Current Issue
Vol. 6 No. 1 (2026): Eurasian Journal of Mathematical Theory and Computer Sciences
A new journal entitled Eurasian Journal of Mathematical Theory and Computer Sciences is recently launched by “Innovative Academy RSC” Publisher.Eurasian Journal of Mathematical Theory and Computer Sciences (EJMTCS) is a peer-reviewed open access monthly journal that publishes high-quality original research papers on the development of theories and methods for mathematical, computer and information sciences, the design, implementation, and analysis of algorithms and software tools for mathematical computation and reasoning, and the integration of mathematics and computer science for scientific and engineering applications.
Published: 2026-01-05
Articles
USING THE GRAPHICAL TOOLS OF BORLAND C++ BUILDER
Curious students who begin learning modern programming languages with the, regardless of the language, inevitably become acquainted with graphics libraries. The following article provides information on the graphical tools of Borland C++ Builder, as well as tips and instructions for creating multi-form applications.
2026-01-05
NEW ALGORITHMIC SOLUTIONS AND THEIR MATHEMATICAL FOUNDATIONS
This article examines new algorithmic solutions and their underlying mathematical foundations. It analyzes complex mathematical principles necessary for understanding and developing advanced algorithms in areas such as machine learning, quantum computing, blockchain technology, optimization, and graph theory. The article highlights the role of linear algebra, calculus, probability theory, and information theory in modern algorithms. It also discusses mathematical approaches to pressing issues such as algorithmic fairness and interpretability. By synthesizing existing research and conceptual ideas, the article demonstrates the inseparable connection between abstract mathematical theory and practical algorithmic innovations, and outlines future research and application directions.
2026-01-05
OPTIMIZATION OF NEURAL NETWORK HYPERPARAMETERS USING GENETIC ALGORITHMS
This scientific paper investigates the problem of automatic hyperparameter optimization for artificial neural networks. Traditional hyperparameter optimization methods (manual tuning, grid search) are often inefficient and resource-intensive. The study proposes a method for automatic selection and optimization of neural network hyperparameters (learning rate, number of layers, number of neurons, activation function, batch size, etc.) using genetic algorithms. The genetic algorithm population represents a set of hyperparameters through each individual (chromosome), and validation accuracy is used as a fitness function. Through selection, crossover, and mutation operators, the best combination of hyperparameters is identified over generations. Experiments conducted on the MNIST, CIFAR-10, and Iris datasets show that the proposed method enables 15-25% faster and more accurate optimization compared to traditional methods (grid search, random search). Additionally, it is proven that this method improves neural network performance by an average of 3-8%.
2026-01-12
