This scientific paper investigates the problem of automatic hyperparameter optimization for artificial neural networks. Traditional hyperparameter optimization methods (manual tuning, grid search) are often inefficient and resource-intensive. The study proposes a method for automatic selection and optimization of neural network hyperparameters (learning rate, number of layers, number of neurons, activation function, batch size, etc.) using genetic algorithms. The genetic algorithm population represents a set of hyperparameters through each individual (chromosome), and validation accuracy is used as a fitness function. Through selection, crossover, and mutation operators, the best combination of hyperparameters is identified over generations. Experiments conducted on the MNIST, CIFAR-10, and Iris datasets show that the proposed method enables 15-25% faster and more accurate optimization compared to traditional methods (grid search, random search). Additionally, it is proven that this method improves neural network performance by an average of 3-8%.