In this paper, we propose a method to improve the accuracy and scalability of bio-inspired neural networks by applying different biological learning rules to each layer using Neural Architecture Search (NAS). We extend the search space of existing NAS-based models to include various biological learning rules, and automatically find the optimal architecture and learning rules for each layer through NAS. Experimental results show that neural networks using different biological learning rules for each layer achieve higher accuracy than those using a single rule. On CIFAR-10, CIFAR-100, ImageNet16-120, and ImageNet datasets, we break the best performance of existing biologically inspired models, and in some cases, outperform backpropagation-based networks. This suggests that the diversity of learning rules per layer contributes to improved scalability and accuracy.