This paper proposes EDNAG (Evolutionary Diffusion-based Neural Architecture Generation), a novel neural network architecture generation method that utilizes an evolutionary algorithm-based diffusion model to address the computational and time-consuming challenges of neural architecture search (NAS). EDNAG mimics the denoising process of the diffusion model with an evolutionary algorithm, inducing a transition from a random Gaussian distribution to an optimal architecture distribution based on fitness. This method combines the advantages of evolutionary strategies and diffusion models, enabling efficient and fast architecture generation. Experimental results demonstrate that EDNAG achieves state-of-the-art performance, improving accuracy by up to 10.45% and increasing inference speed by an average of 50x. Another significant advantage is that it requires no training.