This paper points out the insufficient performance of existing AutoML platforms and proposes Gradients, a distributed system. Based on the Bittensor network, Gradients is a competitive system where independent miners compete to find optimal hyperparameters and receive rewards proportional to their performance. Experimental results show that Gradients achieved a 100% win rate compared to TogetherAI, Databricks, and Google Cloud, and an 82.8% win rate compared to HuggingFace AutoTrain. It achieved an average performance improvement of 42.1% compared to commercial platforms, with 30-40% and 23.4% performance gains for retrieval-augmented generation and diffusion models, respectively. This demonstrates that a distributed system with economic incentives can outperform existing centralized AutoML.