TreeGPT is an attention-less neural network architecture that explores the potential of structured inference tasks using a pure TreeFFN encoder-decoder design. Unlike conventional transformer approaches that rely on attention mechanisms, TreeGPT aims to achieve inference performance while maintaining computational efficiency by utilizing bidirectional TreeFFN components that process sequences in parallel via neighbor connections. Both the encoder, which processes left-to-right dependencies, and the decoder, which processes right-to-left patterns, are centered around the TreeFFN encoder-decoder mechanism with simple neighbor connections. Using 3.16 million parameters, we achieved 99% validation accuracy on the ARC Prize 2025 dataset. The model converged within 1,500 training steps and achieved 100% token-level accuracy on selected evaluation samples.