In this paper, we propose a Bayesian Autoregressive and Recurrent Neural Network (BARNN) to address the lack of uncertainty handling in conventional Autoregressive and Recurrent Networks. Based on the variational dropout method, BARNN provides a systematic way to convert conventional models into Bayesian versions. In particular, by introducing the temporal Variational Mixtures of Posteriors (tVAMP) prior, it enables efficient and well-calibrated Bayesian inference even in large-scale recurrent neural networks. Through extensive experiments on PDE modeling and molecule generation, we demonstrate that BARNN not only achieves comparable or superior accuracy to conventional methods, but also excels in uncertainty quantification and long-term dependency modeling.