In this paper, we present a novel time series modeling approach, building on the success of large-scale pre-training-based models. We introduce Foundation Auto-Encoders (FAEs), a generative AI model for anomaly detection in time series data based on variational autoencoders (VAEs). The term “foundation” refers to a model that has been pre-trained on a large amount of time series data to learn complex temporal patterns and thereby perform accurate modeling, prediction, and anomaly detection on previously unseen datasets. FAEs utilize VAEs and dilated convolutional neural networks (DCNNs) to build a general model for univariate time series modeling, which ultimately can perform appropriately in out-of-the-box zero-shot anomaly detection applications. In this paper, we introduce the main concepts of FAEs and present preliminary results on several multidimensional time series datasets from various domains, including a real-world dataset from an operational mobile ISP and the well-known KDD 2021 anomaly detection dataset.