This paper proposes a Flexible Coded Distributed Convolution Computing (FCDCC) framework to address the problem of straggler nodes, which cause delays when deploying CNNs in resource-constrained environments. It extends the existing Coded Distributed Computing (CDC) with Circulant and Rotation Matrix Embedding (CRME) and applies it to high-dimensional tensor convolutions. The proposed technique, Numerically Stable Coded Tensor Convolution (NSCTC), introduces two novel coding partitioning techniques: Adaptive-Padding Coded Partitioning (APCP) for input tensors and Kernel-Channel Coded Partitioning (KCCP) for filter tensors. These strategies enable linear decomposition of tensor convolutions and encoding them into CDC subtasks, combining model parallelism and coded redundancy to deliver robust and efficient execution. Theoretical analysis identifies optimal trade-offs between communication and storage costs, and experimental results demonstrate computational efficiency, resilience to straggler nodes, and scalability across various CNN architectures.