This paper addresses the problem of synthesizing diverse and uncertain multi-finger robotic grips based on partial observations. Existing generative models struggle to model the complex grip distribution of skilled hands and tend to generate unreliable or overly conservative grips due to their inability to account for shape uncertainty inherent in partial point clouds. In this paper, we propose FFHFlow, a flow-based variational framework. FFHFlow generates diverse and robust multi-finger grips while explicitly quantifying perceptual uncertainty in partial point clouds. It overcomes the limitations of conditional VAEs' mode collapse and fixed prior distributions by learning a hierarchical grip manifold using a regularized flow-based deep latent variable model. Leveraging flow reversibility and accurate likelihood, we identify shape uncertainty from partial observations and identify novel object structures, enabling risk-aware grip synthesis. To further enhance reliability, we integrate a discriminative grip estimator with flow likelihood to develop an uncertainty-aware ranking strategy, prioritizing grips robust to shape ambiguity. Extensive experiments in simulations and real-world environments demonstrate that FFHFlow outperforms state-of-the-art baseline models (including diffusion models) in grip diversity and success rate, while achieving efficient sampling in runtime. Furthermore, diversity-based sampling mitigates collisions, demonstrating its practical value in crowded and constrained environments.