This paper proposes Token-Entropy Conformal Prediction (TECP), a novel framework leveraging token entropy to address the uncertainty quantification (UQ) problem for open-ended language generation under black-box constraints. TECP utilizes token-level entropy as an uncertainty measure, without logit or reference, and integrates it into a separate Conformal Prediction (CP) pipeline to generate a set of predictions with formal coverage guarantees. Unlike existing methods that rely on semantic consistency heuristics or white-box features, TECP directly estimates epistemic uncertainty from the token entropy structure of sampled products and calibrates uncertainty thresholds via CP quantiles, ensuring verifiable error control. Experimental evaluations on six large-scale language models and two benchmarks (CoQA and TriviaQA) demonstrate that TECP consistently achieves reliable coverage and compact prediction sets, outperforming previous self-consistency-based UQ methods. This study provides a principled and efficient solution for reliable generation in the black-box LLM setting.