This paper proposes a novel medical concept representation (MedRep) based on the OMOP Common Data Model (CDM). Despite the performance gains from electronic health record (EHR)-based models, MedRep addresses the challenges of generalization and integration of models trained on different vocabularies due to the handling of unregistered medical codes. MedRep enriches the information of each concept by adding minimal definitions using Large-Scale Language Model (LLM) prompts and supplementing textual representations based on graph ontology in the OMOP vocabulary. Experimental results demonstrate that MedRep outperforms existing EHR-based models and models using existing medical code tokenizers across a variety of prediction tasks, and its generalizability is demonstrated through external validation.