Research is actively being conducted on models ranging from 13B to 60B parameters, and there are also various efforts to quickly deploy embedding models on top of foundation models with over 100B parameters. Ultimately, all these approaches aim to achieve maximum utility at minimal cost. Excluding Google, OpenAI, and Meta, who've already secured their infrastructure, datasets, and funding, this is virtually the only option left for other players in the market.