Sign In
♾️

Building a Service with a Language Model

Prompts play a significant role in streamlining interactions with GPT models like GPT-3.5 and GPT-4. As AI continues to evolve, I believe prompts will further simplify communication with foundation models and make it easier and more efficient to use them across a variety of applications.
With AI becoming increasingly mainstream, it reminds me of the days when planners had to study concepts like APIs and SDKs. It's not strictly necessary knowledge, but it's definitely useful to know. In reality, what people who develop or plan to develop AI-related services really need to understand is how these systems are structured. The most straightforward approach is to use APIs provided by companies like OpenAI—just send a request, get an input, and deal with the output. However, most CEOs or service providers might not prefer this because they’re looking for better user experiences or greater cost efficiency. Unless the goal is to quickly break into the market, this isn’t usually the favored choice. A typical example is the domestic cloud storage business: if you build your own IDC or data center, operational costs can be cut by more than 50%, but building and designing it properly is neither cheap nor easy.
Personally, if you're planning or aspiring to create an AI-related service, I'd suggest reading the information above, regardless of your role. In fact, it's enough to understand just three core concepts clearly: Foundation Model, Embedding Model, and VectorDB .
Emerging Architectures for LLM Applications from Matt Bornstein and Rajko Radovanovic
The Foundation Model is like the librarian of this library. They possess vast knowledge about the books (information) and help you find exactly what you’re looking for. If you ask, "Can you tell me about computers?" they’ll suggest books about computers.
The Embedding Model works by converting each book into a code so it can be found easily. It’s like assigning a unique barcode to each book, enabling you to locate the one you want in no time.
VectorDB can be thought of as a large bookshelf where books are organized using barcodes. This shelf stores books grouped by subject or organized however you prefer, so when you say, "I'd like to see books on this topic," it can immediately provide the relevant books.
When building AI services, you’ll notice that people’s questions to AI fall into clear, frequently asked categories much more than you’d expect. Understanding this allows you to design your VectorDB and Embedding models more effectively, which can significantly cut costs.
🎁
👁️‍🗨️
ⓒ 2023. Haebom, all rights reserved.
It may be used for commercial purposes with permission from the copyright holder, provided the source is cited.
👍