When working with LLM (Large Language Model), adjusting certain settings can make a big difference in the response. Here is an analysis of these settings and how to use them effectively. At first, it may seem difficult to understand why the temperature is displayed and what the sequence is, but just think of them as terms. If you use chatGPT, there is only an input window, so it is difficult to understand how it works, but if you go to the Playground, you can understand how GPT-3.5, etc. are set.