English
Share
Sign In
Haebom
👍
Google PaLM2 scale leak - 340B of parameters
In the PaLM 2 report, there was an issue with the number of tokens, so the exact scaling could not be determined, so based on Chinchilla, we could only guess that the scale would be at least 175B & 3.7T, but the token count is similar, but the model is almost twice as large.
Subscribe to 'haebom'
📚 Welcome to Haebom's archives.
---
I post articles related to IT 💻, economy 💰, and humanities 🎭.
If you are curious about my thoughts, perspectives or interests, please subscribe.
Would you like to be notified when new articles are posted? 🔔 Yes, that means subscribe.
haebom@kakao.com
Subscribe
👍