Sign In
Haebom
Google PaLM2 scale leak – 340B parameter size
The PaLM 2 report had issues with token count, so we couldn't know the exact scaling. Based on Chinchilla, we could only guess that it was at least 175B & 3.7T, but although the token count is similar, the model turns out to be almost twice as large.
Subscribe to 'haebom'
📚 Welcome to Haebom's archives.
---
I post articles related to IT 💻, economy 💰, and humanities 🎭.
If you are curious about my thoughts, perspectives or interests, please subscribe.
haebom@kakao.com
Subscribe