“The availability of unprecedented unsupervised training data, along with neural scaling laws, has resulted in an unprecedented surge in model size and compute requirements for serving/training LLMs.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results