ContributorsPublishersAdvertisers

Google Trains 280 Billion Parameter AI Language Model Gopher

InfoQ.com
 15 days ago

Cover picture for the articleGoogle subsidiary DeepMind announced Gopher, a 280-billion-parameter AI natural language processing (NLP) model. Based on the Transformer architecture and trained on a 10.5TB corpus called MassiveText, Gopher outperformed the current state-of-the-art on 100 of 124 evaluation tasks. The model and several experiments were described in a paper published on...

www.infoq.com

Comments / 0

Comments / 0