Technology Skill

Bidirectional encoder representations from transformers (BERT) Save Table: XLSX CSV

Bidirectional Encoder Representations from Transformers (BERT) is a transformer-based machine learning technique for natural language processing (NLP) pre-training developed by Google. BERT was created and published in 2018 by Jacob Devlin and his colleagues from Google. In 2019, Google announced that it had begun leveraging BERT in its search engine, and by late 2020 it was using BERT in almost every English-language query. A 2020 literature survey concluded that "in a little over a year, BERT has become a ubiquitous baseline in NLP experiments", counting over 150 research publications analyzing and improving the model. Read more at Wikipedia external site

Examples of occupations where workers may use Bidirectional encoder representations from transformers (BERT):

1 occupations shown
Job ZoneCodeOccupation
515-1221.00Computer and Information Research Scientists Bright Outlook Bright Outlook  

This page uses material from the Wikipedia article BERT (language model) external site, which is released under the Creative Commons Attribution-Share-Alike License 3.0 external site.