I am confused with the language model explorer mentioned in the article. Mentions it is for GPT-2 ,but then also says it was built using BERT. Which is it?
The underlying library is called "pytorch-pretrained-BERT" because initially it just contained an implementation of BERT, but now it contains implementations of several models so they backronym-ed it to "Big-&-Extending-Repository-of-Transformers". :)