![]() ![]() ![]() ![]() Deep interoperability between TensorFlow 2.0 and PyTorch models.Train state-of-the-art models in 3 lines of code.8 architectures with over 30 pretrained models, some in more than 100 languagesĬhoose the right framework for every part of a model's lifetime.Practitioners can reduce compute time and production costs.Researchers can share trained models instead of always retraining.Lower compute costs, smaller carbon footprint Low barrier to entry for educators and practitioners.□ Transformers (formerly known as pytorch-transformers and pytorch-pretrained-bert) provides state-of-the-art general-purpose architectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet, CTRL.) for Natural Language Understanding (NLU) and Natural Language Generation (NLG) with over 32+ pretrained models in 100+ languages and deep interoperability between TensorFlow 2.0 and PyTorch. State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |