Metadata-Version: 2.1
Name: ai-transformersx
Version: 0.4.7
Summary: tools for using huggingface/transformers more easily
Home-page: https://github.com/aicanhelp/ai-transformers
Author: modongsong
Author-email: modongsongml@163.com
License: MIT
Project-URL: Bug Reports, https://github.com/aicanhelp/ai-transformersx/issues
Project-URL: Source, https://github.com/aicanhelp/ai-transformers/
Keywords: deeplearning tools development
Platform: UNKNOWN
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: Topic :: Software Development :: Build Tools
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Requires-Python: >=3.5, <4
Description-Content-Type: text/markdown
Provides-Extra: dev
Requires-Dist: check-manifest ; extra == 'dev'
Provides-Extra: test
Requires-Dist: coverage ; extra == 'test'
Requires-Dist: pytest ; extra == 'test'

# Transformersx

[🤗 Transformers](https://github.com/huggingface/transformers) is a great project for the Transformer architecture for NLP and NLG.  

/**🤗 Transformers (formerly known as pytorch-transformers and pytorch-pretrained-bert) provides state-of-the-art general-purpose 
architectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet, T5, CTRL...) for Natural Language Understanding (NLU) and 
Natural Language Generation (NLG) with over thousands of pretrained models in 100+ languages and deep interoperability 
between PyTorch & TensorFlow 2.0.**/

The purpose of this project is to add more tools to easy to use the hugging/transformers library for NLP.
And NLP task examples was refactored or added as well.

/**BTW, **/
Because it's hard to download the pretrained models from huggingface, especially in China, here tries to use a trick to 
solve this problem.  

For personal, Aliyun is free to build the docker image, typically, the image building can use the overseas machine 
to build the docker image. So when the image is built by the overseas machine, it can download the pretrained models from
huggingface fastly.  

After the image is built, the image can be pulled from Aliyun fastly. And then the pretrained models can be take from the docker image.



#### TODO:  
- (1) Build a management tools for the models of transformers to categorize the models, 
    and the features also should be including training, finetuning and predict
- (2) Use the Streamlit to build the UI for the management tools



