Instructions to use Shivam098/Translation with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Shivam098/Translation with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("Shivam098/Translation") model = AutoModelForSeq2SeqLM.from_pretrained("Shivam098/Translation") - Notebooks
- Google Colab
- Kaggle
Librarian Bot: Add base_model information to model
#3
by librarian-bot - opened
This pull request aims to enrich the metadata of your model by adding facebook/mbart-large-50-many-to-many-mmt as a base_model field, situated in the YAML block of your model's README.md.
How did we find this information? We performed a regular expression match on your README.md file to determine the connection.
Why add this? Enhancing your model's metadata in this way:
- Boosts Discoverability - It becomes straightforward to trace the relationships between various models on the Hugging Face Hub.
- Highlights Impact - It showcases the contributions and influences different models have within the community.
For a hands-on example of how such metadata can play a pivotal role in mapping model connections, take a look at librarian-bots/base_model_explorer.
This PR comes courtesy of Librarian Bot. If you have any feedback, queries, or need assistance, please don't hesitate to reach out to @davanstrien. Your input is invaluable to us!