Instructions to use openai/privacy-filter with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use openai/privacy-filter with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="openai/privacy-filter")# Load model directly from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("openai/privacy-filter") model = AutoModelForTokenClassification.from_pretrained("openai/privacy-filter") - Transformers.js
How to use openai/privacy-filter with Transformers.js:
// npm i @huggingface/transformers import { pipeline } from '@huggingface/transformers'; // Allocate pipeline const pipe = await pipeline('token-classification', 'openai/privacy-filter'); - Inference
- Notebooks
- Google Colab
- Kaggle
Add Transformers.js usage/sample code
#5
by Xenova HF Staff - opened
No description provided.
mihaimaruseac changed pull request status to merged
Is it normal that despite this update I cannot run this model using Transformers.js ? (Unsupported model type: openai_privacy_filter)
Do you have the newest version of transformers?
That's my bad - I'm just discovering javascript and thanks to your comment I noticed I was fetching an older version of Transformers.js. Thank you!
mihaimaruseac deleted the
refs/pr/5 ref