Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
deepvk
/
USER2-base
like
23
Follow
deep vk
293
Sentence Similarity
sentence-transformers
ONNX
Safetensors
9 datasets
Russian
modernbert
feature-extraction
text-embeddings-inference
arxiv:
2205.13147
arxiv:
2401.00368
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
2
Deploy
Use this model
main
USER2-base
1.2 GB
Ctrl+K
Ctrl+K
3 contributors
History:
3 commits
SpirinEgor
GoshaBoss
Adding ONNX file of this model (#3)
94bbff0
26 days ago
1_Pooling
Upload model and tokenizer
about 1 year ago
assets
Upload model and tokenizer
about 1 year ago
onnx
Adding ONNX file of this model (#3)
26 days ago
.gitattributes
Safe
1.56 kB
Upload model and tokenizer
about 1 year ago
README.md
Safe
14.9 kB
Adding ONNX file of this model (#3)
26 days ago
config.json
Safe
2.27 kB
Upload model and tokenizer
about 1 year ago
config_sentence_transformers.json
Safe
359 Bytes
Upload model and tokenizer
about 1 year ago
model.safetensors
Safe
596 MB
xet
Upload model and tokenizer
about 1 year ago
modules.json
229 Bytes
Upload model and tokenizer
about 1 year ago
sentence_bert_config.json
Safe
54 Bytes
Upload model and tokenizer
about 1 year ago
special_tokens_map.json
Safe
837 Bytes
Upload model and tokenizer
about 1 year ago
tokenizer.json
Safe
4.75 MB
Upload model and tokenizer
about 1 year ago
tokenizer_config.json
Safe
21.2 kB
Upload model and tokenizer
about 1 year ago