YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
alephbertgimmel
AlephBertGimmel - Modern Hebrew pretrained BERT model with a 128K token vocabulary.
NOTE: This model was only trained with sequences of up to 128 tokens.
When using AlephBertGimmel, please reference:
Eylon Guetta, Avi Shmidman, Shaltiel Shmidman, Cheyn Shmuel Shmidman, Joshua Guedalia, Moshe Koppel, Dan Bareket, Amit Seker and Reut Tsarfaty, "Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All", Nov 2022 arXiv:2211.15199
- Downloads last month
- 33
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.