×
ParsBERT is a monolingual language model based on Google's BERT architecture with the same configurations as BERT-Base. Paper presenting ParsBERT: arXiv: ...
Missing: قالب گستر سپنتا? q=
ParsBERT is a monolingual language model based on Google's BERT architecture. This model is pre-trained on large Persian corpora with various writing styles ...
Missing: قالب گستر سپنتا? q=
ParsBERT is a monolingual language model based on Google's BERT architecture with the same configurations as BERT-Base. Paper presenting ParsBERT: arXiv: ...
Missing: قالب گستر سپنتا? q=
Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository.
Missing: قالب گستر سپنتا? q= HooshvareLab/ parsbert-
We're on a journey to advance and democratize artificial intelligence through open source and open science.
Missing: قالب گستر سپنتا? q=
ParsBERT (v2.0). A Transformer-based Model for Persian Language Understanding. We reconstructed the vocabulary and fine-tuned the ParsBERT v1.1 on the new ...
Missing: قالب گستر سپنتا? q=
We're on a journey to advance and democratize artificial intelligence through open source and open science.
Missing: قالب گستر سپنتا? q= https://
People also ask
This model is a fine-tuned version of HooshvareLab/bert-base-parsbert-uncased on an unknown dataset. It achieves the following results on the evaluation set:.
Missing: قالب گستر سپنتا? q= https://
... https://huggingface.co/HooshvareLab/bert-fa-base-uncased-sentiment-digikala"> - <meta property="og:image" content="https://huggingface.co/front/thumbnails ...
In order to show you the most relevant results, we have omitted some entries very similar to the 10 already displayed. If you like, you can repeat the search with the omitted results included.