约 13,700,000 个结果
在新选项卡中打开链接
  1. google-bert/bert-base-uncased · Hugging Face

    BERT base model (uncased) Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This model is uncased: it does not make a difference between english and English.

  2. google-bert/bert-base-chinese - Hugging Face

    This model has been pre-trained for Chinese, training and random input masking has been applied independently to word pieces (as in the original BERT paper). Developed by: HuggingFace team; Model Type: Fill-Mask; Language(s): Chinese; License: [More Information needed] Parent Model: See the BERT base uncased model for more information about the ...

  3. tftransformers/bert-base-uncased - Hugging Face

    BERT base model (uncased) Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This model is case-sensitive: it makes a difference between english and English.

  4. BERT - Hugging Face

    It is used to instantiate a BERT model according to the specified arguments, defining the model architecture. Instantiating a configuration with the defaults will yield a similar configuration to that of the BERT google-bert/bert-base-uncased architecture. Configuration objects inherit from PretrainedConfig and can be used to control the model ...

  5. ckiplab/bert-base-chinese - Hugging Face

    CKIP BERT Base Chinese This project provides traditional Chinese transformers models (including ALBERT, BERT, GPT2) and NLP tools (including word segmentation, part-of-speech tagging, named entity recognition).

  6. Contents/bert-base-uncased - Hugging Face

    BERT has originally been released in base and large variations, for cased and uncased input text. The uncased models also strips out an accent markers. Chinese and multilingual uncased and cased versions followed shortly after.

  7. google-bert/bert-base-uncased at main - Hugging Face

    bert-base-uncased. 14 contributors; History: 26 commits. lysandre HF staff Updates the tokenizer configuration file . 86b5e09 verified 11 months ago. coreml. Add Core ML conversion (#42) over 1 year ago.gitattributes. Safe. 491 Bytes. Adding `safetensors` variant of this model (#15) about 2 years ago; LICENSE.

  8. aubmindlab/bert-base-arabertv2 - Hugging Face

    AraBERT v1 & v2 : Pre-training BERT for Arabic Language Understanding AraBERT is an Arabic pretrained lanaguage model based on Google's BERT architechture. AraBERT uses the same BERT-Base config. More details are available in the AraBERT Paper and in the AraBERT Meetup

  9. neuralmind/bert-base-portuguese-cased · Hugging Face

    BERTimbau Base is a pretrained BERT model for Brazilian Portuguese that achieves state-of-the-art performances on three downstream NLP tasks: Named Entity Recognition, Sentence Textual Similarity and Recognizing Textual Entailment.

  10. google-bert/bert-base-multilingual-cased · Hugging Face

    BERT multilingual base model (cased) Pretrained model on the top 104 languages with the largest Wikipedia using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository .