Loading...
HF多模态

cambridgeltl/SapBERT-from-PubMedBERT-fulltext

datasets: UMLS [news] A c...

标签:

datasets:

  • UMLS

[news] A cross-lingual extension of SapBERT will appear in the main onference of ACL 2021!
[news] SapBERT will appear in the conference proceedings of NAACL 2021!


Expected input and output

The input should be a string of biomedical entity names, e.g., “covid infection” or “Hydroxychloroquine”. The [CLS] embedding of the last layer is regarded as the output.


SapBERT-PubMedBERT

SapBERT by Liu et al. (2020). Trained with UMLS 2020AA (English only), using microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext as the base model.


Citation

@inproceedings{liu-etal-2021-self,
    title = "Self-Alignment Pretraining for Biomedical Entity Representations",
    author = "Liu, Fangyu  and
      Shareghi, Ehsan  and
      Meng, Zaiqiao  and
      Basaldella, Marco  and
      Collier, Nigel",
    booktitle = "Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
    month = jun,
    year = "2021",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://www.aclweb.org/anthology/2021.naacl-main.334",
    pages = "4228--4238",
    abstract = "Despite the widespread success of self-supervised learning via masked language models (MLM), accurately capturing fine-grained semantic relationships in the biomedical domain remains a challenge. This is of paramount importance for entity-level tasks such as entity linking where the ability to model entity relations (especially synonymy) is pivotal. To address this challenge, we propose SapBERT, a pretraining scheme that self-aligns the representation space of biomedical entities. We design a scalable metric learning framework that can leverage UMLS, a massive collection of biomedical ontologies with 4M+ concepts. In contrast with previous pipeline-based hybrid systems, SapBERT offers an elegant one-model-for-all solution to the problem of medical entity linking (MEL), achieving a new state-of-the-art (SOTA) on six MEL benchmarking datasets. In the scientific domain, we achieve SOTA even without task-specific supervision. With substantial improvement over various domain-specific pretrained MLMs such as BioBERT, SciBERTand and PubMedBERT, our pretraining scheme proves to be both effective and robust.",
}

数据统计

数据评估

cambridgeltl/SapBERT-from-PubMedBERT-fulltext浏览人数已经达到520,如你需要查询该站的相关权重信息,可以点击"5118数据""爱站数据""Chinaz数据"进入;以目前的网站数据参考,建议大家请以爱站数据为准,更多网站价值评估因素如:cambridgeltl/SapBERT-from-PubMedBERT-fulltext的访问速度、搜索引擎收录以及索引量、用户体验等;当然要评估一个站的价值,最主要还是需要根据您自身的需求以及需要,一些确切的数据则需要找cambridgeltl/SapBERT-from-PubMedBERT-fulltext的站长进行洽谈提供。如该站的IP、PV、跳出率等!

关于cambridgeltl/SapBERT-from-PubMedBERT-fulltext特别声明

本站Ai导航提供的cambridgeltl/SapBERT-from-PubMedBERT-fulltext都来源于网络,不保证外部链接的准确性和完整性,同时,对于该外部链接的指向,不由Ai导航实际控制,在2023年5月9日 下午7:13收录时,该网页上的内容,都属于合规合法,后期网页的内容如出现违规,可以直接联系网站管理员进行删除,Ai导航不承担任何责任。

相关导航

暂无评论

暂无评论...