cambridgeltl/SapBERT-from-PubMedBERT-fulltext
datasets:
- UMLS
[news] A cross-lingual extension of SapBERT will appear in the main onference of ACL 2021!
[news] SapBERT will appear in the conference proceedings of NAACL 2021!
Expected input and output
The input should be a string of biomedical entity names, e.g., “covid infection” or “Hydroxychloroquine”. The [CLS] embedding of the last layer is regarded as the output.
SapBERT-PubMedBERT
SapBERT by Liu et al. (2020). Trained with UMLS 2020AA (English only), using microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext as the base model.
Citation
@inproceedings{liu-etal-2021-self,
title = "Self-Alignment Pretraining for Biomedical Entity Representations",
author = "Liu, Fangyu and
Shareghi, Ehsan and
Meng, Zaiqiao and
Basaldella, Marco and
Collier, Nigel",
booktitle = "Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
month = jun,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2021.naacl-main.334",
pages = "4228--4238",
abstract = "Despite the widespread success of self-supervised learning via masked language models (MLM), accurately capturing fine-grained semantic relationships in the biomedical domain remains a challenge. This is of paramount importance for entity-level tasks such as entity linking where the ability to model entity relations (especially synonymy) is pivotal. To address this challenge, we propose SapBERT, a pretraining scheme that self-aligns the representation space of biomedical entities. We design a scalable metric learning framework that can leverage UMLS, a massive collection of biomedical ontologies with 4M+ concepts. In contrast with previous pipeline-based hybrid systems, SapBERT offers an elegant one-model-for-all solution to the problem of medical entity linking (MEL), achieving a new state-of-the-art (SOTA) on six MEL benchmarking datasets. In the scientific domain, we achieve SOTA even without task-specific supervision. With substantial improvement over various domain-specific pretrained MLMs such as BioBERT, SciBERTand and PubMedBERT, our pretraining scheme proves to be both effective and robust.",
}
数据统计
数据评估
本站Ai导航提供的cambridgeltl/SapBERT-from-PubMedBERT-fulltext都来源于网络,不保证外部链接的准确性和完整性,同时,对于该外部链接的指向,不由Ai导航实际控制,在2023年5月9日 下午7:13收录时,该网页上的内容,都属于合规合法,后期网页的内容如出现违规,可以直接联系网站管理员进行删除,Ai导航不承担任何责任。