CodeBERT-base
Pretrained weights for CodeBERT: A Pre-Trained Model for Programming and Natural Languages.
Training Data
The model is trained on bi-modal data (documents & code) of CodeSearchNet
Training Objective
This model is initialized with Roberta-base and trained with MLM+RTD objective (cf. the paper).
Usage
Please see the official repository for scripts that support “code search” and “code-to-document generation”.
Reference
- CodeBERT trained with Masked LM objective (suitable for code completion)
- ? Hugging Face’s CodeBERTa (small size, 6 layers)
Citation
@misc{feng2020codebert,
title={CodeBERT: A Pre-Trained Model for Programming and Natural Languages},
author={Zhangyin Feng and Daya Guo and Duyu Tang and Nan Duan and Xiaocheng Feng and Ming Gong and Linjun Shou and Bing Qin and Ting Liu and Daxin Jiang and Ming Zhou},
year={2020},
eprint={2002.08155},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
数据统计
数据评估
关于microsoft/codebert-base特别声明
本站Ai导航提供的microsoft/codebert-base都来源于网络,不保证外部链接的准确性和完整性,同时,对于该外部链接的指向,不由Ai导航实际控制,在2023年5月9日 下午7:12收录时,该网页上的内容,都属于合规合法,后期网页的内容如出现违规,可以直接联系网站管理员进行删除,Ai导航不承担任何责任。
相关导航
暂无评论...