Loading...
HF多模态

facebook/dragon-plus-context-encoder

DRAGON+ is a BERT-base size...

标签:

DRAGON+ is a bert-base sized dense retriever initialized from RetroMAE and further trained on the data augmented from MS MARCO corpus, following the approach described in How to Train Your DRAGON:
Diverse Augmentation Towards Generalizable Dense Retrieval.

facebook/dragon-plus-context-encoder

The associated GitHub repository is available here https://github.com/facebookresearch/dpr-scale/tree/main/dragon. We use asymmetric dual encoder, with two distinctly parameterized encoders. The following models are also available:

Model Initialization MARCO Dev BEIR Query Encoder Path Context Encoder Path
DRAGON+ Shitao/RetroMAE 39.0 47.4 facebook/dragon-plus-query-encoder facebook/dragon-plus-context-encoder
DRAGON-RoBERTa RoBERTa-base 39.4 47.2 facebook/dragon-roberta-query-encoder facebook/dragon-roberta-context-encoder


Usage (HuggingFace Transformers)

Using the model directly available in HuggingFace transformers .

import torch
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained('facebook/dragon-plus-query-encoder')
query_encoder = AutoModel.from_pretrained('facebook/dragon-plus-query-encoder')
context_encoder = AutoModel.from_pretrained('facebook/dragon-plus-context-encoder')
# We use msmarco query and passages as an example
query =  "Where was Marie Curie born?"
contexts = [
    "Maria Sklodowska, later known as Marie Curie, was born on November 7, 1867.",
    "Born in Paris on 15 May 1859, Pierre Curie was the son of Eugène Curie, a doctor of French Catholic origin from Alsace."
]
# Apply tokenizer
query_input = tokenizer(query, return_tensors='pt')
ctx_input = tokenizer(contexts, padding=True, truncation=True, return_tensors='pt')
# Compute embeddings: take the last-layer hidden state of the [CLS] token
query_emb = query_encoder(**query_input).last_hidden_state[:, 0, :]
ctx_emb = context_encoder(**ctx_input).last_hidden_state[:, 0, :]
# Compute similarity scores using dot product
score1 = query_emb @ ctx_emb[0]  # 396.5625
score2 = query_emb @ ctx_emb[1]  # 393.8340

数据统计

数据评估

facebook/dragon-plus-context-encoder浏览人数已经达到411,如你需要查询该站的相关权重信息,可以点击"5118数据""爱站数据""Chinaz数据"进入;以目前的网站数据参考,建议大家请以爱站数据为准,更多网站价值评估因素如:facebook/dragon-plus-context-encoder的访问速度、搜索引擎收录以及索引量、用户体验等;当然要评估一个站的价值,最主要还是需要根据您自身的需求以及需要,一些确切的数据则需要找facebook/dragon-plus-context-encoder的站长进行洽谈提供。如该站的IP、PV、跳出率等!

关于facebook/dragon-plus-context-encoder特别声明

本站Ai导航提供的facebook/dragon-plus-context-encoder都来源于网络,不保证外部链接的准确性和完整性,同时,对于该外部链接的指向,不由Ai导航实际控制,在2023年5月9日 下午7:11收录时,该网页上的内容,都属于合规合法,后期网页的内容如出现违规,可以直接联系网站管理员进行删除,Ai导航不承担任何责任。

相关导航

暂无评论

暂无评论...