cross-encoder-ettin-17m-MarginMSE

Paper All Models GitHub

This model is a cross-encoder based on jhu-clsp/ettin-encoder-17m. It was trained on Ms-Marco using loss marginMSE as part of a reproducibility paper for training cross encoders: "Reproducing and Comparing Distillation Techniques for Cross-Encoders", see the paper for more details.

Contents

Model Description

This model is intended for re-ranking the top results returned by a retrieval system (like BM25, Bi-Encoders or SPLADE).

  • Training Data: MS MARCO Passage
  • Language: English
  • Loss marginMSE

Training can be easily reproduced using the assiciated repository. The exact training configuration used for this model is also detailed in config.yaml.

Usage

Quick Start:

from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch

tokenizer = AutoTokenizer.from_pretrained("xpmir/cross-encoder-ettin-17m-MarginMSE")
model = AutoModelForSequenceClassification.from_pretrained("xpmir/cross-encoder-ettin-17m-MarginMSE")

features = tokenizer("What is experimaestro ?", "Experimaestro is a powerful framework for ML experiments management...", padding=True, truncation=True, return_tensors="pt")

model.eval()
with torch.no_grad():
    scores = model(**features).logits
    print(scores)

Evaluations

We provide evaluations of this cross-encoder re-ranking the top 1000 documents retrieved by naver/splade-v3-distilbert.

dataset RR@10 nDCG@10
msmarco_dev 30.19 35.85
trec2019 82.75 59.06
trec2020 84.78 58.81
fever 68.73 70.30
arguana 18.92 28.38
climate_fever 24.94 18.41
dbpedia 60.12 34.31
fiqa 37.39 30.22
hotpotqa 79.72 63.19
nfcorpus 44.37 25.27
nq 42.58 47.43
quora 78.84 80.04
scidocs 23.43 12.92
scifact 61.12 64.06
touche 59.18 31.98
trec_covid 86.01 64.39
robust04 54.84 33.24
lotte_writing 54.54 46.38
lotte_recreation 51.88 46.95
lotte_science 41.42 33.95
lotte_technology 44.98 36.89
lotte_lifestyle 66.51 57.08
Mean In Domain 65.91 51.24
BEIR 13 52.72 43.92
LoTTE (OOD) 52.36 42.42
Downloads last month
9
Safetensors
Model size
16.9M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for xpmir/cross-encoder-ettin-17m-MarginMSE

Finetuned
(32)
this model

Collection including xpmir/cross-encoder-ettin-17m-MarginMSE

Paper for xpmir/cross-encoder-ettin-17m-MarginMSE