Skip to content

A github repo for Trimmed-ICR, including code and results.

Notifications You must be signed in to change notification settings

ielab/Selective-ICR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Where Relevance Emerges - A Layer-Wise Study of Internal Attention for Zero-Shot Re-Ranking

This repository contains reproduction code for paper: "Where Relevance Emerges: A Layer-Wise Study of Internal Attention for Zero-Shot Re-Ranking".

Overview

This work investigates how relevance signals are distributed across transformer layers in Large Language Models (LLMs) for zero-shot document re-ranking. The main contributions include:

  1. Layer-wise Analysis: Discovering a universal "Bell-Curve" distribution of relevance signals across transformer layers
  2. Selective-ICR: A strategy that reduces inference latency by 30%-50% without compromising effectiveness by focusing on high-signal layers
  3. Unified Comparison: Systematic evaluation of three scoring mechanisms (generation, likelihood, internal attention) across Listwise and Setwise ranking frameworks
  4. Reasoning-Intensive Tasks: Demonstrating that attention-based re-ranking has high potential on reasoning-intensive tasks

Repository Structure

This codebase reproduces two main components:

1. icr/ - In-Context Reranking (ICR)

2. llm-rankers/ - LLM-based Ranking Methods

Key Research Questions

  • RQ1: How is the relevance signal distributed across transformer layers, and do all layers contribute equally?
  • RQ2: How does attention-based scoring compare with generative and likelihood-based methods in Listwise and Setwise frameworks?
  • RQ3: Does attention-based ranking remain effective on reasoning-intensive tasks, and is the layer-wise signal distribution universal?

Quick Start

See individual README files for detailed installation and usage:

Experiment Results:

download from google drive: https://drive.google.com/file/d/1Myurx_3DsnBcSq6Em9pb0Y2k_iXX5sV7/view?usp=drive_link

Citation

If you use this code, please cite the original papers:

TBD

About

A github repo for Trimmed-ICR, including code and results.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •