Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
29 changes: 7 additions & 22 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,7 @@

# MSANet: Multi-Similarity and Attention Guidance for Boosting Few-Shot Segmentation
This is the official implementation of the paper [MSANet: Multi-Similarity and Attention Guidance for Boosting Few-Shot Segmentation](https://arxiv.org/abs/2206.09667)

[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/msanet-multi-similarity-and-attention-1/few-shot-semantic-segmentation-on-coco-20i-1)](https://paperswithcode.com/sota/few-shot-semantic-segmentation-on-coco-20i-1?p=msanet-multi-similarity-and-attention-1)
[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/msanet-multi-similarity-and-attention-1/few-shot-semantic-segmentation-on-coco-20i-5)](https://paperswithcode.com/sota/few-shot-semantic-segmentation-on-coco-20i-5?p=msanet-multi-similarity-and-attention-1)
[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/msanet-multi-similarity-and-attention-1/few-shot-semantic-segmentation-on-pascal-5i-1)](https://paperswithcode.com/sota/few-shot-semantic-segmentation-on-pascal-5i-1?p=msanet-multi-similarity-and-attention-1)
[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/msanet-multi-similarity-and-attention-1/few-shot-semantic-segmentation-on-pascal-5i-5)](https://paperswithcode.com/sota/few-shot-semantic-segmentation-on-pascal-5i-5?p=msanet-multi-similarity-and-attention-1)

Authors: Ehtesham Iqbal, Sirojbek Safarov, Seongdeok Bang

> **Abstract:** *Few-shot segmentation aims to segment unseen-class objects given only a handful of densely labeled samples. Prototype learning, where the support feature yields a singleor several prototypes by averaging global and local object information, has been widely used in FSS. However, utilizing only prototype vectors may be insufficient to represent the features for all training data. To extract abundant features and make more precise predictions, we propose a Multi-Similarity and Attention Network (MSANet) including two novel modules, a multi-similarity module and an attention module. The multi-similarity module exploits multiple feature-maps of support images and query images to estimate accurate semantic relationships. The attention module instructs the network to concentrate on class-relevant information. The network is tested on standard FSS datasets, PASCAL-5i 1-shot, PASCAL-5i 5-shot, COCO-20i 1-shot, and COCO-20i 5-shot. The MSANet with the backbone of ResNet-101 achieves the state-of-the-art performance for all 4-benchmark datasets with mean intersection over union (mIoU) of 69.13%, 73.99%, 51.09%, 56.80%, respectively.*

<p align="middle">
<img src="figure/Main.png">
Expand All @@ -29,21 +21,21 @@ Authors: Ehtesham Iqbal, Sirojbek Safarov, Seongdeok Bang

- COCO-20<sup>i</sup>: [COCO2014](https://cocodataset.org/#download)

Download the [data](https://aivkr-my.sharepoint.com/:f:/g/personal/safarov_sirojbek_aiv_ai/EsTvSTPyp_NCq-RIifEAnSMBy8BfNX2iVlfZZ0nSnwi3RQ?e=d3OWUj) lists (.txt files) and put them into the `MSANet/lists` directory.
Download the [data](https://drive.google.com/uc?export=download&id=1J1vPgctx8ojlkswMZS9Ccc8_bU6Zuej4) and put them into the `MSANet/lists` directory.

### Models

- Download the pre-trained backbones from [here](https://aivkr-my.sharepoint.com/:f:/g/personal/safarov_sirojbek_aiv_ai/EnGqMXVD5N5HrNgAKDpx0kUB0xo720V5L0VWRwHvVOKukw?e=90JVzl) and put them into the `MSANet/initmodel` directory.
- Download our trained base learners from [OneDrive](https://aivkr-my.sharepoint.com/:f:/g/personal/safarov_sirojbek_aiv_ai/EsAKfmsEqp5DmJ4gaiUtRqUB9b256ObgzfVZ-U-R50IlFw?e=z5HIM6) and put them under `initmodel/PSPNet`.
- We provide all trained MSANet [models](https://aivkr-my.sharepoint.com/:f:/g/personal/safarov_sirojbek_aiv_ai/EjDn3jyTVWFHso3uX8_AgSgBj1y_nB3hQ0wP8RS9aE6Cdw?e=DbT3eH) for performance evaluation. _Backbone: VGG16 & ResNet50; Dataset: PASCAL-5<sup>i</sup> & COCO-20<sup>i</sup>; Setting: 1-shot & 5-shot_.
- Download the pre-trained backbones from [here](https://drive.google.com/uc?export=download&id=1bWVt8OZt2pSDtXn_XLDbR5obSTVrUpN9) and put them into the `MSANet/initmodel` directory.
- Download our trained base learners from [Drive](https://drive.google.com/uc?export=download&id=1pPOC2rsSPMTm7B4Dr2b1yJnsMqa20u98) and put them under `initmodel/PSPNet`.
- We provide all trained MSANet [models](https://drive.google.com/uc?export=download&id=14ILPhyKXva9N8pZB495T5DC5081v9IJj) for performance evaluation. _Backbone: VGG16 & ResNet50; Dataset: PASCAL-5<sup>i</sup> & COCO-20<sup>i</sup>; Setting: 1-shot & 5-shot_.

### Scripts

- Change configuration and add weight path to `.yaml` files in `MSHNet/config` , then run the `test.py` file for testing.

### Performance

Performance comparison with the state-of-the-art approaches (*i.e.*, [HSNet](https://github.com/juhongm999/hsnet), [BAM](https://github.com/chunbolang/BAM) and [VAT](https://github.com/Seokju-Cho/Volumetric-Aggregation-Transformer) in terms of **average** **mIoU** across all folds.
Performance comparison with the state-of-the-art approaches (*i.e.*, [HSNet], [BAM] and [VAT] in terms of **average** **mIoU** across all folds.

1. ##### PASCAL-5<sup>i</sup>

Expand Down Expand Up @@ -75,15 +67,8 @@ Performance comparison with the state-of-the-art approaches (*i.e.*, [HSNet](htt

## References

This repo is mainly built based on [PFENet](https://github.com/dvlab-research/PFENet), [HSNet](https://github.com/juhongm999/hsnet), and [BAM](https://github.com/chunbolang/BAM). Thanks for their great work!


### BibTeX
If you find this research useful, please consider citing:
````BibTeX
@article{MSANet2022,
title={MSANet: Multi-Similarity and Attention Guidance for Boosting Few-Shot Segmentation},
author={Ehtesham Iqbal, Sirojbek Safarov, Seongdeok Bang},
journal={arXiv preprint arXiv:2206.09667},
year={2022}
}

````