Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/144247
Title: | CRNet : cross-reference networks for few-shot segmentation | Authors: | Liu, Weide Zhang, Chi Lin, Guosheng Liu, Fayao |
Keywords: | Engineering::Computer science and engineering | Issue Date: | 2020 | Source: | Liu, W., Zhang, C., Lin, G., & Liu, F. (2020). CRNet : cross-reference networks for few-shot segmentation. Proceedings of 2020 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). doi:10.1109/CVPR42600.2020.00422 | Project: | AISG-RP-2018-003 RG126/17 (S) RG22/19 (S) |
Conference: | 2020 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) | Abstract: | Over the past few years, state-of-the-art image segmentation algorithms are based on deep convolutional neural networks. To render a deep network with the ability to understand a concept, humans need to collect a large amount of pixel-level annotated data to train the models, which is time-consuming and tedious. Recently, few-shot segmentation is proposed to solve this problem. Few-shot segmentation aims to learn a segmentation model that can be generalized to novel classes with only a few training images. In this paper, we propose a cross-reference network (CRNet) for few-shot segmentation. Unlike previous works which only predict the mask in the query image, our proposed model concurrently make predictions for both the support image and the query image. With a cross-reference mechanism, our network can better find the co-occurrent objects in the two images, thus helping the few-shot segmentation task. We also develop a mask refinement module to recurrently refine the prediction of the foreground regions. For the k-shot learning, we propose to finetune parts of networks to take advantage of multiple labeled support images. Experiments on the PASCAL VOC 2012 dataset show that our network achieves state-of-the-art performance. | URI: | https://hdl.handle.net/10356/144247 | DOI: | 10.1109/CVPR42600.2020.00422 | Schools: | School of Computer Science and Engineering | Rights: | © 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: https://doi.org/10.1109/CVPR42600.2020.00422 | Fulltext Permission: | open | Fulltext Availability: | With Fulltext |
Appears in Collections: | SCSE Conference Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
weide 01916.pdf | 2.42 MB | Adobe PDF | View/Open |
SCOPUSTM
Citations
5
150
Updated on Sep 14, 2024
Web of ScienceTM
Citations
5
82
Updated on Oct 30, 2023
Page view(s)
365
Updated on Sep 13, 2024
Download(s) 20
210
Updated on Sep 13, 2024
Google ScholarTM
Check
Altmetric
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.