Concealed object detection

DP Fan, GP Ji, MM Cheng… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
IEEE transactions on pattern analysis and machine intelligence, 2021ieeexplore.ieee.org
We present the first systematic study on concealed object detection (COD), which aims to
identify objects that are visually embedded in their background. The high intrinsic similarities
between the concealed objects and their background make COD far more challenging than
traditional object detection/segmentation. To better understand this task, we collect a large-
scale dataset, called COD10K, which consists of 10,000 images covering concealed objects
in diverse real-world scenarios from 78 object categories. Further, we provide rich …
We present the first systematic study on concealed object detection (COD), which aims to identify objects that are visually embedded in their background. The high intrinsic similarities between the concealed objects and their background make COD far more challenging than traditional object detection/segmentation. To better understand this task, we collect a large-scale dataset, called COD10K , which consists of 10,000 images covering concealed objects in diverse real-world scenarios from 78 object categories. Further, we provide rich annotations including object categories, object boundaries, challenging attributes, object-level labels, and instance-level annotations. Our COD10K is the largest COD dataset to date, with the richest annotations, which enables comprehensive concealed object understanding and can even be used to help progress several other vision tasks, such as detection, segmentation, classification etc . Motivated by how animals hunt in the wild, we also design a simple but strong baseline for COD, termed the Search Identification Network ( SINet ). Without any bells and whistles, SINet outperforms twelve cutting-edge baselines on all datasets tested, making them robust, general architectures that could serve as catalysts for future research in COD. Finally, we provide some interesting findings, and highlight several potential applications and future directions. To spark research in this new field, our code, dataset, and online demo are available at our project page: http://mmcheng.net/cod .
ieeexplore.ieee.org