Cloth Changing Person Re Identification With Noisy Patch Filtering

Cloth Changing Person Re Identification With Noisy Patch Filtering

Abstract:

Cloth-changing person re-identification (ReID) aims to recognize the identities of persons even when changing cloth. Even with the same person, changes in clothing can cause significant visual variations, making it challenging to identify them. Therefore, there is a need for a technique that can extract their inherent characteristics while being invariant to changes in clothing. Recent studies have utilized additional information such as gait, body parsing maps, and 3D shape to address cloth variance. However, these methods require additional processing and their effectiveness is dependent on the quality of the information provided. In this letter, we propose a two-stream model for ReID based on a single RGB image, consisting of a baseline ReID network and cloth-unrelated ReID networks. The baseline network takes the full RGB image as input, while the cloth-unrelated networks only receive cropped patches of the RGB image that include only cloth-independent identity cues such as the face and legs. We also propose a noisy patch filtering module (NPFM) to remove interference from noise patches during training. Finally, we combine features from baseline and cloth-unrelated ReID networks to perform cloth-changing person ReID. Our approach achieves a significant improvement of +7.4% for Rank-1 and +1.7% for mAP in cloth-changing environments compared to the recent state-of-the-art method. The effectiveness of our model for the cloth-changing person ReID is verified on the benchmark dataset with various ablation studies. In summary, our method can extract the unique features of an individual person in a cloth-changing environment and is robustly trained even with occluded or poorly captured data. Therefore, it can be more versatile in practical environments than existing ReID methods.