Persons

Ing. Vladan Stojnić

All publications

Label Propagation for Zero-shot Classification with Vision-Language Models

  • Authors: Ing. Vladan Stojnić, Kalantidis, Y., doc. Georgios Tolias, Ph.D.,
  • Publication: 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). Los Alamitos: IEEE Computer Society, 2024. p. 23209-23218. ISSN 2575-7075. ISBN 979-8-3503-5300-6.
  • Year: 2024
  • DOI: 10.1109/CVPR52733.2024.02190
  • Link: https://doi.org/10.1109/CVPR52733.2024.02190
  • Department: Visual Recognition Group
  • Annotation:
    Vision-Language Models (VLMs) have demonstrated im-pressive performance on zero-shot classification, i.e. classi-fication when provided merely with a list of class names. In this paper, we tackle the case of zero-shot classification in the presence of unlabeled data. We leverage the graph structure of the unlabeled data and introduce ZLaP, a method based on label propagation (LP) that utilizes geodesic distances for classification. We tailor LP to graphs containing both text and image features and further pro-pose an efficient method for performing inductive infer-ence based on a dual solution and a sparsification step. We perform extensive experiments to evaluate the effectiveness of our method on 14 common datasets and show that ZLaP outperforms the latest related works. Code: https://github.com/vladan-stojnic/ZLaP

Training Ensembles with Inliers and Outliers for Semi-supervised Active Learning

  • Authors: Ing. Vladan Stojnić, Laskar, Z., doc. Georgios Tolias, Ph.D.,
  • Publication: 2024 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV). Piscataway: IEEE, 2024. p. 259-268. ISSN 2642-9381. ISBN 979-8-3503-1892-0.
  • Year: 2024
  • DOI: 10.1109/WACV57701.2024.00033
  • Link: https://doi.org/10.1109/WACV57701.2024.00033
  • Department: Visual Recognition Group
  • Annotation:
    Deep active learning in the presence of outlier examples poses a realistic yet challenging scenario. Acquiring unlabeled data for annotation requires a delicate balance between avoiding outliers to conserve the annotation budget and prioritizing useful inlier examples for effective training. In this work, we present an approach that leverages three highly synergistic components, which are identified as key ingredients: joint classifier training with inliers and outliers, semi-supervised learning through pseudo-labeling, and model ensembling. Our work demonstrates that ensembling significantly enhances the accuracy of pseudolabeling and improves the quality of data acquisition. By enabling semi-supervision through the joint training process, where outliers are properly handled, we observe a substantial boost in classifier accuracy through the use of all available unlabeled examples. Notably, we reveal that the integration of joint training renders explicit outlier detection unnecessary; a conventional component for acquisition in prior work. The three key components align seamlessly with numerous existing approaches. Through empirical evaluations, we showcase that their combined use leads to a performance increase. Remarkably, despite its simplicity, our proposed approach outperforms all other methods in terms of performance. Code: https://github.com/vladan-stojnic/active-outliers

Responsible person Ing. Mgr. Radovan Suk