Name: Krisztián Koós
Affiliation:GE Healthcare
Primary research interest: medical image processing, self-supervised learning
Title of the lecture: Multi Anatomy X-ray Foundation Model
Keywords: self-supervised learning, radiology, x-ray, foundation models
Summary: Self-supervised
learning (SSL) has emerged as a powerful approach for developing
general-purpose models that perform exceptionally well across various
downstream tasks, including
classification and segmentation. Among these, DINOv2 stands out as one
of the most prominent methods. In the medical domain—particularly in
X-ray imaging—foundation models are gaining significant traction.
Chest-specific models such as RadDINO and RayDINO
demonstrate the effectiveness of SSL using imaging data alone. In
this talk, a novel multi-anatomy X-ray model pretrained using
self-supervised learning will be presented. The model is evaluated on a
diverse set of tasks, including image-to-image
retrieval, anatomical localization, report generation, and more,
showcasing its versatility and generalization capabilities.