We are a research group headed by Prof. Jy-yong Sohn in the Department of Applied Statistics at Yonsei University.
We focus on research topics in the intersection of information theory and machine learning. More broadly, we delve into various topics in machine learning and artificial intelligence, using mathematical tools from information theory, optimization, learning theory, and probability & statistics. Current research topics include
Foundation Models (including Large Language Models)
Representation Learning
Efficient Machine Learning
For the details of each research topic, please check this page. If you are interested in joining our group, please contact me.
[ICML'25] Measuring Representational Shifts in Continual Learning: A Linear Transformation Perspective
[ICML'25] On the Similarities of Embeddings in Contrastive Learning
[AISTATS'25] A Theoretical Framework for Preventing Class Collapse in Supervised Contrastive Learning
[UAI'24] Memorization Capacity for Additive Fine-Tuning with Small ReLU Networks
[AISTATS'24] Analysis of Using Sigmoid Loss for Contrastive Learning
[ICML'23] Looped Transformers as Programmable Computers
[ICLR'23] Equal Improvability: A New Fairness Notion Considering the Long-term Impact
[AAAI'23] Can We Find Strong Lottery Tickets in Generative Models?
NRF Korea, Basic Research Lab (기초연구실), 2024 - 2027
NRF Korea, Outstanding Young Scientist (우수신진), 2024 - 2027
Excellence in Teaching Award, Yonsei University, 2023
Role of Mathematics in the era of AI [SKIS, 2025 Mar] [HYU, 2025 Feb]
Theory of Contrastive Learning [KSS, 2025 Jan] [NIMS ICIM, 2024 Nov] [Krafton AI, 2024 Oct] [KIAS, 2024 Sep]
Theory of Foundation Models [KICS, 2024 Nov] [KSIAM, 2024 Aug]
Theory of Efficient Machine Learning [KIAS, 2023 Dec]
Large Language Models (LLMs) [KICS, 2024 Feb] [Linq, 2023 July] [Yonsei Institute of Data Science, 2023 April]