Information Theory and
Machine Learning Lab
We are a research group headed by Prof. Jy-yong Sohn in the Department of Applied Statistics at Yonsei University.
We focus on research topics in the intersection of information theory and machine learning. More broadly, we delve into various topics in machine learning and artificial intelligence, using mathematical tools from information theory, optimization, learning theory, and probability & statistics. Current research topics include
Efficient ML
Foundation Models
Representation Learning
For the details of each research topic, please check this page. If you are interested in joining our group, please contact me.
[TMLR'24] Mini-Batch Optimization of Contrastive Loss
[UAI'24] Memorization Capacity for Additive Fine-Tuning
[AISTATS'24] Analysis of Using Sigmoid Loss for Contrastive Learning
[NAACL WS'24] Improving Multi-lingual Alignment Through Soft Contrastive Learning
[NAACL WS'24] ERD: A Framework for Improving LLM Reasoning for Cognitive Distortion Classification
[ICLR WS'24] Re-Ex: Revising after Explanation Reduces the Factual Errors in LLM Responses
[EMNLP WS'23] Retrieval-based Evaluation for LLMs: A Case Study in Korean Legal QA
[ICML'23] Looped Transformers as Programmable Computers
[ICLR'23] Equal Improvability: A New Fairness Notion Considering the Long-term Impact
[AAAI'23] Can We Find Strong Lottery Tickets in Generative Models?
[Findings of EMNLP'22] Utilizing Language-Image Pretraining for Efficient and Robust Bilingual Word Alignment
[NeurIPS'22] LIFT: Language-Interfaced FineTuning for Non-Language Machine Learning Tasks
[NeurIPS'22] Rare Gems: Finding Lottery Tickets at Initialization
[ICML'22] GenLabel: Mixup Relabeling using Generative Models
[ISIT'22] Breaking Fair Binary Classification with Optimal Flipping Attacks
[AISTATS'22] Finding Nearly Everything within Random Binary Networks
[NeurIPS'20] Election Coding for Distributed Learning: Protecting SignSGD Against Byzantine Attacks
[NeurIPS'20] Attack of the Tails: Yes, You Really Can Backdoor Federated Learning
[TIT'19] Capacity of Clustered Distributed Storage (won the best paper award)
Selected Awards & Grants
National Research Foundation of Korea (우수신진), 2024 - 2027
National Research Foundation of Korea (박사후 국외연수), 2020 – 2022
Excellence in Teaching Award, Yonsei University, 2023
KAIST EE Best Research Achievement Award, 2018
Best Paper Award, IEEE International Conference on Communications, 2017
Selected Invited Talks
Theoretical Machine Learning [KIAS, 2023 Dec]
Large Language Models (LLMs), Transformers [Yonsei Institute of Data Science, 2023 April] [Linq, 2023 July]