Information Theory and
Machine Learning Lab
We are a research group headed by Prof. Jy-yong Sohn in the Department of Applied Statistics at Yonsei University.
We focus on research topics in the intersection of information theory and machine learning. More broadly, we delve into various topics in machine learning and artifical intelligence, using mathematical tools from information theory, optimization, learning theory, and probability & statistics. Current research topics include
Efficient ML
Foundation Models
ML Theory
Trustworthy ML
For the details of each research topic, please check this page. If you are interested in joining our group, please contact me.
[arXiv'23] Mini-Batch Optimization of Contrastive Loss
[ICML'23] Looped Transformers as Programmable Computers
[ICLR'23] Equal Improvability: A New Fairness Notion Considering the Long-term Impact
[AAAI'23] Can We Find Strong Lottery Tickets in Generative Models?
[Findings of EMNLP'22] Utilizing Language-Image Pretraining for Efficient and Robust Bilingual Word Alignment
[NeurIPS'22] LIFT: Language-Interfaced FineTuning for Non-Language Machine Learning Tasks
[NeurIPS'22] Rare Gems: Finding Lottery Tickets at Initialization
[ICML'22] GenLabel: Mixup Relabeling using Generative Models
[ISIT'22] Breaking Fair Binary Classification with Optimal Flipping Attacks
[AISTATS'22] Finding Nearly Everything within Random Binary Networks
[NeurIPS'20] Election Coding for Distributed Learning: Protecting SignSGD Against Byzantine Attacks
[NeurIPS'20] Attack of the Tails: Yes, You Really Can Backdoor Federated Learning
[TIT'19] Capacity of Clustered Distributed Storage (won the best paper award)
Selected Invited Talks
Large Language Models (LLMs), Transformers [Yonsei Institute of Data Science, 2023 April] [Wecover, 2023 July]