Welcome to MINDS!
Established in 2020, POSTECH Mathematical Institute for Data Science (MINDS) is the community of researchers in the areas of fundamental data science, machine learning, artificial intelligence, scientific computing, and humanitarian data science. MINDS mission is to provide a platform for collaboration among researchers and to provide various opportunities for students in data science. MINDS also aims to use our data science research to serve our local and global communities pursuing humanitarian data science.
News
🌟 DACON Ranker Special Lecture: Winning Strategies for AI Competitions 🌟
2023.11.30
3nd POSTECH&Peking SIAM Student Chapter Joint Conference
2023.11.01
2023 PSSC Summer Camp
2023.10.04
[POSTECH(포항공과대학교) 수리 데이터과학 연구소 연구계약직 공고]-상시모집
2023.07.25
[POSTECH(포항공과대학교) 수리 데이터과학 연구소 연구계약직 공고]
2023.07.14
[POSTECH(포항공과대학교) 수리 데이터과학 연구소 연구교수 채용 공고]
2023.06.12
Seminar | Joint seminar for probability and mathematical biology
2023.05.02
[POSTECH(포항공과대학교) 수리 데이터과학 연구소 연구계약직 공고]
2023.02.15
Upcoming Events
Schedule
MINDS SEMINAR
MINDS Seminar Series | Wooseok Ha (Amazon Web Services) - Prominent Roles of Conditionally Invariant Components in Domain Adaptation: Theory and Algorithms
MINDS SEMINAR
period : 2023-05-09 ~ 2023-05-09
time : 10:00:00 ~ 11:00:00
개최 장소 : Online streaming (Zoom)
Topic : Prominent Roles of Conditionally Invariant Components in Domain Adaptation: Theory and Algorithms
개요
Date | 2023-05-09 ~ 2023-05-09 | Time | 10:00:00 ~ 11:00:00 |
Speaker | Wooseok Ha | Affiliation | Amazon Web Services |
Place | Online streaming (Zoom) | Streaming link | ID : 688 896 1076 / PW : 54321 |
Topic | Prominent Roles of Conditionally Invariant Components in Domain Adaptation: Theory and Algorithms | ||
Contents | Domain adaptation (DA) is a statistical learning problem that arises when the distribution of the source data used to train a model differs from that of the target data used to test the model. While many DA algorithms have demonstrated considerable empirical success, the unavailability of target labels in DA makes it challenging to determine their effectiveness in new datasets without a theoretical basis. Therefore, it is essential to clarify the assumptions required for successful DA algorithms and quantify the corresponding theoretical guarantees under these assumptions. In this work, we focus on the assumption that conditionally invariant components (CICs) useful for prediction exist across the source and target data. Under this assumption, we demonstrate that CICs, which can be estimated through conditional invariant penalty (CIP), play three prominent roles in providing theoretical guarantees for DA algorithms. First, we introduce a new CIC-based algorithm called importance-weighted conditional invariant penalty (IW-CIP), which has target risk guarantees beyond simple settings such as covariate shift and label shift. Second, we show that CICs can be used to identify large discrepancies between source and target risks of other DA algorithms. Finally, we demonstrate that incorporating CICs into the domain invariant projection (DIP) algorithm helps to address its well-known failure scenario caused by label-flipping features. We support our new algorithms and theoretical findings via numerical experiments on synthetic data, MNIST, CelebA, and Camelyon17 datasets. |
MinDS
·
2023-03-06 16:33 ·
Views 1008
POSTECH SIAM Student Chapter
🌟 DACON Ranker Special Lecture: Winning Strategies for AI Competitions 🌟
2023 POSTECH & Peking SIAM Student Chapter Joint Conference
2023 PSSC Summer Camp
2022 PSSC Summer Camp
2022 POSTECH & Peking SIAM Student Chapter Joint Conference
MINDS-MoNET-ISE Workshop
Information, Network & Topological Data Analysis
2021 POSTECH MINDS WORKSHOP
Recent Progress in Data Science and Applications
- Nov. 19(Fri) ~ Nov. 20(Sat) 2021 (1 Night 2 Days)
- Workshop homepage
Fall 2021 Seminar Series
MINDS Seminar Series on Data Science, Machine Learning, and Scientific Computing
Every Tuesdays 05:00 PM
ILJU POSTECH MINDS Workshop on TDA and ML
July 6 ~ July 9
Registration is required (please register here)