Hi, I'm Taejun Kim, a 3rd year Ph.D. student in KAIST HCI Lab working with Prof. Geehyuk Lee. I'm interested in devising new AR/VR interaction techniques utilizing eye gaze movement. My research focus lies in eye gaze-based interaction, AR/VR interfaces, wearable haptic interfaces.
Taejun Kim, Auejin Ham, Sunggeun Ahn, Geehyuk Lee
Lattice Menu: A Low-Error Gaze-Based Marking Menu Utilizing Target-Assisted Gaze Gestures on a Lattice of Visual Anchors
CHI 2022: ACM Conference on Human Factors in Computing Systems
Youngbo Aram Shim, Taejun Kim, Geehyuk Lee
QuadStretch: A Forearm-wearable Multi-dimensional Skin Stretch Display for Immersive VR Haptic Feedback [Best Demo Award]
CHI 2022 Interactivity (Demo): ACM Conference on Human Factors in Computing Systems
Taejun Kim, Youngbo Aram Shim, Geehyuk Lee
Heterogeneous Stroke: Using Unique Vibration Cues to Improve the Wrist-Worn Spatiotemporal Tactile Display
CHI 2021: ACM Conference on Human Factors in Computing Systems
Jun 2022 - Current
Meta Reality Labs Research
Research Intern
Dec 2015 - Feb 2016
Bhaptics
Frontend coder
May 2022
CHI '22 Best Demo Award
Demonstrating “QuadStretch: A Forearm-wearable Multi-dimensional Skin Stretch Display for Immersive VR Haptic Feedback”
Feb 2021
Outstanding Master's Thesis Award, KAIST School of Computing
Thesis title: "Improving Recognition Accuracy of Wrist-Worn Spatiotemporal Tactile Display using Heterogeneous Vibrotactile Stimuli"
Nov 2022
Interface Control with Eye Movement
Stanford HCI Lunch, Stanford University
Nov 2022
Interface Control with Eye Movement
DGP Lab, University of Toronto
Oct 2021
Lecture on SPSS & R practice [Slides]
in CS584 Human-Computer Interaction, School of Computing, KAIST
2017 - 2021
Teaching Assistant (TA)
CS584 Human-Computer Interaction (2021 Fall), CS550 Software Engineering (2021 Spring), CS300 Introduction to Algorithm (2020 Fall), CS204 Discrete Mathematics (2019 Spring), CS230 System Prograaming (2018 Spring), CS101 Introduction to Programming (2017 Fall), School of Computing, KAIST
2021
SuggestBot: Development of a Context-Based Smart Interaction Service Platform (National R&D Project Demonstration)
Skills: Unity, Hololens 2 API (MRTK-Unity), Eye Gaze-Responsive Application
2020
Are They Really Fair?: Use Transfer Learning To Verify the Learned Fair Representation
Skills: PyTorch
Code
2020
AudioHero: Sound-Based Danger Detection System Using VGGish Deep Learning Model
Skills: Tensorflow, Python (Audio Processing)
Video | Code
2019
CommandPad: Using Finger Identification for Seamless Mode Switching between Command Gesture and Cursor Manipulation on a Touchpad
Skills: C# WPF, Python OpenCV Procesing of RGB and Depth Images, TCP Networking
2017
OTL Plus KAIST: Online Timeplanner with Lectures (Custom Timetable Team)
Skills: Python Django, HTML, CSS, JS