Hi, I'm Myung Jin Kim.
You can call me MJ for short.
I am a student researcher in Human-Computer Interaction (HCI), particularly interested in interaction design, tangible interfaces for XR, and visuo-haptic perception.
Currently I am a PhD candidate at the Make Lab in the Industrial Design Department, KAIST, which is part of the HCI@KAIST group.
I am actively looking for positions opening in 2025 in institutions designated for military service in Korea, due to my current status as a Technical Research Personnel serving in the Republic of Korea Army.
Recent News & Events
24.08.20 -- 🚄 Participated in 2nd Korea Haptics Conference in Gyeongju.
24.07.16 -- 🚄 Participated in SIGCHI Korea Local Chapter 2024 Summer Event at Dong-eui University, Busan.
24.06.27 -- 👨🏫 Invited Talk "My Journey in Finding My Niche as a Researcher" at KAIST exploreCSR: Rising Stars 2024 at KAIST.
24.05.24
~10.05 -- 🏛️ "Touching the Virtual with SpinOcchio & SpinOcchietto" Exhibited in "Noli : jeux coréens" at the Korean Cultural Center, France. (Video)
+ More Past Events
24.06.05 -- 📰 CHI'24 Paper Featured on KAIST School of Computing Awards News.
24.05.31 -- 👨🏫 Presented "The Evolution of VR Haptics Research at ID KAIST" at 2024 ID KAIST Doctoral Colloquium: Spring Chat.
24.05.16 -- 🎤 Chaired "Immersive Experiences: UIs and Personalisation" session at CHI'24.
24.05.15 -- 🏅 Received Best Paper Honourable Mention Award for our work "Big or Small, It’s All in Your Head" and
👨🏫 Presented in "Perception and Input in Immersive Environments" session at CHI'24.
24.05.07 -- 👨🏫 Invited Talk "What do you see?" at Interactive Wearable Computing Lab (EE488) course at KAIST.
Projects
🏅Big or Small, It’s All in Your Head: Visuo-Haptic Illusion of Size-Change Using Finger-Repositioning
[CHI 2024 - Best Paper Honorable Mention]
Myung Jin Kim, Eyal Ofek, Michel Pahud, Mike Sinclair, and Andrea Bianchi
doi | pdf | CHI 2024 Program
Abstract:
"Haptic perception of physical sizes increases the realism and immersion in Virtual Reality (VR). Prior work rendered sizes by exerting pressure on the user’s fingertips or employing tangible, shape-changing devices. These interfaces are constrained by the physical shapes they can assume, making it challenging to simulate objects growing larger or smaller than the perceived size of the interface. Motivated by literature on pseudo-haptics describing the strong influence of visuals over haptic perception, this work investigates modulating the perception of size beyond this range. We developed a fixed-sized VR controller leveraging finger-repositioning to create a visuo-haptic illusion of dynamic size-change of handheld virtual objects. Through two user studies, we found that with an accompanying size-changing visual context, users can perceive virtual object sizes up to 44.2% smaller to 160.4% larger than the perceived size of the device. Without the accompanying visuals, a constant size (141.4% of device size) was perceived."
SpinOcchietto: A Wearable Skin-Slip Haptic Device for Rendering Width and Motion of Objects Gripped Between the Fingertips
[UIST 2022 - Demo]
Myung Jin Kim and Andrea Bianchi
doi | pdf | UIST 2022 Program
Abstract:
"Various haptic feedback techniques have been explored to enable users to interact with their virtual surroundings using their hands. However, investigation on interactions with virtual objects slipping against the skin using skin-slip haptic feedback is still at its early stages. Prior skin-slip virtual reality (VR) haptic display implementations involved bulky actuation mechanisms and were not suitable for multi-finger and bimanual interactions. As a solution to this limitation, we present SpinOcchietto, a wearable skin-slip haptic feedback device using spinning discs for rendering the width and movement of virtual objects gripped between the fingertips. SpinOcchietto was developed to miniaturize and simplify SpinOcchio, a 6-DoF handheld skin-slip haptic display. With its smaller, lighter, and wearable form factor, SpinOcchietto enables users with a wide range of hand sizes to interact with virtual objects with their thumb and index fingers while freeing the rest of the hand. Users can perceive the speed of virtual objects slipping against the fingertips and can use varying grip strengths to grab and release the objects. Three demo applications were developed to showcase the different types of virtual object interactions enabled by the prototype."
Thumble: One-Handed 3D Object Manipulation Using a Thimble-Shaped Wearable Device in Virtual Reality
[UIST 2022 - Poster]
Changsung Lim*, Jina Kim*, and Myung Jin Kim*
(*Equal contribution by all authors)
doi | pdf | poster | UIST 2022 Program
Abstract:
"Conventional controllers or hand-tracking interactions in VR cause hand fatigue while manipulating 3D objects because repetitive wrist rotation and hand movements are often required. As a solution to this inconvenience, we propose Thumble, a novel wearable input device worn on the thumb for modifying the orientation of 3D objects. Thumble can rotate the 3D objects depending on the orientation of the thumb and using the thumb pad as an input surface on which the index finger rubs to control the direction and degree of rotations. Therefore, it requires minimal motion of the wrist and the arm. Through the informal user study, we collected the subjective feedback of users and found that Thumble has less hand movement than a conventional VR controller."
SpinOcchio: Understanding Haptic-Visual Congruency of Skin-Slip in VR with a Dynamic Grip Controller
[CHI 2022 - Paper]
Myung Jin Kim, Neung Ryu, Wooje Chang, Michel Pahud, Mike Sinclair, and Andrea Bianchi
doi | pdf | CHI 2022 Program
Abstract:
"This paper's goal is to understand the haptic-visual congruency perception of skin-slip on the fingertips given visual cues in Virtual Reality (VR). We developed SpinOcchio ('Spin' for the spinning mechanism used, 'Occhio' for the Italian word “eye”), a handheld haptic controller capable of rendering the thickness and slipping of a virtual object pinched between two fingers. This is achieved using a mechanism with spinning and pivoting disks that apply a tangential skin-slip movement to the fingertips. With SpinOcchio, we determined the baseline haptic discrimination threshold for skin-slip, and, using these results, we tested how haptic realism of motion and thickness is perceived with varying visual cues in VR. Surprisingly, the results show that in all cases, visual cues dominate over haptic perception. Based on these results, we suggest applications that leverage skin-slip and grip interaction, contributing further to realistic experiences in VR."
🏅Exploring Pseudo Hand-Eye Interaction on the Head-Mounted Display
[Augmented Humans 2021 - Honorable Mention]
Myung Jin Kim and Andrea Bianchi
Abstract:
"Virtual and augmented reality devices and applications have enabled the user to experience a variety of simulated real-life experiences through first-person visual, auditory, and haptic feedback. However, among the numerous everyday interactions that have been emulated, the familiar interaction of touching or rubbing the eyes is yet to be explored and remains to be understood. In this paper, we aim to understand the components of natural hand-eye interaction, propose an interaction technique through a proof-of-concept prototype head-mounted display, and evaluate the user experience of the prototype through a user study. In addition, we share insights emerged from the studies with suggestions for further development of interaction techniques based on combinations of hardware and software. "
Demonstration of ElaStick: A Variable Stiffness Display for Rendering Handheld Flexible Object
[SIGGRAPH Asia 2020 - Emerging Technologies]
Neung Ryu, Myung Jin Kim, and Andrea Bianchi
Abstract:
"We present ElaStick, a handheld variable stiffness controller capable of simulating the kinesthetic sensation of deformable and flexible objects when swung or shaken. ElaStick is capable of rendering gradual changes of stiffness along two independent axes over a wide continuous range. Two trackers on the controller enable a closed-loop feedback that allows to accurately map the device’s deformations to the visuals of a Virtual Reality application."
BodyPrinter: Fabricating Circuits Directly on the Skin at Arbitrary Locations Using a Wearable Compact Plotter
[UIST 2020 - Paper]
Youngkyung Choi, Neung Ryu, Myung Jin Kim, Artem Dementyev, and Andrea Bianchi
doi | pdf | UIST 2020 Program
Abstract:
"On-body electronics and sensors offer the opportunity to seamlessly augment the human with computing power. Accordingly, numerous previous work investigated methods that exploit conductive materials and flexible substrates to fabricate circuits in the form of wearable devices, stretchable patches, and stickers that can be attached to the skin. For all these methods, the fabrication process involves several manual steps, such as designing the circuit in software, constructing conductive patches, and manually placing these physical patches on the body. In contrast, in this work, we propose to fabricate electronics directly on the skin. We present BodyPrinter, a wearable conductive-ink deposition machine, that prints flexible electronics directly on the body using skin-safe conductive ink. The paper describes our system in detail and, through a series of examples and a technical evaluation, we show how direct on-body fabrication of electronic circuits and sensors can further enhance the human body."
🏆ElaStick: A Handheld Variable Stiffness Display for Rendering Dynamic Haptic Response of Flexible Object
[UIST 2020 - Best Paper]
Neung Ryu, Woojin Lee, Myung Jin Kim, and Andrea Bianchi
doi | pdf | UIST 2020 Program
Abstract:
"Haptic controllers have an important role in providing rich and immersive Virtual Reality (VR) experiences. While previous works have succeeded in creating handheld devices that simulate dynamic properties of rigid objects, such as weight, shape, and movement, recreating the behavior of flexible objects with different stiffness using ungrounded controllers remains an open challenge. In this paper we present ElaStick, a variable-stiffness controller that simulates the dynamic response resulting from shaking or swinging flexible virtual objects. This is achieved by dynamically changing the stiffness of four custom elastic tendons along a joint that effectively increase and reduce the overall stiffness of a perceived object in 2-DoF. We show that with the proposed mechanism, we can render stiffness with high precision and granularity in a continuous range between 10.8 and 71.5Nmm/degree. We estimate the threshold of the human perception of stiffness with a just-noticeable difference (JND) study and investigate the levels of immersion, realism and enjoyment using a VR application."
Aero-plane: A Handheld Force-Feedback Device that Renders Weight Motion Illusion on a Virtual 2D Plane
[UIST 2019 - Paper]
Seungwoo Je, Myung Jin Kim, Woojin Lee, Byungjoo Lee, Xing-Dong Yang, Pedro Lopes, and Andrea Bianchi
doi | pdf | UIST 2019 Program
Abstract:
"Force feedback is said to be the next frontier in virtual reality (VR). Recently, with consumers pushing forward with untethered VR, researchers turned away from solutions based on bulky hardware (e.g., exoskeletons and robotic arms) and started exploring smaller portable or wearable devices. However, when it comes to rendering inertial forces, such as when moving a heavy object around or when interacting with objects with unique mass properties, current ungrounded force feedback devices are unable to provide quick weight shifting sensations that can realistically simulate weight changes over 2D surfaces. In this paper we introduce Aero-plane, a force-feedback handheld controller based on two miniature jet propellers that can render shifting weights of up to 14 N within 0.3 seconds. Through two user studies we: (1) characterize the users' ability to perceive and correctly recognize different motion paths on a virtual plane while using our device; and, (2) tested the level of realism and immersion of the controller when used in two VR applications (a rolling ball on a plane, and using kitchen tools of different shapes and sizes). Lastly, we present a set of applications that further explore different usage cases and alternative form-factors for our device."
Wind-Blaster: a wearable propeller-based prototype that provides ungrounded force-feedback
[SIGGRAPH 2018 - Emerging Technologies]
Seungwoo Je, Hyelip Lee, Myung Jin Kim, Andrea Bianchi
Abstract:
"Ungrounded haptic force-feedback is a crucial element for applications that aim to immerse users in virtual environments where also mobility is an important component of the experience, like for example Virtual Reality games. In this paper, we present a novel wearable interface that generates a force-feedback by spinning two drone-propellers mounted on a wrist. The device is interfaced with a game running in Unity, and it is capable to render different haptic stimuli mapped to four weapons. A simple evaluation with users demonstrates the feasibility of the proposed approach."
An Artistic Provocation to Explore Effects and Opportunities of Virtual Surreal Spaces
[DIS 2018 - Provocations and Work-in-Progress]
Hyelip Lee, Myung Jin Kim, Byungjoo Lee, Andrea Bianchi
doi | pdf | DIS 2018 Program
Abstract:
"The concept of surreal virtual space is used in this paper to describe a space which looks realistic but is impossible to exist in reality. For this project, we developed a 3D virtual space using Google Cardboard and an Android mobile device. Referring to the 2D drawing, "Relativity," of M.C. Escher, the virtual space was designed to have multi-directional but connected stairs. This work was exhibited with other artworks at a gallery for a period of three weeks. Despite some minor sensory confusion, all audiences experienced a degree of place illusion, enjoyment and a sense of self-awareness even though the virtual environment did not provide a visual representation of the audience's own body. For future work, we plan to investigate the advantages of these effects and apply them to everyday non-virtual environments. "