Research Projects

ARMath View Video »

ARMath is a mobile Augmented Reality (AR) system that allows children to discover mathematical concepts in familiar, ordinary objects and engage with math problems in meaningful contexts. Leveraging advanced computer vision, ARMath recognizes everyday objects, visualizes their mathematical attributes, and turns them into tangible or virtual manipulatives. Using the manipulatives, children can solve problems that situate math operations or concepts in specific everyday contexts.

  • Augmented Reality, Computer Vision, STEM Education, Human-AI Interaction
Generic placeholder image

PrototypAR View Video »

PrototypAR is an AR smart desk that allows young children to design and experiment with complex systems. The system combines lo-fidelity prototyping to facilitate the iterative design of system representations, an AR visualization to scaffold learning through design, and a virtual simulation to support personalized experiments.

  • Augmented Reality, Tangible Prototyping, Virtual Simulation, Computer Vision, STEM Education
Generic placeholder image

SharedPhys View Video »

This project introduces three different mixed-reality tools that support new forms of embodied interaction, scientific inquiry, and collaborative learning on human body systems. These tools tightly integrates real-time physiological sensing, whole-body interaction, and responsive large-screen visualizations so as to provide learners with immersive learning experience.

  • Mixed-Reality, Real-time Physiological Sensing, Whole-body Interaction, Large-screen Visualization
Generic placeholder image

I Like This Shirt

We explore the translation of the dynamics of the virtual-based social interactions into the physical world. Specifically, by creating everyday physical objects that can be “liked” in the physical world as if they existed in the virtual world, we study wearers’ and viewers’ reactions and experiences.

  • Hardware Prototyping, Social Interaction
Generic placeholder image

Kids in Fairytales

We introduce an immersive learning system that combines mixed reality experience and interactive storytelling to engage young children more in reading. By interacting with virtual objects and immerse themsevles in responsive 3D scene, young users can develop their own fairytales and become deeply intersted in the stories. We deployed the system at 20+ national children's libraries for 5 years along with 10+ immersive 3D fairytales.

  • 3D Mixed-Reality, Natural User Interface, Computer Vision, Physics Simulation, System Integration
Generic placeholder image

Tangible User Interface

Generic placeholder image


AccStick is a tangible computing interface which assists desktop computing with accessibility functions. Each stick is mapped to an accessibility function such as screen magnifier, screen resolution change, display brightness adjustment, cursor size change, and volume control. By plugging or bending AccSticks, users can easily use the computing functions.

View Video »

Generic placeholder image

City Light from Windmills

City Light from Windmills is a interactive and visual art that displays how our city turns up the lights with electricity generated by windmills. The user can light up city building by blowing windmills and change its color/brightness. The colors and brightness of LEDs are adjusted according to the rotational speeds of each windmill.

View Video »

Generic placeholder image

Tint Picker

Tint Picker is a tangible color picker that allows you to save any colors in your life and build your color palette to use in the future design, painting, or fun. Tint Picker sensors RGB/light value at its tip and send the data to application server through Bluetooth embedded in Arduino Blend Micro. Our application visualizes the color picked with colorful balls and presents the recent colors at your palette.

View Video »

Generic placeholder image

Ukulele Player

Ukulele Player allows you to play a ukulele with no expertise required - just wave your arms to strum and change chords. Servo motors mounted above the instrument activate to strum the strings, and rotate gears to press down on the fretboard. A Kinect is used for gesture recognition, so that moving your right arm up and down strums, and left arm back and forth switches between chords.

View Video »

Generic placeholder image

Flappy Hat

Flappy hat is a intelligent hat that sensors user enviroment (UV level, temperature, and humidity) and visualize them. As a form of social interaction, the wearer can spread the climate information and benefit nearby people

View Video »


  • Kang, S., Shokeen, E., Byrne, V., Norooz, Leyla., Bonsignore, E., Williams-Pierce C., & Froehlich, J. (2020). “ARMath: Augmenting Everyday Life with Math Learning”. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM (Will appear)
  • Kang, S., Norooz, L., Bonsignore, E., Clegg, Byrne, V., Clegg, T., & Froehlich, J. (2019). “PrototypAR: Prototyping and Simulating Complex Systems with Paper Craft and Augmented Reality”. In Proceedings of the The 18th International Conference on Interaction Design and Children. ACM
  • Byrne, V. L., Kang, S., Norooz, L., Froehlich, J., Clegg., T. (2019). Bringing life-relevant embodied learning with e-textiles into the classroom: Tensions with classroom rules and academic norms. American Educational Research Association annual meeting. Toronto, ON.
  • Kang, S., Norooz, L., Byrne, V., Clegg, T., & Froehlich, J. E. (2018). Prototyping and Simulating Complex Systems with Paper Craft and Augmented Reality: An Initial Investigation. In Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction (pp. 320-328). ACM.
  • Byrne, V., Kang, S., Norooz, L., Velez, R., Addeh, A., Froehlich, J., & Clegg, T. (2018). Scaffolding Authentic Wearable-­Based Scientific Inquiry for Early Elementary Learners. Proceedings of ICLS 2018.
  • Clegg, T., Norooz, L., Kang, S., Byrne, V., Katzen, M., Valez, R., Plane, A., Oguamanam, V., Outing, T., Yip, J., Bonsignore, E., & Froehlich, J. (2017). "Live Physiological Sensing and Visualization Ecosystems: An Activity Theory Analysis". In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM.
  • Kang, S., Norooz, L., Oguamanam, V., Plane, A., Clegg, T., & Froehlich, J. (2016). “SharedPhys: Live Physiological Sensing, Whole-Body Interaction, and Large-Screen Visualizations to Support Shared Inquiry Experiences”. In Proceedings of the The 15th International Conference on Interaction Design and Children. ACM
  • Norooz, L., Clegg, T., Kang, S., Plane, A., Oguamanam, V., & Froehlich, J. (2016) ““That's your heart!”: Live Physiological Sensing & Visualization Tools for Life-Relevant & Collaborative STEM Learning”. In Proceedings of ICLS 2016
  • Kang, S., Lee, Y., & Lee, S. (2015). “Kids in Fairytales: Experiential and Interactive Storytelling in Children's Libraries”. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
  • Najafizadeh, L., Kang, S., & Froehlich, J. E. (2015). I Like This Shirt: Exploring the Translation of Social Mechanisms in the Virtual World into Physical Experiences. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
  • Lee, S., Yun, J., Kang, S., & Lee, J. (2013). “Design and Implementation of Plug-in based Interactive e-book Authoring System”. In Proceedings of International Conference on Convergence Content 2013, 11(2).
  • Kwak, J. W., Kang, S., & Jhang, S. T. (2013). On-chip Inter-victim Cache Architecture and its Snooping Protocol for Shared Bus-based CMP Systems. International Information Institute (Tokyo). Information, 16(5), 3185.
  • Ko, J., Lee, S., Kang, S., & Lee, J. (2011). Hybrid Camera Based Real-Time Human Body Segmentation for Virtual Reality E-learning System. In Computers, Networks, Systems and Industrial Engineering (CNSI), 2011 First ACIS/JNU International Conference on. IEEE.
  • Lee, S., Ko, J. G., Kang, S., & Lee, J. (2010, October). An immersive e-learning system providing virtual experience. In Mixed and Augmented Reality (ISMAR), 2010 9th IEEE International Symposium on. IEEE.


  • Lee, S. W., Kang, S. B., Lim, S. H., & Lee, J. S. (2016). "Apparatus for extracting image object in 3D image system and method thereof.". U.S. Patent No. 9,294,753.
  • Kang, S, Lee, J.,Ko, J., Lee, S., & Lee, J. (2012). “Image Separation Apparatus and Method”, U.S. Patent No. 20,120,121,191-A1
  • Lee, J., Kang, S., Kim, S. Y., Yoo, J. S., & Lee, J. (2012). "Apparatus and method for recognizing multi-user interactions.". U.S. Patent No. 20,120,163,661.
  • Lee, S. W., Lee, J., Kang, S., Sung, J., & Lee, G. H. (2012). "Apparatus and method for authoring experiential learning content.". U.S. Patent No. 20,120,107,790.


  • NSF 2019 Video Showcase: Innovations in STEM Education, Facilitator’s Choice. 2019
    PrototypAR: Learning through Design and Experimentation.
  • NSF 2016 Video Showcase: Advancing STEM for All, Facilitator’s Choice. 2016 BodyVis: Advancing New Science Learning and Inquiry Experiences via On-Body Sensing and Visualization
  • PhD Graduate Study Fellowship (5yr), Kwanjeong Educational Foundation. 2014
  • MS Graduate Study Fellowship (2yr), Brain Korea 21. 2007
  • Undergraduate Study Scholarship (4yr), National Scholarship for Science and Engineering. 2003