In a groundbreaking fusion of artificial intelligence and cosmetics, researchers at the Institute of Science Tokyo have developed an innovative makeup simulation system that personalizes color exploration through mood-based textual input, transcending traditional virtual try-on experiences. This advanced technology employs a dynamic projection mapping setup that directly projects AI-generated makeup palettes onto the user’s face in real time, under lighting conditions that realistically mimic individual skin tones and textures, thereby revolutionizing the way consumers and professionals interact with makeup applications.
Traditional virtual makeup tools primarily rely on augmented reality overlays displayed on two-dimensional screens such as smartphones or tablets. While useful for experimenting with different styles, these systems often fall short when it comes to conveying the authentic appearance of makeup as it interacts with the nuanced light reflections and textures unique to each individual’s face. The Institute of Science Tokyo’s system overcomes these limitations by integrating a high-speed projector and camera system capable of tracking facial movements dynamically, ensuring that makeup projections remain precisely aligned even as users move naturally.
At the core of this breakthrough technology is an artificial intelligence model trained to convert user-described impressions—expressed in natural language—into bespoke makeup color themes. Users can articulate evocative phrases such as “Sakura in spring,” “night rose,” or “autumn forest with warm sunlight,” enabling the AI to interpret these sentiments and generate corresponding reference images and five distinct color palettes applicable to cheeks, eyeshadow, and lips. This impression-guided text-to-makeup-color transformation marks a significant leap forward in personalizing beauty experiences by linking abstract concepts to concrete visual aesthetics.
The research team, comprising graduate students Kemeng Zhang and Hao-Lun Peng alongside Associate Professor Yoshihiro Watanabe, spearheaded this interdisciplinary project within the Department of Information and Communications Engineering. Their findings were published in the International Journal of Human-Computer Interaction, highlighting the technical and experiential advantages of combining AI-driven image generation with projection mapping technologies for interactive, personalized makeup simulation.
One of the most compelling attributes of this system is its dynamic optimization capability, which continually refines color suggestions based on real-time user selections. As users explore various suggested themes through the projected makeup, the AI adapts to their preferences, generating increasingly tailored palettes. This feedback loop dramatically reduces the effort and guesswork typically associated with selecting from hundreds of makeup color options, enabling users to swiftly converge on styles that resonate with their personal tastes.
The high-speed projection setup operates with remarkable precision, employing advanced tracking algorithms to monitor facial landmarks—such as eyes, lips, and cheek contours—and counteract any movement to maintain projection fidelity. This technological synergy ensures that users experience a seamless, real-life visualization of makeup effects, viewable directly in a mirror rather than confined to the pixelated realm of digital screens, thus bridging the gap between virtual experimentation and tangible beauty applications.
Importantly, the system accounts for the complex optical properties of human skin and lips, simulating how light interacts with diverse tones and textures. This leads to a realistic portrayal of makeup colors under natural lighting conditions, which is essential for users to make informed choices that translate accurately to real-world appearance, a challenge that standard AR-based tools have struggled to surmount.
User evaluations have underscored both the usability and aesthetic impact of the system. An online survey involving one hundred participants indicated that the system consistently delivered color schemes well-aligned with the users’ mood-based descriptions. Moreover, a hands-on study with fifteen users compared this impression-driven projection method against traditional manual color adjustment tools, with findings showing enhanced speed, intuitiveness, and overall enjoyment. Participants reported discovering appealing styles and color combinations that they might not have spontaneously considered.
The potential applications of this system extend beyond consumer use into the professional realm of cosmetics and fashion. Industry experts have lauded its capacity for creative ideation, particularly in settings such as themed fashion shows or early-phase product design. By swiftly generating accurate color concepts from abstract descriptive inputs, the technology can streamline the traditionally labor-intensive process of makeup design, fostering innovation and efficient experimentation.
The emergence of this AI-powered makeup simulation underscores a broader trend wherein artificial intelligence facilitates interaction between human creativity and technical systems. It exemplifies how AI can not only automate routine tasks but also enhance artistic expression and personalized experience by interpreting subjective concepts and rendering them into visually compelling outputs.
Looking forward, the integration of such systems into retail environments could transform customer experiences, enabling shoppers to articulate their desired ambiance or style verbally and instantly witness corresponding makeup effects in situ. This natural language interface, combined with hyper-realistic projection, promises to elevate the interactivity and satisfaction of cosmetic trials, reducing barriers and uncertainty inherent in makeup purchase decisions.
This pioneering work also sets a technical precedent for future explorations in projection-based design and personalized digital augmentation in other domains, showcasing how AI and real-time dynamic projection can come together to create immersive, user-centered experiences that respect and enhance individual uniqueness.
In sum, the Institute of Science Tokyo’s impression-guided interactive personalized color exploration framework heralds a new era in beauty technology, where the abstract meets the tangible through sophisticated AI and projection methodologies, making makeup exploration more intuitive, enjoyable, and authentically personalized than ever before.
Subject of Research: People
Article Title: Impression-Guided Interactive Personalized Color Exploration Framework for Dynamic Projection Mapping Makeup
News Publication Date: 21-Jan-2026
Web References:
- International Journal of Human-Computer Interaction: https://www.tandfonline.com/doi/full/10.1080/10447318.2025.2599521
- Demonstration Video: https://www.youtube.com/watch?v=xLcoilAGIns
References: 10.1080/10447318.2025.2599521
Image Credits: Institute of Science Tokyo
Keywords: Artificial intelligence, Dynamic projection mapping, Personalized makeup, Virtual makeup simulation, Real-time facial tracking, Human-computer interaction, Computational modeling, Cosmetics technology

