UARTS FEAST

faculty engineering/arts student teams
The goal of this project is to develop a new genre of inclusive augmented reality games and room-sized interactive systems that remove physical and social barriers to play. The project addresses the unmet need of players with different mobility abilities to play and exercise together in spaces such as school gymnasiums, community centers, and family entertainment centers.
This team will make Korean Art Song (Gagok) more accessible to English speaking students by finding Korean composed song scores, creating English translations, phoneticizations and spoken recordings of song texts, and organizing these materials into an accessible database.
FEAST Tamie Featured Image
Designers of digital musical instruments have worked to make them accessible to people with cognitive or motor disabilities. The TAMIE project advances this by allowing users to train their own instruments, using machine learning to map their movements to musical output, emphasizing consistent gesture recognition for effective interaction.
FEAST Aura Featured Image
Studying the interaction between a human artist (i.e., a painter) and an embodied AI agent (i.e., a robot) to quantitatively capture their creative process using biometrics.
FEAST Expanding Photographic Possibilities Featured Image
Creative exploration of visual perception and depiction via the design, creation, and use of experimental cameras through a collaboration between artists and engineers.
FEAST Deep Drawing Featured image
Deep Drawing is an intermedia project that explores the sound of drawing and writing through a deep learning model. An AI co-performer interprets the musical gestures of the human co-performer drawing on an amplified wooden box to create projected visual drawings corresponding to the sound.
Abriendo Caminos: developing engineering pathways card
This FEAST project will create engaging educational materials designed specifically for middle school Hispanic students to describe and encourage pathways into engineering careers, especially those in the material sciences.
Visualizing Telematic Music Performance
The goal of this project is to explore methods of incorporating visual communication of effort, gesture, and movement into telematic performance without video transmission. Practical experiments with different sensing techniques, including infrared motion capture, inertial measurement, electromyography, and force sensing will be coupled with novel digitally fabricated mechatronic displays.
The student team will explore current participatory design theory and practices toward ideation/ fabrication/production, and test developed pieces that will move forward our understanding and application of participatory design.
This team will conduct a collaborative and interdisciplinary study of shadows to expand and hybridize conceptions of shadows, from a range of fields, as a way of mining their artistic potential and imagining their use in experiential and immersive art encounters.
THE BIG CITY: Lost and Found in XR
No copies are known to exist of 1928 lost film THE BIG CITY, only still photographs, a cutting continuity, and a detailed scenario of the film. Using Unreal Engine, detailed 3D model renderings, and live performance, students will take users back in time into the fictional Harlem Black Bottom cabaret and clubs shown in the film.
SMART Lab Card
The SMART (Sound and Media Art) Lab is an artistic research lab where students explore the intersection of art, technology, and science to create sound and media art. Our multidisciplinary approach blends sound design, human-computer interaction, NIME, artistic sonification, and bio-inspired art.
FEAST Roleplaying Realities Featured Image
Project supporting and presenting immersive and real-time digitally mediated tabletop roleplaying game experiences that combine improvisational performance, new technologies, and experimental interactive storytelling forms to produce experiences that explore queer, decolonial, and feminist futures.
Kids, Robots and Eye Exams: Making Needed Medical Testing Fun
This project proposes a new class of eye imaging device for children featuring an embodied robot character. We seek to transform the sterile, cold, and clinical pediatric eye imaging process into an engaging activity that instead invites children into an interactive partnership. This robotic eye imaging character may enable routine eye examination in young patient populations that will substantially improve the standard of care in pediatric ophthalmology.
This team integrates LiDAR (Light Detection and Ranging) and photogrammetry, 3D point-cloud data captured from artifacts, buildings, urban environments, and landscapes that are animated through the gaming platform and advanced 3D creation tool, Unreal Engine, to develop empathic, inclusive spatial narratives through immersive interfaces.