UARTS FEAST

faculty engineering/arts student teams
This FEAST project will create engaging educational materials designed specifically for middle school Hispanic students to describe and encourage pathways into engineering careers, especially those in the material sciences.
The project provides sounds and color analogs of people who exist across multiple axes of physical, emotional, cultural and social divergent identities. The project considers imputations of combinations of sound, color, and visuals in presenting all data - including potentially very identifiable combinations.
This UARTS FEAST project will design, prototype and test an aerospace vehicle to locate and identify endangered plant species in wetland environments. The vehicle should minimally disrupt plant and animal wildlife. Your test client is the University's Botanical Gardens.
Sonic Scenographies is a research program catalyzing experimental collaboration at the intersection of performance, music, theater, dance, architecture, information science, engineering, and digital space. Participating students will experiment with XR tools and gaming engines in Taubman College's new XR lab, and work to develop a virtual platform which interrogates the digital sphere's impact on live performance and audience participation.
1001++ (Magical Technologies) is a series of artistic inquiries inspired by Arthur C. Clarke’s 3rd law, “[a]ny sufficiently advanced technology is indistinguishable from magic.” This lens allows students on this UARTS FEAST project to re-examine folk narratives not as superstitious but to read the ‘magic’ as culturally aspirational desires of applied technologies (e.g., VR, machine learning, robotics, storytelling, choreography).
This UARTS FEAST project will capture the intricate and fast physical movements required for many percussive techniques in slow motion in order to analyze both the physical conditions and sonic outcomes associated with some of percussion's signature performance demands.
“22/26 Midwest” is a net-zero building concept for the US Midwest climate condition. The technology has the goal to reduce the green-house gas emission for building operation, to improve the comfort of the occupants and to reduce the construction cost. The UARTS Student Team will work on prototypes, programming and control technologies to develop the “22/26 Midwest” project.
Studio JOY draws upon interventionist art practices and psychologist Victor Frankl's logotherapy as a form of activism in the face of institutional betrayal. Interventionist art practices made in the spirit of culture jamming as defined by Mark Dery: “Groucho Marxists, ever mindful of the fun to be had in the joyful demolition of oppressive ideologies.” Studio JOY will engage in making practices that may include the construction of superhero or mascot costumes, public signage, and/or screenplays.
This UARTS FEAST project, "Picturing the Structure of Musical Spaces," will study and construct visual representations of music using mathematics. Drawing on scholarship that represents musical chords as points in geometric spaces, we will explore new ways of “picturing” these musical spaces by constructing visualizations of their structure, patterns, and symmetries.
To create a healthier and more sustainable future, we need to create ways to make organic food less expensive to produce and thus more accessible to everyone. This UARTS FEAST project will explore ways to use robots to assist with organic farming and gardening tasks and will collaborate with the U-M Campus Farm and feature working with a real robot to perform tasks related to agriculture.
Collaborators and conspirators will explore the structure, philosophy and dance of multiple forms of language, to define language and its use in multiple ways, to discover how it can be activated, (de)constructed and deciphered in relationship to effort, shape, time and space.
This project will enable a team of students to learn about environmental sensors, specifically around water and watersheds, and data and create tools and technologies with that data that inform and empower community stakeholders.
This team is developing an interactive sound installation that helps users learn the basics of coding. Utilizing research on embodied engagement with sound and critical improvisation studies, this installation will facilitate real-time audio feedback for users’ physical interactions with it. The code that facilitates these interactions will then be displayed, helping users understand the interactive potentials of coding.
ORBIT stands for the Online Resource for Building Intercultural Teams—and it’s one of many projects underway in the ORBIT Lab! We’re also developing a tool for middle schoolers to team up on social justice issues, working on a book called Creative Resilience, and collaborating with faculty in pharmacy and cardiology on an interactive dashboard to help providers better care for heart failure patients.
The goal of this project is to develop a new genre of inclusive augmented reality games and room-sized interactive systems that remove physical and social barriers to play. The project addresses the unmet need of players with different mobility abilities to play and exercise together in spaces such as school gymnasiums, community centers, and family entertainment centers.
The SparkVotes Parties project is a series of games designed to educate and energize college-age voters. Our collaborative team will be developing imaginative ways to gamify the skills and knowledge needed for campus civic participation in the 2022 election.
This team will conduct a collaborative and interdisciplinary study of shadows to expand and hybridize conceptions of shadows, from a range of fields, as a way of mining their artistic potential and imagining their use in experiential and immersive art encounters.
Following the inspiration of the meteorology community and Weather Underground that connected backyard weather stations into the global weather system, thistudent team will deploy magnetometers and other sensors everywhere to make a dense distributed array to enable new science and understanding of the Earth’s space environment.
The student team will explore current participatory design theory and practices toward ideation/ fabrication/production, and test developed pieces that will move forward our understanding and application of participatory design.
This team will enable the architecture student to translate and test spatial ideas in the design process through immersive technologies using point clouds generated from photogrammetry and LiDAR. In addition to scanning and photogrammetry, this team will test design methodologies (experimenting with VFX and VR), create templates for workflow documentation, and establish a database for site scans and student projects.
The project is called LuCelegans (Luce: light in latin; Light-up C. elegans), or the Interactive Worm Project. It is about building the first interactive, physical, 3-dimensional prototype of C. elegans nervous system through the efforts of a student research team.
The goal of this project is to explore methods of incorporating visual communication of effort, gesture, and movement into telematic performance without video transmission. Practical experiments with different sensing techniques, including infrared motion capture, inertial measurement, electromyography, and force sensing will be coupled with novel digitally fabricated mechatronic displays.
This team will make Korean Art Song (Gagok) more accessible to English speaking students by finding Korean composed song scores, creating English translations, phoneticizations and spoken recordings of song texts, and organizing these materials into an accessible database.
This HAPLAB project aims to understand the relationship between the quality of breathing and exceptional performance. We will use data visualization, sonification, and/or visceralization to communicate breathing back to musical performers.
The team will explore how pervasive technologies are mediating the way people interact with their cities. The project seeks to make visible and transparent the complex yet critical issues around the use of computer vision and artificial intelligence (as in controversial programs like Detroit’s Project Greenlight and New York’s LinkNYC systems) in public and urban spaces as we build citizen-engaged, physical installations and interventions.
This team will receive the anatomical model, then print, patent and market a trio of 3-D polymer objects, representing the already designed Lung/Diaphragm simulator, then print a polymer tongue, and print a voice box/vocal folds simulator. Polymer objects will be reprinted affordably, made available in a "toolbox" style setting for housing the anatomically correct parts, and made available for purchase for artists, academics and physicians.
The project team will work collaboratively on a new multi-media artwork produced through printmaking, animation, and storytelling. The project seeks to visually stretch the boundaries of the analog and digital realms of art-making into a multi-media experience.
This student team will work on a new edition of Telemann’s chorale book of his 430 chorales. This will involve developing score recognition technologies to automatically transcribe the 1730 edition into machine code, and a computational model that can generate the four parts of these chorales from that code.
The research project team will create physically and socially intelligent structures that facilitate cooperation and emotional release, while transcending the fixed expectations of architecture and infrastructure, thereby emboldening viewers to become participants.
The student team will develop better construction, testing, and shipping methods; create survey instruments and data collection strategies; and develop and test marketing materials for the ceramic water filters. THIS PROJECT WILL BE COMPLETE AT THE END OF FALL 2021.
The student team will be tasked with developing and evaluating task-specific programming language prototypes for use in integrating computing into high school and undergraduate classes.
Working with doctors at the Mayo Clinic Center for Sleep Medicine, the student team will explore the possibilities of creating techno tracks from up to, at least, four data points from raw polysomnogram data (EEG/Pulse/Oxygenation). The goal is to convert sleep data into interesting music to enable sleep diagnostics that would be accurate and fun–for the world.