Eye imaging in pediatric patients is notoriously difficult but is essential for screening, diagnosis, and management of pediatric eye diseases. Patient participation and cooperation are necessary for successful imaging since the patient must gaze into the imaging instrument, follow a fixation target, and hold still for several minutes. Even the smallest misalignments or movements of the eye produce signal loss or corrupt images with motion artifact. For pediatric patients, age-appropriate fear and aversion responses limit cooperation, and thus participation, in clinical imaging. Consequently, such patients frequently require an exam under anesthesia in the operating room, which adds significant systemic medical risk, stress, delay, and cost to care. This project seeks to develop a motion-tolerant, child-friendly eye imaging system for pediatric ophthalmologists and their young patients.
The project for the upcoming term focuses on designing a robotic, interactive character capable of performing an OCT eye exam. By leveraging robotics, automation, and user interaction design, we seek to transform the sterile, cold, and clinical pediatric eye imaging process into an engaging activity that invites pediatric patients into an interactive partnership to achieve the diagnostic goal safely. Pediatricians and other child caregivers have long understood that child-centric activities such as storytelling and the utilization of videos and toys can engage pediatric patients and encourage them to participate in their care. Over the next term, we will design, prototype, and evaluate several robotic characters and incorporate the best-performing designs into the prototype medical device that we are building.
In the first stage of this project, we plan to:
- Design an experimental user study (including procedure, method, recruitment strategies, analysis plan) to collect information from children and key stakeholders about their preferences for a robot character, including the form factor and interactive behaviors of the robot
- Submit an IRB (Internal Review Board) of the designed study
- Collect preliminary data after IRB acceptance
- Video and audio record the robot playing different characters. These videos will be used in the experimental user study and will be post-produced to be engaging, fun, and informative
Future stages of this project will include:
- Develop hardware and software for the robot, going from prototype to software implementation
- Data analysis of human-robot interaction studies in situ, which will be in the Kellogg Eye Clinic
- Engagement with stakeholders, including doctors, hospital staff, and parents, to collect information about the acceptance of the robot character
Students apply for a specific role on the team as follows:
UX Design Research Specialist (2 students)
Preferred Skills: Experience with the design research, including user journey, personas, ideation, and knowledge of surveys/scales/questionnaires, interview formats, and IRB applications.
Prerequisites: Any course related to UX Design research, design of experiments, data analysis of human data, qualitative analysis, and quantitative analysis, survey design.
Likely Majors/Minors: ARTDES, BBA, COGSCI, COMM, CS, DATA, PSYCH, SI, STATS
AV Editing & Production (2 students)
Preferred Skills: Experience with video/sound recording and editing, use of tools like Audacity, Adobe Audition, Premiere Pro, etc. The successful candidate will work on the team to document the process of creating a robot character and creating final videos to be used during user study testing.
Likely Majors/Minors: ARTDES, CE, CS, EE, FTVM, PAT
Character Design and Fabrication (2 students)
Preferred Skills: Character design, fabrication, prototyping, presentation sketching. Students in this role will work to design and develop robot arm “characters”, and will prototype those character accessories with the exam unit. Portfolio required with application.
Likely Majors/Minors: ARCH, ARTDES, COMM, FTVM, THTREMUS
Animation/Graphic Design (2 students)
Preferred Skills: Familiarity with graphic design and/or animation software. Students in this role may be producing either animated clips or graphically illustrated literature geared at educating kids and families on the upcoming eye exam, featuring characters developed to encourage participation. Portfolio required with application.
Likely Majors/Minors: ARCH, ARTDES, CS, FTVM, SI
Scriptwriting (2 students)
Preferred Skills: Experience with script and/or creative writing. Students in this role will be creating scripts and text for video, animation, and/or literature, in collaboration with videographers, animators, and graphic designers, helping kids and families understand the eye exam process. Writing samples required with application.
Likely Majors/Minors: ARTDES, COMM, ENGLISH, FTVM, THTREMUS
Faculty Project Lead
Mark Draelos is an Assistant Professor of Robotics and Ophthalmology at the University of Michigan, where he directs the Image-Guided Medical Robotics Lab. His work focuses on image-guided medical robots for both clinical and surgical applications with an emphasis on clinical translation. His interests include medical robotics, biomedical imaging, data visualization, medical device development, and real-time algorithms. Previously, Mark was a fellow in the Entrepreneurial Postdoctoral Program and a general surgery resident, both at Duke University.
Patrícia Alves-Oliveira is an Assistant Professor in Robotics at the University of Michigan. Previously, she was a Senior UX Designer for the Astro robot at Amazon Lab126. Patricia designs interactions for social robots that empower and enhance human well-being. Her interdisciplinary background unifies the fields of robotics, design research, and psychology. Patricia was a Postdoctoral Research Associate at the Computer Science and Engineering Department at the University of Washington. She received her Ph.D. from ISCTE-University Institute of Lisbon and spent time at Cornell University as a Visiting Graduate Scholar. Her research received two Best Paper Awards at the International Conference on Human-Robot Interaction. She co-founded Talking Robotics and is now a volunteer in Open Style Lab.
Nita Valikodath is a Clinical Assistant Professor of Ophthalmology at the University of Michigan. She is an adult and pediatric vitreoretinal surgeon. Her research focuses on biomedical device development and innovative imaging solutions to improve clinical and surgical care for adult and pediatric patients. She also has expertise in applications of telemedicine and artificial intelligence in pediatric retina. She is a K12 scholar and will be focusing on clinical trials and medical device development. She graduated from the University of Michigan Medical School. She also obtained a Masters of Science in Clinical Research from the University of Michigan. She completed her ophthalmology residency at the University of Illinois at Chicago and vitreoretinal surgery fellowship at Duke University.
Students: 10
Likely Majors/Minors: ARCH, ARTDES, BBA, CE, COGSCI, COMM, CS, DATA, EE, ENGLISH, FTVM, PAT, PSYCH, SI, STATS, THTREMUS
Meeting Details: This project will have weekly meetings once a week for a duration of 1 hour in the Robotics Building. The meeting date and time will be defined according to the availability of the team. Meetings will take place in-person with occasional hybrid meetings if needed.
Application: Consider including a link to your portfolio or other websites in the personal statement portion of your application to share work you would like considered as part of your submission.
Summer Opportunity: Summer research fellowships may be available for qualifying students.
Citizenship Requirements: This project is open to all students on campus.
IP/NDA: Students who successfully match to this project team will be required to sign an Intellectual Property (IP) Agreement prior to participation.
Course Substitutions: Creative Expression, Intellectual Breadth, CoE Honors, Flex Tech (CoE), other capstone and substitutions as arranged.