The mission of the UW Biorobotics Lab (BRL) is to develop science, technology, and human resources at the interface between robotics and biological movement systems. The two main focuses of the lab during my time there were robotic surgery and novel haptic interfaces.
I sought to develop an approach to characterize the mechanical properties of soft tissue in real time. Identification of these tissue-specific properties could enable partial-automation of grasping in robotic surgery, preventing unintended damage, and be used to identify different types of healthy and pathological tissue using machine learning algorithms.
To address this problem, I designed, built, and programmed a robotic device to collect force and tissue displacement data as it palpated soft-tissue. I designed and build this device from scratch, honing my CAD, machining, and electrical circuit design skills. I used an Unscented Kalman Filter (UKF), a non-linear, recursive filtering algorithm, implemented in C/C++ to identify parameters of several non-linear, dynamic models of soft-tissue from the collected data. I chose the UKF because it allows for simultaneous online identification of tissue model parameters and states of a dynamic system (robot position).
After calibrating the sensors and actuators on the device, I tested and validated system performance on synthetic tissue and in vivo porcine abdominal organs. Initial performance was very promising and the project was continued by several students in the Biorobotics Lab following my graduation.
Here is a video demo of the device moving without any tissue:
This highly interdisciplinary project allowed me to hone my skills in advanced signal processing/data analysis algorithms as well as embedded system programming and hardware development. I won a Mary Gates Research Scholarship for my project and presented my work at the 2012 UW Undergraduate Research Symposium.Download Report (PDF)
Two graduate students and I developed a glove that allows a user to manipulate sound effects playing in the background by moving their hand in space and bending their fingers as if they were playing an instrument. I co-developed the glove hardware and microcontroller code to track finger bending using piezoresistive sensors, actuate vibrating motors with increasing finger bending, and relay sensor data via Bluetooth to a computer. We used a Microsoft Kinect to track the user’s hand position, allowing them to control different sound effects in different locations. As a group, we integrated the data in a music synthesis platform to generate sound effects and vary their volume, speed, and pitch with finger bending. Our project has since been extended in the BRL as a general-purpose device for haptic interaction with virtual environments.
Here is a video demonstration of our final prototype in action: