I focus on how individuals control their bodies during a variety of movements. I have studied how individuals with a diverse range of capabilities perform turning tasks, such as dancers performing multiple revolutions, athletes circumventing obstacles, and older adults changing direction while walkingBy advancing our understanding of how individuals satisfy competing mechanical objectives, we can personalize intervention strategies to improve performance and reduce the risk of injury. I am developing biofeedback technology to improve movement mechanics by encouraging successful mechanical strategies that are identified experimentally. 

At Rush University Medical Center, we are launching a number of projects and currently seeking students who are interested in the following topics: overhand throwing, tracking motion of the upper extremity, comparing upper extremity surgical methods, and developing biofeedback using wearable sensors. Please click here for my faculty page for more information. 

Jumping optical motion capture at the Rush Motion Analysis Lab with realtime marker tracking while testing the feasibility of a scapula tracking grid system. 

Jumping optical motion capture at the Rush Motion Analysis Lab with realtime marker tracking while testing the feasibility of a scapula tracking grid system. 

Prior to joining Rush, I was a Postdoctoral Fellow of the Mechanical Engineering Department at the University of Michigan. As a Postdoc, I worked with Professor Noel Perkins and his lab to further understand the dynamics of successful and unsuccessful performance of pedestrian and athletic maneuvers using inertial measurement units (IMUs). These IMUs are useful for studying the orientation and accelerations of body segments in real-world contexts (outside of the laboratory). For more information about this research, please visit: http://www.ncp.engin.umich.edu/ While at Michigan, I also led a multidisciplinary effort to create a biofeedback device that provides realtime sound feedback to dancers about the alignment of their pelvis.

Please see below for other research interests and example biomechanics videos:

SONIFICATION AS FEEDBACK DURING TURNS

Effective translation of research findings to a performer entails comprehensive understanding of contextually-relevant feedback modalities. Therefore, I have been developing interactive auditory real-time feedback because it is an especially relevant feedback format for dancers performing turnsSonified feedback for skill acquisition is a relatively new field, but it has the potential to augment clinical practices, in-field training, and at-home rehabilitation game design, especially during turning tasks. 

Why provide feedback during turns? Turns are fundamental to daily living and particularly challenging for clinical populations who struggle with maintaining balance. Additionally, for the application of interactive media within the gaming/ rehab industry, incorporating turning into gameplay is strategic for use in smaller spaces. 

Why use sound feedback? Visual feedback is the typical format used by biomechanists and clinicians. However, forcing the visual system to focus on a display during turning tasks could overtax the visual processing required during turning. Further, within my dissertation's focus, dancers are accustomed and attracted to interacting with musical cues, supporting sonified feedback as a contextually-relevant modality. 

Sonified feedback at the whole-body level: As part of my dissertation, I piloted sonifying reaction forces during a series of balance regulation and turning tasks in collaboration with interactive media designer, Vangelis Lympouridis. For instance, one dancer's most common error was to fall backwards during a turn when the reaction force was directed behind her for too long during that turn's initiation. Therefore, we designed a sound environment that encourages reaction forces directed in front of her by using harmonic sounds and discourages backward reaction forces using dissonant sounds.

Sonified feedback at the subsystem level: We are developing real-time interactive media focused on the improving the alignment of forces relative to body segments. For example, during turning tasks, loads are transmitted upward through the joints of the leg and if the mechanical demand imposed on these joints exceeds the individual’s control capabilities, the risk of injury will increase. However, reaction forces aligned with body segments reduce load imposed on joints and can be advantageous. Sonification can be used to encourage alignment between the leg’s segments and the reaction forces to simplify multijoint control requirements.