Gait retraining to reduce the knee adduction moment is an important treatment method for knee osteoarthritis, but current paradigms do not account for differences in human anatomy and do not allow patients to change walking directions or perceive haptic feedback sensations during gait, which have prevented clinical application. This project investigates human movement control, sensing, and haptic feedback, to explore the scientific basis for a wearable system to improve knee osteoarthritis based on novel concepts of data-driven adaptable gait prediction optimization, real-time heading vector estimation, and skin mechanical resonance haptic feedback. The research is conducted based on technical theories and scientific experiments to investigate the following issues: real-time gait biomechanics modeling and wearable sensor fusion, gait prediction optimization modeling and control, and wearable skin stretch feedback device design.
This project focuses on three key problems: 1) inability to accurately predict new gait kinematics to reduce the knee adduction moment, 2) inability to measure human gait kinematics after humans change walking directions, and 3) imperceptible haptic feedback during gait. A wearable sensing and haptic feedback system prototype will be developed to demonstrate key research findings and clinical testing will be performed to establish a scientific basic for practical application. The results of this study are intended to provide a scientific foundation for wearable systems research for gait retraining.
Wearable System for Gait and Balance Training
Wearable system hardware
Eight distributed nodes (Dots) simultaneously send and receive data to and from the central control unit (Hub) where real-time computation and control algorithms are performed. Dots are configured in software to act as sensors, feedback vibrotactors, or both. Each Dot is comprised of a 9-axis IMMU sensor, vibration feedback motor, ZigBee wireless communication module, and a 100 mAh lithium ion battery. All Dot components are embedded in a single silicon mold. Each Dot weighs 12 grams and the overall top surface area is roughly the same as a 25 mm coin.
Real-time sensing and feedback control diagram
Raw wearable sensing data is transformed into relevant biomechanical measurements through the real-time gait biomechanics model. The gait prediction optimization model outputs desired biomechanical measurements and wearable haptic feedback is used to inform patients of require gait changes.
Mobile devices such as smart phones, tablets and laptops are increasingly integral in our daily lives. Traditional interaction paradigms like on-screen keyboards can be an inconvenient and cumbersome way to interact with these devices. This has led to the advancement of bio-signal interfaces such as speech based and hand gesture based interfaces. Compared to speech input interfaces, hand gesture devices can be more intuitive in resembling physical manipulations related to spatial tasks such as navigation on a map or manipulating a picture.
Electromyography (EMG) is well suited for capturing static hand features involving relatively long and stable muscle activations. At the same time, inertial sensing can inherently capture dynamic features related to hand rotation and translation. This project explores a hand gesture recognition wristband based on combined EMG and IMU signals. Preliminary testing was performed on four healthy subjects to evaluate a classification algorithm for identifying four surface pressing gestures at two force levels and eight air gestures. Average classification accuracy across all subjects was 88% for surface gestures and 96% for air gestures. Classification accuracy was significantly improved when both EMG and inertial sensing was used in combination as compared to results based on either single sensing modality.
EMG-IMU Wristband for Hand Gesture Recognition
Overall structure of the hybrid EMG and IMU data acquisition wristband
Air Gesture Recognition Rate
Classification accuracy of eight air gestures: okay sign (OK), peace sign (PS), hang loose (HL), finger snap (FS), thumbs up (TU), thumbs down (TD), turn palm over (TP), walking fingers (WF) for a representative subject. Classification accuracy is shown when using EMG or inertial sensing (ACC) individually, and in combination (Both).
Pressure Sensing Hand Gesture Recognition
Hand movement measurements are vital for upper extremity rehabilitation, particularly after stroke. Wearable hand gesture recognition is a promising approach as it has been shown to aid in upper extremity stroke rehabilitation. Wearables can further serve to monitor short- and long-term rehabilitation progress for acute injuries and chronic disease by providing objective patient metrics to enable therapists to optimize such rehabilitation programs. Wearable hand gesture recognition is also fundamental to enabling intelligent upper extremity prosthetics to function in real-life situations.
This project presents a new approach to wearable hand gesture recognition and finger angle estimation based on barometric pressure sensing. A wearable prototype consisting of an array of barometric pressure sensors around the wrist was developed and validated with experimental testing for three different hand gesture sets and finger flexion/extension trials for each of the five fingers. This project demonstrates that a barometric pressure wristband can be used to classify hand gestures and to estimate individual finger joint angles. This approach could serve to improve clinical treatment for upper extremity deficiencies, such as for stroke rehabilitation, by providing objective patient motor control metrics to inform and aid physicians and therapists throughout the rehabilitation process.
Barometric Pressure Sensing Wristband for Hand Gesture Recognition
Classification Algorithm Flowchart
Three Groups of Hand Gestures for Validation Testing
Sensor Fusion Algorithms
The aim of this project is to develop novel multi-sensor fusion models, which combines wearable sensing data (accelerometer, gyroscope, and magnetometer) to compute clinically important human kinematics during dynamic movement. We initially focus on accurate sensing during gait for foot progression angle (relevant to knee loading for knee osteoarthritis) and trunk sway angle (relevant to walking stability).
One primary problem with the current kinematic estimation algorithms is that gravity and magnetic field vectors are measured in the earth’s reference frame, but gait kinematics must be expressed in the human body reference frame. Thus, a static trial is typically performed prior to the walking trial to align the earth reference frame with the body reference frame. However, this only works when the human body reference frame and the earth reference frame do not change with respect to each other during the entire walking trial. Current research involving wearable kinematic sensing is confined to conditions of straight line walking and doesn’t allow humans to change walking direction. However, in real life walking situations, humans are constantly turning and changing walking directions.
Thus, research will be performed to form a model to compute human gait kinematics in real-time when changing walking directions based on multiple wearable sensors attached to multiple body segments. First, develop sensor fusion algorithms to combine accelerometer, gyroscope, and magnetometer signals to accurately estimate each body segment at the location of the sensors, which includes solving the drift problem of integrating gyroscope angular velocities, the environment magnetic noise problem of magnetometers not always measuring true magnetic north pole, and the unwanted accelerations problems of accelerometers. Second, combine individual body segment orientations to compute anatomical gait kinematics, which involves solving the problem of inaccurate sensor placement on the body. Third, project all gait kinematics into the reference frame of the human to ensure they are still valid after changing walking directions. Accelerometer, gyroscope and magnetometer data will be fused together, integrated, and Gradient decent, Kalman, and custom filters applied to estimate kinematics. Vicon, marker-based motion capture system is used to validate measurements accuracy.
Foot Progression Angle Estimation
Accelerometer, gyroscope, and magnetometer data fusion from the miniaturized sensor to estimate foot progression angle regardless of walking direction.
Trunk Sway Angle Estimation
Accelerometer and gyroscope sensor fusion to estimate medial-lateral trunk sway angle from the miniaturized sensor during walking and running gait.
Smart Sensing Shoes
This work focuses on the development and validation of a smart shoe for estimating foot progression angle during walking. The smart shoe is composed of an electronic module with inertial and magnetometer sensing inserted into the sole of a standard walking shoe. The smart shoe charges wirelessly and up to 160 hours of continuous data (sampled at 100Hz) can be stored locally on the shoe. Foot progression angle estimations from the smart shoe were compared with estimations from a standard motion capture system. In general, foot progression angle estimations from the smart shoe closely followed motion capture estimations for all walking conditions with an overall average estimation error of 0.1±1.9deg and an overall average absolute estimation error of 1.7±1.0deg. There were no significant differences in foot progression angle estimation accuracy among the seven different walking gait patterns. The presented smart shoe could potentially be used for knee osteoarthritis or other clinical applications requiring foot progression angle assessment in daily life or in clinics without specialized motion capture equipment. The smart shoe could also potentially be used for interactive games, assessing athletic performance, or assisting in rehabilitation.
Smart Shoe Design and Demo
Electronics (3-axis accelerometer, 3-axis gyroscope, 3-axis magnetometer, wireless data transmission, and wireless charging circuits) are embedded in the shoe sole.
Smart shoe charges wirelessly.
Energy Harvesting Shoes
Wearable energy harvesting during gait is an important potential avenue for powering the vast array of recently available wearable devices for clinical and recreational purposes, however current approaches either produce too little power to be useful or increase the human metabolic cost of gait thus preventing practical application. This project will investigate movement control, adaptation, and energy consumption to explore the scientific basis for a regenerative, smart shoe system for harvesting energy during gait based on the novel concepts of comprehensive human mechanical and metabolic energy estimation, gait change prediction, and horizontal foot sole slider energy harvesting. The research will be conducted based on technical theories and scientific experiments to investigate the following issues: human mechanical and metabolic energy gait model, gait change prediction model to convert negative work to positive work, and smart shoe regenerative harvester design. Three key problems are expected to be solved: unknown gait braking energy loss, converting negative work braking energy into positive work propulsion energy, and harvesting gait energy without increasing metabolic cost. A regenerative, smart shoe system prototype will be developed to demonstrate key research findings and gait testing will be performed to establish a scientific basic for practical application. The results of this study will provide a scientific foundation for energy harvesting research for powering wearable devices.
Energy Harvesting Sliding Shoe
Custom sliding shoe created by mounting a sliding mechanism to the sole of a standard walking shoe. Energy is harvested as the shoe slides during the stance phase of gait by the generator on the sliding mechanism. The compressed spring returns the sliding mechanism to the original position during swing phase when the shoe is not contacting the ground.
Resonant frequency skin stretch uses cyclic lateral skin stretches matching the skin’s resonant frequency to create highly noticeable stimuli, signifying a new approach for wearable haptic stimulation. In this project, three experiments were performed to explore biomechanical and perceptual aspects of resonant frequency skin stretch. In the first experiment, skin resonant frequencies were quantified at the forearm, shank, and foot. In the second experiment, perceived haptic stimuli were characterized for skin stretch actuations across a spectrum of frequencies. In the third experiment, haptic classification ability was determined as subjects differentiated haptic stimulation cues while sitting, walking, and jogging. Results showed that subjects perceived stimulations at, above, and below the skin’s resonant frequency differently: stimulations lower than the skin resonant frequency felt like distinct impacts, stimulations at the skin resonant frequency felt like cyclic skin stretches, and stimulations higher than the skin resonant frequency felt like standard vibrations. Subjects successfully classified stimulations while sitting, walking, and jogging, and classification accuracy decreased with increasing speed, especially for stimulations at the shank. This work could facilitate more widespread use of wearable skin stretch. Potential applications include gaming, medical simulation, and surgical augmentation, and for training to reduce injury risk or improve sports performance.
Wearable Skin Stretch Prototype
The overall size is 29 16 20 mm, and the weight is 18.4g.
Skin resonance experimental setup showing the device at the (a) left posterior forearm, (b) left posterior shank and (c) left top of foot. A high-speed camera was used to track device position during testing.
Existing wearable devices are typically hard, manufactured with techniques for rigid consumer electronics like smart phones, and are thus often not well suited for use on human skin, which is soft and elastic. We focus on developing elastic substrates which closely match human skin properties. Electronics, sensors, actuators, and wires are embedded in elastic substrates as part of the design process. Soft robotic sensing devices contain accelerometers, gyroscopes, and magnetometers, and haptic devices contain small vibration motors for haptic vibration feedback.
Design and fabrication of elastic substrates for electronic sensing and feedback devices are challenging due to the inherent properties of flexible/elastic materials. Specific challenges include susceptibility to micro cracks during the deposition process and the potential for buckling and wrinkling of rigid thin films on elastic substrates. Additionally, adhesion of deposited metal on elastic substrates can be challenging given the mismatch in thermal coefficients between the two materials.
Lithography and electroforming are critical to the fabrication process. First a layer of PDMS is coated on a rigid glass base. This is followed by seed layer deposition and the use of lithography to form a resist layer to define the circuit structure. Electroforming is then used to make the gold metal conductive circuit. The resist layer is then removed and the seed layer etched. The sensors, microcontroller, battery, and other necessary electronic components are then soldered to the circuit substrate to complete the device packaging. Finally, the glass base is released leaving only the elastic substrate.
Manufacturing soft sensor and feedback devices on an elastic substrate
Elastic substrate is 0.5 mm thick
Elastic substrate with MCU, Zigbee wireless, 3-axis accelerometer, 3-axis gyroscope, 3-axis magnetometer, battery, power regulator, and vibration motor
Miniaturized Sensing and Haptic Feedback
Movement retraining has the potential for effectively treating musculoskeletal diseases and improving rehabilitation for clinical and sports applications. While movement retraining in controlled laboratory settings has shown promise, practical applications of portable movement retraining have not been achieved due to critical barriers in wearable hardware, sensing, real-time modeling, and haptic feedback, specifically: cumbersome and uncomfortable hardware, inaccurate sensing, lack of real-time functionality, and ineffective haptic feedback paradigms.
This project seeks to solve these problems by: 1) combining sensing and feedback in a small, comfortable package, 2) developing a wearable sensing framework for accurate human kinematics estimation, 3) modeling key kinematic, kinetic, and muscle force gait parameters in real-time, and 4) creating novel haptic feedback paradigms for fast and effective movement training. The goal of this project is to expand the scope and effectiveness of human movement training beyond systems tethered to the laboratory to extend health and lifestyle benefits to the general public. Results are expected to produce a movement retraining methodology and portable device platform prototype with wearable movement sensing, wearable haptic feedback, and real-time dynamic modeling.
Real-Time Orientation Estimation
Orientation estimation performed via 3-axis accelerometer, 3-axis gyroscope, and 3-axis magnetometer sensor fusion
Combined sensing and feedback package: accelerometer, gyroscope, magnetometer and vibration motor
Made through a 2-step embedded electronics pouring process based on a 3D printed mold
Size and Weight
Height: 22.5 mm, Width: 20.5 mm, Depth: 9.3 mm, Weight: 6.8 g
Use skin-safe tape to attach to the skin (don't need straps) like a motion capture reflective marker
Wearable system contains 8 dots, wirelessly interconnected, for movement analysis and retraining