Checkout the latest updates to the project here!
Designed and programmed in collaboration with Eric Wells.
Gesture control is becoming a popular technique integrated into the growing area of virtual reality and rehabilitation. The popular Myo Armband achieved this using Electromyography (EMG) sensors placed on the forearm. However, this device was recently discontinued from production, leaving a gap in the marketplace.
We achieved the same result as the Myo Armband, using a unique method involving Force Sensitive Resistors (FSR's) rather than the expensive EMG sensors. These sensors are much cheaper compared to EMG sensors. They are wrapped around the users forearm and measure pressure changes resulting from different muscle configurations. We use a desktop robotic arm in order to demonstrate the gesture control.
This was entirely constructed in the 24 hour time limit at the hackathon. This included:
- Hardware design and construction
- Mechanical design and construction
- Machine learning algorithm design and implementation
- Implementation of trained model onto functioning hardware
- All software involved in reading and mapping sensor data
- Non-repatability in the FSR sensors due to the compliant wristband used caused for many retraining sessions to be required
- Successfully classify 4 different gesture controls
- Successfully control a 3 degree of freedom (DOF) robotic arm
How to train and design various classifiers in MATLAB, and how to export the trained model for implementation on real-time hardware.
Cable management and a more robustly designed mechanical wristband would allow for better repeatability, and likely more classification options.
- C++
- Arduino
- MATLAB
- hardware
- machine-learning
- SVM
- human-computer-interaction
- 0 --> relax
- 1 --> extension
- 2 --> flexion
- 3 --> fist