Winning Project from 2019
Winning project from the fifth annual CNT hackathon, held February 2019.
RISE (winning team)
Melchizedek Mashiku (GSU), Preston Pan (UW), and Karley Beniff (UW)
Rehabilitation for Independent Seated Extension (RISE)
RISE uses electromyography (EMG), accelerometry/magnetometry, and patient gaze data to create an immersive seated-extension rehab experience for the patient while providing real-time information about trunk position and muscle activity to the clinician. RISE measures perispinal muscle activity via surface EMG sensors, patient gaze via Hololens, and spinal flexion via magnetometer and Arduino Uno.
Microsoft Hololens, Magnetometer & Flex API, EMG electrodes, Arduino Uno + Bluetooth
Projects from 2017
Team projects from the third annual CNT hackathon, held February 2017. Not listed in any particular order.
DropStop (winning team)
Rachel Adenekan (MIT), Camille Birch (UW), Alisha Menon (Arizona State University)
Interactive rehabilitation system for patients with foot drop
Health professionals recommend physical therapy for almost all patients with foot drop. However, therapy can be extremely frustrating for patients, as the exercises - movements that were easy before their stroke or injury - are extremely difficult to perform. This makes them less motivated to complete tasks on their own. Research heavily suggests that visual and audio feedback in rehabilitation and training can help patients better understand their own performance and improvement in a clear and motivating manner.
DropStop measures the anterior tibialis muscle activity via surface EMG sensors that are connected to an OpenBCI Cyton biosensing board. The Cyton records and communicates wirelessly with the computer connected to an Arduino Uno through a USB dongle based on the RFduino radio module driving audio and visual feedback.
OpenBCI Cyton, EMG electrodes, Arduino Uno
Extend Your Limits
Adrian Fernandez (SDSU), Surabhi Nimbalkar (MIT), Anand Selvan Sekar (UW)
Augmented Reality (AR) upper-limb rehabilitation exercises
"Extend your Limits" is an augmented reality interface that makes rehabilitation exercises - such as stretching for patients with neuromuscular degenerative diseases - more interactive and informative. A flex resistor determines the "external" degree of stretching, while an EMG sensor determines the "internal" level of activity of the antagonist muscle pair. This information is combined and displayed on a Microsoft HoloLens worn by the patient.
In the scope of the hackathon, the flex resistor was worn on the elbow and the EMG sensor was worn on the bicep - measuring a tricep stretch. This information was directly sent to an Arduino Uno on a Backyard Brains Muscle SpikerShield, then transmitted to the HoloLens through a Bluetooth keyboard (by simulating a key press every instant the flex resistor and EMG both passed a threshold). The HoloLens interface displayed the number of stretches completed and how long a current stretch is being held - as well as a diagram of the stretch.
Backyard Brains Muscle SpikerShield, Arduino Uno, EMG electrodes, flex resistor, Microsoft HoloLens
Aaron Adler (UW), Marc-Joseph Antonini (MIT), Natat Premvuti (UW)
EEG guided meditation/focus routine for casual enthusiasts
There has been a growing interest in neurofeedback as a treatment for a variety of disorders including ADHD, Stress, Depression, and its application as an aid in meditation of reaching a highly focused state. Neurofeedback, within an operant conditioning framework, helps individuals to regulate their cortical EEG activity while receiving feedback from a visual or acoustic signal. However, the majority of companies provide neurofeedback and use EEG under almost no scientific basis, and even in clinical settings.
That is why we decided to develop an open source, open hardware neurofeedback toolbox. Using the OpenBCI Mark IV as an EEG acquisition system, we provide feedback to the patient under the form of pleasing acoustic stimuli that follow EEG metrics corresponding to desired application.
OpenBCI Ultracortex "Mark IV" EEG Headset
Arvind Balasubramani (SDSU), Ropafadzo Denga (Spelman), Ariel Stutzman (Southwestern College)
Virtual game for training the sympathetic and parasympathetic nervous systems
It uses the OpenBCI Cyton board to capture biosignals associated with stress (e.g. EKG and EMG) in order to manipulate a virtual environment in Unity. This immersive environment, projected through HoloLens, is meant to provide multimodal sensory feedback that reflects the user’s stress levels and appropriately challenges or eases their ability to maintain a calm brain state. We implemented a simple golf-like game in Unity for this purpose, with an intended stress-modulated game mechanic that would shorten the distance between the ball and the hole as stress levels improved, but did not fully integrate this feature in time. Our hands-free system is designed to be usable by anyone—sufferers of neurological disease, health-minded people, and students alike—as a stress management tool and as a fun demonstration of brain computer interface technology.
OpenBCI Cyton, Unity, Microsoft HoloLens, Visual Studio
Purushothaman Padmanabhan (UW), Benjamin Pedigo (UW), Zezhi Zheng (UW)
Sensorized personal trainer for sports rehabilitation
Without having a personal trainer or expensive clinical equipment, a lot of valuable data is not taken into consideration when trying to correct bodily movements of a person, such as in rehabilitation or in sports. Current technology considers motion capture to correct mistakes in movement. But it does not take into account the physiology of the movement, the muscles causing this movement. We aim at using available consumer devices to help people with this. We are using a Myo armband and a 5-DT Data Glove to acquire motion data as well as EMG data, to get a better picture behind the physiology of the movements and advice on correction it.
Myo Armband, 5-DT Data Glove
Projects from 2015
Team projects from the second annual CNT hackathon, held November 2015. Not listed in any particular order.
Watch the team presentations on YouTube!
Presentations took place Monday November 9th, 2015 9:00am - 11:00am
Face the Music (winning team)
Timothy Brown (UW), Jaycee Holmes (Spelman), Catherine Yunis (MIT)
Facial EMG game set for facial muscle rehabilitation and artistic expression.
Face the Music is an affordable, open-source way to play music using your face muscles. It can be used for patients with facial paralysis, for people learning about neuroscience, or anyone looking for a hands-free creative outlet. It uses a set of wet electrodes on the users face to pick up muscle activity in two places, on the cheeks and above the eyebrows. The electrodes are connected in a special way: each side's electrodes are daisy chained together before connecting to a lead to the Spiker Shield. A reference electrode is placed right below the hairline. The signals are amplified, digitized, and transmitted using a Backyard Brains Spiker Shield and Arduino Uno. The signals are analyzed and mapped to images and sounds using a program written using the Processing environment.
Backyard Brains MuscleSpiker Shield, Wet EMG electrodes (unipolar), Arduino uno, Processing (Java)
Alexander Lim (MIT), Jonathan Realmuto (UW) (not in photo), Seleste Braddock (Spelman)
Smart cane for blind individuals.
Revision is a white cane (i.e., a cane designed to help the blind navigate) equipped with sonar and haptic feedback. The sonar senses the distance to objects and the haptic feedback produces a vibration of the handle proportional to the distance. Thus, the user is able to "feel" there surroundings.
A Raspberry Pi 2 is used for on board computation and sensor integration. Revision is also equipped with a webcam. The webcam was going to be used for online image processing of faces and street signs, however, we didn't have enough time to integrate the image processing software on the Pi.
Sonar sensor, 3D-Printed Parts, Webcam, Raspberry Pi
Ebenezer Nkwater (MIT), Brandon Smith (UW), Brendan King (UW)
Haptic feedback in the form of vibration pulse patterns to assist in emotion recognition from facial expressions.
EmoVibe is a discrete wearable device for autistic patients that performs facial expression recognition in real time and provides feedback in the form of vibrations. Individuals can either use preset patterns or set up patterns for each emotion to ease learning time. EmoVibe also prevents stigma with its discreet nature and its software can easily be incorporated into available tools such as tablets and phones.
Laptop, Camera, Adafruit Flora, Vibro-Tactile Motors, 3D-Printed Vibro-Tactile Motor Holders
Pragnesh Patel (SDSU), Miranda Gavrin (MIT), Gaurav Mukherjee (UW)
EMG controlled video game for hand muscle rehabilitation.
Rocket brain is an open-source game designed to provide an interactive environment for use as a biofeedback platform, a simple-to-learn tool for STEM outreach, and for neuro-rehabilitation research. In its current form, it accepts a single channel of electromyography (EMG) signal as an input to the game. The objective is to steer the rocket brain to avoid obstacles approaching it in the form of pillars. The distribution of the pillars grows increasingly complex with increasing levels of difficulty. This keeps the user focused and engaged.
The source code is open for non-commercial use and development under a creative commons license and is available upon request.
Biometrics EMG, PyGame
Lisa Liu (MIT), Oliver Stanley (UW), Alberto Perez (SDSU)
EMG controlled robotic arm for educational outreach.
We developed a gesture-controlled robotic hand, which can be used as an educational tool. The system used a Myo Armband and the Myo SDK to read gestures. The Myo sends the signals to a computer, which then sends signals to an Arduino, which is used to control the motors in the OWI Arm Edge. Students can learn basic engineering principles, such as mechanical assembly, circuit design, programming, and signal processing. In future implementations, it would be nice to take the computer out of the flow, and have the Myo send information directly to the Arduino via Bluetooth.
Myo Armband, OWI Robotic Arm, Intel Galileo, Lots of MOSFETs
Projects from 2014
Team projects from the first annual CNT hackathon, held October 2014. Not listed in any particular order.
Orthotic hand controlled by facial emg. Used: Emotive headset (for EMG), servo motors, fishing line, 3D printed gears/spools and rings, gloves, and Arduino.
Kinect integrated google earth for rehab. Used Kinect, Google earth api, Arduino + accelerometer.
Suite of measurement tools for stress and anxiety. Used Biometrics EMG, Arduino + accelerometers.
EMG controlled camera for movement hindered patients. Used biometrics EMG and Goniometers, MS Lifecam, USB Turret.
One handed vibrotactile typing through actuated glove. Used gloves, vibrotactile motors, tactile switches.