top of page

Winning Project from 2020

Projects from the sixth annual CNT Hackathon, 2020

Hack2020a.jpg
Hack2020d.jpg
CNT 2020 Hackathon MD206148.jpg
Sign2Speech

Nikolas Ioannou (UW), Collin May (Whatcom), and Tian Wang (UW)

​

Description:

Biosignals from a CyberGlove were classified using a cloud-hosted ML model, trained on over 1000 hand poses -- a dataset created by this team during their Hackathon. Their Sign2Speech live demo included interaction with an online GUI they had developed, which also featured common word and phrase prediction. Sign2Speecjh was designed to assist people with aphasia, a common manifestation of stroke.  

 

Equipment Used:

CyberGlove, Python, TensorFlow, Grit & Ingenuity

Winning Project from 2019

Winning project from the fifth annual CNT hackathon, held February 2019. 

haf5.jpg
CNT Hackathon 2019 MRD09319_edited.jpg
CNT Hackathon 2019 MRD09634.jpg
RISE (winning team)

Melchizedek Mashiku (GSU), Preston Pan (UW), and Karley Beniff (UW)

 

Rehabilitation for Independent Seated Extension (RISE)

 

Description:

RISE uses electromyography (EMG), accelerometry/magnetometry, and patient gaze data to create an immersive seated-extension rehab experience for the patient while providing real-time information about trunk position and muscle activity to the clinician. RISE measures perispinal muscle activity via surface EMG sensors, patient gaze via Hololens, and spinal flexion via magnetometer and Arduino Uno. 

 

Equipment Used:

Microsoft Hololens, Magnetometer & Flex API, EMG electrodes, Arduino Uno + Bluetooth

& Previous Years!

DropStop (First Place in the CNT Hackathon 2017)

Rachel Adenekan (MIT), Camille Birch (UW), Alisha Menon (Arizona State University)

 

Interactive rehabilitation system for patients with foot drop

 

Description:

Health professionals recommend physical therapy for almost all patients with foot drop. However, therapy can be extremely frustrating for patients, as the exercises - movements that were easy before their stroke or injury - are extremely difficult to perform. This makes them less motivated to complete tasks on their own. Research heavily suggests that visual and audio feedback in rehabilitation and training can help patients better understand their own performance and improvement in a clear and motivating manner.

 

DropStop measures the anterior tibialis muscle activity via surface EMG sensors that are connected to an OpenBCI Cyton biosensing board. The Cyton records and communicates wirelessly with the computer connected to an Arduino Uno through a USB dongle based on the RFduino radio module driving audio and visual feedback.

 

Equipment Used:

OpenBCI Cyton, EMG electrodes, Arduino Uno

Extend Your Limits

Adrian Fernandez (SDSU), Surabhi Nimbalkar (MIT), Anand Selvan Sekar (UW)

 

Augmented Reality (AR) upper-limb rehabilitation exercises

 

Description:

"Extend your Limits" is an augmented reality interface that makes rehabilitation exercises - such as stretching for patients with neuromuscular degenerative diseases - more interactive and informative. A flex resistor determines the "external" degree of stretching, while an EMG sensor determines the "internal" level of activity of the antagonist muscle pair. This information is combined and displayed on a Microsoft HoloLens worn by the patient.

​

In the scope of the hackathon, the flex resistor was worn on the elbow and the EMG sensor was worn on the bicep - measuring a tricep stretch. This information was directly sent to an Arduino Uno on a Backyard Brains Muscle SpikerShield, then transmitted to the HoloLens through a Bluetooth keyboard (by simulating a key press every instant the flex resistor and EMG both passed a threshold). The HoloLens interface displayed the number of stretches completed and how long a current stretch is being held - as well as a diagram of the stretch. 

 

Equipment Used:

Backyard Brains Muscle SpikerShield, Arduino Uno, EMG electrodes, flex resistor, Microsoft HoloLens 

EEGuide

Aaron Adler (UW), Marc-Joseph Antonini (MIT), Natat Premvuti (UW)

 

EEG guided meditation/focus routine for casual enthusiasts

 

Description:

There has been a growing interest in neurofeedback as a treatment for a variety of disorders including ADHD, Stress, Depression, and its application as an aid in meditation of reaching a highly focused state. Neurofeedback, within an operant conditioning framework, helps individuals to regulate their cortical EEG activity while receiving feedback from a visual or acoustic signal. However, the majority of companies provide neurofeedback and use EEG under almost no scientific basis, and even in clinical settings.

​

That is why we decided to develop an open source, open hardware neurofeedback toolbox. Using the OpenBCI Mark IV as an EEG acquisition system, we provide feedback to the patient under the form of pleasing acoustic stimuli that follow EEG metrics corresponding to desired application.

 

Equipment Used:

OpenBCI Ultracortex "Mark IV" EEG Headset

StressLess

Arvind Balasubramani (SDSU), Ropafadzo Denga (Spelman), Ariel Stutzman (Southwestern College)

 

Virtual game for training the sympathetic and parasympathetic nervous systems

 

Description:

It uses the OpenBCI Cyton board to capture biosignals associated with stress (e.g. EKG and EMG) in order to manipulate a virtual environment in Unity. This immersive environment, projected through HoloLens, is meant to provide multimodal sensory feedback that reflects the user’s stress levels and appropriately challenges or eases their ability to maintain a calm brain state. We implemented a simple golf-like game in Unity for this purpose, with an intended stress-modulated game mechanic that would shorten the distance between the ball and the hole as stress levels improved, but did not fully integrate this feature in time. Our hands-free system is designed to be usable by anyone—sufferers of neurological disease, health-minded people, and students alike—as a stress management tool and as a fun demonstration of brain computer interface technology.

 

Equipment Used:

OpenBCI Cyton, Unity, Microsoft HoloLens, Visual Studio

Team Pitch

Purushothaman Padmanabhan (UW), Benjamin Pedigo (UW), Zezhi Zheng (UW)

 

Sensorized personal trainer for sports rehabilitation

 

Description:

Without having a personal trainer or expensive clinical equipment, a lot of valuable data is not taken into consideration when trying to correct bodily movements of a person, such as in rehabilitation or in sports. Current technology considers motion capture to correct mistakes in movement. But it does not take into account the physiology of the movement, the muscles causing this movement. We aim at using available consumer devices to help people with this. We are using a Myo armband and a 5-DT Data Glove to acquire motion data as well as EMG data, to get a better picture behind the physiology of the movements and advice on correction it.

 

Equipment Used:

Myo Armband, 5-DT Data Glove

Watch the 2015 team presentations on YouTube!

https://www.youtube.com/user/CSNEERC

Face the Music (First Place in the 2015 Hackathon)

Timothy Brown (UW), Jaycee Holmes (Spelman), Catherine Yunis (MIT)

 

Facial EMG game set for facial muscle rehabilitation and artistic expression.

 

Description:

Face the Music is an affordable, open-source way to play music using your face muscles. It can be used for patients with facial paralysis, for people learning about neuroscience, or anyone looking for a hands-free creative outlet. It uses a set of wet electrodes on the users face to pick up muscle activity in two places, on the cheeks and above the eyebrows. The electrodes are connected in a special way: each side's electrodes are daisy chained together before connecting to a lead to the Spiker Shield. A reference electrode is placed right below the hairline. The signals are amplified, digitized, and transmitted using a Backyard Brains Spiker Shield and Arduino Uno. The signals are analyzed and mapped to images and sounds using a program written using the Processing environment.

 

Equipment Used:

Backyard Brains MuscleSpiker Shield, Wet EMG electrodes (unipolar), Arduino uno, Processing (Java)

Revision

Alexander Lim (MIT), Jonathan Realmuto (UW) (not in photo), Seleste Braddock (Spelman)

 

Smart cane for blind individuals.

 

Description:

Revision is a white cane (i.e., a cane designed to help the blind navigate) equipped with sonar and haptic feedback. The sonar senses the distance to objects and the haptic feedback produces a vibration of the handle proportional to the distance. Thus, the user is able to "feel" there surroundings.

A Raspberry Pi 2 is used for on board computation and sensor integration. Revision is also equipped with a webcam. The webcam was going to be used for online image processing of faces and street signs, however, we didn't have enough time to integrate the image processing software on the Pi.

 

Equipment Used:

Sonar sensor, 3D-Printed Parts, Webcam, Raspberry Pi

EmoVibe

Ebenezer Nkwater (MIT), Brandon Smith (UW), Brendan King (UW) 

 

Haptic feedback in the form of vibration pulse patterns to assist in emotion recognition from facial expressions.

 

Description:

EmoVibe is a discrete wearable device for autistic patients that performs facial expression recognition in real time and provides feedback in the form of vibrations. Individuals can either use preset patterns or set up patterns for each emotion to ease learning time. EmoVibe also prevents stigma with its discreet nature and its software can easily be incorporated into available tools such as tablets and phones.

 

Equipment Used:

Laptop, Camera, Adafruit Flora, Vibro-Tactile Motors, 3D-Printed Vibro-Tactile Motor Holders

Rocket Brain

Pragnesh Patel (SDSU), Miranda Gavrin (MIT), Gaurav Mukherjee (UW)

 

EMG controlled video game for hand muscle rehabilitation.

 

Description:

Rocket brain is an open-source game designed to provide an interactive environment for use as a biofeedback platform, a simple-to-learn tool for STEM outreach, and for neuro-rehabilitation research. In its current form, it accepts a single channel of electromyography (EMG) signal as an input to the game. The objective is to steer the rocket brain to avoid obstacles approaching it in the form of pillars. The distribution of the pillars grows increasingly complex with increasing levels of difficulty. This keeps the user focused and engaged.

The source code is open for non-commercial use and development under a creative commons license and is available upon request.

 

Equipment Used:

Biometrics EMG, PyGame

Sunshine Arm

Lisa Liu (MIT), Oliver Stanley (UW), Alberto Perez (SDSU)

 

EMG controlled robotic arm for educational outreach.

 

Description:

We developed a gesture-controlled robotic hand, which can be used as an educational tool. The system used a Myo Armband and the Myo SDK to read gestures. The Myo sends the signals to a computer, which then sends signals to an Arduino, which is used to control the motors in the OWI Arm Edge.  Students can learn basic engineering principles, such as mechanical assembly, circuit design, programming, and signal processing. In future implementations, it would be nice to take the computer out of the flow, and have the Myo send information directly to the Arduino via Bluetooth.

 

Equipment Used:

Myo Armband, OWI Robotic Arm, Intel Galileo, Lots of MOSFETs

iHAND

Orthotic hand controlled by facial emg. Used: Emotive headset (for EMG), servo motors, fishing line, 3D printed gears/spools and rings, gloves, and Arduino.

Reincarnation  

Kinect integrated google earth for rehab. Used Kinect, Google earth api, Arduino + accelerometer.

Stress-tector

Suite of measurement tools for stress and anxiety. Used Biometrics EMG, Arduino + accelerometers.

VisualAid

EMG controlled camera for movement hindered patients. Used biometrics EMG and Goniometers, MS Lifecam, USB Turret.

TouchType  

One handed vibrotactile typing through actuated glove. Used gloves, vibrotactile motors, tactile switches.

bottom of page