University of Utah electrical and computer engineering assistant professor Jacob George has received a $200,000 grant from the U.S. Department of Veterans Affairs “Specially Adapted Housing Assistive Technology Program” to expand his lab’s thought-based control of smart-home devices for VA patients with neuromuscular disabilities.
The grant, entitled “Smart Control of Smart Devices,” will allow George and his lab to improve adaptive home environments through the development of a wearable device in the form of a wrist watch capable of translating intended movements of the hand into real-world control of smart-home devices. The vision of this project is to create a device that uses electromyography and various embedded sensors to “decode” or determine the user’s intended hand gestures, then communicate the intent wirelessly to a smart home device, ultimately triggering the desired action without the need for physical movement or contact.
“The idea is that we have these smart devices in our home and instead of yelling at them with voice commands or trying to flip switches and dials on the wall, we simply think about a desired action and it happens,” says George.
Traditional control over smart-home devices requires users to utilize voice commands and hear the device’s respond. George’s device would eliminate the need for vocal or physical interaction, allowing individuals thought-based control over their smart-home devices. This not only would create a more intuitive, accessible home environment but also offers users more privacy (with typical smart-home devices one must announce verbally every command they wish the device to carry out).
In 2020, George’s lab began recording muscle activity from individuals with paralysis. “In general, when an individual is paralyzed and unable to move, information starts as an electrical signal in their brain that goes down to their nervous system. In the case of a stroke or other forms of paralysis – even partial spinal cord injuries – what happens is the signal can come out to your arm, for example, and it still causes your muscles to contract, but they contract so weakly that they can’t actually move the hand,” he said. “It’s like you’re trying to push the gas pedal on a car but your foot is not strong enough to make the car move – it’s the same idea. The electrical signals are coming to your muscles, but they’re not strong enough to make your muscles forcefully move your hand.”
Through the initial recording of muscle activity, George was able to show that an individual’s muscle activity can be “decoded” and, using AI, can be deciphered into what action the person is trying to complete.
This grant will fund the project for one year and result in a functional prototype and leverage plan for future commercialization of the assistive technology. To learn more about this project and find out about research opportunities, visit George’s Lab.
Click here to check out this video from the Utah NeuroRobotics Lab of a stroke participant controlling a virtual hand.