Neuromorphic computing is a powerful tool for identifying time-varying patterns, but is often less effective than some AI-based techniques for more complex tasks. Researchers at the iCAS Lab directed by Ramtin Zand at the University of South Carolina, work on an NSF CAREER project to show how the capabilities of neuromorphic systems could be improved by blending them with specialized machine learning systems, without sacrificing their impressive energy efficiency. Using their approach, the team aims to show how the gestures of American Sign Language could be instantly translated into written and spoken language.