Researchers at the New Jersey Institute of Technology (NJIT) are leveraging artificial intelligence to train robotic dogs to respond to their owners. This international collaboration, funded by a one-year seed grant from the Institute for Future Technologies (IFT)—a partnership between NJIT and Ben-Gurion University of the Negev (BGU)—aims to innovate how robotic dogs interact with humans through edge intelligence.
Assistant Professor Kasthuri Jayarajah of NJIT’s Ying Wu College of Computing is spearheading the development of a socially assistive model for the Unitree Go2 robotic dog. This model will dynamically adapt the robot’s behavior and interactions based on the characteristics of the individuals it engages with. The goal is to create a more lifelike robotic companion by incorporating wearable-based sensing devices that detect physiological and emotional stimuli, such as personality traits and transient states like pain or comfort.
This technology has potential applications in home and healthcare settings, particularly for alleviating loneliness among the elderly and assisting in therapy and rehabilitation. Jayarajah’s research, which involves robotic dogs interpreting and responding to gestural cues from their human partners, will be presented at the International Conference on Intelligent Robots and Systems (IROS) later this year.
Co-principal investigator Shelly Levy-Tzedek, an associate professor in the Department of Physical Therapy at BGU, brings extensive expertise in rehabilitation robotics and the study of age and disease effects on bodily control. The research team is exploring the use of multimodal wearable sensors, such as repurposed earphones, to track user attributes like brain activity and microexpressions. These sensors will be integrated with traditional robot sensors to objectively and passively monitor user states.
Despite the promising concept of socially assistive robots, Jayarajah acknowledges the challenges in achieving long-term, sustained use due to current limitations in cost, processing power, memory, and battery life of robots like the Unitree Go2. The initial phase of the project will focus on enhancing traditional sensor fusion and designing deep-learning architectures to develop wearable sensors capable of extracting user attributes and adapting the robot’s motion commands accordingly.