A Perspective on Robotics in the Caregiving Industry

 

There’s new tech in town and that includes the introduction of caregiving robots to meet demand. Researchers at the University of New Hampshire are receiving a $2.8M grant to develop assistive care robots that will ease the burden on caregivers.

Is this technology a worthy investment for the caregiving industry, or is the human connection irreplaceable?

Larry Carlson, MBA, Former President & CEO, of United Methodist Communities shares his perspective on robotic caregiving and its prospective place in the industry.

Robotic caregiving will never replace human contact. That said, there is a place for robotics. They can be good for enhancing human connection but never replace it. That might look like a product called It’s Never Too Late, or IN2L, run by a human or mechanical CAT initiated by a human or a monitoring system that supports the caregiver to respond in person more effectively.

Honestly, the idea of a robotic elder care system sounds terrifying to me. Dementia does not mean stupid. People with dementia diagnosis know the difference between a robot and a human. I also can see it inducing paranoia and fear. A faller alert system we piloted had a robotic voice when triggered that would come out of the ceiling that said, “Please sit down. Someone will be here to help you.”

It scared the daylights out of our residence, we had to turn it off. It just was not normal. The right kind of check is a useful investment, and what I think the tech industry needs to examine is who are they creating this tech for. Nothing seems to be aimed at the person with dementia or support the direct caregiver’s experience where most of the dementia support is needed.

There are lots of great ideas, but I feel like they’re developed by people who never had a relationship with someone with dementia.”

Follow us on social media for the latest updates in B2B!

Image

Latest

AI costs
QumulusAI Brings Fixed Monthly Pricing to Unpredictable AI Costs in Private LLM Deployment
February 18, 2026

Unpredictable AI costs have become a growing concern for organizations running private LLM platforms. Usage-based pricing models can drive significant swings in monthly expenses as adoption increases. Budgeting becomes difficult when infrastructure spending rises with every new user interaction. Mazda Marvasti, CEO of Amberd, says pricing volatility created challenges as his team expanded its…

Read More
GPU infrastructure
Amberd Moves to the Front of the Line With QumulusAI’s GPU Infrastructure
February 18, 2026

Reliable GPU infrastructure determines how quickly AI companies can execute. Teams developing private LLM platforms depend on consistent high-performance compute. Shared cloud environments often create delays when demand exceeds available capacity Mazda Marvasti, CEO of Amberd, says waiting for GPU capacity did not align with his company’s pace. Amberd required guaranteed availability to support…

Read More
private LLM
QumulusAI Secures Priority GPU Infrastructure Amid AWS Capacity Constraints on Private LLM Development
February 18, 2026

Developing a private large language model(LLM) on AWS can expose infrastructure constraints, particularly around GPU access. For smaller companies, securing consistent access to high-performance computing often proves difficult when competing with larger cloud customers. Mazda Marvasti, CEO of Amberd AI,  encountered these challenges while scaling his company’s AI platform. Because Amberd operates its own…

Read More
custom AI chips
Custom AI Chips Signal Segmentation for AI Teams, While NVIDIA Sets the Performance Ceiling for Cutting-Edge AI
February 18, 2026

Microsoft’s introduction of the Maia 200 adds to a growing list of hyperscaler-developed processors, alongside offerings from AWS and Google. These custom AI chips are largely designed to improve inference efficiency and optimize internal cost structures, though some platforms also support large-scale training. Google’s offering is currently the most mature, with a longer production…

Read More