Adam Zhao on Revolutionizing Fall Detection with AI

by
Sage
July 11, 2025

Adam Zhao joined Sage as the head of AI in February 2025. In that short time, he has led the development and launch of Sage Detect, a new product that provides real-time fall alerts and wellness trends. We spoke with Adam about his path as an engineer to his current role at Sage, the process of building Sage Detect, and how AI is transforming senior living.

What inspired you to work in senior care technology? 

I wanted to work on something deeply impactful to a lot of people. Everyone gets old and the current paradigm of aging is neither sustainable nor optimal—that’s not what I want for my parents, grandparents, or even myself. 

On a personal level, my grandma was diagnosed with dementia. She was living in a senior living community and everything was going relatively well—until the first fall. Then everything started compounding. I wanted to solve this problem and build something that could both keep my grandma safe and alleviate the burden on my mom.

How did you begin working on AI fall detection?

When I left my engineering role at Palantir in 2024, AI had become good enough and, more importantly, cheap enough for wide application across many industries. Companies like OpenAI and Google have slashed costs of AI models with every generation. I did the math and realized that commercial AI models were cheap enough to make it viable at scale.

I had the opportunity to join Sage to work on AI fall detection technology. We recently launched Sage Detect—and it’s had an immediate impact. It has been very meaningful, validating, and rewarding so far, a powerful example of what can happen when the right people come together around a shared vision. 

What sets Sage Detect’s AI apart from other solutions?

Sage Detect uses the largest vision language model on the market—and the best so far for the cost. It’s able to detect actions in the most accurate and granular form compared to anything I've seen to date. Before the latest AI wave, we could maybe detect “here’s a fall” or “here’s not a fall.” The models have gotten so much better and will keep getting better as they are trained on more data by more people. Plus, the speed of iterations on the models we use are quick, so we can be responsive to the needs of care staff and residents

Why is AI a great solution for fall detection and wellness tracking?

Non-optical sensors and wearables each have pros and cons. The latest AI is accurate and, most importantly, granular enough to classify other actions, such as sitting, walking, standing, lying in bed, lying on the ground, drinking water, and more. If you look at the descriptions of what the AI is detecting, you can get an accurate picture of what the person is doing while still preserving privacy, since you don’t have to watch video footage.

In Japan, they have one person watching a big screen with CCTV footage. We want to detect emergencies, like falls, seizures, and signs of distress, where we need a caregiver to immediately render help—but a human watching even 10 screens can’t spot long-term trends or changing conditions. In AI terms, the human context window is so short—I forgot what I ate for lunch yesterday, but the computer will never forget. It can pick up on trends of what I ate for lunch, maybe even the calories of the food, and then give me a health report. 

What wellness trends can AI track over time and how can it change health outcomes?

AI surfaces interesting insights, like sleep patterns, inactivity patterns, and bathroom frequency, so you can observe changes in condition and catch them earlier. Sleeping patterns may be an indication of the progress of Alzheimer’s, a spike in bathroom frequency is a sign of early UTI detection. Then comes the marriage between what computers are good at and what humans are good at. Computers can crunch huge amounts of data and surface trends, and caregivers can leverage those insights to provide more personalized and proactive care. 

Why is it important to have humans working alongside AI?

Eliminating humans would be a huge loss. From the technical side, AI is not good enough to fully eliminate humans—I don’t know if it will ever be good enough. You’d be losing something computers can never replace, which is human intuition to give better care, not to mention the human interactions and other intangibles that a person needs. 

What do you think about concerns around AI and privacy?

Humans have a knee-jerk reaction of, “You’re going to put a camera in my room, is this a complete loss of privacy?” As a person ages, there can be trade-offs for safety. For example, if I lose my ability to ambulate, I may have to call someone to take me places. That’s a loss of privacy, but it increases safety. I believe AI can actually change the tradeoff between safety and privacy. Having AI “watch” video feeds and output deidentified descriptions of what you’re doing is actually very private. The caregiver receiving this alert can’t see the footage—reading that someone is changing is more private than watching them change. But, to push this further, this technology may also delay loss of independence. In that case, if we make the environment safer, we’re preserving privacy in the long term.

How has the process of building and deploying Sage Detect surprised you?

The speed has been incredible. We deployed our pilot in CountryHouse Omaha and, two days later, we detected a fall. It’s been very gratifying to get seniors help in minutes—and the alerts are sent out in seconds. Then we can give footage to clinicians in the community to make determinations of whether a senior needs additional help. For example, we detected a fall where that person had fallen a week before and had to get hip surgery on the right side. In this case, we knew the resident had fallen on their left side. Without Sage Detect, the care staff would have sent this resident to the ER out of an abundance of caution. So we’re not only able to provide much faster help to that person, but save them a hospital trip.

What excites you about the future of AI?

I'm really excited about the intersection of AI and human intelligence. AI can catch what we could never observe, but caregivers are the ones who know whether someone has a sore or a bruise, as well as the greater context around the senior as a human being. If AI can crunch large amounts of data, distill it with the right granularity, and present it to the right person at the right time, then human intelligence and empathy kick in to deliver the best care. In time, I think we’ll be able to anticipate the needs of seniors. My hope is that the call button will be rendered obsolete—because we will be able to give you the care and attention required before you need to push the help button. 

What inspires you in your current role?

Building in the senior living space is immensely fulfilling. I’m not only talking about the people at Sage, who are all rockstars, but the clients. Caregiving can be emotionally draining and often thankless. Most people do it not for the money, but for a higher calling. I find that really inspiring. It’s very motivating for me to build something for residents as well as the frontline caregivers who put so much of themselves into their roles. 

Learn more about Sage Detect here.