Chronic cancer pain is difficult to measure. That’s why Booz Allen and the NIH are using AI to better understand it.

Using AI to Combat Cancer Pain

The nation’s first multimodal cancer pain dataset

For decades, doctors have relied on a simple one-to-ten scale to measure pain. But chronic cancer pain is complex, and numbers alone, especially self-reported ones, rarely tell the whole story. Now, a public-private partnership is giving doctors a clearer picture—one that could transform how cancer pain is understood and treated.

The breakthrough comes from a joint Booz Allen and National Institutes of Health (NIH) project to develop the nation’s first cancer pain dataset containing multiple types of information like video, audio, text, and self-reported patient details. By training AI models on the dataset, researchers can predict pain far more accurately than traditional methods. It’s a first-of-its-kind effort to capture and classify chronic cancer pain on a large scale—providing doctors and clinicians with a powerful new tool to guide treatment and improve care.

Today, the dataset houses more than 500 patient videos and nearly 200,000 video frames, making it the nation’s largest repository of cancer pain information. As patients continue to enter the ongoing clinical trial, the dataset will only grow—allowing the AI models to be retrained and updated with new information, such as thermal imagery. With strict safeguards in place to protect patient privacy, the dataset will also be made available to AI researchers, opening the door for future breakthroughs.

When Pain Is Hard to See

Cancer patients often minimize their pain, and traditional scales are subjective, failing to capture the full spectrum of emotional and physical pain. The gap can leave doctors underestimating how much patients are suffering—and limiting their ability to offer the right treatment at the right time.

By analyzing data from multiple sources, the AI-powered effort provides a clearer view of what patients are going through in their daily lives. The models can detect subtle cues in facial expressions, tone of voice, or even word choice—signals that may reveal pain even when a patient doesn’t. Combined with patients’ own reports, this fuller picture gives doctors a stronger foundation for making care decisions. With more accurate insights, doctors can adjust treatment plans earlier and provide more effective support.

Today, cancer pain remains one of health care’s toughest challenges, but these new tools are making it possible to understand it in ways that once felt out of reach. For patients and families, that means more than numbers on a chart. It means an evidence-based approach to pain management—one that can ultimately improve quality of life.

1 - 4 of 8