When discussing innovation in healthcare technology, much of the terminology is exotic-sounding and futuristic. Recent examples from this column include: functional MRIs to detect lies, active cancellation of tremor (ACT) to stabilize food utensils for Parkinson’s patients, and virtual assistant apps for people with cognitive disabilities.
But it’s important to remember that a lot of progress can be and is made by applying older technologies in new, transformative ways. Take, for example, a technological wonder of the 19th century – the camera. Obviously, cameras and imaging technologies have improved dramatically over the past few decades and are already used today in many parts of the hospital environment (e.g., laparoscopic surgeries and MRI scans). The use of advanced imaging technologies will continue to grow rapidly. Here are four new and exciting ways that cameras will impact the clinical landscape in the years ahead.
A laparoscopic robotic surgery machine. The use of cameras and advanced imaging technologies will start to expand beyond traditional surgical and scanning applications. (Photo credit: Wikipedia)
1) Facial recognition to determine pain levels: Until now, the best gauge of a patient’s pain level has been his or her self-assessment on a scale from 1 – 10. This subjective “measure” is a vague one. For something as important as pain management, more precise methods are badly needed. At the University of Notre Dame’s Robotics, Health, and Communication Lab, researchers are “creating high-fidelity robotic human patient simulators (HPS) that have the ability to exhibit realistic, clinically-relevant facial expressions – critical cues providers need to assess and treat patients.” Binghamton University has built a 3D Facial Expression Database, which the researchers say “aims to achieve a high rate of accuracy in identifying a wide range of facial expressions, with the ultimate goal of increasing the general understanding of facial behavior and 3D structure of facial expressions on a detailed level.” Both of these projects are building blocks towards cameras linked with data that can identify and quantify pain states.
2) Automated detection of patient activity: Traditionally, monitoring of patients has required some sort of physical contact with the monitoring device, such as EEG patches and fingertip oxygen sensors. For patients at risk for falls or self-injury, the solution has usually been “patient-sitters” who are stationed in the room. These sitters are often volunteers from the patient’s family, or hourly employees whose services aren’t covered by insurance. Cisco offers a Video Surveillance Manager that feeds high-def video to an operations center where trained staff can notify appropriate personnel by two-way video, voice, text, paging, or integration with existing nurse call systems. Fujitsu has gone further, creating a camera that recognizes when a patient sits up in bed, gets out of bed, or is tossing and turning in a restless effort to get to sleep.
3) Measurement of changes in heart rate from head movement: Researchers from MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) are working to improve their algorithms that detect the head motion associated with each heartbeat. In other words, each time your heart beats, your head moves slightly. CSAIL is developing technology to translate video of those head movements into accurate data on heart activity. For now, the margin of error is large enough to prevent the data from being clinically useful, but that should change as the imaging and analysis elements improve.
4) Peripheral vein imaging: Finding a viable vein for an injection or IV can be tricky, even for experienced phlebotomists. Multiple sticks can waste time during an emergency, in addition to causing the patient discomfort. A company called Christie Medical Holdings is marketing a product called VeinViewer to make seeing the veins easier. They claim, “Projected near-infrared light is absorbed by blood and reflected by surrounding tissue. The information is captured, processed and projected digitally in real time directly onto the surface of the skin. It provides a real time accurate image of the patient’s blood pattern.” Technology like this would be very welcome in hospitals, outpatient labs, chemotherapy suites, and blood drives.
As with any advances in technology, these innovations may also pose ethical and legal challenges. Privacy concerns of patients being observed by remote cameras are an obvious example. Are the images being recorded and stored? Who has access to the images, and how? One factor that could speed widespread acceptance is demonstrating that these concerns are addressed within the technology itself, or by procedural safeguards.
The use of cameras in the clinical environment will become ubiquitous as image-processing techniques provide more accurate and timely monitoring of critical physiological indicators.
Rob Szczerba is the CEO of X Tech Ventures. Follow him on Forbes, Twitter (@RJSzczerba), Facebook, and LinkedIn.