This blog post was first published on the SAS Voices Blog in October, 2019.
I suffer from arthritis. You can tell just by watching me walk: depending on the day, I have a slight limp, which varies in severity based on a number of factors such as the time of day and recent physical activity.
Years of treatment for my condition have shown me that while technology advances have certainly transformed aspects of treatment, many practices remain stubbornly rooted in old ways of providing care – which I believe has a direct impact on the efficacy and cost of my treatment.
For example, when trained medical clinicians monitor my body’s response to treatment, the process is highly qualitative and subjective, and seems to gather only a sliver of the data points that could be measured. They put me through a series of 14 simple physical tests (“Try to put your leg there, Mark. Okay, now move it here. Now stand on one foot.”) and make notes on my movements. This is called the Berg Balance Scale, and it takes about 20 minutes to conduct.
Isn’t there a better, more scientific way to do this? One that uses technology to monitor much more data, while requiring less observation time on the part of the doctors, nurses, and others caring for me? I would much rather have them spending their valuable time trying to identify new ways to treat my condition (and those of others) than simply taking notes on my ability to execute a series of basic physical tests. Plus, in the context of a health care system that is by most accounts straining under the expense of care, experiencing unsustainable cost increases, and woefully unprepared for the coming generation of care demands, the industry must find a better way.
As a scientist who specializes in analytics and the IoT, I am an inveterate tinkerer. Coming out of yet another assessment at my doctor’s office, I couldn’t help but wonder whether small sensors measuring my movements, delivering gyroscopic and accelerator data, combined with machine learning algorithms, could deliver more useful quantitative data, more efficiently, in a more scientific, reproducible way. So, I bootstrapped a proof of concept. I created a basic algorithm to process movement data generated by a fairly sophisticated instrument that I carry with me virtually everywhere I go – my smart phone.
The results were encouraging enough to warrant more exploration. So, I called up my colleague Chaz Henry, a longtime software engineer who specializes in AI and machine learning. Could we find a way to update the Berg Balance Scale, making it a quantitative, digital, automated clinical assessment?
Here’s what we did. We replaced my phone with some more advanced hardware:
- Intel® IEI* Tank Developer Kit (i5)
- Intel® FPGA V100
- Intel® RealSense™ Camera
- BLE to WIFI Gateway Hub
- MetaMotionR 9-axis IMU (a wearable device that offers real-time, continuous motion monitoring)
In place of my rudimentary algorithm, we used these SAS solutions:
- SAS Analytics for IoT – which allows for GUI-based data mining and machine learning, and
- SAS Event Stream Processing – which allows real-time intelligent decisions with data streams.
The result – an AI-driven version of the Berg Balance Scale – was recently featured in the Intel booth at a recent conference. Our approach has not yet been adopted in the field, nor has it made its way through all the required regulatory channels for approval in clinical use. It is still very much an experiment. But it is promising, and shows what is possible in applying analytics, AI, and machine learning to motion capture – not to mention modern medicine.
Using our approach, mathematical models describe the preoperative condition of the patient as recorded in the assessment – a quantitative baseline. From there, models are used to score and track rehabilitation progress and, ultimately, to evaluate the effect of surgery in terms of therapeutic efficacy compared to the primary diagnosis.
What does that mean? For starters, it means that when I’m being led through the Berg Balance Scale assessment, the system is recording a host of measurements – far more than any clinician can measure. This data is being fed into a model and could ultimately be shared across providers treating others affected by arthritis. Subsequent, routine assessments are fed into the same model, which could allow my doctors to track my progress with pinpoint accuracy in rich detail, seeing patterns that would have otherwise remained hidden. Using machine learning, with more data, the models themselves improve over time. And as data volumes increase, SAS software combined with Intel processing power assures that the system will be ready.
Using our approach, the 5-point scale that my doctors have relied on for decades would be replaced by a much more useful, detailed dashboard view – of even more data points. The dashboard below is only one possible version of how this data could be presented.
Will this approach, put into practice by the health care industry, affect how I’m treated? Maybe. But I am hopeful that by the time my children begin to grapple with these same issues, this approach (or one like it, using advanced AI, analytics, and machine learning) will help doctors provide more effective care, more cheaply and quickly. That’s the very real potential of these technology advances to change the shape of medical care today, and it’s another an example of a real opportunity for data scientists to apply their talents to Data for Good.
For more details about the 20+ year collaborative partnership between Intel and SAS, please visit sas.com/intel.
To learn more about how SAS delivers AI to enhance human ingenuity, please visit sas.com/AI.
Mark Wolff, PhD. has over 25 years of experience in the health and life science industries as a scientist and analyst working in the U.S. and Europe. For more details of his background, visit his LinkedIn profile.