SBIR-STTR Award

Device to Measure Pain Using Facial Expression Recognition Integrated with Patient Painreportit® Tablet
Award last edited on: 1/29/20

Sponsored Program
SBIR
Awarding Agency
NIH : NIDA
Total Award Amount
$149,997
Award Phase
1
Solicitation Topic Code
-----

Principal Investigator
Zhanli Chen

Company Information

eNursing LLC

1138 East 5645 South
Murray, UT 84121
   (801) 414-0627
   contact@enursingllc.com
   www.enursingllc.com
Location: Single
Congr. District: 03
County: Salt Lake

Phase I

Contract Number: 1R43DA046973-01
Start Date: 4/15/19    Completed: 5/31/20
Phase I year
2019
Phase I Amount
$149,997
Our goal is to develop a product that will objectively measure pain using computer vision and machine learning technologies together with tablet-based self-reported pain data from patients for research or clinical purposes. Our device will be low-cost because it will consist of one or two cameras to record the video and a computer to analyze the video in almost real-time. The software of the device will be portable to ordinary personal computers and tablets, and it will be able to simultaneously control multiple cameras that capture facial pain expression, which is an important modality for detecting pain, especially when the patient's verbal ability to communicate is impaired. Non-verbal facial pain behaviors provide important cues to estimate the level of pain and self-reported tablet-based pain data discriminate the type of pain (nociceptive, neuropathic). Used together, these two data sources could contribute to appropriate use of opioids. Facial muscle-based action units (AUs), which are defined by the Facial Action Coding System (FACS), have been widely studied and are highly reliable as a method for detecting facial expressions including valid detection of pain. Unfortunately, use of FACS is time-consuming making its clinical use prohibitive. An automated system for capturing facial images and detecting pain-related AUs, as proposed here, would be highly beneficial for efficient and practical pain monitoring. Preliminary work based on a unique video dataset captured by MPI Wilkie led to a highly promising method to detect AUs related to facial pain expression. Building on our existing research, we will enhance the performance of the automated method to detect pain-related AUs and then integrate this information with patients' self-reported pain data to objectively measure pain intensity. We will develop an almost real- time AU detection system under clinical settings using a deep learning based facial video analysis software. The likelihood scores for the presence of different AUs in the video together with the patient-reported data will be fed to a second machine learning module, which will then estimate the existence and the intensity of pain in an automated manner. The software will have the option of human verification. We will estimate the accuracy of the proposed system by extensive tests on two existing facial pain image datasets in the Phase I of this SBIR project. One of these datasets is the Wilkie video dataset, which has patient-reported data together with the ground truth scores from certified human FACS coders. We envision that when we move on to Phase II activities, eNursing llc will create a device that will be ready for testing in a suitable population to show data suitable for FDA approval.

Public Health Relevance Statement:
Narrative Automated detection of pain from facial expressions and patient reported data can greatly benefit patient care efficiency and provide practical pain monitoring in a variety of clinical settings. We will develop and test an advanced video analysis and machine learning based system to objectively measure pain for research and clinical purposes. This new tool has the potential to help rectify the poor pain outcomes that still plague Americans with opioid addiction, cancer and other health conditions in many health care settings.

Project Terms:
Acute Pain; American; Analgesics; base; chronic pain; Clinical; clinically relevant; Clip; Code; Computer software; Computer Vision Systems; Computers; Consumption; cost; Cues; Data; Data Reporting; Data Set; Data Sources; deep learning; deep neural network; Detection; Devices; Environment; Eye; Face; Facial Expression; Facial Expression Recognition; Facial Muscles; Facial Pain; Goals; Health; Health Care Costs; health care settings; Human; Image; Impairment; improved; Injury; Label; Learning Module; Lighting; Machine Learning; Malignant Neoplasms; Manuals; Measures; Methods; Modality; Monitor; Nerve Tissue; Neuropathy; new technology; Nociception; Opiate Addiction; Opioid; opioid epidemic; Outcome; Pain; pain behavior; Pain intensity; Pain Measurement; pain outcome; pain patient; Pain Research; pain signal; painful neuropathy; Patient Care; Patient Self-Report; Patients; Performance; Personal Computers; Pharmaceutical Preparations; Phase; Plague; Population; portability; Process; prototype; Reporting; Research; Running; Small Business Innovation Research Grant; societal costs; Statistical Data Interpretation; System; Tablets; Technology; Testing; Time; Tissues; tool; Training; United States; Variant; Work

Phase II

Contract Number: ----------
Start Date: 00/00/00    Completed: 00/00/00
Phase II year
----
Phase II Amount
----