The world is going through a maternal well being disaster. According to the World Health Organization, roughly 810 girls die every day on account of preventable causes associated to being pregnant and childbirth. Two-thirds of those deaths happen in sub-Saharan Africa. In Rwanda, one of many main causes of maternal mortality is contaminated Cesarean part wounds.
An interdisciplinary workforce of medical doctors and researchers from MIT, Harvard University, and Partners in Health (PIH) in Rwanda have proposed an answer to deal with this downside. They have developed a cellular well being (mHealth) platform that makes use of synthetic intelligence and real-time laptop imaginative and prescient to foretell an infection in C-section wounds with roughly 90 p.c accuracy.
“Early detection of infection is an important issue worldwide, but in low-resource areas such as rural Rwanda, the problem is even more dire due to a lack of trained doctors and the high prevalence of bacterial infections that are resistant to antibiotics,” says Richard Ribon Fletcher ’89, SM ’97, PhD ’02, analysis scientist in mechanical engineering at MIT and expertise lead for the workforce. “Our idea was to employ mobile phones that could be used by community health workers to visit new mothers in their homes and inspect their wounds to detect infection.”
This summer time, the workforce, which is led by Bethany Hedt-Gauthier, a professor at Harvard Medical School, was awarded the $500,000 first-place prize within the NIH Technology Accelerator Challenge for Maternal Health.
“The lives of women who deliver by Cesarean section in the developing world are compromised by both limited access to quality surgery and postpartum care,” provides Fredrick Kateera, a workforce member from PIH. “Use of mobile health technologies for early identification, plausible accurate diagnosis of those with surgical site infections within these communities would be a scalable game changer in optimizing women’s health.”
Training algorithms to detect an infection
The undertaking’s inception was the results of a number of likelihood encounters. In 2017, Fletcher and Hedt-Gauthier ran into one another on the Washington Metro throughout an NIH investigator assembly. Hedt-Gauthier, who had been engaged on analysis initiatives in Rwanda for 5 years at that time, was searching for an answer for the hole in Cesarean care she and her collaborators had encountered of their analysis. Specifically, she was concerned with exploring using cellular phone cameras as a diagnostic software.
Fletcher, who leads a gaggle of scholars in Professor Sanjay Sarma’s AutoID Lab and has spent many years making use of telephones, machine studying algorithms, and different cellular applied sciences to world well being, was a pure match for the undertaking.
“Once we realized that these types of image-based algorithms could support home-based care for women after Cesarean delivery, we approached Dr. Fletcher as a collaborator, given his extensive experience in developing mHealth technologies in low- and middle-income settings,” says Hedt-Gauthier.
During that very same journey, Hedt-Gauthier serendipitously sat subsequent to Audace Nakeshimana ’20, who was a brand new MIT pupil from Rwanda and would later be a part of Fletcher’s workforce at MIT. With Fletcher’s mentorship, throughout his senior yr, Nakeshimana based Insightiv, a Rwandan startup that’s making use of AI algorithms for evaluation of medical photos, and was a high grant awardee on the annual MIT IDEAS competitors in 2020.
The first step within the undertaking was gathering a database of wound photos taken by group well being employees in rural Rwanda. They collected over 1,000 photos of each contaminated and non-infected wounds after which educated an algorithm utilizing that knowledge.
A central downside emerged with this primary dataset, collected between 2018 and 2019. Many of the images have been of poor high quality.
“The quality of wound images collected by the health workers was highly variable and it required a large amount of manual labor to crop and resample the images. Since these images are used to train the machine learning model, the image quality and variability fundamentally limits the performance of the algorithm,” says Fletcher.
To resolve this concern, Fletcher turned to instruments he utilized in earlier initiatives: real-time laptop imaginative and prescient and augmented actuality.
Improving picture high quality with real-time picture processing
To encourage group well being employees to take higher-quality photos, Fletcher and the workforce revised the wound screener cellular app and paired it with a easy paper body. The body contained a printed calibration colour sample and one other optical sample that guides the app’s laptop imaginative and prescient software program.
Health employees are instructed to position the body over the wound and open the app, which gives real-time suggestions on the digicam placement. Augmented actuality is utilized by the app to show a inexperienced examine mark when the cellphone is within the correct vary. Once in vary, different components of the pc imaginative and prescient software program will then routinely stability the colour, crop the picture, and apply transformations to appropriate for parallax.
“By using real-time computer vision at the time of data collection, we are able to generate beautiful, clean, uniform color-balanced images that can then be used to train our machine learning models, without any need for manual data cleaning or post-processing,” says Fletcher.
Using convolutional neural internet (CNN) machine studying fashions, together with a way referred to as switch studying, the software program has been in a position to efficiently predict an infection in C-section wounds with roughly 90 p.c accuracy inside 10 days of childbirth. Women who’re predicted to have an an infection by way of the app are then given a referral to a clinic the place they’ll obtain diagnostic bacterial testing and could be prescribed life-saving antibiotics as wanted.
The app has been effectively obtained by girls and group well being employees in Rwanda.
“The trust that women have in community health workers, who were a big promoter of the app, meant the mHealth tool was accepted by women in rural areas,” provides Anne Niyigena of PIH.
Using thermal imaging to deal with algorithmic bias
One of the most important hurdles to scaling this AI-based expertise to a extra world viewers is algorithmic bias. When educated on a comparatively homogenous inhabitants, reminiscent of that of rural Rwanda, the algorithm performs as anticipated and may efficiently predict an infection. But when photos of sufferers of various pores and skin colours are launched, the algorithm is much less efficient.
To deal with this concern, Fletcher used thermal imaging. Simple thermal digicam modules, designed to connect to a cellular phone, value roughly $200 and can be utilized to seize infrared photos of wounds. Algorithms can then be educated utilizing the warmth patterns of infrared wound photos to foretell an infection. A study revealed final yr confirmed over a 90 p.c prediction accuracy when these thermal photos have been paired with the app’s CNN algorithm.
While dearer than merely utilizing the cellphone’s digicam, the thermal picture method could possibly be used to scale the workforce’s mHealth expertise to a extra numerous, world inhabitants.
“We’re giving the health staff two options: in a homogenous population, like rural Rwanda, they can use their standard phone camera, using the model that has been trained with data from the local population. Otherwise, they can use the more general model which requires the thermal camera attachment,” says Fletcher.
While the present era of the cellular app makes use of a cloud-based algorithm to run the an infection prediction mannequin, the workforce is now engaged on a stand-alone cellular app that doesn’t require web entry, and in addition appears to be like in any respect features of maternal well being, from being pregnant to postpartum.
In addition to growing the library of wound photos used within the algorithms, Fletcher is working carefully with former pupil Nakeshimana and his workforce at Insightiv on the app’s growth, and utilizing the Android telephones which might be domestically manufactured in Rwanda. PIH will then conduct person testing and field-based validation in Rwanda.
As the workforce appears to be like to develop the great app for maternal well being, privateness and knowledge safety are a high precedence.
“As we develop and refine these tools, a closer attention must be paid to patients’ data privacy. More data security details should be incorporated so that the tool addresses the gaps it is intended to bridge and maximizes user’s trust, which will eventually favor its adoption at a larger scale,” says Niyigena.
Members of the prize-winning workforce embrace: Bethany Hedt-Gauthier from Harvard Medical School; Richard Fletcher from MIT; Robert Riviello from Brigham and Women’s Hospital; Adeline Boatin from Massachusetts General Hospital; Anne Niyigena, Frederick Kateera, Laban Bikorimana, and Vincent Cubaka from PIH in Rwanda; and Audace Nakeshimana ’20, founding father of Insightiv.ai.