Recently, the Food and Drug Administration issued clearance for the distribution of the radiology software product dubbed Quantib ND in the United States.
The company was granted FDA clearances, specifically for Quantib Neurodegenerative (ND), a tool that helps radiologists in reading MRI brain scans.
The software helps in measuring brain shrinkage and detecting white matter hyperintensities, which are brain changes associated with, for instance, multiple sclerosis, dementia, and aging.
Quantib Neurodegenerative (ND) incorporates fully automatic divisions of hippocampus and lobes in a bid to evaluate atrophy development objectively.
Moreover, it includes white matter hyperintensity (WMH) segmentation that facilitates the monitoring of neurological changes taking place in MS and dementia patients, for instance.
First Product Line Receives Certification
The clearance comes in the course of an initial wave of endorsements from regulatory organizations targeting products that have combined deep learning and machine learning by leveraging data derived from other studies.
According to the Chief Scientific Officer of Quantib Wiro Niessen, this is a motivating indicator that shows the future of the healthcare space.
“For the first time, we’re seeing FDA approvals and CE marks for products in which you do objective quantification, using data from other studies. That’s the first step towards the dot in the horizon, in which a patient is now treated with all the knowledge from previously ill patients,” he said.
Niessen, a professor of biomedical image analysis at Delft University of Technology and Erasmus MC in Rotterdam, said that impressive tools have started emerging not only on the market but also from the academia, which it has been considering partnering with the industry.
Currently, Quantib has several machine learning products with clearance from regulatory bodies such as CE and FDA, with other products still in the development stage.
The company enjoys the support of UMC Utrecht and Erasmus MC Rotterdam, allowing access to large scale imaging data.
The Personal Health Train and FAIR Data
Niessen said that suitable data has to be used in developing algorithms that can help in advancing medical imaging and providing additional optimal diagnostic precision drugs.
“In imaging and health data science, the concept of FAIR data, i.e. data that are Findable, Accessible, Interoperable and Reusable, must prevail. You have to know if you have a certain question, whether these data are somewhere.
The concept of FAIR means you’re able to find the data in your national or regional healthcare system, and to put them into an algorithm to train,” he said.
Genomic data, imaging data, as well as data held in the EPR and produced in clinical practice ought to be made FAIR to ensure that they are used in building a prognosis classifier using the particular information.
“As a hospital or organization that has relevant data, you want to deal with your data in order to ensure that they are FAIR.
Using federated learning, you want to be able to bring your software to these different places and analyze the data, bring the results back and learn from all the data,” Niessen added.