Currently, the power of artificial intelligence (AI) is being harnessed in almost all aspects of life from medical, agriculture, finance and military among others. The Defense Research Project Agency (DARPA) is one of the entities in the military space that has shown increased interest in artificial intelligence technology in the recent past.
In fact, Raytheon BBN Technologies is presently creating a distinctive neural network with the potential of explaining itself. This program is under DARPA’s Explainable Artificial Intelligence program (XAI).
The aim of the groundbreaking XAI program is creating a selection of machine learning methods that generate extra explainable models while still retaining a high-level of performance. What’s more, the program is geared towards helping human users comprehend, trust and effectively manage the rising generation of artificially intelligent partners.
The Explainable Question Answering System (EQUAS) used by Raytheon BBN Technologies is set to enable artificial intelligence programs to display their work, particularly increasing the confidence of a human user in a machine’s suggestions.
Bill Ferguson , the head scientist and Raytheon BBN’s EQUAS principal investigator, said that the company’s goal entails giving the human user adequate information regarding how the machine’s answer was reached as well as showing that the particular system took into account the right information in a bid to make users feel comfortable executing the recommendations provided by the system.
EQUAS is expected to show users the kind of data that was most important in the artificial intelligence (AI) decision-making process. By utilizing a graphical interface, users can go through the recommendations generated by the system in an effort to see the reason behind choosing one answer over the rest.
Although this remarkable technology is still in its early stage of development, it is expected to potentially be put to use in a broad range of applications.
According to Bill Ferguson, a completely-developed system such as Raytheon BBN’s Explainable Question Answering System (EQUAS) could assist in decision-making in the operations of the Department of Defense(DoD) as well as an array of many other applications including the medical field, industrial operations, and campus security.
Bill Ferguson gave an example of a doctor with a lung ’s x-ray image and her AI system indicates cancer.
The doctor asks why the system gave such an answer and, in turn, the system highlights certain aspects that it is convinced represent suspicious shadows, which she had earlier overlooked as artifacts left behind after the X-ray process.
Thanks to the system ’s suggestions the doctor is prompted to diagnose or investigate the issue. Alternatively, if she is still convinced the system is wrong, she can ignore the answer.
With continuous enhancement, the explainable Question Answering System (EQUAS) is anticipated to gain the ability to not only monitor itself but also share factors that inhibit its potential to generate trustworthy recommendations.
As such, the system’s monitoring capacity will aid developers in refining artificial intelligence systems in an attempt to enable them to take in extra data or transform the manner in which data is processed.