Recently, the incorporation of artificial intelligence into cell biology at the Allen Institute of Cell Science led to the creation of new computer models that can convert simple black-and-white pictures of live human cells into colored, 3D images that are filled with details. The online database dubbed Allen Integrated Cell is currently being made available to the public. Creators behind this technology claim that it could reveal new information into the cells’ inner workings.
Molly Maleckar, Allen Institute’s director of modeling, told GeekWire that from one, simple microscopy image, you could obtain a high-contrast, 3D image that makes it easy for you to see all the separate structures or components. She emphasized her point by saying that the images allow you to see the relationship between the different structures and ultimately apply a time series to help you view the changes dynamically.
In the long run, this Allen Integrated Cell could simplify the monitoring of how stem cells change themselves into the different types of body cells, look at the effects of drugs on individual cells and view how diseases influence cellular activities. According to Graham Johnson, the Allen Institute ’s director of the Animated Cell project, said that the goal of the Seattle-based institution is to get the structures inside cells as close to the happy state as possible without damaging them with light or interfering with their function.
This effort by the Allen Institute started with the collection of gene-edited hiPSC lines (human induced pluripotent stem cell lines). To make it possible for researchers to identify the substructures inside them, they engineered the special cells by adding florescent labels. The microtubules that serve as cellular scaffolds, the energy-generating mitochondria, and the central region of a cell’s nucleus are some of the examples of substructures.
Researchers involved in the project trained an artificial intelligence (AI) program to identify the glow-in-the-dark elements or substructures present in thousands of cells. Subsequently, they applied that deep-learning model to black-and-white cell images without fluorescent labels. The resultant label-free models allow the creation of highly detailed 3D images from the type of views you obtain from a standard microscope used in high-school. This technique is described more comprehensively in a research paper available on the bioRxiv pre-print site.
Aside from the recent model, the Allen Institute has another model capable of accurately predicting the most likely location and shape of structures in any pluripotent stem cell by focusing only on the shape of the nucleus and cell membrane.
These models could assist researchers in coming up with detailed information regarding cellular interactions over a period without the use of laser scans, chemical dyes or other techniques that interrupt the cells being studied.
Currently, the Allen Institute is trying to improve the computer modeling tools instead of venturing into clinical applications. Maleckar said that although the institute is thrilled about the downstream applications, its primary focus at the moment is probing the limitations of the new technology.
Although the tools developed by the institute can convert high school-quality microscopy into professional-level visualizations for researchers, they could one day be beneficial for high-school students.