Democratizing Cell Biology: Scientists Can Now Look Inside Live Human Cells

Label-free prediction of structures inside a single human cell. Gray: Nuclear envelope and DNA, Purple: Mitochondria, Yellow: Actin, Blue: Endoplasmic reticulum, Brown: Microtubules. (Credit: (Allen Institute for Cell Science)

The Allen Institute for Cell Science has created a new tool, enabled by machine learning that gives researchers a new way to see inside living human cells. 

 

“Until now, our ability to see what is going on inside of human cells has been very limited.”

 

“It’s like seeing the whole cell for the first time,” said Rick Horwitz, PhD, Executive Director of the Allen Institute for Cell Science. “In the future, this will impact drug discovery, disease research and how we frame basic studies involving human cells.”

The tool, called The Allen Integrated Cell, works by summarising a large collection of live human cells, gene edited by Allen Institute scientists to incorporate fluorescent protein tags. These tags illuminate specific structures inside of cell, such as the nucleus and mitochondria.

The scientists took pictures of tens of thousands of these glowing cells and used artificial intelligence to study them. 

First, the researchers developed a computer algorithm that studied the shape of the plasma membrane, the nucleus and other fluorescently labeled cell structures to learn their spatial relationships. A powerful probabilistic model emerged from this training, that accurately predicts the most probable shape and location of structures in any cell, based solely on the shape of the plasma membrane and the nucleus.

 

http://bit.ly/2BEXk6d

 

Second, the researchers took images of those same fluorescently labelled cells and applied a different machine learning algorithm. This algorithm used what it learned from cells with fluorescent labels to find cellular structures in cells without fluorescent labels. This label-free model can be used on relatively easy to collect brightfield microscope images to visualise the integration of many structures inside of cells, simultaneously and with high precision. Viewed side-by-side, the images generated by the label-free method look nearly identical to the fluorescently labelled photographs of cells.

“Fluorescence microscopy is expensive and toxic to cells; increasingly so when you tag multiple structures,” said Molly Maleckar, PhD, Director of Modeling. “Our approach allows scientists to view cells and conduct experiments at the reduced cost of brightfield microscopy, with the structure-identifying power of fluorescence microscopy – and without its toxic effects. It’s really the best of both worlds.”

“Until now, our ability to see what is going on inside of human cells has been very limited,” said Michael Elowitz, PhD, Professor of Biology, Bioengineering and Applied Physics at California Institute of Technology. “Previously, we could only see the proteins that we deliberately labelled. But the Allen Integrated Cell is like the ultimate free lunch. We get to sample a ‘buffet’ of many different proteins and organelles, without having to label anything at all. This opens up a totally new and much more powerful way of doing cell biology. It’s a total game changer.”

Rolling out alongside the Allen Integrated Cell is the Visual Guide to Human Cells: an online interactive overview of human cell structure and function.

Like all Allen Institute tools and resources, the Allen Integrated Cell is publicly and freely available on the Allen Cell Explorer at AllenCell.org. All code is open source and downloadable from GitHub.