Speedy computation enables scientists to reconstruct an anim Source: Howard Hughes Medical Institute
Researchers have developed a new computational method that can rapidly track the three-dimensional movements of cells in such data-rich images. Using the method, scientists can essentially automate much of the time-consuming process of reconstructing an animal's developmental building plan cell by cell.
Recent advances in imaging technology are transforming how scientists see the cellular universe, showing the form and movement of once grainy and blurred structures in stunning detail. But extracting the torrent of information contained in those images often surpasses the limits of existing computational and data analysis techniques, leaving scientists less than satisfied.
Now, researchers at the Howard Hughes Medical Institute's Janelia Research Campus have developed a way around that problem. They have developed a new computational method that can rapidly track the three-dimensional movements of cells in such data-rich images. Using the method, the Janelia scientists can essentially automate much of the time-consuming process of reconstructing an animal's developmental building plan cell by cell.
Philipp Keller, a group leader at Janelia, led the team that developed the computational framework. He and his colleagues, including Janelia postdoc Fernando Amat, Janelia group leader Kristin Branson and former Janelia lab head Eugene Myers, who is now at the Max Plank Institute of Molecular Cell Biology and Genetics, have used the methodto reconstruct cell lineage during development of the early nervous system in a fruit fly. Their method can be used to trace cell lineages in multiple organisms and efficiently processes data from multiple kinds of fluorescent microscopes.
The scientists describe their approach in a paper published online on July 20, 2014, in Nature Methods.
In 2012, Keller developed the simultaneous multi-view (SiMView) light sheet microscope, which captures three-dimensional images with unprecedented speed and precision over periods of hours or days. The microscope's images can reveal the divisions and intricate rearrangements of individual cells as biological structures emerge in a developing embryo. Since then, Keller has been perfecting the system so that he can use it to follow the development of an organism's early nervous system.
"We want to reconstruct the elemental building plan of animals, tracking each cell from very early development until late stages, so that we know everything that has happened in terms of cell movement and cell division," Keller says. "In particular, we want to understand how the nervous system forms. Ultimately, we would like to collect the developmental history of every cell in the nervous system and link that information to the cell's final function. For this purpose, we need to be able to follow individual cells on a fairly large scale and over a long period of time."
It takes more than a week for the nervous system to become functional in an embryonic mouse. Even in a fruit fly, the process takes a day. Following development for that long means imaging tens of thousands of cells at thousands of time points, and that adds up to terabytes of data. "We can get good image data sets, but if you want to reconstruct them, this is something that the human can't really do without help from the computer," Keller says.
Amat, a bioinformatics specialist on Keller's team, and his colleagues have solved that problem with the new computational method that identifies and tracks dividing cells as quickly as their high-speed microscope can capture images. The process is largely automated, but incorporates a manual editing step to improve accuracy for a small percentage of cells that are difficult to track computationally.
Keller's team has been grappling with how to interpret this kind of imaging data since 2010. The problem was challenging not only because of the sheer volume of data his light sheet microscope produced, but also because of the data's complexity. Cells in a developing embryo have different shapes and behaviors and can be densely packed, making it difficult for a computer to identify and track individual cells. Inevitable variations in image quality further complicate the analysis.
Amat led the effort to develop an efficient solution. His first priority was to reduce the complexity of the data. His strategy was to first cluster the voxels (essentially three-dimensional pixels) that make up each image into larger units called supervoxels. Using a supervoxel as the smallest unit reduces an image's complexity a thousand-fold, Keller says.
Next, the program searches for ellipsoid shapes among groups of connected supervoxels, which it recognizes as cell nuclei. Once a cluster of supervoxels has been identified as a cell nucleus, the computer uses that information to find the nucleus again in subsequent images. High-speed microscopy captures its images quickly enough that a single cell can't migrate very far from frame to frame. "We take advantage of that situation and use the solution from one time point as the starting point for the next point," Keller says.
"With this fairly fast, simple approach, we can solve easy cases fairly efficiently," Keller says. Those cases make up about 95 percent of the data. "In harder cases, where we might have mistakes, we use heavier machinery."
He explains that in instances where cells are harder to track -- because image quality is poor or cells are crowded, for example -- the computer draws on additional information. "We look at what all the cells in that neighborhood do a little bit into the future and a little bit into the past," Keller explains. Informative patterns usually emerge from that contextual information. The strategy takes more computing power than the initial tactics. "We don't want to do it for all the cells," Keller says. "But we try to crack these hard cases by gathering more information and making better informed decisions."
All of these steps can be carried out as quickly as images are acquired by the microscope, and the result is lineage information for every cell. "You know the path, you know where it is at a certain time point. You know it divided at a certain point, you know the daughter cells, you know what mother cell it came from," Keller says.
Finally, a human steps in to check the computer's work and fix any mistakes. A computer-generated "confidence score" for every cell at every time point guides the user to the small percentage of data most likely to require a human eye, making high overall accuracy possible without manual examination of each cell.
To test the power of the program, Keller's team collected images of the beginnings of the nervous system as it developed in an embryonic fruit fly. They used their method to trace the lineages of 295 neuroblasts (precursors of nerve cells) and discovered that it is possible to predict the future fate and function of many cells based on their early dynamic behavior.
Keller is eager to begin using the method to investigate a variety of questions about early development, and hopes that others will apply the approach to their own questions. To that end, the team took care to ensure that the technique can be used with a variety of data types. In addition to fruit flies, they successfully used the program to analyze images of zebrafish and mice, as well as data collected from a commercial light sheet microscope and a commercial confocal microscope.
Their open-source software can be downloaded for free at http://www.janelia.org/lab/keller-lab/.
Selected videos ( Video Credit: Philipp Keller Lab, HHMI Janelia Research Campus)
Video 2: https://www.dropbox.com/s/vmr54us4osjfyw5/Supplementary_Video_2%20ProResHQ.mov
Automated segmentation and tracking in SiMView data set of Drosophila embryogenesis (gradient color code)
This video shows automated computational cell lineage reconstruction of the image data. Each circle represents one cell nucleus. The tails of the circles (solid lines) indicate the history of object positions for the past 10 time points. The color scheme was initialized in the first frame using a color gradient from anterior to posterior, using different colors on the dorsal and ventral sides and ensuring continuity in color space at the anterior and posterior ends of the embryo. After this initial color assignment, the color information was propagated in time using the tracking information, thus providing a color-coded single-cell resolution fate map. Some cell nucleus detections correspond to background objects, arising from autofluorescence and limitations in image quality.
Video 20: https://www.dropbox.com/s/zueg4fff1amwilj/Supplementary_Video_20%20ProRes422HQv.2.mov
Automated segmentation and tracking in SiMView data set of zebrafish embryogenesis (gradient color code)
This video shows automated computational cell lineage reconstruction of the image data. Each circle represents one cell nucleus. The tails of the circles (solid lines) indicate the history of object positions for the past 10 time points. The color scheme was initialized in the first frame using a radially-symmetrical color gradient from the animal pole to the periphery of the blastoderm. After this initial color assignment, the color information was propagated in time using the tracking information, thus providing a color-coded single-cell resolution fate map. Some cell nucleus detections correspond to background objects, arising from autofluorescence and limitations in image quality.
Video 25: https://www.dropbox.com/s/46qjosezn9r3stz/Supplementary_Video_25ProResHQ422.mov
Movements and divisions of neural precursors in the early Drosophila embryonic nervous system (with SiMView data)
This video shows ventral and lateral maximum-intensity projections of the SiMView time-lapse recording of the nuclei-labeled Drosophila embryo, superimposed with green spheres marking the location, movements and divisions of neural precursors from the blastoderm stage up to 5 h after egg laying. The tails indicate the history of cell positions for the past ten time points, using a color code that gradually transitions from purple to white as a function of time.
Video 28: https://www.dropbox.com/s/q21xrxo624txgd4/Supplementary_Video_28-ProResHQ422.mov
Cell lineage reconstruction of early Drosophila embryonic nervous system development
This is a video of rotating view of neural precursor cell tracks obtained from the cell lineage reconstruction of early Drosophila embryonic nervous system development. The tracks are represented by solid lines, using a color code that indicates time (purple to white: from the blastoderm stage to the end point of the reconstruction at 5 h after egg laying). Cell locations at the end point of the reconstruction are marked by green spheres. Anterior is to the left, posterior to the right.
| }
|