Tuesday, August 11, 2015

Moving Towards Gene Editing


For decades geneticists have theorized that targeted gene editing would provide an eloquent method to repair a defective gene, treat or prevent disease or allow selection of genetic traits in offspring. Regardless, successful efforts to incorporate targeted genetic material into higher organisms remain risky at best. Several approaches are currently being tested including the swapping of an abnormal gene for a good one, repairing a segment know to contain the deleterious error, or altering control of gene expression via editing of its transcriptional regulatory elements. All of the above approaches currently remain in the experimental mode.

One such example of gene therapy involves a mutated form of the gene RPE65 linked to an inherited disorder causing vision loss. In this case a harmless virus carrying the healthy gene was engineered as a vector to deliver the good gene1. While these studies have shown some success more importantly they point the way to future work.

Gene editing evolved in the early 2000s with the discovery of zinc finger nucleases and more highly evolved synthetic nucleases called TALENs. More recently a new technology is being adopted with high expectation of revolutionizing genome editing: CRISPR-Cas9 based genome editing. Clustered regularly interspaced short palindromic repeats (CRISPRs) and CRISPR-associated (Cas) proteins were first discovered in bacteria and other prokaryotic organisms forming an immune system providing a form of acquired immunity to foreign genetic elements such as plasmids and phages. The CRISPR/Cas system has been adapted to gene editing in a wide variety of species by delivering the Cas9 protein and appropriate guide RNAs into the target cell. This approach allows the host genome to be cut at any desired target sequence location.

Crystal Structure of Cas9 in Complex with Guide RNA and Target DNA:  http://dx.doi.org/10.1016/j.cell.2014.02.001
Currently in press, researchers are working to functionally correct Factor VIII gene chromosomal inversions in hemophilia A patients by using the CRISPR-Cas9 system and induced pluripotent stem cells (iPSCs). Proof of concept has shown rescue of factor VIII deficiency in a lethal mouse model2.  The hope is this technology will be transferrable to human disease targets in the near future.

1. Jacobson SG et al. “Improvement and decline in vision with gene therapy in childhood blindness.” New Engl J Med. DOI:10.1056/NEJMbr1412965

2. Park CY et al. “Functional Correction of Large Factor VIII Gene Chromosomal Inverstions in Hemophilia A Patient-Derived iPSCs Using CRISPR-Cas9.” Cell Stem Cell.  http://dx.doi.org/10.1016/j.stem.2015.07.001



By: BioTek Instruments, Peter J. Brescia Jr., MSc, MBA  

Tuesday, August 4, 2015

Microscopy, Astronomy and Vincent Van Gogh

Anyone in biomedical research knows not all experiments work exactly like they are planned. This was the case a few weeks ago when I seeded some NIH3T3 cells that express GFP into the wells of a microplate. Maybe it was the recent images of Pluto transmitted from NASA's New Horizon's space probe or maybe it was the Van Gogh print of "The Starry Night" that used to hang in my daughter’s bedroom, but when I saw the completed montage image of NIH3T3-GFP cells that have been fixed and stained with DAPI my first thought was that it looked like stars and constellations.
Figure 1. DAPI stained NIH3T3 Cells Expressing GFP. Cells were fixed with 4% formaldehyde and then stained with DAPI nuclear stain. A total of 225 images in a 15 x 15 montage array using a 20X objective was rendered and stitched into a single image.  Scale bar represents 1000 ┬Ám.

Scientists examine things in particular ways using a combination of very sophisticated equipment, everyday instruments, and many unlikely tools. Some phenomena that scientists want to observe are so tiny that they need a microscope. Other things are so far away that a powerful telescope must be used in order to see them. What is fascinating to me is that despite the vast differences in size, things appear very similar. 
Figure 2.  Ultraviolet Coverage of the Hubble Space Telescope Ultra Deep Field. The Hubble Ultra Deep Field 2014 image is a composite of separate exposures taken in 2003 to 2012 with Hubble's Advanced Camera for Surveys and Wide Field Camera 3.

Despite the differences in true object size, astronomy and microscopy are very similar. Both of these fields of research use visual information as a means to maximize scientific expertise; yet the targets are often inaccessible to the human eye. Astronomy relies on telescopes to provide information about extraterrestrial objects, while microscopy utilizes microscopes to visualize cellular objects at much closer range. Even though the objects of astronomy are tremendously large, their distance from us renders them microscopic to the naked eye. At the most basic levels, both systems use much the same magnifier; essentially a tube with focusing lenses, but with markedly different focal lengths.

BioTek has a number of imager products that have produced some remarkable microscopic digital fluorescent and brightfield images. Coincidentally one of my colleagues in BioTek China created a 2015 calendar that shows the similarity between images generated by the BioTek Cytation readers and those taken by the Hubble Telescope.

The GFP expressing NIH3T3 cells that I plated were not evenly distributed and rather clumped making them unusable for the experiment that I had planned, but they certainly had the appearance of stars in the night sky. I have a pretty good idea as to why my cells are arranged the way they are, but astronomers have puzzled the same question about stars for centuries.


By: BioTek Instruments, Paul Held, PhD., Laboratory Manager

Tuesday, July 28, 2015

Tumor Invasion Assays using 3D Spheroids

Metastasis is the main cause of death in cancer patients and one of the most complex biological processes in human diseases.  Target-based approaches that center on cancer gene mutations are fraught with difficulties as the development of cancer involves a series of mutations, some of which drive the disease (see Figure below), while others are just passengers in the process of carcinogenesis. 

Carcinogenesis involves exposure to a carcinogen that provokes an initial genetic mutation.  For cancer to develop and metastasize, numerous other genetic changes occur.
Phenotypic approaches involving 3D cell culture methods that model tumor invasion may provide a more successful approach to developing new therapies. 3D spheroidal tumor structures using co-cultures of cells, creation of a suitable invasion matrix, and image-based monitoring and analysis of tumor invasion through the quantification of invadopodia can model metastasis for preclinical drug discovery studies using the Cytation 5 Cell Imaging Multi-Mode Reader.
 
3D spheroid consisting of a co-culture of MDA-MB-231 (breast cancer cell line) and human fibroblasts invading into matrix in a 96-well microplate.  Gen5 Image+ software quantifies the extent of invasion by measuring invadopodia outlined in gold in the figure.

A full application note can be found here.
 
By: BioTek Instruments, Peter Banks Ph.D., Scientific Director

Tuesday, July 14, 2015

Amazing Medical Discoveries: The “Missing Link”


Medical research produces new and amazing discoveries on an almost daily basis.  These discoveries take place in any number of areas of research, but they occur at the cellular and molecular levels.  For example, a new protein is discovered or the role of a previously discovered protein is elucidated or a disease is found to be linked to a genetic element. Very rarely does an anatomical discovery make the news. But in a recent paper in Nature a link between the lymphatic system and the brain was reported by researchers from Jonathan Kipnis' lab at the University of Virginia.  The complete separation of the lymphatic system and the brain has been anatomical dogma for quite some time.

Besides draining excess fluid from tissue, lymphatic vessels carry immune cells throughout the body. For decades, researchers had assumed that the lymphatic system stopped short of the brain.  The blood brain barrier (BBB), which prevents the exposure of the brain to many compounds, (toxins, drugs, bacteria etc.)  and separates it from the normal immune system, has always believed to provide "immune privilege" to the brain, exempting it from normal immune surveillance.  While limiting the body’s ability to clear pathogens, immune privilege was thought to avoid inflammatory swelling of the brain within the rigid skull.   

Old and New Representations of the Lymphatic System. Credit: University of Virginia Health System.

The discovery originated when Dr. Antoine Louveau, a researcher in Kipnis' lab, mounted the membranes, known as meninges, that cover mouse brains on a slide. In the dural sinuses, which drain blood from the brain, he noticed linear patterns in the arrangement of immune T-cells indicating lymphatic vessels. The trick to observing the vessels was to apply fixative to the brain meninges within the skullcap prior to dissection rather than afterward as one would normally do. Subsequently the vessels have been observed in live animals confirming their function.

These findings begin to explain some poorly understood links between disorders such as autism and gastrointestinal problems in children or neurological diseases like multiple sclerosis and Alzheimer's being linked to changes in immune system function, and autoimmune diseases of the gut, such as Crohn’s disease correlating with psychiatric illness.

The reason I found this discovery intriguing is because I have a number of friends that have been diagnosed with multiple sclerosis (MS).  MS is known to be an example of the immune system attacking the brain, although the reasons are poorly understood. The discovery that lymphatic vessels link the brain to the immune system could transform our understanding of how these attacks occur, and what could stop them.

With all the advances that have been made over the years in biomedical science one almost wonders what is left to discover at the macroscopic level regarding human anatomy.  To have a discovery such as this makes you realize that there is still a great deal that we don't understand about the human body.  It’s not every day when these types of discoveries are made and dogma needs to be rewritten.  To quote Kevin Lee, PhD chairman of the UVA Department of Neuroscience, "They'll have to change the textbooks". 

References:  Louveau A, Smirnov I, Keyes TJ,  Eccles JD, Rouhani SJ, Peske JD, Derecki NC, Castle D4Mandell JW, Lee KS, Harris TH, Kipnis J. Structural and functional features of central nervous system lymphatic vessels. Nature. 2015.



By: BioTek Instruments, Paul Held, PhD., Laboratory Manager

Thursday, July 2, 2015

Variable Bandwidth Monochromators: A Useful tool for Quantifying Fluorescent Probes in Produced Effluent Water


Produced effluent water is one of the primary waste products of the process of separating oil, gas and water and is typically a mixture of formation and injection process water containing oil, salts, chemicals, solids and trace metals. Stringent environmental regulations require producers to monitor oil content in water streams from oil production and refining. To further reduce the level of crude oil released in produced water streams, more advanced methods are necessary to accurately detect minute traces of oil in purified samples; down to parts per million (ppm) or parts per billion (ppb) levels.

Several reference methods have historically been used by the industry to measure oil in produced water. These include infrared absorption, gravimetric, and gas chromatography and flame ionization detection (GC-FID) methods. However, none have been universally adopted, and all are subject to limitations. Newer methods, which incorporate a fluorescent probe, overcome the limitations of each of the previously listed methods, and also deliver a high level of sensitivity. Fluorescent probes can also be used to monitor other pollutants, provided that the dye molecule has a high partitioning coefficient into the contaminant.


BioTek’s new Synergy™ Neo2 Multi-Mode Reader has a monochromator-based detection system that can scan and record excitation and emission spectra for fluorescent probes used in produced effluent water, such as those shown above for the unknown sample 201. The spectra provide the basis for optimizing wavelength and bandwidth selection, using the excitation and emission variable bandwidth monochromators in Synergy Neo2. With Gen5 Software, multiple parameters can be tested in the same experiment, setting excitation and emission values at the determined spectral peaks, or off the peak in order to lower potential crosstalk.


The sensitivity of each parameter combination can then be assessed by examining calculated signal to noise and limit of detection (LOD) values.


The results of the complete set of reads confirm that the flexibility of the variable bandwidth monochromators on the Synergy Neo2 provide the most ideal parameters to easily detect trace crude oil levels and other potential pollutants in effluent water streams. We invite you to learn more about this application, as well as the Synergy Neo2, by following the link to read the entire application note.


By: BioTek Instruments, Brad Larson, Principal Scientist

Tuesday, June 23, 2015

Top 10 Tips for Cellular Microscopy: Part 2

6. Proper Focusing

Autofocusing is the most popular method used in automated imaging. A label-free assay requires brightfield or phase contrast imaging channels to be used for image focus - unlabeled cells provide limited contrast from background, so proper illumination of the sample is critical. It can be helpful to initially use manual imaging to find the z-height where samples are appropriately focused. Incorporation of this value into Vessel z-height should be added to vessel definition specifications in software so that the proper z-height is used to begin the auto-focus process, which can minimize focus time.

With fluorescence imaging, auto-focusing may be accomplished using the signal from an included fluorescent label. While this works to improve the ease of focus in many situations, certain caveats still exist: A fluorophore with a strong signal allows for the proper focal plane to be found easily, minimizing time spent focusing on each sample. Conversely, a fluorophore with a weak signal, or high background signal, lowers contrast within the image and can cause images to be out of focus. Localization of the fluorophore to the specific area of the cell also needs to be taken into account when using objectives with high numerical aperture and limited depth of field. Use of a nuclear stain will lead to accurate focusing on the nucleus of a cell, but when using. a second channel to image fluorescence emanating from a fluorophore localized to the plasma membrane, the image may be improperly focused, as the two cellular compartments exist on slightly different focal planes. A fixed z-offset can accommodate this difference and can easily correct for the disparity.

Finally, fixed-focusing can be used in lieu of auto-focus, when a high degree of confidence exists that samples exist at the same focal plane across multiple wells or fields of view. Fixed-focusing can also be used when a cursory view of the state of an experiment or process is necessary. By eliminating the auto-focus procedure, imaging is immediately carried out on all samples, in the fastest possible processing time.

7. Establish Optimal Image Acquisition (Exposure) Settings

Appropriate image acquisition settings are critical for obtaining meaningful, quantifiable data and images suitable for use in publication. These settings typically include excitation light intensity, camera gain and integration time. Qualitative microscopy uses image acquisition setting, that provide the best looking image. Quantitative digital microscopy adjusts exposure parameters such that the entire bit depth of the camera is used as much as possible.

When using a fixed exposure for all wells included in an experimental imaging step, use positive and negative controls, if possible, to establish exposure settings. This will help avoid pixel saturation, while maintaining an exposure that will also accurately quantify changes in fluorescence amongst test samples. Saturated pixels cannot be accurately quantified causing the dataset to be truncated at the maximum end of the dynamic range. Conversely, parameters that are set too low, often used to hide “background” cellular fluorescence, truncates the data at the minimum end of the dynamic range and again skews quantitative measurements. Manual imaging can also be used for this function, with the final parameters then being transferred to the automated imaging step. If a change in fluorescence is expected between test wells, or between sample and control wells, the auto-exposure function should not be used, as the imager will attempt to compensate for high and low signals by adjusting the parameters accordingly. This will result in a normalization of the data, and elimination of the expected assay window. Auto-exposure is useful to start the optimization process, if single samples are being analyzed, or if a comparison between actual signal values is not part of the analysis criteria.

8. Kinetic Imaging

Kinetic imaging to track cellular changes in real time can yield important nuances that might otherwise be missed in a single end point image. An additional benefit to kinetic imaging is the ability to create a movie from the images captured over time, allowing for visualization of an expected change, in addition to numerical quantification. As with all imaging, however, certain guidelines must be followed to guarantee the best possible images are captured and data obtained.

Fixed exposure is recommended for the kinetic experimental procedure. Most likely there will be a desire to quantify the change to the sample that will occur over the incubation period, and fixed exposure will prevent data normalization over the kinetic time period. If a reduction in signal is expected, use a positive control for parameter optimization. If an increase is expected, use a negative control. Take care, however, to not choose a setting too low which will not allow the expected change in signal to fall within the linear range of the assay. Fluorophores should also be stable, with low susceptibility to photo-bleaching during the entire length of the imaging procedure.

9. Pixel Shift

Pixel shift occurs when a filter in an imaging path diverts the light rays, resulting in a shift of the image detected on a high-resolution CCD camera. Typically this is caused by emission filters not being of uniform thickness, referred to as filter wedge. This shift becomes problematic when two or more images of the same object are acquired using different filter sets and then overlaid in order to simultaneously view fluorescence from multiple fluorophores. Images produced by different fluorophores will not be accurately correlated or combined because each image is shifted by a differently according to the wedge angles found in each filter set.Because even the most expensive objectives and filters will have some aberration, calibration routines are used to correct for this phenomenon. With automated digital microscopes, this is often done during the LED and objective calibration steps. Using brightfield as a reference of illumination, each LED and objective combination is tested for pixel shift by determination of the center of an aperture. The calculated pixel shift for each is then used to offset images such that overlays are in their proper location. Individual colors from multi-color images can also be repositioned after imaging using a channel shift tool in some digital microscopy software. For example, a live specimen moving during separate fluorescent color imaging steps can result in slight misalignment of the different colors with overlaid images. Because different color channels are imaged in sequence there is a short temporal interval between images that can capture specimen movement. The channel shift tool allows the researcher to reposition the channel in order to correct for the movement.

10. Establish Favorable Environmental Conditions

Long term kinetic live cell imaging typically requires some control of environmental conditions to maintain sample viability. Temperature is a main consideration. While most mammalian cells grow at 37 °C, if bacterial or yeast cells are used, higher or lower temperatures may be required. Manual pre-setting, or inclusion of a temperature control step into the procedure, can ensure that proper temperatures are maintained while imaging is performed. Atmospheric control can also be critical. For example, maintenance of 5% CO2/95% air within the imaging chamber provides an appropriate atmosphere for mammalian cells. When combined, provision of the correct temperature and atmospheric conditions can allow live cell imaging to take place without sacrificing sample integrity over long kinetic reads.

By: BioTek Instruments

Tuesday, June 16, 2015

Top 10 Tips for Cellular Microscopy: Part 1

1. Sample Preparation

Sample preparation is critical to ensure high quality images. Cells should be at an adequate density when harvested from the tissue culture flask, and removed with an appropriate dissociation solution to retain essential cellular function. Use aseptic technique to prevent contamination during the experiment and avoid unnecessary introduction of foreign particles or cellular debris that can negatively impact image quality.

For live cell assays, fluorescent probes need to be cell membrane permeable to assess structure and function within the cell. Optimize concentrations and incubation times prior to performing the actual assay to maintain cell viability while still ensuring sufficient fluorescence. Be aware of potential background fluorescence - it may be necessary to incorporate wash steps before imaging. Together, these will help to provide images of live cells with good signal to background ratios.

For fixed cell assays such as immunofluorescence, always use a proven fixing/permeabilizing/staining protocol, including, addition of a blocking agent to prevent non-specific binding, and optimization of antibody concentrations when necessary.

2. Sample Vessel Considerations

The bottom thickness of sample vessels used for microscopy is important, since inverted microscopes view samples through this thickness. Different vessels have different bottom thicknesses. Some common vessel bottom thicknesses include:
  • microscope slide cover glass: 0.17 mm
  • common plastic microplates: 0.5 mm
  • low density microplates: 1.0 mm
Newer microplates, developed specifically for imaging, have bottom thicknesses closer to that of a cover glass. When imaging with lower numerical aperture objectives (< 0.7), vessel bottom thickness is not a great concern. However when air objectives with high numerical apertures (0.8 or greater) are used, variations of just a few micrometers in vessel thickness can cause image degradation. Aberrations get worse as the vessel thickness increases. A correction collar is used to compensate for these errors. The collar allows for adjustment of the positioning of the central lens group within the objective and insures proper image focusing no matter what type of vessel is used.

3. Cell Number Optimization

Typically cell-based assays are more robust when there are more cells. Cellular imaging is a bit trickier; plating too many cells can cause cells to grow on top of each other, confusing proper segmentation of cells for cell counting or other assessments. Plating too few cells can be statistically insignificant and skew cell sub-population analysis. Post-plating and post-treatment incubation times should also be factored into cell number determination, as cell propagation can continue during the preparation and execution of the experiment.

4. Cell Type

A wide variety of cell types can be used for cellular imaging, including immortalized cell lines, primary cells, and stem cells. Most cell types used in microscopy applications are adherent in nature; with the right treatment, the cells will adhere and form a two-dimensional (2D) layer of cells across the bottom of the well. 2D is the most straightforward format for imaging, as the cells are in a single plane which can be found using the auto-focus capability of the imager.

Suspension cells, such as erythrocytes and leukocytes, can also be imaged. These cell types lack the ability to adhere to a surface and require additional manipulation to restrict them to a single focal plane at the surface of the vessel used for microscopy. Suspension cells can be transferred onto a microscope slide with coverslip or hemocytometer with cover glass which restricts the cells in the axial direction and provides better image clarity. With microplates, a centrifugation step can also be used to induce cells to the bottom of the well.

More complicated biology, such as tissues and three-dimensional (3D) cellular structures, have also been incorporated into image-based experimental processes. The difference in size and shape compared to individual 2D-plated cells, make more advanced imaging procedures necessary. Tissue samples can often be much larger than the field of view provided by the microscope objective, particularly if high resolution is desired with high numerical aperture objectives. An image montage procedure captures numerous images across the entire tissue which can then be stitched together into a single composite image which can then be used for analysis. 3D cellular structures contain hundreds to thousands of cells which extend not only in the x- and y-axes, but also in the z-axis. Capturing a z-stacked set of images across multiple z-planes, coupled with z-projection image processing allows for a composite image with better focus than any one of the individual z-stack images. This z-projected composite image can increase the accuracy of any subsequent analysis.

5. Fluorophores and Imaging Filter Sets

It’s essential to use the best fluorophore for successful imaging. Excitation and emission spectra of the fluorescent probe or protein should be matched with LED light sources, excitation and emission filters, and dichroic mirrors available to assure satisfactory fluorescent signal. The fluorophore's Stokes shift is an important variable to consider as narrow Stokes shift can lead to excessive background fluorescence and poor signal to background. Additional optimization is necessary if multiple fluorophores are used together in a multiplexed format. Molecular spectra tend to be broad and overlap in both excitation and emission spectra can occur, resulting in bleed-through of one fluorophore into the fluorescent channel of another. This is particularly important should both fluorophores be colocalized in the same area within a cell.

By: BioTek Instruments