Many times the fun of AGBT is the advances in genome technology
If you have attended Advances in Genome Biology and Technology in the past, sometimes there is amazing technology presented in a plenary session. Who can forget Stephen Turner’s electrifying talk in 2009’s meeting, complete with fireworks on the beach? <https://www.genomeweb.com/blog/fireworks-beach-guess-theyre-officially-out-stealth-mode> (And if you would like to take a bit of a nostalgic tour, this article from the New York Times were made available to pass out then.) I remember my jaw dropping not a few times during that presentation.
But other times there is amazing technology that is represented by a single person (that you may or may not happen to meet among the approximately seven hundred attendees). I remember one year (it was perhaps 2014 if memory serves correctly) sitting at a place where, um, adult beverages are usually served, and a relatively young Japanese gentlemen sat next to me. He was from a new company that had a poster that used electron microscopy for DNA sequencing, and he told me several interesting things about what their technical approach was and its unique advantages. (Here’s some background on the technology for those interested.) After taking a look at the poster, the data showed 10 or 12 bases being read, which was a start. These sequencing technologies are very difficult to develop, which should go without saying; however many times those who are consumers of a technology who may know a considerable amount of the mechanics of the principles at work with a given sequencing instrument will not know the major challenges that have to be overcome to bring such technology to the market. These instruments (as little as $1,000 in the case of Oxford Nanopore’s MinION to $50,000 for the new MiniSeq or the older Ion Torrent Personal Genome Machine) have to combine biochemistry (sometimes with complex engineered fluor-modified nucleotides and accompanying engineered polymerases), with fluidics (these reagents have to do their business with the template somehow), with optics (that fluorescence has to be picked up off the template), with electronics (and then that light has to be turned into a digital signal), and all inside a user-friendly instrument.
So instead of highlighting talks from the second day of the Advances in Genome Biology and Technology here in Orlando Florida, I’ll take the liberty to highlight a few advances in technologies that caught my eye.
The poster session had a concurrent software demonstration session, which one has to pick and choose among the 20 or so tables that were setup from vendors such as DNANexus, Biomatters, Golden Helix, QIAGEN’s Ingenuity, and many other firms focusing on different aspects of data analysis. Now the software business is not for the faint of heart – my friend Geoffrey Routh maintains a list of Genomics Service Providers that currently numbers a whopping 145 here – but these firms have something to offer the genomics market, and they are making progress in saving researcher’s time.
One firm SolveBio out of New York has done nice work compiling database resources, tracking version changes of given GrCH builds, and basically putting together the database and software engineering API resources in an easy-to-use format. I am not an expert in knowing how this very competitive market for software can cleanly differentiate itself, but nonetheless in the brief demonstration it wasn’t hard for me to see value. At the company I work for we’re just getting started with clinical informatics internally, and a friend had to spend hours to find just the right tool to convert a VCF file based upon a certain build of GrCH and get it into a different GrCH version. They even have an on-demand custom annotation service, which may well be of interest for those institutions or laboratories who have urgent data interpretation needs than their existing resources can allow.
Bobby Sebra (Mount Sinai Icahn School of Medicine) certainly ranks high among the people I’ve met at AGBT who are wonderful at straddling that difficult line between advanced technology and applying it to genomic analysis. (For those interested, I interviewed Bobby at last year’s AGBT available here.) He presented a poster of a new technology from a startup coming out of ‘stealth mode’ called Berkeley Lights, that uses a principle Opto-ElectroPositioning to manipulate single cells. Now this is a crowded area for manipulating Circulating Tumor Cells (CTCs) – a Cynvenio instrument is located in the Thermo Fisher Scientific suite, Silicon Biosciences have their first suite of their own at AGBT this year, and others such as ClearBridge Biomedics and Fluxion Biosciences also have instruments that I’ve seen at other conferences.
The poster showed the BLI OptoSelect™ technology, using machine vision and automated path planning to manipulate as many as 1000 individual cells using light. (These individual cells had a white square around them, similar to what I’ve put together in the screenshot from their website.) With the ability to ‘pen’ and ‘unpen’ individual cells, they are placed into a 384- or 96-well plate for downstream sequencing with the Sinai team using Ion Torrent Cancer Hotspot Panel v2 with Ion Torrent S5 XL sequencing. He showed data from 1, 10 and 100 cells from a bulk endometrial tumor sample, and showed how the sensitivity for detection of certain genes changed depending on the number of cells analyzed (bulk, 100, 10 or 1 cell), reflecting inherent heterogeneity of the sample. He also had a lot of interesting Oncomine results with two individual cases.
He showed 1:100,000 cell selection sensitivity, as well as the ability to use as few as a few thousand cells as input up to several million cells. When I asked him about the differentiation between platforms, it was the fact that these cells were handled so gently as to remain alive that offered quite a lot of potential, such as culturing, secretion and phenotype testing on chip, and the number of cells isolated to single-cell purity was a lot higher than the alternatives.
Also coming later this year the system will allow for cell selection to sequence-ready library prep output.
This company, founded by Joel McComb, is offering a new library preparation methodology called TempO-Seq for measuring (for their first product) targeted RNA expression, using the NGS platform as a readout device rather than a measuring one. The concept is to count tags instead of going through reverse transcription (and all of its variability, of which if you look into it is something you do not want to dwell on too much) and then to fragmentation and library preparation and sequencing, and then to mapping. By simply binning tags you can then count, and associate the tag to a given sequence.
Thus the overall cost-per-sample is driven down (how does $5/sample for the sequencing sound?), and they showed nice MAQC-equivalence at 2700 genes (as a proof of principle along the way to some 21,000), sensitivity down to 10 pg total RNA (single-cell level), ability to work with compromised FFPE-derived sample. Very much a company I’m going to keep an eye on; they have the potential to also to DNA-based variation assays (the probe is recognizing DNA instead of RNA, but tagging nonetheless) or even protein assays (reminiscent of the TaqMan Protein assays).