News Article | January 15, 2016
The tunnel that allows the L train to shuttle European tourists and junior advertising executives between Williamsburg, Brooklyn and Manhattan will soon be shut down for repairs, according to recent reports. The Canarsie Tube was flooded with corrosive saltwater during Hurricane Sandy, and fixing the damage could take anywhere between one to four years, depending on which plan the MTA decides to undertake. This will, in effect, isolate much of Williamsburg from an easily accessible subway route into Manhattan. For those unaware of why this matters, Williamsburg, according to Wikipedia, “is an influential hub of contemporary music like indie rock, it is attributed to be the place of origin of electroclash, and has a large local hipster culture, a momentous art community and vibrant nightlife.” The MTA is considering two major options, reportedly: Shutting down the entire tunnel, working nonstop on repairs, and finishing in just over a year. Or, because there are really two separate tunnels—one covering tracks going each direction—reducing traffic to half-volume and repairing each of the tunnels one at a time, and finishing in three to four years. Either way, Williamsburg will be left without, gasp, an obvious, super-convenient transit path into Manhattan (much like the rest of the city!) for a significant chunk of time, a prospect to which people reacted totally reasonably. Even if one tunnel was kept open, the delays would make the commute across the river a crowded, logistical nightmare. The train is already running at maximum throughput as is. It really is a fascinating engineering problem—there are 350,000 riders who travel through the Canarsie Tunnel every day. So, I asked an urban planner and transit expert to parse out what this will mean for the future of the birthplace of electroclash. (Full disclosure, I recently lived in Williamsburg, until I felt the blow of an ill wind on the nape of my neck and hastily moved to LA, just in time to escape the Ltrainpocolypse.) Adam Davidson is a former MTA budget analyst, he holds a Master’s in City and Regional Planning from the University of Pennsylvania, and is currently researching how GPS and smartphone data are impacting the urban environment at CUNY. “It’s just one year,” he begins by emphasizing. “In that year, it will suck for a lot of people. I’m sure rents won’t be going up that year, or they might even go down a little bit.” “But when the L train comes back, it will be better than ever. It’s going to have more throughput. One thing about the L train is that it’s awesome except when it isn’t. It works really well, but there’s no alternative when it shuts down. So that year will suck. Or the three years will suck. It’s going to dampen an area that’s just been hot.” “I just don’t know how it will handle the morning commute,” Davidson tells me. “Right now it’s at the maximum throughput. To me, the three-year option sounds worse in some ways. I don’t think they’d be able to meet half the demand. Or even a quarter of the demand.” Instead, Davidson says, he thinks the Department of Transportation may get creative. “It really gives the DOT the option to rethink the surface transit options.” He thinks they may look to expand Select Bus Service, and to more fully adopt the Bus Rapid Transit model that’s risen in popularity in cities around the world. Basically, buses that get dedicated lanes, are exempt from certain traffic stops, and move faster than other city traffic. Davidson says that if the L tunnel shuts down, we can expect to see more BRT-like solutions, and perhaps a move closer to true bus rapid transit. “During Sandy, they basically closed down sections of the bridges to all but buses—the Manhattan bridge, one deck was exclusively for buses, and something similar happened for the Williamsburg bridge.” “What we call the SBS is just the bus in Europe,” he says. “Payment just like the subway. It’s a little more cumbersome in New York. It’s not as quick as going through the turnstyle. They have a system where they check the fare payment. But the driver doesn’t deal with that. You have a card that you flash to an inspector if they’re checking.” The Radio Motherboard podcast talked about what the L train means to Brooklyn late last year. The podcast is also available on your favorite podcasting app. “Not that this can totally solve the 350,000 person demand. But I think that you can re-envision ways to get buses across the Williamsburg bridge or the Queens-Midtown tunnel.” “I think they’re going to have to consider this BRT concept, and getting more surface transit than they’ve been offering. The city will have a lot more latitude in how they approach the situation, so they might be able to get more acceptance of the SBS, and say ‘we can find a way to dedicate some loadspace so we can get bus after bus after bus over the Williamsburg bridge.’” “People are going to reconsider the trips that they’re making,” Davidson says. “The work trip, which is the most valued trip, that is where we’re going to see the most creativity in terms of how people figure that out.” “I would expect people to stay home, or stay in the neighborhood, or hold meetings—Brooklyn is becoming a work destination in and of itself; I’ve seen offices where everyone is commuting to Brooklyn from Manhattan. People will be looking for ways to not go into Manhattan all the time.” Changes to the neighborhood “There are going to be some winners, going to be some losers,” Davidson says. “Some places are going to see an opportunity catering to people who find it harder to get out of the neighborhood. The weekday lunch crowd may be bigger, but the night time dinner crowd may be smaller, obviously.” Davidson says that those who stand to be most affected are small business owners whose month-to-month earnings are dependent on the influx of riders. “Bedford Avenue might be hurt because it relies more on tourism.” Real estate, he says, will take a brief hit, but it will likely amount to just a tiny delay in recouping expenses. “The ones with deep pockets are totally going to survive this. And it will be better on the other side. The people you should be concerned for are the more marginal players. The businesses who, with a couple bad months could destabilize their finances, your smaller businesses, your storefronts, who don’t have international financing. The businesses I’m most worried about are those who signed those 5-10 year leases without this being a reality.” “I expect those big developers, they have the long term in mind, they have the finance in mind to be able to do that. The Edge [a giant condominium complex on the Waterfront] has its own ferry, they’ve got transportation.” Furthermore, it’s possible that squeezed businesses could be exploited by their landlords. “If they’re thinking they can use this to incentivize someone to leave and get an even more lucrative deal, that’s certainly an option there. I think that the concern in this is those marginal populations. And I think that’s why I was talking about BRT, the city needs to come up with a solution that’s outside their normal operations here. “The extent that people feel they can get into and out of Williamsburg, that’s the extent that it’s really going to hurt people. Those people on the edge are going to feel the pain a bit more.” “It does suggest that the M and the J will become a lot more attractive to people in that area. Also the 7 in Long Island City. So the people who aren’t going to be harmed are the people who have yet to move there.” A place to go to complain about it The city will likely set up a center to deal with complaints, much like it has for its nigh-eternal 2nd Avenue subway line project. “Look to what they’re doing on 2nd Avenue. The mitigation they’re willing to take for people disrupted by the building of the 2nd avenue subway. In some cases, they dealt with potentially lost revenue—I know that the city and the MTA wanted to mitigate the issues they were causing as much as they could, and they have community outreach programs, a center to go and talk about the project. If the project was hitting you in an unanticipated way, there are people you can go and speak to about it." Don't hold your breath for compensation for lost artisanal cheese sales, but the city will take the disruption seriously. But will Williamsburg still be hip “I don’t think it’s going to ruin Williamsburg. I think a lot of people are going to be inconvenienced, and some real estate investments are going to have a delay on their returns,” Davidson says. “I think some people will be happy, they’ll see it as a reprieve—other people will be worried about money that they might be losing. For everyone who’s tired of seeing all these hipsters coming across—I could imagine it would be a calmer year on that front.” “Maybe one way of talking about it is that the people of Williamsburg are getting a break, culturally, when they’re not getting a break. When they’re not going to work. It’s going to need some real innovative alternatives on the part of the city to mitigate that. Saturday night might be a little bit different.” “It will be really bad, but the other side of this is that it will be pulling off a band aid and the service will be much much better.”
After more than a decade of work, and at a cost of around US$3 billion, the Human Genome Project yielded the DNA base sequence of a representative human genome in 2001. Now, some 15 years later, technological advances have created the next generation of sequencing machines, which are capable of sequencing many genomes in a day at a cost of around $1,000 each (see 'Technological leap'). “The sequencing is almost the easy part now,” says Cordelia Langford, senior scientific operations manager at the Sanger Institute in Hinxton, UK, and a participant in the original Human Genome Project. The technology is not perfect: inaccuracies still creep into the sequencing data, and some regions of DNA cannot be sequenced at all. Then huge analytical effort is required to do something useful with the data generated. Nonetheless, the ability of modern technology to achieve so quickly and cheaply what once took years of enormously expensive work is making the dream of precision medicine more plausible by the day. Genome sequencing reveals the exact order in which nucleotide molecules — each containing one of four bases, adenine (A), cytosine (C), guanine (G) and thymine (T) — are arranged along the strand of DNA. There are about 3 billion bases in a human genome sequence, arranged as complementary pairs that hold matching strands of the DNA double helix together, and they are distributed across 23 pairs of chromosomes. Patients around the world are already benefiting from genome sequencing, and the cost is falling so sharply that the practice could soon become almost routine. The Sanger Institute, for example, is sequencing the genomes of patients with rare diseases and cancer as part of the 100,000 Genomes Project organized by Genomics England. Some participants already benefit from improved diagnosis and treatment, and researchers are discovering more about the genetic variations that cause disease. Sequencing is not the only option in genetic analysis, however. A key part of the Precision Medicine Initiative, run by the US National Institutes of Health, is the more conventional, and arguably less technologically heroic, approach of genotyping. Here, the variants of specific genes that people carry are identified without knowing their full genome sequence. But genotyping requires some idea of what to look for. Sequencing is the only way to uncover everything about the DNA that governs the onset and progression of so many diseases, and to learn how our DNA keeps us healthy. To sequence a genome, you must first smash it into millions of bits. The original method used by the Human Genome Project, known as Sanger sequencing, made copies of parts of the initial fragments of DNA, each copy a single nucleotide longer than the last. These were then laboriously separated on electrophoresis gels and identified by the radioactively or fluorescently labelled nucleotides at the end of each strand. “Each of the fragments had to be sequenced one, or just a few, at a time,” explains Langford. Sanger sequencing is still in use today, albeit in a more automated form. The technological advance that allows genomes to be sequenced in a single day is massively parallel sequencing. Billions of fragments can now be sequenced and read simultaneously, Langford says. The Sanger Institute uses and tests several modern sequencing methods — part of its remit is to assess emerging technologies. Its main workhorse, however, and the method used most often in the 100,000 Genomes Project, is sequencing by synthesis (SBS). This is a finely choreographed cycle in which enzymes build strands of DNA that are complementary to template strands derived from the fragments of the genome being sequenced. Each new strand is built by adding the nucleotides that match the template one by one. At each step, fluorescently labelled nucleotides bring the synthesis process to a temporary halt. An optical analysis system then scans the strands, which are held on a glass plate about the size of a microscope slide, and detects by way of coloured signals which nucleotides have been added. The chemical groups that block further synthesis can then be cut off and washed away, and another cycle of synthesis begins. In this way, nucleotide by nucleotide, base by base, new strands are synthesized as specified by the template strands, and the sequence in which the bases are added is recorded. The technique was invented in the 1990s by University of Cambridge spin-out company Solexa, which was acquired in 2007 by Illumina, a company based in San Diego, California, that now claims a roughly 90% share of sequenced bases worldwide. “Developing the technology required the use of genetic engineering to create enzymes that will work with the modified fluorescent nucleotides,” explains Illumina's chief scientist, David Bentley. These reactions are based on the way DNA is copied in living cells. Crucial to the advancement, Bentley says, has been the move away from natural reagents. The adoption of non-natural chemistry makes modern sequencing reactions robust and efficient enough to operate at the speeds necessary to sequence genomes in hours, rather than years, he says. The next big challenge is one for software: analysing all the sequenced fragments and piecing them back together to form a three-billion-base genome sequence. Langford likens this to completing an incredibly complex jigsaw. But whereas a jigsaw puzzle comes with a complete picture for guidance, all the computer has to help it decide where the fragments should fit is the reference genome, derived from the Human Genome Project. The reference genome is a representative example of a human genome that approximates what the pieces in our individual jigsaws will create, but with slight differences that make us who we are — and these differences are central to the aims of precision medicine. Illumina's SBS is one of several technologies that can read a person's genetic code. Ion-torrent sequencing, for example, is quite similar to SBS: it also reads the sequence piece by piece from a newly synthesized strand of DNA. But rather than use a coloured marker to denote each nucleotide, the signal that distinguishes the bases comes from hydrogen ions that are released into solution when new nucleotides are added. The ions cause a detectable blip in the pH of the solution, and these blips translate into a sequence. The machine washes each nucleotide in turn through the system and monitors which one causes the ion torrent at each stage. The length of the fragments sequenced, and therefore the complexity of piecing together the jigsaw puzzle afterwards, also varies between techniques. Some of the longest fragments are sequenced by biotech company Pacific Biosciences, based in Menlo Park, California. “Our technology delivers DNA sequence reads about one hundred times longer than the short-read technologies used in most next-generation sequencing,” says Jonas Korlach, the company's chief scientific officer. “This makes understanding and assembling the sequence reads into complete genomes much easier.” Reading longer unbroken sections of DNA also helps to reveal complex long-range structural features, but such long-read technologies are often more expensive than other techniques. The UK company Oxford Nanopore Technologies uses a unique system in which DNA strands are fed through tiny protein nanopores that have been inserted into a polymer membrane. Rather than requiring any DNA synthesis, the system simply notes the sequence of nucleotides passing through the nanopore, based on specific electrical signals generated by different combinations of bases. This is the technology behind the company's MinION — a portable sequencing device about the same size as a mobile phone. Clive Brown, chief technology officer at Oxford Nanopore, says that the device weighs less than 100 g; the next-smallest box on the market is 46 kg, he adds. Portability may be most important in remote areas, such as makeshift clinics set up to tackle emerging diseases in developing countries. MinION sequencing, for example, was used to sequence short viral genomes in field hospitals during the 2014 Ebola outbreak in West Africa. Portability is simple to compare across technologies, but not all comparisons are so straightforward. Cost per sequence, for instance, depends as much on how many genomes a lab is sequencing as it does on the system being used. Accuracy can be difficult to pin down too. Manufacturers talk about accuracy of between 90% and 99.9%, often at the higher end of the range, but that still adds up to a large number of individual reads of a sequence that contain errors ( et al. Genome Med. 8, 24; 2016). For this reason, genome sequencing is often repeated multiple times to achieve a truly reliable result. Practitioners talk about sequencing to differing degrees of 'depth', depending on how many times the same DNA is sequenced to increase confidence in the results. It is the accuracy of the final collated analysis that really matters. Regardless of which sequencing technology is used, researchers and clinicians face an important decision about whether to sequence an entire genome or to take a more targeted approach. They can choose to focus on a specific region of interest in a particular chromosome. They can choose to examine only the genes that actually code for proteins or functional RNA molecules, while ignoring the vast bulk of our DNA — often misleadingly called junk DNA — that may have a crucial regulatory role or have no real function. The exome, for example, is the part of the genome comprising only the stretches of DNA called exons that code for protein molecules.Targeting only these regions is like fishing: it requires bait. As Langford explains, an exome bait can be a collection of small sections of synthetic DNA that will bind by base-pairing to regions of DNA in a sample that identify exons. Each piece of exome bait has a corresponding magnetic bead attached to it. An external magnet is used to literally pull down the exon DNA, leaving everything else to be discarded. “It is an absolutely beautifully elegant technology,” says Langford. Researchers can either devise their own baits for the specific parts of the genome they are interested in, or they can buy commercial bait kits that target either the whole exome or specific parts. “Clinical applications will differ as to whether a targeted approach is enough,” says Illumina's Bentley. Looking at whole genomes can detect the unexpected, such as genes that were not suspected of having a role in a disease and whose significance may be missed by a targeted approach. “For some studies exome sequencing may be okay, but it will become increasingly less sufficient as precision medicine builds,” Bentley says. “There will be a moral imperative to try to fully characterize every patient and not miss anything.” Many large medical centres now have dedicated gene-sequencing centres that offer the whole gamut, from whole-genome sequencing to the precise targeting of specific genes. The Dana-Farber Cancer Institute in Boston, Massachusetts, for example, outlines the choices to patients on its website, saying: “Before starting a project, we will discuss the best sequencing strategies, experimental design, and analysis options with you.” It goes on to explain that whole-genome sequencing can discover most genomic aberrations, but that targeted sequencing is often sufficient for many clinical applications. It points out that “targeted sequencing has the advantage of sequencing larger sample sizes with lower cost and easier data analysis.” In a comprehensive review of the current state of gene-sequencing technologies, Sara Goodwin of Cold Spring Harbor Laboratory in New York discusses some of the factors that influence decisions about which technologies and methods to use ( et al. Nature Rev. Genet. 17, 333–351; 2016). Limiting the scope of an analysis can sometimes be crucial in getting fast results, she says, adding that the limiting factor for speed is often the data analysis, rather than the actual sequencing. Langford agrees, highlighting the need for “highly sophisticated software algorithms to handle the huge stream of data emerging from a modern sequencing machine”. The coming years are likely to bring more diverse applications of sequencing technology. The basic strategies that are used in DNA sequencing can also, for instance, be used to sequence RNA. Looking at RNA focuses attention on many of the parts of the genome that are most likely to have functional significance — but it may miss regions of DNA that have crucial regulatory roles, even though they are never copied into RNA. The choice of DNA or RNA sequencing, or a combination of the two, will depend on the clinical situation. Another target, which reveals a limitation of the existing technology, is the pattern of epigenetic chemical modifications carried by some of the four bases of DNA. These modifications, such as the addition of methyl groups to specific bases, can be crucial in controlling the activity of genes — and knowing whether a gene is active can be at least as valuable as knowing which genes are present in a sequence. Bentley says that efforts to add epigenetic analysis to the sequencing toolbox are still in the research phase, but that it would provide an important additional level of information. Variations that are of crucial clinical significance may be missed by just looking at the four bases in DNA, he says, rather than by considering the effects of whatever chemical modifications they may carry. Existing sequencing technologies are already helping many individual patients, but personalized sequencing cannot yet reveal everything that clinicians need to know to fully understand the links between DNA and disease. The technology has come a long way in the past 15 years, but “there are still many mountains ahead,” says Bentley. He seems confident that solutions are within reach, however. The march towards the widespread use of personalized gene sequence analysis is well under way and is showing little sign of slowing.
News Article | April 14, 2016
Scientists from the University of Southampton, in partnership with the Japan Advanced Institute of Science and Technology (JAIST), have developed a graphene-based sensor and switch that can detect harmful air pollution in the home with very low power consumption. The sensor detects individual CO molecules and volatile organic compound (VOC) gas molecules found in building and interior materials, furniture and even household goods, which adversely affect our living in modern houses with good insulation. These harmful chemical gases have low concentrations of ppb (parts per billion) levels and are extremely difficult to detect with current environmental sensor technology, which can only detect concentrations of parts per million (ppm). In recent years, there has been an increase in health problems due to air pollution in personal living spaces, known as sick building syndrome (SBS), along with other conditions such as sick car and sick school syndromes. The research group, led by Professor Hiroshi Mizuta, who holds a joint appointment at the University of Southampton and JAIST, and Dr. Jian Sun and Assistant Professor Manoharan Muruganathan of JAIST, developed the sensor to detect individual CO molecules adsorbed (the bond of molecules from a gas to a surface) onto the suspended graphene (single atomic sheet of carbon atoms arranged in a honeycomb-like hexagonal crystal lattice structure) one by one by applying an electric field across the structure. By monitoring the electrical resistance of the graphene beam, the adsorption and desorption (whereby a substance is released from or through a surface) processes of individual CO molecules onto the graphene were detected as “quantized” changes in resistance (step-wise increase or decrease in resistance). In the study, published today in Science Advances, the journal of the American Association for the Advancement of Science (AAAS), a small volume of CO gas (equivalent to a concentration of approximately 30 ppb) was released and the detection time was only a few minutes. Mizuta says, “In contrast to the commercially available environmental monitoring tools, this extreme sensing technology enables us to realize significant miniaturization, resulting in weight and cost reduction in addition to the remarkable improvement in the detection limit from the ppm levels to the ppb levels.” Research group members, Dr. Harold Chong from Southampton and Dr. Marek Schmidt and Dr. Jian Sun of JAIST, have also recently developed graphene-based switches (published in the March issue of Nanoscale, the journal of the Royal Society of Chemistry) using a uniquely thin film developed at the University of Southampton. The switches, which require remarkably low voltages (below three volts), can be used to power electronic components on demand, greatly improving the battery lifetime of personal electronic devices. Mizuta and the research group are now aiming to bring the two technologies together to create ultra-low-power environmental sensor systems that can detect single molecules.
News Article | April 18, 2016
A graphene-based sensor that can incredibly detect harmful air pollution at home, has been developed by scientists from the University of Southampton and the Japan Advanced Institute of Science and Technology (JAIST). This sensor could prove helpful in combating sick building syndrome (SBS), a condition typically marked by headaches and respiratory problems. Polluted air that is contained within a space can give rise to these health issues. The team of researchers, including Professors Hiroshi Mizuta, Manoharan Muruganathan and Jian Sun, came together to develop this revolutionary graphene-based sensor. Graphene can be described as an extremely thin layer of pure carbon - a single, tightly packed layer of carbon atoms bonded together in a hexagonal honeycomb lattice. It is the thinnest and lightest known material and highly sensitive in terms of detecting chemical gases. Supposedly, the commercially available environmental sensors today, can detect chemical gases only with concentrations of parts per million (ppm). It cannot detect low concentrations of parts per billion (ppb). However, the newly innovated graphene sensor comes equipped with the advanced capability of detecting low concentrations of ppb. This capability enables it to detect each and every CO molecule, as well as volatile organic compound (VOC) gas molecules that are individually present within the interiors of varied buildings and houses. These molecules are even found on household goods and furniture. "In contrast to the commercially available environmental monitoring tools, this extreme sensing technology enables us to realize significant miniaturization, resulting in weight and cost reduction in addition to the remarkable improvement in the detection limit from the ppm levels to the ppb levels." said Mizuta. The study has been published in the journal Science Advances. © 2016 Tech Times, All rights reserved. Do not reproduce without permission.
News Article | April 15, 2016
The sensor detects individual CO2 molecules and volatile organic compound (VOC) gas molecules found in building and interior materials, furniture and even household goods, which adversely affect our living in modern houses with good insulation. These harmful chemical gases have low concentrations of ppb (parts per billion) levels and are extremely difficult to detect with current environmental sensor technology, which can only detect concentrations of parts per million (ppm). In recent years, there has been an increase in health problems due to air pollution in personal living spaces, known as sick building syndrome (SBS), along with other conditions such as sick car and sick school syndromes. The research group, led by Professor Hiroshi Mizuta, who holds a joint appointment at the University of Southampton and JAIST, and Dr Jian Sun and Assistant Professor Manoharan Muruganathan of JAIST, developed the sensor to detect individual CO2 molecules adsorbed (the bond of molecules from a gas to a surface) onto the suspended graphene (single atomic sheet of carbon atoms arranged in a honeycomb-like hexagonal crystal lattice structure) one by one by applying an electric field across the structure. By monitoring the electrical resistance of the graphene beam, the adsorption and desorption (whereby a substance is released from or through a surface) processes of individual CO2 molecules onto the graphene were detected as 'quantised' changes in resistance (step-wise increase or decrease in resistance). In the study, published today in Science Advances, the journal of the American Association for the Advancement of Science (AAAS), a small volume of CO2 gas (equivalent to a concentration of approximately 30 ppb) was released and the detection time was only a few minutes. Professor Mizuta said: "In contrast to the commercially available environmental monitoring tools, this extreme sensing technology enables us to realise significant miniaturisation, resulting in weight and cost reduction in addition to the remarkable improvement in the detection limit from the ppm levels to the ppb levels." Research group members, Harold Chong, Ph.D. from Southampton and Marek Schmidt, Ph.D. and Jian Sun, Ph.D., of JAIST, have also recently developed graphene-based switches (published in the March issue of Nanoscale, the journal of the Royal Society of Chemistry) using a uniquely thin film developed at the University of Southampton. The switches, which require remarkably low voltages (below three volts), can be used to power electronic components on demand, greatly improving the battery lifetime of personal electronic devices. Professor Mizuta and the research group are now aiming to bring the two technologies together to create ultra-low-power environmental sensor systems that can detect single molecules.