Morgridge Institute for Research

Madison, WI, United States

Morgridge Institute for Research

Madison, WI, United States
SEARCH FILTERS
Time filter
Source Type

Cut to 2016, and HTCondor is on to a new collision: helping scientists detect gravitational waves caused 1.3 billion years ago by a collision between two black holes 30 times as massive as our sun. From revealing the Higgs boson, among the smallest particles known to science, to detecting the impossibly massive astrophysics of black holes, the HTCondor High Throughput Computing (HTC) software has proven indispensable to processing the vast and complex data produced by big international science. Computer scientists at the University of Wisconsin—Madison pioneered these distributed high throughput computing technologies over the last three decades. The announcement in February that scientists from the Laser Interferometer Gravitational-Wave Observatory (LIGO) unlocked the final door to Albert Einstein's Theory of Relativity—proof that gravitational waves produce ripples through space and time—has a rich back story involving HTCondor. Since 2004, HTCondor has been a core part of the data analysis effort of the project that includes more than 1,000 scientists from 80 institutions across 15 countries. By the numbers, more than 700 LIGO scientists have used HTCondor over the past 12 years to run complex data analysis workflows on computing resources scattered throughout the U.S. and Europe. About 50 million core-hours managed by HTCondor in the past six months alone supported the data analysis that led to the detection reported in the February 2016 papers. The path of HTCondor to LIGO was paved by a collaboration that started more than a decade ago between two University of Wisconsin teams—the 29-member LIGO team at UW-Milwaukee and the HTCondor team at UW–Madison. This interdisciplinary collaboration helped transform the computational model of LIGO and advance the state of the art of HTC. The HTCondor team is led by Miron Livny, UW–Madison professor of computer sciences and chief technology officer for the Morgridge Institute for Research and the Wisconsin Institute for Discovery. The HTCondor software implements innovative high-throughput computing technologies that harness the power of tens of thousands of networked computers to run large ensembles of computational tasks. "What we have is the expertise of two UW System schools coming together to tackle a complex data analysis problem," says Thomas Downes, a UW-Milwaukee senior scientist of physics and LIGO investigator. "The problem was, how do you manage thousands upon thousands of interrelated data analysis jobs in a way that scientists can effectively use? And it was much more easily solved because Milwaukee and Madison are right down the street from each other." The UW-Milwaukee team began using HTCondor in the early 2000s as part of an NSF Information Technology Research (ITR) project. Its then-lead scientist, Bruce Allen, landed a job as director of the Albert Einstein Institute for Gravitational Physics in Hannover, Germany, one of the leading hubs in the LIGO project. Duncan Brown, then a UW-Milwaukee physics Ph.D. candidate, became a professor of physics at Syracuse University, leading that university's LIGO efforts. Allen, Brown and others worked hard at Milwaukee to demonstrate the value of the HTCondor approach to the mission of LIGO, eventually leading to its adoption at other LIGO sites. HTCondor soon became the go-to technology for the core LIGO data groups at UW-Milwaukee, Syracuse University, the Albert Einstein Institute, the California Institute of Technology (Caltech), and Cardiff University in the UK. Peter Couvares has a 360-degree view of HTCondor's relationship to LIGO. He worked on the HTCondor team for 10 years at UW–Madison, and managed the relationship between LIGO and HTCondor for about five years after joining the LIGO team led by Brown at Syracuse. Today he is a senior scientist at Caltech managing the LIGO data analysis computing team. Why is the HTCondor software such a boon to big science endeavors like LIGO? "We know it will work—that's the killer feature of HTCondor," says Couvares. It works, he adds, because HTCondor takes seriously the core challenge of distributed computing: It's impossible to assume that a network of thousands of individual computers will not have local failures. HTCondor bakes that assumption into its core software. "The HTCondor team always asks people to think ahead to the issues that are going to come up in real production environments, and they're good about not letting HTCondor users take shortcuts or make bad assumptions," Couvares adds. In a project like LIGO, that approach is especially important. The steady stream of data from LIGO detectors is a mix of gravitational information and noise such as seismic activity, wind, temperature and light—all of which helps define and differentiate both good and bad data. "In the absence of noise, this would have been a very easy search," Couvares says. "But the trick is in picking a needle out of a haystack of noise. The biggest trick of all the data analysis in LIGO is to come up with a better signal-to-noise ratio." Stuart Anderson, LIGO senior staff scientist at Caltech, has been supporting the use of HTCondor within LIGO for more than a decade. The reason HTCondor succeeds is less about technology than it is about the human element, he says. "The HTCondor team provides a level of long-term collaboration and support in cyber-infrastructure that I have not seen anywhere else," he says. "The team has provided the highest quality of technical expertise, communication skills and collaborative problem-solving that I have had the privilege or working with." Adds Todd Tannenbaum, the current HTCondor technical lead who works closely with Anderson and Couvares: "Our relationship with LIGO is mutually profitable. The improvements made on behalf of our relationship with LIGO have greatly benefited HTCondor and the wider high throughput computing community." Ewa Deelman, research associate professor and research director at the University of Southern California Information Sciences Institute (ISI), became involved with HTCondor in 2001 as she launched Pegasus, a system that automates the workflow for scientists using systems like HTCondor. Together, Pegasus and HTCondor help lower the technological barriers for scientists. "I think the automation and the reliability provided by Pegasus and HTCondor are key to enabling scientists to focus on their science, rather than the details of the underlying cyber-infrastructure and its inevitable failures," she says. The future of LIGO is tremendously exciting, and the underpinning high-throughput computing technologies of HTCondor will change with the science and the computing technologies. Most agree that with the initial observation of gravitational waves, a potential tsunami of data awaits from the universe. "The field of gravitational wave astronomy has just begun," says Couvares. "This was a physics and engineering experiment. Now it's astronomy, where we're seeing things. For 20 years, LIGO was trying to find a needle in a haystack. Now we're going to build a needle detection factory." Adds Livny: "What started 15 years ago as a local Madison-Milwaukee collaboration turned into a computational framework for a new field of astronomy. We are ready and eager to address the evolving HTC needs of this new field. By collaborating with scientists from other international efforts that use technologies ranging from a neutrino detector in the South Pole (IceCube) to a telescope floating in space (Hubble) to collect data about our universe, HTC will continue to support scientific discovery." Explore further: Science powerhouses unite to help search for gravitational waves


News Article | November 28, 2016
Site: www.marketwired.com

MADISON, WI--(Marketwired - November 28, 2016) - OnLume Inc., a medical device company developing novel surgical lighting technology, today announced support from the Small Business Innovation Research (SBIR) program to accelerate work on a fluorescence image-guided surgery (FIGS) system. OnLume received the $300,000 Phase I SBIR grant through the National Cancer Institute (NCI) at the National Institutes of Health (NIH) for the development of imaging and lighting systems for transient lighting in fluorescence image-guided surgery. A portion of the work is being performed in collaboration with researchers led by Kevin Eliceiri at the Morgridge Institute for Research and the University of Wisconsin-Madison. The SBIR program is highly competitive, encouraging small businesses to engage in federal research and development with the potential for commercialization. FIGS systems use fluorescent dyes for real-time intraoperative imaging of subsurface blood vessels, perfusion, and cancer, in which residual cancerous cells give off light that surgeons may use to guide the removal of additional tissue. But the light-sensitive technology typically requires a darkened operating environment, limiting its use to surgeons and disrupting the treatment process. OnLume transient lighting technology enables fluorescence-guided methods to work in the operating room without any discernible loss of light for surgeons. "At OnLume, we believe that fluorescence guidance will play a major role in the future of surgery," says Adam Uselmann, CEO of OnLume and principal investigator of the grant. "Being able to assess tissue function and delineate cancer in real time without interrupting existing surgical workflows will be a boon to surgeons across disciplines. The generous support of the NIH is enabling the research and development necessary to realize our vision." The field of fluorescence image-guided surgery is rapidly evolving and has broad clinical applications, such as more efficient and efficacious removal of tumors when used in conjunction with cancer-targeting fluorescent drugs. One of the challenges to FIGS is eliminating light contamination from ambient room lighting, which impedes the fluorescence signal emitted from the patient during a surgical procedure. OnLume's technology offers broad compatibility with fluorescent drugs across the fluorescent spectrum while eliminating the contaminating light.


Thelen's mechanical engineering background and interest in biomedical applications catalyzed his current research in knee biomechanics. Much like Thelen, Colin Smith, a mechanical engineering PhD candidate at UW-Madison, gravitated to the field after studying machines as an undergraduate. "What we do is use mechanics—all the things people use when designing cars such as how motors drive movement—and use those ideas to analyze how humans move," Smith says. Thelen and Smith specifically look at knee surgical procedures and the complex variables that affect the outcome of surgeries. When somebody tears their anterior cruciate ligament (ACL), the surgeon must adjust a multitude of surgical parameters while performing the operation to achieve the best outcome for the patient. Such parameters include how tight to pull the graft before fixing it to the bones and what location and angle to attach the replacement ACL. While ACL reconstruction surgeries can enable individuals to return fully to activity, high rates of osteoarthritis, a degenerative joint disease, are seen 10-15 years after the procedure. Thelen and Smith want to come up with a strategic way to tell orthopedic surgeons, with increased certainty, what surgical adjustments may reduce the risk of osteoarthritis for each patient. In order to do so, their lab uses OpenSim software to investigate joint contact pressures by simulating movements such as walking, running and stair climbing. OpenSim is a freely distributed open source software package developed under a Biocomputing grant from the National Institutes of Health and used around the world to model the musculoskeletal system. "We knew we needed to perform a lot of simulations. So we had to go beyond running simulations on a single computer to utilizing a high throughput computing platform, which enabled us to do the simulations in parallel," Thelen says. Supported by the National Science Foundation (NSF), the National Institutes of Health (NIH), the Department of Energy (DOE), the Morgridge Institute for Research, and various grants from the university, the Center for High-Throughput Computing (CHTC) provides computing technology to researchers like Thelen and Smith in need of extensive computing resources. With the help of the CHTC, running 3,000 simulations takes about two hours rather than the estimated 1,500 hours on a single computer. The CHTC uses HTCondor software, developed within CHTC to schedule and send jobs to large grids of computers. This enables researchers to run simulations on open computers across campus, including the undergraduate computer labs and research-oriented computing clusters, as well as computers belonging to CHTC. The CHTC also takes advantage of the resources of the Open Science grid, consisting of computing and storage elements at over 100 individual sites spanning the United States. The use of high throughput computing and probabilistic analysis helps scientists, like Thelen and Smith, answer research questions and speeds up the process in doing so. With the ability to run simulations on a multitude of computers, scientists can run hundreds of tests in a fraction of the time. This technology allows researchers to get to results, change their approach, or do exploratory work more quickly. Christina Koch, a research computing facilitator with the CHTC since November 2014, assists researchers in using the large scale computing resources available to them. Koch's background in math, computing, and computing education led her to take on the position of a computing facilitator. On any given day, Koch helps researchers solve problems, think critically about their work, and assists them in discovering how to best leverage the resources at CHTC to transform their research. "I've had a good day if I've helped someone overcome a hurdle that allows them to move forward with their work – whether that be solving a specific problem together, helping them learn something new, or hearing about what they were able to accomplish on their own," Koch says. "It's great to hear that someone was able to finish their dissertation research, get a paper in, or expand the scope of their project – just because they could access computing through CHTC and there was someone there to help them move forward." The computing power of the Open Science Grid coupled with freely available software such as OpenSim is something Koch principles as a major gateway in the scientific community. Easy access to so much computing power allows researchers to try new approaches or methods that just wouldn't have been possible before. What does the future hold for OpenSim and knee modeling? Through more OpenSim simulations, researchers hope to devise subject-specific treatment plans and have a better grasp predicting surgical outcomes. Smith hopes to see this research progressing into personalized medicine where surgeries are pre-planned and knee replacement implants are optimally designed. Explore further: New computing capabilities brought to UW-Madison researchers


News Article | February 28, 2017
Site: www.biosciencetechnology.com

The mystery of what controls the range of developmental clocks in mammals -- from 22 months for an elephant to 12 days for a opossum -- may lie in the strict time-keeping of pluripotent stem cells for each unique species. Developmental clocks are of high importance to regenerative medicine, since many cells types take long periods to grow to maturity, limiting their usefulness to human therapies. The regenerative biology team at the Morgridge Institute for Research, led by stem cell pioneer and UW-Madison professor James Thomson, is studying whether stem cell differentiation rates can be accelerated in the lab and made available to patients faster. In a study published in February online editions of the journal Developmental Biology, Morgridge scientists tested the stringency of the developmental clock in human stem cells during neural differentiation. First, they closely compared the differentiation rates of the cells growing in dishes compared to the known growth rates of human cells in utero. Second, they grew the human stem cells within a mouse host, surrounded by factors -- such as blood, growth hormones and signaling molecules -- endemic to a species that grows much more rapidly than humans. In both cases -- lab dish or different species -- the cells did not waver from their innate timetable for development, without regard to environmental changes. "What we found remarkable was this very intrinsic process within cells," said lead author Chris Barry, a Morgridge assistant scientist. "They have self-coding clocks that do not require outside stimulus from the mother or the uterus or even neighboring cells to know their pace of development." While the study suggests that cellular timing is a stubborn process, the Thomson lab is exploring a variety of follow-up studies on potential factors that could help cells alter their pace, Barry said. One aspect of the study that's immediately valuable across biology is the realization that how stem cells behave in the dish aligns almost precisely with what happens in nature. "The promising thing is that we can take species of stem cells, put them in tissue culture, and more confidently believe that events we're seeing are probably happening in the wild as well," Barry said. "That is potentially great news for studying embryology in general, understanding what's going on in the womb and disease modeling for when things can go wrong." It also opens up potential avenues in embryology that would have been inconceivable otherwise -- for example, using stem cells to accurately study the embryology of whales and other species with much longer (or shorter) gestation rates than humans. In order to accurately compare development timing across species with wildly different gestation rates -- nine months compared to three weeks -- the team used an algorithm called Dynamic Time Warping, originally developed for speech pattern recognition. This algorithm will stretch or compress the time frame of one species to match up with similar gene expression patterns in the other. Using this process, they identified more than 3,000 genes that regulate more rapidly in mice and found none that regulate faster in human cells. The impact of solving the cell timing puzzle could be enormous, Barry said. For example, cells of the central nervous system take months to develop to a functional state, far too long to make them therapeutically practical. If scientists can shorten that timing to weeks, cells could potentially be grown from individual patients that could counteract grave diseases such as Parkinson's, Multiple Sclerosis, Alzheimer's, Huntington's disease and spinal cord injuries. "If it turns out these clocks are universal across different cell types," said Barry, "you are looking at broad-spectrum impact across the body."


News Article | February 27, 2017
Site: www.eurekalert.org

MADISON -- The mystery of what controls the range of developmental clocks in mammals -- from 22 months for an elephant to 12 days for a opossum -- may lie in the strict time-keeping of pluripotent stem cells for each unique species. Developmental clocks are of high importance to regenerative medicine, since many cells types take long periods to grow to maturity, limiting their usefulness to human therapies. The regenerative biology team at the Morgridge Institute for Research, led by stem cell pioneer and UW-Madison professor James Thomson, is studying whether stem cell differentiation rates can be accelerated in the lab and made available to patients faster. In a study published in February online editions of the journal Developmental Biology, Morgridge scientists tested the stringency of the developmental clock in human stem cells during neural differentiation. First, they closely compared the differentiation rates of the cells growing in dishes compared to the known growth rates of human cells in utero. Second, they grew the human stem cells within a mouse host, surrounded by factors -- such as blood, growth hormones and signaling molecules -- endemic to a species that grows much more rapidly than humans. In both cases -- lab dish or different species -- the cells did not waver from their innate timetable for development, without regard to environmental changes. "What we found remarkable was this very intrinsic process within cells," says lead author Chris Barry, a Morgridge assistant scientist. "They have self-coding clocks that do not require outside stimulus from the mother or the uterus or even neighboring cells to know their pace of development." While the study suggests that cellular timing is a stubborn process, the Thomson lab is exploring a variety of follow-up studies on potential factors that could help cells alter their pace, Barry says. One aspect of the study that's immediately valuable across biology is the realization that how stem cells behave in the dish aligns almost precisely with what happens in nature. "The promising thing is that we can take species of stem cells, put them in tissue culture, and more confidently believe that events we're seeing are probably happening in the wild as well," Barry says. "That is potentially great news for studying embryology in general, understanding what's going on in the womb and disease modeling for when things can go wrong." It also opens up potential avenues in embryology that would have been inconceivable otherwise -- for example, using stem cells to accurately study the embryology of whales and other species with much longer (or shorter) gestation rates than humans. In order to accurately compare development timing across species with wildly different gestation rates -- nine months compared to three weeks -- the team used an algorithm called Dynamic Time Warping, originally developed for speech pattern recognition. This algorithm will stretch or compress the time frame of one species to match up with similar gene expression patterns in the other. Using this process, they identified more than 3,000 genes that regulate more rapidly in mice and found none that regulate faster in human cells. The impact of solving the cell timing puzzle could be enormous, Barry says. For example, cells of the central nervous system take months to develop to a functional state, far too long to make them therapeutically practical. If scientists can shorten that timing to weeks, cells could potentially be grown from individual patients that could counteract grave diseases such as Parkinson's, Multiple Sclerosis, Alzheimer's, Huntington's disease and spinal cord injuries. "If it turns out these clocks are universal across different cell types," says Barry, "you are looking at broad-spectrum impact across the body." A video highlighting Barry's work can be seen here: https:/


News Article | February 27, 2017
Site: phys.org

Developmental clocks are of high importance to regenerative medicine, since many cells types take long periods to grow to maturity, limiting their usefulness to human therapies. The regenerative biology team at the Morgridge Institute for Research, led by stem cell pioneer and UW-Madison professor James Thomson, is studying whether stem cell differentiation rates can be accelerated in the lab and made available to patients faster. In a study published in February online editions of the journal Developmental Biology, Morgridge scientists tested the stringency of the developmental clock in human stem cells during neural differentiation. First, they closely compared the differentiation rates of the cells growing in dishes compared to the known growth rates of human cells in utero. Second, they grew the human stem cells within a mouse host, surrounded by factors—such as blood, growth hormones and signaling molecules—endemic to a species that grows much more rapidly than humans. In both cases—lab dish or different species—the cells did not waver from their innate timetable for development, without regard to environmental changes. "What we found remarkable was this very intrinsic process within cells," says lead author Chris Barry, a Morgridge assistant scientist. "They have self-coding clocks that do not require outside stimulus from the mother or the uterus or even neighboring cells to know their pace of development." While the study suggests that cellular timing is a stubborn process, the Thomson lab is exploring a variety of follow-up studies on potential factors that could help cells alter their pace, Barry says. One aspect of the study that's immediately valuable across biology is the realization that how stem cells behave in the dish aligns almost precisely with what happens in nature. "The promising thing is that we can take species of stem cells, put them in tissue culture, and more confidently believe that events we're seeing are probably happening in the wild as well," Barry says. "That is potentially great news for studying embryology in general, understanding what's going on in the womb and disease modeling for when things can go wrong." It also opens up potential avenues in embryology that would have been inconceivable otherwise—for example, using stem cells to accurately study the embryology of whales and other species with much longer (or shorter) gestation rates than humans. In order to accurately compare development timing across species with wildly different gestation rates—nine months compared to three weeks—the team used an algorithm called Dynamic Time Warping, originally developed for speech pattern recognition. This algorithm will stretch or compress the time frame of one species to match up with similar gene expression patterns in the other. Using this process, they identified more than 3,000 genes that regulate more rapidly in mice and found none that regulate faster in human cells. The impact of solving the cell timing puzzle could be enormous, Barry says. For example, cells of the central nervous system take months to develop to a functional state, far too long to make them therapeutically practical. If scientists can shorten that timing to weeks, cells could potentially be grown from individual patients that could counteract grave diseases such as Parkinson's, Multiple Sclerosis, Alzheimer's, Huntington's disease and spinal cord injuries. "If it turns out these clocks are universal across different cell types," says Barry, "you are looking at broad-spectrum impact across the body." Explore further: Researchers turn stem cells into somites, precursors to skeletal muscle, cartilage and bone


News Article | December 13, 2016
Site: www.eurekalert.org

MADISON, Wis. -- Social media has erased many of the boundaries between leaders and the people they represent, between experts and the lay public, between scientists and nonscientists. It has enabled people to communicate directly and interact in unprecedented ways. At the University of Wisconsin-Madison, a survey of 372 scientists engaged in biological or physical science research shows that scientists are increasingly using social media to communicate with nonscientific audiences. Nearly 75 percent of the scientists surveyed at UW-Madison between April and June 2016 believe that nonscientists add valuable perspective to discussions about scientific research, which came as a surprise to Dominique Brossard, professor and a leader of the group that administered the survey, the Science, Media and the Public research group (SCIMEP) in the UW-Madison Department of Life Sciences Communication. A report from the survey is published on the SCIMEP website. "Scientists think lay audiences have something important to say," says Brossard. "It really reflects the reality of complex science today, where often there are ethical dimensions to consider." At the same time, the SCIMEP team found scientists at UW-Madison are also using social media more often to communicate with their peers. "The norms are changing," says Brossard. "UW-Madison is representing this quite well." Indeed, a non-UW-Madison study published in October in the journal PLOS One shows that scientists from a variety of disciplines around the world report that while they have not widely adopted social media, they believe there are numerous advantages to using it in their work. Partnering with the Morgridge Institute for Research, the UW-Madison team approached its survey with an interest in how scientists view social media communication around sensitive topics like synthetic biology and gene editing. They were also interested in how scientists view the role of social media in conducting post-publication peer review of scientific studies. The survey was partially inspired by a case in 2010 in which a scientific study claimed to have discovered a bacterial species that incorporated the element arsenic into its genetic material, alluding to extraterrestrial life, says Kathleen Rose, the SCIMEP graduate student who compiled the report. Shortly after the "arsenic life" study published, a blog post from another scientist in the field disputed the claims. Heated conversation ensued on Twitter. Brossard and Rose were co-authors of an analysis of that case, which was published earlier this year and examined how social media influenced the fate of the controversial study once it became public. "We know the peer-review system is flawed and this may be a mechanism that is fixing, like a crowd-sourced correcting mechanism," Brossard says. Rose adds that they were curious whether scientists at UW-Madison viewed social media as a place "where they can go to discuss and find other people and build up that post-publication peer review." The survey reveals that one in four of the responding scientists think that scientists should use social media to comment on the validity of scientific findings after publication. Another 54 percent neither agreed nor disagreed with this statement and Brossard finds this particularly interesting. "More than half of scientists are ambivalent that peer review is always right," she says, noting that many scientists may not have thought before about social media as a tool for post-publication peer review. "We may have planted a seed. Surveys can get people to think about something they may not have thought about before." The survey also asked UW-Madison scientists how they believed using social media impacts their reputation and scientific credibility. Despite SCIMEP research that shows otherwise, only 17 percent of respondents agreed that social media increases their citation rates, a measure of their impact in the field, while nearly half did not agree or disagree. This ambivalence, she says, "is usually a sign that things are changing." According to the survey, for scientific purposes, 78 percent of the responding scientists frequent Wikipedia anywhere from more than once a week to at least a few times a month, half never use Facebook, just 35 percent ever use Twitter and very few use Reddit. Others use ResearchGate, YouTube, blogs and podcasts with variable frequency. Just over 40 percent talk to reporters about their research at least a few times a year or more, and nearly three-quarters engage at least a few times a year in public outreach efforts related to their field. More than three quarters of respondents at UW-Madison also believe scientists should be actively involved in political debates around scientific issues like synthetic biology. "Scientists are willing to engage with lay audiences; they want to engage," says Brossard. "It's not just scientists in the Ivory Tower anymore." The survey shows that scientists at UW-Madison by and large believe the public is interested in what they have to say about science on social media. However, a majority report that using it is too time-consuming. Respondents were overwhelmingly male (70 percent) and relatively far along in their careers. The mean number of years since they earned their doctorate was just over 23. Brossard was heartened to learn that 34 percent of respondents say they pay attention to the social science underlying science communication. "That's amazing," says Brossard, who trained as a biological scientist before transitioning to study the science of science communication. "Fifteen years ago, it was really hard for a social scientist to have credibility when talking to a physical or biological scientist." Better study of science communication may be on the horizon. On Tuesday [Dec. 13, 2016], the National Academies of Sciences, Engineering, and Medicine issued a report on effective science communication, compiled by a committee co-chaired by Dietram Scheufele, professor in the Department of Life Sciences Communication with Brossard and member of SCIMEP. The report identifies what we currently know about effective science communication and proposes a research agenda to better understand how to improve it, particularly around contentious issues like climate change, stem cells, and vaccines. Rose and Brossard say they would like to help normalize scientists' use of social media and help them see their peers are also using it productively and in meaningful ways. And they would like to understand what happens when scientists do use social media. "Does it change your view of the public? Does it change the way you see your own research? Does it make you more creative? What's the impact?" Brossard asks. "It's not just to do it to say that you do it, does it actually change you as a person, as a researcher in your relationship with society? That's what I find fascinating. Most research is focused on how does it affect the public, but how does it impact the scientist?" To illustrate, she shares an experience she had serving on a public panel focused on gene editing held at a local science festival last year. One of the panelists remarked to her, "Wow, the public is asking super hard questions!" Brossard says. "Not all of them were technical questions, but they were profound questions," she says. "I'm sure this person was transformed by the experience and maybe that makes his research a little more meaningful. That's what we would like to see ... I know that it has changed me."


News Article | November 9, 2016
Site: www.sciencedaily.com

Biochemists at the University of Wisconsin-Madison have created the first atlas that maps where molecular tools that can switch genes on and off will bind to the human genome. It is a development they say could enable these tools to be targeted to specific parts of an individual's genome for use in precision medicine, developing therapies and treating disease. The study is published this week in the Proceedings of the National Academy of Sciences. The tools are polyamides, engineered DNA-binding molecules that are an important component of artificial transcription factors. Transcription factors -- both natural and artificial -- determine which genes are translated into proteins inside cells. "We know that transcription factors bind to specific sites in the genome and when they misfire they drive many diseases, including cancers," explains lead study author Graham Erwin, a former graduate student in the lab of Aseem Ansari, a professor in the Department of Biochemistry and the Genome Center of Wisconsin. "Using insights gleaned from this research, we hope to design polyamides that can bind to these same sites and outcompete the cancer-inducing factors, helping to repress that gene." Transcription factors work by binding to a particular gene and then recruiting the cellular machinery necessary to read it and manufacture the desired protein, or they can stop a protein from being created. While polyamides had already been designed to attach to particular regions of the human genome and turn corresponding genes on or off, the new study answers lingering questions about where particular polyamides bind and ultimately function in the cell. "Our big question was, where are these molecules going across the genome?" says Erwin, who is now a postdoctoral researcher at Stanford University. "With this study, we have a whole new understanding of how they read the genome." Using a technique called COSMIC (Crosslinking of Small Molecules to Isolate Chromatin), the researchers were able to create polyamides that would bind to DNA in human embryonic stem cells. Then, using a light-activated "handle," the scientists could search for the location where the polyamide had bound to the genome. This enabled them to build an atlas of specific binding sites within the context of whole cells. To their surprise, the researchers learned some polyamides can bind to DNA previously thought to be inaccessible. Each human genome, nearly five feet in length, must be highly packaged to fit inside the tiny volume of a cell's nucleus. To accomplish this extraordinary task, cells wind up tightly most parts of the DNA that isn't readily needed. At any given time, more than two-thirds of the DNA in the human stem cells the researchers used is packaged in this way. The study team found some of the polyamides were bound to this off-limits DNA. While the DNA is unavailable to most transcription factors, polyamides may be small enough to reach it, the researchers believe. "Being able to target a specific site in the genome is essential for the next generation of rationally designed therapies, and the lessons we've learned have changed the way we design molecules to target individual genomes," says Ansari. In a parallel study published Nov. 4 in the journal Angewandte Chemie, Ansari collaborated with Kan Xiong and Paul Blainey of MIT to visualize how these small molecules search long stretches of DNA for their binding sites. The study shows that these synthetic genome readers behave like "molecular sleds" and slide effortlessly across vast tracts of the genome. Together the studies provide new insights into how these molecules locate their preferred target sites in the genome. "For 15 years we've been working on this idea and now it seems we're finally on the way to being able to intervene in a thoughtful way," Ansari says. Other authors on the PNAS "atlas" study include past or present members of the Ansari lab, members of the Department of Electrical and Computer Engineering, and researchers at the Morgridge Institute for Research.


Ossorio P.,Morgridge Institute for Research | Ossorio P.,University of Wisconsin - Madison
Genetics in Medicine | Year: 2012

Most discussions of researchers' duties to return incidental findings or research results to research participants or repository contributors fail to provide an adequate theoretical grounding for such duties. Returning findings is a positive duty, a duty to help somebody. Typically, such duties are specified narrowly such that helping is only a duty when it poses little or no risk or burden to the helper and does not interfere with her legitimate aims. Under current budgetary and personnel constraints, and with currently available information technology, routine return of individual findings from research using repository materials would constitute a substantial burden on the scientific enterprise and would seriously frustrate the aims of both scientists and specimen/data contributors. In most cases, researchers' limited duties to help repository contributors probably can be fulfilled by some action less demanding than returning individual findings. Furthermore, the duty-to-return issue should be analyzed as a conflict between (possibly) helping some contributors now and (possibly) helping a greater number of people who would benefit in the future from the knowledge produced by research. ©American College of Medical Genetics and Genomics.


Benjamin Shapiro R.,Tufts University | Ossorio P.N.,Morgridge Institute for Research | Ossorio P.N.,University of Wisconsin - Madison
Science | Year: 2013

How should research studying adolescent players of online educational games be conducted responsibly?

Loading Morgridge Institute for Research collaborators
Loading Morgridge Institute for Research collaborators