Johns Hopkins

Baltimore, MD, United States

Johns Hopkins

Baltimore, MD, United States
SEARCH FILTERS
Time filter
Source Type

A peer-support program launched six years ago at Johns Hopkins Medicine to help doctors and nurses recover after traumatic patient-care events such as a patient's death probably saves the institution close to $2 million annually, according to a recent cost-benefit analysis. The findings, published online in the Journal of Patient Safety, could provide impetus for other medical centers to offer similar programs -- whose benefits go far beyond the financial, the Johns Hopkins Bloomberg School of Public Health researchers say. Clinicians who aren't able to cope with the stress or don't feel supported following these events, often suffer a decrease in their work productivity, take time off or quit their jobs, they say. "We often refer to medical providers who are part of these stressful events as 'second victims,'" says study leader William V. Padula, PhD, an assistant professor in the Department of Health Policy and Management at the Bloomberg School, using a term coined by Johns Hopkins professor Albert Wu, MD. "Although providers often aren't considered to be personally affected, the impact of these events can last through their entire career." In 2011, Johns Hopkins Medicine started the Resilience In Stressful Events (RISE) program. The program relies on a multidisciplinary network of peer counselors -- nurses, physicians, social workers, chaplains and other professionals -- who arrive or call a fellow clinician in need within 30 minutes after they request help following an emotionally difficult care-related event, such as a patient in extreme pain, dealing with an overwhelmed family, or a patient being harmed through a medical error. At large, academic medical centers such as Johns Hopkins, with a complicated and often very sick patient population, such events happen on a daily basis, Padula says. Although Padula says that he and others involved in the RISE program believe in its importance regardless of cost, the program does require Johns Hopkins to redirect some resources. For example, he says, although the peer counselors all volunteer their time, that's time taken away from other billable work, such as patient care. For Johns Hopkins to continue to invest in the program, he explains, showing a financial benefit is key. To explore whether such a benefit exists, Padula and his colleagues developed a model focused just on the nursing population to investigate the likely financial outcomes of a year with or without the RISE program in place. The model used data from a survey delivered to nurses familiar with the RISE program on their probability of quitting or taking a day off after a stressful event with or without the program in place. It also used Johns Hopkins human resources data as well as the average cost of replacing a lost nursing employee available in published literature, among other data. After inputting this information into the model, the researchers found that the annual cost of the RISE program per nurse was about $656. However, they found that the expected annual cost of not having the program in place was $23,232. Thus, the RISE program results in a net cost savings of $22,576 per nurse. Expanding that out to all users of the system -- including doctors, who have a much higher cost per billable hour and dramatically higher replacement costs -- the total savings to the entire institution in one year was expected to be about $1.81 million. The savings alone is an attractive reason to implement a program like RISE at other large, academic medical centers, Padula says. However, he says, helping clinicians get through a stressful event is the right thing to do, regardless of cost. "It's hard to put a true price on the emotional support and coping mechanisms this program provides for clinicians after tragic events," he says. Other Johns Hopkins researchers who participated in this study include Dane Moran, MPH; Albert W. Wu, MD; Cheryl Connors, MS; Meera R. Chappidi, MPH; Sushama K. Sreedhara, MBB; and Jessica J. Selter, MD. Funding for this study was provided by the Josie King Foundation and the Maryland Patient Safety Center.


News Article | May 9, 2017
Site: www.sciencemag.org

Could the Trump administration be changing its mind about slashing funding for the National Institutes of Health (NIH)? Scientific leaders were optimistic yesterday after meeting for 2 hours at the White House with several biotech executives to discuss the “ecosystem” in which federally funded basic research leads to discoveries that companies turn into treatments. The closed meeting, described 2 weeks ago by Bloomberg News as a “summit,” took place on 8 May in a room in the White House residence. Twenty-seven people attended, including NIH Director Francis Collins, Health and Human Services Secretary Tom Price, Food and Drug Administration Acting Commissioner Stephen Ostroff, and nine White House officials including President Donald Trump’s daughter Ivanka Trump. About a dozen outside speakers ranged from Stanford University’s president and the CEOs of the Mayo Clinic and Johns Hopkins Medicine to four CEOs from biotech companies including Vertex Pharmaceuticals and Regeneron Pharmaceuticals. The meeting took place against the backdrop of White House plans for massive cuts to NIH that lawmakers in Congress have so far rebuffed. After the president proposed cutting NIH by about $1 billion in 2017, Congress instead gave the agency a $2 billion raise, to $34 billion, in a bill Trump signed last Friday. However, Trump’s proposed “skinny budget” for the 2018 fiscal year that begins 1 October would trim $5.8 billion from NIH’s budget. Yesterday’s gathering began with remarks by Vice President Mike Pence and a few brief presentations in which NIH and other leaders discussed milestones such as the Human Genome Project and falling rates of heart disease and HIV, and presented data on how NIH research contributes to the U.S. economy. Then the attendees had a discussion. Biotech leaders explained why private investment can’t substitute for NIH’s support for basic research at academic institutions. NIH and academic leaders described the crushing one in five odds of winning an NIH grant that early-career investigators are facing after years of flat NIH funding. Another concern was that Trump immigration policies are making it more difficult to recruit foreign talent. Heart disease researcher Helen Hobbs of the University of Texas Southwestern Medical Center in Dallas told the group that her Chinese postdocs are now accepting job offers in China instead of staying in the United States. White House officials were attentive, including Ivanka Trump, who asked a lot of questions, including on the role of women in biomedical research and K–12 science education, Collins says. “She was totally engaged,” he told reporters later. At the end, the group visited the Oval Office for a photo with President Trump. “By far the overwhelming message was the critical role of NIH in supporting fundamental biomedical research that has laid the foundation” for new diagnostics and therapeutics, says human geneticist Rick Lifton, president of The Rockefeller University in New York City. “I certainly came away with the understanding that we were carefully listened to,” he adds. “The message in the room was loud and clear: We need the NIH! And we need it now more than ever,” says Cori Bargmann, president of science for the Chan Zuckerberg Initiative in Palo Alto, California, in a Facebook post. The meeting was organized by Bill Ford, CEO of General Atlantic, a global investment firm, who has ties to Reed Cordish, Trump’s assistant for intragovernmental and technology initiatives, Collins said. Ford, who sits on the boards of The Rockefeller University and Memorial Sloan Kettering Cancer Center in New York City, thought it would be a “good idea” to have experts describe “the whole ecosystem” that biomedical innovation depends on, Collins said. Participants did not discuss the proposed $5.8 billion NIH budget cut in 2018—it was one of several “elephants in the room,” including drug pricing, Collins said. The meeting did touch on Price’s proposal to make that cut by slashing indirect costs, the overhead payments for research grants that NIH now disburses to grantee universities. But “it was not the main focus,” Collins says. (One attendee said that academic leaders explained how indirect cost payments don’t come close to covering the full cost of NIH-funded research.) The meeting “was an important step in laying out bold plans to fortify America's role as the global leader in biomedicine," Collins said in a statement afterward. He told reporters he thinks there will be more such meetings. Are NIH’s budget prospects looking better? “I think time will tell,” Collins said.


News Article | May 9, 2017
Site: www.npr.org

Location A Bigger Influence Than Race For Children In Public Housing Do black and white children who live in assisted or subsidized housing experience different life outcomes? That question was at the center of a new study by Sandra Newman and C. Scott Holupka, two researchers at Johns Hopkins University in Baltimore. They combed through federal data on households in public housing or those that received housing vouchers from the 1970s through the first decade of the 2000s. What the pair found was neither straightforward nor surprising. When it came to life outcomes, living in assisted housing didn't predict significant differences in how children turned out because of their race, a big change from decades past. They also found that black and white people living in subsidized housing in the first decade of the aughts were living in homes of comparable quality, which was not true in the past. But while they found parity in the housing conditions in which black and white households lived, there were still wide disparities in the neighborhoods in which their homes were located. "Despite achieving equality and access to quality public housing, because of historical and structural forces, black families are more likely to live in central cities," Newman said — that is, places with more concentrated poverty and the problems that come with it. By contrast, white families in assisted housing were more likely to live on the edges of those cities, or in the inner-ring suburbs — which often meant access to better schools and public services. "Living in a better place can have a very good influence on these outcomes," Newman said. The researchers said that the study's findings were concerning given the number of children involved: In 2011, there were about 1.9 million households with children living in assisted housing across the country, and nearly half of children who lived in assisted housing in the United States were black. The researchers found that black families in assisted housing were likely to have been on welfare longer than white families in similar situations; they also had lower household incomes, and were more likely to have a head of household with a disability. "We really need to understand the neighborhood, because it's the key to unlocking these [more advantageous] family backgrounds," Newman said. The data they analyzed bore out this neighborhood effect: black kids who lived in assisted housing in neighborhoods with better services and resources had better outcomes than those in public housing in more economically distressed areas. But there are challenges to keeping affordable housing from being concentrated in poorer inner cities. One is social: families with fewer resources rely on their close ties for things like childcare and transportation. "Even if people were given a [housing] voucher where they can choose where they live, people tend to pick places with which they're already familiar," Holupka said. The other is local politics. Holupka pointed to Johns Hopkins' own backyard, which is dealing with an ongoing affordable housing crisis. "Look at Baltimore City," he said. "It would need to work with other housing authorities" — the local public-private agencies that oversee assisted housing — "to give assistance to families looking to move for housing elsewhere." About 150,000 people in the city — around a third of its renting households — live below the poverty line at a time when rental costs have sharply increased. "It's not like some suburban jurisdiction is going to say, all these people from the inner city need housing so let's make more available," he said. The study comes at a time when Republicans in Washington seem to be considering dismantling a fair housing provision put in place under the Obama administration to address such concerns. Under the rule, known as Affirmatively Furthering Fair Housing, local housing agencies that receive federal money for housing must keep data on the racial demographics of the people who live in affordable or subsidized housing — and take steps to remove barriers to housing for low-income earners. As Slate reported in March, that rule has already made a dent in residential segregation in some cities by nudging local agencies into changing how they distribute their affordable housing resources: During the presidential campaign last year, Donald Trump pledged to do away with the rule. Ben Carson, President Trump's housing secretary, was on the record as opposing it years ago. For now, though, it seems Republicans have turned their attention elsewhere. "[The rule] is being ignored," Newman said. "It hasn't been overturned yet — but it's not being enforced."


News Article | May 11, 2017
Site: www.chromatographytechniques.com

The biggest problem in sports today—whether at the pee-wee, high school, collegiate or professional level—is the diagnosis, or non-diagnosis, of concussions. The signs and symptoms can be tricky to identify, even for seasoned medical professionals. With approximately 6 million people a year suffering from a case of mild traumatic brain injury (TBI) that may or may not result in a concussion, the desperate need for an accurate diagnostic tool is obvious. Researchers all over the world are working on a solution, but it’s a Maryland-based company called BrainScope that has taken the lead with its FDA-approved Ahead 300 device. The Ahead 300 is a portable EEG (electroencephalogram) that provides a rapid, objective assessment of the likelihood of the presence of TBI in patients who present with mild symptoms at the point of care, which could be a field, the emergency room, a park, etc. The goal of the device is to offer clinicians a comprehensive panel of data to assist in the diagnosis of the full spectrum of TBI. Eight years and three generations in the making, this iteration of the Ahead 300 was approved by the FDA in September 2016 and became commercially available on a limited basis in January 2017. “This is a first-of-its-kind instrument. We have finally reached the age where there will be objective quantitative measurements made on the mild-TBI patient, and that’s what is most exciting,” Dr. Daniel Hanley, Jr., M.D., and professor of neurological medicine at the Johns Hopkins University School of Medicine told Laboratory Equipment. Hanley was the lead investigator on the latest clinical trial for the Ahead 300, which showed a 97 percent accuracy in ruling out whether a person with a head injury likely had brain bleeding. Development of the device comes in part from grant support by the National Football League through the GE/NFL Head Health Initiative. BrainScope has also been awarded more than $27 million in research contracts since 2011 by the U.S. Department of Defense for research and development. These contracts helped support the diagnostic tool through more than 20 clinical studies at 55 sites, as well as its evolution and enhancement through the years. “We’ve learned in the last 15 years, unfortunately, that blasts cause concussions, so the military is very interested in being able to identify who had a serious blast injury and who had no injury at all but was near where the blast went off,” explained Hanley, noting that the device has applications beyond sports. “The military is very interested in triage and who can go back to work and who can’t. It looks like this device has potential in that second area.” The Ahead 300 features BrainScope’s proprietary, patent-protected EEG capabilities in combination with algorithms and machine learning to assess the likelihood a patient presenting with mild-TBI symptoms has more than 1 millimeter of bleeding in the brain, which would require immediate evaluation and further testing/intervention. A disposable electrode headset, powered by smartphone technology, records EEG data from five regions of the forehead and feeds the signals back to the handheld device. For clinicians who have never had a suited diagnostic tool for TBI before, the Ahead 300 answers two very important questions that can facilitate proper decision-making: 1) Is it likely the mildly presenting head-injured patient has a traumatic structural brain injury that would be visible on a CT scan, which is the gold standard used in emergency rooms?; and 2) Is there evidence of something functionally abnormal with the brain after head injury, which could be a concussion? In addition to these questions, the Ahead 300 also offers two rapid cognitive performance tests, as well as a number of professional society-based concussion assessment tools commonly used in today’s medical landscape. The 5- to 10-minute cognitive performance tests allow doctors to assess patient performance compared with healthy individuals in the same age group, providing an objective metric. The device also includes four objective tests and 16 standard concussion assessment tools, all of which can be customized since there is no national protocol for TBI/concussion diagnosis. Johns Hopkins’ Hanley, also the director of the university’s Brain Injury Outcomes program, recently published a study in Academic Emergency Medicine designed to test the accuracy and effectiveness of the Ahead 300 against a CT scan, the current gold standard for TBI/concussion assessment. Hanley and his research team recruited 720 adults from 11 emergency departments across the nation who presented with a closed head injury. After standard clinical assessment tests by a doctor or nurse to characterize the patient’s symptoms, the Ahead 300 device was used to measure EEG data—essentially tracking and recording brain wave patterns. For this study, the device was programmed to read approximately 30 specific features of brain electrical activity, which it uses an algorithm to analyze, and how the patient’s pattern of brain activity compared to the same pattern of brain activity considered normal. For example, it looked for how fast or slow information traveled from one side of the brain to the other, or whether electrical activity in both sides of the brain was coordinated or if one side was lagging. The accuracy of the device was then tested using CT scans from the patients. The presence of any blood within the intracranial cavity was considered a positive finding, indicating brain bleeding. Initially, researchers sorted the patients tested with Ahead 300 into two categories—“yes,” indicating likely traumatic brain injury with over 1 mm of bleeding, and “no,” for those with likely no bleeding in the brain. Of the 564 patients without brain bleeding as confirmed by CT scans, 291 were scored on the Ahead 300 as likely not having a brain injury. However, of the 156 patients with confirmed brain bleeding, the Ahead 300 correctly identified 144, or 92 percent. Given the nearly 50 percent variance in those without brain bleeding, the researchers then created three categories to sort patients—yes, no and maybe. The maybe category included a small number of patients with greater-than-usual abnormal EEG activity that was not statistically high enough to be definitely positive. When the results were recalculated on the three-tier system, the sensitivity of detecting someone with a traumatic brain injury increased to 97 percent, with 152 of 156 traumatic head injuries detected by the Ahead 300—99 percent of those having more than or equal to 1 milliliter of bleeding in the brain. None of the four false negatives required surgery, returned to the hospital due to their injury or needed additional brain imaging. The researchers say these predictive capabilities improve on the clinical criteria currently used to assess whether to do a CT scan—known as the New Orleans Criteria and the Canadian Head CT rules—and predicted the absence of brain bleeding more than 70 percent of the time in those people with no more than one symptom of brain injury, such as disorientation, headache or amnesia. “This work opens up the possibility of diagnosing head injury in a very early and precise way,” Hanley said. “This technology is not meant to replace the CT scan in patients with mild head injury, but it provides the clinician with additional information to facilitate routine clinical decision-making. If someone with a mild head injury was evaluated on the sports or battlefield, then this test could assist in the decision of whether or not he or she needs rapid transport to the hospital. Alternatively, if there is an accident with many people injured, medical personnel could use the device to triage which patients would need to have CT scans and who should go first.” This specific study only tested the Ahead 300 on adults between the ages of 18 and 85, leaving off a large portion of the younger target audience. But, since EEG is different across the spectrum of ages, the effectiveness of the device on children and teens needs to be tested in a separate pediatric study—which BrainScope is already working on. Hanley said he hopes to collaborate on that study, as well as any overall concussion research going forward. “The exciting thing is this has applications across mild TBI,” he said. “This is a measurement that could be used in chronic traumatic encephalopathy (CTE) research. The question there is ‘how many miles add up to CTE?’ This would give you some measure of mild concussion with or without brain bleeding.”


News Article | May 9, 2017
Site: www.npr.org

Almost 100 hospitals reported suspicious data on dangerous infections to Centers for Medicare & Medicaid Services officials, but the agency did not follow up or examine any of the cases in depth, according to a report by the Health and Human Services inspector general's office. Most hospitals report how many infections strike patients during treatment, meaning the infections are likely contracted inside the facility. Each year, Medicare is supposed to review up to 200 cases in which hospitals report suspicious infection-tracking results. The IG said Medicare should have done an in-depth review of 96 hospitals that submitted "aberrant data patterns" in 2013 and 2014. Such patterns could include a rapid change in results, improbably low infection rates or assertions that infections nearly always struck before patients arrived at the hospital. The IG's report, released Thursday, was designed to address concerns over whether hospitals are "gaming" a system in which it falls to the hospitals to report patient infection rates and, in turn, the facilities can see a bonus or a penalty worth millions of dollars. The bonuses and penalties are part of Medicare's Hospital Inpatient Quality Reporting program, which is meant to reward hospitals for low infection rates and give consumers access to the information at the agency's Hospital Compare website. The report zeroes in on a persistent concern about deadly infections that patients develop as a result of being in the hospital. A recent report in the journal BMJ identified medical errors as the third-leading cause of death in the country. Hospital infections particularly threaten senior citizens with weakened immune systems. Rigorous review of hospital-reported data is important to protect patients, says Lisa McGiffert, director of the Consumers Union's Safe Patient Project. "There's a certain amount of blind faith that the hospitals are going to tell the truth," McGiffert says. "It's a bit much to expect that if they have a bad record, they're going to fess up to it." Yet there are no uniform standards for reviewing the data that hospitals report to Medicare, says Dr. Peter Pronovost, senior vice president for patient safety and quality at Johns Hopkins Medicine. "There are greater requirements for what a company says about a washing machine's performance than there is for a hospital on quality of care, and this needs to change," Pronovost said. "We require auditing of financial data, but we don't require auditing of [health care] quality data, and what that implies is that dollars are more important than deaths." In 2015, Medicare and the Centers for Disease Control and Prevention issued a joint statement cautioning against efforts to manipulate the infection data. The report said CDC officials heard "anecdotal" reports of hospitals declining to test apparently infected patients so there would be no infection to report. They also warned against overtesting, which helps hospitals assert that patients came into the hospital with a pre-existing infection, thus avoiding a penalty. In double-checking hospital-reported data from 2013 and 2014, Medicare reviewed the results from 400 randomly selected hospitals, about 10 percent of the nation's more than 4,000 hospitals. Officials also examined the data from 49 "targeted" hospitals that had previously underreported infections or had a low score on a prior year's review. All told, only six hospitals failed the review, which included a look at patients' medical records and tissue sample analyses. Those hospitals were subject to a 0.6 percent reduction in their Medicare payments. Medicare did not specify which six hospitals failed the data review, but it did identify dozens of hospitals that received a pay reduction based on their reports on the quality of care. The new IG report recommends that Medicare "make better use of analytics to ensure the integrity of hospital-reported quality data." A response letter from Centers for Medicare & Medicaid Services Administrator Seema Verma says Medicare concurs with the finding and will "continue to evaluate the use of better analytics ... as feasible, based on [Medicare's] operational capabilities." Questions about truth in reporting hospital infections have percolated for years, as reports have trickled out from states that double-check data. In Colorado, one-third of the central-line infections that state reviewers found in 2012 were not reported to the state by hospitals, as required. Central lines are inserted into a patient's vein to deliver nutrients, fluids or medicine. Two years later, though, reviewers found that only 2 percent of central-line infections were not reported. In Connecticut, a 2010 analysis of three months of cases found that hospitals reported about half, or 23 of the 48 central-line infections that made patients sick. Reviewers took a second look in 2012 and found improved reporting, with about a quarter of the cases unreported, according to the state public health department. New York state officials have a rigorous data-checking system that they described in a report on 2015 infection rates. In 2014, they targeted hospitals that were reporting low rates of infections and urged self-audits that found underreporting rates of nearly 11 percent. Not all states double-check the data, though, which Pronovost says underscores the problem with data tracking the quality of health care. He said common oversight standards, like the accounting standards that apply to publicly traded corporations, would make sense in health care, given that patients make life-or-death decisions based on quality ratings assigned to hospitals. "You'd think, given the stakes, you'd have more confidence that the data is reliable," he said. Kaiser Health News is a national health policy news service. It is an editorially independent program of the Henry J. Kaiser Family Foundation.


News Article | May 12, 2017
Site: www.prweb.com

Modern Healthcare, a leader in health care business news, research and data has ranked OSF HealthCare as having one of the top 10 innovation centers in the U.S. OSF Innovation is listed among other world-renowned centers at Cleveland Clinic, Johns Hopkins and Mayo Clinic. Launched in 2016, OSF Innovation is a multidisciplinary innovation center focused on internal and external innovation for the transformation of health care. “We are proud to see Modern Healthcare recognize our progress as a leading health care innovation program,” said Jeffry Tillery, MD, Senior Vice President and Chief Transformation Officer for OSF. “This speaks to our vision of transforming health care into value for the patients and communities we serve as well as the work of an extremely talented team that works tirelessly to integrate innovation within our health care ministry.” According to Modern Healthcare’s “By the Numbers” research, OSF Innovation ranked #2 out of 10 health care systems that are incubating startups. The recognition is based on the number of startup companies OSF HealthCare is working with to test ideas, offer mentorship, feedback and help with troubleshooting. The organization collaborates with about 50 entrepreneurial companies. The health care publication also listed OSF Innovation #6 out of 10 health care systems accelerating innovations. The ranking is based on the number of innovation projects being tested internally. OSF HealthCare boasts more than 50. “The unsustainable nature of health care today is what is driving our innovation agenda,” said Michelle Conger, Chief Strategy Officer for OSF. “We believe working together with startups and piloting new ideas can help us find solutions to some of health care's biggest challenges.” OSF Innovation employs a variety of approaches to innovation such as improving processes and functions to serve patients; mentoring, networking and partnering with external companies working on solutions to health care problems; investing in start-ups through OSF Ventures; and developing and testing internal and external ideas that could revolutionize how health care is delivered.


News Article | May 10, 2017
Site: motherboard.vice.com

Sitting in traffic on the way to the Brooklyn Navy Yard on Thursday, I wondered to myself why I hadn't simply taken my bike instead. I was en route to interview Dr. Amen Ra Mashariki, the City of New York's chief analytics officer, determined to better understand how data analysis can help making living in this cramped city a little more pleasant for its nearly 8.5 million residents. After all, the purpose of the event where I caught up with Mashariki, Smart Cities NYC 2017, was to explore the intersection of "technology and urban life." Who better to ask how the reams of data the city collects can actually make a difference? What follows is an edited and condensed recreation of my conversation with Mashariki, which took place on Thursday afternoon. Motherboard: When I saw the title "chief analytics officer," I had no idea what that was in the context of a city employee. Can you explain what a chief analytics officer does and how you got here? Amen Ra Mashariki: I think it's a great question primarily because across the country you'll see mostly chief djavascript:void(null);ata officers. You'll rarely see chief analytics officers, and I think that speaks to the difference of the role that I play here than what you might see with folks who are referred to as chief data officers. To give you the quick lay of the land, I started in the private sector in Motorola, but then I went back and got my doctorate. And when I went back and got my doctorate my nephew was diagnosed with cancer and I decided that I wanted to be impactful in the technological space. When I did my dissertation I did it on interoperable medical devices, which is essentially on how to share data across medical devices. Then I did post-doc at the University of Chicago's cancer research center. Again, staying in that vein of, "How do you use technology for good?" Then I went to Johns Hopkins and helped create their first bioinformatics research space. I was a computer scientist there, so all of my degrees are in computer science. So I was a core programmer but I had never thought about public service. I had been in the private sector and academia, but I applied for this program called the White House Fellows. What happens is 11 people are appointed by the president every year to function as high level senior advisers to heads of agencies. It's a very prestigious program about leadership and so that was my foray in 2012 into government. And I just absolutely fell in love with the concept of public service. What was jumping into government like? For me it was a learning experience, because I came in thinking the only people who go into government don't want to do work, or they're lazy or not smart. I was absolutely—my whole concept turned on its head! These are hard workers, committed people, extremely smart, who are experienced and knowledgeable in their fields, who have decided, "Hey I want to be maximally impactful." People in the private sector are the same way but in government you have to have a sense of patience. It's more of a mentality like, "We're not planning for the next quarterly results, we're planning how to better society." That's right! And that's a great segue into my role. My office, the way I describe it is, we're a no-cost data analytics consulting firm to the city. There are things that are long plays but there are things that city agencies need to do now. We need to find the top X-number of buildings that are not up to code. We need to find the places where the city is not performing. We need to identity a more efficient route when we do an emergency response or plow snow. So there are things that need to happen virtually immediately, right? When an agency decides that they're looking for a more innovative way to be more efficient and to be more impactful and to drive costs down, that's where we come in. We help city agencies determine what data sets they can use to help solve any number of problems that they may have. What sort of questions do these agencies ask you when they want help with, say, snow removal? The way that we function is that all a city agency has to do is essentially talk about their business challenge. Let's take one problem—one of the problems that we did a while back was to help fire inspectors buildings that were illegally converted [such as an apartment building only being licensed to 10 units actually having 14 units instead]. Why do we want to know which buildings have been illegally converted? Because the business problem was, "We want to minimize the number of fires that exist in the city." We translated that into an analytics question that says, "If we identify buildings that have been illegally converted you're taking gas and electricity to those additional units that you're not permitted to. So you have to do all sorts of splicing and connections and so forth, so maybe you've got your cousin or your uncle doing the work and they're not qualified to do it properly, which grows the chance of fire. So if we can minimize the number of illegally converted buildings we can minimize the number of fires. The thing that we think about the most is emergency responses. When you look at the recent history of New York City, there's 9/11, Hurricane Sandy—the de Blasio administration is very concerned about making sure we have the right resources and infrastructure in place for emergency response. So my office thinks about emergency response from standpoint of sharing data. Every month we do a thing called a data drill. Essentially we work with city agencies to create an emergency scenario and then we present that scenario. "This happened on this day, here are the circumstances, and here are the things that the city leadership need to know to be able to respond." We think about these agencies, whether it's the NYPD or FDNY, but then there's also questions about data that needs to get passed so that the leadership of these agencies have a better level of insight so that they can move the pieces where they need in an emergency situation. What kind of pieces? If there are downed trees, for example. Let's say there are X-number of downed trees in the city. How do we know which ones to go and remove first? Should we go and remove the downed tree because someone called it in first so it's at the top of the queue but it's in the middle of nowhere so it's not impacting traffic? Or maybe it's the people who called in 20th on the list but the tree is right in front of a building where people with disabilities live? Well, you should probably go there first. Gotcha, right. Mashariki: So there's all these things around data and information that agencies need to know. During a drill, we practice what we call "data at the speed of thought." We want people who are responding to emergencies to have data at the speed of thought. Were those conversations happening 10 years ago, 20 years ago, where the NYPD was talking to the Department of Buildings in terms of sharing data? Not at the level that we're facilitating now, no. I always bristle when I hear people say, "Aren't there silos where this agency isn't talking to that agency?" You have people who have been in government for 20, 30 years who've worked at four different agencies. And because they've worked at four different agencies they know the people to call at those agencies when they need stuff. So there is a mechanism that exists, "Oh I know that guy at the agency, let me call him," but there wasn't a framework or an infrastructure across the board. There's always one to one sharing where this agency can share with that agency, but to create an environment where all agencies can share across everyone else, no that wasn't happening. Right. Now we're talking about data-sharing but let me also add another piece in there. Let's say 10 years ago everyone was sharing data but there was something that no one was asking and that's whether or not the data was even good. That's my job to ask. Just because Agency A asks Agency Y for the data and they emailed it to them in an Excel spreadsheet does it mean that that data's going to be useful? It just means that they responded. Awesome, thank you, check, you responded, but is the data useful? Is it even in a format that I can understand? So who's responsible for ensuring that the data that's being collected is quality data to begin with? Mashariki: The agencies. The agencies do a good job but then, say, the Department of Buildings is a huge agency, so the question really should be who in that agency is working to ensure that that particular data is accurate. There isn't one "data guru" who oversees all the different sets of data. So in an emergency what happens is someone from City Hall calls the head of that agency. Someone from City Hall doesn't know the middle manager who oversees any particular dataset, but the head of that agency will, or can call on the Chief Information Officer to find that person. At that point you've already reached the threshold of how many people you should contact to get something out. So the overall question for us is, in an emergency situation what's the process for getting the right and best data? My sense is if you gave an agency a couple of days to find the best, highest quality data on any given topic they can do that, but if you give them 30 seconds are you going to get to that? So it sounds like we're limited to moving at the speed of people? It's not so much, data flying everywhere, ahh! It's more, OK, who knows this and how quickly can we find this person? That's exactly right. We lead with people and not data. It's all about whom you engage, how you engage them, and what tools you give them with which to respond. It's something I learned during my White House fellowship: government is well-equipped with the right people but we have to build out processes to give them the tools to be able to respond successfully. And that doesn't seem all that different from any other large organization. You often have the right people but they don't have the tools, or even know where to get the tools, to be as efficient and productive as they fully have the ability to be. Absolutely. So let's step back a bit. What kind of data does the city actually collect? I assume there's the usual things like crime statistics, car accidents, but what else? That's a big question so here's how I'll answer it. If you go to our Open Data Portal we have over 1,700 datasets and the next highest city has like 800, maybe 900. And we've only just begun. As for specifics: location, location, location. Almost every single dataset that we have in the city revolves around location. Some of our bigger data agencies are the Department of Finance because they manage information around taxes and land use. The Department of Buildings has building information. Every building in New York City has a Building Identification Number, and that BIN corresponds to a host of characteristics. Like you said, the NYPD collects crime statistics so that's a big dataset. The Department of Sanitation collects data about snow plow routing. The Department of Homeless Services sends us nightly the aggregate number of people in homeless shelters. What we don't get for all sorts of privacy reasons is data from the Department of Health, at least that we release to the Open Data Portal. But I would say buildings data is probably your most expansive dataset but that's primarily because buildings cut across all sorts of different agencies: Department of Finance for tax data, obviously the Department of Buildings, you've got Housing Preservation and Development, you've got NYPD that stores data. What sort of data is there surrounding transportation? I know everyone hates the subway but de Blasio has taken great pains to remind people that the city doesn't control the subway [a state agency does]. I'm just wondering what role does data play in getting folks from Point A to Point B as quickly and as safely as possible? We have CitiBike locations. We get TLC [Taxi and Limousine Commission] locations, that's a big dataset. We've got Department of Transportation data in terms of crashes and fatalities. We also have a partnership with Waze where we get some data from them that can be useful. DOT also has cameras where they can track traffic. It's funny you say Waze because I'm wondering if there are any other startups or private companies either working with the city to improve the data or improve how it's accessed? My office did a project called Business Atlas where we took federal, state, and city data, things like liquor license data and a bunch of Department of Consumer Affairs Data, we took all this data created a view of businesses such that you could get a better sense of market research if you were trying to open up a business in that area. You'd get a sense of median income, median age, what businesses that existed in that area, and so on and sort forth. We worked with a startup called PlaceMeter that uses sensors to measure foot traffic information around that area. So we created this map where you could put in an address an immediately get a sense of what's the business environment is like in that area. We also have strong partnerships with academia. So the theme of this event is the future. Things are whatever they are today but how can we make things better tomorrow. When you leave your role as chief analytics officer what would you like your legacy to be That's an easy answer. Two things. One is, I've already said that my job is to grow the competency of the Mayor's Office of Data Analytics [MODA] in the short term such that we can lessen its role across the city in the long term. And the concept behind that is, creating a culture will be more sustainable than saying, "As long as there's a MODA these things will get done." Because one thing we know about government is, there may not be a MODA depending on who gets elected years down the line. So my job is not to ensure that MODA maintains this leadership role and is always this hub of analytics excellent. My job is to spread analytics excellence across the city within these agencies. So it almost sounds like a successful MODA under your leadership is one that doesn't even really need to exist. That's exactly right. And the second is, oversight of our open data strategy. When I came in we published this vision called Open Data for All. And Open Data for All says that, yes, the city has this data that we want to release to New Yorkers but it's not useful if only a small, chosen few statistically minded few can use it. We need to have it so that all New Yorkers from all walks of life know that this exists and know how this exists but not only that but that use it to their advantage. We want it so that all New Yorkers can say, "Having access to this data can do things like help me start a business or can help me figure out a smart way to engage with my community." A strong implementation of Open Data for All and a growing understanding that everyone in New York—everyone—should have access to the resources of the city. Subscribe to Science Solved It , Motherboard's new show about the greatest mysteries that were solved by science.


News Article | May 8, 2017
Site: www.eurekalert.org

In experiments with human colon cancer cells and mice, a team led by scientists at the Johns Hopkins Kimmel Cancer Center say they have evidence that cancer arises when a normal part of cells' machinery generally used to repair DNA damage is diverted from its usual task. The findings, if further studies confirm them, could lead to the identification of novel molecular targets for anticancer drugs or tests for cancer recurrence, the investigators say. Scientists have long known that chronic inflammation, a risk factor for cancer, can damage DNA. They've also known that cancer cells' ability to spread is in part due to so-called "epigenetic" factors that sabotage the ability of genes to turn on or off when they should. In their new study, described in the May 8 issue of Cancer Cell, the scientists uncovered a link between those two phenomena by turning their attention to a protein known as CHD4, short for chromodomain helicase DNA-binding protein. The protein is associated with DNA damage repair. The researchers, principally Limin Xia, a postdoctoral fellow in the laboratory of Stephen B. Baylin, M.D., the Virginia and D.K. Ludwig Professor of Oncology and Medicine and associate director for research programs at the Kimmel Cancer Center, designed a series of experiments to determine how the CHD4 protein repairs DNA damage. First, the researchers exposed human colon cancer cells in the laboratory to hydrogen peroxide, which damages DNA through an inflammatory-like process, namely the generation of negatively charged, highly reactive molecules called reactive oxygen species (ROS). The experiments showed that CHD4 was present at the DNA damage sites within minutes of exposure to hydrogen peroxide, and was soon accompanied by a "repair crew" of other proteins, composed in part of DNA methyltransferases, proteins that place methyl groups on genes to "silence," or turn them off. Then, the research team used a laser beam to cause DNA damage in the colon cancer cell lines. Again, CHD4 and its crew of repair proteins swooped into the damage site. "This result suggests that the presence of CHD4 and its accompanying proteins may be part of a universal system for repairing DNA damage," says Baylin. Adding support to that idea, he says, when the team stopped cells from making CHD4 by genetically disrupting the gene, the accompanying proteins were no-shows after exposure to hydrogen peroxide or the laser. Presumably, Baylin says, the mechanism exists to shut down genes in damaged regions while cells repair DNA. However, he says, the repair team may stick around in some genes, keeping them turned off even after DNA repair is finished or ongoing. The type of gene that is kept turned off could be linked to cancer, notes the team. The researchers found that eight genes most likely to be already methylated and thus turned off in colon cancer cells are thought to be potential tumor suppressors. Further investigation showed that these genes were also already enriched with CHD4. When researchers prevented cells from making CHD4, these genes lost their methylation and became reactivated, able to produce proteins that prevented the spread of cancer cells. Checking the Cancer Genome Atlas, a database funded by the National Institutes of Health that catalogs genetic mutations thought to be responsible for cancer, the investigators found that a significant subset of colon, lung and other cancers -- between 30 and 40 percent -- had much higher levels of CHD4 than healthy tissues. Curious as to how CHD4 is drawn to damaged DNA, Xia, Baylin and their colleagues looked for other factors that might attract CHD4 to the DNA damage site. They found that CHD4 interacts directly with an enzyme called 8-oxoguanine glycosylase (OGG1), which removes guanine -- one of the units that makes up DNA -- when it becomes damaged. When the researchers removed this enzyme from cells, CHD4 failed to arrive at sections of damaged DNA. When the researchers color-stained the DNA of colon cancer cells to find the most likely locations of OGG1, they found it at the locations of the eight tumor suppressor genes that are often turned off when cancer occurs. Finally, the researchers performed a series of experiments to examine the behavior of two sets of colon cancer cells: one set with a characteristically high amount of CHD4, and one in which the researchers had used genetic techniques to reduce levels of this protein. The researchers found that the unmodified colon cancer cells readily moved around in petri dishes, penetrated other cell membranes there, and migrated from one area to another in live mice to create new tumors -- the hallmarks of metastases. However, when the researchers tried the same experiments with the cells in which CHD4 had been knocked down, they'd lost all these characteristic cancer cell abilities. "Taken together," Baylin says, "our experiments suggest that CHD4 and the resulting methylation is a really important phenomenon associated with the cause of colon and probably many other cancer types." Consequently, he says, finding ways to reduce the amount of CHD4 in tumors could be one way to treat cancer. Additionally, he says, tracking high levels of OGG1, which attracts CHD4, might be useful for doctors to gauge risk of cancer recurrence. Other researchers who participated in this study include Yi Cai, Yang W. Zhang, Huili Li, Cynthia A. Zahnow, Wenbing Xie, and Ray-Whay Chiu Yen of Johns Hopkins, and Feyruz V. Rassool from the University of Maryland Greenebaum Comprehensive Cancer Center. The research was supported by grants from the National Institutes of Health's National Institute of Environmental Health Sciences (RO1 ES011858), the Hodson Trust, and the National Natural Science Foundation of China (No.81522031, No.81272652). The study was also supported in part by internal funds from the National Center for Toxicological Research, U.S. Food and Drug Administration.


News Article | May 10, 2017
Site: www.eurekalert.org

Using gene sequencing tools, scientists from Johns Hopkins Medicine and the University of British Columbia have found a set of genetic mutations in samples from 24 women with benign endometriosis, a painful disorder marked by the growth of uterine tissue outside of the womb. The findings, described in the May 11 issue of the New England Journal of Medicine, may eventually help scientists develop molecular tests to distinguish between aggressive and clinically "indolent," or non-aggressive, types of endometriosis. "Our discovery of these mutations is a first step in developing a genetics-based system for classifying endometriosis so that clinicians can sort out which forms of the disorder may need more aggressive treatment and which may not," says Ie-Ming Shih, M.D., Ph.D., the Richard W. TeLinde Distinguished Professor in the Department of Gynecology & Obstetrics at the Johns Hopkins University School of Medicine and co-director of the Breast and Ovarian Cancer Program at the Johns Hopkins Kimmel Cancer Center. Endometriosis occurs when tissue lining the uterus forms and grows outside of the organ, most often into the abdomen. The disease occurs in up to 10 percent of women before menopause and half of those with abdominal pain and infertility problems. In the 1920s, Johns Hopkins graduate and trained gynecologist John Sampson first coined the term "endometriosis" and proposed the idea that endometriosis resulted when normal endometrial tissue spilled out through the fallopian tubes into the abdominal cavity during menstruation. The new study, Shih says, challenges that view. The presence of the unusual set of mutations they found in their tissue samples, he says, suggests that while the origins of endometriosis are rooted in normal endometrial cells, acquired mutations changed their fate. For reasons the researchers say are not yet clear, the mutations they identified have some links to genetic mutations found in some forms of cancer. They emphasize that although abnormal tissue growth in endometriosis often spreads throughout the abdominal cavity, the tissue rarely becomes cancerous except in a few cases when ovaries are involved. For the study, Shih and his colleagues sequenced -- or figured out the genetic alphabet -- a part of the genome known as the exome, which contains all of the genes that can be expressed and make proteins. Specifically, they sequenced the exome of both normal tissue and endometriosis tissue removed during laparoscopic biopsies on 24 women, some with more than one abnormal endometrial growth. All had deep infiltrating endometriosis, the type that typically causes pain and infertility. Seven of the 24 women were from Japan; the rest were patients at Lenox Hill Hospital-Northwell Health in New York City. The use of samples from Japanese women was selected because endometriosis before menopause occurs more often in Asian women (13-18 percent) than in Caucasian women (6-10 percent), Shih says. The scientists looked for mutations, or abnormal changes in the DNA, and filtered out normal variations in genes that commonly occur among humans. Of the 24 women, 19 had one or more mutations in their endometriosis tissue that were not present in their normal tissue. The type and number of mutations varied per endometriosis lesion and between each of the women. The most common mutations, occurring in five of the women, occurred in genes including ARID1A, PIK3CA, KRAS and PPP2R1A, all known for controlling cell growth, cell invasion and DNA damage repair. Mutations in these genes have been associated with one of the deadliest types of ovarian cancer, called clear cell carcinoma. Nickolas Papadopoulos, Ph.D., professor of oncology and pathology at the Johns Hopkins Kimmel Cancer Center, led the team that completed the first sequencing of the clear cell ovarian cancer genome in 2010. "We were surprised to find cancer-linked genes in these benign endometriosis samples because these lesions do not typically become cancer," says Papadopoulos, whose Ludwig Center laboratories performed the sequencing. "We don't yet understand why these mutations occur in these tissues, but one possibility is that they could be giving the cells an advantage for growth and spread." In an additional group of endometriosis samples biopsied from 15 women at the University of British Columbia, the scientists looked specifically for mutations in the KRAS gene, whose expression signals proteins that spur cell growth and replication. They found KRAS mutations in five of the 15 patients. The scientists make clear that their sequencing studies may have missed mutations in some of the samples. Their data do not at this point reveal the aggressiveness of the lesions. However, Shih says, he and his team are working on additional studies to determine if the mutations correlate with patients' outcomes. He says a molecular test that sorts lesions as more or less aggressive has the potential to help doctors and patients decide how to treat and monitor the progression and control of the disease. "We may also be able to develop new treatments for endometriosis that use agents that block a gene-related pathway specific to a person's disease," says Shih. Women with endometriosis are typically prescribed anti-hormonal treatments that block estrogen to shrink lesions. When the disease occurs in the ovaries and forms a large cyst, which increases the risk of developing ovarian cancer, the lesion is usually surgically removed. Other scientists involved in the research include M.S. Anglesio, A. Ayhan, T.M. Nazeran, M. Noë, H.M. Horlings, A. Lum, S. Jones, J. Senz, T. Seckin, J. Ho, R.-C. Wu, V. Lac, H. Ogawa, B. Tessier?Cloutier, R. Alhassan, A. Wang, Y. Wang, J.D. Cohen, F. Wong, A. Hasanovic, N. Orr, M. Zhang, M. Popoli, W. McMahon, L.D. Wood, A. Mattox, C. Allaire, J. Segars, C. Williams, C. Tomasetti, N. Boyd, K.W. Kinzler, C.B. Gilks, L. Diaz, T.-L. Wang, B. Vogelstein, P.J. Yong, and D.G. Huntsman. Funding for the studies was provided by the Richard W. TeLinde Gynecologic Pathology Research Program at The Johns Hopkins University, the Virginia and D.K. Ludwig Fund for Cancer Research, the Ephraim and Wilma Shaw Roseman Foundation, the Endometriosis Foundation of America, the National Institutes of Health and National Cancer Institute (grants P50-CA62924, CA06973, GM07184, GM07309, CA09243, CA57345, P30-CA006973, CA215483, and UO1-CA200469), the Gray Family Ovarian Clear Cell Carcinoma Research Resource, the Canadian Cancer Society (grant 701603), the Canadian Institutes of Health Research (IHD-137431 and MOP-142273), the Canadian Foundation for Innovation (John R. Evans Leaders Fund) and British Columbia Knowledge Development Fund, the Women's Health Research Institute (Nelly Auersperg Grant), and the Canadian Foundation for Women's Health (General Research Grant), the BC Women's Hospital and Health Centre Foundation, The BC Cancer Foundation and the VGH and UBC Hospital Foundation, David and Darrell Mindell, Peter and Shelley O'Sullivan, the Jemini Foundation, the Vancouver Coastal Health Research Institute, the Dr. Chew Wei Memorial Professorship in Gynecologic Oncology, the Canada Research Chairs program (Research Chair in Molecular and Genomic Pathology), and the Dutch Cancer Society translational research fellowship (KWF 2013-5869).


Grant
Agency: Department of Defense | Branch: Navy | Program: STTR | Phase: Phase I | Award Amount: 80.00K | Year: 2013

We propose a solution to the Navy"s laser weapons warning problem. The proposed solution employs a combined approach of extending the wavelength spectrum and dynamic range of current State of the Art laser warning systems. The proposed wide optical bandwidth will be achieved by adapting previously developed methods. The system will exploit on going design work created for visible multi-threat optical systems and leverage advances in FPA technologies. The optical bandwidth will require multiple diffraction gratings integrated with reflective wide FOV optics. FPA enhancements will address anticipated dynamic range requirements

Loading Johns Hopkins collaborators
Loading Johns Hopkins collaborators