News Article | February 16, 2017
EAST LANSING, Mich. --- Black girls are disproportionately punished in American schools - an "overlooked crisis" that is populating the school-to-prison pipeline at rising rates, two education scholars argue in a new paper. Dorinda Carter Andrews, associate professor at Michigan State University, and Dorothy Hines-Datiri, assistant professor at the University of Kansas and former doctoral student at MSU, cite various examples of black girls in elementary school being handcuffed and taken away in police cars for classroom disruptions such as temper tantrums. These zero tolerance policies unfairly target students of color and should be abolished, Carter Andrews said. But while a wealth of research and public discussion has focused on black male students, little attention has been paid to the mistreatment of black girls in U.S. classrooms, she said. "Zero tolerance constructs these young girls as criminals," Carter Andrews said. "It's a criminalization of their childhood, and it's a very prison-type mentality for schools to take." The paper, which appears online in the journal Urban Education, notes that zero tolerance is defined as a form of school discipline that imposes removal from school for an array of violations, from violence to truancy to dress code violations. Black students are two to three times more likely to be suspended than white students and are overrepresented in office referrals, expulsions and corporal punishment, the paper says. Black female students in the United States receive out-of-school suspensions at higher rates (12 percent) than female students across all other racial and ethnic categories, according to the U.S. Department of Education Office for Civil Rights. Only black boys (20 percent) and American Indian/Alaska native boys (13 percent) have higher suspension rates than black girls. Black girls are also more likely to receive harsher discipline than their white peers for minor offenses, such as talking back to the teacher, Carter Andrews said. "The research shows that teachers and other adults may give a pass to certain students for the ways in which they talk back," she said. "Teachers may view some girls, particularly African-American girls, as attitudinal or aggressive, even though they may be using the same talk-back language as a white female student." In addition to the abolishment of zero tolerance policies, the researchers call for the establishment of culturally responsive professional-development training for educators that would raise their awareness of the experiences of girls of color. "We cannot afford to have more black girls' identities snuffed out by disciplinary policies and ultimately the educational and criminal justice systems," the study says.
News Article | February 27, 2017
Paul and Hannah Catlett are named one of the "7 Springfield Power Couples". Studio 417 Salon in Springfield, MO, belongs to Paul and Hannah Catlett, who actually met each other at the very hair salon that they own. Paul and Hannah were recently listed as one of the “7 Springfield Power Couples You Need To Meet” by the Murney Blog They are a longstanding couple that does everything together and puts family above all else. They give back to the community by hiring locally and sharing their secrets of success in business and marriage. The salon has received multiple forms of recognition for its above-average service and its myriad of top notch products. The studio has received at least three awards that label it as the best in its field. Missouri State University's "The Standard" dubbed it the best in MSU. The Springfield Newsleader gave it a "Best of the Ozarks" award, and 417 Magazine awarded it with the title of “best salon”. Studio 417 offers a broad range of services for both men and women. Customers can visit to get their hair or nails done for a special occasion such as a wedding or a banquet. The salon offers makeup applications, nail service, haircuts, perms, highlights, extensions, wigs, and eyebrow service. The team is comprised of men and women who have colorful personalities, friendly demeanors, unique cultures, and a deep sense of artistic prowess. Studio 417 Salon has been providing services to the residents of Springfield for over 16 years, and continues to grow in its professionalism and expand the scope of its work. To see the results of their work, one has to look no further than the many 5 star reviews they have received, as well as the extensive portfolio that they creatively post on Instagram. The salon is open Monday through Friday from 8 a.m. to 8 p.m. It has Saturday hours of 8 a.m. to 6 p.m. and is closed on Sundays. If interested in being served by the best, appointments can be scheduled by calling 417-866-6455, booking online at www.studio417salon.com, or stopping by their location in Farmer’s Park, at 2144 E Republic Road at Ste A104. Any questions can be directed to email@example.com.
News Article | February 15, 2017
EAST LANSING, Mich. -- Engineering researchers at Michigan State University have developed the first stretchable integrated circuit that is made entirely using an inkjet printer, raising the possibility of inexpensive mass production of smart fabric. Imagine: an ultrathin smart tablet that can be stretched easily from mini-size to extra large. Or a rubber band-like wrist monitor that measures one's heartbeat. Or wallpaper that turns an entire wall into an electronic display. These are some of the potential applications of the stretchable smart fabric developed in the lab of Chuan Wang, assistant professor of electrical and computer engineering. And because the material can be produced on a standard printer, it has a major potential cost advantage over current technologies that are expensive to manufacture. "We can conceivably make the costs of producing flexible electronics comparable to the costs of printing newspapers," said Wang. "Our work could soon lead to printed displays that can easily be stretched to larger sizes, as well as wearable electronics and soft robotics applications." The smart fabric is made up of several materials fabricated from nanomaterials and organic compounds. These compounds are dissolved in solution to produce different electronic inks, which are run through the printer to make the devices. From the ink, Wang and his team have successfully created the elastic material, the circuit and the organic light-emitting diode, or OLED. The next step is combining the circuit and OLED into a single pixel, which Wang estimates will take one to two years. There are generally millions of pixels just underneath the screen of a smart tablet or a large display. Once the researchers successfully combine the circuit and OLED into a working pixel, the smart fabric can be potentially commercialized. Conceivably, Wang said, the stretchable electronic fabric can be folded and put in one's pocket without breaking. This is an advantage over current "flexible" electronics material technology that cannot be folded. "We have created a new technology that is not yet available," Wang said. "And we have taken it one big step beyond the flexible screens that are about to become commercially available." The groundbreaking discovery of the ink-fabricated stretchable circuitry was published recently in the journal ACS Nano. Wang's co-researchers were Le Cai, Suoming Zhang and Jinshui Miao of MSU and Zhibin Yu of Florida State University.
News Article | March 2, 2017
EAST LANSING, Mich. - A Michigan State University researcher has received a $1.65 million grant that looks to bring a better understanding about fertility treatments in women by studying the effect of hormones on ovulation and reproduction in cows. "Cattle are a useful model because they have a relatively long reproductive cycle similar to women and they ovulate a single egg at the end of each cycle," said James Ireland, a professor of reproductive physiology. "Plus, a cow with a smaller egg reserve typically doesn't respond to fertility methods as well as cattle who have more eggs stored, a phenomenon women often experience too." With funding from the National Institutes of Health and United States Department of Agriculture, Ireland will lead the five-year study with Keith Latham, co-director of the Reproductive and Developmental Sciences Program at MSU. Richard Leach, chair of MSU's Department of Obstetrics, Gynecology and Reproductive Biology, will also contribute to the project. Although many fertility techniques used today have been developed using cows as a model, Ireland and his research team are the first to try and establish how increased doses of a certain fertility hormone given to women during in vitro fertilization can positively or negatively affect live birth rates. Follicle stimulating hormone, or FSH, is produced by the pituitary gland and controls the ovaries in women and testes in men. It's essential for reproduction and physicians often use it to stimulate as many follicles as possible in a woman's ovaries, so a larger number of eggs can be recovered for IVF treatment. Ireland said that evaluating the impact and mechanisms of excess FSH levels on ovarian function and egg quality could lead to developing better, assisted reproductive technologies in the future, something the team will also try to accomplish as part of its research. According to 2014 data reported by the Centers for Disease Control and Prevention, 33 percent of women who actually went through fertility treatments using their own eggs were able to get pregnant but only 27 percent had a live birth. "If we can improve the fertility response rate of cows that have these small ovarian reserves, our findings could be useful for clinicians to use and may eventually lead to more successful pregnancies ending in live births in women," Ireland said. Michigan State University has been working to advance the common good in uncommon ways for more than 150 years. One of the top research universities in the world, MSU focuses its vast resources on creating solutions to some of the world's most pressing challenges, while providing life-changing opportunities to a diverse and inclusive academic community through more than 200 programs of study in 17 degree-granting colleges.
News Article | January 6, 2017
A cure for skin cancer could soon be on the horizon after researchers from Michigan State University (MSU) developed a new chemical compound that can significantly reduce the spread of melanoma cells in the body. Melanoma continues to be one of the deadliest forms of cancer in the United States. Data from the Centers for Disease Control and Prevention shows that about 71,943 Americans were diagnosed with the malignancy in 2013. Of these, about 9,394 patients later died from the disease. This type of skin cancer often develops when skin cells are damaged because of overexposure to ultraviolet radiation from sunlight or tanning beds. The cells then mutate into rapidly multiplying cancer cells and form malignant tumors on the skin. If melanomas are diagnosed and treated early on, there is a chance that they can still be cured. However, those left untreated can advance and easily spread to other parts of the body, making them more life-threatening and difficult to root out. To help stop the development of melanoma, MSU pharmacology professor Richard Neubig and his colleagues developed a small-molecule chemical compound that can prevent cancer cells from multiplying. According to the researchers, the new treatment focuses on stopping the ability of genes to produce proteins and RNA molecules used by melanoma tumors to further develop and spread cells to other parts of the body. Neubig explained that it was difficult for them to create a small-molecule drug that could halt the gene activity responsible for allowing melanomas to develop. The chemical compound they came up with was the same one they intended to use to treat another illness called scleroderma. People who suffer from scleroderma experience a hardening of tissues, such as those in the heart, lungs, kidneys, and skin. Scientists discovered that the mechanism behind fibrosis, or thickening of the skin, for scleroderma patients is similar to the one that helps spread cancer. Neubig and his team used the chemical compound to treat laboratory mice injected with human melanoma cells. They discovered that the drug was able to reduce the migration of malignant cells by as much as 85 to 90 percent. With small-molecule drugs making up 90 percent of medications currently available on the market, study co-author Kate Appleton said their work could lead to the development of a highly effective treatment for skin cancer. Appleton pointed out that melanoma is very fatal because it can quickly spread its malignant cells all over the body. They then attack other primary organs such as the lungs and brain. The MSU researchers believe it is important to identify the correct pathway that malignant cells use to spread throughout the body, so that they could develop their chemical compound further. This would then allow them to determine which melanoma patients would benefit the most from their treatment. Appleton said the ability of their chemical compound to stop the development and spread of melanoma cells is stronger when they are able to activate the correct pathway. The findings of the study are featured in the journal Molecular Cancer Therapeutics. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.
News Article | February 21, 2017
EAST LANSING, Mich. -- A Michigan State University breast cancer researcher has shown that effective treatment options can be predicted based on the way certain breast cancer genes act or express themselves. The research, published in the journal Oncogene, offers up proof that gene expression patterns can help direct the type of therapy a patient might receive, paving the way for more targeted and personalized approaches to care. The National Institutes of Health and the Susan G. Komen Foundation funded the study. "Breast cancer has numerous subtypes," said Eran Andrechek, a physiology professor in the College of Human Medicine. "Treatments for these various subtypes have to be different because there are different genes that drive the cancer." Estrogen- or progesterone-receptor positive breast cancer, where hormones drive cancer growth, is one subtype. Other subtypes include human epidermal growth factor receptor 2, or HER2, which is a protein that also promotes the development of the disease, and triple-negative breast cancer, or TNBC. This cancer type isn't driven by either the HER2 protein or hormone receptors and is the one that Andrechek focused on in his study. His research, also led by doctoral student Jing-Ru Jhan, first examined the unique genetic characteristics and differences within each TNBC tumor. Then Andrechek's team took the genomic information they gathered and compared it to various drugs that could target the specific tumor activity. "Triple-negative breast cancer is highly aggressive and currently there are limited treatment options," Andrechek said. "By looking at the particular gene expression patterns of this cancer and determining the pathways that were activated, or turned on, we identified certain drugs that could turn these pathways off and stop tumor growth." Andrechek's study discovered that a three-drug combination, including two FDA-approved drugs - Afatinib and Trametinib - also targeted a specific pathway associated with triple-negative breast cancer and together, were effective at stopping the cancer's growth. Currently, both drugs are commonly used for other types of cancers. Andrechek said his proof-of-concept study is a positive first step in determining the feasibility of this type of treatment approach. "We tested several other drug combinations too and when we expanded our study to include human breast cancers that were grown in mice, we received the same positive result," Andrechek said. "This gives us a much clearer indication that targeted, individualized breast cancer treatment is viable." Michigan State University has been working to advance the common good in uncommon ways for more than 150 years. One of the top research universities in the world, MSU focuses its vast resources on creating solutions to some of the world's most pressing challenges, while providing life-changing opportunities to a diverse and inclusive academic community through more than 200 programs of study in 17 degree-granting colleges.
News Article | February 18, 2017
Cosmic detonations of this scale and larger created many of the atoms in our bodies, says Michigan State University's Christopher Wrede, who presented at the American Association for the Advancement of Science meeting. A safe way to study these events in laboratories on Earth is to investigate the exotic nuclei or "rare isotopes" that influence them. "Astronomers observe exploding stars and astrophysicists model them on supercomputers," said Wrede, assistant professor of physics at MSU's National Superconducting Cyclotron Laboratory. "At NSCL and, in the future at the Facility for Rare Isotope Beams, we're able to measure the nuclear properties that drive stellar explosions and synthesize the chemical elements - essential input for the models. Rare isotopes are like the DNA of exploding stars." Wrede's presentation explained how rare isotopes are produced and studied at MSU's NSCL, and how they shed light on the evolution of visible matter in the universe. "Rare isotopes will help us to understand how stars processed some of the hydrogen and helium gas from the Big Bang into elements that make up solid planets and life," Wrede said. "Experiments at rare isotope beam facilities are beginning to provide the detailed nuclear physics information needed to understand our origins." In a recent experiment, Wrede's team investigated stellar production of the radioactive isotope aluminum-26 present in the Milky Way. An injection of aluminum-26 into the nebula that formed the solar system could have influenced the amount of water on Earth. Using a rare isotope beam created at NSCL, the team determined the last unknown nuclear-reaction rate affecting the production of aluminum-26 in classical novae. They concluded that up to 30 percent could be produced in novae, and the rest must be produced in other sources like supernovae. Future research can now focus on counting the number of novae in the galaxy per year, modeling the hydrodynamics of novae and investigating the other sources in complete nuclear detail. To extend their reach to more extreme astrophysical events, nuclear scientists are continuing to improve their technology and techniques. Traditionally, stable ion beams have been used to measure nuclear reactions. For example, bombarding a piece of aluminum foil with a beam of protons can produce silicon atoms. However, exploding stars make radioactive isotopes of aluminum that would decay into other elements too quickly to make a foil target out of them. "With FRIB, we will reverse the process; we'll create a beam of radioactive aluminum ions and use it to bombard a target of protons," Wrede said. "Once FRIB comes online, we will be able to measure many more of the nuclear reactions that affect exploding stars." Explore further: Dust grains could be remnants of stellar explosions billions of years ago
News Article | February 18, 2017
EAST LANSING, Mich. - Imagine being able to view microscopic aspects of a classical nova, a massive stellar explosion on the surface of a white dwarf star (about as big as Earth), in a laboratory rather than from afar via a telescope. Cosmic detonations of this scale and larger created many of the atoms in our bodies, says Michigan State University's Christopher Wrede, who presented at the American Association for the Advancement of Science meeting. A safe way to study these events in laboratories on Earth is to investigate the exotic nuclei or "rare isotopes" that influence them. "Astronomers observe exploding stars and astrophysicists model them on supercomputers," said Wrede, assistant professor of physics at MSU's National Superconducting Cyclotron Laboratory. "At NSCL and, in the future at the Facility for Rare Isotope Beams, we're able to measure the nuclear properties that drive stellar explosions and synthesize the chemical elements - essential input for the models. Rare isotopes are like the DNA of exploding stars." Wrede's presentation explained how rare isotopes are produced and studied at MSU's NSCL, and how they shed light on the evolution of visible matter in the universe. "Rare isotopes will help us to understand how stars processed some of the hydrogen and helium gas from the Big Bang into elements that make up solid planets and life," Wrede said. "Experiments at rare isotope beam facilities are beginning to provide the detailed nuclear physics information needed to understand our origins." In a recent experiment, Wrede's team investigated stellar production of the radioactive isotope aluminum-26 present in the Milky Way. An injection of aluminum-26 into the nebula that formed the solar system could have influenced the amount of water on Earth. Using a rare isotope beam created at NSCL, the team determined the last unknown nuclear-reaction rate affecting the production of aluminum-26 in classical novae. They concluded that up to 30 percent could be produced in novae, and the rest must be produced in other sources like supernovae. Future research can now focus on counting the number of novae in the galaxy per year, modeling the hydrodynamics of novae and investigating the other sources in complete nuclear detail. To extend their reach to more extreme astrophysical events, nuclear scientists are continuing to improve their technology and techniques. Traditionally, stable ion beams have been used to measure nuclear reactions. For example, bombarding a piece of aluminum foil with a beam of protons can produce silicon atoms. However, exploding stars make radioactive isotopes of aluminum that would decay into other elements too quickly to make a foil target out of them. "With FRIB, we will reverse the process; we'll create a beam of radioactive aluminum ions and use it to bombard a target of protons," Wrede said. "Once FRIB comes online, we will be able to measure many more of the nuclear reactions that affect exploding stars." MSU is establishing FRIB as a new scientific user facility for the Office of Nuclear Physics in the U.S. Department of Energy Office of Science. Under construction on campus and operated by MSU, FRIB will enable scientists to make discoveries about the properties of rare isotopes in order to better understand the physics of nuclei, nuclear astrophysics, fundamental interactions, and applications for society, including in medicine, homeland security and industry. Project completion is expected in 2022, with the project team managing to early completion in fiscal year 2021. Michigan State University has been working to advance the common good in uncommon ways for more than 150 years. One of the top research universities in the world, MSU focuses its vast resources on creating solutions to some of the world's most pressing challenges, while providing life-changing opportunities to a diverse and inclusive academic community through more than 200 programs of study in 17 degree-granting colleges. For MSU news on the Web, go to MSUToday. Follow MSU News on Twitter at twitter.com/MSUnews.
News Article | March 2, 2017
Gavin has already discussed John Christy’s misleading graph earlier in 2016, however, since the end of 2016, there has been a surge in interest in this graph in Norway amongst people who try to diminish the role of anthropogenic global warming. I think this graph is warranted some extra comments in addition to Gavin’s points because it is flawed on more counts beyond those that he has already discussed. In fact, those using this graph to judge climate models reveal an elementary lack of understanding of climate data. Different types of numbers The upper left panel in Fig. 1 shows that Christy compared the average of 102 climate model simulations with temperature from satellite measurements (average of three different analyses) and weather balloons (average of two analyses). This is a flawed comparison because it compares a statistical parameter with a variable. A parameter, such as the mean (also referred to as the ‘average’) and the standard deviation, describe the statistical distribution of a given variable. However, such parameters are not equivalent to the variable they describe. The comparison between the average of model runs and observations is surprising, because it is clearly incorrect from elementary statistics (This is similar statistics-confusion as the flaw found in the Douglass et al. (2007)). I can illustrate this with an example: Fig. 2 shows 108 different model simulations of the global mean temperature (from the CMIP5 experiment). The thick black line shows the average of all the model runs (the ‘multi-model ensemble’). None of the individual runs (coloured thin curves) match the mean (thick black curve), and if I were to use the same logic as Christy, I could incorrectly claim that the average is inconsistent with the individual runs because of their different characters. But the average is based on all these individual runs. Hence, this type of logic is obviously flawed. To be fair, the observations shown in Cristy’s graph were also based on averages, although of a small set of analyses. This does not improve the case because all the satellite data are based on the same measurements and only differ in terms of synthesis and choices made in the analyses (they are highly correlated, as we will see later on). By the way, one of the curves shown in Fig. 2 is observations. Can you see which? Eyeballing such curves, however, is not the proper way to compare different data, and there are numerous statistical tests to do so properly. Different physical aspects Christy compared temperatures estimated for the troposphere (satellites and balloons) temperature computed by global climate models. Th is is a fact because th e data portal where he obtained the model results was the KNMI ClimateExplorer. ClimateExplorer does not hold upper air temperature stored as 3D-fields (I checked this with Geert Jan van der Oldenborgh)(correction: ‘taz’ is zonally integrated temperature as a function of height but does not take into account the differences between land and sea. Nevertheless, this variable still does not really correspond closely with those measured from satellites) A proper comparison between the satellite temperature and the model results needs to estimate a weighted average of the temperature over the troposphere and lower stratosphere with an appropriate altitude-dependent weighting. The difference between the near-surface and tropospheric temperature matters as the stratosphere has cooled in contrast to the warming surface. Temperature from satellites are also model results It is fair to compare the satellite record with model results to explore uncertainties, but the satellite data is not the ground truth and cannot be used to invalidate the models. The microwave sounding unit (MSU), the instrument used to measure the temperature, measures light in certain wavelength bands emitted by oxygen molecules. An algorithm is then used to compute the air temperature consistent with the measured irradiance. This algorithm is a model based on the same physics as the models which predict that higher concentrations of CO result in higher surface temperatures. I wonder if Christy sees the irony in his use of satellite temperatures to dispute the effect of CO on the global mean temperature. It is nevertheless reassuring to see a good match between the balloon and satellite data, which suggests that the representation of the physics in both the satellite retrieval algorithm and the climate models are more or less correct. How to compare the models with observations The two graphs (courtesy of Gavin) below show comparisons between tropospheric mean temperatures (TMT) that are comparable to the satellite data and include confidence interval for the ensemble rather than just the ensemble mean. This type of comparisons is more consistent with standard statistical tests such as the students t-test. The graphs also show several satellite-based analyses: the Remote Sensing Systems (RSS; different versions), University of Alabama in Huntsville (UAH; Different versions), and NOAA (STAR). All these curves are so similar (highly correlated) that taking the average doesn’t make much difference. According to Fig. 3, the tropospheric temperature simulated by the global climate models (from the CMIP5 experiment) increased slightly faster than the temperatures derived from the satellite measurements between 2000 and 2015, but they were not very different. The RSS temperatures gave the closest match with the global climate models. Fig. 4 shows a trend analysis for the 1979-2016 interval where the satellite-based temperature trends are shown with appropriate error bars. The trends from the satellite analyses and the model results overlap if the confidence limits are taken into consideration. The story behind the upper tropospheric warming The biggest weight of the troposphere temperature trends come from the tropics because it accounts for the largest volume (half of the Earth’s surface area lies between 30°S and 30°N due to its geometric shape), and they are therefore sensitive to conditions around the equator. This is also where large-scale convection takes place that produce bands of high clouds (the Inter-Tropical Convergence Zone – ITCZ). Cloud formation through convection and condensation is associated with release of latent heat and influences the temperatures (e.g. Vecchi et al., 2006). It is part of the hydrological cycle, and a slow change in the atmospheric overturning, moisture and circulation patterns is expected to have a bearing on the global tropospheric temperature trend estimates. This means that the picture is complex when it comes to the global tropospheric temperature trends because many physical processes have an influence that take place on a wide range of spatial scales. Hard evidence of misrepresentation Despite the complicated nature of tropospheric temperatures, it is an indisputable fact that Christy’s graph presents numbers with different meanings as if they were equivalent. It is really surprising to see such a basic misrepresentation in a testimony at the U.S. House Committee on Science, Space & Technology. One of the most elementary parts of science is to know what the numbers really represent and how they should be interpreted.
News Article | February 23, 2017
EAST LANSING, Mich. -- As the number of English learners continues to grow across the nation, new research indicates these students are being treated differently depending on where they go to school. Michigan State University researchers found that schools in Texas - second only to California in total number of English learners - vary widely in how they determine if students should be reclassified as English proficient, affecting their chances of success in school and beyond. An English learner in the El Paso metropolitan area, for example is nearly twice as likely to be reclassified by the end of seventh grade compared to a student performing at the same level in the Rio Grande Valley. Recent changes in federal law require all states to standardize how they identify and reclassify English learners, but Texas has had policies in place since the 1990s. "If we are seeing this amount of variation in Texas, imagine what we would see in a state where the population is newer and educators have less experience serving immigrants and English learners," said Madeline Mavrogordato, assistant professor of K-12 educational administration and lead author. At least one in 10 U.S. students is classified as an English learner, compared to 1 in 20 back in 1990. Being reclassified is a key turning point in a student's educational trajectory, said Mavrogordato. If it occurs too early, English learners could find themselves struggling without the support services they need. If too late, students may be restricted from taking higher-level courses that would prepare them for college. Mavrogordato used state data to estimate reclassification rates for English learners throughout Texas over seven years. The study, published in Educational Evaluation and Policy Analysis, is one of the first to examine how reclassification rates vary - and to document how educators make decisions in schools. Mavrogordato and Rachel White, a MSU doctoral candidate in education policy, observed eight Texas elementary schools while educators conducted annual meetings required to determine the status of English learners. They found clear differences in what happens during the meetings, how technology is incorporated into the process, what data sources are used and ultimately how individual students were reviewed. In one school, the meeting entailed committee members filling in assessment scores and signing forms, while another school invited each child's teacher to provide input followed by a discussion of the relative assessment data and how best to serve the student in the coming year. Focus group interviews showed most educators believe they are approaching the reclassification process in the same way. However, Mavrogordato says the likelihood of reclassification in different parts of the state appears to be linked to how educators understand the purpose of the policy and their role in implementation. "We need to give educators the background needed to understand the spirit of the law," said Mavrogordato. "Since they are the ones implementing policy on the ground, we need to build their understanding of why these policies are in place. Otherwise, we may end up focusing on demonstrating compliance as opposed to truly expanding educational opportunity."