Lehigh University is an American private research university located in Bethlehem, Pennsylvania. It was established in 1865 by businessman Asa Packer and has grown to include studies in a wide variety of disciplines. Its undergraduate programs have been coeducational since the 1971–72 academic year. As of 2014, the university had 4,904 undergraduate students and 2,165 graduate students. Lehigh is considered one of the twenty-four Hidden Ivies in the Northeastern United States.Lehigh is ranked 12th in the nation, according to The Wall Street Journal, in college return on investment . The university has over 680 faculty members; awards and honors recognizing Lehigh faculty and alumni include the Nobel Prize, the Pulitzer Prize, Fulbright Fellowship, and membership in the American Academy of Arts & science and the National Academy of science.The university has four colleges: the P.C. Rossin College of Engineering and Applied Science, the College of Arts and science, the College of Business and Economics, and the College of Education. The College of Arts and science is the largest college today, home to roughly 40% percent of the university's students. The university offers a variety of degrees, including Bachelor of Arts, Bachelor of Science, Master of Arts, Master of Science, Master of Business Administration, Master of Engineering, Master of Education, and Doctor of Philosophy. Wikipedia.
Lehigh University | Date: 2016-08-10
Compositions, compounds and methods are described for addressing both toxicity of membrane disruptive anti-microbial agents as well as poor transport of such agents across the blood-brain-barrier (BBB) via the use of molecular appendages including one or more facial amphiphiles. These molecules have in vitro anti-fungal activity that is very similar to that of the native drug but with hemolytic activity and toxicity towards mammalian cells that is greatly reduced.
Rutgers University and Lehigh University | Date: 2016-10-26
Augmented or synergized anti-inflammatory constructs are disclosed including anti-inflammatory amino acids covalently conjugated with other anti-inflammatory molecules such as nonsteroidal anti-inflammatory drugs, vanilloids and ketone bodies. Further conjugation with a choline bioisostere or an additional anti-inflammatory moiety further augments the anti-inflammatory activity.
Lehigh University | Date: 2016-09-22
The present invention includes methods of promoting single crystal growth via solid-solid transformation of an appropriate glass, while avoiding the gaseous or liquid phase. In certain embodiments, in the all-solid-state glass-to-crystal transformation of the invention, extraneous nucleation is avoided relative to crystal growth via spatially localized laser heating and optional inclusion of a suitable glass former in the composition. The ability to fabricate patterned single-crystal architecture on a glass surface was demonstrated, providing a new class of micro-structured substrate for low cost epitaxial growth and active planar devices, for example.
Lehigh University | Date: 2016-10-19
A process of preparing a glass comprising: (a) heating a mixture of precursor chemicals to a melt temperature to form a melt, the melt being characterized in that quenching the melt at or above a threshold temperature results in a spinodal phase separation, and quenching the melt below the threshold temperature results in a droplet phase separation; and (b) quenching the melt at or above the threshold temperature in a preheated mold to form the glass composition having the spinodal phase separation.
Yu Z.C.,Lehigh University
Biogeosciences | Year: 2012
Peatlands contain a large belowground carbon (C) stock in the biosphere, and their dynamics have important implications for the global carbon cycle. However, there are still large uncertainties in C stock estimates and poor understanding of C dynamics across timescales. Here I review different approaches and associated uncertainties of C stock estimates in the literature, and on the basis of the literature review my best estimate of C stocks and uncertainty is 500 ± 100 (approximate range) gigatons of C (Gt C) in northern peatlands. The greatest source of uncertainty for all the approaches is the lack or insufficient representation of data, including depth, bulk density and carbon accumulation data, especially from the world's large peatlands. Several ways to improve estimates of peat carbon stocks are also discussed in this paper, including the estimates of C stocks by regions and further utilizations of widely available basal peat ages. Changes in peatland carbon stocks over time, estimated using Sphagnum (peat moss) spore data and down-core peat accumulation records, show different patterns during the Holocene, and I argue that spore-based approach underestimates the abundance of peatlands in their early histories. Considering long-term peat decomposition using peat accumulation data allows estimates of net carbon sequestration rates by peatlands, or net (ecosystem) carbon balance (NECB), which indicates more than half of peat carbon (> 270 Gt C) was sequestrated before 7000 yr ago during the Holocene. Contemporary carbon flux studies at 5 peatland sites show much larger NECB during the last decade (32 ± 7.8 (S.E.) g C m-2 yr–1) than during the last 7000 yr (∼ 11 g C m-2 yr–1), as modeled from peat records across northern peatlands. This discrepancy highlights the urgent need for carbon accumulation data and process understanding, especially at decadal and centennial timescales, that would bridge current knowledge gaps and facilitate comparisons of NECB across all timescales. © 2012 Author(s).
Agency: NSF | Branch: Standard Grant | Program: | Phase: BIOTECH, BIOCHEM & BIOMASS ENG | Award Amount: 500.00K | Year: 2016
Brown, Angela C.
The rise of antibiotic resistance threatens our ability to treat even minor bacterial infections. With few new drugs in development and rates of resistance increasing, we urgently need new, nontraditional approaches to fight pathogenic bacteria. The goal of this project is to target the delivery of molecules that facilitate the virulence of pathogenic bacteria as a novel approach to the treatment of bacterial infections. Rather than targeting some essential life process to kill the bacteria directly as traditional antibiotics due, the PI intends to target those processes that allow the pathogen to settle in the host. This CAREER proposal addresses two goals outlined by the United States CDC: to develop new approaches to the treatment of bacterial infections and to improve antibiotic stewardship. In addition to identifying unique targets for the development of novel antibiotics, the PI plans extensive outreach activities to demonstrate in a fun, hands-on manner, the danger of misusing antibiotics. The PI intends to further integrate the research with her educational goals by developing new courses at Lehigh University, providing undergraduate and graduate research opportunities, and developing a program for local Girl Scout troops to learn about STEM fields. The PI?s overall goals in these educational activities is to develop well-trained, interdisciplinary scientists and to encourage young women to
choose STEM majors and pursue related technical careers.
Pathogenic bacteria produce certain molecules, called virulence factors that allow them to thrive within the host. One mechanism used by Gram negative bacteria to transfer these factors to target cells is the production of outer membrane vesicles (OMVs), which bleb off from the outer membrane of the bacterium and encapsulate multiple virulence factors. The goal of this project is to characterize the
trafficking of these OMVs for the identification of shared mechanisms that can be blocked to inhibit virulence factor delivery. The overall hypothesis of the project is that toxins present on the surface of the OMV facilitate binding of the OMV to the target cells, and inhibition of this binding can be used to disrupt delivery of virulence factors. To investigate this hypothesis, the PI will study three representative organisms, enterotoxigenic Escherichia coli, Vibrio cholerae, and Bordetella pertussis. With these three organisms, she will first characterize the mechanisms by which bacterial toxins bind to the lipid components of the OMV and determine role of membrane properties in this interaction. This aim will be accomplished by studying the affinity of each toxin for specific lipids, using isothermal titration calorimetry. Next, she will study the interaction between OMVs and specific components of host cells, using calorimetric and microscopy techniques. Finally, with this information, she will design methods to prevent OMV delivery by blocking specific factors found to facilitate OMV delivery; this will allow the PI to inhibit the virulence of these organisms. Targeting virulence in this way represents a novel approach to treating bacterial infections, which is desperately needed due to our current lack of effective antibiotic strategies.
Agency: NSF | Branch: Cooperative Agreement | Program: | Phase: Natural Hazards Engineering Re | Award Amount: 3.04M | Year: 2016
The Natural Hazards Engineering Research Infrastructure (NHERI) will be supported by the National Science Foundation (NSF) as a distributed, multi-user national facility that will provide the natural hazards research community with access to research infrastructure that will include earthquake and wind engineering experimental facilities, cyberinfrastructure, computational modeling and simulation tools, and research data, as well as education and community outreach activities. NHERI will be comprised of separate awards for a Network Coordination Office, Cyberinfrastructure, Computational Modeling and Simulation Center, and Experimental Facilities, including a post-disaster, rapid response research facility. Awards made for NHERI will contribute to NSFs role in the National Earthquake Hazards Reduction Program (NEHRP) and the National Windstorm Impact Reduction Program. NHERI continues NSFs emphasis on earthquake engineering research infrastructure previously supported under the George E. Brown, Jr. Network for Earthquake Engineering Simulation as part of NEHRP, but now broadens that support to include wind engineering research infrastructure. NHERI has the broad goal of supporting research that will improve the resilience and sustainability of civil infrastructure, such as buildings and other structures, underground structures, levees, and critical lifelines, against the natural hazards of earthquakes and windstorms, in order to reduce loss of life, damage, and economic loss. Information about NHERI resources will be available on the DesignSafe-ci.org web portal.
NHERI Experimental Facilities will provide access to their experimental resources, user services, and data management infrastructure for NSF-supported research and education awards. This award will provide a NHERI Experimental Facility at Lehigh University to support research to mitigate the impact of natural hazards, including earthquakes, on structures. This facility will provide experimental resources for accurate, large-scale, multi-directional simulations to investigate the effects of natural hazard events on civil infrastructure systems, with potential soil-foundation effects. The new discoveries and knowledge from these simulations will enable researchers to develop and validate new hazard mitigation solutions and innovative hazard-resistant structural concepts. The research conducted by facility users will contribute to the development of the next generation workforce for natural hazards engineering research, educational activities, and professional practice. The education and community outreach programs, along with the strategic education partnerships of the facility, will target a diverse audience and attract young individuals into science, technology, and engineering, while also reaching out to educate and inform a broader community. This facility will conduct annual workshops for prospective users and will host Research Experiences for Undergraduate students.
The facilitys experimental resources, which include a strong floor, multi-directional reaction wall, static and dynamic actuators, and testing algorithms, will support large-scale, multi-direction, real-time hybrid simulations that combine physical experiments with computer-based simulations for evaluating performance of large-scale components and systems. The types of laboratory simulations and tests enabled by the facility include: (1) hybrid simulation (HS), which combines large-scale physical models with computer-based numerical simulation models; (2) geographically distributed hybrid simulation (DHS), which is a HS with physical models and/or numerical simulation models located at different sites; (3) real-time hybrid earthquake simulation (RTHS), which is a HS conducted at the actual time scale of the physical models; (4) geographically distributed, real-time hybrid earthquake simulation, which combines DHS and RTHS; (5) dynamic testing, which loads large-scale physical models at real-time scales through predefined load histories; and (6) quasi-static testing, which loads large-scale physical models at slow rates through predefined load histories. The facility resources will enable multiple, large-scale simulations and tests to be conducted simultaneously, allowing numerous users to work concurrently without significant interruption. The large-scale hybrid simulations are a unique resource for system-level response data, since they enable the definition of the system to be expanded well beyond the size of typical laboratory physical models. At the same time, due to its large size, the facility can accommodate large-scale physical models, which reduces scaling effects associated with typical, small physical models. A broad array of instrumentation, large-scale data acquisition systems, and advanced sensors will provide the system-level data needed for advancing computational modeling and simulation. The facility will utilize the HS capabilities for earthquake engineering to develop hybrid wind engineering simulation capabilities within the facility. The HS being advanced at the facility has the potential for broader impacts in other fields of science and engineering, where combining physical models and numerical simulation models will enable more holistic definitions of the system in experimental research.
Agency: NSF | Branch: Standard Grant | Program: | Phase: ITEST | Award Amount: 1.20M | Year: 2016
This project will advance efforts of the innovative Technology Experiences for Students and Teachers (ITEST) program to better understand and promote practices that increase students motivations and capacities to pursue careers in fields of science, technology, engineering, or mathematics (STEM) by producing empirical findings and/or research tools that contribute to knowledge about which models and interventions with K-12 students and teachers are most likely to increase capacity in the STEM and STEM-cognate intensive workforce of the future.
This project will develop, implement, and evaluate a series of innovative socio-environmental science investigations (SESI) using a geospatial curriculum approach with STEM-related mentoring that will provide economically disadvantaged students in the 9th grade of the Building 21 school in Allentown, PA with technology-rich geospatial learning experiences to develop STEM-related skills. The project will focus on social issues related to environmental science. This project uses a design partnership among Lehigh University natural science professors, social science professors, education professors, the STEM Valley Mentoring Coalition, Building 21 science and social studies school teachers, the Allentown city government, and PPL Corporation to develop geospatial investigations with Web GIS (Geospatial Information Systems) that will prepare students with skills, and career awareness, to motivate them to pursue appropriate education pathways for STEM-related occupations.
Despite the accelerating growth in geospatial industries and congruence across STEM, few school-based programs integrate geospatial technology within their curriculum, and even fewer incorporate STEM-related mentoring to promote interests and aspirations in related occupations. The project will contribute to educational research infrastructure by producing a series of instruments to measure the development of STEM-related skills in the context of SESI investigations, students perceptions of mentoring experience, STEM career motivation, and STEM interest for use with high school students who are traditionally underrepresented in STEM-related disciplines and careers. The project also promotes a unique community-based partnership with university scientists, social scientists, educators, a science center, and STEM mentors that involve high school students in the research and decision-making process about local environmental problems. The learning activities will provide opportunities for students to collaborate, seek evidence, problem-solve, master technology, develop geospatial thinking and reasoning skills, and practice communication skills that are essential for the STEM workplace. The project will enable researchers to (1) examine how SESI and mentoring increases students interest in STEM and their motivation to pursue STEM-related careers, and (2) analyze how the geospatial curriculum approach when combined with STEM-related mentoring can improve STEM-related skills with students under-represented in STEM.
Agency: NSF | Branch: Standard Grant | Program: | Phase: Secure &Trustworthy Cyberspace | Award Amount: 599.93K | Year: 2016
Today individuals and organizations leverage machine learning systems to adjust room temperature, provide recommendations, detect malware, predict earthquakes, forecast weather, maneuver vehicles, and turn Big Data into insights. Unfortunately, these systems are prone to a variety of malicious attacks with potentially disastrous consequences. For example, an attacker might trick an Intrusion Detection System into ignoring the warning signs of a future attack by injecting carefully crafted samples into the training set for the machine learning model (i.e., polluting the model). This project is creating an approach to machine unlearning and the necessary algorithms, techniques, and systems to efficiently and effectively repair a learning system after it has been compromised. Machine unlearning provides a last resort against various attacks on learning systems, and is complementary to other existing defenses.
The key insight in machine unlearning is that most learning systems can be converted into a form that can be updated incrementally without costly retraining from scratch. For instance, several common learning techniques (e.g., naive Bayesian classifier) can be converted to the non-adaptive statistical query learning form, which depends only on a constant number of summations, each of which is a sum of some efficiently computable transformation of the training data samples. To repair a compromised learning system in this form, operators add or remove the affected training sample and re-compute the trained model by updating a constant number of summations. This approach yields huge speedup -- the asymptotic speedup over retraining is equal to the size of the training set. With unlearning, operators can efficiently correct a polluted learning system by removing the injected sample from the training set, strengthen an evaded learning system by adding evasive samples to the training set, and prevent system inference attacks by forgetting samples stolen by the attacker so that no future attacks can infer anything about the samples.
Agency: NSF | Branch: Standard Grant | Program: | Phase: COMPUTATIONAL MATHEMATICS | Award Amount: 499.14K | Year: 2016
Intelligent systems that say, recommend music or movies based on past interests, or recognize faces or handwriting based on labeled samples, often learn from examples using supervised learning. The system tries to find a prediction function: a combination of feature values of the song, movie, image, or pen movements, that, on known inputs, produces score values that agree with known preferences. Some combinations may add with simple positive or negative weight parameters (The more guitar the better, or I really dont want accordion), while others can be more complex (nether too loud nor too soft). If parameters for such a function can be found, then it can be hoped that, on a new input, the function will be a good approximation for the preference.
In scientific computing, there are many optimization techniques used to find the best parameters. The type called gradient methods is like a group hike that gets caught in the hills after dark; the members want to go downhill to return to the valley quickly, but take small steps so as not to trip. With a little light, the group can discover more about its vicinity to 1) suggest the best direction, 2) take longer steps without tripping, or 3) send different members in different directions so that someone finds the best way. When there are many parameters (not just latitude and longitude) there are many more directions to step. Simple combinations define simple (aka convex) valleys, and many optimization-based learning methods (including support vector machines (SVM), least squares, and logistic regression) have been effectively applied to find the best parameters. More complex combinations that sometime lead to better learning, may define non-convex valleys, so the known methods may get stuck in dips or have to take very small steps -- they often lack theoretical convergence guarantees and do not always work well in practice.
This project will explore non-convex optimization for machine learning with three techniques that are analogous to the hikers? use of the light:
First, new techniques will be explored for exploiting approximate second-order derivatives within stochastic methods, which is expected to improve performance over stochastic gradient methods, avoid convergence to saddle points, and improve complexity guarantees over first-order approaches. Compared to other such techniques that have been proposed, these approaches will be unique as they will be set within trust-region frameworks, the exploration of which represents the second component of the project. Known for decades to offer improved performance for nonconvex optimization, trust region algorithms have not fully been explored for machine learning, and we believe that, when combined with second-order information, dramatic improvements (both theoretically and practically) can be achieved. Finally, for such methods to be efficient in large-scale settings, one needs to offer techniques for solving trust region subproblems in situations when all data might not be stored on a single computer. To address this, parallel and distributed optimization techniques will be developed for solving trust region subproblems and related problems. The three PIs work together with about a dozen students at Lehigh; their website is one way they disseminate research papers, software, and news of weekly activities.
This project is funded jointly by NSF CISE CCF Algorithmic Foundations, and NSF MPS DMS Computational Mathematics.