New Orleans, LA, United States

University of New Orleans
New Orleans, LA, United States

The University of New Orleans, often referred to locally as UNO, is a medium-sized public urban university located on the New Orleans Lakefront within New Orleans, Louisiana, United States. It is a member of the University of Louisiana System and the Urban 13 association. In the fall of 2011 the Southern Association of Colleges and Schools Commission on Colleges gave approval for the University of New Orleans to join the University of Louisiana System, concluding the five-month transition from the LSU System since ACT 419 of the 2011 Louisiana Legislative Regular Session was signed into law in July 2011. Soon after the transition was approved, the UNO Presidential Search Committee selected UNO alumnus Peter J. Fos as president. Wikipedia.

Time filter

Source Type

Louisiana State University and University of New Orleans | Date: 2015-05-04

A surface protein of the murine fungal pathogen Pneumocystis murina can be used to generate an immune response in a recipient animal or human that provides prophylactic protection and an anti-fungal activity in subjects already infected with a Pneumocystis species. Further, the disclosure provides novel polypeptides or peptides derived from the P. murina surface protein Surface Peptidase 1 (SPD-1) that are useful, alone or in combination with the SPD-1 polypeptide, in compositions and methods for the generation of an anti-Pneumocystis immune reaction by a recipient subject. The compositions and methods of the disclosure provide advantageous alternatives to available immunogenic determinants for the treatment or prevention of fungal pneumonia.

University of Pittsburgh, University of New Orleans and Louisiana State University | Date: 2015-03-19

Pneumonia due to the fungus Pneumocystis jirovecii is a life-threatening infection that occurs in immunocompromised patients. The inability to culture the organism as well as the lack of a sequenced genome has hindered antigen discovery that could be useful in developing effective vaccines, therapeutic antibodies and diagnostic methods. A method of surface proteomics of Pneumocystis murina that reliably detects surface proteins that are conserved in Pneumocystis jirovecii is described. In particular, eight identified P. murina surface proteins are described. Methods of eliciting immune responses against the identified proteins, generating therapeutic antibodies against the identified proteins, as well as diagnostic methods based on the identified peptides are described.

Laughrey Z.,Arizona State University | Gibb B.C.,University of New Orleans
Chemical Society Reviews | Year: 2011

Over the past five years, an important development in the area of self-assembling containers has been the increase in interest in those containers that function in aqueous solution. This progress is a reflection of a similar trend within supramolecular chemistry in general, and is driven in part by the need to address issues and challenges within the biological sciences, as well as a desire to develop new strategies for greener chemistries carried out in water. It is also an opportunity to learn more about fundamental topics such as the hydrophobic effect. In this critical review we discuss progress in aqueous-based self-assembling container molecules since 2005 (177 references). © 2011 The Royal Society of Chemistry.

Agency: NSF | Branch: Standard Grant | Program: | Phase: Cyber-Human Systems (CHS) | Award Amount: 156.97K | Year: 2016

The goal of this project is to improve the software that generates stories automatically for virtual environments like training simulations and educational games. Specifically, the software will be able to reason about what is actually true, what each character thinks is true, what they think others think is true, and so on, to improve the way virtual characters act and make them seem more believable and more human. Current approaches to designing these narratives often assume agents know everything about others beliefs and goals; this often leads to inconsistent or un-believable behaviors by the agents, which damage the credibility of the software and quality of the experience for their human users. The proposal will extend the lead researchers existing narrative planning system, using an approach that lets agents consider multiple sets of beliefs that are consistent with their own and others actions so far, ruling out situations where agents have beliefs that are inconsistent with their actions. Compared to existing approaches, this should allow the narrative planner to generate a wider variety of narratives that are also more believable to humans, as well as to handle situations such as trickery and uncertainty where reasoning about beliefs is crucial. The research team will test the software and these assumptions through several experiments that ask people to compare narratives generated by the new software to those generated by state of the art methods. If successful, the project sets the stage to improve the quality of systems where virtual agents interact with humans such as smart phone assistants, online games, automated customer chat tools, and educational software. In particular, the work will lead to training scenarios where understanding others beliefs is crucial, such as officer-citizen interactions. The work is also interdisciplinary, ranging from computer science to psychology, and the lead researcher is committed to training young researchers to do work that crosses these intellectual boundaries and to recruiting researchers who might not otherwise participate in computer science-related research.

In the work, the lead researcher proposes to develop a model of agent belief based on doxastic modal logic and possible worlds reasoning suitable for use in a planning algorithm that coordinates a virtual environment. By supporting a single modal believes predicate, the planner can treat the narrative search space as a Kripke structure to reason about epistemically accessible states. This improves on previous models by allowing arbitrarily nested beliefs while simultaneously reducing the burden on the virtual environments author to write alternative scenarios, thus increasing their flexibility and expressiveness. The research team will integrate this model of beliefs into a prototype system based on the Glaive narrative planner previously developed by the lead researcher. This prototype will take advantage of Glaives existing heuristic-driven state-space search techniques: in addition to expanding temporally accessible states, Glaive will also expand epistemically accessible states and track when an action taken by an epistemic child can be anticipated by its epistemic parent in the Kripke structure. The initial prototype will be too slow for real-time use, but it will be suitable for conducting the proposed experiments that investigate to what extent such a model improves the believability of agent behavior in automatically generated stories. In particular, the team will study whether the planner produces narratives whose structure better meets the expectations of a human audience: that is, the model will answer questions about agent beliefs more similarly to a human audience and the resulting planner will generate stories more like those composed by human authors. Further, the prototype is expected to solve certain narrative planning problems which algorithms that lack a model of agent beliefs cannot solve. These claims will be evaluated by having the new prototype and two state-of-the-art planners generate narratives for a library of scenarios to be developed by the team that rely on agents having a theory of mind for other agents, then asking both the systems and human users a number of questions about the generated narrative and agents beliefs to evaluate how well the planners output conforms with humans expectations and believability.

Agency: NSF | Branch: Standard Grant | Program: | Phase: Secure &Trustworthy Cyberspace | Award Amount: 299.98K | Year: 2016

One of the main obstacles in providing extensive hands-on experience in cybersecurity classes is the substantial amount of manual work involved in creating and grading the exercise. Combined with the frequent need to update the exercises, this obstacle effectively limits that amount of hands-on work that gets incorporated into cybersecurity education. This project seeks to eliminate such barriers, and to greatly improve the efficiency of the educational process by automating the most time-consuming tasks.

This project makes two main contributions to cybersecurity education: the development of a specification-driven, dynamic environment for implementing realistic cyber defense and forensic analysis exercises; and the advanced support for class management and automated evaluation. The platform, AutoCUE, provides a high-level specification language, and an execution runtime that enable instructors to easily and efficiently run realistic scenarios that result in customized environments; based on the same methods, the system also be used to automatically create of realistic experimental data sets. The infrastructure provides an automated class management component, which consists of: a) deployment automation module, which guarantees consistent student lab environment, and central control by the instructor; b) scenario personalization module, which can generate customized exercises for each student (for evaluation purposes); and c) automated grading module, which combines ideas from capture-the-flag competitions and environment sensors to track student progress and automate the grading process. The project also provides ready-to-use seed content for two classes: digital forensics and network penetration testing.

Agency: NSF | Branch: Standard Grant | Program: | Phase: Secure &Trustworthy Cyberspace | Award Amount: 150.00K | Year: 2016

Cybersecurity is one of the most strategically important areas in computer science, and also one of the most difficult disciplines to teach effectively. Historically, hands-on cyber security exercises helped students reinforce basic concepts, but most of them focused on user level attacks and defenses. Since OS kernels provide the foundations to the applications, any compromise to OS kernels will lead to an entirely untrusted computing. Therefore, it is imperative to teach students the practice of kernel level attacks and defenses.

Over the past decade, there has been great interest in using virtualization to profile, characterize, and observe kernel events including the security incidents. Inspired by the great success from virtual machine introspection (VMI), this project aims to provide an advancement by directly building practical VMI tools and libraries (or toolkit) on top of virtualization, and applying them for deep cybersecurity education. The deepness comes from the study of the lower level system internals such as OS kernels. The project will further provide a number of seed contents to teach both instructors and students on utilizing the toolkit to be used for studying not only traditional user level attacks such as buffer overflow, but also defenses inside the OS kernels. The outcome of this project (i.e., the toolkit and the cybersecurity exercises) will contribute to the health, safety, and economic well-being of our society by helping to improve the state-of-the-art in cybersecurity education, especially for effectively performing hands-on cybersecurity exercises.

Agency: Department of Defense | Branch: Navy | Program: STTR | Phase: Phase I | Award Amount: 80.00K | Year: 2016

For machinery control systems, forensics is a vital part to provide a cyber-protection strategy and aid in identification and troubleshooting of system malfunctions due to malicious and non-malicious events. A number of unique challenges exist for the forensic analysis of SCADA based systems. Components of a SCADA system are often resource constrained. In addition, SCADA based systems have a critical requirement of being continuously operational. The resource constrained nature of SCADA systems and the 24/7 availability requirement calls for live forensic solutions where the data acquisition and analysis are performed at run time. Despite such emerging demands, there are still no comprehensive software design and implementation to systemically address live forensic issues on a SCADA system in a way to minimize risk to the systems services. To address this critical need, IAI and its team propose to develop Digital Forensic Took Kit for Machinery Control Systems (TRACE), a live digital forensics took kit that, at run time, provide a cyber-protection strategy and aid in identification of malfunctions while ensuring minimal impact on overall system performance. The key innovation is to deploy

Agency: NSF | Branch: Standard Grant | Program: | Phase: Secure &Trustworthy Cyberspace | Award Amount: 300.00K | Year: 2015

Engineering a secure IT system, in addition to technical skills, requires a particular mindset focused on using cybersecurity solutions effectively against sophisticated and stealthy cyber attacks. The traditional lecture-centric style of teaching has failed to deliver that mindset, which is the direct result of an over-emphasis on specific technical skills (with limited lifespan and insufficient technical depth), abstract rather than deeply technical examination of fundamental concepts, and an impatience in developing broader analytical skills. The vast majority of cybersecurity failures are the result of poor understanding of the security landscape and an inability to adapt to new threats.

Peer instruction may be a solution to this challenge. This project evaluates the effectiveness of peer instruction methodology for cybersecurity education, and develops the peer instruction material for three cybersecurity courses offering an introduction to security concepts, a defensive view of cybersecurity, and an offensive view of cybersecurity. The two primary mechanisms are used for measuring impact of peer instruction, which are pre- and post-testing, and isomorphic questions.

Agency: NSF | Branch: Continuing grant | Program: | Phase: Environmental Chemical Science | Award Amount: 450.00K | Year: 2015

In this project funded by the Environmental Chemical Sciences Program of the Chemistry Division, Professor Matthew Tarr of the University of New Orleans is developing new methods to understand how oil spilled in ocean environments is transformed by sunlight. Such methods help identify the chemicals formed when crude oil is subjected to photochemical changes in the environment. The nature and behavior of these products are important in oil spill remediation. The project involves collaboration with the National High Magnetic Field Laboratory. In addition, the project trains young scientists through direct involvement of high school students, high school teachers, and undergraduate students.

This project identifies the types of compounds formed when crude oil is exposed to sunlight and investigates structures of photoproducts present in the gas phase, the aqueous phase, and the oil phase. The project utilizes fractionation methods, chromatography, and derivatization to assess the functional groups present in photoproducts. The study also utilizes a wide range of analytical tools, including gas chromatography, liquid chromatography, fluorescence spectroscopy, and mass spectrometry (including GC-MS, LC-MS, electrospray MS, and high resolution electrospray FTICR-MS) in order to gain a complete picture of oil photochemistry. The results of this project will be important for understanding the fate of spilled oil and improving predictive models for oil spill remediation. Additional broader impacts include training of future scientists and exposing teachers to environmental research.

Agency: NSF | Branch: Standard Grant | Program: | Phase: UBE - Undergraduate Biology Ed | Award Amount: 50.00K | Year: 2016

Research efforts in the Chemical and Biological Sciences have significantly benefitted from the rapid growth of computational capacity and the extensive use of data-analytics tools. The rapid proliferation of these methods have, however, brought about an urgent need for researchers who can effectively incorporate field-relevant computational tools and methods in their research workflows. This is a necessity in order to meet the growing Materials, Energy and Health needs of the nation. Towards meeting this objective, undergraduate students enrolled in Chemical and Biological sciences degree programs are typically steered towards introductory Computer Science courses. These courses are, however, not geared towards the needs of these students and consequently meet with mixed success. In contrast, recent efforts to develop degree-relevant computational courses that specifically cater to these undergraduate students have successfully broadened the appeal of computational methods. The goal of the workshop is to identify strategies to successfully introduce this populace of undergraduate students to computational methods. As such, this workshop will likely be a critical step in promoting the progress of science and prosperity of the nations population.

Workshop attendees will include domain specialists, industrial partners, computer scientists, and education experts. The overall goal will be to provide a framework for undergraduate courses for Chemical and Biological Science degree programs that introduce the student to computing in a familiar environment. These courses will use hands-on computational activities and project based learning approaches that are specific to the degree programs. Instructors will use existing computational tools and visualization techniques to teach students how to solve domain-specific research problems in an intuitive manner. As such, this approach is likely to successfully engage students who lack a formal training in programming and help them develop an appreciation for computation. To help this effort, attendees at the workshop will work to develop a common standard for computing topics to be taught in Biology and Chemistry degree programs, provide frameworks for hands-on activities and research projects for these courses, identify practices and strategies to promote recruitment and retention of underrepresented minorities in these courses, and identify means to train interested instructors who may not have the benefit of a background in computational methods.

Loading University of New Orleans collaborators
Loading University of New Orleans collaborators