Anderson, United States
Anderson, United States

Time filter

Source Type

News Article | May 10, 2017
Site: www.chromatographytechniques.com

The presence of just a few autonomous vehicles can eliminate the stop-and-go driving of the human drivers in traffic, along with the accident risk and fuel inefficiency it causes, according to new research. The finding indicates that self-driving cars and related technology may be even closer to revolutionizing traffic control than previously thought. “Our experiments show that with as few as five percent of vehicles being automated and carefully controlled, we can eliminate stop-and-go waves caused by human driving behavior,” said Daniel B. Work, assistant professor at the University of Illinois at Urbana-Champaign, a lead researcher in the study. The use of autonomous vehicles to regulate traffic flow is the next innovation in the rapidly evolving science of traffic monitoring and control, Work said. Just as fixed traffic sensors have been replaced by crowd-sourced GPS data in many navigation systems, the use of self-driving cars is poised to replace classical freeway traffic control concepts like variable speed limits. Critical to the success of this innovation is a deeper understanding of the dynamic between these autonomous vehicles and the human drivers on the road. Funded by the National Science Foundation’s Cyber-Physical Systems program, the research was led by a multi-disciplinary team of researchers with expertise in traffic flow theory, control theory, robotics, cyber-physical systems, and transportation engineering. The team conducted field experiments in Tucson, Arizona, in which a single autonomous vehicle circled a track continuously with at least 20 other human-driven cars. Under normal circumstances, human drivers naturally create stop-and-go traffic, even in the absence of bottlenecks, lane changes, merges or other disruptions, Work said. This phenomenon is called the “phantom traffic jam.” Researchers found that by controlling the pace of the autonomous car in the study, they were able to smooth out the traffic flow for all the cars. For the first time, researchers demonstrated experimentally that even a small percentage of such vehicles can have a significant impact on the road, eliminating waves and reducing the total fuel consumption by up to 40 percent. Moreover, the researchers found that conceptually simple and easy to implement control strategies can achieve the goal. "Before we carried out these experiments, I did not know how straightforward it could be to positively affect the flow of traffic," said Jonathan Sprinkle, the Litton Industries John M. Leonis Distinguished Associate Professor in Electrical and Computer Engineering at the University of Arizona, Tucson. "I assumed we would need sophisticated control techniques, but what we showed was that controllers which are staples of undergraduate control theory will do the trick." This latest research suggests that even the related technology available now – such as adaptive cruise control – has the power to improve traffic even before there are large numbers of autonomous vehicles on the road. "Fully autonomous vehicles in common traffic may be still far away in the future due to many technological, market and policy constraints," said Benedetto Piccoli, the Joseph and Loretta Lopez Chair Professor of Mathematics at Rutgers University, Camden. "However, increased communication among vehicles and increased levels of autonomy in human-driven vehicles is in the near future." The near future with only a few autonomous vehicles on the road is more challenging than the far future in which all vehicles are connected, said Benjamin Seibold, associate professor of Mathematics at Temple University. "The proper design of autonomous vehicles requires a profound understanding of the reaction of humans to them,” Seibold said, “and traffic experiments play a crucial role in understanding this interplay of human and robotic agents." The researchers say the next step will be to study the impact of autonomous vehicles in denser traffic with more freedom granted to the human drivers, such as the ability to change lanes.


News Article | May 10, 2017
Site: www.rdmag.com

Autonomous cars may help reduce the amount of gridlock in city highways and roads. According to a new study conducted by researchers from several universities, autonomous vehicles may be crucial in alleviating traffic problems plaguing many metropolitan areas by reducing the amount of traffic caused by stop-and-go traffic. “Our experiments show that with as few as 5 percent of vehicles being automated and carefully controlled, we can eliminate stop-and-go waves caused by human driving behavior,” Daniel Work, assistant professor at the University of Illinois at Urbana-Champaign and a lead researcher in the study, said in a statement. According to Work, the use of self-driving cars will one day lead to the replacement of classical freeway traffic control concepts including variable speed limits. However, Work said for this theory to prove true a deeper understanding of the dynamic between autonomous vehicles and the human drivers is necessary. During field experiments conducted in Tucson, the researchers had a single autonomous vehicle circle a track continuously with at least 20 other human-driven cars. This revealed that under normal conditions humans naturally create stop-and-go traffic called a “phantom traffic jam,” even without bottlenecks, lane changes, merges or other disruptions. However, by controlling the pace of the autonomous car, the researchers were able to smooth out the traffic flow for all the cars, showing that even a small percentage of autonomous vehicles can have a significant benefit to traffic flow. “Before we carried out these experiments, I did not know how straightforward it could be to positively affect the flow of traffic,” Jonathan Sprinkle, the Litton Industries John M. Leonis Distinguished Associate Professor in Electrical and Computer Engineering at the University of Arizona, said in a statement. “I assumed we would need sophisticated control techniques, but what we showed was that controllers which are staples of undergraduate control theory will do the trick.” Benedetto Piccoli, the Joseph and Loretta Lopez Chair Professor of Mathematics at Rutgers University, explained that there are still some hurdles before autonomous vehicles can be unleashed on the highways. “Fully autonomous vehicles in common traffic may be still far away in the future due to many technological, market and policy constraints,” Piccoli said in a statement. “However, increased communication among vehicles and increased levels of autonomy in human-driven vehicles is in the near future.” The research team now plans on studying the impact of autonomous vehicles in denser traffic with more freedom given to the human drivers.


News Article | May 10, 2017
Site: phys.org

Angelo Gaitas, a research assistant professor in the Electrical and Computer Engineering Department, along with Gwangseong Kim, a research scientist, are commercializing a device that reduces the screening process to just a few hours at the same cost as current devices. If you've ever suffered from food poisoning, you'll appreciate why it's so important to inspect food before it reaches the consumer. Food producers have to check for bacteria and signs of contamination before they are able to ship out any perishable food. Some common bacteria that can lead to foodborne illnesses include E.coli, salmonella and listeria. In fact, according to the Centers for Disease Control, each year, one in six Americans gets sick by consuming contaminated foods or beverages, that is 48 million people, out of whom 128,000 are hospitalized. Typically, the inspection process, which involves putting samples in a solution and placing it in an incubator to see if bacteria grows, takes anywhere from 18 hours to several days. The reason is that it takes time for bacteria to grow at detectable levels. Current detection techniques are limited – you may need about 1,000 to a million bacteria present, depending on the technique, in a small volume before bacteria can be successfully detected. To reach that level, it takes time. With this new device, food producers are able to run the whole solution through a smaller container inside the incubator oven. Antibodies in the device capture the target bacteria. This procedure allows bacteria to be concentrated in a smaller volume enabling same day detection. "We are focused on helping food producers reduce storage cost and get fresher food to consumers," Gaitas says. "We are addressing a major and well documented need in a very large market. There are about 1.2 billion food tests conducted worldwide and about 220 million tests in the United States." By shortening the detection time by one day, the team believes that the device can save the food industry billions. For example, meat producers, as a collective industry, could save up to $3 billion in storage costs by shortening the detection to one day. This device can also be used to expedite the detection of blood borne illnesses such as sepsis and viral infections; however, currently the commercial focus is on food due to the lower barriers to entry. Explore further: Detecting salmonella in pork meat twice as fast


"Our experiments show that with as few as 5 percent of vehicles being automated and carefully controlled, we can eliminate stop-and-go waves caused by human driving behavior," said Daniel B. Work, assistant professor at the University of Illinois at Urbana-Champaign, a lead researcher in the study. The use of autonomous vehicles to regulate traffic flow is the next innovation in the rapidly evolving science of traffic monitoring and control, Work said. Just as fixed traffic sensors have been replaced by crowd-sourced GPS data in many navigation systems, the use of self-driving cars is poised to replace classical freeway traffic control concepts like variable speed limits. Critical to the success of this innovation is a deeper understanding of the dynamic between these autonomous vehicles and the human drivers on the road. Funded by the National Science Foundation's Cyber-Physical Systems program, the research was led by a multi-disciplinary team of researchers with expertise in traffic flow theory, control theory, robotics, cyber-physical systems, and transportation engineering. Principal investigators (PIs) were: Benedetto Piccoli, the Joseph and Loretta Lopez Chair Professor of Mathematics at Rutgers University, Camden; Benjamin Seibold, associate professor of Mathematics at Temple University; Jonathan Sprinkle, the Litton Industries John M. Leonis Distinguished Associate Professor in Electrical and Computer Engineering at the University of Arizona, Tucson; and Daniel B. Work, assistant professor in Civil and Environmental Engineering and the Coordinated Science Laboratory at the University of Illinois at Urbana-Champaign. The team conducted field experiments in Tucson, Arizona, in which a single autonomous vehicle circled a track continuously with at least 20 other human-driven cars. Under normal circumstances, human drivers naturally create stop-and-go traffic, even in the absence of bottlenecks, lane changes, merges or other disruptions, Work said. This phenomenon is called the "phantom traffic jam." Researchers found that by controlling the pace of the autonomous car in the study, they were able to smooth out the traffic flow for all the cars. For the first time, researchers demonstrated experimentally that even a small percentage of such vehicles can have a significant impact on the road, eliminating waves and reducing the total fuel consumption by up to 40 percent. Moreover, the researchers found that conceptually simple and easy to implement control strategies can achieve the goal. "Before we carried out these experiments, I did not know how straightforward it could be to positively affect the flow of traffic," Sprinkle said. "I assumed we would need sophisticated control techniques, but what we showed was that controllers which are staples of undergraduate control theory will do the trick." This latest research suggests that even the related technology available now - such as adaptive cruise control - has the power to improve traffic even before there are large numbers of autonomous vehicles on the road. "Fully autonomous vehicles in common traffic may be still far away in the future due to many technological, market and policy constraints," Piccoli said. "However, increased communication among vehicles and increased levels of autonomy in human-driven vehicles is in the near future." The near future with only a few autonomous vehicles on the road is more challenging than the far future in which all vehicles are connected, Seibold said. "The proper design of autonomous vehicles requires a profound understanding of the reaction of humans to them," Seibold said, "and traffic experiments play a crucial role in understanding this interplay of human and robotic agents." The researchers say the next step will be to study the impact of autonomous vehicles in denser traffic with more freedom granted to the human drivers, such as the ability to change lanes. The paper describing this work, "Dissipation of stop-and-go waves via control of autonomous vehicles: Field experiments," is available here: https://arxiv.org/abs/1705.01693 Explore further: California gives green light to self-driving car tests More information: Dissipation of stop-and-go waves via control of autonomous vehicles: Field experiments, arXiv:1705.01693 [cs.SY] arxiv.org/abs/1705.01693


The presence of just a few autonomous vehicles can eliminate the stop-and-go driving of the human drivers in traffic, along with the accident risk and fuel inefficiency it causes, according to new research. The finding indicates that self-driving cars and related technology may be even closer to revolutionizing traffic control than previously thought. "Our experiments show that with as few as 5 percent of vehicles being automated and carefully controlled, we can eliminate stop-and-go waves caused by human driving behavior," said Daniel B. Work, assistant professor at the University of Illinois at Urbana-Champaign, a lead researcher in the study. The use of autonomous vehicles to regulate traffic flow is the next innovation in the rapidly evolving science of traffic monitoring and control, Work said. Just as fixed traffic sensors have been replaced by crowd-sourced GPS data in many navigation systems, the use of self-driving cars is poised to replace classical freeway traffic control concepts like variable speed limits. Critical to the success of this innovation is a deeper understanding of the dynamic between these autonomous vehicles and the human drivers on the road. Funded by the National Science Foundation's Cyber-Physical Systems program, the research was led by a multi-disciplinary team of researchers with expertise in traffic flow theory, control theory, robotics, cyber-physical systems, and transportation engineering. Principal investigators (PIs) were: Benedetto Piccoli, the Joseph and Loretta Lopez Chair Professor of Mathematics at Rutgers University, Camden; Benjamin Seibold, associate professor of Mathematics at Temple University; Jonathan Sprinkle, the Litton Industries John M. Leonis Distinguished Associate Professor in Electrical and Computer Engineering at the University of Arizona, Tucson; and Daniel B. Work, assistant professor in Civil and Environmental Engineering and the Coordinated Science Laboratory at the University of Illinois at Urbana-Champaign. The team conducted field experiments in Tucson, Arizona, in which a single autonomous vehicle circled a track continuously with at least 20 other human-driven cars. Under normal circumstances, human drivers naturally create stop-and-go traffic, even in the absence of bottlenecks, lane changes, merges or other disruptions, Work said. This phenomenon is called the "phantom traffic jam." Researchers found that by controlling the pace of the autonomous car in the study, they were able to smooth out the traffic flow for all the cars. For the first time, researchers demonstrated experimentally that even a small percentage of such vehicles can have a significant impact on the road, eliminating waves and reducing the total fuel consumption by up to 40 percent. Moreover, the researchers found that conceptually simple and easy to implement control strategies can achieve the goal. "Before we carried out these experiments, I did not know how straightforward it could be to positively affect the flow of traffic," Sprinkle said. "I assumed we would need sophisticated control techniques, but what we showed was that controllers which are staples of undergraduate control theory will do the trick." This latest research suggests that even the related technology available now - such as adaptive cruise control - has the power to improve traffic even before there are large numbers of autonomous vehicles on the road. "Fully autonomous vehicles in common traffic may be still far away in the future due to many technological, market and policy constraints," Piccoli said. "However, increased communication among vehicles and increased levels of autonomy in human-driven vehicles is in the near future." The near future with only a few autonomous vehicles on the road is more challenging than the far future in which all vehicles are connected, Seibold said. "The proper design of autonomous vehicles requires a profound understanding of the reaction of humans to them," Seibold said, "and traffic experiments play a crucial role in understanding this interplay of human and robotic agents." The researchers say the next step will be to study the impact of autonomous vehicles in denser traffic with more freedom granted to the human drivers, such as the ability to change lanes.


News Article | May 18, 2017
Site: www.techrepublic.com

Even with the disdain heaped on digital passwords, they still manage to survive, and dare I say thrive? Adding insult to injury, "password" is likely the most popular password in use. It is a pretty safe bet that anyone reading this article is well aware why using password as a password is not a good idea. So rather than preach to the choir, let's look at why password is such a popular password and learn about a painless way to improve password selection. It does not take long for professionals new to cybersecurity to realize convenience typically wins over security, which is why password and 123456 are the most popular passwords. It is just human nature, and there's no getting around it—or is there? Some people might suggest using password meters. While password meters are great at informing us about whether passwords are strong enough according to current best practices, do they actually help? "One of our findings is that password meters do not yield much improvement in helping users choose passwords for unimportant accounts, yet they are commonly deployed in such contexts," write the authors of the research paper Does My Password Go up to Eleven? The Impact of Password Meters on Password Selection (PDF). "Equally, where meters make a difference—password changes for important accounts—they are less often seen. Thus, practice at real sites appears to be very far from what our results dictate. This indicates a real opportunity for improvement." There is another problem with password meters: If a password is deemed to be unacceptable, the user has to guess how to strengthen the password, and usually tries to do so multiple times and then ends up frustrated with the process. (Readers of this article likely know how to strengthen a password to appease a password testing app.) "Instead of having a meter say, 'Your password is bad,' we thought it would be useful for the meter to say, 'Here's why it's bad and here's how you could do better,'" says Nicolas Christin, a professor in Carnegie Mellon University's Engineering and Public Policy department in this press release by Daniel Tkacik. SEE: Firms that force you to change your password are clueless says cyber security chief (TechRepublic) This team of researchers from Carnegie Mellon University and the University of Chicago were on a quest to help users create strong passwords, and they focused on the fact that most passwords are relatively simple to determine using password crackers and rainbow tables. "The way attackers guess passwords is by exploiting the patterns that they observe in large datasets of breached passwords," said Blase Ur, lead author on the study and currently an assistant professor at the University of Chicago's Department of Computer Science. "For example, if you change 'Es' to '3s' in your password, that's not going to fool an attacker." "The key result is that providing the data-driven feedback makes a huge difference in security compared to just having a password labeled as weak or strong," continues Ur in the press release. "Our new meter leads users to create stronger passwords that are no harder to remember than passwords created without the feedback." In the press release, Tkacik writes that in order to compile data-driven feedback, the researchers developed an artificial neural network—a large, complex map of information that resembles the way neurons behave in the brain. "The network 'learns' by scanning millions of existing passwords and identifying trends," continues Tkacik. "If the meter detects a characteristic in a password that it knows attackers may guess, it tells the user." The research team has a working model of their password meter online. Figure A depicts a password that is reasonably strong, but could easily be made stronger by heeding the advice listed. A nice feature is that the feedback is presented in real time as the user is typing in the information. The team has open sourced the password meter on GitHub. "There's a lot of different tweaking one could imagine doing for a specific application of the meter," Ur told Tkacik. "We're hoping to do some of that ourselves, and also engage other members of the security and privacy community to help contribute to the meter." Other authors of the study included current CMU students Jessica Colnago, Henry Dixon, Pardis Emami-Naeini, Hana Habib, Noah Johnson and William Melicher; former CMU students Felicia Alfieri and Maung Aung; and Lujo Bauer, associate professor in the ISR and the Electrical and Computer Engineering Department; and Lorrie Faith Cranor, professor of computer science and engineering and public policy. There has been a great deal of tech press given to the lowly password. Until it is officially retired, it seems prudent to support password authentication and efforts to improve its effectiveness.


News Article | April 17, 2017
Site: www.eurekalert.org

Professor Federico Rosei, Director of the INRS Centre Énergie Matériaux Télécommunications, is the recipient of the 2017 Outstanding Engineer Award from IEEE (Institute of Electrical and Electronics Engineers) Canada. The award recognizes outstanding Canadian engineers who have made important contributions to electrical and electronics engineering. This is not the first time Dr. Rosei's remarkable contributions to engineering in Canada have been highlighted. His election as a fellow of the Engineering Institute of Canada in 2013 and of the Canadian Academy of Engineering in 2015 attest to his status among Canada's engineering elite. Dr. Rosei holds the UNESCO Chair in Materials and Technologies for Energy Conversion, Saving and Storage (MATECSS) and the newly established Canada Research Chair in Nanostructured Materials. His ever-growing national and international reputation is reflected in the numerous awards and honours he has received in recent years from around the world. Dr. Rosei will receive a medal and plaque at the IEEE Canada awards ceremony, part of the Annual IEEE Canadian Conference on Electrical and Computer Engineering next May 1, 2017, in Windsor, Ontario. The theme of the conference is "Two Great Nations Innovate the Technology." Our heartfelt congratulations to Professor Rosei on this new honour!


News Article | February 15, 2017
Site: www.spie.org

The FireFly architecture features free-space optics communication links and represents an extreme design approach to meet the requirements of modern robust data center networks. Data centers (DCs)—facilities that are used to centralize the IT operations and equipment of an organization—represent a critical piece of modern networked applications, in both the private and public sectors. The trend toward DCs has been driven by a number of key factors, e.g., economies of scale, reduced management costs, better use of hardware (via statistical multiplexing), and the ability to elastically scale applications in response to changing workload patterns. In particular, a robust network fabric is fundamental for the success of DCs, i.e., to ensure that the network does not become a bottleneck for high-performance applications. In this context, the design of a DC network must satisfy several goals, including high performance (e.g., high throughput and low latency), low equipment and management costs, robustness to dynamic traffic patterns, incremental expandability to add new servers or racks, as well as other practical concerns (e.g., cabling complexity, and power and cooling costs). Currently available DC network architectures, however, do not provide satisfactory solutions to these requirements. There are two main problems with traditional static (wired) networks. They are either ‘overprovisioned’ to account for worst-case traffic patterns and thus incur high costs (e.g., with fat trees or Clos architectures), or they are ‘oversubscribed’ (such as with simple trees or leaf-spine architectures). Although the latter networks have low costs, they offer poor performance because of their congested links. In recent studies, attempts have been made to overcome these limitations by augmenting a static ‘core’ with some flexible links (radio-frequency or optical wireless). These augmented architectures do show some promise, but they provide only a small improvement in performance. Moreover, all these architectures involve high cabling costs and complexities. In our work,1 we envision an extreme design point to meet the requirements of modern DC networks rather than trying to incrementally improve the cost-performance tradeoffs, high cabling complexity, and rigidity of current DC architectures. In our architecture vision—known as FireFly—we use free-space optics (FSO) communication links to realize a fully flexible, all-wireless inter-rack fabric. FSO communication technology is particularly well suited to our aim because it offers very high data rates (tens of Gb/s) over longranges (more than 100m), while having low transmission power and a small interference footprint. A conceptual overview of our FireFly architecture vision is shown in Figure 1. In our design, each top-of-rack (ToR) switch has flexible (steerable) FSO links that can be dynamically reconfigured to connect to other ToRs. The controller reconfigures the topology to adapt to current traffic workloads. This vision provides several benefits over current DC architectures. For instance, our topological flexibility (if achieved correctly) provides a low-cost option (i.e., few switches and links) with performance comparable to more expensive overprovisioned networks. In addition, our all-wireless fabric eliminates cabling complexity and associated overheads (e.g., obstructed cooling). Our approach can also facilitate new and radical DC topology structures that would otherwise remain at the ‘paper design’ phase because of their cabling complexity. Lastly, our method of flexibly turning links on or off brings us closer to realizing the aim of energy-proportional DCs (and the flexibility enables easier incremental expansion of a DC). Figure 1. Schematic illustration of the FireFly architecture. FSO: Free-space optics. ToR: Top of rack. The unique characteristics of our approach (i.e., the FSO-based inter-rack links and the fully flexible topology) give rise to novel algorithmic, networking, and system-level challenges that need to be addressed before our vision can be made into a reality. First, we need to develop cost-effective and robust link technologies that have small form factors and that can be steered at short timescales to impart flexibility. Second, we require algorithmic techniques to design the efficient and flexible networks. Third, we need new network management solutions. These may include algorithms for the joint optimization problem of runtime topology selection and traffic engineering, as well as data-plane mechanisms to guarantee various consistency and performance requirements. In our recent work,1 we have demonstrated the viability of our FireFly architecture by building a proof-of-concept prototype (with commodity components) for a steerable, small-form-factor FSO device (see Figure 2). We have also developed practical heuristics to address the algorithmic and system-level challenges in the network design and management of our architecture. In addition, we have developed techniques to provide line-of-sight for FSO links in the FireFly architecture. For our steerable, small-form-factor FSO device, we have been exploring the use of microelectromechanical systems (MEMS) mirrors as a steering technology to steer the FSO beams with minimal latency. In this device, we use a collimated laser beam that is transmitted from the fiber collimator of an FSO terminal. The laser beam passes onto a gimbal-less two-axis MEMS micromirror (2mm diameter) and thus steers the beam in an ultrafast manner within a large optical deflection (10°) over the entire device bandwidth (1.2kHz). The MEMS mirror deflects the beam into a wide-angle lens that magnifies (about three times) the optical scan angles of the system. This magnification is linear and therefore results in an overall scan capability field of view of more than 30°. The power consumption of this system is less than 1mW and thus several orders of magnitude lower than that of galvanometer mirrors. The outgoing FSO beam from our MEMS beam-steering mechanism passes through autopoints and onto a receiving aperture (where it is efficiently coupled into a fiber collimator). With this fast and precise MEMS steering mechanism, we can switch the FSO from one rack to the next for reconfigurable networking. It also enables an autocorrection mechanism for fixing any misalignments (in real time). Figure 2. Photographs of the MEMS (microelectromechanical systems)-based proof-of-concept prototype assembly used to realize steerable FSO beams. In summary, we have designed the novel FireFly architecture for radically improving modern DC networks. Our vision includes unique characteristics, such as FSO-based inter-rack links and a fully flexible topology. Such features give rise to a number of algorithmic, networking, and system-level challenges that we are working to address. We have recently demonstrated the feasibility of our architecture with a proof-of-concept prototype for a MEMS-based steerable, small-form-factor FSO device. There are, however, various challenges that we need to address before we can realize commercialization of our architecture. In our current work we are thus building a small testbed for the FireFly architecture, which includes autoalignment through the use of galvanometers and MEMS steering technologies. We now plan to demonstrate the resilience of our FSO-link technologies against indoor effects, e.g., rack vibrations and temperature variations. This project was supported by the National Science Foundation award 1513866 (NeTS: Medium: Collaborative Research: Flexible All-Wireless Inter-Rack Fabric for Datacenters using Free-Space Optics) and represents a collaboration between faculty members, postdoctoral fellows, research associates, and graduate students at Pennsylvania State University, Stony Brook University, and Carnegie Mellon University. Electrical Engineering and Computer Science Pennsylvania State University Mohsen Kavehrad has been the W. L. Weiss Chair Professor of Electrical Engineering since 1997, and is the founding director of the Center for Information and Communications Technology Research. He has previously worked for Bell Laboratories and is a fellow of the IEEE. He is the author of more than 400 papers, several books and book chapters, and holds several patents. Department of Computer Science Stony Brook University Samir Das received his PhD in computer science from Georgia Institute of Technology. He previously studied at Jadavpur University, India, and the Indian Institute of Science. He has also worked briefly at the Indian Statistical Institute. He moved to Stony Brook in 2002 and was previously a faculty member at the University of Texas at San Antonio and then at the University of Cincinnati. Himanshu Gupta obtained his PhD in computer science from Stanford University in 1999 and his BTech from the Indian Institute of Technology in 1992. In his recent research he focuses on theoretical issues associated with wireless networking. In particular, he is interested in sensor networks and databases. His other research interests include database systems and theory, e.g., materialized views, (multiple) query optimization, and data analysis. Jon Longtin joined the mechanical engineering faculty in 1996. He is the author of more than 130 technical research publications, including a number of book chapters. He also holds six issued and three pending US patents. His expertise is in the thermal sciences, with a focus on the development of laser-based optical techniques for the measurement of temperature, concentration, and thermal properties. He is also interested in the use of ultrafast lasers for precision material processing and micromachining, and the development of sensors for harsh environments (e.g., direct-write thermal spray technology). He has been the recipient of a Japan Society for the Promotion of Science postdoctoral fellowship, the National Science Foundation's Faculty Early Career Development award and the Presidential Early Career Award for Scientists and Engineers, and the Stony Brook Excellence in Teaching award. He is a registered professional engineer in New York State. School of Computer Science Carnegie Mellon University Vyas Sekar is an assistant professor in the Electrical and Computer Engineering department. He received his PhD from Carnegie Mellon University in 2010 and earned his bachelor's degree from the Indian Institute of Technology Madras (during which he was awarded the President of India Gold Medal). His research interests lie at the intersection of networking, security, and systems. He has also received a number of best paper awards, e.g., at the Association for Computing Machinery's SIGCOMM, CoNext, and Multimedia conferences.


News Article | February 22, 2017
Site: www.eurekalert.org

BEER-SHEVA, Israel...Feb. 22, 2017 - Researchers at the Ben-Gurion University of the Negev (BGU) Cyber Security Research Center have demonstrated that data can be stolen from an isolated "air-gapped" computer's hard drive reading the pulses of light on the LED drive using various types of cameras and light sensors. In the new paper, the researchers demonstrated how data can be received by a Quadcopter drone flight, even outside a window with line-of-sight of the transmitting computer. Click here to watch a video of the demonstration. Air-gapped computers are isolated -- separated both logically and physically from public networks -- ostensibly so that they cannot be hacked over the Internet or within company networks. These computers typically contain an organization's most sensitive and confidential information. Led by Dr. Mordechai Guri, head of R&D at the Cyber Security Research Center, the research team utilized the hard-drive (HDD) activity LED lights that are found on most desktop PCs and laptops. The researchers found that once malware is on a computer, it can indirectly control the HDD LED, turning it on and off rapidly (thousands of flickers per second) -- a rate that exceeds the human visual perception capabilities. As a result, highly sensitive information can be encoded and leaked over the fast LED signals, which are received and recorded by remote cameras or light sensors. "Our method compared to other LED exfiltration is unique, because it is also covert," Dr. Guri says. "The hard drive LED flickers frequently, and therefore the user won't be suspicious about changes in its activity." Dr. Guri and the Cyber Security Research Center have conducted a number of studies to demonstrate how malware can infiltrate air-gapped computers and transmit data. Previously, they determined that computer speakers and fans, FM waves and heat are all methods that can be used to obtain data. In addition to Dr. Guri, the other BGU researchers include Boris Zadov, who received his M.Sc. degree from the BGU Department of Electrical and Computer Engineering and Prof. Yuval Elovici, director of the BGU Cyber Security Research Center. Prof. Elovici is also a member of Ben-Gurion University's Department of Software and Information Systems Engineering and director of Deutsche Telekom Laboratories at BGU. About American Associates, Ben-Gurion University of the Negev American Associates, Ben-Gurion University of the Negev (AABGU) plays a vital role in sustaining David Ben-Gurion's vision: creating a world-class institution of education and research in the Israeli desert, nurturing the Negev community and sharing the University's expertise locally and around the globe. As Ben-Gurion University of the Negev (BGU) looks ahead to turning 50 in 2020, AABGU imagines a future that goes beyond the walls of academia. It is a future where BGU invents a new world and inspires a vision for a stronger Israel and its next generation of leaders. Together with supporters, AABGU will help the University foster excellence in teaching, research and outreach to the communities of the Negev for the next 50 years and beyond. Visit vision.aabgu.org to learn more. AABGU, which is headquartered in Manhattan, has nine regional offices throughout the United States. For more information, visit http://www.


News Article | January 25, 2017
Site: www.techtimes.com

Nanotechnology - What You Should Know Graphene - Here's What You Should Know A new breakthrough has brightened the scope of expanding the utilization of the terahertz band of frequencies for faster and wider data transmission. Lying between infrared light and radio waves, terahertz band was facing under-utilization because of the nonavailability of compact, on-chip components such as transmitters, receivers, and modulators. The innovation from Tufts University School of Engineering in Massachusetts had researchers developing a modulator that is high-speed, chip-sized, and requires no DC power supply, The highlight is it exceeds 14 gigahertz and can work above 1 terahertz (THz) on the electromagnetic spectrum. This has opened up the scope for new generation wireless devices that can transmit terahertz frequencies with data transmission at unprecedented speeds compared to the present level. The fabrication of the on-chip device capable of gigahertz-rate amplitude modulation has reasons to excite the industry. The study has been published in Scientific Reports. "A prototype device is fabricated which shows THz intensity modulation of 96% at 0.25 THz carrier frequency with low insertion loss and device length as small as 100 microns. The demonstrated modulation cutoff frequency exceeds 14 GHz indicating the potential for the high-speed modulation of terahertz waves. The entire device operates at room temperature with low drive voltage (<2 V) and zero DC power consumption," the researchers said. Right now, most Wi-Fi and cellular networks are working with microwave frequencies of around one gigahertz. Moving to higher terahertz frequencies with high-speed data rates of 100 Gbit/s will be a big opportunity considering the rising bandwidth crunch. "This is a very promising device that can operate at terahertz frequencies, is miniaturized using mainstream semiconductor foundry, and is in the same form factor as current communication devices. It's only one building block, but it could help to start filling the THz gap," noted, Sameer Sonkusale, corresponding author from Nano Lab, Department of Electrical and Computer Engineering, Tufts University. The modulation cutoff frequency higher than 14 gigahertz and the potential to work above 1 THz will be a boon for cellular networks that are trying out the bands at the lower end of the spectrum and struggling with data transmission. During the experiments, the prototype device operated within the frequency band of 0.22-0.325 THz, in accordance with the experimental facilities available though it can work in other bands as well. The experiment has raised the bar of success rate when the past efforts to make terahertz modulators are factored in. In the past, engineers could push the limit only to a few kilohertz. The experiment remains a harbinger for faster yet compact terahertz modulators that can deliver high data rate wireless communication, given the high carrier frequency of THz waves that will support signal bandwidth compared to the radio frequency (RF) bands currently in use. The upcoming area of applications includes material identification, imaging, wireless communications, chemical and biological sensing. For better optimization of terahertz communications, researchers are now experimenting with other components such as bi-dimensional superlattice materials that can accelerate electron oscillations in the terahertz range and also for new power splitters with unique waveguide architectures to enable transmission of wireless terahertz waves through existing fiber optic networks. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.

Loading Electrical and Computer Engineering collaborators
Loading Electrical and Computer Engineering collaborators