Entity

Time filter

Source Type

Murray, UT, United States

Bell Laboratories is the research and development subsidiary of Alcatel-Lucent. Bell Laboratories operates its headquarters in Murray Hill, New Jersey, United States, and has research and development facilities throughout the world.The historic laboratory originated in the late 19th century as the Volta Laboratory and Bureau created by Alexander Graham Bell. Bell Labs was also at one time a division of the American Telephone & Telegraph Company , half-owned through its Western Electric manufacturing subsidiary.Researchers working at Bell Labs are credited with the development of radio astronomy, the transistor, the laser, the charge-coupled device , information theory, the UNIX operating system, the C programming language, S programming language and the C++ programming language. Eight Nobel Prizes have been awarded for work completed at Bell Laboratories.On May 20, 2014, Bell Labs announced the Bell Labs Prize, a competition for innovators to offer proposals in information and communications technologies, with cash awards of up to $100,000 for the grand prize. Wikipedia.


Lu L.,Georgia Institute of Technology | Li G.Y.,Georgia Institute of Technology | Swindlehurst A.L.,University of California at Irvine | Ashikhmin A.,Bell Laboratories | Zhang R.,National University of Singapore
IEEE Journal on Selected Topics in Signal Processing | Year: 2014

Massive multiple-input multiple-output (MIMO) wireless communications refers to the idea equipping cellular base stations (BSs) with a very large number of antennas, and has been shown to potentially allow for orders of magnitude improvement in spectral and energy efficiency using relatively simple (linear) processing. In this paper, we present a comprehensive overview of state-of-the-art research on the topic, which has recently attracted considerable attention. We begin with an information theoretic analysis to illustrate the conjectured advantages of massive MIMO, and then we address implementation issues related to channel estimation, detection and precoding schemes. We particularly focus on the potential impact of pilot contamination caused by the use of non-orthogonal pilot sequences by users in adjacent cells. We also analyze the energy efficiency achieved by massive MIMO systems, and demonstrate how the degrees of freedom provided by massive MIMO systems enable efficient single-carrier transmission. Finally, the challenges and opportunities associated with implementing massive MIMO in future wireless communications systems are discussed. © 2014 IEEE.


Shang X.,Bell Laboratories | Poor H.V.,Princeton University
IEEE Transactions on Information Theory | Year: 2012

An interference channel is said to have strong interference if a certain pair of mutual information inequalities are satisfied for all input distributions. These inequalities assure that the capacity of the interference channel with strong interference is achieved by jointly decoding the signal and the interference. This definition of strong interference applies to discrete memoryless, scalar and vector Gaussian interference channels. However, there exist vector Gaussian interference channels that may not satisfy the strong interference condition but for which the capacity can still be achieved by jointly decoding the signal and the interference. This kind of interference is called generally strong interference. Sufficient conditions for a vector Gaussian interference channel to have generally strong interference are derived. The sum-rate capacity and the boundary points of the capacity region are also determined. © 2012 IEEE.


Gettys J.,Bell Laboratories
IEEE Internet Computing | Year: 2011

Bufferbloat is the existence of excessively large (bloated) buffers into systems, particularly network communication systems. Systems suffering from bufferbloat will have bad latency under load under some or all circumstances, depending on if and where the bottleneck in the communication's path exists. Bufferbloat encourages network congestion; it destroys congestion avoidance in transport protocols such as HTTP, TCP, Bittorrent, and so on. Network congestion-avoidance algorithms depend on timely packet drops or ECN; bloated buffers invalidate this design presumption. Without active queue management, these bloated buffers will fill, and stay full. Bufferbloat is an endemic disease in today's Internet. © 2011 IEEE.


Grant
Agency: NSF | Branch: Continuing grant | Program: | Phase: RES IN NETWORKING TECH & SYS | Award Amount: 35.09K | Year: 2016

Software defined radio (SDR) is emerging as a key technology to satisfy rapidly increasing data rate demands on the nations mobile wireless networks while ensuring coexistence with other spectrum users. When SDRs are in the hands and pockets of average people, it will be easy for a selfish user to alter his device to transmit and receive data on unauthorized spectrum, or ignore priority rules, making the network less reliable for many other users. Further, malware could cause an SDR to exhibit illegal spectrum use without the users awareness. The FCC has an enforcement bureau which detects interference via complaints and extensive manual investigation. The mechanisms used currently for locating spectrum offenders are time consuming, human-intensive, and expensive. A violators illegal spectrum use can be too temporary or too mobile to be detected and located using existing processes. This project envisions a future where a crowdsourced and networked fleet of spectrum sensors deployed in homes, community and office buildings, on vehicles, and in cell phones will detect, identify, and locate illegal use of the spectrum across a wide areas and frequency bands. This project will investigate and test new privacy-preserving crowdsourcing methods to detect and locate spectrum offenders. New tools to quickly find offenders will discourage users from illegal SDR activity, and enable recovery from spectrum-offending malware. In short, these tools will ensure the efficient, reliable, and fair use of the spectrum for network operators, government and scientific purposes, and wireless users. New course materials and demonstrations for use in public outreach will be developed on the topics of wireless communications, dynamic spectrum access, data mining, network security, and crowdsourcing.

There are several challenges the project will address in the development of methods and tools to find spectrum offenders. First, the project will enable localization of offenders via crowdsourced spectrum measurements that do not decode the transmitted data and thus preserve users? data and identity privacy. Second, the crowd-sourced sensing strategy will implicitly adapt to the density of traffic and explicitly adapt to focus on suspicious activity. Next, the sensing strategy will stay within an energy budget, and have incentive models to encourage participation, yet have sufficient spatial and temporal coverage to provide high statistical confidence in detecting illegal activity. Finally, the developed methods will be evaluated using both simulation and extensive experiments, to quantify performance and provide a rich public data set for other researchers.


Grant
Agency: NSF | Branch: Standard Grant | Program: | Phase: SOFTWARE & HARDWARE FOUNDATION | Award Amount: 145.50K | Year: 2016

Software is embedded into our daily activities. Ensuring that the software is trustworthy - does what is intended - and secure - is not vulnerable to attack - is a prime concern. Much attention has been devoted to establishing the correctness of high-level programs. This project is focused on the important task of ensuring that the, often complex and opaque, transformations carried out by a compiler do not degrade the trustworthiness and security guarantees of its input program.

The key innovation pursued in this project is self-certification which guarantees the correctness and security of compilation. A self-certifying compiler creates a tangible, independently-checkable proof, justifying the correctness of the compilation run. By linking in information from external analysis tools certificates can also aid in obtaining better machine code. In particular, they allow for automatic insertion of defensive measures, which protect the program from common security attacks. This work builds on existing theoretical ideas and compiler implementations, while extending them in new directions. The self-certifying compiler is implemented in the popular LLVM framework, making it suitable for immediate adoption by programmers, and its security benefits available to end users in a transparent fashion. Provable program correctness is a true Grand Challenge for computing. By developing both theory and implementation of a self-certifying compiler, this project is taking a significant step forward in meeting that challenge.

Discover hidden collaborations