Entity

Time filter

Source Type

Cambridge, United Kingdom

Lutz D.R.,ARM Inc
Proceedings - Symposium on Computer Arithmetic | Year: 2011

We present an IEEE 754-2008 and ARM compliant floating-point micro architecture that preserves the higher performance of separate multiply and add units while decreasing the effective latency of fused multiply-adds (FMAs). The multiplier supports subnormals in a novel and faster manner, shifting the partial products so that injection rounding can be used. The early-normalizing adder retains the low latency of a split path near/far adder, but does so in a unified path with less area. The adder also allows rounding on effective subtractions involving one input that is twice the normal width, a necessary feature for handling FMAs. The resulting floating-point unit has about twice the (IPC) performance of the best previous ARM design, and can be clocked at a higher speed despite the wider paths required by FMAs. © 2011 IEEE. Source


Blake G.,University of Michigan | Dreslinski R.G.,University of Michigan | Mudge T.,University of Michigan | Flautner K.,ARM Inc
Proceedings - International Symposium on Computer Architecture | Year: 2010

As the effective limits of frequency and instruction level parallelism have been reached, the strategy of microprocessor vendors has changed to increase the number of processing cores on a single chip each generation. The implicit expectation is that software developers will write their applications with concurrency in mind to take advantage of this sudden change in direction. In this study we analyze whether software developers for laptop/desktop machines have followed the recent hardware trends by creating software for chip multi-processing. We conduct a study of a wide range of applications on Microsoft Windows 7 and Apple's OS X Snow Leopard, measuring Thread Level Parallelism on a high performanceworkstation and a low power desktop. In addition, we explore graphics processing units (GPUs) and their impact on chip multi-processing. We compare our findings to a study done 10 years ago which concluded that a second core was sufficient to improve system responsiveness. Our results on today's machines show that, 10 years later, surprisingly 2-3 cores are more than adequate for most applications and that the GPU often remains under-utilized. However, in some application specific domains an 8 core SMT system with a 240 core GPU can be effectively utilized. Overall these studies suggest that many-core architectures are not a natural fit for current desktop/laptop applications. Copyright 2010 ACM. Source


Becker A.,ARM Inc
Proceedings of the IEEE VLSI Test Symposium | Year: 2016

A new methodology and algorithm is presented for testing on-chip memories concurrently with the normal operation of a processor, which has little or no effect on its performance. The test algorithm uses a series of short burst of memory access and does not destroy the memory contents. This paper describes the implementation of the memory built-in self-test (MBIST) methodology in the ARM® Cortex®-M7 microprocessor to allow on-line testing of the Level 1 cache SRAMs and Tightly Coupled Memories (TCMs). A programmable MBIST controller is also described that can execute the algorithm to test memories at-speed and autonomously from software running on a processor, via the ARM standard MBIST interface. © 2016 IEEE. Source


Patent
ARM Inc | Date: 2014-09-29

A graphics processing pipeline comprises a tessellation stage


News Article | November 11, 2015
Site: www.eweek.com

Throughout keynote presentations and other conference sessions, the message was that security needs to be a priority when developing for the Internet of things. SANTA CLARA, Calif.—ARM is looking to extend the reach of its chip designs beyond smartphones and tablets and into new growth areas, with the Internet of things being a key one. The company sees the development of tens of billions of connected devices, systems and sensors as a natural fit for its low-power system-on-a-chip (SoC) designs, and has made a strong push into the burgeoning Internet of things (IOT) market over the last couple of years. That has included everything from the development of its mbed IoT platform to partnerships with the likes of IBM and acquisitions to build out its capabilities in the market. At the company's TechCon 2015 show here Nov. 10, CTO Mike Muller unveiled ARM's new Cortex-A35 SoC, which is aimed at low-cost smartphones but also can be used for IoT devices. With a broad array of devices becoming connected, the attack surface for hackers is rapidly increasing, and the issues of security and privacy have been at the forefront of IoT discussions throughout the industry. They also have been a focus at TechCon this week, where there were more than a dozen sessions about security and the IoT . In addition, at the same time he announced the Cortex-A35, Muller also introduced the company's efforts to bring its TrustZone technology that is pervasive in its architectures for mobile systems to IoT devices though the development of its new ARMv8-M architecture. Muller stressed the need for a layered approach to security that starts with the hardware and works its way up through software and communications.CEO Simon Segars followed that up in his Nov.11 keynote address, talking about security, privacy and the need to develop trust among end users and governments in the use of a myriad of connected devices in use. Otherwise, they risk stalling what he and ARM partners on the TechCon stage said is a significant opportunity for technology innovations, business advancements and improving the lives of the world's population.If people don't trust their connected devices, they won't want to use them, Segars said. If governments are concerned about security, they will move in with regulations. The key is for the tech industry to get ahead of the security and privacy issues that impact trust and develop solutions that will deal with the challenges, the CEO said. An important part of that will be to address security as the devices are developed, rather than trying to bolt on technologies later. "We have the opportunity to get this right," Segars said to several thousand TechCon attendees. "Let's take that opportunity to get the IoT right. “As the IoT evolves, and as it gets more complex, it will be difficult to address security after the fact, he added. The Internet of things is expected to grow quickly over the next few years, with Cisco Systems forecasting that the number of connected devices worldwide will jump from 25 billion in 2014 to more than 50 billion by 2020. IDC analysts expect that IoT spending will hit $1.7 trillion by that year. IDC also is predicting that as the number of devices grows, so will the number of cyber-attacks. According to IDC figures, the number of IoT devices will grow to 22 billion by 2018 and will fuel the development of 200,000 new apps and solutions to take advantage of them. However, security will continue to be a key issue. During a recent webcast, Frank Gens, senior vice president and chief analyst at IDC, reportedly said that by 2018, two-thirds of enterprises will experience IoT security breaches . "Trust is all about risk mitigation," said Coby Sella, vice president of products and technology at ARM said during a panel discussion. "You need to address that risk factor."

Discover hidden collaborations