Cambridge, MA, United States

Massachusetts Institute of Technology

mit.edu/
Cambridge, MA, United States

The Massachusetts Institute of Technology is a private research university in Cambridge, Massachusetts. Founded in 1861 in response to the increasing industrialization of the United States, MIT adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. Researchers worked on computers, radar, and inertial guidance during World War II and the Cold War. Post-war defense research contributed to the rapid expansion of the faculty and campus under James Killian. The current 168-acre campus opened in 1916 and extends over 1 mile along the northern bank of the Charles River basin.MIT, with five schools and one college which contain a total of 32 departments, is traditionally known for research and education in the physical science and engineering, and more recently in biology, economics, linguistics, and management as well. The "Engineers" sponsor 31 sports, most teams of which compete in the NCAA Division III's New England Women's and Men's Athletic Conference; the Division I rowing programs compete as part of the EARC and EAWRC.MIT is often cited as among the world's top universities. As of 2014, 81 Nobel laureates, 52 National Medal of Science recipients, 45 Rhodes Scholars, 38 MacArthur Fellows, and 2 Fields Medalists have been affiliated with MIT. MIT has a strong entrepreneurial culture and the aggregated revenues of companies founded by MIT alumni would rank as the eleventh-largest economy in the world. Wikipedia.

SEARCH TERMS
SEARCH FILTERS
Time filter
Source Type

The Advisory Council is led by Swamy Kotagiri, Magna Chief Technology Officer (CTO), and consists of some of the most recognized and respected experts in the global automotive and tech industries. The council brings a wider circle of insight, knowledge and experience from various industries that ultimately helps accelerate the execution of Magna's technology and business objectives. "The pace of innovation in the automotive industry is like nothing we have ever seen before, creating even more challenges and opportunities," said Kotagiri.  "At Magna, we welcome the challenge and aim to seize the opportunities by continuing to leverage our culture of innovation, while embracing a new level of innovation outreach.  We are excited to have such a distinguished group of individuals bringing their vision and insights to our company." Advisory Council members will provide high-level strategic planning insights and experience in the areas of advanced driver assistance systems, environmental and automotive safety, overall industry trends, and next-generation technologies. Chaired by Kotagiri, the Advisory Council is comprised of six members who are recognized leaders in their respective fields, several of whom have significant experience in product innovation and the implementation of new technologies. "Magna's deep vehicle systems knowledge and electronics capabilities, combined with its global engineering and manufacturing expertise, are remarkable," said Tony Fadell. "They are in a great position to help drive change in the auto industry and I am excited to be working with such an innovative company." "Magna is a company committed to helping define the future of mobility and I am delighted to be a part of such a distinguished group of individuals who collectively can bring new opportunities to Magna and the industry," said Dr. Ian Hunter. Swamy Kotagiri is globally responsible for managing Magna's innovation and new product strategy and development.  As CTO, Kotagiri helps Magna's product groups bring innovative ideas to the market, which allows the company to move the automotive industry forward. Mei-Wei Cheng is a member of the Board of Directors of Seagate Technology PLC and recently served as non-executive Chairman of Pactera.  He was the former CEO and President for the Chinese subsidiaries of AT&T, Siemens Ford Motor Company and General Electric. He holds a bachelor's degree in industrial engineering/operations research from Cornell University and an MBA from Rutgers University. Tony Fadell is the inventor of the iPod, an inventor of the iPhone, and founder of Nest, the company that pioneered the "Internet of things".  He is an active investor and entrepreneur with a 25-year history of founding companies and designing products that improve people's lives. Fadell has authored more than 300 patents.  In May 2016, TIME named Nest Thermostat, the iPod and iPhone as three of the "50 Most Influential Gadgets of All Time." Dr. Ian Hunter is a Professor of Mechanical Engineering and runs the BioInstrumentation Lab at the Massachusetts Institute of Technology.  Dr. Hunter has filed over 150 patents, produced more than 500 scientific and engineering publications, and has founded and/or co-founded 25 companies. He received his bachelor's, master's and doctorate degrees from the University of Auckland and completed a post-doctoral fellowship in the department of Biomedical Engineering at McGill University in Canada. John Maddox is the CEO of the American Center for Mobility. He began his career as a Research Engineer at Ford Motor Company and has held positions such as Associate Administrator at the National Highway Traffic Safety Administration and Compliance Officer at Volkswagen North America. He holds a degree in mechanical engineering from the University of Maryland and a master's degree in engineering management from the University of Detroit Mercy. Paul Mascarenas is a member of the Board of Directors at ON Semiconductor and the United States Steel Corporation.  He previously held a number of senior leadership positions at Ford Motor Company, most recently serving as Chief Technical Officer.  Paul holds a bachelor's degree in mechanical engineering from the University of London, King's College in England and holds an honorary doctorate degree from Chongqing University in China. ABOUT MAGNA We are a leading global automotive supplier with 317 manufacturing operations and 102 product development, engineering and sales centres in 29 countries. We have over 155,000 employees focused on delivering superior value to our customers through innovative products and processes, and world class manufacturing. We have complete vehicle engineering and contract manufacturing expertise, as well as product capabilities which include body, chassis, exterior, seating, powertrain, active driver assistance, vision, closure and roof systems. We also have electronic and software capabilities across many of these areas.  Our common shares trade on the Toronto Stock Exchange (MG) and the New York Stock Exchange (MGA). For further information about Magna, visit our website at www.magna.com. THIS RELEASE MAY CONTAIN STATEMENTS WHICH CONSTITUTE "FORWARD-LOOKING STATEMENTS" UNDER APPLICABLE SECURITIES LEGISLATION AND ARE SUBJECT TO, AND EXPRESSLY QUALIFIED BY, THE CAUTIONARY DISCLAIMERS THAT ARE SET OUT IN MAGNA'S REGULATORY FILINGS. PLEASE REFER TO MAGNA'S MOST CURRENT MANAGEMENT'S DISCUSSION AND ANALYSIS OF RESULTS OF OPERATIONS AND FINANCIAL POSITION, ANNUAL INFORMATION FORM AND ANNUAL REPORT ON FORM 40-F, AS REPLACED OR UPDATED BY ANY OF MAGNA'S SUBSEQUENT REGULATORY FILINGS, WHICH SET OUT THE CAUTIONARY DISCLAIMERS, INCLUDING THE RISK FACTORS THAT COULD CAUSE ACTUAL EVENTS TO DIFFER MATERIALLY FROM THOSE INDICATED BY SUCH FORWARD-LOOKING STATEMENTS. THESE DOCUMENTS ARE AVAILABLE FOR REVIEW ON MAGNA'S WEBSITE AT WWW.MAGNA.COM.


News Article | May 16, 2017
Site: phys.org

American wunderkind Reuben Paul, may be still only in 6th grade at his school in Austin, Texas, but he and his teddy bear Bob wowed hundreds at a timely cyber security conference in The Netherlands. "From airplanes to automobiles, from smart phones to smart homes, anything or any toy can be part of the" Internet of Things (IOT)," he said, a small figure pacing the huge stage at the World Forum in The Hague. "From terminators to teddy bears, anything or any toy can be weaponised." To demonstrate, he deployed his cuddly bear, which connects to the iCloud via WiFi and Bluetooth smart technology to receive and transmit messages. Plugging into his laptop a rogue device known as a "raspberry pi"—a small credit card size computer —Reuben scanned the hall for available Bluetooth devices, and to everyone's amazement including his own suddenly downloaded dozens of numbers including some of top officials. Then using a computer language programme, called Python, he hacked into his bear via one of the numbers to turn on one of its lights and record a message from the audience. "Most internet-connected things have a blue-tooth functionality ... I basically showed how I could connect to it, and send commands to it, by recording audio and playing the light," he told AFP later. "IOT home appliances, things that can be used in our everyday lives, our cars, lights refrigerators, everything like this that is connected can be used and weaponised to spy on us or harm us." They can be used to steal private information such as passwords, as remote surveillance to spy on kids, or employ a GPS to find out where a person is. More chillingly, a toy could say "meet me at this location and I will pick you up," Reuben said. His father, information technology expert Mano Paul, told how aged about six Reuben had revealed his early IT skills correcting him during a business call. Using a simple explanation from dad on how one smart phone game worked, Reuben then figured out it was the same kind of algorithm behind the popular video game Angry Birds. "He has always surprised us. Every moment when we teach him something he's usually the one who ends up teaching us," Mano Paul told AFP. But Paul said he been "shocked" by the vulnerabilities discovered in kids toys, after Reuben first hacked a toy car, before moving onto more complicated things. "It means that my kids are playing with timebombs, that over time somebody who is bad or malicious can exploit." Now the family has helped Reuben, who is also the youngest American to have become a Shaolin Kung Fu black belt, to set up his CyberShaolin non-profit organisation. Its aim is "to inform kids and adults about the dangers of cyber insecurity," Reuben said, adding he also wants to press home the message that manufacturers, security researchers and the government have to work together. Reuben also has ambitious plans for the future, aiming to study cyber security at either CalTech or MIT universities and then use his skills for good. Failing that maybe he could become an Olympian in gymnastics—another sport he excels in. Explore further: NASA catches the two day life of Tropical Cyclone Reuben


News Article | May 22, 2017
Site: news.mit.edu

We are entering the age of ubiquitous sensing. Smart sensors will soon track our health and wellness, enable autonomous cars, and monitor machines, buildings, and bridges. Massive networks of small, inexpensive sensors will enable large-scale global data collection — impacting the distribution of agriculture and water, environmental monitoring, disaster recovery, disease-outbreak detection and intervention, and the operation of cities. With this change in mind, MIT is creating a singular hub to unite experts as they develop a new generation of sensors, and sensing and measurement technologies. On May 25-26, SENSE.nano will debut, marking the first “center of excellence” powered by MIT.nano, the 214,000 square-foot research facility taking shape in the heart of MIT campus. The center will empower people in the MIT community, engage industry leaders, and educate the public. “There is a thing we do extremely well at MIT: We lock arms and make progress that is beyond the scope of any one researcher,” says Timothy Swager, the John D. MacArthur Professor in the Department of Chemistry. “If you look at what’s happening with sensors, you’ll see that many different disciplines have to come together. Ubiquitous sensing has so many aspects — chemical, biological, physical, radiological,” he says. “With all this sensing research going on, we need a place to coordinate our synergies.” As part of the kickoff, a full-day symposium will feature experts discussing technical challenges, commercial and humanitarian needs, and the societal impact of ubiquitous sensor and sensing systems. In a nod to the everyday impact of this technology, NPR journalist Tom Ashbrook will lead a broad discussion on “Sensing, Society, and Technology.” “Novel sensors and sensing systems will provide previously unimaginable insight into the condition of individuals and the built and natural worlds, positively impacting people, machines, and the environment,” says Brian W. Anthony, a principal research engineer at MIT and director of the Advanced Manufacturing and Design program, who is coleading the new center. SENSE.nano will support collaboration between people from a range of specialty areas — engineering, business, Earth science, electronics, computation, nanoscience, materials science, neuroscience, chemistry, physics, computer science, biology, and advanced manufacturing. “We want to use this event as an opportunity to strengthen the community and improve our connection to the local innovation and manufacturing ecosystem,” adds Anthony. “And to accelerate the rate at which our new sensing technologies and innovations are scaled-up and go out and impact the IoT enabled industries, advanced instrumentation, and beyond.” Vince Roche, CEO of Analog Devices, and Gururaj “Desh” Deshpande, founder of the Deshpande Foundation, will offer morning and afternoon keynotes. Framing the broad impact and opportunity of sensing technologies to the U.S. economy and the world’s societal needs. Analog Devices, a semiconductor company cofounded by Raymond S. Stata, is a cornerstone company in sensor products and advanced manufacturing in Massachusetts. “It is time for people to reach out and find the best ways to collaborate,” he says. “We’re looking for input from the community, sensor and sensing system manufacturers, government, academe, and researchers to help us define the grand challenge focus areas within SENSE.nano.”


News Article | November 2, 2015
Site: phys.org

As the population of internet connected things continues to rise, how will autonomous robotic devices see and be smart enough to navigate obstacles, each other and people in the real world? Light Detection And Ranging (LiDAR) is one answer. The remote sensing technology has been around since the 1960s. Back then, it required enormous scanners that filled planes and cost hundreds of thousands of dollars. Increasingly it's more affordable, and in some configurations can fit inside something the size off a baseball. It is used for widespread surveying, mapping for geological research and archaeology. It works like this: A scanner shoots out infrared lasers that reflect or bounce off solid matter on surfaces. The distance measured helps create a 3D image. In the case of robots, that 3D image is interpreted by the computer system, which is coded with instructions for how the robot should react. Like most technologies that survive the test of time, LiDAR technology has evolved. In fact, today police use LiDAR speed guns to measure vehicles that exceed the speed limit. A small, cutting-edge LiDAR sensor can be purchased for $8,000. While this is still expensive, advancements in LiDAR technology and the rise in robot and autonomous machine research have combined to create a boom in projects that are pushing the limits of what the environment-mapping technology can do. Deloitte estimates that this year's $18 billion market for sales of robots for logistics, packaging and materials handling will nearly double by 2020. The research firm stated that many of these robots are already designed to interact directly with consumers or assist them with shopping. These machines need to somehow see in order to function properly. "LiDAR systems have been an absolutely necessary tool for the advancement of mobile robots," said Charles Grinnell, CEO of Harvest Automation, which builds autonomous agricultural robots. "LiDAR is a great all-in-one solution for us." Then there's the Cheetah running robot. This four-legged, 70-pound robot, developed at MIT and funded by Defense Advanced Research Project Agency (DARPA), can run untethered at 13 mph over a flat course. Now it can jump over hurdles autonomously with the help of an onboard LiDAR setup. The system lets the robot spot approaching obstacles then calculate their size and distance. A series of path-panning algorithms tell the robot when and how high to jump, all without any human help. The Cheetah can clear hurdles 18 inches high—more than half its own height—while running 5 mph. On a treadmill, the robot can make it over almost three-quarters of obstacles. On a track, with more space and time to make its calculations, the robot has a 90 percent success rate. The Harvest Automation HV-100 has a more benign purpose: helping move and place plants with exacting precision. Nicknamed "Harvey," the 20-inch-tall autonomous model is already zipping around nurseries and fields across the country. LiDAR helps these robots identify their location and determine where to place the plants they're carrying amid thousands of others—and avoid obstacles in the way. "Our robots work with people in the same space, so we are constantly on the lookout to avoid collisions," said Grinnell. This plant placement is also important for growers to optimize space while minimizing labor costs and preventing worker injury related to repetitive tasks. "It's critical for mobile robots to process movement and sense obstacles," said Eric Mantion, an evangelist for Intel RealSense camera technology. RealSense technology, which is built into many new laptop and all-in-one PC models, gives computer vision to experimental drones and robots. Mantion said that just as a high jumper's brain needs to process every step of a jump – the approach, the launch, the clearing of the bar—robots also have to process all of these types of movements to function in the real world. "The more efficiently these algorithms can be processed, the more responsive and successful the outcome," Mantion said. RealSense is much smaller and more affordable than LiDAR, making it possible to create new consumer robotic devices. One example is the iRobot butler, which uses RealSense cameras to navigate hotel hallways. In addition to helping researchers pioneer new capabilities for autonomous machines, LiDAR sensors are being used by a number of car manufacturers. Together with other kinds of navigational systems, LiDAR helps direct self-driving vehicles. Recently, however, researchers have shown that LiDAR sensors can be fooled from more than 300 feet away by someone using a specially modified laser pointer. Hackers could conceivably make a self-driving car stop or swerve by convincing the sensor that it is about to hit an imaginary obstacle like a car, person or wall, said Jonathan Petit with the University of Cork's Computer Security Group. "LiDAR uses light, an unregulated spectrum that is accessible to anyone," he explained. Luckily there are ways to mitigate such attacks, he says, such as identifying and removing outliers from the raw LiDAR data. With its nearly instantaneous 360-degree sweeps, however, LiDAR is ideal for covering large areas quickly. That's why it's so popular with geologists, surveyors and archaeologists, who often mount sensors on planes. Some companies specialize in the scanning process itself, capturing data for a variety of clients. London-based ScanLab works with historians, climatologists and forensic anthropologists to create large-scale 3D images of everything from historic sites to clothes. They've helped researchers at the University of Cambridge calculate how ice floes in the Arctic are being affected by climate change and recreated the beaches of Normandy for a PBS documentary about D-Day. Some of their projects edge into the morbid, such as scans of concentration camps in former Yugoslavia and the coat Lord Nelson wore at the Battle of Trafalgar, complete with bullet hole and bloodstains. Last year, ScanLab technicians descended 70 feet beneath the streets of London to record images of part of the old London Post Office Railway. The 23-mile "Mail Rail" carried post and packages until 2003, when it was almost completely abandoned. A total of 223 scans resulted in more than 11 billion data points, filling more than a terabyte of memory. CyArk, a nonprofit, was launched to create 3D models of threatened cultural heritage sites around the world before they are lost. So far their free online library includes LiDAR scans of the Leaning Tower of Pisa, Chichén Itzá and the towering stone moai of Easter Island. In 2014, the organization scanned 5.71 square miles of the historic core of New Orleans, creating a virtual version that is accurate to within about five inches. The idea is to capture the city as it stands today—and to have something to guide rebuilding in the aftermath of a natural disaster like Hurricane Katrina. Some LiDAR projects have a more artistic bent. In 2008, Radiohead filmed the video for the song "House of Cards" entirely using laser scans, in part by driving a LiDAR-equipped van around Florida. "In the Eyes of the Animal," a virtual reality (VR) exhibit created by Marshmallow Laser Feast for the Abandon Normal Devices Festival in the U.K., gives users an immersive experience of what it's like to be an animal in the wild. First they used LiDAR, CT scanning and video-equipped drones to record part of Grizedale Forest in Britain's Lake District. To experience the virtual forest, users don sensory units the size of beach balls that completely enclose their heads. Inside, VR goggles and binaural headphones simulate what a fox, bird or deer might see and hear, down to the specific field of vision and light wavelengths. A wearable subwoofer adds a physical dimension. Despite the awesome things LiDAR enables people to do, Grinnell says price is still an issue. The systems are much cheaper than they used to be, but the one that guides each HV-100 is the single most expensive part of the robot, accounting for about 20 percent of the $30,000 price tag. "The cost represents a big barrier to creating more commercially viable solutions using mobile robots." All this means is that LiDAR technology isn't nearly as inexpensive and ubiquitous as something like GPS—yet. Luckily, there are plenty of important projects to motivate experts to continue working toward even better solutions.


News Article | February 8, 2017
Site: www.scientificcomputing.com

Equipped with high-tech versions of common city fixtures — namely, smart benches and digital information signs — and fueled by a “deploy or die” attitude, MIT Media Lab spinout Changing Environments is hoping to accelerate the development of “smart” cities that use technology to solve urban challenges. “The idea is to bring simple technologies to the streets,” says CEO Sandra Richter, a former Media Lab researcher who co-founded the startup with Nan Zhao, a Media Lab PhD student, and Jutta Friedrichs, a Harvard University graduate. “When it comes to smart cities, there’s been a lot of talking, but not a lot of doing. If you don’t want this [smart city] concept to die, you need to bring real-world examples to the places where we live, work, and play.” The women-founded startup is the brains behind the Soofa Benches that have cropped up around Boston and Cambridge, including on MIT’s campus. The benches contain an embedded charging station powered by a mini solar panel, with two USB ports for plugging in mobile devices. They also connect to wireless networks. First installed in Boston in June 2014, the benches are now in 65 cities across 23 U.S. states, including in New York; Washington; Los Angeles; Boulder, Colorado; Oklahoma City; and Austin, Texas. Cities in Canada, Costa Rica, Saudi Arabia, and Germany have adopted the benches as well. The startup also sells a Soofa charging station independently, which can be integrated into existing city infrastructure. Recently, Changing Environments starting deploying its second solar-powered product, the Soofa Sign, in Metro Boston spots including in Kendall Square in Cambridge, Samuel Adams Park in Boston, and Porter Square in Cambridge and Somerville. Each sign has apps installed that display public transit times, weather, and events, among other information. This month, the startup will select three additional cities where it will pilot the Soofa Sign. Each Soofa product comes equipped with sensors that gather pedestrian-traffic data for cities, and can be considered part of the “internet of things” (IoT), in which many kinds of everyday devices are wirelessly connected and exchange data. This data can be used by cities to make decisions about funding city developments, events, and other initiatives that impact the public. Richter and Zhao came together in the Media Lab after realizing they shared similar interests in developing “persuasive” technologies that helped people live healthier and more sustainably. Richter studied in the Changing Places group led by Principal Research Scientist Kent Larson, where she designed technologies and apps that encouraged people to bike more, use electric cars, and maintain other healthy practices. (Richter was named one of the most creative people in business by Fast Company in 2013 for her work at the lab.) Zhao, a student in the Responsive Environments group led by Joseph Paradiso, the Alexander W. Dreyfoos Professor in Media Arts and Sciences, develops technologies that help people save energy by using less light. The startup’s name, in fact, is a combination of the two group names. Many of Richter and Zhao’s projects centered on building for smart cities, in which IoT devices would collect data to help improve efficiency of services and meet residents’ needs. As a side project, in 2013, they decided to build an IoT fixture that could be easily deployed in urban areas and would benefit the public. “And what is better than a park bench?” Richter says. “It’s something we have all around the globe, has existed for centuries, and is a place for people to connect with each other. For us, that was the ideal platform to start introducing sensors into the public environment.” From there, things moved quickly. Influenced by the Media Lab’s oft-repeated motto, “deploy or die,” Richter and Zhao, then joined by Friedrichs, developed a concrete prototype of the current Soofa Bench model, “every now and then turning the Media Lab into a concrete mess,” Richter says, laughing. Thanks to a meeting facilitated by Larson with Boston’s Mayor’s Office of New Urban Mechanics, the first Soofa Bench prototype was installed in Titus Sparrow Park in June 2014. One week after, Richter took the prototype to the first White House Maker’s Faire, where she sat down with then-President Barack Obama to discuss the bench and the future of smart cities. Upon returning to Boston, the three co-founders embarked on a “crazy summer where everything happened,” Zhao says. Verizon and Cisco — which invests in IoT technologies — had funded the students to develop more Soofa Benches. But the students didn’t even have a company bank account. “So literally a couple days before Cisco transferred money, we said, ‘Alright, we need to start a company,’” Richter says. Naming themselves Changing Environments, the students cranked out about 10 benches in the Media Lab as part of a pilot launch for spots in the Boston Common, the Rose Kennedy Greenway, and other Boston locations. In mid-2015, Changing Environments opened headquarters in East Cambridge and brought its first commercial Soofa Bench to Central Square, in Cambridge, before spreading to dozens of other U.S. and international cities. Today, the Soofa Benches are certainly seeing use. Charging activity is tracked at headquarters where, in a lighthearted competition, the office has a “bench leadership board” with benches that see the most charging activity. Currently holding the top spot is “Amelia,” the startup’s first commercial bench in Cambridge’s Central Square, with 1,817 total hours charged over a total of 3,571 charging sessions, as of mid-January. (Soofa encourages cities to name each bench to keep things amusing and engaging.) Benches in Harvard Square and on the Rose Kennedy Greenway have logged around 2,500 charging sessions. Some newly installed benches in New York City, which were just implemented in May 2016, already have more than 2,000 sessions charged. On the back end, Soofa fixtures collect valuable data for the city governments that purchase them. The sensors count the wireless signals emitted from pedestrians’ mobile devices and assign an activity level for the location. Cities can use Soofa software to check if activity was high or low in certain areas at certain times. A city may note, for instance, that a certain event drew a big crowd and may decide to host similar events. More broadly, Richter says, installing Soofa Benches — often a publicized event — can open discussions about smart cities. When benches are installed in a new city, the mayor or other city official usually meets the co-founders at the bench to discuss the technology and its benefits and limitations. In that way, Soofa serves as “an icon for internet of things in cities,” Richter says. Now a thriving startup, Changing Environments owes some of its success to MIT’s entrepreneurial ecosystem, the co-founders say, including an early investment from the Media Lab’s E14 Fund, which provides stipends, mentoring, introductions to investors, and basic legal and accounting services to recent MIT graduates such as Richter. The E14 Fund, Richter says, gave her a great “runway” for transitioning from student to entrepreneur. “It’s like you’re still under the wing of the lab, but you’re just learning how to fly,” she says. The newly deployed Soofa Sign, Richter adds, came about through a collaboration with MIT spinout E Ink — which invented electronic ink for e-readers and other devices — that was initiated by Joi Ito, director of the Media Lab. “It’s beautiful to see two MIT companies working together to push the envelope on a product,” Richter says. This winter, MIT professors also helped the startup recruit interns for the Institute’s Independent Activities Period. And mentors still offer advice when needed. “It’s been great to continue the relationship with professors, students, and the MIT community as a whole,” Richter says.


News Article | February 21, 2017
Site: motherboard.vice.com

I recently read The End of Ownership, a new book about how companies are using contract law, Digital Rights Management, and End User License Agreements to strip away the very concept of "owning" the things we buy. When our Kindle books mysteriously disappear overnight, or we're prevented from installing unauthorized software on an iPhone, we can surmise that we don't really own the things we've bought. But the internet of things has created new and interesting ways for tech manufacturers to assert ownership not just of MP3s, ebooks, and software, but of the physical goods we have in our homes, our garages, and even inside out bodies. After you read the excerpt, check out our podcast with the authors. - Jason Koebler Excerpted from The End of Ownership: Personal Property in the Digital Economy by Aaron Perzanowski and Jason Schultz published by The MIT Press. All rights reserved. Cars, refrigerators, televisions, Barbie dolls. When people buy these everyday objects, they rarely give much thought to whether or not they own them. We pay for them, so we think of them as our property. And historically, with the exception of the occasional lease or rental, we owned our personal possessions. They were ours to use as we saw fit. They were free to be shared, resold, modified, or repaired. That expectation is a deeply held one. When manufacturers tried to leverage the DMCA to control how we used our printers and garage door openers, a big reason courts pushed back was that the effort was so unexpected, so out of step with our understanding of our relationship to the things we buy. But in the decade or so that followed those first bumbling attempts, we've witnessed a subtler and more effective strategy for convincing people to cede control over everyday purchases. It relies less—or at least less obviously—on DRM and the threat of DMCA liability, and more on the appeal of new product features, and in particular those found in the smart devices that make up the so-called Internet of Things (IoT). Radio Motherboard is also available on iTunes and all podcast apps. Your car is a computer with wheels; a plane is a computer with wings; your watch, your child's toys, even your pacemaker are all computers at their core. And as computers, they are susceptible to the same sort of external limitations and controls we've witnessed with previous generations of digital goods. Even if we resist it, we're accustomed to software telling us whether we can watch a digital movie. But what happens when computer code dictates when your light bulbs have to be replaced? Or how fast you can drive? Or whether you can fly your drone in a particular neighborhood? Or what brand of cat litter you can use? What are the social consequences of a smart mattress that collects and analyzes heart rate and breathing data, monitors your movements, and provides you a nightly summary? That's what Samsung's Sleepsense device promises. Samsung even suggests you track your loved ones by "simply put[ting] the sensor under their mattress ... to receive an analysis of the quality of their sleep via email." What could possibly go wrong? This walled-garden approach was a dramatic departure from the approach of general-purpose computers, which allowed third-party applications and considerable freedom for user modification With so many networked devices in their homes, consumers are relying on home automation hubs—devices that allow them to control their home security systems, lights, garage door openers, and entertainment systems from any place with an Internet connection. The maker of one such device, Revolv, was acquired by Google-owned IoT company Nest in 2014. The Revolv hub sold for $300 and touted a "lifetime subscription" for updates and new features. But in April of 2016, Nest announced it would no longer support the Revolv. What's more, Nest planned to exercise its software-enabled remote control over the devices to render them entirely inoperable. After a May 15 software update, it explained, "The Revolv app won't open and the hub won't work." Alphabet, Google's parent company, which has its sights set on the self-driving car and medical device markets, decided it was within its rights to reduce a device that consumers bought to nothing more than an overpriced paperweight. Consider that before you buy a Google car. Let's look at a small sampling of IoT devices across a wide range of sectors and consider their consequences for ownership and consumer welfare more broadly. In many cases, these technologies offer real benefits. Yet the core cultural and legal shifts they represent strike a blow against ownership in the digital economy. Jailbreaking Is Not a Crime The exact origin of the Internet of Things is difficult to pinpoint, but one significant moment in its early history was the introduction of the iPhone on January 9, 2007. Steve Jobs told the assembled crowd, "Today, Apple is going to reinvent the phone." He proceeded to wow them with "a revolutionary mobile phone, a widescreen iPod with touch controls, and a breakthrough Internet communications device" combined in a single product. But like nearly every Apple product, the user experience was carefully choreographed and tightly controlled. iPhone users could only run Apple's iOS. They could only configure the settings Apple allowed them to access. They could only use Apple-approved mobile carriers. And they could only run the applications Apple provided. And later, once Apple launched its App Store, they could only install software that Apple approved—on the basis of opaque and inconsistent standards. What you could do with this remarkably powerful pocket computer depended entirely on what Apple let you do. This walled-garden approach was a dramatic departure from the approach of general-purpose computers, including Macs, which allowed third-party applications and considerable freedom for user modification. In some ways, Apple's approach to the iPhone was more in line with an earlier phone maker, AT&T. During its decades-long reign as a telecommunications monopolist, AT&T—née Bell Telephone—used a number of strategies to maintain strict control over telephones. As the holder of Alexander Graham Bell's patents, AT&T had total control over the design, production, and distribution of phones. And even after those patents expired, it extended that control by leasing phones rather than selling them, making certain that users didn't acquire property rights in their devices. They also used contractual provisions and legal threats to stamp out innovation, no matter how innocuous. In the 1940s, AT&T exercised this power by targeting the Hush-a-phone, a small non-electronic accessory that attached over a telephone receiver to increase privacy and cut down on noise. AT&T forbade its use, and it took nearly a decade of legal battles before the DC Circuit rejected that restriction as an "unwarranted interference with the telephone subscriber's right reasonably to use his telephone in ways which are privately beneficial without becoming publicly detrimental." This case, along with the FCC's subsequent Carterphone decision, which permitted the attachment of wireless technology to AT&T's phones, paved the way for competition and individual ownership of landline phones. In some ways, Apple's control over the iPhone is a throwback to these bad old days. But it's one that many consumers happily accepted in exchange for the convenience of integrating all of their online activities into a single device. But not everyone was willing to go along quietly. Apple's restrictions sparked a movement to "jailbreak" iPhones in order to regain some semblance of ownership. "Jailbreaking" refers to the act of eliminating software restrictions and DRM that limit how phone owners can use their devices. With a jailbroken iPhone, you can install any software you choose, replace Apple's operating system with one you prefer, and customize the look and feel of your phone. Jailbreaking is related to, but distinct from, unlocking a mobile phone—the process of removing software locks that prevent you from switching wireless carriers—from AT&T to T-Mobile, for example. Jailbreaking is not a new practice. Similar homebrew communities formed around other devices long before the iPhone launched, from Xbox hacks to do-it-yourself DVRs. But nothing galvanized that community more than the thought of turning Apple's powerful and ubiquitous product into an open platform. The first iPhone jailbreak was announced on July 10, 2007, just eleven days after the device launched. With each inevitable Apple software update, the jailbreaking community would free that new version within weeks, if not days. Monsanto claimed its seeds were licensed for a single season, not sold. Although it didn't file suit, Apple insisted that jailbreaking was illegal. In 2009, the Electronic Frontier Foundation (EFF) filed a petition with the U.S. Copyright Office requesting formal permission for iPhone owners to jailbreak their devices without fearing anti-circumvention liability. This provoked Apple to explain precisely why jailbreaking should be banned. Despite referring to consumers as "iPhone owners" throughout its filing, Apple asserted that "iPhone users are licensees, not owners, of the copies of iPhone operating software." In other words, when you buy an iPhone, all you own is the physical hardware. The software stored on it that make it work and account for much of its value still belong to Apple. While perhaps shocking to those with an iPhone in their pocket, this stance was a logical conclusion for Apple, a company with one foot in the software industry and a commitment to controlling the user experience that bordered on zealotry. And because Apple has consistently proven its nearly unrivaled skill as a designer of end user experiences, it succeeded in selling us DRM in the guise of a smart device. It made us believe that a bug was a feature. Consumers recoiled at the idea of these sorts of restrictions when Chamberlain and Lexmark tried to sneak them into our garage door openers and laser printers, but when Jobs offered us the same vision, we lined up to give Apple our money. Tired of losing revenue to industrious farmers who repaired their own tractors or bargain hunters who took their equipment to an independent repair shop, John Deere decided to force their customers to have their equipment serviced by authorized John Deere dealers Eventually, the Copyright Office ruled in favor of the right to jailbreak phones. However, in doing so, it sidestepped the contentious issue of ownership and focused on jailbreaking as a fair use of Apple's copyrighted iOS. And in 2014, an otherwise hopelessly gridlocked Congress passed, and President Obama signed, the Unlocking Consumer Choice and Wireless Competition Act in response to a petition signed by over 100,000 Americans. Although each of these measures suggests both that people still care deeply about owning their devices and that government can be responsive to those concerns, they are temporary fixes. Both the Copyright Office exemptions and the unlocking legislation expire after three years. Apple's battle for ownership of our phones signaled the beginning of a much broader shift. Every day, we learn of yet another object that will come with embedded software, location detection sensors, and network connections that limit consumer control and surreptitiously communicate back to its corporate mother ship. And while companies like Apple are slowly making their devices more open and user-configurable as a result of public pressure and competitive threats from open-source mobile operating systems such as Android, whole other areas of our lives are becoming constrained and preconfigured for us, often without our knowledge. Old MacDonald Licensed a Farm Farmers have enough to worry about. Banks are coming to foreclose on their land. Locusts are eating their crops. Immigration policy is complicating their hiring practices. And corporate agri-business long ago redefined the economics of their way of life. On top of all of this, today's farmers have to contend with intellectual property. It began with seeds. For years, Monsanto successfully sold Roundup, an herbicide that helped farmers control weeds and other unwanted vegetation. But Roundup also often damaged the crops themselves, so Monsanto began manufacturing crops resistant to Roundup. It patented so-called Roundup Ready soybeans and later added alfalfa, canola, corn, cotton, and sugar beets to the list of Roundup-resistant products. Initially welcomed by many farmers, some were troubled by Monsanto's claim that its seeds were licensed for a single season, not sold. This meant that no matter how many seeds you saved, they couldn't be replanted the following year, a centuries-old farming practice. Instead, you had to buy new seeds from Monsanto or else contend with pests and less-effective pesticides. Seed patents were just the beginning of the IP frustrations facing farmers. Software has also found its way onto the farm. The iconic John Deere tractor now contains no less than eight control units—hardware and software components that regulate various functions, ranging from running the engine to adjusting the armrest to operating the hitch. When tractors were purely mechanical, farmers could easily maintain, repair, and modify their own equipment as needed. But now, software stands in their way. That barrier is no accident. Tired of losing revenue to industrious farmers who repaired their own tractors or bargain hunters who took their equipment to an independent repair shop, John Deere decided to force their customers to have their equipment serviced by authorized John Deere dealers. By interposing a software layer between farmers and their tractors, John Deere created a practical hurdle. And by wrapping its software controls in DRM, it created a legal one. A quick glance at the John Deere owner's manual gives you a good indication of the result. Almost any problem—from high coolant temperature to a parking brake that's not working or a seat that's too firm—ends the same way, with a trip to the John Deere dealer. Keurig's machines would only accept pods embedded with a code that verified your coffee came from a licensed supplier Fed up with John Deere's tactics, a group of farmers petitioned the Copyright Office in February of 2015 for a temporary DMCA exemption, like the one granted to smartphone jailbreakers, that would give them clear legal authority to repair, upgrade, and modify their tractors. John Deere responded with adamant opposition, insisting that tractor owners had no right to look under the digital hood, even if the fix was quick and technically simple. Its argument hinged on ownership. John Deere claimed it owns the software, and not just as an abstract matter of copyright law. It owns the copies of its code embedded in the tractors it sells to farmers, code that is essential to the functioning of the equipment. Farmers, in John Deere's words, merely had "an implied license for the life of the vehicle to operate the vehicle." That means you get to keep driving the tractor you bought from John Deere for tens of thousands of dollars unless and until it tells you otherwise. John Deere's attitude toward ownership has a number of important implications that typify the core risks presented by the Internet of Things . Most obviously, by denying farmers the right to repair—a right entrenched enough that even patent protection can't disturb it—John Deere has effectively raised the price of its products for farmers. It has also done serious harm to the market for repair services, which are less competitive since farmers have no real choice of mechanics. Free as in Coffee Those in the free software movement are fond of distinguishing between two ways in which we use the word "free." "Free as in beer" refers to price. "Free as in speech" refers to liberty, the freedom you have to use a thing as you choose. Until recently, you could be confident that if you overheard someone talking about free coffee, it meant Starbucks was running a promotion. But thanks to Keurig, the maker of the popular K-Cup brewing system, conversations about coffee now have to account for questions of liberty as well. The Keurig saga began in 2012, when several of the coffee company's key patents expired. Those patents covered its pod-based brewing system. Users placed single-serving portions of coffee or other brewed beverages in the machine, hit a button, and got a consistent drink each time. Without patent protection, Keurig had to contend with competition. As it turned out, Keurig wasn't a fan. Rival companies started producing compatible pods and undercutting Keurig's prices. In response, Keurig released new machines featuring "Keurig 2.0 Brewing Technology which reads each lid to deliver on the promise of excellent quality beverages." Marketing speak aside, what that meant was that Keurig's machines would only accept pods embedded with a code that verified your coffee came from a licensed supplier. And it also killed off its generic pod that let you supply your own coffee grounds. If you tried to brew rogue coffee, your Keurig machine greeted you with this cheerful message: The public reaction was swift and vicious. Angry Facebook posts and irate Amazon reviews flooded the Internet . As Brian Barrett wrote, "A coffee maker limiting your choice of grind seems as out of place as a frying pan dictating your eggs." It didn't take long for competitors to capitalize on this outrage by cracking the Keurig DRM. Coffee drinkers even figured out how to defeat it with a single piece of tape. Soon Keurig was persuaded to reverse course, at least in part. It appears to be sticking to its guns when it comes to blocking pods from competitors, but it announced plans to reintroduce the My K-Cup product that allowed coffee drinkers to fill their own pods. Nonetheless, the company and its investors have paid a price for its overreach. Keurig stock dropped by 10 percent in the wake of the DRM controversy. ToyTalk claims to own anything you, your child, or even their friends say to Barbie The Keurig example shows that people still care deeply about owning and controlling their devices and that they have the potential to make their voices heard in the marketplace. But it also cautions that market pressure is often only partly effective in protecting consumer interests. Open the Pod Bay Doors, Barbie At this point, it should come as no surprise that the Internet of Things threatens our sense of control over the devices we purchase. However, those threats aren't limited to intellectual property and DRM; they also include battles for control over information about our behavior and our inner lives. One troubling example is the Wi-Fi-enabled Hello Barbie doll from Mattel. This IoT Barbie looks like many of her predecessors but offers a unique feature. She can engage in conversation with a child and learn about them in the process. Barbie does this by recording her conversations and transmitting them via network connections to ToyTalk, a third-party cloud-based speech recognition service. ToyTalk then uses software and data analytics to analyze those conversations and deliver personalized responses. It's an impressive trick, but the implications for our sense of ownership are quite shocking. For many children, talking to toy dolls is a way to share their unfiltered thoughts, dreams, and fears in a safe, private environment. But according to the terms of the Hello Barbie EULA, ToyTalk and its unnamed partners have wide latitude to make use of information about your child's conversations in ways that few parents would anticipate: All information, materials and content ... is owned by ToyTalk or is used with permission. ... You agree that ToyTalk and its licensors and contractors may use, transcribe and store. ... Recordings and any speech data contained therein, including your voice and likeness as may be captured therein, to provide and maintain the ToyTalk App, to develop, tune, test, enhance or improve speech recognition technology and artificial intelligence algorithms, to develop acoustic and language models and for other research and development purposes. ... By using any Service, you consent to ToyTalk's collection, use and/or disclosure of your personal information as described in this Policy. By allowing other people to use the Service via your account, you are confirming that you have the right to consent on their behalf to ToyTalk's collection, use and disclosure of their personal information as described below. In other words, ToyTalk claims to own anything you, your child, or even their friends say to Barbie. Conversations with the doll are corporate property. The safety and privacy of a child's bedroom is compromised by the collection, sharing, and commercial use of those conversations. And while these services may offer benefits, they come with significant new risks. Shortly after the IoT-enabled Barbie shipped, security vulnerabilities that could allow hackers to intercept a child's conversations with the doll were revealed. And those worries aren't just hypothetical. Around the same time, VTech—maker of the children's smartwatch Kidizoom and InnoTab mobile device—disclosed that more than six million children had their personal information, including photos and chat messages, stolen from VTech's servers. Hello Barbie is just the latest example of this trend of networked appliances. Samsung shipped a SmartTV with a default listening mode—and accompanying privacy policy—intended to continuously eavesdrop on viewers and send audio back via the cloud for analysis. In a pitch to investors, Vizio recently touted the fact that its smart televisions will be able to detect any content that users watch, regardless of the source, and use that information to customize advertising and programming. The June smart oven features cameras and software that can recognize the food you cook. Google's Nest thermostat takes a similar approach to learning about you. Amazon's Echo, Apple's Siri, Microsoft's Cortana, and Google Now go a step further by encouraging us to interact with disembodied soothing, friendly, and—by default—female voices. Science tells us that we engage more readily with technology that mimics human interaction. A recent study showed that gamblers risk more on slot machines with humanlike features. Of course, such services have the potential to offer real benefits. But such a service relationship comes not only with divided loyalties but also diminished autonomy. It is very different from owning an object completely and suggests we should be mindful of exactly who controls our relationship with any object we purchase. A person's home may be their castle, but their appliances may belong to someone else. Our Bodies, Our Servers As if our connection to the Internet of Things wasn't intimate enough, network-enabled and software-dependent devices are now inside our bodies. When open source advocate Karen Sandler found out at age thirty-one that she could die suddenly from a heart condition, she did what most of us would do. She went to the doctor to fix it. In her case, that meant implanting a pacemaker-defibrillator in her chest to give her heart a jolt in the event it gave out. The device—about the size of an avocado—was literally a life-saving invention. But because it ran proprietary software, Sandler had no way to tell how it worked or how likely it was to fail. As she explained in an interview, "A statistic came out recently that 25 percent of all medical device recalls in the last few years have been due to software failure. When you read these statistics it becomes very personal." It turns out that Sandler's questions about her pacemaker weren't so easy to answer. Much like Apple and its iPhone, pacemaker manufacturers won't let patients look inside or test the devices they purchase. Nor are you allowed to read the data from your own device while you are at home or on the road—even in the midst of a medical emergency. Instead, you can only access your health data from manufacturer-approved sources. And until recently, you couldn't even test your device to make sure it is functioning correctly or was running the latest software or security update. The reason for such restrictions? According to a filing with the Copyright Office, the Advanced Medical Technology Association "believe[s] that patients have an inherent right to access their own medical data, however, this in and of itself does not necessitate bypass of any intellectual property protections." In other words, even if you own the physical parts of the pacemaker, the manufacturer's copyright trumps any claim you might have to see how it works or what data it collects on you—even when it is implanted inside your body. Fitbit's privacy policy does promise to remove personally identifiable information whenever it shares your records with third parties, it reserves the right to keep everything else indefinitely, even after you delete your account Dana Lewis proved what patients can do when they own their devices and control their care. Lewis is a diabetic living in Seattle who relies on a glucose monitor and a handheld wireless device to alert her when her blood sugar is too high or low. Yet Lewis often wasn't able to hear the alarm, especially when she was sleeping. So she and her partner, Scott Leibrand, built a new program that displayed blood sugar levels with new louder alarms and a snooze button. They even added the ability to send the information to other mobile devices, such as Leibrand's Pebble watch. Next they turned to Lewis's insulin regime. Traditionally diabetics control their insulin levels manually. But Lewis and Leibrand began experimenting with the data to devise an algorithm specific to Lewis's needs—something that would automate and adapt based on the data her device was sending out. It could predict her insulin needs thirty, sixty, and even ninety minutes in the future. Eventually they hope to produce an artificial pancreas that will essentially automate this process. No IP law, and certainly not one designed to stop infringers from sharing movies online, should stand in the way of patients adapting equipment they own to keep them alive. These concerns are not limited to those of us with life-threatening conditions. When you buy a Fitbit wearable tracker, its Terms of Sale specifically state that "to the extent the Products contain or consist of software in any form ... such Software is licensed to you, not sold[.] Terms such as 'sell' and 'purchase,' as used in these Terms, apply only to the extent the Products consist of items other than Software." Again all you own is the shell and the components. Everything digital—including physical storage media—belongs to Fitbit. While Fitbit's privacy policy does promise to remove personally identifiable information whenever it shares your records with third parties, it reserves the right to keep everything else indefinitely, even after you delete your account. Every move you make, every step you take, Fitbit will be tracking you. And as Kate Crawford wrote, because the type of information collected by these devices is so personal, and so intimate, it is almost as if the device itself becomes a more authoritative source about us than we are. Network security has also become an issue for medical devices. From insulin pumps to cochlear implants and powered prosthetic joints, more and more medical devices rely on transmitting medical data to providers through Wi-Fi and Bluetooth protocols. These connections have already opened the door to numerous security issues. Even former Vice President Dick Cheney claims to have switched off the wireless functionality on his own pacemaker to prevent terrorists from hacking it. Fortunately, much like with vehicle security testing, the Copyright Office granted an exemption for testing exterior medical devices and passively testing those that are implanted in ways that don't affect functionality. The ability to innovate and improve these devices, however, remains highly contested. Karen Sandler's dream of an open source pacemaker may inspire us, but it also presents complications. Open source could allow patients to examine, test, and improve devices in ways far more flexible and permissive than the current proprietary model, but they don't give us autonomy in quite the same way as analog ownership. Instead they offer a future with different, more user-friendly restrictions to navigate. Focusing on medical devices, the argument for individual ownership and control resonates more viscerally. For the rest of the stuff we buy, the stakes may be lower, but the arguments are the same. If you don't own your devices, you can't repair or customize them. You can't innovate with them. And in the end, the products you buy may end up using you more than you use them.


News Article | October 28, 2016
Site: www.techrepublic.com

IBM is investing $200 million into its Watson IoT business, which is headquartered in Munich, Germany. The Watson IoT headquarters will be home to new hands-on industry labs where clients and partners will work with IBM's researchers, engineers, and developers to drive innovation in the automotive, electronics, manufacturing, healthcare, and insurance industries. This is one of IBM's largest ever investments in Europe, and is in response to customers wanting to use a combination of IoT and Artificial Intelligence technologies. IBM currently has 6,000 global clients, up from 4,000 eight months ago, according to an IBM press release. These clients are using Watson IoT technologies to gather information from billions of sensors embedded in machines, cars, drones, ball bearings, and hospitals. "IBM is making tremendous strides to ensure that businesses around the world are able to take advantage of this incredible period of technological transformation and develop new products and services that really change people's lives," said Harriet Green, global head of IBM's Watson IoT business. "Germany is at the forefront of the Industry 4.0 initiative and by inviting our clients and partners to join us in Munich, we are opening up our talent and technologies to help deliver on the promise of IoT and establishing a global hotbed for collaborative innovation." SEE: IBM Watson, MIT partner on 'brain-inspired' lab to make computers more human (TechRepublic) IBM Watson IoT is working with Schaeffler, Aerialtronics, and Thomas Jefferson University Hospital with the following new projects:


News Article | September 5, 2016
Site: www.technologyreview.com

Many of the inventors who fueled the digital revolution have become household names. And rightfully so. Innovators such as Steve Jobs, Bill Gates, and Mark Zuckerberg all contributed mightily to the technologies that have transformed our daily lives and society. If you’re not an engineer, however, you have probably never heard of the brilliant inventor Rudolf Kálmán, a Budapest-born engineer and mathematician who died on July 2 in Gainesville, Florida, at age 86. His fundamental contribution, an algorithm called the Kalman filter, made possible many essential technological achievements of the last 50 years. These include aerospace systems such as the computers that landed Apollo astronauts on the moon, robotic vehicles that explore our world from the deep sea to the outer planets, and nearly any endeavor that needs to estimate the state of the world from noisy data. Someone once described the entire GPS system—an Earth-girdling constellation of satellites, ground stations, and computers as “one enormous Kalman filter.” Within his professional community, Kálmán was well known and highly admired, the recipient of numerous awards and honors. In 2009 President Obama awarded him the National Medal of Science. If you have studied any form of robotics, control, or aerospace engineering in the past four decades, then Kálmán’s eponymous filter was as fundamental to your work as the Pythagorean theorem is to high schoolers preparing for the SAT. Here’s why. Control engineers know that you can only control what you can measure. The more precisely you can measure it, the better you can control it. Consider the challenge faced by the engineers tasked with designing the Apollo flight computers in the early 1960s. The computers’ raw data—measurements from sensors such as gyroscopes, accelerometers, and radar—were inherently noisy, full of random errors and messy inaccuracies. When barreling toward a rocky moon at high speed, those errors can ruin your day. Somehow you have to be able to filter out this noise from the measurements and make the best possible estimate of where you are and how fast you’re moving. You also need to know just how good or bad your estimates are, in a statistical sense, since it can be disastrous to think that you’re doing better than you actually are. And all this needs to happen in fractions of a second as the spacecraft speeds toward the moon, attempts a lunar landing, or threads the needle of an entry-corridor as it reënters Earth’s atmosphere. That’s where Rudolf Kálmán came in. He published an ingenious recursive estimation algorithm in 1960. The filter would accomplish the goal of accurately estimating and predicting critical variables such as location, direction, and speed in the presence of noisy measurements, and even estimate the noise. Others, such as cybernetics inventor Norbert Wiener, had tackled the problem before, but Kálmán tailored his solution to the emerging world of digital computers and real-time processing. When the Apollo 11 lunar module, controlled by Neil Armstrong and a software program, made its heart-stopping landing on the Sea of Tranquility, the Kalman filter ensured that real-time position data coming from Earth-based radar tracking agreed closely with the onboard sensors. Listen to the tapes and you’ll hear Buzz Aldrin calling out the Kalman filter estimates as Armstrong landed. Nearly that same calculation, with modernized Kalman filters, happens routinely inside your mobile phone. The phone’s GPS sensor provides real-world coӧrdinates on the face of the Earth, while its accelerometers sense rapid, small motions. Each has noise and inaccuracy of different types; the Kalman filter combines them for the best of both worlds. Drive your car into a tunnel, for example, and you lose GPS, but the Kalman filter still achieves pretty good dead reckoning until you come out the other side and get a new GPS “fix.” But that’s only the beginning of the impact that Rudolf Kálmán’s work will have on the world. Within the next decade the Kalman filter will be at work in consumer technologies that will change your life in equally profound ways. The very same guidance and navigation problems faced by Apollo engineers 50 years ago—how to locate objects accurately in the vastness of space—challenge engineers today as they design self-driving cars that can navigate safely in smart cities, augmented-reality computer games, and robot companions to work on the factory floor and in your home. All these inventions require precise information, what we call “microlocation,” in some cases down to millimeters, to ensure that your self-driving car parks in your garage and not on your lawn, that your virtual-reality gaming headset makes you fly and not vomit, and that your trusted robot companion pours coffee into your cup and not on your lap. This means millions and perhaps billions of Kalman filters. But then there’s the Internet of things, the much anticipated infrastructure of a connected, smart world of the future. The Internet of things will require Kalman filters in trillions of smart objects to guide them to where and when we want them, at our workplaces, in our homes, and elsewhere in our lives. Then perhaps Kálmán will finally join Jobs, Gates, and Zuckerberg as a household name. David Mindell is a professor at MIT and founder of Humatics, a Cambridge, Massachusetts, microlocation company that utilizes the Kálmán filter. Frank Moss, an alumnus of the MIT Instrumentation Laboratory that built the Apollo computers, is a former director of the MIT Media Lab and a board member at Humatics.


News Article | November 12, 2015
Site: www.techtimes.com

If you're tired of sitting in your car frustrated over traffic congestions or just too sleepy and plain tired at the end of the day, Kevin Ashton has a technological forecast that will surely excite you. The man who coined the term " Internet of Things" himself predicted that by the year 2030, you'll be able to put your feet up on your dashboard because cars will do the driving for you. That's right, Massachusetts Institute of Technology's (MIT) Auto-ID Center founder Kevin Ashton believes that the technology of the future is predictable when you base it on past trends and he wrote a book titled "How to Fly a Horse: The Secret History of Creation, Invention, and Discovery" which is currently a finalist for Best Innovation & Creativity Book of the Year in the 800-CEO-READ Business Book Awards for 2015. "Expect self-driving features in most new cars by 2020 and cars without steering wheels between 2025-2030, varying by country. What's the point? They will be safer, faster, more fuel efficient, and you'll be able to get things done, or take a nap, while you move from place to place," he wrote. He anchored the predictability of future technology on three specific laws that technological progress seems to abide by: Moore's Law or the idea that every two years, microprocessors half in size; Metcalfe's Law which determines that a network's value is dependent on the number of users squared; and Koomey's Law which establishes the idea that every 18 months, the energy spent for computation is halved. Basically, what this means is that computer technology becomes smaller, constantly connected to the Internet and more energy efficient. Ashton's predictions may just be on its way to the streets as more and more car manufacturers are investing in technology to produce a self-driving vehicle. Toyota, Nissan, General Motors and Google are racing to perfect their technology as they all announced their plan to get autonomous vehicles cruising the streets by 2020. We have five years of waiting time for car manufacturers to make good on their plans and 15 years to see if Ashton's predictions are correct. To pique your interest on his ideas more, Ashton also predicted that we will discover extra-terrestrial life and gives his take on the reality of Climate Change and how humans would cope with it.


News Article | September 18, 2016
Site: techcrunch.com

Over the past few years, the Internet of Things (IoT) has been the white-hot center of a flurry of activity. Startups that create embedded sensors for physical things have been snapped up by larger companies at a rapid pace, with deals for IoT startups totaling more than $30 billion in the past four years. The IoT may well be The Next Big Thing, but maybe the attention around sensors is misplaced… What if we didn’t even need embedded sensors to allow things to gather data about their surrounding environment? What if material could be a sensor in and of itself? Sentient materials might sound like the stuff of sci-fi, but it’s quickly becoming a reality. A new generation of materials is being developed that can sense temperature, pressure, impact and other variables — completely removing the need for sensors. Not only can these materials capture and relay data to the cloud, they also can reconfigure themselves on-the-fly to react to changing environmental conditions. It’s as if materials are becoming not just smart, but “alive” — and it will change the way things are designed and used in startling ways. How did we arrive here? Design and engineering used to focus on materials that behaved isotropically — which is to say, uniformly and predictably. In the isotropic age, you would create a design and then assign a material to carry out a specific role in that design. What if, however, you allowed materials to determine design, rather than vice versa? We see this in nature all the time. A seed, for example, works together with a specific environment to create a tree. This is an example of anisotropic materials in action. Unlike isotropic materials, their behavior isn’t predetermined, so their performance can be tailored to their environment. Welcome to the anisotropic age of design. Imagine an airplane skin that self-heals to remove dings and dents, thereby maintaining optimal aerodynamics. In the isotropic age that’d be virtually impossible to design — but in the anisotropic age, it becomes a possibility. Here’s how it would work: An airplane component (like the wing) is made out of a composite material that has been coated with a thin layer of nanosensors. This coating serves as a “nervous system,” allowing the component to “sense” everything that is happening around it — pressure, temperature and so on. When the wing’s nervous system senses damage, it sends a signal to microspheres of uncured material within the nanocrystal coating. This signal instructs the microspheres to release their contents in the damaged area and then start curing, much like putting glue on a crack and letting it harden. Airbus is already doing important research in this area at the University of Bristol’s National Composites Centre, moving us closer to an aviation industry shaped by smart materials. The automotive industry, meanwhile, can use smart materials to manufacture cars that not only sense damage and self-heal, but also collect data about performance that can be fed back into the design and engineering process. The Hack Rod project — which brings technology partners together with a team of automotive enthusiasts in Southern California — is out to design the first car in history built with smart materials and engineered using artificial intelligence. In another example, Paulo Gameiro, coordinator of the EU-funded HARKEN project and R&D manager for the Portuguese automotive textiles supplier Borgstena, is developing a prototype seat and seatbelt that uses smart textiles with built-in sensors to detect a driver’s heart and breathing rates, so it can alert drivers to tell-tale signs of drowsiness. Beyond transportation, more opportunities await in the construction and civil engineering fields, where smart materials can greatly assist with structural health monitoring. Today, the world has hundreds of roads, bridges and other pieces of infrastructure that are slowly falling apart because of wear and tear and exposure to the elements. More often than not, we don’t even know which items need our attention most urgently. But what if you could build these structures out of “smart concrete”? The “nervous system” within the concrete could constantly monitor and assess the status of the infrastructure and initiate self-repair as soon as any damage was sustained. There is a major project currently underway at the Massachusetts Institute of Technology (MIT), called ZERO+, that aims to reshape the construction industry with exactly these types of advanced composite materials. The researchers at MIT are also hard at work at the newly formed Advanced Functional Fabrics of America (AFFOA) Institute. Their goal is to come up with a new generation of fabrics and fibers that will have the ability to see, hear and sense their surroundings; communicate; store and convert energy; monitor health; control temperature; and change their color. These functional fabrics mean that clothes won’t necessarily just be clothes anymore. They can be agents of health and well-being, serving as noninvasive ways to monitor body temperature or to analyze sweat for the presence of various elements. They can be portable power sources, capturing energy from outside sources like the sun and retaining that energy. They even can be used by soldiers to adapt to different environments more quickly and efficiently. And if you accidentally rip a hole in your garment? Naturally, the nanosensors within the fabric will engage a self-repair process to patch things up — in the exact same way the airplane wing and the smart concrete healed themselves. This is no Hollywood movie — this is reality, and a clear indicator of how quickly smart materials are coming along. These materials have an increasingly important role to play in shaping the world around us — whether that’s airplanes and infrastructure or the clothes on our backs. By creating things that can not only capture data about their environment, but also adjust their performance based on that data, materials are starting to play an active role in design. This is the potential of smart materials, and it’s one of the keys to creating a better-designed world around us.

Loading Massachusetts Institute of Technology collaborators
Loading Massachusetts Institute of Technology collaborators