The Massachusetts Institute of Technology is a private research university in Cambridge, Massachusetts. Founded in 1861 in response to the increasing industrialization of the United States, MIT adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. Researchers worked on computers, radar, and inertial guidance during World War II and the Cold War. Post-war defense research contributed to the rapid expansion of the faculty and campus under James Killian. The current 168-acre campus opened in 1916 and extends over 1 mile along the northern bank of the Charles River basin.MIT, with five schools and one college which contain a total of 32 departments, is traditionally known for research and education in the physical science and engineering, and more recently in biology, economics, linguistics, and management as well. The "Engineers" sponsor 31 sports, most teams of which compete in the NCAA Division III's New England Women's and Men's Athletic Conference; the Division I rowing programs compete as part of the EARC and EAWRC.MIT is often cited as among the world's top universities. As of 2014, 81 Nobel laureates, 52 National Medal of Science recipients, 45 Rhodes Scholars, 38 MacArthur Fellows, and 2 Fields Medalists have been affiliated with MIT. MIT has a strong entrepreneurial culture and the aggregated revenues of companies founded by MIT alumni would rank as the eleventh-largest economy in the world. Wikipedia.
News Article | May 3, 2017
The Advisory Council is led by Swamy Kotagiri, Magna Chief Technology Officer (CTO), and consists of some of the most recognized and respected experts in the global automotive and tech industries. The council brings a wider circle of insight, knowledge and experience from various industries that ultimately helps accelerate the execution of Magna's technology and business objectives. "The pace of innovation in the automotive industry is like nothing we have ever seen before, creating even more challenges and opportunities," said Kotagiri. "At Magna, we welcome the challenge and aim to seize the opportunities by continuing to leverage our culture of innovation, while embracing a new level of innovation outreach. We are excited to have such a distinguished group of individuals bringing their vision and insights to our company." Advisory Council members will provide high-level strategic planning insights and experience in the areas of advanced driver assistance systems, environmental and automotive safety, overall industry trends, and next-generation technologies. Chaired by Kotagiri, the Advisory Council is comprised of six members who are recognized leaders in their respective fields, several of whom have significant experience in product innovation and the implementation of new technologies. "Magna's deep vehicle systems knowledge and electronics capabilities, combined with its global engineering and manufacturing expertise, are remarkable," said Tony Fadell. "They are in a great position to help drive change in the auto industry and I am excited to be working with such an innovative company." "Magna is a company committed to helping define the future of mobility and I am delighted to be a part of such a distinguished group of individuals who collectively can bring new opportunities to Magna and the industry," said Dr. Ian Hunter. Swamy Kotagiri is globally responsible for managing Magna's innovation and new product strategy and development. As CTO, Kotagiri helps Magna's product groups bring innovative ideas to the market, which allows the company to move the automotive industry forward. Mei-Wei Cheng is a member of the Board of Directors of Seagate Technology PLC and recently served as non-executive Chairman of Pactera. He was the former CEO and President for the Chinese subsidiaries of AT&T, Siemens Ford Motor Company and General Electric. He holds a bachelor's degree in industrial engineering/operations research from Cornell University and an MBA from Rutgers University. Tony Fadell is the inventor of the iPod, an inventor of the iPhone, and founder of Nest, the company that pioneered the "Internet of things". He is an active investor and entrepreneur with a 25-year history of founding companies and designing products that improve people's lives. Fadell has authored more than 300 patents. In May 2016, TIME named Nest Thermostat, the iPod and iPhone as three of the "50 Most Influential Gadgets of All Time." Dr. Ian Hunter is a Professor of Mechanical Engineering and runs the BioInstrumentation Lab at the Massachusetts Institute of Technology. Dr. Hunter has filed over 150 patents, produced more than 500 scientific and engineering publications, and has founded and/or co-founded 25 companies. He received his bachelor's, master's and doctorate degrees from the University of Auckland and completed a post-doctoral fellowship in the department of Biomedical Engineering at McGill University in Canada. John Maddox is the CEO of the American Center for Mobility. He began his career as a Research Engineer at Ford Motor Company and has held positions such as Associate Administrator at the National Highway Traffic Safety Administration and Compliance Officer at Volkswagen North America. He holds a degree in mechanical engineering from the University of Maryland and a master's degree in engineering management from the University of Detroit Mercy. Paul Mascarenas is a member of the Board of Directors at ON Semiconductor and the United States Steel Corporation. He previously held a number of senior leadership positions at Ford Motor Company, most recently serving as Chief Technical Officer. Paul holds a bachelor's degree in mechanical engineering from the University of London, King's College in England and holds an honorary doctorate degree from Chongqing University in China. ABOUT MAGNA We are a leading global automotive supplier with 317 manufacturing operations and 102 product development, engineering and sales centres in 29 countries. We have over 155,000 employees focused on delivering superior value to our customers through innovative products and processes, and world class manufacturing. We have complete vehicle engineering and contract manufacturing expertise, as well as product capabilities which include body, chassis, exterior, seating, powertrain, active driver assistance, vision, closure and roof systems. We also have electronic and software capabilities across many of these areas. Our common shares trade on the Toronto Stock Exchange (MG) and the New York Stock Exchange (MGA). For further information about Magna, visit our website at www.magna.com. THIS RELEASE MAY CONTAIN STATEMENTS WHICH CONSTITUTE "FORWARD-LOOKING STATEMENTS" UNDER APPLICABLE SECURITIES LEGISLATION AND ARE SUBJECT TO, AND EXPRESSLY QUALIFIED BY, THE CAUTIONARY DISCLAIMERS THAT ARE SET OUT IN MAGNA'S REGULATORY FILINGS. PLEASE REFER TO MAGNA'S MOST CURRENT MANAGEMENT'S DISCUSSION AND ANALYSIS OF RESULTS OF OPERATIONS AND FINANCIAL POSITION, ANNUAL INFORMATION FORM AND ANNUAL REPORT ON FORM 40-F, AS REPLACED OR UPDATED BY ANY OF MAGNA'S SUBSEQUENT REGULATORY FILINGS, WHICH SET OUT THE CAUTIONARY DISCLAIMERS, INCLUDING THE RISK FACTORS THAT COULD CAUSE ACTUAL EVENTS TO DIFFER MATERIALLY FROM THOSE INDICATED BY SUCH FORWARD-LOOKING STATEMENTS. THESE DOCUMENTS ARE AVAILABLE FOR REVIEW ON MAGNA'S WEBSITE AT WWW.MAGNA.COM.
News Article | May 16, 2017
American wunderkind Reuben Paul, may be still only in 6th grade at his school in Austin, Texas, but he and his teddy bear Bob wowed hundreds at a timely cyber security conference in The Netherlands. "From airplanes to automobiles, from smart phones to smart homes, anything or any toy can be part of the" Internet of Things (IOT)," he said, a small figure pacing the huge stage at the World Forum in The Hague. "From terminators to teddy bears, anything or any toy can be weaponised." To demonstrate, he deployed his cuddly bear, which connects to the iCloud via WiFi and Bluetooth smart technology to receive and transmit messages. Plugging into his laptop a rogue device known as a "raspberry pi"—a small credit card size computer —Reuben scanned the hall for available Bluetooth devices, and to everyone's amazement including his own suddenly downloaded dozens of numbers including some of top officials. Then using a computer language programme, called Python, he hacked into his bear via one of the numbers to turn on one of its lights and record a message from the audience. "Most internet-connected things have a blue-tooth functionality ... I basically showed how I could connect to it, and send commands to it, by recording audio and playing the light," he told AFP later. "IOT home appliances, things that can be used in our everyday lives, our cars, lights refrigerators, everything like this that is connected can be used and weaponised to spy on us or harm us." They can be used to steal private information such as passwords, as remote surveillance to spy on kids, or employ a GPS to find out where a person is. More chillingly, a toy could say "meet me at this location and I will pick you up," Reuben said. His father, information technology expert Mano Paul, told how aged about six Reuben had revealed his early IT skills correcting him during a business call. Using a simple explanation from dad on how one smart phone game worked, Reuben then figured out it was the same kind of algorithm behind the popular video game Angry Birds. "He has always surprised us. Every moment when we teach him something he's usually the one who ends up teaching us," Mano Paul told AFP. But Paul said he been "shocked" by the vulnerabilities discovered in kids toys, after Reuben first hacked a toy car, before moving onto more complicated things. "It means that my kids are playing with timebombs, that over time somebody who is bad or malicious can exploit." Now the family has helped Reuben, who is also the youngest American to have become a Shaolin Kung Fu black belt, to set up his CyberShaolin non-profit organisation. Its aim is "to inform kids and adults about the dangers of cyber insecurity," Reuben said, adding he also wants to press home the message that manufacturers, security researchers and the government have to work together. Reuben also has ambitious plans for the future, aiming to study cyber security at either CalTech or MIT universities and then use his skills for good. Failing that maybe he could become an Olympian in gymnastics—another sport he excels in. Explore further: NASA catches the two day life of Tropical Cyclone Reuben
News Article | May 22, 2017
We are entering the age of ubiquitous sensing. Smart sensors will soon track our health and wellness, enable autonomous cars, and monitor machines, buildings, and bridges. Massive networks of small, inexpensive sensors will enable large-scale global data collection — impacting the distribution of agriculture and water, environmental monitoring, disaster recovery, disease-outbreak detection and intervention, and the operation of cities. With this change in mind, MIT is creating a singular hub to unite experts as they develop a new generation of sensors, and sensing and measurement technologies. On May 25-26, SENSE.nano will debut, marking the first “center of excellence” powered by MIT.nano, the 214,000 square-foot research facility taking shape in the heart of MIT campus. The center will empower people in the MIT community, engage industry leaders, and educate the public. “There is a thing we do extremely well at MIT: We lock arms and make progress that is beyond the scope of any one researcher,” says Timothy Swager, the John D. MacArthur Professor in the Department of Chemistry. “If you look at what’s happening with sensors, you’ll see that many different disciplines have to come together. Ubiquitous sensing has so many aspects — chemical, biological, physical, radiological,” he says. “With all this sensing research going on, we need a place to coordinate our synergies.” As part of the kickoff, a full-day symposium will feature experts discussing technical challenges, commercial and humanitarian needs, and the societal impact of ubiquitous sensor and sensing systems. In a nod to the everyday impact of this technology, NPR journalist Tom Ashbrook will lead a broad discussion on “Sensing, Society, and Technology.” “Novel sensors and sensing systems will provide previously unimaginable insight into the condition of individuals and the built and natural worlds, positively impacting people, machines, and the environment,” says Brian W. Anthony, a principal research engineer at MIT and director of the Advanced Manufacturing and Design program, who is coleading the new center. SENSE.nano will support collaboration between people from a range of specialty areas — engineering, business, Earth science, electronics, computation, nanoscience, materials science, neuroscience, chemistry, physics, computer science, biology, and advanced manufacturing. “We want to use this event as an opportunity to strengthen the community and improve our connection to the local innovation and manufacturing ecosystem,” adds Anthony. “And to accelerate the rate at which our new sensing technologies and innovations are scaled-up and go out and impact the IoT enabled industries, advanced instrumentation, and beyond.” Vince Roche, CEO of Analog Devices, and Gururaj “Desh” Deshpande, founder of the Deshpande Foundation, will offer morning and afternoon keynotes. Framing the broad impact and opportunity of sensing technologies to the U.S. economy and the world’s societal needs. Analog Devices, a semiconductor company cofounded by Raymond S. Stata, is a cornerstone company in sensor products and advanced manufacturing in Massachusetts. “It is time for people to reach out and find the best ways to collaborate,” he says. “We’re looking for input from the community, sensor and sensing system manufacturers, government, academe, and researchers to help us define the grand challenge focus areas within SENSE.nano.”
News Article | November 2, 2015
As the population of internet connected things continues to rise, how will autonomous robotic devices see and be smart enough to navigate obstacles, each other and people in the real world? Light Detection And Ranging (LiDAR) is one answer. The remote sensing technology has been around since the 1960s. Back then, it required enormous scanners that filled planes and cost hundreds of thousands of dollars. Increasingly it's more affordable, and in some configurations can fit inside something the size off a baseball. It is used for widespread surveying, mapping for geological research and archaeology. It works like this: A scanner shoots out infrared lasers that reflect or bounce off solid matter on surfaces. The distance measured helps create a 3D image. In the case of robots, that 3D image is interpreted by the computer system, which is coded with instructions for how the robot should react. Like most technologies that survive the test of time, LiDAR technology has evolved. In fact, today police use LiDAR speed guns to measure vehicles that exceed the speed limit. A small, cutting-edge LiDAR sensor can be purchased for $8,000. While this is still expensive, advancements in LiDAR technology and the rise in robot and autonomous machine research have combined to create a boom in projects that are pushing the limits of what the environment-mapping technology can do. Deloitte estimates that this year's $18 billion market for sales of robots for logistics, packaging and materials handling will nearly double by 2020. The research firm stated that many of these robots are already designed to interact directly with consumers or assist them with shopping. These machines need to somehow see in order to function properly. "LiDAR systems have been an absolutely necessary tool for the advancement of mobile robots," said Charles Grinnell, CEO of Harvest Automation, which builds autonomous agricultural robots. "LiDAR is a great all-in-one solution for us." Then there's the Cheetah running robot. This four-legged, 70-pound robot, developed at MIT and funded by Defense Advanced Research Project Agency (DARPA), can run untethered at 13 mph over a flat course. Now it can jump over hurdles autonomously with the help of an onboard LiDAR setup. The system lets the robot spot approaching obstacles then calculate their size and distance. A series of path-panning algorithms tell the robot when and how high to jump, all without any human help. The Cheetah can clear hurdles 18 inches high—more than half its own height—while running 5 mph. On a treadmill, the robot can make it over almost three-quarters of obstacles. On a track, with more space and time to make its calculations, the robot has a 90 percent success rate. The Harvest Automation HV-100 has a more benign purpose: helping move and place plants with exacting precision. Nicknamed "Harvey," the 20-inch-tall autonomous model is already zipping around nurseries and fields across the country. LiDAR helps these robots identify their location and determine where to place the plants they're carrying amid thousands of others—and avoid obstacles in the way. "Our robots work with people in the same space, so we are constantly on the lookout to avoid collisions," said Grinnell. This plant placement is also important for growers to optimize space while minimizing labor costs and preventing worker injury related to repetitive tasks. "It's critical for mobile robots to process movement and sense obstacles," said Eric Mantion, an evangelist for Intel RealSense camera technology. RealSense technology, which is built into many new laptop and all-in-one PC models, gives computer vision to experimental drones and robots. Mantion said that just as a high jumper's brain needs to process every step of a jump – the approach, the launch, the clearing of the bar—robots also have to process all of these types of movements to function in the real world. "The more efficiently these algorithms can be processed, the more responsive and successful the outcome," Mantion said. RealSense is much smaller and more affordable than LiDAR, making it possible to create new consumer robotic devices. One example is the iRobot butler, which uses RealSense cameras to navigate hotel hallways. In addition to helping researchers pioneer new capabilities for autonomous machines, LiDAR sensors are being used by a number of car manufacturers. Together with other kinds of navigational systems, LiDAR helps direct self-driving vehicles. Recently, however, researchers have shown that LiDAR sensors can be fooled from more than 300 feet away by someone using a specially modified laser pointer. Hackers could conceivably make a self-driving car stop or swerve by convincing the sensor that it is about to hit an imaginary obstacle like a car, person or wall, said Jonathan Petit with the University of Cork's Computer Security Group. "LiDAR uses light, an unregulated spectrum that is accessible to anyone," he explained. Luckily there are ways to mitigate such attacks, he says, such as identifying and removing outliers from the raw LiDAR data. With its nearly instantaneous 360-degree sweeps, however, LiDAR is ideal for covering large areas quickly. That's why it's so popular with geologists, surveyors and archaeologists, who often mount sensors on planes. Some companies specialize in the scanning process itself, capturing data for a variety of clients. London-based ScanLab works with historians, climatologists and forensic anthropologists to create large-scale 3D images of everything from historic sites to clothes. They've helped researchers at the University of Cambridge calculate how ice floes in the Arctic are being affected by climate change and recreated the beaches of Normandy for a PBS documentary about D-Day. Some of their projects edge into the morbid, such as scans of concentration camps in former Yugoslavia and the coat Lord Nelson wore at the Battle of Trafalgar, complete with bullet hole and bloodstains. Last year, ScanLab technicians descended 70 feet beneath the streets of London to record images of part of the old London Post Office Railway. The 23-mile "Mail Rail" carried post and packages until 2003, when it was almost completely abandoned. A total of 223 scans resulted in more than 11 billion data points, filling more than a terabyte of memory. CyArk, a nonprofit, was launched to create 3D models of threatened cultural heritage sites around the world before they are lost. So far their free online library includes LiDAR scans of the Leaning Tower of Pisa, Chichén Itzá and the towering stone moai of Easter Island. In 2014, the organization scanned 5.71 square miles of the historic core of New Orleans, creating a virtual version that is accurate to within about five inches. The idea is to capture the city as it stands today—and to have something to guide rebuilding in the aftermath of a natural disaster like Hurricane Katrina. Some LiDAR projects have a more artistic bent. In 2008, Radiohead filmed the video for the song "House of Cards" entirely using laser scans, in part by driving a LiDAR-equipped van around Florida. "In the Eyes of the Animal," a virtual reality (VR) exhibit created by Marshmallow Laser Feast for the Abandon Normal Devices Festival in the U.K., gives users an immersive experience of what it's like to be an animal in the wild. First they used LiDAR, CT scanning and video-equipped drones to record part of Grizedale Forest in Britain's Lake District. To experience the virtual forest, users don sensory units the size of beach balls that completely enclose their heads. Inside, VR goggles and binaural headphones simulate what a fox, bird or deer might see and hear, down to the specific field of vision and light wavelengths. A wearable subwoofer adds a physical dimension. Despite the awesome things LiDAR enables people to do, Grinnell says price is still an issue. The systems are much cheaper than they used to be, but the one that guides each HV-100 is the single most expensive part of the robot, accounting for about 20 percent of the $30,000 price tag. "The cost represents a big barrier to creating more commercially viable solutions using mobile robots." All this means is that LiDAR technology isn't nearly as inexpensive and ubiquitous as something like GPS—yet. Luckily, there are plenty of important projects to motivate experts to continue working toward even better solutions.
News Article | February 8, 2017
Equipped with high-tech versions of common city fixtures — namely, smart benches and digital information signs — and fueled by a “deploy or die” attitude, MIT Media Lab spinout Changing Environments is hoping to accelerate the development of “smart” cities that use technology to solve urban challenges. “The idea is to bring simple technologies to the streets,” says CEO Sandra Richter, a former Media Lab researcher who co-founded the startup with Nan Zhao, a Media Lab PhD student, and Jutta Friedrichs, a Harvard University graduate. “When it comes to smart cities, there’s been a lot of talking, but not a lot of doing. If you don’t want this [smart city] concept to die, you need to bring real-world examples to the places where we live, work, and play.” The women-founded startup is the brains behind the Soofa Benches that have cropped up around Boston and Cambridge, including on MIT’s campus. The benches contain an embedded charging station powered by a mini solar panel, with two USB ports for plugging in mobile devices. They also connect to wireless networks. First installed in Boston in June 2014, the benches are now in 65 cities across 23 U.S. states, including in New York; Washington; Los Angeles; Boulder, Colorado; Oklahoma City; and Austin, Texas. Cities in Canada, Costa Rica, Saudi Arabia, and Germany have adopted the benches as well. The startup also sells a Soofa charging station independently, which can be integrated into existing city infrastructure. Recently, Changing Environments starting deploying its second solar-powered product, the Soofa Sign, in Metro Boston spots including in Kendall Square in Cambridge, Samuel Adams Park in Boston, and Porter Square in Cambridge and Somerville. Each sign has apps installed that display public transit times, weather, and events, among other information. This month, the startup will select three additional cities where it will pilot the Soofa Sign. Each Soofa product comes equipped with sensors that gather pedestrian-traffic data for cities, and can be considered part of the “internet of things” (IoT), in which many kinds of everyday devices are wirelessly connected and exchange data. This data can be used by cities to make decisions about funding city developments, events, and other initiatives that impact the public. Richter and Zhao came together in the Media Lab after realizing they shared similar interests in developing “persuasive” technologies that helped people live healthier and more sustainably. Richter studied in the Changing Places group led by Principal Research Scientist Kent Larson, where she designed technologies and apps that encouraged people to bike more, use electric cars, and maintain other healthy practices. (Richter was named one of the most creative people in business by Fast Company in 2013 for her work at the lab.) Zhao, a student in the Responsive Environments group led by Joseph Paradiso, the Alexander W. Dreyfoos Professor in Media Arts and Sciences, develops technologies that help people save energy by using less light. The startup’s name, in fact, is a combination of the two group names. Many of Richter and Zhao’s projects centered on building for smart cities, in which IoT devices would collect data to help improve efficiency of services and meet residents’ needs. As a side project, in 2013, they decided to build an IoT fixture that could be easily deployed in urban areas and would benefit the public. “And what is better than a park bench?” Richter says. “It’s something we have all around the globe, has existed for centuries, and is a place for people to connect with each other. For us, that was the ideal platform to start introducing sensors into the public environment.” From there, things moved quickly. Influenced by the Media Lab’s oft-repeated motto, “deploy or die,” Richter and Zhao, then joined by Friedrichs, developed a concrete prototype of the current Soofa Bench model, “every now and then turning the Media Lab into a concrete mess,” Richter says, laughing. Thanks to a meeting facilitated by Larson with Boston’s Mayor’s Office of New Urban Mechanics, the first Soofa Bench prototype was installed in Titus Sparrow Park in June 2014. One week after, Richter took the prototype to the first White House Maker’s Faire, where she sat down with then-President Barack Obama to discuss the bench and the future of smart cities. Upon returning to Boston, the three co-founders embarked on a “crazy summer where everything happened,” Zhao says. Verizon and Cisco — which invests in IoT technologies — had funded the students to develop more Soofa Benches. But the students didn’t even have a company bank account. “So literally a couple days before Cisco transferred money, we said, ‘Alright, we need to start a company,’” Richter says. Naming themselves Changing Environments, the students cranked out about 10 benches in the Media Lab as part of a pilot launch for spots in the Boston Common, the Rose Kennedy Greenway, and other Boston locations. In mid-2015, Changing Environments opened headquarters in East Cambridge and brought its first commercial Soofa Bench to Central Square, in Cambridge, before spreading to dozens of other U.S. and international cities. Today, the Soofa Benches are certainly seeing use. Charging activity is tracked at headquarters where, in a lighthearted competition, the office has a “bench leadership board” with benches that see the most charging activity. Currently holding the top spot is “Amelia,” the startup’s first commercial bench in Cambridge’s Central Square, with 1,817 total hours charged over a total of 3,571 charging sessions, as of mid-January. (Soofa encourages cities to name each bench to keep things amusing and engaging.) Benches in Harvard Square and on the Rose Kennedy Greenway have logged around 2,500 charging sessions. Some newly installed benches in New York City, which were just implemented in May 2016, already have more than 2,000 sessions charged. On the back end, Soofa fixtures collect valuable data for the city governments that purchase them. The sensors count the wireless signals emitted from pedestrians’ mobile devices and assign an activity level for the location. Cities can use Soofa software to check if activity was high or low in certain areas at certain times. A city may note, for instance, that a certain event drew a big crowd and may decide to host similar events. More broadly, Richter says, installing Soofa Benches — often a publicized event — can open discussions about smart cities. When benches are installed in a new city, the mayor or other city official usually meets the co-founders at the bench to discuss the technology and its benefits and limitations. In that way, Soofa serves as “an icon for internet of things in cities,” Richter says. Now a thriving startup, Changing Environments owes some of its success to MIT’s entrepreneurial ecosystem, the co-founders say, including an early investment from the Media Lab’s E14 Fund, which provides stipends, mentoring, introductions to investors, and basic legal and accounting services to recent MIT graduates such as Richter. The E14 Fund, Richter says, gave her a great “runway” for transitioning from student to entrepreneur. “It’s like you’re still under the wing of the lab, but you’re just learning how to fly,” she says. The newly deployed Soofa Sign, Richter adds, came about through a collaboration with MIT spinout E Ink — which invented electronic ink for e-readers and other devices — that was initiated by Joi Ito, director of the Media Lab. “It’s beautiful to see two MIT companies working together to push the envelope on a product,” Richter says. This winter, MIT professors also helped the startup recruit interns for the Institute’s Independent Activities Period. And mentors still offer advice when needed. “It’s been great to continue the relationship with professors, students, and the MIT community as a whole,” Richter says.
News Article | February 21, 2017
News Article | October 28, 2016
IBM is investing $200 million into its Watson IoT business, which is headquartered in Munich, Germany. The Watson IoT headquarters will be home to new hands-on industry labs where clients and partners will work with IBM's researchers, engineers, and developers to drive innovation in the automotive, electronics, manufacturing, healthcare, and insurance industries. This is one of IBM's largest ever investments in Europe, and is in response to customers wanting to use a combination of IoT and Artificial Intelligence technologies. IBM currently has 6,000 global clients, up from 4,000 eight months ago, according to an IBM press release. These clients are using Watson IoT technologies to gather information from billions of sensors embedded in machines, cars, drones, ball bearings, and hospitals. "IBM is making tremendous strides to ensure that businesses around the world are able to take advantage of this incredible period of technological transformation and develop new products and services that really change people's lives," said Harriet Green, global head of IBM's Watson IoT business. "Germany is at the forefront of the Industry 4.0 initiative and by inviting our clients and partners to join us in Munich, we are opening up our talent and technologies to help deliver on the promise of IoT and establishing a global hotbed for collaborative innovation." SEE: IBM Watson, MIT partner on 'brain-inspired' lab to make computers more human (TechRepublic) IBM Watson IoT is working with Schaeffler, Aerialtronics, and Thomas Jefferson University Hospital with the following new projects:
News Article | September 5, 2016
Many of the inventors who fueled the digital revolution have become household names. And rightfully so. Innovators such as Steve Jobs, Bill Gates, and Mark Zuckerberg all contributed mightily to the technologies that have transformed our daily lives and society. If you’re not an engineer, however, you have probably never heard of the brilliant inventor Rudolf Kálmán, a Budapest-born engineer and mathematician who died on July 2 in Gainesville, Florida, at age 86. His fundamental contribution, an algorithm called the Kalman filter, made possible many essential technological achievements of the last 50 years. These include aerospace systems such as the computers that landed Apollo astronauts on the moon, robotic vehicles that explore our world from the deep sea to the outer planets, and nearly any endeavor that needs to estimate the state of the world from noisy data. Someone once described the entire GPS system—an Earth-girdling constellation of satellites, ground stations, and computers as “one enormous Kalman filter.” Within his professional community, Kálmán was well known and highly admired, the recipient of numerous awards and honors. In 2009 President Obama awarded him the National Medal of Science. If you have studied any form of robotics, control, or aerospace engineering in the past four decades, then Kálmán’s eponymous filter was as fundamental to your work as the Pythagorean theorem is to high schoolers preparing for the SAT. Here’s why. Control engineers know that you can only control what you can measure. The more precisely you can measure it, the better you can control it. Consider the challenge faced by the engineers tasked with designing the Apollo flight computers in the early 1960s. The computers’ raw data—measurements from sensors such as gyroscopes, accelerometers, and radar—were inherently noisy, full of random errors and messy inaccuracies. When barreling toward a rocky moon at high speed, those errors can ruin your day. Somehow you have to be able to filter out this noise from the measurements and make the best possible estimate of where you are and how fast you’re moving. You also need to know just how good or bad your estimates are, in a statistical sense, since it can be disastrous to think that you’re doing better than you actually are. And all this needs to happen in fractions of a second as the spacecraft speeds toward the moon, attempts a lunar landing, or threads the needle of an entry-corridor as it reënters Earth’s atmosphere. That’s where Rudolf Kálmán came in. He published an ingenious recursive estimation algorithm in 1960. The filter would accomplish the goal of accurately estimating and predicting critical variables such as location, direction, and speed in the presence of noisy measurements, and even estimate the noise. Others, such as cybernetics inventor Norbert Wiener, had tackled the problem before, but Kálmán tailored his solution to the emerging world of digital computers and real-time processing. When the Apollo 11 lunar module, controlled by Neil Armstrong and a software program, made its heart-stopping landing on the Sea of Tranquility, the Kalman filter ensured that real-time position data coming from Earth-based radar tracking agreed closely with the onboard sensors. Listen to the tapes and you’ll hear Buzz Aldrin calling out the Kalman filter estimates as Armstrong landed. Nearly that same calculation, with modernized Kalman filters, happens routinely inside your mobile phone. The phone’s GPS sensor provides real-world coӧrdinates on the face of the Earth, while its accelerometers sense rapid, small motions. Each has noise and inaccuracy of different types; the Kalman filter combines them for the best of both worlds. Drive your car into a tunnel, for example, and you lose GPS, but the Kalman filter still achieves pretty good dead reckoning until you come out the other side and get a new GPS “fix.” But that’s only the beginning of the impact that Rudolf Kálmán’s work will have on the world. Within the next decade the Kalman filter will be at work in consumer technologies that will change your life in equally profound ways. The very same guidance and navigation problems faced by Apollo engineers 50 years ago—how to locate objects accurately in the vastness of space—challenge engineers today as they design self-driving cars that can navigate safely in smart cities, augmented-reality computer games, and robot companions to work on the factory floor and in your home. All these inventions require precise information, what we call “microlocation,” in some cases down to millimeters, to ensure that your self-driving car parks in your garage and not on your lawn, that your virtual-reality gaming headset makes you fly and not vomit, and that your trusted robot companion pours coffee into your cup and not on your lap. This means millions and perhaps billions of Kalman filters. But then there’s the Internet of things, the much anticipated infrastructure of a connected, smart world of the future. The Internet of things will require Kalman filters in trillions of smart objects to guide them to where and when we want them, at our workplaces, in our homes, and elsewhere in our lives. Then perhaps Kálmán will finally join Jobs, Gates, and Zuckerberg as a household name. David Mindell is a professor at MIT and founder of Humatics, a Cambridge, Massachusetts, microlocation company that utilizes the Kálmán filter. Frank Moss, an alumnus of the MIT Instrumentation Laboratory that built the Apollo computers, is a former director of the MIT Media Lab and a board member at Humatics.
News Article | November 12, 2015
If you're tired of sitting in your car frustrated over traffic congestions or just too sleepy and plain tired at the end of the day, Kevin Ashton has a technological forecast that will surely excite you. The man who coined the term " Internet of Things" himself predicted that by the year 2030, you'll be able to put your feet up on your dashboard because cars will do the driving for you. That's right, Massachusetts Institute of Technology's (MIT) Auto-ID Center founder Kevin Ashton believes that the technology of the future is predictable when you base it on past trends and he wrote a book titled "How to Fly a Horse: The Secret History of Creation, Invention, and Discovery" which is currently a finalist for Best Innovation & Creativity Book of the Year in the 800-CEO-READ Business Book Awards for 2015. "Expect self-driving features in most new cars by 2020 and cars without steering wheels between 2025-2030, varying by country. What's the point? They will be safer, faster, more fuel efficient, and you'll be able to get things done, or take a nap, while you move from place to place," he wrote. He anchored the predictability of future technology on three specific laws that technological progress seems to abide by: Moore's Law or the idea that every two years, microprocessors half in size; Metcalfe's Law which determines that a network's value is dependent on the number of users squared; and Koomey's Law which establishes the idea that every 18 months, the energy spent for computation is halved. Basically, what this means is that computer technology becomes smaller, constantly connected to the Internet and more energy efficient. Ashton's predictions may just be on its way to the streets as more and more car manufacturers are investing in technology to produce a self-driving vehicle. Toyota, Nissan, General Motors and Google are racing to perfect their technology as they all announced their plan to get autonomous vehicles cruising the streets by 2020. We have five years of waiting time for car manufacturers to make good on their plans and 15 years to see if Ashton's predictions are correct. To pique your interest on his ideas more, Ashton also predicted that we will discover extra-terrestrial life and gives his take on the reality of Climate Change and how humans would cope with it.
News Article | September 18, 2016
Over the past few years, the Internet of Things (IoT) has been the white-hot center of a flurry of activity. Startups that create embedded sensors for physical things have been snapped up by larger companies at a rapid pace, with deals for IoT startups totaling more than $30 billion in the past four years. The IoT may well be The Next Big Thing, but maybe the attention around sensors is misplaced… What if we didn’t even need embedded sensors to allow things to gather data about their surrounding environment? What if material could be a sensor in and of itself? Sentient materials might sound like the stuff of sci-fi, but it’s quickly becoming a reality. A new generation of materials is being developed that can sense temperature, pressure, impact and other variables — completely removing the need for sensors. Not only can these materials capture and relay data to the cloud, they also can reconfigure themselves on-the-fly to react to changing environmental conditions. It’s as if materials are becoming not just smart, but “alive” — and it will change the way things are designed and used in startling ways. How did we arrive here? Design and engineering used to focus on materials that behaved isotropically — which is to say, uniformly and predictably. In the isotropic age, you would create a design and then assign a material to carry out a specific role in that design. What if, however, you allowed materials to determine design, rather than vice versa? We see this in nature all the time. A seed, for example, works together with a specific environment to create a tree. This is an example of anisotropic materials in action. Unlike isotropic materials, their behavior isn’t predetermined, so their performance can be tailored to their environment. Welcome to the anisotropic age of design. Imagine an airplane skin that self-heals to remove dings and dents, thereby maintaining optimal aerodynamics. In the isotropic age that’d be virtually impossible to design — but in the anisotropic age, it becomes a possibility. Here’s how it would work: An airplane component (like the wing) is made out of a composite material that has been coated with a thin layer of nanosensors. This coating serves as a “nervous system,” allowing the component to “sense” everything that is happening around it — pressure, temperature and so on. When the wing’s nervous system senses damage, it sends a signal to microspheres of uncured material within the nanocrystal coating. This signal instructs the microspheres to release their contents in the damaged area and then start curing, much like putting glue on a crack and letting it harden. Airbus is already doing important research in this area at the University of Bristol’s National Composites Centre, moving us closer to an aviation industry shaped by smart materials. The automotive industry, meanwhile, can use smart materials to manufacture cars that not only sense damage and self-heal, but also collect data about performance that can be fed back into the design and engineering process. The Hack Rod project — which brings technology partners together with a team of automotive enthusiasts in Southern California — is out to design the first car in history built with smart materials and engineered using artificial intelligence. In another example, Paulo Gameiro, coordinator of the EU-funded HARKEN project and R&D manager for the Portuguese automotive textiles supplier Borgstena, is developing a prototype seat and seatbelt that uses smart textiles with built-in sensors to detect a driver’s heart and breathing rates, so it can alert drivers to tell-tale signs of drowsiness. Beyond transportation, more opportunities await in the construction and civil engineering fields, where smart materials can greatly assist with structural health monitoring. Today, the world has hundreds of roads, bridges and other pieces of infrastructure that are slowly falling apart because of wear and tear and exposure to the elements. More often than not, we don’t even know which items need our attention most urgently. But what if you could build these structures out of “smart concrete”? The “nervous system” within the concrete could constantly monitor and assess the status of the infrastructure and initiate self-repair as soon as any damage was sustained. There is a major project currently underway at the Massachusetts Institute of Technology (MIT), called ZERO+, that aims to reshape the construction industry with exactly these types of advanced composite materials. The researchers at MIT are also hard at work at the newly formed Advanced Functional Fabrics of America (AFFOA) Institute. Their goal is to come up with a new generation of fabrics and fibers that will have the ability to see, hear and sense their surroundings; communicate; store and convert energy; monitor health; control temperature; and change their color. These functional fabrics mean that clothes won’t necessarily just be clothes anymore. They can be agents of health and well-being, serving as noninvasive ways to monitor body temperature or to analyze sweat for the presence of various elements. They can be portable power sources, capturing energy from outside sources like the sun and retaining that energy. They even can be used by soldiers to adapt to different environments more quickly and efficiently. And if you accidentally rip a hole in your garment? Naturally, the nanosensors within the fabric will engage a self-repair process to patch things up — in the exact same way the airplane wing and the smart concrete healed themselves. This is no Hollywood movie — this is reality, and a clear indicator of how quickly smart materials are coming along. These materials have an increasingly important role to play in shaping the world around us — whether that’s airplanes and infrastructure or the clothes on our backs. By creating things that can not only capture data about their environment, but also adjust their performance based on that data, materials are starting to play an active role in design. This is the potential of smart materials, and it’s one of the keys to creating a better-designed world around us.