Los Angeles, CA, United States
Los Angeles, CA, United States

Time filter

Source Type

Fraundorfer F.,ETH Zurich | Scaramuzza D.,Perception Robotics
IEEE Robotics and Automation Magazine | Year: 2012

Visual odometry (VO) is the process of estimating the gomotion of an agent using the input of a single or multiple cameras attached to it. The advantage of VO with respect to wheel odometry is that VO is not affected by wheel slip in uneven terrain or other adverse conditions. During the feature-detection step, the image is searched for salient keypoints that are likely to match well in other images. A local feature is an image pattern that differs from its immediate neighborhood in terms of intensity, color, and texture. In the feature description step, the region around each detected feature is converted into a compact descriptor that can be matched against other descriptors. After comparing all feature descriptors between two images, the best correspondence of a feature in the second image is chosen as that with the closest descriptor. Alternatively, if only the motion model is known but not the 3-D feature position, the corresponding match can be searched along the epipolar line in the second image.


Forster C.,Perception Robotics | Pizzoli M.,Perception Robotics | Scaramuzza D.,Perception Robotics
Proceedings - IEEE International Conference on Robotics and Automation | Year: 2014

We propose a semi-direct monocular visual odometry algorithm that is precise, robust, and faster than current state-of-the-art methods. The semi-direct approach eliminates the need of costly feature extraction and robust matching techniques for motion estimation. Our algorithm operates directly on pixel intensities, which results in subpixel precision at high frame-rates. A probabilistic mapping method that explicitly models outlier measurements is used to estimate 3D points, which results in fewer outliers and more reliable points. Precise and high frame-rate motion estimation brings increased robustness in scenes of little, repetitive, and high-frequency texture. The algorithm is applied to micro-aerial-vehicle state-estimation in GPS-denied environments and runs at 55 frames per second on the onboard embedded computer and at more than 300 frames per second on a consumer laptop. We call our approach SVO (Semi-direct Visual Odometry) and release our implementation as open-source software. © 2014 IEEE.


Grant
Agency: National Science Foundation | Branch: | Program: SBIR | Phase: Phase I | Award Amount: 149.99K | Year: 2015

The broader impact/commercial potential of this project stems from its ability to transform excavator operation and control from a primarily skill-based activity to a knowledge-based practice, leading to significant increases in productivity and safety. This is turn will help realize enormous cost savings and reduction of potential hazards to the public, improve competitiveness of U.S. construction industry, and reduce life cycle costs of civil infrastructure. Such benefits will also accrue in fields such as manufacturing, transportation, mining, and ship-building where the transition from skill-based to knowledge-based processes is of value. In the long-run, the accidental utility strike warning capabilities of the proposed solution also have the potential to save millions of dollars in liability and opportunity costs for customers, and avoid disruptions to life and commerce, and prevent physical danger to workers and the public. The societal benefit and commercial impact of the project are thus expected to be the significant reductions in construction and underground infrastructure costs that will be possible through safe and efficient excavation. This Small Business Innovation Research (SBIR) Phase I project will translate fundamental computer-vision based motion-tracking technology into a transformative solution for tracking an excavator end-effector (bucket) in field conditions, and demonstrate the capability to meet target market performance demands. Excavation is a quintessential construction activity where every operator faces two major problems: 1) The need to maintain precise grade control; and 2) The need to avoid accidental utility strikes. This project will overcome these pain points and present operators with a visualization of excavation job plans, target grade profiles, and the evolving grade profile in real-time using augmented reality, allowing operators to achieve target grades with high precision, improved productivity, and safety. The disruptive innovation is the use of inexpensive computer-vision based tracking to: 1) track the position of an excavator bucket in a local coordinate system; and 2) track the position of a cabin-mounted camera in the same coordinate system to visualize buried utility locations in augmented reality. Extensive field testing has demonstrated that this technology tracks markers relative to a camera with an uncertainty of less than one inch, offering significant advantages over current global positioning system based methods that are expensive and unreliable for the pursued application.


Grant
Agency: National Science Foundation | Branch: | Program: SBIR | Phase: Phase I | Award Amount: 115.75K | Year: 2014

The broader impact/commercial potential of this project will enable improved cost-efficiency and industrial automation in manufacturing, increasing worker productivity and reducing injuries. The end-users of the robots, i.e., automotive original-equipment-manufacturers and subassembly suppliers, will be able to achieve significant cost advantages by automating new assembly tasks with more inexpensive systems. Of the non-fatal injuries and illness cases reported in the U.S. workforce, 43% of injuries were due to bodily reaction/exertion, and 62% of illness cases were due to repetitive trauma. This innovative solution will facilitate the automation of repetitive, injury-prone manual tasks and greatly improve the speed, accuracy, and cost-efficiency of current robotic handling systems. Beyond handling, there is significant market potential in packaging and warehousing, hazardous materials handling, medical device and other precision manufacturing, and military applications such as bomb defusal and evacuation robots. By enabling new robotic applications and increasing productivity in current automation, this sensor will help the U.S. (and other developed economies) maintain a competitive domestic manufacturing sector. This Small Business Innovation Research (SBIR) Phase I project?s goal is to develop a visual-tactile sensing package for parts handling. The solution is two-fold: (1) A new flexible tactile sensor that can be tailored to a wide variety of form factors; (2) Software to fuse the tactile data with a vision system to estimate pose of objects in pick-and-place tasks. Object grasping and manipulation by robotic hands in unstructured environments demands a sensor that is durable, compliant, and responsive to various force and slip conditions. The goal is to be the first commercially available sensing package that integrates tactile and visual data with accompanying software for state estimation. A large software and gaming company was able to greatly impact the machine vision space by introducing an inexpensive, easily calibrated robust visual sensor; this will do the same for touch sensing ? our team has studied the desirable properties of such tactile sensors for years and discovered a way to produce them in an inexpensive, robust format.


Grant
Agency: NSF | Branch: Standard Grant | Program: | Phase: SMALL BUSINESS PHASE II | Award Amount: 746.17K | Year: 2016

The broader impact/commercial potential of this project is improvement in cost-efficiency, energy-efficiency, and quality in manufacturing automation, increasing worker productivity and reducing repetitive motion injuries. This integrated visual-tactile system will be 3-4 times more inexpensive ($20,000 purchase cost vs. existing $65,000-80,000 vision system), improve the speed and accuracy of current robotic handling systems, and facilitate the automation of repetitive, injury-prone manual tasks. By enabling new robotic applications and increasing productivity in current automation, this solution will help the U.S. maintain a competitive domestic manufacturing sector. In 2009 there were 36,190 logged repetitive motion injuries in the U.S.; the median missed work time from these injuries was 21 days (U.S. Bureau of Labor Statistics). This innovative solution will facilitate the automation of repetitive, injury-prone manual tasks and greatly improve the speed, accuracy, and cost-efficiency of current robotic handling systems. The immediate commercial applications are in industrial robotics, specifically robotics in agile manufacturing. In the long term, the technology will be applied in personal, healthcare, and military robots. The current market potential for tactile sensors for industrial robots is estimated as $576 million - $1.15 billion and expected to more than double by 2025.

This Small Business Innovation Research (SBIR) Phase 2 project will result in a combined visual-tactile system that will give robots an integrated sense of touch and vision, much like the hand-eye coordination of humans. It incorporates a technically novel compliant tactile sensing solution?a rubber ?skin? that can be molded into any form factor and is inexpensive and durable. This advanced skin technology can resolve object shape, contact/slip events, and forces of contacted objects. It will uniquely fuse visual and tactile information for object handling and pose estimation resulting in flexible robotic system that handles objects more like humans do. This approach addresses key weaknesses in vision-based robotic manufacturing, such as occlusion and dislodging when parts are grasped. Current industrial robots are restricted in their ability to handle small, irregularly shaped, soft, or fragile parts. Existing solutions rely on expensive and complex 3D-vision systems or repetitive manual labor. This solution is two-fold: (1) A new flexible tactile sensor that can be tailored to a wide variety of form factors; (2) Software to fuse the tactile data with a vision system to estimate pose of objects in pick-and-place tasks.


Grant
Agency: NSF | Branch: Standard Grant | Program: | Phase: | Award Amount: 149.99K | Year: 2015

The broader impact/commercial potential of this project stems from its ability to transform excavator operation and control from a primarily skill-based activity to a knowledge-based practice, leading to significant increases in productivity and safety. This is turn will help realize enormous cost savings and reduction of potential hazards to the public, improve competitiveness of U.S. construction industry, and reduce life cycle costs of civil infrastructure. Such benefits will also accrue in fields such as manufacturing, transportation, mining, and ship-building where the transition from skill-based to knowledge-based processes is of value. In the long-run, the accidental utility strike warning capabilities of the proposed solution also have the potential to save millions of dollars in liability and opportunity costs for customers, and avoid disruptions to life and commerce, and prevent physical danger to workers and the public. The societal benefit and commercial impact of the project are thus expected to be the significant reductions in construction and underground infrastructure costs that will be possible through safe and efficient excavation.

This Small Business Innovation Research (SBIR) Phase I project will translate fundamental computer-vision based motion-tracking technology into a transformative solution for tracking an excavator end-effector (bucket) in field conditions, and demonstrate the capability to meet target market performance demands. Excavation is a quintessential construction activity where every operator faces two major problems: 1) The need to maintain precise grade control; and 2) The need to avoid accidental utility strikes. This project will overcome these pain points and present operators with a visualization of excavation job plans, target grade profiles, and the evolving grade profile in real-time using augmented reality, allowing operators to achieve target grades with high precision, improved productivity, and safety. The disruptive innovation is the use of inexpensive computer-vision based tracking to: 1) track the position of an excavator bucket in a local coordinate system; and 2) track the position of a cabin-mounted camera in the same coordinate system to visualize buried utility locations in augmented reality. Extensive field testing has demonstrated that this technology tracks markers relative to a camera with an uncertainty of less than one inch, offering significant advantages over current global positioning system based methods that are expensive and unreliable for the pursued application.


Grant
Agency: National Science Foundation | Branch: | Program: SBIR | Phase: Phase II | Award Amount: 746.17K | Year: 2016

The broader impact/commercial potential of this project is improvement in cost-efficiency, energy-efficiency, and quality in manufacturing automation, increasing worker productivity and reducing repetitive motion injuries. This integrated visual-tactile system will be 3-4 times more inexpensive ($20,000 purchase cost vs. existing $65,000-80,000 vision system), improve the speed and accuracy of current robotic handling systems, and facilitate the automation of repetitive, injury-prone manual tasks. By enabling new robotic applications and increasing productivity in current automation, this solution will help the U.S. maintain a competitive domestic manufacturing sector. In 2009 there were 36,190 logged repetitive motion injuries in the U.S.; the median missed work time from these injuries was 21 days (U.S. Bureau of Labor Statistics). This innovative solution will facilitate the automation of repetitive, injury-prone manual tasks and greatly improve the speed, accuracy, and cost-efficiency of current robotic handling systems. The immediate commercial applications are in industrial robotics, specifically robotics in agile manufacturing. In the long term, the technology will be applied in personal, healthcare, and military robots. The current market potential for tactile sensors for industrial robots is estimated as $576 million - $1.15 billion and expected to more than double by 2025. This Small Business Innovation Research (SBIR) Phase 2 project will result in a combined visual-tactile system that will give robots an integrated sense of touch and vision, much like the hand-eye coordination of humans. It incorporates a technically novel compliant tactile sensing solution?a rubber ?skin? that can be molded into any form factor and is inexpensive and durable. This advanced skin technology can resolve object shape, contact/slip events, and forces of contacted objects. It will uniquely fuse visual and tactile information for object handling and pose estimation resulting in flexible robotic system that handles objects more like humans do. This approach addresses key weaknesses in vision-based robotic manufacturing, such as occlusion and dislodging when parts are grasped. Current industrial robots are restricted in their ability to handle small, irregularly shaped, soft, or fragile parts. Existing solutions rely on expensive and complex 3D-vision systems or repetitive manual labor. This solution is two-fold: (1) A new flexible tactile sensor that can be tailored to a wide variety of form factors; (2) Software to fuse the tactile data with a vision system to estimate pose of objects in pick-and-place tasks.


Grant
Agency: NSF | Branch: Standard Grant | Program: | Phase: SMALL BUSINESS PHASE I | Award Amount: 138.80K | Year: 2014

The broader impact/commercial potential of this project will enable improved cost-efficiency and industrial automation in manufacturing, increasing worker productivity and reducing injuries. The end-users of the robots, i.e., automotive original-equipment-manufacturers and subassembly suppliers, will be able to achieve significant cost advantages by automating new assembly tasks with more inexpensive systems. Of the non-fatal injuries and illness cases reported in the U.S. workforce, 43% of injuries were due to bodily reaction/exertion, and 62% of illness cases were due to repetitive trauma. This innovative solution will facilitate the automation of repetitive, injury-prone manual tasks and greatly improve the speed, accuracy, and cost-efficiency of current robotic handling systems. Beyond handling, there is significant market potential in packaging and warehousing, hazardous materials handling, medical device and other precision manufacturing, and military applications such as bomb defusal and evacuation robots. By enabling new robotic applications and increasing productivity in current automation, this sensor will help the U.S. (and other developed economies) maintain a competitive domestic manufacturing sector.

This Small Business Innovation Research (SBIR) Phase I project?s goal is to develop a visual-tactile sensing package for parts handling. The solution is two-fold: (1) A new flexible tactile sensor that can be tailored to a wide variety of form factors; (2) Software to fuse the tactile data with a vision system to estimate pose of objects in pick-and-place tasks. Object grasping and manipulation by robotic hands in unstructured environments demands a sensor that is durable, compliant, and responsive to various force and slip conditions. The goal is to be the first commercially available sensing package that integrates tactile and visual data with accompanying software for state estimation. A large software and gaming company was able to greatly impact the machine vision space by introducing an inexpensive, easily calibrated robust visual sensor; this will do the same for touch sensing ? our team has studied the desirable properties of such tactile sensors for years and discovered a way to produce them in an inexpensive, robust format.


Day Long Conference Convenes the Top Funders, Firms and Founders in this Emerging Industry Including Executives from Intel Capital, IBM and Virtuosity LOS ANGELES, CA--(Marketwired - Nov 16, 2016) - Mentor InSight (http://www.mentorinsight.net/), the media and consulting company that provides guidance empowering superachievement, today announced the creation of the #AIShowBiz Event -- a day long celebration featuring trailblazer talks, an entrepreneurial pitch fest and innovation fair on January 12, 2017 in Downtown Los Angeles. #AIShowBiz brings together industry visionaries at the intersection of the artificial intelligence and entertainment industry to further the economic development of these industries as well as the profitability of the companies pioneering the technologies of tomorrow. The day long conference that includes Trailblazer Talks infused with techno music, a PitchFest for entrepreneurs seeking financing and an evening networking reception party featuring showcases of previously never before revealed technology DEMOs. #AIShowBiz PitchFest Award Prize: Top 3 winners receive R&D to prototype services provided by Above Solutions, Inc. Entrepreneurs and inventors with ideas, products and or services at the intersection of the AI/Entertainment Industry are encouraged to apply by visiting: http://www.mentorinsight.net/aishowbiz-pitchfest-easy-application/ Brilliant electronic artist/composer IAMEVE of "Starman," the musical theme of #AIShowBiz, will be a featured performer during the event held at The Los Angeles Cleantech Incubator (LACI) located at 525 S Hewitt St., Los Angeles, CA 90013. Tickets are available by visiting www.AIShow.Biz For questions, please contact the #AIShowBiz Creator Molly Lavik at: info@mentorinsight.net or 310-488-4401. About #AIShowBiz: #AIShowBiz event is created by Mentor InSight, Inc. in partnership with LACI: The Los Angeles Cleantech Incubator and sponsored by Perception Robotics, Kevins S. Reid Insurance Services, Inc., Transform Group and Above Solutions, Inc. About Mentor InSight, Inc.: Mentor InSight, Inc. is a media and consulting company that provides guidance empowering superachievement by producing educational forums and tools for people to realize their full human potential. The company was founded with a moonshot goal to facilitate the development of an Obi-Wan Kenobi hologram style guru to give advice on anything, anywhere at anytime. The #AIShowBiz conference's over arching mission is to create a marketplace that facilitates the development of visionary AI/Entertainment Industry innovations of tomorrow today.


First Ever Week-Long Discounts, Events and Trailblazer Talks to Celebrate the Landmark Achievements and Innovations in the Emerging Industry of Artificial Intelligence LOS ANGELES, CA--(Marketwired - Nov 23, 2016) -  Mentor InSight (http://www.mentorinsight.net/), the media and consulting company that provides guidance empowering superachievement, today announced the creation of a week long celebration of landmark innovations in the AI industry known as "AI Appreciation Week." Starting November 23rd - December 1, 2016, customers can receive discounts on event tickets and information about relevant industry events as well as visionary talks that share the expertise of some of the most fascinating and prolific people involved with the artificial intelligence industry. "Southern California is the capital of invention especially as it relates to the extraordinary innovations of brilliant entrepreneurs in the artificial intelligence industry. The future possibilities of artificial intelligence are profound and 2016 marks the rapid adoption of these new technologies which inspired me to launch 'AI Appreciation Week,'" said Molly Lavik, Creator of #AIShowBiz, the world's first conference that explores the intersection of AI and the Entertainment Industry. "AI Appreciation Week" Inaugural Year Activities Include: 50% Off #AIShowBiz Early-Bird Tickets for January 12, 2017 event -- marked down from $200 to $100 and only offered at this special half off rate through December 1, 2016. Rates will return to $200 and higher after AI Appreciation Week. Tickets are available by visiting: http://www.AIShow.Biz. Trailblazer Talks from Artificial Intelligence Visionaries: Tuesday, November 29, 2016 Nicholas Wettels, PhD President and CTO Perception Robotics, http://www.perceptionrobotics.com/ Broadcast via Facebook Live from the #AIShowBiz recently launched event page that is accessible by visiting: https://www.facebook.com/aishowbiz/. Updates for the AI Trailblazer Talks Broadcasts will be featured on this page. Special Promotion also Includes: A free #AIShowBiz Pre-Event Mixer held December 1, 2016 featuring Greg Panos, Futurist and Human Simulation Evangelist. The event will be held from 6:30 p.m. to 8:00 p.m. in Venice, CA with a no-host bar and menu offerings. Registration is required via tinyurl.com/AIFreeMixer. #AIShowBiz Event, January 12, 2017: #AIShowBiz brings together industry visionaries at the intersection of the artificial intelligence and entertainment industry to further the economic development of these industries as well as the profitability of the companies pioneering the technologies of tomorrow. The day long conference that includes Trailblazer Talks infused with techno music, a PitchFest for entrepreneurs seeking financing and an evening networking reception party featuring showcases of previously never before revealed technology DEMOs. #AIShowBiz PitchFest Award Prize: Top 3 winners receive R&D to prototype services provided by Above Solutions, Inc. Entrepreneurs and inventors with ideas, products and or services at the intersection of the AI/Entertainment Industry are encouraged to apply by visiting: http://www.mentorinsight.net/aishowbiz-pitchfest-easy-application/ Brilliant electronic artist/composer IAMEVE of "Starman," the musical theme of #AIShowBiz, will be a featured performer during the event held at The Los Angeles Cleantech Incubator (LACI) located at 525 S Hewitt St., Los Angeles, CA 90013. Tickets are available by visiting www.AIShow.Biz For questions about "AI Appreciation Week" and to join the celebration, please contact the #AIShowBiz Creator Molly Lavik at: info@mentorinsight.net or 310-488-4401. About #AIShowBiz: #AIShowBiz event is created by Mentor InSight, Inc. in partnership with LACI: The Los Angeles Cleantech Incubator and sponsored by Perception Robotics, Kevins S. Reid Insurance Services, Inc., Transform Group and Above Solutions, Inc. About Mentor InSight, Inc.: Mentor InSight, Inc. is a media and consulting company that provides guidance empowering superachievement by producing educational forums and tools for people to realize their full human potential. The company was founded with a moonshot goal to facilitate the development of an Obi-Wan Kenobi hologram style guru to give advice on anything, anywhere at anytime. The #AIShowBiz conference's over arching mission is to create a marketplace that facilitates the development of visionary AI/Entertainment Industry innovations of tomorrow today.

Loading Perception Robotics collaborators
Loading Perception Robotics collaborators