Albany, CA, United States
Albany, CA, United States

Time filter

Source Type

Rozen O.,University of California at Davis | Block S.T.,University of California at Davis | Block S.T.,Chirp Microsystems | Shelton S.E.,Chirp Microsystems | Horsley D.A.,University of California at Davis
2015 Transducers - 2015 18th International Conference on Solid-State Sensors, Actuators and Microsystems, TRANSDUCERS 2015 | Year: 2015

In a conventional piezoelectric micromachined ultrasonic transducer (PMUT), only half the acoustic output is used, because pressure emerging from the back of the PMUT is wasted. Here, we demonstrate a novel method to recycle the back-side acoustic pressure by redirecting it to the front-side through concentric venting rings. The ring diameter determines the phase-shift between the sound emerging from the front-side port and the ring, and can be adjusted to either amplify the far-field sound pressure level (SPL) or change the directivity of the output beam. Devices were fabricated in an industrial foundry process using wafer-level bonding of a MEMS PMUT wafer to a CMOS wafer using a conductive metal eutectic bond. We have designed, fabricated, and characterized nine new venting designs, and we achieved a 4.5 dB increase in SPL for a design with a 400 μm radius venting ring. © 2015 IEEE.


Chirp's MEMS-based ultrasonic sensor also enables ultra-wide field-of-view, inside-out controller tracking for mobile VR/AR at 1/1000 power of other solutions BARCELONA, SPAIN--(Marketwired - Feb 28, 2017) -  Chirp Microsystems, the pioneer in low-power ultrasonic sensing solutions, today introduced at Mobile World Congress 2017 the first high-accuracy, ultra-low power ultrasonic sensing development platform for wearables. The new Chirp development platform -- which leverages the company's microelectromechanical systems (MEMS)-based time-of-flight (ToF) sensor -- senses tiny "microgestures" with 1mm accuracy, allowing users to interact with wearables and other consumer electronics devices using the smallest of gestures. Chirp's ToF sensor is also the foundation for vastly superior virtual reality/augmented reality (VR/AR) experiences, which the company demonstrated privately at Mobile World Congress. Chirp's ultrasonic sensing development platform for VR/AR enhances the mobility of users, supporting "inside-out tracking" of controllers or input devices with six degrees of freedom, which allows users to interact with the VR/AR environment without being tethered to a base station or confined to a prescribed space. They can literally take their VR/AR systems with them. In VR/AR applications, Chirp's development platform offers significant benefits over camera-based controller-tracking systems. It makes possible a 360-degree immersive experience because the tracking system moves with the user. It also supports a wide field of view, a vast improvement over the narrow field of view that camera-based systems provide. "Chirp has developed a revolutionary new approach to ultrasonic sensing that expands the ways in which users can interact with consumer electronics devices," said Michelle Kiang, CEO, Chirp Microsystems. "We are demonstrating some of those ways this week at Mobile World Congress. Wearing a Chirp-enabled smartwatch, I can use subtle finger gestures on the back of my hand, controlling watch functions without ever touching the screen. I will also demonstrate how Chirp makes the mobile VR experience one step closer to that of high-end tethered VR systems. We developed an ultrasonically tracked, six-degree-of-freedom controller that connects to any mobile VR headset, letting me play a VR saber game anywhere on the show floor, or even under the sunlight outside the exhibit halls. Since the controller is tracked from the headset, I can move anywhere in the virtual environment and the tracking moves with me, making the VR experience that much more immersive." For More Information Chirp's ultrasonic sensing development platforms are now available to qualified customers. For more information, please email: info@chirpmicro.com. About Chirp Technology Chirp's ultrasonic development platforms leverage its proprietary ToF sensor, a system in package (SiP) that combines a MEMS ultrasound transducer with a power-efficient digital signal processor (DSP) on a custom low-power mixed-signal CMOS ASIC. Relative to existing optical ToF sensors, Chirp's ultrasonic ToF sensor has extremely low power consumption, wide 180-degree field of view and works in any lighting conditions, including direct sunlight, and can detect objects of any color including optically transparent ones. Compared to existing infrared (IR) ToF sensors, Chirp's ToF sensor offers ultra-precise range and position measurement. For more information, visit: http://www.chirpmicro.com/technology.html About Chirp Microsystems Chirp Microsystems is bringing ultrasound to everyday products. Founded in 2013 based on pioneering research performed at the University of California, Chirp's piezoelectric micromachined ultrasonic transducers offer long range and low power in a tiny package, enabling products that perceive the three-dimensional world in which we live. Combined with Chirp's embedded software library, these sensors enable new user interfaces for wearables, smart home and other IoT devices, AR/VR and many more. For more information, please visit: www.chirpmicro.com The Chirp Microsystems logo is a registered trademark of Chirp Microsystems. All other product and company names are trademarks or registered trademarks of their respective holders.


Home > Press > MEMS & Sensors Industry Group Reveals Tech Showcase Finalists --MSIG hosts competition for unique MEMS/sensors demos at MEMS & Sensors Executive Congress, 11/10/16 in Scottsdale, AZ Abstract: MEMS & Sensors Industry Group (MSIG)’s annual MEMS & Sensors Technology Showcase at MEMS & Sensors Executive Congress® 2016 (November 9-11, 2016 in Scottsdale, AZ) highlights some of the newest and most unique MEMS/sensors-enabled applications in the industry. MSIG today announced the shortlist of finalists who will compete for the title of winner at this year’s event. i i-BLADES’ Smartplatform i-BLADES’ mobile Smartcase is a new modular accessory that dramatically accelerates time to market and reach for MEMS and Internet of Things (IoT) technologies. It lets new technologies quickly reach mass-market mobile consumers through one integrated smartphone accessory -- a mobile phone case. It not only provides protection but also a Smartplatform that forms a “hard-wired” smartphone connection, enabling add-on MEMS and IoT technologies. Developers can add new sensors to Smartcase directly or through snap-on Smartblade modules. With i-BLADES, technologies can quickly go onto hundreds of millions of smartphones as an after-market opportunity, making smartphones “smarter.” i-BLADES partnered with Bosch to deploy successfully the BME680 sensor faster than via other routes. For more information, visit: www.i-blades.com or watch video: https://www.youtube.com/watch?v=dVcOewMhopE&feature=youtu.be Chirp Microsystems’ MEMS-Based Ultrasonic Sensing Solution Today’s VR and gaming systems are limited by their reliance on complex computer vision techniques for controller tracking, resulting in higher cost, limited tracking area and lack of mobility due to high power consumption. Chirp Microsystems’ ultrasonic tracking technology addresses these limitations, offering solutions that enable truly mobile VR and AR systems at attractive price points suitable for multiple tiers of products. Chirp Microsystems’ new ultrasonic time-of-flight (ToF) technology uses pulses of ultrasound to measure an object’s range with millimeter accuracy. This ultra-low power ultrasonic ToF technology enables low-latency, millimeter-accurate 6 degrees of freedom (DOF) inside-out controller tracking for VR/AR and gaming systems. This system solution is enabled by Chirp’s ultra-low power ultrasonic ToF sensor, which offers ultra-wide field-of-view, noise and light immunity, fast sample rate, and small package size. The ToF sensor is a system in package (SiP) that combines a MEMS ultrasound transducer with a power-efficient digital signal processor (DSP) on a custom integrated circuit. In wearable applications, Chirp’s ultrasonic SiP provides a transformative and intuitive touchless gesture interface. For more information, visit: www.chirpmicro.com Integrated Device Technology’s Gas Sensor for Air Quality and Breath Detection Integrated Device Technology’s (IDT’s) new highly sensitive gas sensor family based on the ZMOD3250 targets indoor air quality with a roadmap that includes environmental (outdoor) air quality and breath detection. The ZMOD3250 family detects total volatile organic compounds (VOCs) and odors, and can be used to selectively identify several VOCs, including formaldehyde, ethanol and toluene. The company is promoting several features and applications of this new gas sensor product line, including the off-gassing detection of chemicals from common home and office materials, odor detection, selective measurements among VOCs and detection of several breath components. IDT’s flagship product, the ZMOD3250, features a unique silicon microhotplate with nanostructured sensing material that enables a highly sensitive measurement of gas. The accompanying ASIC provides a flexible solution for integration of the sensor with various consumer devices, including mobile phones, wearables and appliances. Packaged in a 12 pin LGA assembly (3.0 mm x 3.0 mm x 0.7 mm), the sensor emulates a sensor array with a single sensor element. Suitable for a wide range of applications, the sensor features programmable-measurement sequence and highly integrated CMOS design. To request more information about the ZMOD3250, visit: www.idt.com or watch video: http://www.idt.com/video/uv-sensor-and-gas-sensor-demonstration-idt Valencell’s Biometric Gaming Biometric input adds a new element to gaming. For example, fitness games can use heart rate as a key control measure, or action games can require users to hold their breath while their characters are swimming. Audio earbuds, headsets, armbands and wrist devices – all of which make good use of MEMS/sensors -- are natural peripherals for gaming -- and as well as for exercising. Valencell has created a demonstration game that not only involves real-time biometric data to affect the gaming experience, but also collects meaningful health metrics in the background. This has implications not only for the gaming industry, but also for healthcare and medical markets. In fact, healthcare practitioners are integrating biometric game play into physical therapy and surgery recovery protocols to measure and manage recovery processes. Valencell will demonstrate the game as well as its biometric output and analysis. For more information, visit: www.valencell.com or watch video: https://www.youtube.com/watch?v=QMTJP6OBmjA Vesper’s Wake-on Sound MEMS Microphone Always-listening MEMS microphones may signal a new era of ubiquitous sensors that can run indefinitely on small batteries. That’s good news for developers of TV remote controls, smart speakers, smartphones, intelligent sensor nodes, hearables and other electronic devices. It’s even better news for consumers who want to cut the power cord but end up incessantly charging devices or replacing batteries, even when those devices aren’t in regular use. Vesper--developer of the world’s only piezoelectric MEMS microphones--will demonstrate VM1010, the first quiescent-sensing MEMS microphone, during MEMS & Sensors Technology Showcase. VM1010 alleviates the heavy power consumption typical of speech recognition--which consumes up to 1000 µW or more. Because it supports wake-on sound at practically zero power draw (a mere 3 µA of current while in listening mode), VM1010 reduces standby power by two orders of magnitude and can increase standby time by a factor of 100. Vesper will also demonstrate the extremely fast response time of VM1010, showing how it can go to full power within microseconds, quick enough to record what a user is saying and capture keywords and other acoustic event triggers. For more information, visit: www.vespermems.com or watch video: https://www.youtube.com/watch?v=KhFtrjbpffE Join Us for MEMS & Sensors Technology Showcase Sponsored by Rogue Valley Microdevices, MEMS & Sensors Technology Showcase will take place from 11:15 am-12:15 pm. on November 10, 2016. Audiences will choose a winner, which MSIG Executive Director Karen Lightman will announce at the close of MEMS Executive Congress US 2016 on November 11. About MEMS & Sensors Industry Group MEMS & Sensors Industry Group (MSIG) is the trade association advancing MEMS and sensors across global markets. MSIG advocates for near-term commercialization of MEMS/sensors-based products through a wide range of activities, such as conferences, technical working groups and education. By bringing the TSensors® (Trillion Sensors) initiative under the umbrella of events and programs, MSIG also increases worldwide awareness of emerging MEMS/sensors-based applications with huge commercialization potential in the next decade and beyond. Nearly 200 companies and industry partners comprise MEMS & Sensors Industry Group. For more information, visit: www.memsindustrygroup.org and follow MSIG on LinkedIn and Twitter (use @MEMSGroup). MEMS & Sensors Industry Group, the MEMS & Sensors Industry Group logo, MEMS & Sensors Executive Congress and TSensors are registered trademarks of MEMS & Sensors Industry Group. All other product and company names are trademarks or registered trademarks of their respective holders. About MEMS & Sensors Executive Congress Now in its 12th year, MEMS & Sensors Executive Congress 2016 is an annual event that brings together business leaders from a broad spectrum of industries: automotive, communications, consumer goods, energy/environmental, industrial and medical. It is a unique professional forum at which executives from companies designing and manufacturing MEMS/sensors technology sit side-by-side with their end-user customers in panel discussions and networking events to exchange ideas and information about the use of MEMS and sensors in commercial applications. Premier sponsors of MEMS & Sensors Executive Congress US 2016 include: Platinum Sponsor EV Group; Gold Sponsors Enterprise Florida and SPTS Technologies; Silver Sponsor NXP; Bronze Sponsors Analog Devices, Applied Materials, OMRON, TECNISCO, ULVAC and X-FAB MEMS Foundry. Event sponsors include: Bosch Automotive Electronics (AE), Bosch Sensortec GmbH, Coventor, Huawei, IHS Markit, MCA Public Relations, MEMS Journal, microGen Systems, MIPI Alliance, PNI Sensor, Rogue Valley Microdevices and Yole Développement. MEMS & Sensors Executive Congress US will take place November 9-11, 2016 at the JW Marriott Scottsdale Camelback Inn Resort & Spa. For more information, please contact MSIG via phone: +412/390-1644, email: or visit MEMS & Sensors Executive Congress US at: http://msigevents.org/msec2016/ Registration For conference registration, please visit: http://msigevents.org/msec2016/registration/ . For press registration, please contact Maria Vetrano, Vetrano Communications, email: maria[at]vetrano.com. For more information, please click If you have a comment, please us. Issuers of news releases, not 7th Wave, Inc. or Nanotechnology Now, are solely responsible for the accuracy of the content.


News Article | December 21, 2016
Site: www.eurekalert.org

From bioprinting to adaptive learning, innovative tech based on fundamental research featured at world's largest consumer technology event More than 20 small businesses funded by the National Science Foundation (NSF) will showcase their early stage technologies at the 2017 CES, a global conference that unveils up-and-coming consumer technologies. The companies will be featured at the Eureka Park Marketplace, an area dedicated to pre-market technologies born from fundamental science and engineering innovation. NSF cofounded Eureka Park in 2012 to help NSF-funded entrepreneurs with emerging, ready-for-commercialization technology to gain marketplace exposure by giving them access to potential partners and investors at CES. Since then, Eureka Park has had a six-fold increase in exhibitors, and now features more than 600 companies. The small businesses exhibiting at CES 2017 will showcase cutting-edge technologies designed to address challenges in robotics, biomedicine, energy, the Internet of Things, and many other areas. WHAT: Startups and small businesses will demonstrate their pre-market or new-to-market technologies to thousands of CES attendees. WHO: NSF-funded small businesses in Eureka Park include the following: SynTouch, a sensor technology that gives robots the ability to replicate -- and sometimes exceed -- the human sense of touch. TAG Optics Inc., a lens that uses sound to bring images into focus more quickly. Neural Analytics, a robotic headset designed to enable people to monitor brain health at home for disorders such as stroke, dementia and concussion. SE3D, a desktop system that prints biological scaffolds, cells and liquids for 3D tissues. Weinberg Medical Physics, a portable, low-power MRI system using magnets that can be switched on and off to image heads or other parts of the body in seconds with high spatial resolution. PFP Cybersecurity, provides an early warning system for wireless networking equipment and smart grid infrastructure using Internet of Things monitors, machine learning and data analytics to detect tiny anomalies in power patterns. ZillionInfo, provides location intelligence solutions to help people deal with large datasets, dig into data insights, and make better location decisions such as site selection, territory mapping and routing. Imagars, a design decision support tool that helps designers learn proper design techniques, develop engineering judgment and foster creativity in a stimulating and user-friendly fashion. IntellADAPT, an adaptive learning technology based on pedagogy, learning models, rich media, real-time feedback and learner analytics. SmartyPal, a technology platform that takes award-winning stories and videos and infuses them with personalized, interactive, educational games, which are designed for two or more people to play and learn together on the same screen. ThoughtSTEM, a game engine platform that immerses students in computer science education by allowing them to easily modify one of the most popular video games of all time, Minecraft. VoiceVibes, automated speech coaching software that enables people to practice and improve their public speaking skills by analyzing vocal features of speech, such as pace, pauses, vocal variety and more, to make users sound more engaging, personable and confident. Zyrobotics, an app-connected plush toy that helps young children fall in love with STEM by immersing them in an animated world of learning games and stories. ARGIL, Inc., a fast-switching electrochromic film to improve light and heat control in cars by replacing standard glass in sunglasses and goggles with dynamic glass. Chirp Microsystems Inc., an ultrasound, low-power 3-D sensing technology that lets users play music or check email on a tablet with the wave of a hand. Pointivo, a cloud-based computer vision algorithm that uses machine learning to create 3D models from video and photos. VisiSonics Corporation, a highly realistic and computationally efficient software for sound synthesis, with applications that include virtual reality, gaming and prostheses for the blind. Stratio Inc., innovative sensor and low-cost optics products to expand access to materials identification, product quality control, medical imaging and other applications. Vaporsens, nanofiber-based sensor and instrumentation technology for the next generation of gas and vapor chemical sensing. LightUp, a personal tutoring platform powered by augmented reality and artificial intelligence that provides real-time voice guidance, adaptive learning analytics and 3D holograms to teach STEM subjects. Electroninks, Inc., particle-free, metallic, conductive inks for consumer electronics, wearables and e-textiles. NSF awards nearly $190 million annually to startups and small businesses through the Small Business Innovation Research (SBIR)/Small Business Technology Transfer (STTR) program, transforming scientific discovery into products and services with commercial and societal impact. The National Science Foundation (NSF) is an independent federal agency that supports fundamental research and education across all fields of science and engineering. In fiscal year (FY) 2016, its budget is $7.5 billion. NSF funds reach all 50 states through grants to nearly 2,000 colleges, universities and other institutions. Each year, NSF receives more than 48,000 competitive proposals for funding and makes about 12,000 new funding awards. NSF also awards about $626 million in professional and service contracts yearly.


Grant
Agency: NSF | Branch: Standard Grant | Program: | Phase: SMALL BUSINESS PHASE II | Award Amount: 971.00K | Year: 2015

This Small Business Innovation Research (SBIR) Phase II project proposes the development of an ultralow-power ultrasonic three-dimensional (3D) rangefinder system for mobile gesture recognition. The proposed 3D rangefinder uses an array of tiny piezoelectric ultrasound transducers which are built on a silicon wafer using microfabrication techniques. Custom electronics are used to control the transducers and the system emits sound into the air and receives echoes from objects in front of the transducer array. The proposed ultrasonic 3D rangefinder has the potential to be small and low-power enough to be left on continuously, giving devices such as smartphones, tablets, and wearable electronic devices a way to sense physical objects in the surrounding environment. Based on the smartphone market alone, the potential market size for this device is over one billion units per year. Mobile contextual awareness will enable 3D interaction with smartphones and tablets, facilitating rich user interfaces for applications such as gaming and hands-free control in automobiles. Looking beyond the smartphone and tablet market, the proposed rangefinder will feature size and power advantages that will permit integration into centimeter-sized devices which are too small to support a touchscreen.

During Phase II, the major technical goals of this project are to transfer the ultrasound transducer manufacturing from a university laboratory to a commercial production facility, to develop a custom integrated circuit for signal processing, and to develop engineering prototypes. In Phase I, micromachined ultrasound transducers having a novel structure designed to improve manufacturability were developed and a demonstration prototype was built using signal processing algorithms running on a personal computer. In Phase II, the ultrasound transducers will be manufactured in a commercial facility for the first time and signal processing algorithms will be realized on a custom mixed-signal integrated circuit. A prototype package for the transducer and integrated circuit chips will be developed and detailed acoustic testing of the packaged prototypes will be conducted.


Grant
Agency: National Science Foundation | Branch: | Program: SBIR | Phase: Phase I | Award Amount: 150.00K | Year: 2014

This Small Business Innovation Research Phase I project proposes the development of an ultrasonic three-dimensional (3D) rangefinder system for mobile gesture recognition. Optical gesture recognition has been introduced for gaming and will soon be launched for personal computer (PC) interaction, but optical gesture sensors are too large and power-hungry to be incorporated into tablets, smartphones, and smaller devices. The proposed 3D rangefinder uses an array of tiny piezoelectric ultrasound transducers which are built on a silicon wafer using microfabrication techniques. Custom electronics are used to control the transducers. In operation, the system emits sound into the air and receives echoes from objects in front of the transducer array. The system infers the location of the objects by measuring the time delay between transmission of the sound wave and reception of the echo. The system will be designed for incorporation into smartphones, tablets, and other mobile devices. The broader impact/commercial potential of this project is to bring contextual awareness to everyday devices, which currently have very little idea about what is going on in the space around them. The proposed ultrasonic 3D rangefinder has the potential to be small and low-power enough to be left on continuously, giving the device a way to sense the physical objects surrounding it in the environment. While today's optical 3D ranging systems work across a small room and are capable of sufficient resolution, they are too large and power hungry to be integrated into battery-powered devices. Mobile contextual awareness will enable 3D interaction with smartphones and tablets, facilitating rich user interfaces for applications such as gaming and hands-free control in automobiles. Looking beyond the smartphone and tablet market, the proposed rangefinder would be well-suited for wearable devices that are too small or simply don't allow for a full-function touchscreen, such as head mounted displays and smart watches. These products currently have limited input options since the area available for buttons and touch-sensor inputs is only slightly larger than a finger. Ultrasonic contextual awareness has the potential to revolutionize the user interface for tiny consumer electronics.


A rangefinding apparatus and method are disclosed. The apparatus may include at least one processor and memory operably connected to the at least one processor. The memory may store instructions that, when executed, cause the apparatus to iterate a target-acquisition process until a target is identified and then iterate a target-tracking process after the target has been identified. The target-acquisition process may include transmitting a short ultrasonic pulse, transmitting a long ultrasonic pulse, and listening for one or more echoes corresponding to the short or long ultrasonic pulses. The target-tracking process may include steering an optimized ultrasonic pulse toward the target, listening for an echo corresponding to the optimized ultrasonic pulse, and calculating, based on the echo, an updated location for the target.


Patent
Chirp Microsystems | Date: 2016-04-28

A piezoelectric micromachined ultrasonic transducer (PMUT) device includes a substrate having an opening therethrough and a membrane attached to the substrate over the opening. A portion of the membrane that overlies the opening is divided into a plurality of cantilevers that are mechanically coupled so that the cantilevers resonate at a common frequency.


Grant
Agency: NSF | Branch: Standard Grant | Program: | Phase: | Award Amount: 150.00K | Year: 2014

This Small Business Innovation Research Phase I project proposes the development of an ultrasonic three-dimensional (3D) rangefinder system for mobile gesture recognition. Optical gesture recognition has been introduced for gaming and will soon be launched for personal computer (PC) interaction, but optical gesture sensors are too large and power-hungry to be incorporated into tablets, smartphones, and smaller devices. The proposed 3D rangefinder uses an array of tiny piezoelectric ultrasound transducers which are built on a silicon wafer using microfabrication techniques. Custom electronics are used to control the transducers. In operation, the system emits sound into the air and receives echoes from objects in front of the transducer array. The system infers the location of the objects by measuring the time delay between transmission of the sound wave and reception of the echo. The system will be designed for incorporation into smartphones, tablets, and other mobile devices.

The broader impact/commercial potential of this project is to bring contextual awareness to everyday devices, which currently have very little idea about what is going on in the space around them. The proposed ultrasonic 3D rangefinder has the potential to be small and low-power enough to be left on continuously, giving the device a way to sense the physical objects surrounding it in the environment. While todays optical 3D ranging systems work across a small room and are capable of sufficient resolution, they are too large and power hungry to be integrated into battery-powered devices. Mobile contextual awareness will enable 3D interaction with smartphones and tablets, facilitating rich user interfaces for applications such as gaming and hands-free control in automobiles. Looking beyond the smartphone and tablet market, the proposed rangefinder would be well-suited for wearable devices that are too small or simply dont allow for a full-function touchscreen, such as head mounted displays and smart watches. These products currently have limited input options since the area available for buttons and touch-sensor inputs is only slightly larger than a finger. Ultrasonic contextual awareness has the potential to revolutionize the user interface for tiny consumer electronics.


Grant
Agency: National Science Foundation | Branch: | Program: SBIR | Phase: Phase II | Award Amount: 750.00K | Year: 2015

This Small Business Innovation Research (SBIR) Phase II project proposes the development of an ultralow-power ultrasonic three-dimensional (3D) rangefinder system for mobile gesture recognition. The proposed 3D rangefinder uses an array of tiny piezoelectric ultrasound transducers which are built on a silicon wafer using microfabrication techniques. Custom electronics are used to control the transducers and the system emits sound into the air and receives echoes from objects in front of the transducer array. The proposed ultrasonic 3D rangefinder has the potential to be small and low-power enough to be left on continuously, giving devices such as smartphones, tablets, and wearable electronic devices a way to sense physical objects in the surrounding environment. Based on the smartphone market alone, the potential market size for this device is over one billion units per year. Mobile contextual awareness will enable 3D interaction with smartphones and tablets, facilitating rich user interfaces for applications such as gaming and hands-free control in automobiles. Looking beyond the smartphone and tablet market, the proposed rangefinder will feature size and power advantages that will permit integration into centimeter-sized devices which are too small to support a touchscreen. During Phase II, the major technical goals of this project are to transfer the ultrasound transducer manufacturing from a university laboratory to a commercial production facility, to develop a custom integrated circuit for signal processing, and to develop engineering prototypes. In Phase I, micromachined ultrasound transducers having a novel structure designed to improve manufacturability were developed and a demonstration prototype was built using signal processing algorithms running on a personal computer. In Phase II, the ultrasound transducers will be manufactured in a commercial facility for the first time and signal processing algorithms will be realized on a custom mixed-signal integrated circuit. A prototype package for the transducer and integrated circuit chips will be developed and detailed acoustic testing of the packaged prototypes will be conducted.

Loading Chirp Microsystems collaborators
Loading Chirp Microsystems collaborators