Barrington, RI, United States
Barrington, RI, United States
SEARCH FILTERS
Time filter
Source Type

News Article | May 25, 2017
Site: www.prweb.com

XIMEA, the camera innovator of small size and high speed cameras, announced today that its xiB-64 camera line was recognized by the judges of the annual Vision Systems Design Innovators Awards program. The judging panel consisted of esteemed experts from system integrator and end-user companies: https://www.ximea.com/en/corporate-news/vision-systems-design-award-2017 XIMEA was honored with a Platinum-level award for Fast speed camera based on PCI Express interface. Beside the extremely high bandwidth of up to 64 Gbit/s, PCIe interface also allows to aggregate different interfaces and camera models. What that means in terms of new options - you can easily combine many cameras into a customizable hub from which the compiled data will be transferred using just one cable. Image sensors available in the first camera family include the CMOSIS CMV12000, CMV20000, CMV50000, providing resolutions of 12 MPixel, 20 MPixel or 50 MPixel, reaching speeds of up to 330 fps. The second camera family features global shutter CMOS image sensors with 13.7 µm pixel sizes that can reach up to 3500 fps at full resolution of 1280 x 864. “We were positively surprised by the overwhelming feedback received after announcement of PCIe product line,” says Max Larin, CEO and head of product development and engineering at XIMEA. “The demand for fast frame rates and higher resolution is ever growing in the machine vision field.” Alan Bergstein, publisher of Vision Systems Design (http://www.vision-systems.com) said, "This prestigious program allows Vision Systems Design to celebrate and recognize the most innovative products and services in the vision and image processing industry. Our 2017 Honorees are an outstanding example of companies who are making an impact in the industry." The Innovators Awards are judged based on the following criteria: The 2017 Visions Systems Design Innovators Awards Honorees are featured in the June Issue of Vision Systems Design magazine as well as on http://www.vision-systems.com. About Vision Systems Design Published since 1996, Vision Systems Design is a global resource for engineers, engineering managers and systems integrators that provides comprehensive global coverage of vision systems technologies, applications, and markets. Vision Systems Design's magazine, website (http://www.vision-systems.com), email newsletters and webcasts report on and analyze the latest technology and business developments and trends in the worldwide machine vision and image processing industry. About The Vision Systems Design 2017 Innovators Awards program The Vision Systems Design 2017 Innovators Awards program reviewed and recognized the most innovative products and services in the vision and image processing industry. Honorees were announced at Automate 2017 held in Chicago, IL, USA. Criteria used in the Innovators Awards ranking included: originality, innovation; impact on designers, systems integrators and end-users; fulfilling a need in the market that hasn’t been addressed, leveraging a novel technology, and increasing productivity. About XIMEA Drawing on two decades of experience in the industry, XIMEA offer consists of state-of-the-art cameras with USB 3.1, USB 3.0, USB 2.0, PCI Express and FireWire interface as well as X-RAY, Hyperspectral and Thunderbolt™ technology enabled cameras. For more than 20 years XIMEA has developed, manufactured and sold standard or OEM cameras for machine vision applications in motion control, assembly, robotics, industrial inspection and security, as well as scientific grade cooled cameras for life science and microscopy. The main distinction is based on the extremely robust way the cameras are built while still providing highest speed like for example the USB3 Vision camera line. Learn more about XIMEA at http://www.ximea.com


GENEVA, SWITZERLAND--(Marketwired - May 25, 2017) - At the May 22-24 EBACE aircraft show in Geneva, the industry's growing movement towards SPD-Smart EDWs was evident in many forms -- aircraft currently being equipped with SPD-Smart EDWs, new programs selecting SPD, and new strategic partnerships combining respective strengths and innovations. SPD-Smart EDWs deliver unprecedented passenger benefits. By enabling users to precisely control the amount of daylight and glare coming through windows, passengers can instantly tune the tint to a comfortable level while continuing to enjoy views, rather than blocking their view with a shade. The system delivers many other practical benefits including a cooler cabin due to remarkable thermal insulation properties, and a quieter cabin due to acoustic insulation properties. SPD-Smart EDWs selected for Lufthansa Technik and Mercedes interior for VIP aircraft Lufthansa Technik and Mercedes-Benz Style are now offering for sale their next-generation interior design for large private jets. At EBACE, the aircraft industry also learned that SPD-Smart EDWs have been selected for use in this luxurious interior. Ten very large SPD-Smart EDWs, each covering multiple window openings and more, will be used to create technological luxury and aesthetics in this remarkable interior. Visitors to the Lufthansa Technik booth were able to view mockups of this cabin interior and also experience a walk-through of the interior via a virtual reality tour. See this luxurious interior, by Mercedes-Benz and Lufthansa Technik, by viewing this video. Vision Systems and PPG Aerospace enter into commercial agreement to offer SPD-Smart EDWs In a joint press release last week, Research Frontiers licensee Vision Systems, and PPG Aerospace, a leader in aircraft transparencies, announced they will combine their respective strengths to bring "a wide range of dimmable solutions," using SPD-Smart EDW technology, to commercial, regional, military and general aviation sectors of the industry. EBACE marked the first time that these two companies appeared together publicly, and visitors to the booth were treated to a wide array of SPD-Smart dimmable solutions including Nuance V2, their second-generation SPD-Smart EDW. Other products featured included the latest version of their smart Acti-Vision interactive window, and SPD-Smart EDWs for VIP helicopters. Also at the Vision Systems booth was another new innovation developed by Vision Systems -embedded electronics and a laminated touch panel for a more compact, easy-to-retrofit solution. On the passenger-facing side of the SPD-Smart EDW panel, the passenger can instantly and precisely tune the tint of the lightweight SPD-Smart EDW by simply touching the bottom of the window itself. On the opposite side, the electronic EDW controller is integrated into the window panel itself. This elegant solution simplifies the aftermarket installation procedure for airlines and operators, and saves space and weight. View this video to see this innovation. Research Frontiers licensee InspecTech, and Fokker Services, a division of GKN Aerospace, have teamed to offer the brand "Element EDW" using SPD-Smart technology. At EBACE, visitors to Fokker's booth had the opportunity to experience a virtual reality tour of a full cabin of Element EDWs. This experience provided perspective on how SPD-Smart EDWs improve the passenger experience for all -- not just those seated at windows. To learn more about InspecTech / Fokker Element EDWs, read this Fokker Element EDW brochure or view this video. SPD-Smart EDWs, supplied by Research Frontiers licensee InspecTech Aero Service, are standard equipment on all three models of Beechcraft King Airs: King Air 350i, King Air 250, and King Air C90GTx. The King Air 250 with these instantly switching SD-Smart EDWs was at this week's EBACE show at the status display of aircraft at Geneva airport. InspecTech is now shipping their "iShade" brand of SPD-Smart EDWs for all three models, and King Airs with the new interiors are now being delivered to customers. To see SPD-Smart EDWs on board the King Air 250 at EBACE, view this video. Other notable aircraft at EBACE that have selected SPD-Smart EDWs The HondaJet HA-420 aircraft, from Honda Aircraft Company, was on display at EBACE. SPD-Smart EDWs, supplied by Research Frontiers licensee Vision Systems, are standard equipment on the HondaJet. Also at EBACE was a mockup of the forthcoming Dassault Falcon 5X. SPD-Smart skylights, supplied by Vision Systems, have been selected as standard equipment on this aircraft. To offer business aviation's first skylight, Dassault was faced with a critical need to manage the intense solar light, glare and heat coming into the cabin, and SPD-Smart EDW technology provide the solution. The first delivery of the Falcon 5X is expected in 2020. Research Frontiers ( : REFR) is the developer of SPD-Smart light-control technology which allows users to instantly, precisely and uniformly control the shading of glass or plastic, either manually or automatically. Research Frontiers has an infrastructure of over 40 licensed companies that collectively are capable of serving the growing global demand for smart glass products in automobiles, homes, buildings, museums, aircraft and boats. For more information, please visit our website at www.SmartGlass.com, and on Facebook, Twitter, LinkedIn and YouTube. Note: From time to time Research Frontiers may issue forward-looking statements which involve risks and uncertainties. This press release contains forward-looking statements. Actual results could differ and are not guaranteed. Any forward-looking statements should be considered accordingly. "SPD-Smart" is a trademark of Research Frontiers Inc. "Nuance V2" and "Acti-Vision" are trademarks of Vision Systems. "Element EDW" is a trademark of Fokker services. "iShade" is a trademark of InspecTech Aero Service. "Alteos" is a trademark of PPG Aerospace.


Patent
Vision Systems, Inc. | Date: 2013-03-22

A new technology is provided for a portable bar code verification reader and is designed to create a video image of a bar code label in accordance to strict industry bar code verification standards. The barcode verification reader will effectively capture a bar code label and then send the resulting video image to a computer. The resulting bar code label quality may be displayed on the portable verification reader via one or more LEDs, an embedded display monitor, or the like.


Grant
Agency: Department of Defense | Branch: Air Force | Program: SBIR | Phase: Phase II | Award Amount: 749.86K | Year: 2015

ABSTRACT: VSI proposes the development of the SIGMA (Speedy Imagery Geo-registration and Motion Analysis) system, which will address three key requirements by the end of Phase II: (i) adapt to different type of sensors, sensor-configurations and sensor operating conditions, (ii) handle large amount of data in real-time or close to real-time using limited resources, and (iii) operate under conditions such as lack of metadata in case of GPS-denied areas and linear, unbounded trajecto-ries. The central tenet of the proposed approach will be formation of a graph where the nodes will be a group of images and these nodes will be connected to each other if there is a spatial or temporal connectivity. Furthermore, the development in Phase II will leverage Phase I develop-ments, specifically the robust online SFM system, which was successfully tested on multiple da-tasets such as CLIF 2007, MAMI and synthetic datasets, and (ii) the demonstrated feasibility of the dynamic graph- representation to handle large amounts of image data and variable camera geometry constraints. The proposed system is a plugin-based framework which will allow each of the components to be replaced by different implementations or algorithms to allow the system to work flexibly under different operating conditions. BENEFIT: Real-time mapping of aerial imagery for urban planning, construction and disaster relief and real-time traffic updates.


Grant
Agency: Department of Defense | Branch: Air Force | Program: SBIR | Phase: Phase II | Award Amount: 1.50M | Year: 2016

ABSTRACT:A system is proposed to access, collate and process imagery from multiple small satellite constellations. The imagery is fused and registered to a common reference so that exploitation algorithms, such as change detection and event monitoring, are reported with accurate geographic coordinates. The system, called SatTel, is implemented as open source software that is designed to run on the cloud with web-based analyst tools for defining intelligence queries. These queries arise from tasks such as monitoring weapon test sites, and receive timely responses due to SatTels ability to aggregate images based on an integrated portal to access multiple vendor archives. The imagery to satisfy intelligence queries is selected using a rule-based algorithm that matches exploitation algorithm data requirements against available image attributes such as resolution, viewpoint and atmospheric effects. The image fusion process results in useful intermediate products including: a high resolution digital elevation model (DEM) and a controlled orthorectified image base map (CIB). These products are exported in standard formats such as GEOTIF and NITF. The image exploitation algorithms include: global change detection, event triggers and vehicle tracking and are exported as attributed shape files, a format accepted by many current analyst tools, such as ArcGIS.BENEFIT:Oil and Gas energy, Natural Resource management: agriculture and forestry, Insurance, Fleet management, Real estate, city planning and engineering


Grant
Agency: Department of Defense | Branch: Air Force | Program: SBIR | Phase: Phase I | Award Amount: 149.93K | Year: 2013

ABSTRACT: VSI proposes a flexible sensor estimation framework for airborne calibration processing of image streams over long time periods. The components of the framework will consist of existing software and libraries. The plugin-based framework provides flexibility to switch the components easily. The core of the framework is a belief propagation engine to globally optimize the sensor parameters. The input to the approach is an image stream and its associated navigational data when available. The proposed framework is an online system wherein the sensor estimation is continuously refined as more spatial overlap is discovered. The proposed system can be dynamically tuned for either speed/efficiency or accuracy, depending on the requirements of the operator. The central tenets of the proposed approach are (i) use temporal ordering of the images to increase the computation efficiency, (ii) use spatial overlap of the images which are temporally disjoint to correct for any accumulated drift and (iii) to locally optimize the sensor parameters and use the belief propagation to globally optimize the sensor parameters for hundred thousands of images. BENEFIT: Sensor estimation capabilities for airborne image acquisition systems for Remote Sensing Applications. On-board processing capabilities for updating traffic reports.


Grant
Agency: Department of Defense | Branch: Air Force | Program: SBIR | Phase: Phase I | Award Amount: 149.87K | Year: 2013

ABSTRACT: While the state of the art in both single-image reconstruction algorithms and multi-view structure from motion algorithms have advanced considerably in recent years, little work has been performed which leverages the constraints relied upon by both approaches. When an area of interest is imaged from a long distance with little angular diversity in viewpoint, multi-view constraints alone are often insufficient to produce an accurate 3-d reconstruction. It is proposed that the additional constraints provided by surface properties learned from image data will improve reconstruction performance significantly. The proposed Phase I effort is focused on the development of an aerial image-based 3-d reconstruction algorithm that combines the relative strengths of both single-image reconstruction and context algorithms and state of the art multi-view stereo. The result is an automatically generated 3-d model that is optimally constrained by all information contained in a set of collected aerial images. The proposed system is general enough to exploit high angular diversity datasets, but exhibits graceful degradation as the viewpoint diversity decreases. The decrease in information due to low view angle diversity is compensated by single-image constraints on surface orientation derived by machine learning algorithms. BENEFIT: Benefits of the proposed approach include improved sensor model estimation and high accuracy 3-D modeling capabilities. Applications include support of downstream processing (tracking, geo-positioning, geo-registration), augmented reality / situational awareness, and simulation/training.


Grant
Agency: Department of Defense | Branch: National Geospatial-Intelligence Agency | Program: SBIR | Phase: Phase II | Award Amount: 499.54K | Year: 2013

With the advancement of aerial imaging sensors, high quality data equipped with partial sensor calibration models is available. There is a recent research activity in computer vision community that aims to reconstruct 3-d structure of the observed scenes relying on the content of the imagery in fully automated ways. However the research has not matured into robust systems ready for operational settings. In this proposal, a novel architecture that reconstructs the 3-d geometry of the scene in the form of a geo-registered 3-d point cloud given imagery from multiple sensor platforms is presented. The 3-d cloud is equipped with LE and CE measurements through propagation of errors in the sensor calibration and the geometry reconstruction stages. The CVG team proposes to use a volumetric probabilistic 3-d representation (P3DM) and dense image matching to reconstruct the geometry and the appearance of the scene starting from a set of images with partial calibration data. The P3DM technology is at Technical Readiness Level (TRL) 4, with critical modules of the system parallelized and implemented on GPU hardware for real-time processing.


Grant
Agency: Department of Defense | Branch: Air Force | Program: SBIR | Phase: Phase I | Award Amount: 149.90K | Year: 2012

ABSTRACT: Multi-sensor fusion is a key capability required for effective inferences for timely and accurate strategy and tactical planning. A multi-sensor fusion technology needs to appropriately weigh different image streams according to the uncertainties and errors across different inputs and parameters such as GPS and INS measurements, sensor model parameters, inherent ambiguities of the data such as localization for featureless region and errors due to registering data in a common frame in each source of data for effective fusion. Vision Systems, Inc. propose to fuse aerial imagery via 3-d models using subjective logic to represent and account for various uncertainties. Additionally, the 3-d models will be registered by computing a 3-d to 3-d transformation. BENEFIT: Remote Sensing, surveying, evacuation and emergency planning, visualization and navigation.


Patent
Vision Systems, Inc. | Date: 2012-02-10

Driver distraction is reduced by providing information only when necessary to assist the driver, and in a visually pleasing manner. Obstacles such as other vehicles, pedestrians, and road defects are detected based on analysis of image data from a forward-facing camera system. An internal camera images the driver to determine a line of sight. Navigational information, such as a line with an arrow, is displayed on a windshield so that it appears to overlay and follow the road along the line of sight. Brightness of the information may be adjusted to correct for lighting conditions, so that the overlay will appear brighter during daylight hours and dimmer during the night. A full augmented reality is modeled and navigational hints are provided accordingly, so that the navigational information indicates how to avoid obstacles by directing the driver around them. Obstacles also may be visually highlighted.

Loading Vision Systems, Inc. collaborators
Loading Vision Systems, Inc. collaborators