Buckley E.,Light Blue Optics
Optics Letters | Year: 2010
It is shown that the lens count in a Fourier holographic projector can be reduced by encoding the equivalent lens power in sets of Fresnel holograms. By using appropriately calculated Fresnel holograms in a reflective configuration to effectively share a lens between the beam-expansion and demagnification stages of a holographic projector, a reduction in lens count from four to one is demonstrated. © 2010 Optical Society of America.
Buckley E.,Light Blue Optics
IEEE/OSA Journal of Display Technology | Year: 2011
Phase-only holographic projection has prompted a great deal of research and has often been cited as a desirable method of 2-D image formation, since such a technique offers a number of advantages over conventional imaging projection technology , . Although holographic image formation was demonstrated some forty years ago , efforts at realizing a real-time 2-D video projection system based on this technique have not been successful, principally due to the computational complexity of calculating diffraction patterns in real time and the poor quality of the resultant images. In this paper, a new approach to hologram generation and display is presented which overcomes both of these problems, enablingfor the first timea high-quality real-time holographic projector. © 2006 IEEE.
Light Blue Optics | Date: 2013-08-13
Various approaches to touch sensing systems are disclosed As an example, a touch sensing system is disclosed that includes: an optical beam source to provide an optical beam; a pair of controllable beam deflectors comprising at least first and second beam deflectors, wherein the first beam deflector is configured to deflect the optical beam through a first angle towards a touch sensing region, and wherein the second beam deflector is configured to deflect scattered light from an object in the touch sensing region through a second angle; a detector, in particular a detector array; and an imaging system to image the deflected scattered light from the second beam deflector onto the detector array. The first and second beam deflectors are controlled in tandem to scan the touch sensing region.
Light Blue Optics | Date: 2013-07-03
We describe a touch sensitive holographic image display device for holographically projecting a touch sensitive image at an acute angle onto a surface on which the device is placed. The device includes holographic image projection optics comprising at least one coherent light source illuminating a spatial light modulator (SLM), output optics to project a hologram onto an acute angle surface, and a remote touch sensing system to remotely detect a touch of a location within or adjacent to the holographically displayed image. A control system is configured to provide data defining an image for display, to receive detected touch data, and to control the device responsive to remote detection of a touch of a the displayed image.
Light Blue Optics | Date: 2014-12-19
We describe a touch sensing system projecting light defining a touch sheet above a surface and a camera to capture a touch image of light scattered by a pen intersecting the touch sheet. A signal processor identifies a lateral location of the pen. The pen includes a light source to provide a light signal, and the system also includes two photodiodes to detect the signal from the pen. The touch detection is augmented by modelling the signals received at the photodiodes, dependent on the distance and angle of a pen, and using this to derive a probability of the observed received signals given the pen location determined from the touch sheet. This information can be used, for example, to allocate identities to the pens.
Light Blue Optics | Date: 2013-01-17
We describe a touch sensitive image display device. The device comprises: an image projector to project a displayed image onto a surface in front of the device; a touch sensor light source to project light defining a touch sheet above the displayed image; a camera directed to capture a touch sense image comprising light scattered from the touch sheet by an object approaching the displayed image; and a signal processor to process a the touch sense image to identify a location of the object relative to the displayed image. The camera is able to capture an image projected by the image projector, the image projector is configured to project a calibration image, and the device includes a calibration module configured to use a calibration image from the projector, captured by the camera, to calibrate locations in said captured touch sense image with reference to said displayed image.
Light Blue Optics | Date: 2013-09-12
We describe a touch sensing system comprising a light source to project light defining a touch sheet above a surface; a camera directed to capture a touch sense image from the touch sheet, comprising light scattered by an object; and a signal processor to process the touch sense image to identify a lateral location of the object on the surface. A brightness of the projected light is modulated to define bright, touch detecting intervals and dark intervals. The camera and the light projection are synchronized such that the camera selectively captures scattered light during the touch detecting intervals and rejects ambient light during the dark intervals. The system further comprises a pen. The pen comprises a photodetector to detect the projected light, a first light source detectable by the camera, and a controller coupled to control the first light source such that it is selectively illuminated during touch detecting intervals in synchronism with modulated projected light of the touch sheet.
Light Blue Optics | Date: 2012-10-08
Some embodiments of the present invention provide touch sensitive image display devices including: an image projector to project a displayed image onto a surface; a touch sensor light source to project a plane of light above the displayed image; a camera to capture a touch sense image from light scattered from the plane of light by an approaching object; and a signal processor to process the touch sense image to identify a location of the object. The light path to the touch sensor camera includes a keystone-distortion compensating topical element, in particular a convex curved aspherical minor.
Light Blue Optics | Date: 2013-03-25
A touch sensing system, for sensing the position of at least one object with respect to surface, the system comprising: a first, 2D touch sensing subsystem to detect a first location of said object with respect to a surface and to provide first location data; a second, object position sensing subsystem to detect a second location of said object, wherein said second location of said object is not constrained by said surface, and to provide second location data; a system to associate said first location data and said second location and to determine additional object-related data from said association; and a system to report position data for said object, wherein said position data comprises data dependent on at least one of said first and second locations and on said additional object-related data.
Light Blue Optics | Date: 2012-06-15
A touch sensitive device, the device comprising: a touch sensor light source to project a plane of light above a surface: a camera directed to capture a touch sense image from a region including at least a portion of said plane of light, said touch sense image comprising light scattered from said plane of light by a plurality of said objects simultaneously approaching or touching said surface; a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a lateral location of each of said objects; and an output object position output, to provide output object positions for said plurality of objects; and wherein said signal processor is further configured to: input a succession of said touch sense images; process each said touch sense image to identify a plurality of candidate object touch positions; and filter said candidate object touch positions from said successive touch sense images to link said candidate object touch positions to previously identified output object positions for said plurality of objects; and update said previously identified output object positions using said linked candidate object touch positions.