VideoMining Corporation

State College, PA, United States

VideoMining Corporation

State College, PA, United States
SEARCH FILTERS
Time filter
Source Type

News Article | December 14, 2016
Site: globenewswire.com

NEW YORK, Dec. 14, 2016 (GLOBE NEWSWIRE) -- ICAP Patent Brokerage announces for sale patents disclosing a means of real-time shopper analysis, available from VideoMining Corporation. “This is sophisticated technology comparable to the kinds of data mining usually applied to large-scale e-commerce operations, but intended for brick and mortar analytics,” said Paul Greco, Senior Vice President of Sales for ICAP Patent Brokerage. Key Characteristics & Benefits With a 2007 priority date, the VideoMining portfolio discloses an apparatus, systems, and methods to understand and measure shopper composition, behavior, and attitudes in retail environments, primarily to gauge shoppers’ response to various stimuli or elements—e.g., marketing, products, pricing, etc.—to optimize their marketing and merchandising strategies. This includes a method and system for measuring human response to retail elements, based on the shopper's facial expressions and behaviors. Facial geometry—facial pose and facial feature positions—is estimated to facilitate the recognition of facial expressions, gaze, and demographic categories. Dynamic changes and interest are used to estimate both the shopper's changes in attitude toward the retail element and the end response—such as a purchase decision or a product rating. Also included is a comprehensive method to design an automatic media audience measurement system capable of estimating the site-wide audience of a media of interest (e.g., the site-wide viewership of a target display) based on the measurements of an audience subset. Site, display, crowd, and audience—and their relationships - are identified in terms of visibility, viewership relevancy, and crowd occupancy, along with optimal sensor positioning coupled and intelligent noise removal, to extrapolate viewership data. See the technical description of the shopper behavior IP sales offering. To learn more about the intellectual property available for sale in this portfolio: Contact Michelle Tyler of ICAP Patent Brokerage at (312) 327-4438 or via email at michelle@icapip.com. If you have a patent portfolio for sale, visit our website to make a patent portfolio submission for an upcoming sealed bid event. Follow us on Twitter (@ICAP_IP) and join our LinkedIn group. About ICAP Patent Brokerage ICAP Patent Brokerage is the world's largest intellectual property brokerage and patent auction firm, leveraging the talents of experienced patent brokers to match buyers and sellers for the sale of patents and other intellectual property assets. With multiple transaction platforms and unparalleled industry knowledge, including experience with domain names, trademarks, brands, intellectual property licensing, and UCC sales, ICAP Patent Brokerage is the global leader in the transaction of intellectual property.


Kasturi R.,University of South Florida | Goldgof D.B.,University of South Florida | Ekambaram R.,University of South Florida | Pratt G.,Darpa | And 15 more authors.
Proceedings - International Conference on Pattern Recognition | Year: 2014

The U.S. Defense Advanced Research Projects Agency's (DARPA) Neovision2 program aims to develop artificial vision systems based on the design principles employed by mammalian vision systems. Three such algorithms are briefly described in this paper. These neuromorphic-vision systems' performance in detecting objects in video was measured using a set of annotated clips. This paper describes the results of these evaluations including the data domains, metrics, methodologies, performance over a range of operating points and a comparison with computer vision based baseline algorithms. © 2014 IEEE.


Hong K.,LG Corp | Medeiros H.,Marquette University | Shin P.J.,VideoMining Corporation | Park J.,Purdue University
International Journal of Sensor Networks | Year: 2016

This paper presents a novel resource-aware framework for the implementation of distributed particle filters in resource-constrained wireless camera networks (WCNs). WCNs often suffer from communication failures caused by physical limitations of the communication channel as well as network congestion. Unreliable communication degrades the visual information shared by the cameras, such as visual feature data, and consequently leads to inaccurate vision processing at individual camera nodes. This paper focuses on the effects of communication failures on object tracking performance and presents a novel communication resource-aware tracking methodology, which adjusts the amount of data packets transmitted by the cameras according to the network conditions. We demonstrate the performance of the proposed framework using three different mechanisms to share the particle information among nodes: synchronised particles, Gaussian mixture models, and Parzen windows. The experimental results show that the proposed resource-aware method makes the distributed particle filters more tolerant to packet losses and also more energy efficient. Copyright © 2016 Inderscience Enterprises Ltd.

Loading VideoMining Corporation collaborators
Loading VideoMining Corporation collaborators