News Article | May 19, 2017
Quantum Applied Science & Research (QUASAR), Inc., a San Diego-based high-tech company, has just been awarded a research contract from the Defense Advanced Research Projects Agency (DARPA) to develop a wearable detector of extreme mental stress, referred to as acute cognitive strain (ACS). Under this funding, QUASAR will collaborate with the Florida Institute for Human and Machine Cognition (IHMC) to develop and test a wearable and comfortable Nonintrusive Detector of Acute Cognitive Strain (DACS) for U.S. warfighters. The $150K award is the first of two planned phases of a Small Business Technology Transfer (STTR) research effort whose total funding could exceed $1M. In October 2016, QUASAR and IHMC also received a $2M award from the U.S. Air Force to monitor cognitive effort, stress, and fatigue. ACS is the psychological state that results from excessively high mental demands. It can cause loss of productivity, situational awareness, and self-control, in addition to breakdowns in team cooperation. This can worsen job performance and satisfaction, and in certain environments lead to team failure and catastrophic consequences. “In today’s world, almost everyone is stressed out. ACS is the extreme end of the stress spectrum, when tempers are lost and grown-ups melt down into tantrums,” explained Dr. Walid Soussou, CEO of QUASAR, who is leading this project along with Dr. Anil Raj from IHMC. Many consumer-grade wearable stress monitors have recently started appearing on the market, like Thync, Wellbe, or Spire, and there are even apps for the Apple Watch. QUASAR and IHMC aim to develop a military-grade wearable DACS sensor suite to measure ACS-related physiological changes, including increased blood pressure, heart rate, sweatiness, muscle and voice tension, and shortness of breath. These measurements would be used to detect ACS and alert management that the operator needs assistance, or fed back into the system itself so it could take appropriate action to address the situation. For both military and non-military environments, the detection of ACS is important to rapidly mitigate stressful situations, especially in mission- or safety-critical environments such as command center operation, emergency response teams, air traffic control, and plant management. The detection of ACS is also important for other stressful professional environments, including high-stakes trading in the financial sector, emergency medical staff, or even competitive sports. Therefore, wearable DACS technology could be extended to a range of non-military applications, including anger management and specialized peak-performance training methods for maintaining athletic performance under high-stress situations. About QUASAR QUASAR, Inc. (http://www.quasarusa.com) was founded in 1998. The company is known for high-fidelity dry sensors for noninvasive physiological monitoring. QUASAR’s focus is the creation of high-performance systems for measurement of electroencephalogram (EEG) and electrocardiogram (ECG) signals. QUASAR also develops specialized algorithms to interpret these signals for real-world applications such as optimizing education programs and enhancing athletic performance. About IHMC IHMC is a non-profit research institute of the Florida University System and is affiliated with several Florida universities. Researchers at IHMC pioneer technologies aimed at leveraging and extending human capabilities to deliver cognitive, physical, or perceptual augmentative solutions. IHMC collaborates extensively with industry and government to develop science and technology that can be enabling with respect to society’s broader goals of human life enhancement.
News Article | February 27, 2017
BARCELONA, Spain, Feb. 27, 2017 (GLOBE NEWSWIRE) -- MOBILE WORLD CONGRESS 2017 -- InterDigital, Inc. (NASDAQ:IDCC), a mobile technology research and development company, and Florida Institute for Human and Machine Cognition (IHMC) based in Pensacola, FL, a not-for-profit multidisciplinary research institute, today announced a debut demonstration of their Contextual Driving Platform (CDP) at this year’s Mobile World Congress. InterDigital’s Innovation Partners and IHMC collaborated to develop a contextual driving platform to demonstrate cooperative sensor data fusion techniques based on data from on-board sensors and Vehicle-to-X communications for situational awareness. In an interactive demonstration, MWC attendees will be able to experience the contextual services offered by the platform through a virtual reality driving experience. The technology demonstration will highlight how the vehicles sense, collaborate with each other to compute risk exposure, improve performance, optimize trajectories and adapt self-driving behavior. “We will be seeing self-driving vehicles equipped with a multitude of camera, radar and LIDAR sensors, as well as 5G and V2V communication capabilities, appear on the road soon. For a long time, we still will have human-driven cars sharing the road with automated vehicles. With this platform, we are demonstrating how these two different sets of vehicles can work together to understand mixed traffic conditions, share situational data and monitor, assess and mitigate risks to make driving safe both for computer algorithms and humans. Via the immersive Virtual Reality experience, we are offering a taste of the future’s automated driving world now,” said Samian Kaur, Director, Partner Development at InterDigital. “Vehicle safety should be a collaborative process, informed by prior knowledge of historical risk collected from surrounding vehicles. This is basically the way that humans reduce risk. We notice the driving manner of an approaching vehicle, the closing speeds and proximity and make a risk assessment that helps us to reduce or increase speed for better safety margin. Part of that assessment is an intuitive notion based on the surrounding vehicles, road conditions, situational awareness and driving experience. The Contextual Driving Platform allows us to explore these possibilities as a V2X networked solution,” said Tim Hutcheson, Research Associate at IHMC. Innovation Partners is InterDigital's technology sourcing initiative based on partnerships with leading technology firms, innovators and research organizations worldwide. Innovation Partners recognizes the importance of partnerships to capitalize on opportunities to expand the company's research and development efforts in response to the evolution of the mobile technology industry. Attendees of Mobile World Congress can see the Contextual Driving Platform demo, along with other IoT and 5G demos, at InterDigital’s pavilion in Hall 7, Stand 7C61. InterDigital develops mobile technologies that are at the core of devices, networks, and services worldwide. We solve many of the industry's most critical and complex technical challenges, inventing solutions for more efficient broadband networks and a richer multimedia experience years ahead of market deployment. InterDigital has licenses and strategic relationships with many of the world's leading wireless companies. Founded in 1972, InterDigital is listed on NASDAQ and is included in the S&P MidCap 400® index. InterDigital is a registered trademark of InterDigital, Inc. IHMC is one of the nation’s premier research organizations with world-class scientists and engineers investigating a broad range of topics related to building technological systems aimed at amplifying and extending human cognitive, perceptual, and physical capacities. IHMC headquarters are in Pensacola, Florida, with a branch research facility in Ocala, Florida.
Koschmann T.,University of Illinois at Springfield |
LeBaron C.,Brigham Young University |
Goodwin C.,University of California at Los Angeles |
Feltovich P.,Florida Institute for Human and Machine Cognition
Journal of Pragmatics | Year: 2011
As our contribution to this special issue, we examine how understandings of objects are talked and worked into being within concerted action. We will argue that formal procedure can serve as a resource in this regard. Procedures make relevant certain kinds of objects, objects that serve as its materials, tools, end-products, agents, etc. Our analysis traces all references to a particular object, the cystic artery, over the course of a surgery conducted at a teaching hospital. The arrangements of the operating theatre impose certain constraints on how the key participants, a surgeon in training, a faculty member and a medical student, were able to display and detect particular features of their material environment. Also, because of the surgery's status as a 'site of instruction,' a special set of accountabilities came into play during its performance. Talk was frequently seen to do both instructional and instrumental work. The team members were called upon to interpret the visual field in congruent ways and, more specifically, to strike agreements as to what would serve as salient objects for the purposes of the work at hand. The identification of the cystic artery was called into question and its thingness had to be renegotiated. We draw on Garfinkel's notion of 'trust' to describe the prospective/retrospective processes of referring to what comes to be the cystic-artery-for-the-purposes-of-this-surgery. We argue that procedure both determines and is determined by its objects. © 2010.
Tortonesi M.,University of Ferrara |
Stefanelli C.,University of Ferrara |
Benvegnu E.,Florida Institute for Human and Machine Cognition |
Ford K.,Florida Institute for Human and Machine Cognition |
And 2 more authors.
IEEE Communications Magazine | Year: 2012
Unmanned aerial vehicles are becoming prevalent in tactical networks as they are proving to be an extremely flexible platform for a variety of applications. Increasingly, UAVs need to cooperate with each other in order to perform complex tasks such as target monitoring and prosecution, information gathering and processing, and delivery between disconnected portions of the network. However, UAV cooperation in tactical scenarios represents a major challenge from both the coordination and communication perspectives. In fact, cooperating UAVs must achieve a high degree of coordination in order to accomplish complex tasks in a dynamic and uncertain environment. In turn, as UAVs interact with other entities, the effective coordination of multiple-UAV operations requires specific support in terms of efficient communication protocols and mechanisms exploiting UAVs as mobile assets that facilitate and hasten critical information flows. This article presents a series of considerations and lessons learned that we have collected in our experience with multiple- UAV coordination and communications in tactical edge networks, and discusses some of the main components of a middleware we specifically designed to support multiple-UAV operations. © 1979-2012 IEEE.
Bias R.G.,University of Texas at Austin |
Hoffman R.,Florida Institute for Human and Machine Cognition
IEEE Intelligent Systems | Year: 2013
If usability is to be a valuable, empirical methodology.. then where's the science in usability analysis? © 2013 IEEE.
Hoffman R.R.,Florida Institute for Human and Machine Cognition |
McCloskey M.J.,361 Interactive, LLC
IEEE Intelligent Systems | Year: 2013
The conceptual distinction between requirements and desirements was introduced in a previous installment; here, the authors expand on the concept and specify a methodology for eliciting desirements in support of the development of intelligent systems. © 2001-2011 IEEE.
Neuhaus P.D.,Florida Institute for Human and Machine Cognition |
Pratt J.E.,Florida Institute for Human and Machine Cognition |
Johnson M.J.,Florida Institute for Human and Machine Cognition
International Journal of Robotics Research | Year: 2011
We discuss the main issues and challenges with quadrupedal locomotion over rough terrain in the context of the Defense Advanced Research Projects Agency's Learning Locomotion program. We present our controller for the LittleDog platform, which allows for continuous transition between a static crawl gait and a dynamic trot gait depending on the roughness of the terrain. We provide detailed descriptions for some of our key algorithm components, such as a fast footstep planner for rough terrain, a body pose finder for a given support polygon, and a new type of parameterized gait. We present the results of our algorithm, which proved successful in the program, crossing all 10 terrain boards on the final test at an average speed of 11.2 cm/s. We conclude with a discussion on the applicability of this work for platforms other than LittleDog and in environments other than the Learning Locomotion designed tests. © 2011 The Author(s).
Clancey W.J.,Florida Institute for Human and Machine Cognition
Cognitive Processing | Year: 2015
People conceive their everyday affairs (their practices) as social actors in activities, in which they perceive, infer, move, manipulate objects, and communicate in some physical setting (e.g., going to the grocery to buy dinner). These behaviors are conceptually choreographed in an ongoing, usually tacit understanding of “what I’m doing now,” encapsulating roles (“who I’m being now”), norms (“what I should be doing”; “how I should be dressed/talking/sitting”), and progress appraisals (“how well I’m doing”). Activity motives and modalities vary widely (e.g., waiting in line, listening to music, sleeping), all of which require time and occur in particular settings. Brahms is a multi-agent work systems design tool for modeling and simulating activities, used extensively to design aerospace work systems. For example, the Generalized Überlingen Model (Brahms-GÜM) simulates air transportation practices, focusing on how pilots and air traffic controllers interact with automated systems in safety–critical, time-pressured encounters. Spatial cognition is pervasive: scanning displays of multiple workstations; coordinating airspaces and flight paths; and prioritizing and timing interventions to maintain aircraft separations. Brahms-GÜM demonstrates how events may become unpredictable when aspects of the work system are missing or malfunctioning, making a routinely complicated system into one that is cognitively complex and becomes out of control. Normally, asynchronous processes become coupled in space and time, leading to difficulty comprehending the situation (“what is happening now”) as a familiar multi-modal flow of events. Such examples illustrate the dynamics of spatial cognition inherent in our conceptually situated experience—our consciousness—of who we are and what we are doing. © 2015, Marta Olivetti Belardinelli and Springer-Verlag Berlin Heidelberg (outside the USA).
Florida Institute For Human And Machine Cognition | Date: 2015-12-04
A wearable robotic device configured to provide exercise for a human user. A primary use of the device is to address muscle and bone density loss for astronauts spending extended periods in microgravity. In one configuration the device applies a compressive force between a users feet and torso. This force acts very generally like gravityforcing the user to exert a reactive force. The compressive force is precisely controlled using a processor running software so that a virtually endless variety of force applications are possible. For example, the wearable device can be configured to apply a gravity-simulating force throughout the devices range of motion. The robotic device may also be configurable for non-wearable uses. In these cases the robotic device may act as an exercise machine. The programmable nature of the force application allows the device to simulate weight-training devices and other useful exercise devices. The devices functions may be implemented in a microgravity environment or a normal terrestrial environment.
News Article | January 15, 2016
The robot takeover is nigh—but first, let one of your future masters clean your home. While last year’s DARPA Robotics Challenge sought a robot most capable of performing missions in disaster areas, the rest of us have been hoping for a mechanized friend better suited to more...immediate needs, such as doing our chores. Fortunately, the Florida Institute for Human and Machine Cognition, or IHMC, is taking us one step closer to the robot butler of our dreams with its multi-million-dollar Atlas. Since its second place win at the DARPA Robotics Challenge, Team IHMC hasn’t slowed down its innovations, paying particular attention to Atlas. Case in point: The humanoid robot can now sweep floors. Atlas can also pick up boxes, fold up ladders, and move chairs—all useful résumé skills for when your office hires its next assistant. And in what we’re sure is an attempt to showcase its humanity (and thereby luring us into a false sense of security before stealing our identities), Atlas appears to have the capacity for the very-human traits of laziness and inefficiency, as evidenced by it throwing a paper airplane. Next up on the docket: teaching Atlas how to scroll Facebook and access your bank account.