Citec is a Finnish corporation providing multi-discipline engineering and information management services for power, civil, rail vehicles, process, ICT and health care industries. Citec also offers a wide range of services and solutions related to information lifecycle management. Founded in 1984, Citec consisted of two companies, Citec Engineering and Citec Information. In 2012 the two companies merged to form Citec Group. As of 2012 Citec has about 1100 employees and the turnover was approximately 66 million euro. The company is headquartered in Vaasa, Finland, and has offices in Finland, Sweden, India, France, Russia, Germany and the UK. In 2011 Sentica Partners acquired a majority stake in Citec. Wikipedia.
News Article | June 8, 2017
The system was developed as part of the large-scale research project Famula at Bielefeld University's Cluster of Excellence Cognitive Interaction Technology (CITEC). The knowledge gained from this project could contribute to future service robots, for instance, that are able to independently adapt to working in new households. CITEC has invested approximately one million Euro in Famula. In a new "research_tv" report from Bielefeld University, the coordinators of the Famula project explain the new innovation. "Our system learns by trying out and exploring on its own - just as babies approach new objects," says neuroinformatics Professor Dr. Helge Ritter, who heads the Famula project together with sports scientist and cognitive psychologist Professor Dr. Thomas Schack and robotics Privatdozent Dr. Sven Wachsmuth. The CITEC researchers are working on a robot with two hands that are based on human hands in terms of both shape and mobility. The robot brain for these hands has to learn how everyday objects like pieces of fruit, dishes, or stuffed animals can be distinguished on the basis of their color or shape, as well as what matters when attempting to grasp the object. A banana can be held, and a button can be pressed. "The system learns to recognize such possibilities as characteristics, and constructs a model for interacting and re-identifying the object," explains Ritter. To accomplish this, the interdisciplinary project brings together work in artificial intelligence with research from other disciplines. Thomas Schack's research group, for instance, investigated which characteristics study participants perceived to be significant in grasping actions. In one study, test subjects had to compare the similarity of more than 100 objects. "It was surprising that weight hardly plays a role. We humans rely mostly on shape and size when we differentiate objects," says Thomas Schack. In another study, test subjects' eyes were covered and they had to handle cubes that differed in weight, shape, and size. Infrared cameras recorded their hand movements. "Through this, we find out how people touch an object, and which strategies they prefer to use to identify its characteristics," explains Dirk Koester, who is a member of Schack's research group. "Of course, we also find out which mistakes people make when blindly handling objects." Dr. Robert Haschke, a colleague of Helge Ritter, stands in front of a large metal cage with both robot arms and a table with various test objects. In his role as a human learning mentor, Dr. Haschke helps the system to acquire familiarity with novel objects, telling the robot hands which object on the table they should inspect next. To do this, Haschke points to individual objects, or gives spoken hints, such as in which direction an interesting object for the robot can be found (e.g. "behind, at left"). Using color cameras and depth sensors, two monitors display how the system perceives its surroundings and reacts to instructions from humans. "In order to understand which objects they should work with, the robot hands have to be able to interpret not only spoken language, but also gestures," explains Sven Wachsmuth, of CITEC's Central Labs. "And they also have to be able to put themselves in the position of a human to also ask themselves if they have correctly understood." Wachsmuth and his team are not only responsible for the system's language capabilities: they have also given the system a face. From one of the monitors, Flobi follows the movements of the hands and reacts to the researchers' instructions. Flobi is a stylized robot head that complements the robot's language and actions with facial expressions. As part of the Famula system, the virtual version of the robot Flobi is currently in use. With the Famula project, CITEC researchers are conducting basic research that can benefit self-learning robots of the future in both the home and industry. "We want to literally understand how we learn to 'grasp' our environment with our hands. The robot makes it possible for us to test our findings in reality and to rigorously expose the gaps in our understanding. In doing so, we are contributing to the future use of complex, multi-fingered robot hands, which today are still too costly or complex to be used, for instance, in industry," explains Ritter. The project name Famula stands for "Deep Familiarization and Learning Grounded in Cooperative Manual Action and Language: From Analysis to Implementation." The project has been running since 2014 and is limited to October 2017 at the moment. Eight research groups from the Cluster of Excellence CITEC are working together on the project. Famula is one of four large-scale projects at CITEC; other projects include a robot service apartment, the walking robot Hector, and the virtual coaching space ICSpace. As part of the Excellence Initiative of the Deutsche Forschungsgemeinschaft (German Research Foundation, DFG), CITEC is funded by state and federal governments (EXC 277).
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2007.1.3 | Award Amount: 2.83M | Year: 2008
SPIKE will develop a software platform for the easy and fast setup of business alliances. The project targets two main organisational objectives: first, outsourcing parts of the value chain to business partners (and vice versa, offering such parts in form of services); second, enabling collaboration between members of participating organisations through ad-hoc created as well as predefined business processes. SPIKE will enable collaboration and cooperation between the networked enterprises. The solution will encompass a semantically enriched service oriented infrastructure including a virtual semantic service bus for workflow control and handling and transformation of messages. At the enterprise interface level, we follow a collaborative process portal approach, capturing the users working context and seamlessly transmitting it to other applications and services according to the current workflow. This will also enable integration of legacy systems via tailored portlets and connectors. Special focus will be put on the security issues involved; the solution will include an easy-to-administer security infrastructure for the networked enterprise which will provide security services for service and workflow management.\nThe user partners will demonstrate the potential of SPIKE at the case of pilot deployments and use cases, i.e. a collaborative business alliance and two services ready for use in the networked enterprise. Because of its focus, the project will have an impact on organizations of all sizes that want to collaborate with each other. The base SPIKE components will be developed as an open source solution, with a special emphasis on easy adoption and cost feasibility. Where possible, we will build upon and enhance existing open source software. This way, SPIKE will have a special impact on SMEs. It will enable them to offer their services to potential new customers in a cost-saving and timely manner.
News Article | December 21, 2016
Privatdozent Dr. Dirk Koester and his colleagues reported the findings of their discovery in the research journal "PLOS One." According to the researchers, the method could offer an approach for new therapies, such as treating stroke patients. "Latest theories in cognitive science research hypothesize that our memory also records physical sensations as part of the words stored," says Dirk Koester, who works in the CITEC research group "Neurocognition, Action and Biomechanics" led by Prof. Dr. Thomas Schack. "Similar to an entry in a reference book, the brain records a word like 'whisk', associating it with concepts such as 'inanimate' and 'kitchen device.' In addition to this, the brain connects the word to one's own experience - how a whisk feels, for instance, and that a spinning motion is related to it." In their new study conducted with 28 participants, Koester and his colleagues lend support for the thesis of the embodiment of knowledge. Koester explains the central finding of their study: "When the study participants had to grasp an object while reading, their brain processed parts of the meaning of the words earlier than in previous studies in which words were evaluated without something being gripped." The participants sat in front of a computer screen, where three cubes were lying next to each other on the tabletop: one about the size of an apple, one the size of a table tennis ball, and one the size of a dice. On the screen behind the cubes, three white fields were displayed. Words then appeared in one of the fields on the screen - sometimes made-up words, sometimes real ones. When a pseudo-word such as "whask" was displayed, the participants did not have to do anything. But if a real noun like "orange" appeared, they were supposed to grip the cube corresponding to that respective field. An EEG electrode cap recorded brain activity, allowing the researchers to then evaluate how the word was processed. As demonstrated in previous studies, it takes the brain a third of a second to process a word. "In our study, however, we were able to show that comprehension can already begin much earlier, after just a tenth of a second - if a grasping action is required," explains Koester. This study not only provides evidence that the brain has a common control center for language and movement, but "it also shows that our brain's processing steps shift very quickly and adjust to current tasks - in this case, the task of grasping something while reading." Koester believes that the findings from this study could also be used in the future for various therapies, such as treatments for aphasia, a language disorder that can occur after a stroke in which one's ability to comprehend or formulate words is impaired or lost. "Similar as in our experiment, patients could practice words they cannot access by indicating not only verbally but also with grip movements to show they recognize a word. In short, motor training," explains Koester. "As such, one's knowledge of words would be strengthened through the 'back door' of motor control." Dirk Koester, Thomas Schack: Early neurophysiological interaction of conceptual and motor representations. PLOS ONE, http://dx. , published on 14 Dezember 2016.
News Article | December 23, 2016
Chess is one of the oldest - and most popular - board games. On Christmas Eve, the classic game is given as a gift several hundred thousand times over, whether as a chess set, computer game, or chess computer. Yet what is the secret of successful chess players? Cognitive scientists at the Cluster of Excellence Cognitive Interaction Technology (CITEC) at Bielefeld University have been investigating this question for the past year in the project "Ceege" by recording players' eye movements and facial expressions. Now, the researchers are revealing their preliminary results and explain why Norwegian grandmaster Magnus Carlsen again earned the title of world chess champion at this year's tournament. "There are numerous theories on how the brain controls attention and solves problems in both everyday situations and game situations," says Professor Dr. Thomas Schack. The sports scientist and cognitive psychologist heads the CITEC research group "Neurocognition and Action - Biomechanics" as well as the chess research project. "The game of chess is an ideal object of research for testing these theories because chess players have to be extremely attentive and make decisions in quick succession as to how they will proceed." Schack's research group is working together on "Ceege" with Inria Grenoble Rhones-Aples, a research institute in France. The project name means "Chess Expertise from Eye Gaze and Emotion." "We are investigating individual game tactics, chess players' behavior towards one another, and their body language," says Dr. Kai Essig, who together with Thomas Küchelmann is working on the project. "With the findings from this project, we will be able to predict in the future how strong an individual chess player is, and how high the chances are that a player wins a match. It appears that we will even be able to recognize a series of optimal moves that will increase the player's probability of winning." In order to gather as much information on players and their activity as possible, the Bielefeld researchers use various techniques. Eye tracking glasses allow to measure players' gaze positions, while video cameras record their facial expressions and body language. Professor Dr. James Crowley and his team from the Institute Inria are focusing on chess players' emotions, capturing for instance microexpressions - facial expressions that are only recognizable for a few miliseconds - as well as gestures, heart and respiratory rate, and perspiration. More than 120 participants have so far played chess under observation in the study and pilot study. Of these, a third were chess experts, and the other two-thirds novices. "The current study and the pilot study already show that chess experts show significant differences in their eye movements," says Kai Essig. "Chess experts concentrate for most of the time on the main chess pieces that can make or break the game in respective situations. The experts control their attention more efficiently than novices." According to Essig, amateurs jump very frequently from one figure to the next with their gaze, and look at nearly all the pieces on the board, regardless of whether they play an important role in the particular game situation. With the knowledge gleaned from their project, the researchers closely followed the chess world championship in November. "Early in the tournament, it was already apparent that Magnus Carlsen would win. He had shown more initiative in the first six matches. It was hardly possible for his opponent Sergej Karjakin to dominate the game," says physicist Thomas Küchelmann. When observing from a distance, though, only limited conclusions can be drawn. As Küchelmann explains: "in order to make concrete predictions, we would have actually had to measure Carlsen's and Karjakin's game with our test equipment. It would have been interesting to measure, for instance, Carlsen's emotional reaction to his missed end game opportunities, and his mistake in the eighth match, which he lost, along with Karjakin's emotional reactions to running out of time in the tie break." With their findings, the researchers want to develop an electronic chess assistant, which would analyze the weaknesses of chess novices and experts, using eye tracking for instance, and train players by providing tips and explanations. The assistant would recommend which move is optimal in the particular situation. "Looking forward, it would also be conceivable to integrate this assistive system into a robot. With their physical presence, robots could motivate players in a different way than for example an assistant operating verbally on a tablet," explains Thomas Schack. The "Ceege" research project will run for three years, through February 2019. The Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) and the French research funding body „Agence Nationale de la Recherche" (ANR) are providing funding for the project. Bielefeld University has received 300.000 euros for the research. More information is available online at: "Neurocognition and Action - Biomechanics" research group: http://bit.
News Article | September 6, 2016
Virtual reality (VR) does not just help companies save money when it comes to testing new technologies; it also enables trainees and students to quickly and intensively learn on-the-job skills (e.g. in a virtual hospital). From 8 to 9 September 2016, experts will present the latest developments and studies on augmented and virtual reali-ty at the Cluster of Excellence Cognitive Interaction Technology (CITEC) of Bielefeld University. This conference also includes a competition in which the researchers will demonstrate how virtual and augmented reality will enhance the workplace of the future. This conference brings together specialists and professionals from both academia and the business world. In the foyer of the CITEC Building, visitors will be able to experience the research prototypes firsthand and learn about virtual reality systems from the exhibitors. One example of this is a product from the software company Virtalis, which allows users to interactively discuss and modify 3-D constructions (such as those used for machines) in real time. Participation in the conference is free, but visitors must register in advance on the conference website. This conference brings together specialists and professionals from both academia and the business world. In the foyer of the CITEC Building, visitors will be able to experience the research prototypes firsthand and learn about virtual reality systems from the exhibitors. One example of this is a product from the software company Virtalis, which allows users to interactively discuss and modify 3-D constructions (such as those used for machines) in real time. Participation in the conference is free, but visitors must register in advance on the conference website. "With virtual reality, we can generate spaces and situations that help people learn to perform occupational duties," says computer scientist Dr. Thies Pfeiffer, who together with colleagues is organizing and leading the conference at CITEC. "In virtual hospital rooms, for instance, aspiring healthcare professionals can train more quickly and cost effectively in routine tasks like inserting an intravenous injection," explains Dr. Pfeiffer. "In many professions, certain tasks are performed over and over again to ensure that everything runs smoothly in emergencies or stressful situations. Training in 'reality' would require considerable resources, and students need a lot of space. In virtual reality, however, students and trainees can practice at their own pace until the processes become second nature." All they need to do this is a pair of virtual reality glasses with suitable software. This allows for situations to be generated in which the users are more immersed than if they were only to practice in front of a computer monitor. At the conference, CITEC research demonstrators will be on display, such as the virtual training space "ICSpace" and the smart data glasses from the Adamaas project. In ICSpace, a virtual coach instructs the user in how to correctly perform sports exercises - and immediately point out mistakes. The data glasses developed for the Adamaas project help people, particularly those with cognitive disabilities, to successfully complete the tasks of daily living. The glasses provide assistance, for example, in performing the maintenance work for a coffee machine. In addition to the prototypes, the researchers will also present studies on virtual and augmented reality, including the concept for a reality simulator, which serves to test both hardware and software in virtual reality. The simulator is suitable for programmers, for instance, who develop apps for smartphones. It can simulate a smart phone or a pair of data glasses, which the programmer can then use to test whether their app really does what it is designed to do. This type of simulation is also helpful for applications that involve larger buildings or entire factory settings: the programmers can remain seated at their computer and only need to take a look through the virtual reality glasses, rather than traveling to the actual production facilities for which they are developing the software. A highlight of the conference held in the CITEC Building will be the competition called "Virtual and Augmented Reality for the Workplace of the Future." For this competition, the research teams are each supplying a research prototype, which will be judged by an expert jury. The company Ceyoniq Technology GmbH has donated a pair of virtual reality video glasses called HTC Vive as the prize for the competition. Ceyoniq develops software for document management and is one of the main sponsors for the conference, along with Virtalis and Raumtänzer. In addition to this, outstanding research articles will be recognized with a "Best Paper Award," which will be chosen by both the organizing committee and the steering committee of the specialist group on virtual and augmented reality (Fachgruppe VR/AR) of the German Informatics Society (Gesellschaft für Informatik), who is organizing the conference. The winning articles will be published in a special edition of the "Journal of Virtual Reality and Broadcasting." For the past 13 years, interested parties from the academic and business world meet once a year for the "Virtual and Augmented Reality" conference, which was launched by the specialist group VR/AR. A detailed program overview and a list of exhibitors will be provided at the end of August. Event organizers expect approximately 100 participants to attend.
Rohlfing K.,CITEC |
Deak G.O.,University of California at San Diego
IEEE Transactions on Autonomous Mental Development | Year: 2013
Social learning takes place within an interactional loop. The contributions of this Special Issue exemplify approaches capturing the microdynamics of interaction to provide us with insights into the adaptation and learning processes. © 2013 IEEE.
Citec | Date: 2010-05-05
The invention relates to a method and an arrangement for separating particular material and/or water from heavy oil. An input flow of heavy oil comprising particular material and/or water is introduced to a first centrifugal separator, where heavy oil is separated from water and particles. Heavy oil is led out of the first centrifugal separator (6) as a primary output flow (10) and water (13) and/or particular material (11) is led out of the first centrifugal separator (6) as a secondary output flow(s). According to the invention the heavy oil comprising particular material and/or water is introduced to the first centrifugal separator at a density that is higher than density of the water at that temperature.
Frontiers in Heat and Mass Transfer | Year: 2016
Urea-water solution droplet evaporation is modelled using multi-component droplet evaporation approach. The heat and mass transfer process of a multi-component droplet is implemented in the Langrangian framework through a custom code in ANSYS-Fluent R15. The evaporation process is defined by a convection-diffusion controlled model which includes the effect of Stefan flow. A rapid mixing model assumption is used for the droplet internal physics. The code is tested on a single multi-component droplet and the predicted evaporation rates at different ambient temperatures are compared with the experimental data in the literature. The approach is used to model the injection of urea-water solution spray in a duct carrying hot air to predict the urea to ammonia conversion efficiency. Thermolysis reaction of the evaporated urea and the hydrolysis of the byproduct iso-cyanic acid are solved as volumetric reactions in the Eulerian framework using laminar finite rate approach. The spray simulation results are compared with the experimental data and the numerical results of surface reaction based direct thermolysis approach available in the literature. © 2016, Global Digital Central. All rights reserved.
News Article | May 27, 2015
The Palaszczuk government has continued its push to de-Newmanise Queensland by scrapping plans to outsource government IT services. There were fears of job cuts when the former Liberal National Party government, led by Campbell Newman, revealed it would divest its technology services provider CITEC as part of a cost-cutting drive. But Innovation Minister Leeanne Enoch announced on Wednesday that CITEC would remain a Queensland government-owned information and communication technology provider. "The decision to keep CITEC in public ownership provides certainty for its 344 full-time equivalent employees, as well as security for its government and commercial customers," she said. Enoch said she would be focused on working with CITEC management, staff, government and commercial clients to build a future-proof business model. "While there will be no job losses we will need a strategy to ensure that staff continue to grow their skills in order to meet changing technology and customer needs so that they can deliver ICT services now and into the future," she said. Enoch said significant transformations had already begun, including data storage and protection upgrades that would reduce government costs by more than $12 million over three years. The decision to outsource the IT services of the Queensland government was taken by then-IT minister Ian Walker in 2013, after an audit conducted by former federal treasurer Peter Costello said the cost to maintain and fix the government's IT systems would amount to AU$4.7 billion. At the time, 5,000 government IT jobs were set to be outsourced. Last week, Anna Bligh acknowledged that when the state government partnered with IBM to roll out its new health payroll system in 2010, it bought the wrong one. "The single biggest failure of the project was failure around managing the program and governance of it," Bligh said. "There was no real clarity of governance. There was one part of the government that was responsible for whole of government IT in a shared service provider model, and then we had the line agency Queensland Health." "Between those two agencies there was not a single point of accountability. So everybody was in charge, which ultimately meant nobody was," she said. "We basically got the product we bought, but we bought the wrong one, or we bought one that was not fit for purpose." The reversal of the decision to outsource CITEC yet another legacy of the Newman government that the current administration has either scrapped or plans to. The government has already reversed the Liberal National Party's changes to electoral donations laws, scrapped its cross-river bus and train tunnel and ended its surgery wait-time guarantee, among other things.