TUDelft

Delft, Netherlands
Delft, Netherlands
SEARCH FILTERS
Time filter
Source Type

Air travel could be better, and technology could deliver that, but only if the customer is at the heart of the brand. Shortly after the United incident, the New York Times published an article which argued that technology has not helped make air travel better. Instead, the writer suggested, technology has only helped improve airline margins and efficiencies but provided no tangible value to consumers. Though the premise is arguable, the story proves that airline reputations have taken a hit, especially in the US. The industry’s operating practices and CEOs face Congressional scrutiny and viral videos of poor passenger treatment have now impacted all three major US airline carriers. The Harris Poll released a report this month showing that the United Airlines suffered a 500 percent increase in negative consumer sentiment during a survey this April. Forty-two percent of U.S. consumers described the airline as “bad” or “very bad,” compared to seven percent of consumers in 2016. It’s time to take notice. United has. It published a 10-point program of improvement. But delivering will require technology. United has promised to the reduce overbooking and to create an automated alerts to recruit volunteers when seats are needed. There is no irony in this. When you serve millions of customers a day, you need technology to keep up. But should technology only benefit airline operations or can it resolve brand damage and build better relationships with consumers? Can it build loyalty? Can technology, in fact, improve air travel? As Chris Nurko, global chairman, Futurebrand explains, the way we think of technology drives how we apply it. Nurko believes technology needs to be framed properly, focusing it on creating affordable, simpler and real time/efficient products, services and experiences that appeal to consumer needs. It’s the old argument of sales versus marketing. They should work together, but they don’t think alike. Why aren’t technology and innovation customer centric? How can, not just airlines, but all travel brands change this focus? The difference is best encapsulated by the iconic image from the Crispin & Porter advertising agency in Florida. It is not enough to know the product a customer needs, if you don’t understand why the customer needs it. Nurko believes it’s critical to start with what he describes as “the voice and needs of the customer or passenger.” Successful companies, he tells us, start with the understanding that brand experience, and the related products and services must be relevant to the consumer, meet a specific need. Technology, deployed with a customer-centric design, can  offer choice, flexibility, and rewards for return custom. Nurko believes the gap in understanding the consumer explains the rise of the great disruptors. They offer consumers direct access to competitive products and services, while democratizing and cutting the middle man out of the path to purchase and beyond. Services like TripAdvisor, Cruise Critic, Yelp! and similar technology platforms, Nurko says, have helped fuel marketing awareness and develop preference. Jeroen van Erp, founding partner at Fabrique and professor of concept design at TUDelft, which works with KLM on the hyper-personalized application of technology, suggests that companies have placed blind faith in algorithms without really understanding their function or the data which they process, or, more importantly, what that data reveals about the customer. What is more, data collected may not be specific or relevant enough to offer customers value because it can’t answer the “why”. As an example of this disconnect, van Erp shares an anecdote of a study one of his graduate students conducted on Nike products. Basic purchasing data revealed nothing meaningful but by cross-checking for motivation, formulating the right questions, they learned that a lot of Nike’s buyers don’t intend to run. They may not even like to run. They just like to look good with Nike shoes on. They like Nike as a fashion brand, and associate it positively as a real sports brand, even if those consumers don’t themselves do sports. This aspirational and identity brand association is important to travel brands, including airlines. Looking at a passenger purchases only, for example, might identify someone who most often flies one airline in economy class, for example, as a budget flyer. The airline may follow up with fare promotions, and miss the detail that the passenger flies with them while complying with strict corporate travel budgets, but is affluent enough to pay for a premium ticket on a partner or competitor out-of-pocket which offers a superior product or serves a choice holiday destination. Aspirational and identity consumer behaviors, and the hyper-personalized brand positioning required to satisfy them, requires thinking beyond numbers. Applying technology to more intelligent and focused marketing, with hyper-personalized consumer engagement isn’t a touchy-feely notion. It is a brand necessity in a competitive market, van Erp tells us. “It should really be the preferred brand in our heads. They can’t afford to lose market share. So, with KLM for example, it´s very important to ensure that the collective satisfaction for travelers is really high. Also, because KLM is not a low-cost carrier, the added value should be very evident. “Technology can only deliver that if you can really tap into the individual concerns and personalities. “For example, with KLM one of the ideas the graduates came up with is that if people have different personality traits. Some people always want to know what to expect but there are also people that want to be surprised. “You feel that someone knows who you are, starts a dialogue and can really delight you. Then a flight is better than just a flight.” Who, where, when? Seeing beyond the immediate to what might surprise and delight requires brands to make organizational changes, van Erp believes. Rob Sinclair-Barnes, strategic marketing director at Amadeus IT Group believes that technology, as a tool, can do a lot, if given a chance. He says: It’s a cycle: focus on the relationship by acting on what you’ve learned and you learn more so that you can focus better on building and growing the relationship with improved services and better products. Sinclair-Barnes believes one hurdle in the way for travel brands, especially airlines, is a disconnect built into the system by the various sales channels. During disruptions, for example, there may be inadequate information available about the passenger for the airline to respond. Sinclair-Barnes believes there is a lot of work ahead for travel brands to collaborate on “data hygiene”, ensuring that everyone has shared insights on the customer at a basic level. He also believes travel brands should focus technology around the consumer’s “hierarchy of needs”; the ‘What’ ‘Where’ and ‘When’ but also The ‘Why and ‘How’. Getting full answers, though, requires permission from the consumer, and brands need to build consumer confidence to earn that. If we’re giving the impression that consumer satisfaction requires more granular consumer surveys, that’s not the whole story. Structured and unstructured data, which reveal behavioral patterns and preferences, can be perhaps more valuable than survey answers alone. That’s because we, as consumers, often don’t know what we want. Deeper consumer understanding allows brands to anticipate a need, and fulfill it. Unstructured data, from social media for example, as well as fuller view of past behavior with data shared between partners, can reveal consumers true identity and values. Not passport details, but true identity: knowing that a person would buy Nike shoes just to drive a short distance to the shops because it makes them “feel” sporty. Trying to understand millions of people on a personal level is impossible without applied technology. The amount of information to be processed is staggering. There’s a good reason why we call it Big Data. But the industry is making progress. Sinclair-Barnes says: By testing data gathered on consumers in general, Amadeus can predict what passengers might want to buy lounge access, and Fast Track services when they are flying with an airline they don’t fly often or when they are traveling at a non-hub airport. Building trust with consumers to access data like GPS location, can facilitate point-and-click ground transport services, fitting their preferences. What is known about customer preferences can be shared with staff to enhance operations, and with trust and permission, with network partners to improve joint services. Amadeus has already tested the appeal of cross-brand services with the Star Alliance airport experience at Heathrow’s Terminal 2. And, yes, technology can improve operations while satisfying customer needs. It’s difficult to imagine keeping customers happy or building trust if you can’t deliver on the core service offered, regardless of all the consumer insights you may have gathered. Efficient disruption recovery is as important as selling ancillaries and can also put a smile on a passengers face. AI can make brand interactions more meaningful, timely and relevant. Bots won’t replace personal service, but they can enhance it. Airlines are only beginning to dip their toes in this area. Sinclair-Barnes believes, in the end, technology will come through for consumers. Making technology work for consumers isn’t just about throwing money at it, though. As Nurko and van Erp point out, you have to put your heart and mind into it. There has to be a destination in mind. You have to get the kid by the side of the road to mom’s for Christmas. Forging stronger brand ties through applied technology is trip, but it’s important to get started. Now would be good.


Wu J.,University of Chile | Thompson J.,University of Edinburgh | Zhang H.,Zhejiang University | Prasad R.V.,TUDelft | Guo S.,University of Aizu
IEEE Communications Magazine | Year: 2016

Under the framework of the United Nations Framework Convention on Climate Change (UNFCCC) and the Conferences of the Parties (COPs), the United Nations Climate Change Conferences have been held yearly to evaluate the progress in dealing with climate change since 1995, when COP 1 was held in Berlin, Germany. COP20, in Lima, Peru in December 2014, reached an agreement that urged all countries to achieve their greenhouse gas (GHG) emission reduction targets by 31 March 2015. This information is called an Intended Nationally Determined Contribution (INDC). With the deadline of 31 March 2015 already passed, only 35 of the 193 countries had published their INDCs. After solid and united global efforts, from 30 November to 12 December 2015, COP 21 was held in Paris, France, when, in a historical breakthrough and milestone toward securing the future Earth, a global agreement on the reduction of climate change was agreed upon by representatives of more than 193 countries in attendance. According to the COP21 Organizing Committee, the agreement was to limit global warming to well below 2°C compared to pre-industrial levels. By 12 December 2015, 160 INDCs had been submitted, and on February 04, 2016, Nepal confirmed the 161st INDC, which together represented 188 countries. The requirement that the agreement would become legally binding is that at least 55 countries, which jointly represent at least 55 percent of global greenhouse emissions, have to sign the agreement in New York between 22 April 2016 (Earth Day) and 21 April 2017, and also adopt it within their own legal systems. Readers may find some detailed information from the sixth United Nations Environment Programme (UNEP) Emissions Gap Report, which was available in 2015 [1]. © 1979-2012 IEEE.


Waly T.,UNESCO IHE | Waly T.,Dow Chemical Company | Kennedy M.D.,UNESCO IHE | Witkamp G.-J.,TUDelft | And 3 more authors.
Desalination | Year: 2012

In supersaturated solutions the period preceding the start of 'measurable' crystallization is normally referred to as the 'induction time'. This research project aimed to investigate the induction times of CaCO 3 in the presence of Mg 2+ and SO 4 2-. The prepared synthetic solutions have the same ionic strength values found in the Gulf of Oman SWRO concentrates at 30% and 50% recovery. The results showed a significant increase in the induction time by 1140%, 2820%, and 3880% for a recovery of 50%, when adding SO 4 2- only, Mg 2+ only, or both Mg 2+ and SO 4 2-, respectively, to synthetic SWRO concentrate compared to that obtained in the absence of Mg 2+ and SO 4 2- at an initial pH of 8.3. The increase in the induction time in the presence of SO 4 2- was more than likely to be due to nucleation and growth inhibition while the presence of Mg 2+ affected the nucleation and growth through both complexation and inhibition. After a 5-month solution stabilization period, ESEM and XRD analyses showed aragonite in solutions containing Mg 2+. On the contrary, calcite was the final crystal phase formed in solutions with no Mg 2+. This suggests that magnesium may play an important role in inhibiting the formation of calcite. © 2011 Elsevier B.V..


Timarchi S.,Shahid Beheshti University | Fazlali M.,Shahid Beheshti University | D.cotofana S.,TUDelft
Proceedings - IEEE International Conference on Computer Design: VLSI in Computers and Processors | Year: 2010

Given that modulo 2n±1 are the most popular moduli in Residue Number Systems (RNS), a large variety of modulo 2n±1 adder designs have been proposed based on different number representations. However, in most of the cases, these encodings do not allow the implementation of a unified adder for all the moduli of the form 2n-1, 2 n, and 2n+1. In this paper, we address the modular addition issue by introducing a new encoding, namely, the storedunibit RNS. Moreover, we demonstrate how the proposed representation can be utilized to derive a unified design for the moduli set {2n-1,2n,2 n+1}. Our approach enables a unified design for the moduli set adders, which opens the possibility to design reliable RNS processors with low hardware redundancy. Moreover, the proposed representation can be utilized in conjunction with any fast state of the art binary adder without requiring any extra hardware for end-around-carry addition. © 2010 IEEE.


News Article | December 7, 2016
Site: phys.org

That is why NASA's stratospheric balloon STO2 will be launched from Antarctica to the edge of space to measure cosmic far infrared radiation. At an altitude of 40 kilometers above Antarctica, the air is crystal clear. There is scarcely any water vapor, which often blocks this type of radiation at other locations in the atmosphere. The NASA balloon that will carry the measuring instruments to this altitude will make use of the circular polar vortex, a stable airflow on which the balloon can circulate with for one or more rounds of about 14 days each. This will allow scientists to carry out observations for a period of two weeks before they find the balloon at nearly the same location again. STO2 has been developed under the leadership of the University of Arizona and contains vital contributions from SRON Netherlands Institute for Space Research (Utrecht and Groningen) and tech university TUDelft. These are three receivers for 1.4, 1.9 and 4.7 terahertz respectively. pectra of radiation at these frequencies often disclose the presence of elements in space, including electrically neutral atomic oxygen. The localization of that last element in space, which can be achieved using the 4.7 terahertz receiver, is a long-cherished dream of astronomers. It is the first time a 4.7 terahertz receiver will be brought to the edge of space for an unrestricted view. Together with the Massachusetts Institute of Technology (MIT), the partners developed a reference source for radiation at this frequency. Electrically neutral atomic oxygen reveals us places in the gas clouds between stars that are particularly warm. This is a good indicator for stars that only just formed. This way we can directly find the birthplaces of new stars. STO2 is therefore an important scouting mission for future terahertz missions using a satellite in space. Far infrared radiation is sometimes also referred to as terahertz radiation. One terahertz is equivalent to a wavelength of 300 micrometers. The University of Arizona is scientifically in the lead of the mission. The teams of prof. dr. Alexander Tielens (Universiteit Leiden) and prof. dr. Floris van der Tak (SRON/Rijksuniversiteit Groningen) will help in the international scientific analysis of the observations. Thursday the team on Antarctica gets three hours of good weather conditions. If this is too short, nice launching weather will follow in the following days. Explore further: New nano-detector very promising for remote cosmic realms More information: A livestream of the mission from NASA: www.csbf.nasa.gov/antarctica/ice.htm


Ridler M.E.,DHI | Ridler M.E.,Copenhagen University | Van Velzen N.,TUDelft | Hummel S.,Deltares | And 4 more authors.
Environmental Modelling and Software | Year: 2014

Data assimilation optimally merges model forecasts with observations taking into account both model and observational uncertainty. This paper presents a new data assimilation framework that enables the many Open Model Interface (OpenMI) 2.0 .NET compliant hydrological models already available, access to a robust data assimilation library. OpenMI is an open standard that allows models to exchange data during runtime, thus transforming a complex numerical model to a 'plug and play' like component. OpenDA is an open interface standard for a set of tools, filters, and numerical techniques to quickly implement data assimilation. The OpenDA-OpenMI framework is presented and tested on a synthetic case that highlights the potential of this new framework. MIKE SHE, a distributed and integrated hydrological model is used to assimilate hydraulic head in a catchment in Denmark. The simulated head over the entire domain were significantly improved by using an ensemble based Kalman filter. © 2014 Elsevier Ltd.


Verstrynge E.,Catholic University of Leuven | Schueremans L.,Catholic University of Leuven | Van Gemert D.,Catholic University of Leuven | Hendriks M.A.N.,T.U.Delft
Engineering Structures | Year: 2011

Masonry which is subjected to high, sustained stress levels can suffer from long-term damage accumulations. This type of stress-induced damage interacts with other long-term phenomena, such as deterioration and fatigue. In this work, the time-dependent damage which is caused by elevated stress levels is analysed and modelled. A one-dimensional rheological model, which was calibrated on the results of an extensive experimental test campaign, is extended to a three-dimensional version. The time-dependent constitutive relations are implemented in a finite element code. The issues of triaxial stresses and mesh-dependency are addressed. In a first application, the model is used to simulate the long-term behaviour of a masonry tower. Secondly, the effects of time-dependent stress redistributions on the long-term stability of three-leaf masonry are investigated. © 2010 Elsevier Ltd.


Rajabalinejad M.,Ecole Polytechnique de Montréal | Mahdi T.,Ecole Polytechnique de Montréal | van Gelder P.,TUDelft
Natural Hazards | Year: 2010

In this study, we address the effective method to apply a novel reliability method integrated with finite element models to the safety assessment of pilot site Scheldt in the Netherlands. This site was considered as one of the three main pilot sites in Europe to assess the application of newly suggested techniques in order to reduce and manage the flood risk in the Floodsite project. http://www.floodsite.net, 2004-2009). The novel method of dynamic bounds (DB) is applied to this site after a successful experience in (Rajabalinejad in Reliability methods for finite element models, 1 edn. IOS Press, Amsterdam, 2009). In this study, the bi-functional response of the finite element model is considered, and the dimensional uncertainty is defined presenting the expected uncertainty for a certain dimension in the DB method. The uncertainty is used as a judgment tool to choose the dimension for the DB method for the desired accuracy. The results obtained by applying this technique are presented in this paper. © 2010 Springer Science+Business Media B.V.


Gouiza M.,University of Saskatchewan | Hall J.,Memorial University of Newfoundland | Bertotti G.,TUDelft | Bertotti G.,VU University Amsterdam
Basin Research | Year: 2015

The Orphan Basin, lying along the Newfoundland rifted continental margin, formed in Mesozoic time during the opening of the North Atlantic Ocean and the breakup of Iberia/Eurasia from North America. To investigate the evolution of the Orphan Basin and the factors that governed its formation, we (i) analysed the stratigraphic and crustal architecture documented by seismic data (courtesy of TGS), (ii) quantified the tectonic and thermal subsidence along a constructed geological transect, and (iii) used forward numerical modelling to understand the state of the pre-rift lithosphere and the distribution of deformation during rifting. Our study shows that the pre-rift lithosphere was 200-km thick and rheologically strong (150-km-thick elastic plate) prior to rifting. It also indicates that extension in the Orphan Basin occurred in three distinct phases during the Jurassic, the Early Cretaceous and the Late Cretaceous. Each rifting phase is characterized by a specific crustal and subcrustal thinning configuration. Crustal deformation initiated in the eastern part of the basin during the Jurassic and migrated to the west during the Cretaceous. It was coupled with a subcrustal thinning which was reduced underneath the eastern domain and very intense in the western domains of the basin. © 2014 John Wiley & Sons Ltd.


Lamers A.J.,University of Twente | Gallego Sanchez J.A.,EAFIT University | Herder J.L.,TUDelft
Mechanism and Machine Theory | Year: 2015

Abstract Monolithic and thus fully compliant surgical graspers are promising when they provide equal or better force feedback than conventional graspers. In this work for the first time a fully compliant grasper is designed to exhibit zero stiffness and zero operation force. The design problem is addressed by taking a building block approach, in which a pre-existing positive stiffness compliant grasper is compensated by a negative stiffness balancer. The design of the balancer is conceived from a 4-bar linkage and explores the rigid-body-replacement method as a design approach towards static balancing. Design variables and sensitivities are determined through the use of a pseudo-rigid-body model. Final dimensions are obtained using rough hand calculations. Justification of the pseudo rigid body model as well as the set of final dimensions is done by non-linear finite element analysis. Experimental validation is done through a titanium prototype of 40 mm size having an unbalanced positive stiffness of 61.2 N/mm showing that a force reduction of 91.75% is achievable over a range of 0.6 mm, with an approximate hysteresis of 1.32%. The behavior can be tuned from monostable to bistable. The rigid-body-replacement method proved successful in the design of a statically balanced fully compliant mechanism, thus, widening the design possibilities for this kind of mechanism. © 2015 Elsevier Ltd.

Loading TUDelft collaborators
Loading TUDelft collaborators