Entity

Time filter

Source Type

Ra'anana, Israel

Kenett R.S.,KPA Ltd. | Kenett R.S.,University of Turin | Pollak M.,Hebrew University of Jerusalem
Quality and Reliability Engineering International | Year: 2012

The literature on statistical process control has focused mostly on the average run length (ARL) to an alarm, as a performance criterion of sequential schemes. When the process is in control, this is the ARL to false alarm, generally denoted by ARL 0, and represents the in-control operating characteristic of the procedure. The ARL from the occurrence of a change to its detection represents an out-of-control operating characteristic and is typically embodied by ARL 1, the ARL to detection assuming that the change occurs at the very start of surveillance. However, these indices do not tell the whole story, and at times they are not defined well by a single number. We review the role of various operating characteristics in assessing performance of sequential procedures in comparison with ARL 0 and ARL 1. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd. Source


Assaraf S.,Israel Aerospace Industries | Assaraf S.,Systems Engineer and Process Group | Kenett R.S.,KPA Group | Kenett R.S.,University of Turin | Kenett R.S.,KPA Ltd.
54th Israel Annual Conference on Aerospace Sciences 2014 | Year: 2014

Integration and testing of large scale systems can be a long and tedious effort: many defects and failures, not enough resources and significantly tight schedule. Using Software Trouble Assessment Matrix (STAM) metrics to organize relationships between three dimensions - defect injection phase, earliest detection phase and defect detection phase, the project manager and system engineer can evaluate the actual system's development stage and plan realistic schedules and support the achievement of quality and operational goals. The paper presents STAM measurements analysis of data from a large satellite program in IAI. The analysis of the STAM measures indicated that the process of planning defects detection is very effective, but the execution is not: meaning that we plan effectively the tests in order to detect defects in the required stage, but that we perform inefficiently and that the detection takes longer and results in late detection. This conclusion triggered changes and reviews of the test procedures and preparation processes. Results, a year later, showed a significant improvement. Source


Kenett R.S.,KPA Ltd. | Kenett R.S.,University of Turin
Proceedings - International Computer Software and Applications Conference | Year: 2011

Software Cybernetics can address important challenges in future software based system. To achieve this, requires interdisciplinary work and research. An evaluation of how Software Cybernetics can interact with other disciplines is called for. © 2011 IEEE. Source


Figini S.,University of Pavia | Kenett R.S.,KPA Ltd. | Kenett R.S.,University of Turin | Salini S.,University of Milan
Quality and Reliability Engineering International | Year: 2010

The focus of the paper is the use of optimal scaling techniques to reduce the dimensionality of ordinal variables describing the quality of services to a continuous score interpretable as a measure of operational risk. This new score of operational risk is merged with a financial risk score in order to obtain an integrated measure of risk. The proposed integration methodology is a generalization of the merging model suggested in Fagini and Giudici (J. Oper. Res. Soc. 2010; in press) for a hierarchical data structure. In order to demonstrate the methodology, we use real data from a telecommunication company providing services to enterprises in different business lines and geographical locations. For each enterprise, we have collected information about operational and financial performance. The approach demonstrated in this case study can be generalized to general service providers who are concerned by both the quality of service and the financial solvency of their customers. © 2010 John Wiley & Sons, Ltd. Source


Dalla Valle L.,University of Plymouth | Kenett R.S.,KPA Ltd. | Kenett R.S.,University of Turin | Kenett R.S.,New York University
Quality and Reliability Engineering International | Year: 2015

This work is about integrated analysis of data collected as official statistics with administrative data from operational systems in order to increase the quality of information. Information quality, or InfoQ, is 'the potential of a data set to achieve a specific goal by using a given empirical analysis method'. InfoQ is based on the identification of four interacting components: the analysis goal, the data, the data analysis and the utility, and it is assessed through eight dimensions: data resolution, data structure, data integration, temporal relevance, generalizability, chronology of data and goal, construct operationalization and communication. The paper illustrates, through case studies, a novel strategy to increase InfoQ based on the integration of official statistics with administrative data using copulas and Bayesian Networks. Official statistics are extraordinary sources of information. However, because of temporal relevance and chronology of data and goals, these fundamental sources of information are often not properly leveraged resulting in a poor level of InfoQ in the use of official statistics. This leads to low valued statistical analyses and to the lack of sufficiently informative results. By improving temporal relevance and chronology of data and goals, the use of Bayesian Networks allows us to calibrate official with administrative data, thus strengthening the quality of the information derived from official surveys, and, overall, enhancing InfoQ. We show, with examples, how to design and implement such a calibration strategy. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd. Source

Discover hidden collaborations