Cremona, Italy

Time filter

Source Type

News Article | April 17, 2017
Site: www.prnewswire.com

Of the high-risk categories, financial management tops the list, accounting for 17.32% of the overall data. The other 4 categories that round out the top five are: travel, life service, shopping, and social. High-risk categories and sub-categories ranked from highest frequency to lowest: Testin mobile security experts have smoothed out specific explanations for the above five high-risk categories, their potential risks, and recommendations to fix them. The following recommendations are for reference only: Decompilation refers to the implementation of "reversed analysis and research" on the targeted software (such as executable programs) with an aim to derive the ideas, principles, structures, algorithms, processes, methods of operation, and other design elements used in the software. In some cases, the source code can be directly derived through Java-developed Andrew APK programs. 1.       The leaked software core code processes may be directly stolen by competitors. 2.       Hackers can implement malicious code and then create a second package to serve as the imposter of the original program for malicious actions. 3.       The exposure of the source code is more vulnerable to the exploitation of software vulnerabilities, thus being more vulnerable to attacks. 1.       Convert the entire dex into another file by encrypting or compressing the target DEX file and then save the file in the assets folder or elsewhere, and then use the class loader technology to decrypt the memory and load for operation. 2.       Extract the bytecode commands of DexCode and replace them with zeros, or modify the method properties. Make corrections and repairs in memory during operation. Code obfuscation rewrites various elements of the code, such as variables, functions, and class names, into meaningless names. For example, the rewritten single letter, or a brief combination of meaningless letters, or even symbols like "__" will prevent people from guessing their purposes according to their names. Rewriting some of the logic in the code and turning it into a functionally equivalent are more difficult to understand. For example, the for loop is rewritten into a while loop and the while loop is rewritten into recursive with streamlined intermediate variables. Disrupting the code format, such as deleting spaces and pushing multiple lines of code into one line, or breaking a line of code into multiple lines will also increase the difficulty for hackers to directly analyze the code. 1.       The original code of the program without code obfuscation will be completely exposed to hackers, thus reducing the hacker's invasion costs. 2.       The function code is easier to be analyzed, copied, and stolen. Android sdk packages are embedded with the webView plugin, which is mainly used to control the web view. This plugin uses the addJavascriptInterface method to achieve the interaction between local Java and JavaScript, but the method has no restriction on its calling, allowing attackers to invoke any JAVA classes, and thus eventually causing JavaScript code to attack arbitrary code execution on the device. 1.       Attackers can build malicious WEB pages to induce user to parse and then use the context of the application to execute any commands. 2.       Attackers can use the vulnerability to remotely control the victim's mobile phone and implant Trojans. 1.       Android 4.2 (api17) has already adopted new interface functions and replace addjavascriptInterface with @JavascriptInterface. Some android 2.3 is on longer available for upgrading and thus browsers need to be compatible. 2.       When the bridge of js2java is being used, every parameter input needs to be authenticated so as to block attack code. 3.       Control related permissions or avoid the bridge of js2java as much as possible In the customized subclasses of X509TrustManager, the lack of authentication on the server certificate with default acceptance of any server certificates will pose security risks, making it likely for malicious programs to use middleman attacks to bypass the certificate verification. 1.       Risk of the Man-in-the-middle (MitM) attacks with all traffic being read directly by hackers. Use checkServerTrusted function in the subclass of X509TrustManager to check the legitimacy of server-side certificates. Since the client fails to verify the server's certificate, attackers can create separate contacts with the two ends of the telecommunication and exchange the data they receive, thus tricking the two ends into thinking they are communicating directly through private connections. However, the whole communication is completely controlled by attackers. In a middleman attack, attackers can intercept communications of both parties and insert new content. Through middleman hijacking, attackers can steal the plain text of accounts and passwords, chat content, mailing addresses, phone numbers, and credit card payment information and other sensitive information. They can even replace the original information with a malicious link or malicious code program for remote control, malicious charge, and other offensive intentions. It is recommended to verify the SSL certificate (whether the signature CA is valid, whether the certificate is self-signed, whether the host domain name matches, whether the certificate is out of date, etc.). Testin is a leading provider in "one-stop mobile application cloud testing service" in the world, offering one-stop application testing service and quality assurance for developers of mobile application, games, VR/AR, wearable deices, Internet of Things, and Artificial Intelligence. Testin's cloud testing is able to check function, compatibility, regression, automated testing on security, real machine debugging and A/B test and bug management in real machines deployed in the cloud through the deep machine learning AI automated script. Testin's distributed testing, supported by sharing experts around the world, targets functionality, user experience, scenario and usability. Testin Pro tests private cloud compatibility, real machine debugging, functionality, performance provisioning, and application in an automated way and make dedicated deployment for test management. After 150+ million iteration tests of 2+ million Apps in past 5+ years, Testin has grown from a groundbreaking idea to the leader in #1 Mobile App Quality Assurance platform, secured US$84.9 million in 3 Rounds from IDG, Banyan, Haiyin and CEL and succeeded in not only capturing the domestic market in China, but also setting its foot into the global arena. Testin has been recognized as 2014 and 2015 Zero2IPO v50 China, 2014 Red Herring 100 Asia and 2015 Red Herring 100 Global, 2015 and 2016 Deloitte High-Tech & Growth Top 50 China. By address mobile and OS fragmentations, App's compatibility, functionality, user experience, performance, security and analytics, Testin builds thousands of developers' – including McDonald's, Nestle, Starbucks, Benz, Philips, Kabam, JD – confidence to ensure great experiences for their users. For more information on security information and services, please keep an eye on http://www.Testin.net, security test and authentication service embedded with AI learning technology for developers and QA teams. To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/testin-security-report-q1-2017-financial-management-tops-five-high-risk-categories-300440131.html


Dickenson E.R.V.,Colorado School of Mines | Snyder S.A.,App Quality | Sedlak D.L.,University of California at Berkeley | Drewes J.E.,Colorado School of Mines
Water Research | Year: 2011

Numerous studies have reported the presence of trace (i.e., ng/L) organic chemicals in municipal wastewater effluents, but it is unclear which compounds will be useful to evaluate the contribution of effluent to overall river flow or the attenuation processes that occur in receiving streams. This paper presents a new approach that uses a suite of common trace organic chemicals as indicators to assess the degree of impact and attenuation of trace organic chemicals in receiving streams. The utility of the approach was validated by effluent monitoring at ten wastewater treatment plants and two effluent-impacted rivers with short retention times (<17 h). A total of 56 compounds were particularly well suited as potential indicators, occurring frequently in effluent samples at concentrations that were at least five times higher than their limit of quantification. Monitoring data from two effluent-impacted rivers indicated that biotransformation was not important for these two river stretches, whereas photolysis attenuation was possibly important for the shallow river. The application of this approach to receiving waters and water reclamation and reuse systems will allow for more effective allocation of resources in future monitoring programs. © 2010 Elsevier Ltd.


Tarazona S.,Research Center Principe Felipe | Tarazona S.,App Quality | Garcia-Alcalde F.,Research Center Principe Felipe | Dopazo J.,Research Center Principe Felipe | And 2 more authors.
Genome Research | Year: 2011

Next-generation sequencing (NGS) technologies are revolutionizing genome research, and in particular, their application to transcriptomics (RNA-seq) is increasingly being used for gene expression profiling as a replacement for microarrays. However, the properties of RNA-seq data have not been yet fully established, and additional research is needed for understanding how these data respond to differential expression analysis. In this work, we set out to gain insights into the characteristics of RNA-seq data analysis by studying an important parameter of this technology: the sequencing depth. We have analyzed how sequencing depth affects the detection of transcripts and their identification as differentially expressed, looking at aspects such as transcript biotype, length, expression level, and fold-change. We have evaluated different algorithms available for the analysis of RNA-seq and proposed a novel approach-NOISeq-that differs from existing methods in that it is data-adaptive and nonparametric. Our results reveal that most existing methodologies suffer from a strong dependency on sequencing depth for their differential expression calls and that this results in a considerable number of false positives that increases as the number of reads grows. In contrast, our proposed method models the noise distribution from the actual data, can therefore better adapt to the size of the data set, and is more effective in controlling the rate of false discoveries. This work discusses the true potential of RNA-seq for studying regulation at low expression ranges, the noise within RNA-seq data, and the issue of replication. © 2011 by Cold Spring Harbor Laboratory Press.


Ferrer A.,App Quality
Quality Engineering | Year: 2014

The basic fundamentals of statistical process control (SPC) were proposed by Walter Shewhart for data-starved production environments typical in the 1920s and 1930s. In the 21st century, the traditional scarcity of data has given way to a data-rich environment typical of highly automated and computerized modern processes. These data often exhibit high correlation, rank deficiency, low signal-to-noise ratio, multistage and multiway structures, and missing values. Conventional univariate and multivariate SPC techniques are not suitable in these environments. This article discusses the paradigm shift to which those working in the quality improvement field should pay keen attention. We advocate the use of latent structure-based multivariate statistical process control methods as efficient quality improvement tools in these massive data contexts. This is a strategic issue for industrial success in the tremendously competitive global market. © Copyright Taylor and Francis Group, LLC.


Prats-Montalban J.M.,App Quality | de Juan A.,University of Barcelona | Ferrer A.,App Quality
Chemometrics and Intelligent Laboratory Systems | Year: 2011

Nowadays, image analysis is becoming more important because of its ability to perform fast and non-invasive low-cost analysis on products and processes. Image analysis is a wide denomination that encloses classical studies on gray scale or RGB images, analysis of images collected using few spectral channels (sometimes called multispectral images) or, most recently, data treatments to deal with hyperspectral images, where the spectral direction is exploited in its full extension. Pioneering data treatments in image analysis were applied to simple images mainly for defect detection, segmentation and classification by the Computer Science community. From the late 80s, the chemometric community joined this field introducing powerful tools for image analysis, which were already in use for the study of classical spectroscopic data sets and were appropriately modified to fit the particular characteristics of image structures. These chemometric approaches adapt to images of all kinds, from the simplest to the hyperspectral images, and have provided new insights on the spatial and spectroscopic information of this kind of data sets. New fields open by the introduction of chemometrics on image analysis are exploratory image analysis, multivariate statistical process control (monitoring), multivariate image regression or image resolution. This paper reviews the different techniques developed in image analysis and shows the evolution in the information provided by the different methodologies, which has been heavily pushed by the increasing complexity of the image measurements in the spatial and, particularly, in the spectral direction. © 2011 Elsevier B.V.


Wert E.C.,App Quality
Journal - American Water Works Association | Year: 2014

The removal of biodegradable ozone by-products was evaluated at pilot scale using a fixed-bed biofilm reactor (FBBR) containing spherical plastic support media. Six FBBRs were operated in parallel with varying media sizes (1-, 1.25-, or 2-in. diameter) and empty bed contact times (EBCTs; 6 or 12 min). Influent water was provided from a full-scale water treatment plant following ozonation, coagulation, and flocculation processes. After seven months of operation, pseudosteady-state conditions were achieved with up to 50% removal of assimilable organic carbon (AOC) and up to 40% reduction in ultraviolet absorbance at 254 nm (UV254). Increases in FBBR effluent turbidity and head loss were also indicative of biomass development and sloughing. Process efficiency deteriorated because of the consumption of biomass by snails and other invertebrates. © 2014 Koch Membrane Systems, Inc.


Perfumes are manufactured by mixing odorous materials with different volatilities. The parameter that measures the lasting property of a material when applied on the skin is called substantivity or tenacity. It is well known by perfumers that citrus and green notes are perceived as fresh and they tend to evaporate quickly, while odors most dissimilar to 'fresh' (e.g., oriental, powdery, erogenic and animalic scents) are tenacious. However, studies aimed at quantifying the relationship between fresh odor quality and substantivity have not received much attention. In this work, perceptual olfactory ratings on a fresh scale, estimated in a previous study, were compared with substantivity parameters and antierogenic ratings from the literature. It was found that the correlation between fresh odor character and odorant substantivity is quite strong (r = -0.85). 'Fresh' is sometimes interpreted in perfumery as 'cool' and the opposite of 'warm'. This association suggests that odor freshness might be somehow related to temperature. Assuming that odor perception space was shaped throughout evolution in temperate climates, results reported here are consistent with the hypothesis that 'fresh' evokes scents typically encountered in the cool season, while 'warm' would be evoked by odors found in nature during summer. This hypothesis is rather simplistic but it may provide a new insight to better understand the perceptual space of scents. © 2013 by the authors; licensee MDPI, Basel, Switzerland.


Capilla C.,App Quality
WIT Transactions on Ecology and the Environment | Year: 2012

Meteorological variability must be taken into account in the modelling of temporal changes in air pollutants to evaluate emissions reduction strategies. In this paper nitrogen dioxide (NO2) hourly data are analyzed at two monitoring stations in Valencia (Spain). Meteorologically-adjusted nitric oxide (NO) is used as an indicator of traffic density. The low-pass filter developed by Kolmogorov and Zurbenko is used to split the logarithm of NO2 and NO hourly concentrations into long-term, seasonal and short-term components. Meteorological effects are analyzed and removed from filtered pollutants time series. Forward stepwise regression is employed to select the filtered meteorological variables that explain more variability. A natural logarithmic transformation is applied to the series of hourly data and the model for the hourly pollutants concentrations is multiplicative. The dependencies of urban NO2 on the corresponding vehicular emissions and relevant meteorological parameters are non-linear. Long-term components represent a small amount in the overall variability of air pollution data. Seasonal and short-term components mask the underlying relationship between NO2 and emissions if studied as a whole. The pollutant temporal components have to be studied separately due to their different physical and explanatory mechanisms. © 2012 WIT Press.


Computer-implemented methods and systems are disclosed for organizing user reviews, especially computer app reviews, into clusters and ranking the clusters so that the reviews may be more meaningfully analyzed.


News Article | October 2, 2012
Site: thenextweb.com

App Quality Alliance (AQuA), the non-profit organization backed by some of the major players from mobile, has launched what it’s calling the “first ever quality app directory.” Just to recap, AQuA emerged from the ashes of The Unified Testing Initiative back in August, as it sought a “deeper commitment to quality in app development”. It’s run and funded by AT&T, LG, Motorola, Nokia, Oracle, Orange, Samsung and Sony Mobile. The Quality App Directory has been developed in close consultation with mobile app developers, and is claiming to be an independent and free resource for developers to upload details of apps – either self-tested or tested by an approved body – to gain accreditation and recognition for their good QA practice. So the idea here is that only apps that meet a certain standard will be included in the directory, but it’s worth stressing that this isn’t another app store – you can’t download apps from here. It’s purely a directory for interested parties to glean further information about an app, and for developers to claim that their work has been AQuA verified. It’s like a seal of approval. It’s also worth noting that there is a little room for wriggling, as developers themselves can sign the terms and conditions which stipulate they will only upload app details that have met these standards, though AQuA will conduct an audit periodically for the self-verified apps. This is all entirely funded by AQuA’s backers, so there’s no cost attached for developers themselves. While developers can submit self-tested apps based on AQuA’s testing criteria and receive entry-level accreditation, they can also choose one of four test houses to accredit their app, for an “independent” accreditation – these are: Babel Media, Intertek, Sogeti High Tech and VMC Consulting. The cost of testing an app using one of the independent bodies depends on which body is used and the nature of the app, but here’s a rough guide: Where an AQuA member has tested an app themselves, these will gain the status of AQuA Member Verified. Developers who achieve consistent accreditation across a series of apps will attain ‘trusted developer’ status. AQuA, with industry input, has compiled a set of testing criteria for Android, though other platforms will be added in due course. The criteria is focused mainly on usability and application behaviour within the mobile device, but will be expanded to cover other areas relating to battery life, privacy, network usage and more. Interestingly, it doesn’t look at the usefulness or function of the app in question, so this is entirely about usability and whether it actually works. “In today’s fast growing app market, the three ingredients for success are innovation, quality in delivery and effective marketing,” says Martin Wrigley, Chairman, AQuA, and Director of Developer Service at Orange. “AQuA has always supported quality in delivery and with the directory we are also supporting the effective marketing of developers’ quality apps.” A beta trial kicked off in February 2012 with developers and test houses who flagged a number of points, and this resulted in many of the suggested changes for the first release today.

Loading App Quality collaborators
Loading App Quality collaborators