Cremona, Italy

Time filter

Source Type

Dickenson E.R.V.,Colorado School of Mines | Snyder S.A.,App Quality | Sedlak D.L.,University of California at Berkeley | Drewes J.E.,Colorado School of Mines
Water Research | Year: 2011

Numerous studies have reported the presence of trace (i.e., ng/L) organic chemicals in municipal wastewater effluents, but it is unclear which compounds will be useful to evaluate the contribution of effluent to overall river flow or the attenuation processes that occur in receiving streams. This paper presents a new approach that uses a suite of common trace organic chemicals as indicators to assess the degree of impact and attenuation of trace organic chemicals in receiving streams. The utility of the approach was validated by effluent monitoring at ten wastewater treatment plants and two effluent-impacted rivers with short retention times (<17 h). A total of 56 compounds were particularly well suited as potential indicators, occurring frequently in effluent samples at concentrations that were at least five times higher than their limit of quantification. Monitoring data from two effluent-impacted rivers indicated that biotransformation was not important for these two river stretches, whereas photolysis attenuation was possibly important for the shallow river. The application of this approach to receiving waters and water reclamation and reuse systems will allow for more effective allocation of resources in future monitoring programs. © 2010 Elsevier Ltd.


Tarazona S.,Research Center Principe Felipe | Tarazona S.,App Quality | Garcia-Alcalde F.,Research Center Principe Felipe | Dopazo J.,Research Center Principe Felipe | And 2 more authors.
Genome Research | Year: 2011

Next-generation sequencing (NGS) technologies are revolutionizing genome research, and in particular, their application to transcriptomics (RNA-seq) is increasingly being used for gene expression profiling as a replacement for microarrays. However, the properties of RNA-seq data have not been yet fully established, and additional research is needed for understanding how these data respond to differential expression analysis. In this work, we set out to gain insights into the characteristics of RNA-seq data analysis by studying an important parameter of this technology: the sequencing depth. We have analyzed how sequencing depth affects the detection of transcripts and their identification as differentially expressed, looking at aspects such as transcript biotype, length, expression level, and fold-change. We have evaluated different algorithms available for the analysis of RNA-seq and proposed a novel approach-NOISeq-that differs from existing methods in that it is data-adaptive and nonparametric. Our results reveal that most existing methodologies suffer from a strong dependency on sequencing depth for their differential expression calls and that this results in a considerable number of false positives that increases as the number of reads grows. In contrast, our proposed method models the noise distribution from the actual data, can therefore better adapt to the size of the data set, and is more effective in controlling the rate of false discoveries. This work discusses the true potential of RNA-seq for studying regulation at low expression ranges, the noise within RNA-seq data, and the issue of replication. © 2011 by Cold Spring Harbor Laboratory Press.


Ferrer A.,App Quality
Quality Engineering | Year: 2014

The basic fundamentals of statistical process control (SPC) were proposed by Walter Shewhart for data-starved production environments typical in the 1920s and 1930s. In the 21st century, the traditional scarcity of data has given way to a data-rich environment typical of highly automated and computerized modern processes. These data often exhibit high correlation, rank deficiency, low signal-to-noise ratio, multistage and multiway structures, and missing values. Conventional univariate and multivariate SPC techniques are not suitable in these environments. This article discusses the paradigm shift to which those working in the quality improvement field should pay keen attention. We advocate the use of latent structure-based multivariate statistical process control methods as efficient quality improvement tools in these massive data contexts. This is a strategic issue for industrial success in the tremendously competitive global market. © Copyright Taylor and Francis Group, LLC.


Prats-Montalban J.M.,App Quality | de Juan A.,University of Barcelona | Ferrer A.,App Quality
Chemometrics and Intelligent Laboratory Systems | Year: 2011

Nowadays, image analysis is becoming more important because of its ability to perform fast and non-invasive low-cost analysis on products and processes. Image analysis is a wide denomination that encloses classical studies on gray scale or RGB images, analysis of images collected using few spectral channels (sometimes called multispectral images) or, most recently, data treatments to deal with hyperspectral images, where the spectral direction is exploited in its full extension. Pioneering data treatments in image analysis were applied to simple images mainly for defect detection, segmentation and classification by the Computer Science community. From the late 80s, the chemometric community joined this field introducing powerful tools for image analysis, which were already in use for the study of classical spectroscopic data sets and were appropriately modified to fit the particular characteristics of image structures. These chemometric approaches adapt to images of all kinds, from the simplest to the hyperspectral images, and have provided new insights on the spatial and spectroscopic information of this kind of data sets. New fields open by the introduction of chemometrics on image analysis are exploratory image analysis, multivariate statistical process control (monitoring), multivariate image regression or image resolution. This paper reviews the different techniques developed in image analysis and shows the evolution in the information provided by the different methodologies, which has been heavily pushed by the increasing complexity of the image measurements in the spatial and, particularly, in the spectral direction. © 2011 Elsevier B.V.


Wert E.C.,App Quality
Journal - American Water Works Association | Year: 2014

The removal of biodegradable ozone by-products was evaluated at pilot scale using a fixed-bed biofilm reactor (FBBR) containing spherical plastic support media. Six FBBRs were operated in parallel with varying media sizes (1-, 1.25-, or 2-in. diameter) and empty bed contact times (EBCTs; 6 or 12 min). Influent water was provided from a full-scale water treatment plant following ozonation, coagulation, and flocculation processes. After seven months of operation, pseudosteady-state conditions were achieved with up to 50% removal of assimilable organic carbon (AOC) and up to 40% reduction in ultraviolet absorbance at 254 nm (UV254). Increases in FBBR effluent turbidity and head loss were also indicative of biomass development and sloughing. Process efficiency deteriorated because of the consumption of biomass by snails and other invertebrates. © 2014 Koch Membrane Systems, Inc.


Perfumes are manufactured by mixing odorous materials with different volatilities. The parameter that measures the lasting property of a material when applied on the skin is called substantivity or tenacity. It is well known by perfumers that citrus and green notes are perceived as fresh and they tend to evaporate quickly, while odors most dissimilar to 'fresh' (e.g., oriental, powdery, erogenic and animalic scents) are tenacious. However, studies aimed at quantifying the relationship between fresh odor quality and substantivity have not received much attention. In this work, perceptual olfactory ratings on a fresh scale, estimated in a previous study, were compared with substantivity parameters and antierogenic ratings from the literature. It was found that the correlation between fresh odor character and odorant substantivity is quite strong (r = -0.85). 'Fresh' is sometimes interpreted in perfumery as 'cool' and the opposite of 'warm'. This association suggests that odor freshness might be somehow related to temperature. Assuming that odor perception space was shaped throughout evolution in temperate climates, results reported here are consistent with the hypothesis that 'fresh' evokes scents typically encountered in the cool season, while 'warm' would be evoked by odors found in nature during summer. This hypothesis is rather simplistic but it may provide a new insight to better understand the perceptual space of scents. © 2013 by the authors; licensee MDPI, Basel, Switzerland.


Capilla C.,App Quality
WIT Transactions on Ecology and the Environment | Year: 2012

Meteorological variability must be taken into account in the modelling of temporal changes in air pollutants to evaluate emissions reduction strategies. In this paper nitrogen dioxide (NO2) hourly data are analyzed at two monitoring stations in Valencia (Spain). Meteorologically-adjusted nitric oxide (NO) is used as an indicator of traffic density. The low-pass filter developed by Kolmogorov and Zurbenko is used to split the logarithm of NO2 and NO hourly concentrations into long-term, seasonal and short-term components. Meteorological effects are analyzed and removed from filtered pollutants time series. Forward stepwise regression is employed to select the filtered meteorological variables that explain more variability. A natural logarithmic transformation is applied to the series of hourly data and the model for the hourly pollutants concentrations is multiplicative. The dependencies of urban NO2 on the corresponding vehicular emissions and relevant meteorological parameters are non-linear. Long-term components represent a small amount in the overall variability of air pollution data. Seasonal and short-term components mask the underlying relationship between NO2 and emissions if studied as a whole. The pollutant temporal components have to be studied separately due to their different physical and explanatory mechanisms. © 2012 WIT Press.


Computer-implemented methods and systems are disclosed for organizing user reviews, especially computer app reviews, into clusters and ranking the clusters so that the reviews may be more meaningfully analyzed.


News Article | June 21, 2010
Site: techcrunch.com

AppMakr is a service that allows you to make simple, Internet-fed apps for the iPhone, Touch, or iPad. Historically, AppMakr has been called a glorified RSS reader but, with the addition of push notifications, a native image viewer, and GeoRSS support, they’re trying to shake that image. GeoRSS is the latest addition to the product and allows content providers to stick a little XML code into their feed to mark posts with GPS coordinates. You can then use the iOS’s native map-handling to view these posts on Google Maps or within the app itself. We wrote about AppMakr when they first launched and many found that their apps were rejected for what amounted to over-simplicity. By adding a few new features, the folks at AppMakr hope to make it easier for developers to submit their apps and get them accepted immediately. AppMakr has also added the App Quality Index, which ensures that apps have a minimum of functionality before they’re submitted. Jeremy Caverly, an AppMakr spokesperson, told us that “We’re seeing a lot of success since we’ve started having people publish under their own license as the default. Lots of people are getting their apps accepted immediately and that has a lot to do with the added functionality.” The current pricing scheme at AppMakr is $999 for a full-service, authored app and, for a limited, free for developers who want to handle the submission process themselves.


News Article | October 2, 2012
Site: thenextweb.com

App Quality Alliance (AQuA), the non-profit organization backed by some of the major players from mobile, has launched what it’s calling the “first ever quality app directory.” Just to recap, AQuA emerged from the ashes of The Unified Testing Initiative back in August, as it sought a “deeper commitment to quality in app development”. It’s run and funded by AT&T, LG, Motorola, Nokia, Oracle, Orange, Samsung and Sony Mobile. The Quality App Directory has been developed in close consultation with mobile app developers, and is claiming to be an independent and free resource for developers to upload details of apps – either self-tested or tested by an approved body – to gain accreditation and recognition for their good QA practice. So the idea here is that only apps that meet a certain standard will be included in the directory, but it’s worth stressing that this isn’t another app store – you can’t download apps from here. It’s purely a directory for interested parties to glean further information about an app, and for developers to claim that their work has been AQuA verified. It’s like a seal of approval. It’s also worth noting that there is a little room for wriggling, as developers themselves can sign the terms and conditions which stipulate they will only upload app details that have met these standards, though AQuA will conduct an audit periodically for the self-verified apps. This is all entirely funded by AQuA’s backers, so there’s no cost attached for developers themselves. While developers can submit self-tested apps based on AQuA’s testing criteria and receive entry-level accreditation, they can also choose one of four test houses to accredit their app, for an “independent” accreditation – these are: Babel Media, Intertek, Sogeti High Tech and VMC Consulting. The cost of testing an app using one of the independent bodies depends on which body is used and the nature of the app, but here’s a rough guide: Where an AQuA member has tested an app themselves, these will gain the status of AQuA Member Verified. Developers who achieve consistent accreditation across a series of apps will attain ‘trusted developer’ status. AQuA, with industry input, has compiled a set of testing criteria for Android, though other platforms will be added in due course. The criteria is focused mainly on usability and application behaviour within the mobile device, but will be expanded to cover other areas relating to battery life, privacy, network usage and more. Interestingly, it doesn’t look at the usefulness or function of the app in question, so this is entirely about usability and whether it actually works. “In today’s fast growing app market, the three ingredients for success are innovation, quality in delivery and effective marketing,” says Martin Wrigley, Chairman, AQuA, and Director of Developer Service at Orange. “AQuA has always supported quality in delivery and with the directory we are also supporting the effective marketing of developers’ quality apps.” A beta trial kicked off in February 2012 with developers and test houses who flagged a number of points, and this resulted in many of the suggested changes for the first release today.

Loading App Quality collaborators
Loading App Quality collaborators