Entity

Time filter

Source Type

Brookfield, CT, United States

Workman J.,Unity Scientific LLC | Workman J.,University of San Diego | Workman J.,Liberty University | Mark H.,Mark Electronics
Spectroscopy (Santa Monica) | Year: 2016

This column addresses the issue of degrees of freedom (df) for regression models. It seems there is some confusion about the use of df for the various calibration and prediction situations - the standard error parameters should be-comparable and are related to the total independent samples, data channels containing information (that is, wavelengths or wave numbers), and number of factors or terms in the regression. By convention everyone could just choose a definition, but there is a more correct one that should be verified and discussed for each case. The problem is computing the standard deviation using different degrees of freedom without a more rigorous explanation and then putting so much emphasis on the actual number derived for the standard error of the estimate (SEE) and the standard error of cross validation (SECV), rather than on the computed confidence intervals. © 2016 Advanstar Communications, Inc. All rights reserved.


Workman J.,Unity Scientific LLC | Workman J.,University of San Diego | Workman J.,Liberty University | Mark H.,Mark Electronics
Spectroscopy (Santa Monica) | Year: 2014

Photometric accuracy and precision, as reproducibility and repeatability, respectively, are essential for building consistent large databases over time for use in qualitative searches or quantitative multivariate analysis. If the spectrophotometer in use is inconsistent in terms of linearity and photometric accuracy, the analytical precision and accuracy will be jeopardized over time. Photometric accuracy and linearity drift over time within a single instrument or between instruments and create errors and variation in the accuracy of measurements using databases collected with different photometric registrations. How do current commercial instruments vary with respect to photometric accuracy and precision over time? What are potential solutions to this challenge? © 2014, Advanstar Communications Inc. All rights reserved.


Workman Jr. J.,Unity Scientific LLC | Workman Jr. J.,University of San Diego | Workman Jr. J.,Liberty University | Mark H.,Mark Electronics
Spectroscopy (Santa Monica) | Year: 2014

Different units of measurement have different relationships to the spectral values, for reasons having nothing to do with the spectroscopy. The experimental finding that electromagnetic spectroscopy is sensitive to the volume percent (or, strictly speaking, the volume fraction) of materials in a sample is the resulting conclusion. There are a variety of measurement errors because of variation in sample presentation and instrumentation, as described above, but fundamentally the spectroscopy is relating to volume fraction and not weight percent. There are both graphical methods available and numeric methods. It made sense to use both approaches for the comparison, if for no other reason than that is standard procedure when performing calibrations with the other algorithms, and we decided to examine our CLS results as closely as is normally done. Researchers started with the numeric approach, and computed the root mean square differences and correlation coefficients between the concentration values from the CLS method and the concentration values obtained using other units.


Mark H.,Mark Electronics | Workman J.,Unity Scientific LLC | Workman J.,University of San Diego
Spectroscopy (Santa Monica) | Year: 2015

The science of statistics are concerned with the effects of the random portion of the uncertainty, not least because it can help discern and, even better, calculate the systematic errors so that they can be corrected. Over the years, statisticians have discovered much about the nature and behavior of random errors, and have learned how to specify bounds for what can legitimately be said about the data that are subject to these errors. One of the more important findings is that if the data are subject to fluctuations because of random error, then anything you calculate from those data will also be subject to fluctuations. To the statistician, a value calculated from a set of data subject to random fluctuations is what is known as a statistic. An important difference between samples and statistics is the distribution of the variations. The distribution of values of multiple measurements from a sample, or the distribution of values of measurements from multiple samples can be almost anything, and is determined by the physics or chemistry (or other applicable discipline) of the situation governing the behavior of the samples.


Workman J.,Unity Scientific LLC | Workman J.,University of San Diego | Workman J.,Liberty University | Mark H.,Mark Electronics
Spectroscopy (Santa Monica) | Year: 2015

When using any regression technique, either linear or nonlinear, there is a rational process that allows the researcher to select the best model. One question often arises: Which regression method (or model) is better or best when compared to others? This column discusses a mathematical and rational process that is useful for selecting the best predictive model when using regression methods for spectroscopic quantitative analysis. © 2015 Advanstar Communications, Inc. All rights reserved.

Discover hidden collaborations