STATISTICS AND CHEMOMETRICS FOR ANALYTICAL CHEMISTRY PDF

adminComment(0)

ii Section K – Lipid metabolism The INSTANT NOTES series Series editor B.D. Hames School Instant Notes: Analytical Principles and Practice of. Views 2MB Size Report. DOWNLOAD PDF Statistics and Chemometrics for Analytical Chemistry Sixth edition Miller & Miller Statistics and Chemometrics for . - Seventh edition / James N. Miller, Jane C. Miller, Robert D. Miller. [Matching item] Statistics and chemometrics for analytical chemistry [electronic resource] / James N. Miller, Jane C. Miller. [Matching item] Statistics and chemometrics for analytical chemistry / James N.


Statistics And Chemometrics For Analytical Chemistry Pdf

Author:KRISTOFER GREEVES
Language:English, Dutch, German
Country:Nicaragua
Genre:Personal Growth
Pages:494
Published (Last):21.05.2016
ISBN:670-8-59538-578-3
ePub File Size:21.50 MB
PDF File Size:15.40 MB
Distribution:Free* [*Registration Required]
Downloads:30885
Uploaded by: DOROTHA

Download and Read Statistics And Chemometrics For Analytical Chemistry. Statistics The provided soft file book of this PDF will give the amazing situation. Köp Statistics and Chemometrics for Analytical Chemistry av James Miller, Jane C Miller PDF-böcker lämpar sig inte för läsning på små skärmar, t ex mobiler. Pris: kr. Häftad, Skickas inom vardagar. Köp Statistics and Chemometrics for Analytical Chemistry av James Miller på sidjudendelstead.tk

Spara som favorit. Skickas inom vardagar. Statistics and Chemometrics for Analytical Chemistry 7th edition provides a clear, accessible introduction to main statistical methods used in modern analytical laboratories. It continues to be the ideal companion for students in Chemistry and related fields keen to build their understanding of how to conduct high quality analyses in areas such as the safety of food, water and medicines, environmental monitoring, and chemical manufacturing.

With a focus on the underlying statistical ideas, this book incorporates useful real world examples, step by step explanation and helpful exercises throughout. Features of the new edition: Significant revision of the Quality of analytical measurements chapter to incorporate more detailed coverage of the estimation of measurement uncertainty and the validation of analytical methods. Updated coverage of a range of topics including robust statistics, Bayesian methods, and testing for normality of distribution, plus expanded material on regression and calibration methods.

Additional experimental design methods, including the increasingly popular optimal designs. Worked examples have been updated throughout to ensure compatibility with the latest versions of Excel and Minitab. Exercises are available at the end of each chapter to allow student to check understanding and prepare for exams.

The current review is limited to approximately references, which continues to pose a challenge since the number of citations on chemometrics continues to show steady growth.

Over 15 citations, for example, appear when the terms pattern recognition and multivariate calibration are used as keywords in a Chemical Abstracts search. This comes as no surprise since many areas of chemometrics have been assimilated by other disciplines. The extraction of information from chemical data drives research in chemometrics, and growth in this field will continue as long as practitioners of chemometrics continue to solve problems that need to be solved as opposed to solving problems that can be done, simply because they can be.

There were more papers presented on bioanalytical, biomedical, and nanotechnology subjects than on classical instrumental techniques at the recent Pittsburgh Conference. However, the opposite is true when examining the chemometric literature, which begs the question, has chemometrics failed to branch out into other areas of chemistry beyond pattern recognition and calibration to become a computational and multivariate branch of chemistry?

Argose Inc. Lima A4 , Greensill A5 , and Small A6 have evaluated and compared the performance of various standardization methods used for transfer of calibration models between instruments. Using the wavelet transform to preprocess the data and remove noise, Brown A7 and Galvao A8 have shown improvements in the robustness of calibration models, thereby circumventing the entire problem.

It should be noted that although these solutions exist in the literature, their use is not widespread for applied problem solving. At present, the general approach used to tackle this problem involves formulating a fixed mathematical solution to a chaotic system.

Instead, a calibration is done at a particular time and is expected to hold for an indefinite period. Furthermore, previous solutions to the problem of calibration transfer have focused on variability between the first-order instrument responses for more than one instrument when attempting to develop a suitable regression model to quantitate spectral response across instruments.

However, there are other higher order variations between instruments over time that need to be addressed to ensure a successful calibration transfer. These are best tackled by having a set of reference standards that can be used to calibrate all instruments at any moment in time.

Meditation For Dummies 4th Edition

It becomes a simple matter of periodically checking the instrument for drift or unmodeled variation. Chemometrics could then be used to conform the instrument signal to the standard samples and report how far the current instrument response of the standard samples lies outside the modeled variance and for developing methods to update the models using information about the unmodeled variance.

Current methods of calibration transfer rarely address these issues, which are crucial when applying multivariate techniques to spectroscopic data for accurate quantitative analysis over time. At the other extreme, computed-aided molecular design including quantitative structure-activity relationship QSAR -type techniques need to become more of a focus area in chemometrics. Practitioners of chemometrics bring mathematical, statistical, and chemical expertise to bear on complex problems.

For QSAR, issues that need to be investigated include molecular descriptor generation, improved algorithms for multivariate analysis, and libraries of molecular properties. These different components need to be systematized and centralized in a single core facility. They need to be extended from spectroscopic and reactivity properties of molecules to toxicity, mutagenicity, and other structure-activity relationships SARs. The field of chemoinformatics, which encompasses the analysis, visualization, and use of chemical information as a surrogate variable for other data or information, Analytical Chemistry, Vol.

Using multivariate analysis techniques such as principal component analysis or partial least squares, nonionic organic pesticides can be partitioned into different environmental compartments based on their physicochemical properties A13 , compounds for drug development can be optimized A14 , and catalysts can be designed using quantum mechanical chemical parameters as molecular descriptors for the formulation of a QSAR A During this reporting period, there have been a few papers published on the development of new molecular descriptors for 2-D and 3-D QSAR A16, A17 and new algorithms for multivariate analysis A18, A During the same time period when chemoinformatics evolved, chemometrics primarily focused on process analytical chemistry.

The expectation was that engineers would depart from their traditional approach and accept the use of sensors and multivariate analysis for process monitoring and control. Although chemometrics has been able to make some progress in this field, the vision of an engineering discipline rooted on chemometrics has not been realized. Furthermore, analytical scientists and chemometricians have been relegated to service departments in most pharmaceutical, chemical, and biotechnology organizations, rather than as a key element in experimental discovery, new product development, and process optimization.

Meanwhile, new problems and challenges appeared but were largely ignored by most practitioners of chemometrics because of their ties to the more mundane problems found in process analytical chemistry, medicinal chemistry, or biotechnology.

What are the important problems associated with chemometrics and therefore multivariate thinking? From the broadest perspective, these problems fall into several general categories or topics.

Such a list of important problems might include calibration and calibration transfer, signal processing and digital filtering, second and higher order data processing, machine learning, propagation of uncertainty in machine learning, image enhancement, hyperspectral image analysis, computer-aided molecular design such as QSAR and other in silico techniques, and data fusion.

It is important to note that chemometric research groups today often collaborate with psychometricians, bioinformaticians, statisticians, chemical engineers, and electrical and computer engineering groups. The next review article will be used to address the details of new algorithmic and mathematical approaches in the field. For this review, only applications with dramatic significance and at least a modicum of published research papers have been selected for inclusion.

During this reporting period, we observed that a number of Web-based resources exist to support chemometrics. However, the Umea website needs some updating of its 44 web links, which covers the Analytical Chemistry, Vol.

This book surveys the application of genetic algorithms and neural networks.

Recommended for you

Both theoretical and applied aspects of these chemometric tools are described. In this book, the author has focused on the main concepts of chemometrics, which he defines as experimental design, signal processing, pattern recognition, calibration, and evolutionary data.

He explores the basic principles and applications of these concepts through problem solving. The text has worked examples to demonstrate chemometric concepts.

There are 54 problems included along with relevant appendixes in subjects such as matrix algebra, statistics, and commonly used algorithms. The aim of the text is to provide the reader with a guide to the field of multivariate calibration and classification through a careful survey of the literature.

Topics that are covered and treated at great length include data preprocessing techniques e. Chemometrics software continues to proliferate, particularly packages containing algorithms intended for broad applications.

Many of these packages are rooted in traditional chemometric history and have enjoyed a decade or more of critical use. There are also several journals dedicated to the mechanics and logic of chemometrics.

Other journals covering chemometrics in their general editorial scope, but more focused on applications include the following: Environmetrics, Analytical Chemistry, Analytical Letters, and Analytica Chimica Acta.

Much of the effort in chemometrics has been directed toward exploiting other literatures in an effort to find an existing method that might solve a chemical problem of interest.

In the past, the invention of new methods has not played a large role in the field. Recently, both chemistry and biology have begun to evolve into data-rich fields, thereby opening up the possibility of data-driven research. This, in turn, has led to a new approach for solving scientific problems, which consists of four interrelated steps: 1 measure a phenomenon or process using instrumentation that generates data effortlessly and inexpensively, 2 analyze the multivariate data, 3 iterate if necessary, and 4 create and test a model that will provide fundamental multivariate understanding of the process being investigated.

This new approach to scientific problem solving constitutes a true paradigm shift since multiple experimentation and chemometrics are used as a vehicle to investigate the world from a multivariate perspective. Mathematics is not used for modeling per se but is more for discovery and is thus a data microscope to sort probe and look for hidden relationships in data. In the previous review, we had discussed the implications of this new paradigm for discovery and cited examples of its use in fields other than chemistry.

During this reporting period, we have observed that chemists are beginning to take advantage of this new approach to problem solving because of high-throughput experimentation, which is a result of the proliferation of microreactors.

They allow complex chemistries to be conducted at a miniature scale at relatively low temperatures. The favorable kinetics and increased yield and efficiency of microreactor systems promise a bold change in the traditional chemical and pharmaceutical manufacturing processes.

Potyrailo A20 has shown that conditions for polymerization reactions can be optimized. A 96microreactor array for combinatorial screening of new catalysts was used and the properties of the polymer measured in situ were correlated to the polymer formulation and reaction conditions using the appropriate multivariate optimization function.

Strategies for developing new high-throughput screening tools and multivariate methods for prediction of material properties and determination of the contributing factors to combinatorial-scale chemical reactions have been discussed by Potyrailo A21 and Tuchbreiter A Chemometrics is an application-driven field.

Any review of this field cannot and should not be formulated without focusing on so-called novel or exciting applications.

Therefore, this review has been divided into three sections with each section corresponding to an application area that has been judged to be exciting or hot. The criteria used to select these application areas are based in part on the number of literature citations uncovered during the search and in part on the perceived impact that developments in these areas will have on chemometrics and analytical chemistry.

The three application areas highlighted in this review are image analysis, sensors, and microarrays. Two of the three areas were highlighted in the previous review.

Image analysis attempts to exploit the power gained by interfacing human perception with cameras and imaging system. It is the interface between data and the human operator.

Insight into chemical and physical phenomena can be garnered where the current superior pattern recognition of humans over computers provides us with a strong argument to develop chemometric tools for imaging. These include tools for interpretation, creation, or extraction of virtual images from real data, data compression and display, image enhancement, and three-dimensional views into structures and mixtures.

Chemometrics has an even greater potential to improve sensor performance than miniaturization of hardware. Fast computations combined with multivariate sensor data can provide the user with continuous feedback control information for both the sensor and process diagnostics.

The sensor can literally become a selfdiagnosing entity, flagging unusual data that arise from a variety of sources including sensor malfunction, process disruption, unusual events, or sampling issues. Microarrays have allowed the expression level of thousands of genes or proteins to be measured simultaneously.

Data sets generated by these arrays consist of a small number of observations e. The observations in these data sets often have other attributes associated with them such as a class label denoting the pathology of the subject. Finding genes or proteins that are correlated to these attributes is often a difficult task since most of the variables do not contain information about the pathology and as such can mask the identity of the relevant features.

The development of better algorithms to analyze and to visualize expression data and to integrate it with other information is crucial to making expression data more amenable to interpretation. We would like to be able to analyze the large arrays of data from a microarray experiment at an intermediate level using pattern recognition techniques for interpretation. At the very least, such an analysis could identify those genes worthy of further study among the thousands of genes already known.

Other potential focal topics that will not be treated in great detail in this review but are worthy of mention include estimation of kinetic rate constants, protein folding, DNA hybridization, and metabonomics.

Brereton A23 discusses the relative merits of a Analytical Chemistry, Vol. Smilde A24 investigated constrained least squares as one approach to improve the accuracy of the estimation and concluded that using constraints does not necessarily result in an improvement in the accuracy of the rate constant estimate.

Rutan A25 was able to successfully resolve the reactant, product, and intermediate spectra and determine the rate constant for the degradation of an herbicide using NMR and alternating least squares. Olivieri A26 used both alternating least squares and parallel factor analysis to determine second-order rate constants for two pesticides: carbaryl and chlorypyrifos.

Using iterative target testing factor analysis, Zhu A27 was able to resolve twoway kinetic spectra data. Tauler AA31 applied multivariate curve resolution with alternating least squares to study intermediate species in proteinfolding processes, monitor temperature-dependent protein structural transitions, and study nucleic acid melting and salt-induced transitions.

Rutan A32 and Kvalheim A33 studies the selfassociation of alcohols methanol, propanol, butanol, pentanol, hexanol, heptanol by infrared and Raman spectroscopy using alternating least squares, evolving factor analysis, iterative target testing factor analysis, and orthogonal projection to resolve the spectra and determined concentration profiles as a function of composition.

Metabonomics, which is a rapidly emerging field of research combining sophisticated analytical instrumentation such as NMR with multivariate statistical analysis to generate complex metabolic profiles of biofluids and tissues, received considerable attention during this reporting period as evidenced by the large number of publications on this subject.

There were several reviews AA36 published on the chemometric contributions to the evolution of the field with emphasis on characterizing and interpreting complex biological NMR data using pattern recognition techniques. Defernez A37 used principal component analysis to investigate whether there are factors that may affect the NMR spectra in a way that subsequently decreases the robustness of the metabolic fingerprint.

Nicholson A38 showed that discriminant PLS with orthogonal signal correction was effective at removing confounding variation obscuring subtle changes in NMR profile data. Holmes A39 also discussed multivariate techniques that may be useful for minimizing confounding biological and analytical noise present in the metabolic data. The analytical reproducibility of proton NMR for metabolic fingerprinting was investigated by Nicholson A40 , who used principal component analysis to evaluate the effect that different spectrometers at different operating frequencies had on the observed profiles.

During this reporting period, there were several unique and innovative applications of chemometrics that do not fit in a particular category but should be reported to the community. They include the use of principal components to reduce the combinatorial explosion of possibilities in conformational analysis of organic molecules A41 , the monitoring of the conservation state of wooden boards from the 16th century A42 based on their Raman spectra, which were being periodically collected, assessing the structural similarity of G-protein coupled receptors using principal property descriptors to characterize their amino acid sequences A43, A44 , and the use of wavelets and principal component analysis to eliminate instrumental variation in peptide maps obtained by liquid chromatography A Data sets generated by chemical imaging are large, are multivariate, and require significant processing.

Review articles on near-infrared and Raman spectroscopy for chemical imaging have appeared in the literature B1, B2. Segmentation and classification tasks can be impeded by the high dimensionality of the data. Willse B3 proposes multivariate methods based on Poisson and multinomial mixture models to segment SIMS images into chemical homogeneous regions.

Fulghum B4 demonstrates that additional information can be obtained from XPS imaging data when multivariate methods are applied. Ruckebusch B5 discussed the use of time-resolved step-scan FT-IR and chemometrics to study the photocycle of bacteriorhodopsin.

Three-dimensional data recorded over time were suitably unfolded and studied using principal component analysis, evolving factor analysis, and multivariate curve resolution. Transient intermediates formed in the time domain were identified. Alternating least squares was used by Sum B6 to extract concentration profiles and individual spectra from FT-IR images of in situ plant tissue.

Hancewicz B7 discusses the use of confocal Raman spectroscopy and selfmodeling curve resolution to measure the concentrations of phaseseparated biopolymers in foods.

The use of chemometrics to analyze descriptive image information in pharmaceutical powder technology and pharmaceutical process control has been investigated by Laitinen B8 and Tauler B9. Multivariate curve resolution has played an important role in analyzing image data. Tauler B10 has reviewed the contribution of this methodology to unraveling multicomponent processes and mixtures from images. The influence of selectivity and sensitivity on detection limits in multivariate curve resolution using iterative target testing factor analysis as the specific method studied has been treated by Rodriguez-Cuesta B Duponcehl B12 has investigated the influence of instrumental perturbations on the performance of widely used multivariate curve resolution methods.

Van Benthem B13 has reviewed the effect of equality constraints on the performance of alternating least squares. Hopke B14 describes the development of a new convergence criterion for multivariate curve resolution, and Lavine B15 describes a new method to perform multivariate curve resolution based on a Varimax extended rotation.

Visser B16 presents an information theoretical framework that can be used to extract pure component spectra from images without prior knowledge of the system under investigation, and Sin B17 discusses a new spectral reconstruction algorithm based on maximum entropy.

Larsen B18 discusses the use of maximum autocorrelation factors to extract information from images where there is an ordering of objects. Multiway methods also play an important role in the analysis of image data.

Chemometrics

Esbensen B19 provides an overview of multiway methods. Object-oriented data modeling B20 , which can provide a framework for multiway methods based on the PLS paradigm, is treated by Esbensen in a separate publication. Smilde B21 also offers a framework for sequential multiblock component methods to study complex data sets. Rutan B22 describes an improvement in the three-way alternating least squares multivariate curve resolution algorithm that makes use of the recently introduced multidimensional arrays of MATLAB.

Gurden B23 discusses principal component analysis and parallel factor analysis for the analysis of both single images and movies with similarities and differences between the two methods highlighted. A problem in multiway analysis is the estimation of chemical rank. Xie describes two approaches for tackling this problem: two-mode subspace comparison B24 and principal norm vector orthogonal projection B Jack-knife techniques for the detection of outliers, which can be deleterious to the performance of parallel factor analysis and related methods, is described by Bro B SENSORS During this reporting period, there have been a large number of papers published on the applications of chemometrics to sensors.

A brief survey of the more interesting applications is provided in this section. Many of the applications have focused on detection of biological organisms. Fry C1 has developed a microporous polyethylene disposable optical film that is mostly transparent to IR light to characterize bacterial strains by FT-IR for subsequent classification by principal component analysis and hierarchical clustering. Bacterial cultures are harvested and placed onto the film where they are allowed to dry.

Goodacre C2 showed that surface-enhanced Raman spectroscopy SERS on colloidal silver could be used to fingerprint whole bacteria and fungi. Discriminant analysis and hierarchical clustering identified patterns in the Raman spectra characteristic of the strain level of the particular organism. Raman spectra and pattern recognition techniques were also used to differentiate basal cell carcinoma from its surrounding noncancerous tissue C3 and identify epithelial cancer cells C4.

Microorganisms on food surfaces could be differentiated using Fourier transform IR C5. A Mahalonobis distance metric was used to evaluate and quantify the statistical differences in the spectra of six different microorganisms. Sensor applications involving the detection of specific compounds focused on sugars. Ben-Amotz C6 demonstrated the feasibility of using Raman and PLS for classification and quantitation of oligosaccharides.

Potentiometric assays were also developed to detect saccharides. PLS and multiple linear regression analysis were used to quantitate the responses of a potentiometric sensor array on a laboratory in a chip with a correlation coefficient of 0.

A glucose biosensor based on SERS was developed that relies on an alkanethiolate monolayer that acts as a partition layer preconcentrating the glucose. Chemometric analysis of the captured SERS spectra reveals that glucose can be reliably quantitated at physiological levels C8. Noninvasive glucose monitoring with NIR diffuse reflectance spectroscopy remains an active, yet controversial, research area with a large and growing literature.

Li C9 has recently reviewed the necessary instrumental precision required to achieve this goal as well as the biological complexity of this problem.

An alternative to noninvasive glucose monitoring for diabetics is monitoring the changes in tear proteins from diabetic patients. Using electrophoretic methods, changes in protein patterns may contain information about glucose levels based on the Wilks lambda test C Despite the large number of failed attempts to solve the noninvasive glucose problem for insulin dosing, this commercially and medically important application continues to receive funding due to its market attractiveness for investors.

Many of the citations on the application of chemometrics to sensors have focused on improving sensor performance. Brown C11 was able to use wavelet analysis to remove a nonconstant, varying spectroscopic background from near-IR data leading to a simpler and more parsimonious multivariate linear model.

Signal denoising and baseline correction using discrete wavelets was also demonstrated in a study on microchip electrophoresis. Liu C12 was able to show that baseline drift, which is a frequently occurring problem with chip devices, can be circumvented.

The fast wavelet transform through the WILMA algorithm has also been coupled with multiple linear regression analysis and partial least squares for the selection of optimal regression models.

Using this approach, Cocchi C13 was able to improve the predictive ability of regression models.

Statistics and Chemometrics for Analytical Chemistry, Sixth Edition

The wavelets that primarily contained noise were discarded with the remaining wavelets used for spectral reconstruction. There were other approaches taken during this reporting period to improve the signal-to-noise ratios of the data. Martens C14 developed a method to prewhiten spectra, which makes the instrument blind to certain interferences while retaining its analytical sensitivity. The method consists of shrinking the multidimensional data space of the spectra in the off-axis directions corresponding to the spectra of the interferences.

A nuisance covariance matrix is developed, and each spectrum is multiplied by the square root of the matrix. Vogt C15 proposed the idea of secured principal components for detection and correction calibration models that fail because of uncalibrated spectral features. The proposed algorithm searches for these features and corrects them in the disturbed sample. Esbensen C16 took a different approach to robustifying a multivariate calibration model.

Piovoso C17 was concerned about the deleterious effects that multivariate outliers have on a calibration model and has focused his attention on outlier replacement in the score space generated by the principal component analysis of the data.Earlier editions cover previous versions of Excel. For the sodium in mineral water example in the previous paragraph we might say that we think that m lies somewhere in the interval 7 to 9 ppm and that all possible values in that range are equally likely.

A simple way of seeing whether a set of data is consistent with the assumption of normality is to plot a cumulative frequency curve on special graph paper known as normal probability paper. Software screenshots are reproduced with permission of Microsoft Corporation. Statistics of repeated measurements As expected, the larger n is, the smaller the value of the SEM and consequently the smaller the spread of the sample means about m.