|
Showing 1 - 6 of
6 matches in All Departments
Winner of the 2015 Sugiyama Meiko Award (Publication Award) of the
Behaviormetric Society of Japan Developed by the authors,
generalized structured component analysis is an alternative to two
longstanding approaches to structural equation modeling: covariance
structure analysis and partial least squares path modeling.
Generalized structured component analysis allows researchers to
evaluate the adequacy of a model as a whole, compare a model to
alternative specifications, and conduct complex analyses in a
straightforward manner. Generalized Structured Component Analysis:
A Component-Based Approach to Structural Equation Modeling provides
a detailed account of this novel statistical methodology and its
various extensions. The authors present the theoretical
underpinnings of generalized structured component analysis and
demonstrate how it can be applied to various empirical examples.
The book enables quantitative methodologists, applied researchers,
and practitioners to grasp the basic concepts behind this new
approach and apply it to their own research. The book emphasizes
conceptual discussions throughout while relegating more technical
intricacies to the chapter appendices. Most chapters compare
generalized structured component analysis to partial least squares
path modeling to show how the two component-based approaches differ
when addressing an identical issue. The authors also offer a free,
online software program (GeSCA) and an Excel-based software program
(XLSTAT) for implementing the basic features of generalized
structured component analysis.
In multivariate data analysis, regression techniques predict one
set of variables from another while principal component analysis
(PCA) finds a subspace of minimal dimensionality that captures the
largest variability in the data. How can regression analysis and
PCA be combined in a beneficial way? Why and when is it a good idea
to combine them? What kind of benefits are we getting from them?
Addressing these questions, Constrained Principal Component
Analysis and Related Techniques shows how constrained PCA (CPCA)
offers a unified framework for these approaches. The book begins
with four concrete examples of CPCA that provide readers with a
basic understanding of the technique and its applications. It gives
a detailed account of two key mathematical ideas in CPCA:
projection and singular value decomposition. The author then
describes the basic data requirements, models, and analytical tools
for CPCA and their immediate extensions. He also introduces
techniques that are special cases of or closely related to CPCA and
discusses several topics relevant to practical uses of CPCA. The
book concludes with a technique that imposes different constraints
on different dimensions (DCDD), along with its analytical
extensions. MATLAB (R) programs for CPCA and DCDD as well as data
to create the book's examples are available on the author's
website.
Winner of the 2015 Sugiyama Meiko Award (Publication Award) of the
Behaviormetric Society of Japan Developed by the authors,
generalized structured component analysis is an alternative to two
longstanding approaches to structural equation modeling: covariance
structure analysis and partial least squares path modeling.
Generalized structured component analysis allows researchers to
evaluate the adequacy of a model as a whole, compare a model to
alternative specifications, and conduct complex analyses in a
straightforward manner. Generalized Structured Component Analysis:
A Component-Based Approach to Structural Equation Modeling provides
a detailed account of this novel statistical methodology and its
various extensions. The authors present the theoretical
underpinnings of generalized structured component analysis and
demonstrate how it can be applied to various empirical examples.
The book enables quantitative methodologists, applied researchers,
and practitioners to grasp the basic concepts behind this new
approach and apply it to their own research. The book emphasizes
conceptual discussions throughout while relegating more technical
intricacies to the chapter appendices. Most chapters compare
generalized structured component analysis to partial least squares
path modeling to show how the two component-based approaches differ
when addressing an identical issue. The authors also offer a free,
online software program (GeSCA) and an Excel-based software program
(XLSTAT) for implementing the basic features of generalized
structured component analysis.
In multivariate data analysis, regression techniques predict one
set of variables from another while principal component analysis
(PCA) finds a subspace of minimal dimensionality that captures the
largest variability in the data. How can regression analysis and
PCA be combined in a beneficial way? Why and when is it a good idea
to combine them? What kind of benefits are we getting from them?
Addressing these questions, Constrained Principal Component
Analysis and Related Techniques shows how constrained PCA (CPCA)
offers a unified framework for these approaches. The book begins
with four concrete examples of CPCA that provide readers with a
basic understanding of the technique and its applications. It gives
a detailed account of two key mathematical ideas in CPCA:
projection and singular value decomposition. The author then
describes the basic data requirements, models, and analytical tools
for CPCA and their immediate extensions. He also introduces
techniques that are special cases of or closely related to CPCA and
discusses several topics relevant to practical uses of CPCA. The
book concludes with a technique that imposes different constraints
on different dimensions (DCDD), along with its analytical
extensions. MATLAB (R) programs for CPCA and DCDD as well as data
to create the book's examples are available on the author's
website.
Aside from distribution theory, projections and the singular value
decomposition (SVD) are the two most important concepts for
understanding the basic mechanism of multivariate analysis. The
former underlies the least squares estimation in regression
analysis, which is essentially a projection of one subspace onto
another, and the latter underlies principal component analysis,
which seeks to find a subspace that captures the largest
variability in the original space. This book is about projections
and SVD. A thorough discussion of generalized inverse (g-inverse)
matrices is also given because it is closely related to the former.
The book provides systematic and in-depth accounts of these
concepts from a unified viewpoint of linear transformations finite
dimensional vector spaces. More specially, it shows that projection
matrices (projectors) and g-inverse matrices can be defined in
various ways so that a vector space is decomposed into a direct-sum
of (disjoint) subspaces. Projection Matrices, Generalized Inverse
Matrices, and Singular Value Decomposition will be useful for
researchers, practitioners, and students in applied mathematics,
statistics, engineering, behaviormetrics, and other fields.
Aside from distribution theory, projections and the singular value
decomposition (SVD) are the two most important concepts for
understanding the basic mechanism of multivariate analysis. The
former underlies the least squares estimation in regression
analysis, which is essentially a projection of one subspace onto
another, and the latter underlies principal component analysis,
which seeks to find a subspace that captures the largest
variability in the original space. This book is about projections
and SVD. A thorough discussion of generalized inverse (g-inverse)
matrices is also given because it is closely related to the former.
The book provides systematic and in-depth accounts of these
concepts from a unified viewpoint of linear transformations finite
dimensional vector spaces. More specially, it shows that projection
matrices (projectors) and g-inverse matrices can be defined in
various ways so that a vector space is decomposed into a direct-sum
of (disjoint) subspaces. Projection Matrices, Generalized Inverse
Matrices, and Singular Value Decomposition will be useful for
researchers, practitioners, and students in applied mathematics,
statistics, engineering, behaviormetrics, and other fields.
|
You may like...
Loot
Nadine Gordimer
Paperback
(2)
R398
R330
Discovery Miles 3 300
|