|
Showing 1 - 4 of
4 matches in All Departments
For more than thirty years the senior author has been trying to
learn algebraic geometry. In the process he discovered that many of
the classic textbooks in algebraic geometry require substantial
knowledge of cohomology, homological algebra, and sheaf theory. In
an attempt to demystify these abstract concepts and facilitate
understanding for a new generation of mathematicians, he along with
co-author wrote this book for an audience who is familiar with
basic concepts of linear and abstract algebra, but who never has
had any exposure to the algebraic geometry or homological algebra.
As such this book consists of two parts. The first part gives a
crash-course on the homological and cohomological aspects of
algebraic topology, with a bias in favor of cohomology. The second
part is devoted to presheaves, sheaves, Cech cohomology, derived
functors, sheaf cohomology, and spectral sequences. All important
concepts are intuitively motivated and the associated proofs of the
quintessential theorems are presented in detail rarely found in the
standard texts.
This book provides the mathematical fundamentals of linear algebra
to practicers in computer vision, machine learning, robotics,
applied mathematics, and electrical engineering. By only assuming a
knowledge of calculus, the authors develop, in a rigorous yet down
to earth manner, the mathematical theory behind concepts such as:
vectors spaces, bases, linear maps, duality, Hermitian spaces, the
spectral theorems, SVD, and the primary decomposition theorem. At
all times, pertinent real-world applications are provided. This
book includes the mathematical explanations for the tools used
which we believe that is adequate for computer scientists,
engineers and mathematicians who really want to do serious research
and make significant contributions in their respective fields.
Volume 2 applies the linear algebra concepts presented in Volume 1
to optimization problems which frequently occur throughout machine
learning. This book blends theory with practice by not only
carefully discussing the mathematical under pinnings of each
optimization technique but by applying these techniques to linear
programming, support vector machines (SVM), principal component
analysis (PCA), and ridge regression. Volume 2 begins by discussing
preliminary concepts of optimization theory such as metric spaces,
derivatives, and the Lagrange multiplier technique for finding
extrema of real valued functions. The focus then shifts to the
special case of optimizing a linear function over a region
determined by affine constraints, namely linear programming.
Highlights include careful derivations and applications of the
simplex algorithm, the dual-simplex algorithm, and the primal-dual
algorithm. The theoretical heart of this book is the mathematically
rigorous presentation of various nonlinear optimization methods,
including but not limited to gradient decent, the
Karush-Kuhn-Tucker (KKT) conditions, Lagrangian duality,
alternating direction method of multipliers (ADMM), and the kernel
method. These methods are carefully applied to hard margin SVM,
soft margin SVM, kernel PCA, ridge regression, lasso regression,
and elastic-net regression. Matlab programs implementing these
methods are included.
This book provides the mathematical fundamentals of linear algebra
to practicers in computer vision, machine learning, robotics,
applied mathematics, and electrical engineering. By only assuming a
knowledge of calculus, the authors develop, in a rigorous yet down
to earth manner, the mathematical theory behind concepts such as:
vectors spaces, bases, linear maps, duality, Hermitian spaces, the
spectral theorems, SVD, and the primary decomposition theorem. At
all times, pertinent real-world applications are provided. This
book includes the mathematical explanations for the tools used
which we believe that is adequate for computer scientists,
engineers and mathematicians who really want to do serious research
and make significant contributions in their respective fields.
|
|