|
Showing 1 - 20 of
20 matches in All Departments
A study of those statistical ideas that use a probability
distribution over parameter space. The first part describes the
axiomatic basis in the concept of coherence and the implications of
this for sampling theory statistics. The second part discusses the
use of Bayesian ideas in many branches of statistics.
This book serves well as an introduction into the more theoretical
aspects of the use of spline models. It develops a theory and
practice for the estimation of functions from noisy data on
functionals. The simplest example is the estimation of a smooth
curve, given noisy observations on a finite number of its values.
The estimate is a polynomial smoothing spline. By placing this
smoothing problem in the setting of reproducing kernel Hilbert
spaces, a theory is developed which includes univariate smoothing
splines, thin plate splines in d dimensions, splines on the sphere,
additive splines, and interaction splines in a single framework. A
straightforward generalization allows the theory to encompass the
very important area of (Tikhonov) regularization methods for ill
posed inverse problems. Convergence properties, data based
smoothing parameter selection, confidence intervals, and numerical
methods are established which are appropriate to a wide variety of
problems which fall within this framework. Methods for including
side conditions and other prior information in solving ill posed
inverse problems are included. Data which involves samples of
random variables with Gaussian, Poisson, binomial, and other
distributions are treated in a unified optimization context.
Experimental design questions, i.e., which functionals should be
observed, are studied in a general context. Extensions to
distributed parameter system identification problems are made by
considering implicitly defined functionals.
This monograph deals with aspects of the computer programming
process that involve techniques derived from mathematical logic.
The author focuses on proving that a given program produces the
intended result whenever it halts, that a given program will
eventually halt, that a given program is partially correct and
terminates, and that a system of rewriting rules always halts.
Also, the author describes the intermediate behavior of a given
program, and discusses constructing a program to meet a given
specification.
A monograph on some of the ways geometry and analysis can be used
in mathematical problems of physical interest. The roles of
symmetry, bifurcation, and Hamiltonian systems in diverse
applications are explored.
This book deals with the mathematical side of the theory of shock
waves. The author presents what is known about the existence and
uniqueness of generalized solutions of the initial value problem
subject to the entropy conditions. The subtle dissipation
introduced by the entropy condition is investigated and the slow
decay in signal strength it causes is shown.
The ideas of Elie Cartan are combined with the tools of Felix Klein
and Sophus Lie to present in this book the only detailed treatment
of the method of equivalence. An algorithmic description of this
method, which finds invariants of geometric objects under infinite
dimensional pseudo-groups, is presented for the first time. As part
of the algorithm, Gardner introduces several major new techniques.
In particular, the use of Cartan's idea of principal components
that appears in his theory of Repere Mobile, and the use of Lie
algebras instead of Lie groups, effectively a linear procedure,
provide a tremendous simplification. One must, however, know how to
convert from one to the other, and the author provides the Rosetta
stone to accomplish this. In complex problems, it is essential to
be able to identify natural blocks in group actions and not just
individual elements, and prior to this publication, there was no
reference to block matrix techniques. The Method of Equivalence and
Its Applications details ten diverse applications including
Lagrangian field theory, control theory, ordinary differential
equations, and Riemannian and conformal geometry. This is the only
book to treat this subject in such depth and to include the
algorithm, the use of principal components, and the use of
infinitesimal analysis on the Lie algebra level. This volume
contains a series of lectures, the purpose of which was to describe
the equivalence algorithm and to show, in particular, how it is
applied to several pedagogical examples and to a problem in control
theory called state estimation of plants under feedback. The
lectures, and hence the book, focus on problems in real geometry.
Here is an in-depth, up-to-date analysis of wave interactions for
general systems of hyperbolic and viscous conservation laws. This
self-contained study of shock waves explains the new wave phenomena
from both a physical and a mathematical standpoint. The analysis is
useful for the study of various physical situations, including
nonlinear elasticity, magnetohydrodynamics, multiphase flows,
combustion, and classical gas dynamics shocks. The central issue
throughout the book is the understanding of nonlinear wave
interactions. The book describes the qualitative theory of shock
waves. It begins with the basics of the theory for scalar
conservation law and Lax's solution of the Reimann problem. For
hyperbolic conservation laws, the Glimm scheme and wave tracing
techniques are presented and used to study the regularity and
large-time behavior of solutions. Viscous nonlinear waves are
studied via the recent approach to pointwise estimates.
This monograph presents new and elegant proofs of classical results
and makes difficult results accessible. The integer programming
models known as set packing and set covering have a wide range of
applications. Sometimes, owing to the special structure of the
constraint matrix, the natural linear programming relaxation yields
an optimal solution that is integral, thus solving the problem.
Sometimes, both the linear programming relaxation and its dual have
integral optimal solutions. Under which conditions do such
integrality conditions hold? This question is of both theoretical
and practical interest. Min-max theorems, polyhedral combinatorics,
and graph theory all come together in this rich area of discrete
mathematics. This monograph presents several of these beautiful
results as it introduces mathematicians to this active area of
research. To encourage research on the many intriguing open
problems that remain, Dr. Cornuejols is offering a $5000 prize to
the first paper solving or refuting each of the 18 conjectures
described in the book. To claim one of the prizes mentioned in the
preface, papers must be accepted by a quality refereed journal
(such as Journal of Combinatorial Theory B, Combinatorica, SIAM
Journal on Discrete Mathematics, or others to be determined by Dr.
Cornuejols) before 2020. Claims must be sent to Dr. Cornuejols at
Carnegie Mellon University during his lifetime.
As this monograph shows, the purpose of cardinal spline
interpolation is to bridge the gap between the linear spline and
the cardinal series. The author explains cardinal spline functions,
the basic properties of B-splines, including B- splines with
equidistant knots and cardinal splines represented in terms of
B-splines, and exponential Euler splines, leading to the most
important case and central problem of the book - cardinal spline
interpolation, with main results, proofs, and some applications.
Other topics discussed include cardinal Hermite interpolation,
semi-cardinal interpolation, finite spline interpolation problems,
extremum and limit properties, equidistant spline interpolation
applied to approximations of Fourier transforms, and the smoothing
of histograms.
Addresses external biofluiddynamics concerning animal locomotion
through surrounding fluid media - and internal biofluiddynamics
concerning heat and mass transport by fluid flow systems within an
animal.
The soliton is a dramatic concept in nonlinear science. What makes
this book unique in the treatment of this subject is its focus on
the properties that make the soliton physically ubiquitous and the
soliton equation mathematically miraculous. Here, on the classical
level, is the entity field theorists have been postulating for
years: a local traveling wave pulse; a lump-like coherent
structure; the solution of a field equation with remarkable
stability and particle-like properties. It is a fundamental mode of
propagation in gravity-driven surface and internal waves; in
atmospheric waves; in ion acoustic and Langmuir waves in plasmas;
in some laser waves in nonlinear media; and in many biologic
contexts, such as alpha-helix proteins. This is not an encyclopedia
of information on solitons in which every sentence is interrupted
by either a caveat or a reference. Rather, Newell has tried to tell
the story of the soliton as he would have liked to have heard it as
a graduate student, with some historical development, lots of
motivation, and frequent attempts to relate the topic at hand to
the big picture. The book begins with a history of the soliton from
its first sighting to the discovery of the inverse scattering
method and recent ideas on the algebraic structure of soliton
equations. Chapter 2 focuses on the universal nature of these
equations and how and why they arise in physical and engineering
contexts as asymptotic solvability conditions. The third chapter
deals with the inverse scattering method and perturbation theories.
Chapter 4 introduces the t-function and discusses the relations
between the various methods for constructing solutions to the
soliton equations and their various properties. Finally, an
algebraic structure for the equations is provided in Chapter 5.
A study of sequential nonparametric methods emphasizing the unified
Martingale approach to the theory, with a detailed explanation of
major applications including problems arising in clinical trials,
life-testing experimentation, survival analysis, classical
sequential analysis and other areas of applied statistics and
biostatistics.
A systematic, self-contained treatment of the theory of stochastic
differential equations in infinite dimensional spaces. Included is
a discussion of Schwartz spaces of distributions in relation to
probability theory and infinite dimensional stochastic analysis, as
well as the random variables and stochastic processes that take
values in infinite dimensional spaces.
There has been an explosive growth in the field of combinatorial
algorithms. These algorithms depend not only on results in
combinatorics and especially in graph theory, but also on the
development of new data structures and new techniques for analyzing
algorithms. Four classical problems in network optimization are
covered in detail, including a development of the data structures
they use and an analysis of their running time. Data Structures and
Network Algorithms attempts to provide the reader with both a
practical understanding of the algorithms, described to facilitate
their easy implementation, and an appreciation of the depth and
beauty of the field of graph algorithms.
Many situations exist in which solutions to problems are
represented as function space integrals. Such representations can
be used to study the qualitative properties of the solutions and to
evaluate them numerically using Monte Carlo methods. The emphasis
in this book is on the behavior of solutions in special situations
when certain parameters get large or small.
The jackknife and the bootstrap are nonparametric methods for
assessing the errors in a statistical estimation problem. They
provide several advantages over the traditional parametric
approach: the methods are easy to describe and they apply to
arbitrarily complicated situations; distribution assumptions, such
as normality, are never made. This monograph connects the
jackknife, the bootstrap, and many other related ideas such as
cross-validation, random subsampling, and balanced repeated
replications into a unified exposition. The theoretical development
is at an easy mathematical level and is supplemented by a large
number of numerical examples. The methods described in this
monograph form a useful set of tools for the applied statistician.
They are particularly useful in problem areas where complicated
data structures are common, for example, in censoring, missing
data, and highly multivariate situations.
Presents a coherent body of theory for the derivation of the
sampling distributions of a wide range of test statistics. Emphasis
is on the development of practical techniques. A unified treatment
of the theory was attempted, e.g., the author sought to relate the
derivations for tests on the circle and the two-sample problem to
the basic theory for the one-sample problem on the line. The
Markovian nature of the sample distribution function is stressed,
as it accounts for the elegance of many of the results achieved, as
well as the close relation with parts of the theory of stochastic
processes.
Acquaints the specialist in relativity theory with some global
techniques for the treatment of space-times and will provide the
pure mathematician with a way into the subject of general
relativity.
The synergism between the World Wide Web and fiber optics is a
familiar story to researchers of digital communications. Fibers are
the enablers of the rates of information flow that make the
Internet possible. Currently transoceanic optical fiber cables
transmit data at rates that could transfer the contents of a
respectable university library in a few minutes. No other medium is
capable of this rate of transmission at such distances.With the
maturing of mobile portable telephony and the emerging broadband
access market, greater fiber transmission capacity will be
essential in the early 21st century. Since the demand for more
capacity drives the development of new optics-based technologies,
fiber optics therefore remains a vibrant area for research. Knowing
that the basic fiber optic technology is mature means that open
questions are more sharply focused and permit deeper mathematic
content.This book is intended to support and promote
interdisciplinary research in optical fiber communications by
providing essential background in both the physical and
mathematical principles of the discipline. Chapter topics include
the basics of fibers and their construction, fiber modes and the
criterion of single mode operation, the nonlinear Schrodinger
equation, the variational approach to the analysis of pulse
propagation, and, finally, solitons and some new results on soliton
formation energy thresholds. These chapters are written to be as
independent as possible while taking the reader to the frontiers of
research on fiber optics communications.
A unified discussion of the formulation and analysis of special
methods of mixed initial boundary-value problems. The focus is on
the development of a new mathematical theory that explains why and
how well spectral methods work. Included are interesting extensions
of the classical numerical analysis.
|
You may like...
Tenet
John David Washington, Robert Pattinson, …
DVD
R53
Discovery Miles 530
Loot
Nadine Gordimer
Paperback
(2)
R398
R330
Discovery Miles 3 300
Loot
Nadine Gordimer
Paperback
(2)
R398
R330
Discovery Miles 3 300
Loot
Nadine Gordimer
Paperback
(2)
R398
R330
Discovery Miles 3 300
|