|
Books > Computing & IT > Computer software packages > Other software packages > Mathematical & statistical software
This book provides insight and enhanced appreciation of analysis,
modeling and control of dynamic systems. The reader is assumed to
be familiar with calculus, physics and some programming skills. It
might develop the reader's ability to interpret physical
significance of mathematical results in system analysis. The book
also prepares the reader for more advanced treatment of subsequent
knowledge in the automatic control field. Learning objectives are
performance-oriented, using for this purpose interactive MATLAB and
SIMULINK software tools. It presents realistic problems in order to
analyze, design and develop automatic control systems. Learning
with computing tools can aid theory and help students to think,
analyze and reason in meaningful ways. The book is also
complemented with classroom slides and MATLAB and SIMULINK exercise
files to aid students to focus on fundamental concepts treated.
In honor of professor and renowned statistician R. Dennis Cook,
this festschrift explores his influential contributions to an array
of statistical disciplines ranging from experimental design and
population genetics, to statistical diagnostics and all areas of
regression-related inference and analysis. Since the early 1990s,
Prof. Cook has led the development of dimension reduction
methodology in three distinct but related regression contexts:
envelopes, sufficient dimension reduction (SDR), and regression
graphics. In particular, he has made fundamental and pioneering
contributions to SDR, inventing or co-inventing many popular
dimension reduction methods, such as sliced average variance
estimation, the minimum discrepancy approach, model-free variable
selection, and sufficient dimension reduction subspaces. A prolific
researcher and mentor, Prof. Cook is known for his ability to
identify research problems in statistics that are both challenging
and important, as well as his deep appreciation for the applied
side of statistics. This collection of Prof. Cook's collaborators,
colleagues, friends, and former students reflects the broad array
of his contributions to the research and instructional arenas of
statistics.
Introduction to Mathcad 15, 3/e is ideal for Freshman or
Introductory courses in Engineering and Computer Science.
Introduces Mathcad's basic mathematical and data analysis functions
(e.g., trigonometric, regression, and interpolation functions)
using easy-to-follow examples, then applies the functions to
examples drawn from emerging or rapidly developing fields in
engineering. ESource-Prentice Hall's Engineering Source-provides a
complete, flexible introductory engineering and computing program.
ESource allows professors to fully customize their textbooks
through the ESource website. Professors are not only able to pick
and choose modules, but also sections of modules, incorporate their
own materials, and re-paginate and re-index the complete project.
prenhall.com/esource
This is a new type of calculus book: Students who master this text will be well versed in calculus and, in addition, possess a useful working knowledge of how to use modern symbolic mathematics software systems for solving problems in calculus. This will equip them with the mathematical competence they need for science and engineering and the competitive workplace. MACSYMA is used as the software in which the example programs and calculations are given. However, by the experience gained in this book, the student will also be able to use any of the other major mathematical software systems, like for example AXIOM, MATHEMATICA, MAPLE, DERIVE or REDUCE, for "doing calculus on computers".
This book promotes the experimental mathematics approach in the
context of secondary mathematics curriculum by exploring
mathematical models depending on parameters that were typically
considered advanced in the pre-digital education era. This
approach, by drawing on the power of computers to perform numerical
computations and graphical constructions, stimulates formal
learning of mathematics through making sense of a computational
experiment. It allows one (in the spirit of Freudenthal) to bridge
serious mathematical content and contemporary teaching practice. In
other words, the notion of teaching experiment can be extended to
include a true mathematical experiment. When used appropriately,
the approach creates conditions for collateral learning (in the
spirit of Dewey) to occur including the development of skills
important for engineering applications of mathematics. In the
context of a mathematics teacher education program, thebook
addresses a call for the preparation of teachers capable of
utilizing modern technology tools for the modeling-based teaching
of mathematics with a focus on methods conducive to the improvement
of the whole STEM education at the secondary level. By the same
token, using the book's pedagogy and its mathematical content in a
pre-college classroom can assist teachers in introducing students
to the ideas that develop the foundation of engineering
profession."
This monograph provides, for the first time, a most comprehensive
statistical account of composite sampling as an ingenious
environmental sampling method to help accomplish observational
economy in a variety of environmental and ecological studies.
Sampling consists of selection, acquisition, and quantification of
a part of the population. But often what is desirable is not
affordable, and what is affordable is not adequate. How do we deal
with this dilemma? Operationally, composite sampling recognizes the
distinction between selection, acquisition, and quantification. In
certain applications, it is a common experience that the costs of
selection and acquisition are not very high, but the cost of
quantification, or measurement, is substantially high. In such
situations, one may select a sample sufficiently large to satisfy
the requirement of representativeness and precision and then, by
combining several sampling units into composites, reduce the cost
of measurement to an affordable level. Thus composite sampling
offers an approach to deal with the classical dilemma of desirable
versus affordable sample sizes, when conventional statistical
methods fail to resolve the problem. Composite sampling, at least
under idealized conditions, incurs no loss of information for
estimating the population means. But an important limitation to the
method has been the loss of information on individual sample
values, such as the extremely large value. In many of the
situations where individual sample values are of interest or
concern, composite sampling methods can be suitably modified to
retrieve the information on individual sample values that may be
lost due to compositing. In this monograph, we present statistical
solutions to these and other issues that arise in the context of
applications of composite sampling. Content Level Research
Describing novel mathematical concepts for recommendation engines,
Realtime Data Mining: Self-Learning Techniques for Recommendation
Engines features a sound mathematical framework unifying approaches
based on control and learning theories, tensor factorization, and
hierarchical methods. Furthermore, it presents promising results of
numerous experiments on real-world data. The area of realtime data
mining is currently developing at an exceptionally dynamic pace,
and realtime data mining systems are the counterpart of today's
"classic" data mining systems. Whereas the latter learn from
historical data and then use it to deduce necessary actions,
realtime analytics systems learn and act continuously and
autonomously. In the vanguard of these new analytics systems are
recommendation engines. They are principally found on the Internet,
where all information is available in realtime and an immediate
feedback is guaranteed. This monograph appeals to computer
scientists and specialists in machine learning, especially from the
area of recommender systems, because it conveys a new way of
realtime thinking by considering recommendation tasks as
control-theoretic problems. Realtime Data Mining: Self-Learning
Techniques for Recommendation Engines will also interest
application-oriented mathematicians because it consistently
combines some of the most promising mathematical areas, namely
control theory, multilevel approximation, and tensor factorization.
This focuses on the developing field of building probability models
with the power of symbolic algebra systems. The book combines the
uses of symbolic algebra with probabilistic/stochastic application
and highlights the applications in a variety of contexts. The
research explored in each chapter is unified by the use of A
Probability Programming Language (APPL) to achieve the modeling
objectives. APPL, as a research tool, enables a probabilist or
statistician the ability to explore new ideas, methods, and models.
Furthermore, as an open-source language, it sets the foundation for
future algorithms to augment the original code. Computational
Probability Applications is comprised of fifteen chapters, each
presenting a specific application of computational probability
using the APPL modeling and computer language. The chapter topics
include using inverse gamma as a survival distribution, linear
approximations of probability density functions, and also
moment-ratio diagrams for univariate distributions. These works
highlight interesting examples, often done by undergraduate
students and graduate students that can serve as templates for
future work. In addition, this book should appeal to researchers
and practitioners in a range of fields including probability,
statistics, engineering, finance, neuroscience, and economics.
This book presents the best papers from the 1st International
Conference on Mathematical Research for Blockchain Economy (MARBLE)
2019, held in Santorini, Greece. While most blockchain conferences
and forums are dedicated to business applications, product
development or Initial Coin Offering (ICO) launches, this
conference focused on the mathematics behind blockchain to bridge
the gap between practice and theory. Every year, thousands of
blockchain projects are launched and circulated in the market, and
there is a tremendous wealth of blockchain applications, from
finance to healthcare, education, media, logistics and more.
However, due to theoretical and technical barriers, most of these
applications are impractical for use in a real-world business
context. The papers in this book reveal the challenges and
limitations, such as scalability, latency, privacy and security,
and showcase solutions and developments to overcome them.
Applied Linear Regression for Business Analytics with R introduces
regression analysis to business students using the R programming
language with a focus on illustrating and solving real-time,
topical problems. Specifically, this book presents modern and
relevant case studies from the business world, along with clear and
concise explanations of the theory, intuition, hands-on examples,
and the coding required to employ regression modeling. Each chapter
includes the mathematical formulation and details of regression
analysis and provides in-depth practical analysis using the R
programming language.
Multilevel and Longitudinal Modeling with IBM SPSS, Third Edition,
demonstrates how to use the multilevel and longitudinal modeling
techniques available in IBM SPSS Versions 25-27. Annotated
screenshots with all relevant output provide readers with a
step-by-step understanding of each technique as they are shown how
to navigate the program. Throughout, diagnostic tools, data
management issues, and related graphics are introduced. SPSS
commands show the flow of the menu structure and how to facilitate
model building, while annotated syntax is also available for those
who prefer this approach. Extended examples illustrating the logic
of model development and evaluation are included throughout the
book, demonstrating the context and rationale of the research
questions and the steps around which the analyses are structured.
The book opens with the conceptual and methodological issues
associated with multilevel and longitudinal modeling, followed by a
discussion of SPSS data management techniques that facilitate
working with multilevel, longitudinal, or cross-classified data
sets. The next few chapters introduce the basics of multilevel
modeling, developing a multilevel model, extensions of the basic
two-level model (e.g., three-level models, models for binary and
ordinal outcomes), and troubleshooting techniques for everyday-use
programming and modeling problems along with potential solutions.
Models for investigating individual and organizational change are
next developed, followed by models with multivariate outcomes and,
finally, models with cross-classified and multiple membership data
structures. The book concludes with thoughts about ways to expand
on the various multilevel and longitudinal modeling techniques
introduced and issues (e.g., missing data, sample weights) to keep
in mind in conducting multilevel analyses. Key features of the
third edition: Thoroughly updated throughout to reflect IBM SPSS
Versions 26-27. Introduction to fixed-effects regression for
examining change over time where random-effects modeling may not be
an optimal choice. Additional treatment of key topics specifically
aligned with multilevel modeling (e.g., models with binary and
ordinal outcomes). Expanded coverage of models with
cross-classified and multiple membership data structures. Added
discussion on model checking for improvement (e.g., examining
residuals, locating outliers). Further discussion of alternatives
for dealing with missing data and the use of sample weights within
multilevel data structures. Supported by online data sets, the
book's practical approach makes it an essential text for
graduate-level courses on multilevel, longitudinal, latent variable
modeling, multivariate statistics, or advanced quantitative
techniques taught in departments of business, education, health,
psychology, and sociology. The book will also prove appealing to
researchers in these fields. The book is designed to provide an
excellent supplement to Heck and Thomas's An Introduction to
Multilevel Modeling Techniques, Fourth Edition; however, it can
also be used with any multilevel or longitudinal modeling book or
as a stand-alone text.
Essentials of Monte Carlo Simulation focuses on the fundamentals of
Monte Carlo methods using basic computer simulation techniques. The
theories presented in this text deal with systems that are too
complex to solve analytically. As a result, readers are given a
system of interest and constructs using computer code, as well as
algorithmic models to emulate how the system works internally.
After the models are run several times, in a random sample way, the
data for each output variable(s) of interest is analyzed by
ordinary statistical methods. This book features 11 comprehensive
chapters, and discusses such key topics as random number
generators, multivariate random variates, and continuous random
variates. Over 100 numerical examples are presented as part of the
appendix to illustrate useful real world applications. The text
also contains an easy to read presentation with minimal use of
difficult mathematical concepts. Very little has been published in
the area of computer Monte Carlo simulation methods, and this book
will appeal to students and researchers in the fields of
Mathematics and Statistics.
Presenting theory while using "Mathematica" in a complementary way,
Modern Differential Geometry of Curves and Surfaces with
Mathematica, the third edition of Alfred Gray's famous textbook,
covers how to define and compute standard geometric functions using
"Mathematica" for constructing new curves and surfaces from
existing ones. Since Gray's death, authors Abbena and Salamon have
stepped in to bring the book up to date. While maintaining Gray's
intuitive approach, they reorganized the material to provide a
clearer division between the text and the "Mathematica" code and
added a "Mathematica" notebook as an appendix to each chapter. They
also address important new topics, such as quaternions.
The approach of this book is at times more computational than is
usual for a book on the subject. For example, Brioshi's formula for
the Gaussian curvature in terms of the first fundamental form can
be too complicated for use in hand calculations, but"Mathematica
"handles it easily, either through computations or through graphing
curvature. Another part of "Mathematica" that can be used
effectively in differential geometry is its special function
library, where nonstandard spaces of constant curvature can be
defined in terms of elliptic functions and then plotted.
Using the techniques described in this book, readers will
understand concepts geometrically, plotting curves and surfaces on
a monitor and then printing them. Containing more than 300
illustrations, the book demonstrates how to use "Mathematica" to
plot many interesting curves and surfaces. Including as many topics
of the classical differential geometry and surfaces as possible, it
highlights important theorems with many examples.It includes 300
miniprograms for computing and plotting various geometric objects,
alleviating the drudgery of computing things such as the curvature
and torsion of a curve in space.
This textbook addresses postgraduate students in applied
mathematics, probability, and statistics, as well as computer
scientists, biologists, physicists and economists, who are seeking
a rigorous introduction to applied stochastic processes. Pursuing a
pedagogic approach, the content follows a path of increasing
complexity, from the simplest random sequences to the advanced
stochastic processes. Illustrations are provided from many applied
fields, together with connections to ergodic theory, information
theory, reliability and insurance. The main content is also
complemented by a wealth of examples and exercises with solutions.
This first book in the series will describe the Net Generation as
visual learners who thrive when surrounded with new technologies
and whose needs can be met with the technological innovations.
These new learners seek novel ways of studying, such as
collaborating with peers, multitasking, as well as use of
multimedia, the Internet, and other Information and Communication
Technologies. Here we present mathematics as a contemporary subject
that is engaging, exciting and enlightening in new ways. For
example, in the distributed environment of cyber space, mathematics
learners play games, watch presentations on YouTube, create Java
applets of mathematics simulations and exchange thoughts over the
Instant Messaging tool. How should mathematics education resonate
with these learners and technological novelties that excite them?
Economists are regularly confronted with results of quantitative
economics research. Econometrics: Theory and Applications with
EViews provides a broad introduction to quantitative economic
methods, for example how models arise, their underlying assumptions
and how estimates of parameters or other economic quantities are
computed. The author combines econometric theory with practice by
demonstrating its use with the software package EViews through
extensive use of screen shots. The emphasis is on understanding how
to select the right method of analysis for a given situation, and
how to actually apply the theoretical methodology correctly. The
EViews software package is available from 'Quantitive Micro
Software'. Written for any undergraduate or postgraduate course in
Econometrics.
This is a fully revised edition of the best-selling Introduction to Maple. The book presents the modern computer algebra system Maple, teaching the reader not only what can be done by Maple, but also how and why it can be done. The book also provides the necessary background for those who want the most of Maple or want to extend its built-in knowledge. Emphasis is on understanding the Maple system more than on factual knowledge of built-in possibilities. To this end, the book contains both elementary and more sophisticated examples as well as many exercises. The typical reader should have a background in mathematics at the intermediate level. Andre Heck began developing and teaching Maple courses at the University of Nijmegen in 1987. In 1989 he was appointed managing director of the CAN Expertise Center in Amsterdam. CAN, Computer Algebra in the Netherlands, stimulates and coordinates the use of computer algebra in education and research. In 1996 the CAN Expertise Center was integrated into the Faculty of Science at the University of Amsterdam, into what became the AMSTEL Institute. The institute program focuses on the innovation of computer activities in mathematics and science education on all levels of education. The author is actively involved in the research and development aimed at the integrated computer learning environment Coach for mathematics and science education at secondary school level.
SAS Programming: The One-Day Course provides a concise introduction to the SAS programming language that gives readers not only a quick start in SAS programming, but also in the basic data manipulations and statistical summaries that are available through SAS. Unlike other introductory texts on the market, this is a pocket-sized reference that does not clutter the presentation of programming techniques by trying to teach statistical methods at the same time. Strong on explanations of how to carry out data manipulations that real-life data often call for, it also contains a short "workbook" appendix, complete with solutions. Datasets and the programming code are available to download from the Web.
Economists can use computer algebra systems to manipulate symbolic
models, derive numerical computations, and analyze empirical
relationships among variables. Maxima is an open-source
multi-platform computer algebra system that rivals proprietary
software. Maxima's symbolic and computational capabilities enable
economists and financial analysts to develop a deeper understanding
of models by allowing them to explore the implications of
differences in parameter values, providing numerical solutions to
problems that would be otherwise intractable, and by providing
graphical representations that can guide analysis. This book
provides a step-by-step tutorial for using this program to examine
the economic relationships that form the core of microeconomics in
a way that complements traditional modeling techniques. Readers
learn how to phrase the relevant analysis and how symbolic
expressions, numerical computations, and graphical representations
can be used to learn from microeconomic models. In particular,
comparative statics analysis is facilitated. Little has been
published on Maxima and its applications in economics and finance,
and this volume will appeal to advanced undergraduates,
graduate-level students studying microeconomics, academic
researchers in economics and finance, economists, and financial
analysts.
This book presents algorithmic tools for algebraic geometry and experimental applications of them. It also introduces a software system in which the tools have been implemented and with which the experiments can be carried out. Macaulay 2 is a computer algebra system devoted to supporting research in algebraic geometry, commutative algebra, and their applications. The reader of this book will encounter Macaulay 2 in the context of concrete applications and practical computations in algebraic geometry. The expositions of the algorithmic tools presented here are designed to serve as a useful guide for those wishing to bring such tools to bear on their own problems. These expositions will be valuable to both the users of other programs similar to Macaulay 2 (for example, Singular and CoCoA) and those who are not interested in explicit machine computations at all. The first part of the book is primarily concerned with introducing Macaulay2, whereas the second part emphasizes the mathematics.
Focused on practical matters: this book will not cover Shiny
concepts, but practical tools and methodologies to use for
production. Based on experience: this book will be a formalization
of several years of experience building Shiny applications.
Original content: this book will present new methodology and
tooling, not just do a review of what already exists.
|
|