|
|
Books > Computing & IT > Applications of computing > Computer modelling & simulation
In the last decade there has been a phenomenal growth in interest
in crime pattern analysis. Geographic information systems are now
widely used in urban police agencies throughout industrial nations.
With this, scholarly interest in understanding crime patterns has
grown considerably. ""Artificial Crime Analysis Systems: Using
Computer Simulations and Geographic Information Systems"" discusses
leading research on the use of computer simulation of crime
patterns to reveal hidden processes of urban crimes, taking an
interdisciplinary approach by combining criminology, computer
simulation, and geographic information systems into one
comprehensive resource.
Real world problems are complex and full of uncertainty. Fuzzy
computing technologies embedded into data analysis methodologies
and decision-making processes help to address the complexity of
these problems. Contemporary Theory and Pragmatic Approaches in
Fuzzy Computing Utilization presents the most innovative systematic
and practical facets of fuzzy computing technologies to students,
scholars, and academicians, as well as practitioners, engineers,
and professionals. This premier reference source focuses on
up-to-date theoretical views of fuzzy computing while highlighting
empirical approaches useful to real world utilization.
The healthcare industry is starting to adopt digital twins to
improve personalized medicine, healthcare organization performance,
and new medicine and devices. These digital twins can create useful
models based on information from wearable devices, omics, and
patient records to connect the dots across processes that span
patients, doctors, and healthcare organizations as well as drug and
device manufacturers. Digital twins are digital representations of
human physiology built on computer models. The use of digital twins
in healthcare is revolutionizing clinical processes and hospital
management by enhancing medical care with digital tracking and
advancing modelling of the human body. These tools are of great
help to researchers in studying diseases, new drugs, and medical
devices. Digital Twins and Healthcare: Trends, Techniques, and
Challenges facilitates the advancement and knowledge dissemination
in methodologies and applications of digital twins in the
healthcare and medicine fields. This book raises interest and
awareness of the uses of digital twins in healthcare in the
research community. Covering topics such as deep neural network,
edge computing, and transfer learning method, this premier
reference source is an essential resource for hospital
administrators, pharmacists, medical professionals, IT consultants,
students and educators of higher education, librarians, and
researchers.
The discovery and development of new computational methods have
expanded the capabilities and uses of simulations. With agent-based
models, the applications of computer simulations are significantly
enhanced. Multi-Agent-Based Simulations Applied to Biological and
Environmental Systems is a pivotal reference source for the latest
research on the implementation of autonomous agents in computer
simulation paradigms. Featuring extensive coverage on relevant
applications, such as biodiversity conservation, pollution
reduction, and environmental risk assessment, this publication is
an ideal source for researchers, academics, engineers,
practitioners, and professionals seeking material on various issues
surrounding the use of agent-based simulations.
This book addresses the experimental calibration of best-estimate
numerical simulation models. The results of measurements and
computations are never exact. Therefore, knowing only the nominal
values of experimentally measured or computed quantities is
insufficient for applications, particularly since the respective
experimental and computed nominal values seldom coincide. In the
author's view, the objective of predictive modeling is to extract
"best estimate" values for model parameters and predicted results,
together with "best estimate" uncertainties for these parameters
and results. To achieve this goal, predictive modeling combines
imprecisely known experimental and computational data, which calls
for reasoning on the basis of incomplete, error-rich, and
occasionally discrepant information. The customary methods used for
data assimilation combine experimental and computational
information by minimizing an a priori, user-chosen, "cost
functional" (usually a quadratic functional that represents the
weighted errors between measured and computed responses). In
contrast to these user-influenced methods, the BERRU (Best Estimate
Results with Reduced Uncertainties) Predictive Modeling methodology
developed by the author relies on the thermodynamics-based maximum
entropy principle to eliminate the need for relying on minimizing
user-chosen functionals, thus generalizing the "data adjustment"
and/or the "4D-VAR" data assimilation procedures used in the
geophysical sciences. The BERRU predictive modeling methodology
also provides a "model validation metric" which quantifies the
consistency (agreement/disagreement) between measurements and
computations. This "model validation metric" (or "consistency
indicator") is constructed from parameter covariance matrices,
response covariance matrices (measured and computed), and response
sensitivities to model parameters. Traditional methods for
computing response sensitivities are hampered by the "curse of
dimensionality," which makes them impractical for applications to
large-scale systems that involve many imprecisely known parameters.
Reducing the computational effort required for precisely
calculating the response sensitivities is paramount, and the
comprehensive adjoint sensitivity analysis methodology developed by
the author shows great promise in this regard, as shown in this
book. After discarding inconsistent data (if any) using the
consistency indicator, the BERRU predictive modeling methodology
provides best-estimate values for predicted parameters and
responses along with best-estimate reduced uncertainties (i.e.,
smaller predicted standard deviations) for the predicted
quantities. Applying the BERRU methodology yields optimal,
experimentally validated, "best estimate" predictive modeling tools
for designing new technologies and facilities, while also improving
on existing ones.
This unique, new book covers the whole field of electronic warfare
modeling and simulation at a systems level, including chapters that
describe basic electronic warfare (EW) concepts. Written by a
well-known expert in the field with more than 24 years of
experience, the book explores EW applications and techniques and
the radio frequency spectrum. A detailed resource for entry-level
engineering personnel in EW, military personnel with no radio or
communications engineering background, technicians and software
professionals, the work explains the basic concepts required for
modeling and simulation that today's professionals need to
understand. Practitioners find clear explanations of important
mathematical concepts, such as decibel notation and spherical
trigonometry, necessary for modeling and simulation. Moreover, the
book describes specific types of EW equipment, how they work and
how each is mathematically modeled.
This book offers a comprehensive overview of the challenges in
hydrological modeling. Hydrology, on both a local and global scale,
has undergone dramatic changes, largely due to variations in
climate, population growth and the associated land-use and
land-cover changes. Written by experts in the field, the book
provides decision-makers with a better understanding of the
science, impacts, and consequences of these climate and land-use
changes on hydrology. Further, offering insights into how the
changing behavior of hydrological processes, related uncertainties
and their evolution affect the modeling process, it is of interest
for all researchers and practitioners using hydrological modeling.
This book examines the origins and dynamical characteristics of
atmospheric inertia-gravity waves in the Antarctic mesosphere.
Gravity waves are relatively small-scale atmospheric waves with a
restoring force of buoyancy that can transport momentum upward from
the troposphere to the middle atmosphere. In previous studies, the
dynamical characteristics of mesospheric gravity waves have not
been fully examined using numerical simulations, since performing a
numerical simulation with a high resolution and a high model-top
requires considerable computational power. However, recent advances
in computational capabilities have allowed us to perform numerical
simulations using atmospheric general circulation models, which
cover the troposphere to the mesosphere with a sufficiently fine
horizontal resolution to resolve small-scale gravity waves. The
book first describes the simulation of mesospheric gravity waves
using a high-resolution non-hydrostatic atmospheric model with a
high model top. The accuracy of the numerical results was confirmed
by the first Mesosphere-Stratosphere-Troposphere/Incoherent
Scattering (MST/IS) radar observation in the Antarctic. It also
depicts the origins and propagation processes of mesospheric
gravity waves on the basis of the results of the high-resolution
numerical model. The behaviors of mesospheric gravity waves can be
clearly explained using both fundamental and cutting-edge theories
of fluid dynamics
The papers contained in this volume were originally presented at
the 2015 International Conference on Complex Systems in Business,
Administration, Science and Engineering. Included are the latest
works of practitioners from a variety of disciplines who have
developed new approaches for resolving complex issues that cannot
be formulated using conventional, mathematical or software
models.Complex Systems occur in an infinite variety of problems,
not only in the realm of physical sciences and engineering, but
also in such diverse fields as economics, the environment,
humanities, and social and political sciences.The papers in the
book cover such topics as: Complex ecological systems; Complexity
science and urban developments; Complex energy systems; Complex
issues in biological and medical sciences; Extreme events: natural
and human made disasters; Climate change; Complexity of the
internet-based global market; Complex business processes; Supply
chain complexity; Transportation complexity; Logistics complexity;
Closed and open systems; Attractions and chaotic systems; Complex
adaptive software; Complexity of big data; Management of
complexity; Global economy as a complex system; Complexity in
social systems; Complex political systems; Administrations as
complex systems; Complexity in engineering; Complexity and
environment; Complexity and evolution; Complexity in linguistics,
literature and arts.
The book discusses the theoretical fundamentals of CAD graphics to
enhance readers' understanding of surface modeling and free-form
design by demonstrating how to use mathematical equations to define
curves and surfaces in CAD modelers. Additionally, it explains and
describes the main approaches to creating CAD models out of 3D
scans of physical objects. All CAD approaches are demonstrated with
guided examples and supported with comprehensive engineering
explanations. Furthermore, each approach includes exercises for
independent consolidation of advanced CAD skills. This book is
intended for engineers and designers who are already familiar with
the basics of modern CAD tools, e.g. feature based and solid based
modeling in 3D space, and would like to improve and expand their
knowledge and experience. It is also an easy-to use guide and
excellent teaching and research aid for academics and practitioners
alike.
Probabilistic modeling represents a subject spanning many branches
of mathematics, economics, and computer science to connect pure
mathematics with applied sciences. Operational research also relies
on this connection to enable the improvement of business functions
and decision making. Analyzing Risk through Probabilistic Modeling
in Operations Research is an authoritative reference publication
discussing the various challenges in management and decision
science. Featuring exhaustive coverage on a range of topics within
operational research including, but not limited to, decision
analysis, data mining, process modeling, probabilistic
interpolation and extrapolation, and optimization methods, this
book is an essential reference source for decision makers,
academicians, researchers, advanced-level students, technology
developers, and government officials interested in the
implementation of probabilistic modeling in various business
applications.
Computational Modelling of Nanoparticles highlights recent advances
in the power and versatility of computational modelling,
experimental techniques, and how new progress has opened the door
to a more detailed and comprehensive understanding of the world of
nanomaterials. Nanoparticles, having dimensions of 100 nanometers
or less, are increasingly being used in applications in medicine,
materials and manufacturing, and energy. Spanning the smallest
sub-nanometer nanoclusters to nanocrystals with diameters of 10s of
nanometers, this book provides a state-of-the-art overview on how
computational modelling can provide, often otherwise unobtainable,
insights into nanoparticulate structure and properties. This
comprehensive, single resource is ideal for researchers who want to
start/improve their nanoparticle modelling efforts, learn what can
be (and what cannot) achieved with computational modelling, and
understand more clearly the value and details of computational
modelling efforts in their area of research.
From Digital Traces to Algorithmic Projections describes individual
digital fingerprints in interaction with the different algorithms
they encounter throughout life. Centered on the human user, this
formalism makes it possible to distinguish the voluntary
projections of an individual and their systemic projections
(suffered, metadata), both open (public) and closed. As the global
algorithmic projection of an individual is now the focus of
attention (Big Data, neuromarketing, targeted advertising,
sentiment analysis, cybermonitoring, etc.) and is used to define
new concepts, this resource discusses the ubiquity of place and the
algorithmic consent of a user.
This book describes for readers various technical outcomes from the
EU-project IoSense. The authors discuss sensor integration,
including LEDs, dust sensors, LIDAR for automotive driving and 8
more, demonstrating their use in simulations for the design and
fabrication of sensor systems. Readers will benefit from the
coverage of topics such as sensor technologies for both discrete
and integrated innovative sensor devices, suitable for high volume
production, electrical, mechanical, security and software resources
for integration of sensor system components into IoT systems and
IoT-enabling systems, and IoT sensor system reliability. Describes
from component to system level simulation, how to use the available
simulation techniques for reaching a proper design with good
performance; Explains how to use simulation techniques such as
Finite Elements, Multi-body, Dynamic, stochastics and many more in
the virtual design of sensor systems; Demonstrates the integration
of several sensor solutions (thermal, dust, occupancy, distance,
awareness and more) into large-scale system solutions in several
industrial domains (Lighting, automotive, transport and more);
Includes state-of-the-art simulation techniques, both multi-scale
and multi-physics, for use in the electronic industry.
Interval Finite Element Method with MATLAB provides a thorough
introduction to an effective way of investigating problems
involving uncertainty using computational modeling. The well-known
and versatile Finite Element Method (FEM) is combined with the
concept of interval uncertainties to develop the Interval Finite
Element Method (IFEM). An interval or stochastic environment in
parameters and variables is used in place of crisp ones to make the
governing equations interval, thereby allowing modeling of the
problem. The concept of interval uncertainties is systematically
explained. Several examples are explored with IFEM using MATLAB on
topics like spring mass, bar, truss and frame.
Recent developments in model-predictive control promise remarkable
opportunities for designing multi-input, multi-output control
systems and improving the control of single-input, single-output
systems. This volume provides a definitive survey of the latest
model-predictive control methods available to engineers and
scientists today. The initial set of chapters present various
methods for managing uncertainty in systems, including stochastic
model-predictive control. With the advent of affordable and fast
computation, control engineers now need to think about using
"computationally intensive controls," so the second part of this
book addresses the solution of optimization problems in "real" time
for model-predictive control. The theory and applications of
control theory often influence each other, so the last section of
Handbook of Model Predictive Control rounds out the book with
representative applications to automobiles, healthcare, robotics,
and finance. The chapters in this volume will be useful to working
engineers, scientists, and mathematicians, as well as students and
faculty interested in the progression of control theory. Future
developments in MPC will no doubt build from concepts demonstrated
in this book and anyone with an interest in MPC will find fruitful
information and suggestions for additional reading.
This book presents the state-of-the-art in supercomputer
simulation. It includes the latest findings from leading
researchers using systems from the High Performance Computing
Center Stuttgart (HLRS) in 2017. The reports cover all fields of
computational science and engineering ranging from CFD to
computational physics and from chemistry to computer science with a
special emphasis on industrially relevant applications. Presenting
findings of one of Europe's leading systems, this volume covers a
wide variety of applications that deliver a high level of sustained
performance.The book covers the main methods in high-performance
computing. Its outstanding results in achieving the best
performance for production codes are of particular interest for
both scientists and engineers. The book comes with a wealth of
color illustrations and tables of results.
This book is a survey of the research work done by the author over
the last 15 years, in collaboration with various eminent
mathematicians and climate scientists on the subject of tropical
convection and convectively coupled waves. In the areas of climate
modelling and climate change science, tropical dynamics and
tropical rainfall are among the biggest uncertainties of future
projections. This not only puts at risk billions of human beings
who populate the tropical continents but it is also of central
importance for climate predictions on the global scale. This book
aims to introduce the non-expert readers in mathematics and
theoretical physics to this fascinating topic in order to attract
interest into this difficult and exciting research area. The
general thyme revolves around the use of new deterministic and
stochastic multi-cloud models for tropical convection and
convectively coupled waves. It draws modelling ideas from various
areas of mathematics and physics and used in conjunction with
state-of-the-art satellite and in-situ observations and detailed
numerical simulations. After a review of preliminary material on
tropical dynamics and moist thermodynamics, including recent
discoveries based on satellite observations as well as Markov
chains, the book immerses the reader into the area of models for
convection and tropical waves. It begins with basic concepts of
linear stability analysis and ends with the use of these models to
improve the state-of-the-art global climate models. The book also
contains a fair amount of exercises that makes it suitable as a
textbook complement on the subject.
This book presents the state of the art in High Performance
Computing on modern supercomputer architectures. It addresses
trends in hardware and software development in general, as well as
the future of High Performance Computing systems and heterogeneous
architectures. The contributions cover a broad range of topics,
from improved system management to Computational Fluid Dynamics,
High Performance Data Analytics, and novel mathematical approaches
for large-scale systems. In addition, they explore innovative
fields like coupled multi-physics and multi-scale simulations. All
contributions are based on selected papers presented at the 24th
Workshop on Sustained Simulation Performance, held at the
University of Stuttgart's High Performance Computing Center in
Stuttgart, Germany in December 2016 and the subsequent Workshop on
Sustained Simulation Performance, held at the Cyberscience Center,
Tohoku University, Japan in March 2017.
This book is intended for researchers, practitioners and students
who are interested in the current trends and want to make their GI
applications and research dynamic. Time is the key element of
contemporary GIS: mobile and wearable electronics, sensor networks,
UAVs and other mobile snoopers, the IoT and many other resources
produce a massive amount of data every minute, which is naturally
located in space as well as in time. Time series data is
transformed into almost (from the human perspective) continuous
data streams, which require changes to the concept of spatial data
recording, storage and manipulation. This book collects the latest
innovative research presented at the GIS Ostrava 2017 conference
held in 2017 in Ostrava, Czech Republic, under the auspices of
EuroSDR and EuroGEO. The accepted papers cover various aspects of
dynamics in GIscience, including spatiotemporal data analysis and
modelling; spatial mobility data and trajectories; real-time
geodata and real-time applications; dynamics in land use, land
cover and urban development; visualisation of dynamics; open
spatiotemporal data; crowdsourcing for spatiotemporal data and big
spatiotemporal data.
This book contains a selection of papers presented during a special
workshop on Complexity Science organized as part of the 9th
International Conference on GIScience 2016. Expert researchers in
the areas of Agent-Based Modeling, Complexity Theory, Network
Theory, Big Data, and emerging methods of Analysis and
Visualization for new types of data explore novel complexity
science approaches to dynamic geographic phenomena and their
applications, addressing challenges and enriching research
methodologies in geography in a Big Data Era.
|
You may like...
Doolhof
Rudie van Rensburg
Paperback
R365
R326
Discovery Miles 3 260
Nobody's Fool
Harlan Coben
Paperback
R395
R353
Discovery Miles 3 530
In Too Deep
Lee Child, Andrew Child
Paperback
R395
R353
Discovery Miles 3 530
|