Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > Applications of computing > Computer modelling & simulation
This graduate-level text covers modeling, programming and analysis of simulation experiments and provides a rigorous treatment of the foundations of simulation and why it works. It introduces object-oriented programming for simulation, covers both the probabilistic and statistical basis for simulation in a rigorous but accessible manner (providing all necessary background material); and provides a modern treatment of experiment design and analysis that goes beyond classical statistics. The book emphasizes essential foundations throughout, rather than providing a compendium of algorithms and theorems and prepares the reader to use simulation in research as well as practice. The book is a rigorous, but concise treatment, emphasizing lasting principles but also providing specific training in modeling, programming and analysis. In addition to teaching readers how to do simulation, it also prepares them to use simulation in their research; no other book does this. An online solutions manual for end of chapter exercises is also be provided. "
The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, computer-intensive methods such as the bootstrap and cross-validation freed practitioners from the limitations of parametric models, and paved the way towards the `big data' era of the 21st century. Nonetheless, there is a further step one may take, i.e., going beyond even nonparametric models; this is where the Model-Free Prediction Principle is useful. Interestingly, being able to predict a response variable Y associated with a regressor variable X taking on any possible value seems to inadvertently also achieve the main goal of modeling, i.e., trying to describe how Y depends on X. Hence, as prediction can be treated as a by-product of model-fitting, key estimation problems can be addressed as a by-product of being able to perform prediction. In other words, a practitioner can use Model-Free Prediction ideas in order to additionally obtain point estimates and confidence intervals for relevant parameters leading to an alternative, transformation-based approach to statistical inference.
This volume introduces a series of different data-driven computational methods for analyzing group processes through didactic and tutorial-based examples. Group processes are of central importance to many sectors of society, including government, the military, health care, and corporations. Computational methods are better suited to handle (potentially huge) group process data than traditional methodologies because of their more flexible assumptions and capability to handle real-time trace data. Indeed, the use of methods under the name of computational social science have exploded over the years. However, attention has been focused on original research rather than pedagogy, leaving those interested in obtaining computational skills lacking a much needed resource. Although the methods here can be applied to wider areas of social science, they are specifically tailored to group process research. A number of data-driven methods adapted to group process research are demonstrated in this current volume. These include text mining, relational event modeling, social simulation, machine learning, social sequence analysis, and response surface analysis. In order to take advantage of these new opportunities, this book provides clear examples (e.g., providing code) of group processes in various contexts, setting guidelines and best practices for future work to build upon. This volume will be of great benefit to those willing to learn computational methods. These include academics like graduate students and faculty, multidisciplinary professionals and researchers working on organization and management science, and consultants for various types of organizations and groups.
Power system modelling and scripting is a quite general and ambitious title. Of course, to embrace all existing aspects of power system modelling would lead to an encyclopedia and would be likely an impossible task. Thus, the book focuses on a subset of power system models based on the following assumptions: (i) devices are modelled as a set of nonlinear differential algebraic equations, (ii) all alternate-current devices are operating in three-phase balanced fundamental frequency, and (iii) the time frame of the dynamics of interest ranges from tenths to tens of seconds. These assumptions basically restrict the analysis to transient stability phenomena and generator controls. The modelling step is not self-sufficient. Mathematical models have to be translated into computer programming code in order to be analyzed, understood and experienced . It is an object of the book to provide a general framework for a power system analysis software tool and hints for filling up this framework with versatile programming code. This book is for all students and researchers that are looking for a quick reference on power system models or need some guidelines for starting the challenging adventure of writing their own code."
In this book for the first time two scientific fields - consensus
formation and synchronization of communications - are presented
together and examined through their interrelational aspects, of
rapidly growing importance. Both fields have indeed attracted
enormous research interest especially in relation to complex
networks.
This book is a compilation of a selected subset of research articles presented at the Eighth INFORMS Computing Society Conference, held in Chandler, Arizona, from January 8 to 10, 2003. The articles in this book represent the diversity and depth of the interface between ORiMS (operations research and the management sciences) and CS/AI (computer science and artificial intelligence ). This volume starts with two papers that represent the reflective and integrative thinking that is critical to any scientific discipline. These two articles present philosophical perspectives on computation, covering a variety of traditional and newer methods for modeling, solving, and explaining mathematical models. The next set includes articles that study machine learning and computational heuristics, and is followed by articles that address issues in performance testing of solution algorithms and heuristics. These two sets of papers demonstrate the richness of thought that takes place at the ORiMS and CSI AI interface. The final set of articles demonstrates the usefulness of these and other methods at the interface towards solving problems in the real world, covering e-commerce, workflow, electronic negotiation, music, parallel computation, and telecommunications. The articles in this collection represent the results of cross-fertilization between ORiMS and CSI AI, making possible advances that could have not been achieved in isolation. The continuing aim ofthe INFORMS Computing Society and this research conference is to invigorate and further develop this interface.
This book provides a conceptual and computational framework to study how the nervous system exploits the anatomical properties of limbs to produce mechanical function. The study of the neural control of limbs has historically emphasized the use of optimization to find solutions to the muscle redundancy problem. That is, how does the nervous system select a specific muscle coordination pattern when the many muscles of a limb allow for multiple solutions? I revisit this problem from the emerging perspective of neuromechanics that emphasizes finding and implementing families of feasible solutions, instead of a single and unique optimal solution. Those families of feasible solutions emerge naturally from the interactions among the feasible neural commands, anatomy of the limb, and constraints of the task. Such alternative perspective to the neural control of limb function is not only biologically plausible, but sheds light on the most central tenets and debates in the fields of neural control, robotics, rehabilitation, and brain-body co-evolutionary adaptations. This perspective developed from courses I taught to engineers and life scientists at Cornell University and the University of Southern California, and is made possible by combining fundamental concepts from mechanics, anatomy, mathematics, robotics and neuroscience with advances in the field of computational geometry. Fundamentals of Neuromechanics is intended for neuroscientists, roboticists, engineers, physicians, evolutionary biologists, athletes, and physical and occupational therapists seeking to advance their understanding of neuromechanics. Therefore, the tone is decidedly pedagogical, engaging, integrative, and practical to make it accessible to people coming from a broad spectrum of disciplines. I attempt to tread the line between making the mathematical exposition accessible to life scientists, and convey the wonder and complexity of neuroscience to engineers and computational scientists. While no one approach can hope to definitively resolve the important questions in these related fields, I hope to provide you with the fundamental background and tools to allow you to contribute to the emerging field of neuromechanics.
This book explores systems-based, co-design, introducing a "Decision-Based, Co-Design" (DBCD) approach for the co-design of materials, products, and processes. In recent years there have been significant advances in modeling and simulation of material behavior, from the smallest atomic scale to the macro scale. However, the uncertainties associated with these approaches and models across different scales need to be addressed to enable decision-making resulting in designs that are robust, that is, relatively insensitive to uncertainties. An approach that facilitates co-design is needed across material, product design and manufacturing processes. This book describes a cloud-based platform to support decisions in the design of engineered systems (CB-PDSIDES), which feature an architecture that promotes co-design through the servitization of decision-making, knowledge capture and use templates that allow previous solutions to be reused. Placing the platform in the cloud aids mass collaboration and open innovation. A valuable reference resource reference on all areas related to the design of materials, products and processes, the book appeals to material scientists, design engineers and all those involved in the emerging interdisciplinary field of integrated computational materials engineering (ICME).
Understanding how the human brain represents, stores, and processes information is one of the greatest unsolved mysteries of science today. The cerebral cortex is the seat of most of the mental capabilities that distinguish humans from other animals and, once understood, it will almost certainly lead to a better knowledge of other brain nuclei. Although neuroscience research has been underway for 150 years, very little progress has been made. What is needed is a key concept that will trigger a full understanding of existing information, and will also help to identify future directions for research. This book aims to help identify this key concept. Including contributions from leading experts in the field, it provides an overview of different conceptual frameworks that indicate how some pieces of the neuroscience puzzle fit together. It offers a representative selection of current ideas, concepts, analyses, calculations and computer experiments, and also looks at important advances such as the application of new modeling methodologies. Computational Models for Neuroscience will be essential reading for anyone who needs to keep up-to-date with the latest ideas in computational neuroscience, machine intelligence, and intelligent systems. It will also be useful background reading for advanced undergraduates and postgraduates taking courses in neuroscience and psychology.
This isn't a book about the Object Data Standard; it's the
complete, When it comes to storing objects in databases, ODMG 3.0 is
a
Computer simulations not only belong to the most important methods for the theoretical investigation of granular materials, but provide the tools that have enabled much of the expanding research by physicists and engineers. The present book is intended to serve as an introduction to the application of numerical methods to systems of granular particles. Accordingly emphasis is on a general understanding of the subject rather than on the presentation of latest advances in numerical algorithms. Although a basic knowledge of C++ is needed for the understanding of the numerical methods and algorithms in the book, it avoids usage of elegant but complicated algorithms to remain accessible for those who prefer to use a different programming language. While the book focuses more on models than on the physics of granular material, many applications to real systems are presented.
This volume is dedicated to Jacob Aboudi, a ?ne scientist who has made seminal c- tributions in applied mechanics. The papers presented here re?ect the appreciation of many of Jacob's colleagues. A publication list f- lowing this introduction provides an indi- tion of his distinguished academic career, c- rently in its ?fth decade, and the breadth of hisknowledge. His papersconsistentlydem- strate originality, innovation and diligence. This list uncovers the methodical work of a dedicated researcher whose achievements established him as a leading authority in the area of mathematical modeling of the beh- ior of heterogeneous materials, the area which became known as homogenization theory. Starting in 1981, Jacob established a micromechanical model known as the Method of Cells (MOC) which evolved into the Generalized Method of Cells (GMC) that predicts the macroscopic response of composite materials as a function of the pr- erties, volume fractions, shapes, and constitutive behavior of its constituents. The versatility of the model has been demonstrated to effectively incorporate various types of constituent material behavior (i. e. , both coupled and uncoupled mecha- cal, thermal, electrical and magnetic effects). As a result of its potential in providing an ef?cient tool for the emerging ?eld of multiscale analysis, the method gained increasing attention and became a subject for further research.
Relevant to, and drawing from, a range of disciplines, the chapters in this collection show the diversity, and applicability, of research in Bayesian argumentation. Together, they form a challenge to philosophers versed in both the use and criticism of Bayesian models who have largely overlooked their potential in argumentation. Selected from contributions to a multidisciplinary workshop on the topic held in Sweden in 2010, the authors count linguists and social psychologists among their number, in addition to philosophers. They analyze material that includes real-life court cases, experimental research results, and the insights gained from computer models. The volume provides, for the first time, a formal measure of subjective argument strength and argument force, robust enough to allow advocates of opposing sides of an argument to agree on the relative strengths of their supporting reasoning. With papers from leading figures such as Michael Oaksford and Ulrike Hahn, the book comprises recent research conducted at the frontiers of Bayesian argumentation and provides a multitude of examples in which these formal tools can be applied to informal argument. It signals new and impending developments in philosophy, which has seen Bayesian models deployed in formal epistemology and philosophy of science, but has yet to explore the full potential of Bayesian models as a framework in argumentation. In doing so, this revealing anthology looks destined to become a standard teaching text in years to come. "
The book provides a bottom-up approach to understanding how a computer works and how to use computing to solve real-world problems. It covers the basics of digital logic through the lens of computer organization and programming. The reader should be able to design his or her own computer from the ground up at the end of the book. Logic simulation with Verilog is used throughout, assembly languages are introduced and discussed, and the fundamentals of computer architecture and embedded systems are touched upon, all in a cohesive design-driven framework suitable for class or self-study.
This book introduces the latest progress in six degrees of freedom (6-DoF) haptic rendering with the focus on a new approach for simulating force/torque feedback in performing tasks that require dexterous manipulation skills. One of the major challenges in 6-DoF haptic rendering is to resolve the conflict between high speed and high fidelity requirements, especially in simulating a tool interacting with both rigid and deformable objects in a narrow space and with fine features. The book presents a configuration-based optimization approach to tackle this challenge. Addressing a key issue in many VR-based simulation systems, the book will be of particular interest to researchers and professionals in the areas of surgical simulation, rehabilitation, virtual assembly, and inspection and maintenance.
This book describes thermal plant simulation, that is, dynamic simulation of plants which produce, exchange and otherwise utilize heat as their working medium. Directed at chemical, mechanical and control engineers involved with operations, control and optimization and operator training, the book gives the mathematical formulation and use of simulation models of the equipment and systems typically found in these industries. The author has adopted a fundamental approach to the subject. The initial chapters provide an overview of simulation concepts and describe a suitable computer environment. Reviews of relevant numerical computation methods and fundamental thermodynamics are followed by a detailed examination of the basic conservation equations. The bulk of the book is concerned with development of specific simulation models. Care is taken to trace each model derivation path from the basic underlying physical equations, explaining simplifying and restrictive assumptions as they arise and relating the model coefficients to the physical dimensions and physical properties of the working materials. Numerous photographs of real equipment complement the text and most models are illustrated by numerical examples based on typical real plant operations.
This book brings together some of the most influential pieces of research undertaken around the world in design synthesis. It is the first comprehensive work of this kind and covers all three aspects of research in design synthesis:- understanding what constitutes and influences synthesis;- the major approaches to synthesis;- the diverse range of tools that are created to support this crucial design task.The chapters are comprised of cutting edge research and established methods, written by the originators of this growing field of research. They cover all major generic synthesis approaches i.e., composition, retrieval, change and repair, and tackle problems that come from a wide variety of domains within architecture and engineering as well as areas of application including clocks, sensors and medical devices. The book contains an editorial introduction to the chapters and the broader context of research they represent. With its range of tools and methods covered, it is an ideal introduction to design synthesis for those intending to research in this area as well as being a valuable source of ideas for educators and practitioners of engineering design.
This book presents the selected results of the XI Scientific Conference Selected Issues of Electrical Engineering and Electronics (WZEE) which was held in Rzeszow and Czarna, Poland on September 27-30, 2013. The main aim of the Conference was to provide academia and industry to discuss and present the latest technological advantages and research results and to integrate the new interdisciplinary scientific circle in the field of electrical engineering, electronics and mechatronics. The Conference was organized by the Rzeszow Division of Polish Association of Theoretical and Applied Electrical Engineering (PTETiS) in cooperation with Rzeszow University of Technology, the Faculty of Electrical and Computer Engineering and Rzeszow University, the Faculty of Mathematics and Natural Sciences.
The book describes the K-Method which has been developed by the authors. The purpose of the K-Method is to negotiate and administrate a complex portfolio of customised materials, all belonging to the same purchasing group (e.g. labels). The underlying idea is to agree prices for specification features, instead of giving each material an individual price based on its unique specification. By doing so, a price formula will be agreed between the buyer and supplier which even defines prices of future materials with any kind of specification.
The 2nd edition of Chopra's Google SketchUp provides key pedagogical elements, which help prepare readers for the workforce. The content provides real-world and applied material including better PowerPoint presentations and how-to animations. Additional features include updated content to reflect software upgrades and market use; new pedagogy elements and interior design; and more robust resources that will are appropriate for different users of Google Sketch. The book also addresses the similarities between the adapted title, Google SketchUp 8 for Dummies, and Google SketchUp 2. This includes a title that contains the core content and basic software how-to from For Dummies; revised TOC to reflect the course; and new material developed/written by writer and academic advisors/reviewers. This edition goes beyond the basic software use to teach on portions of SketchUp.
This text is about spreading of information and influence in complex networks. Although previously considered similar and modeled in parallel approaches, there is now experimental evidence that epidemic and social spreading work in subtly different ways. While previously explored through modeling, there is currently an explosion of work on revealing the mechanisms underlying complex contagion based on big data and data-driven approaches. This volume consists of four parts. Part 1 is an Introduction, providing an accessible summary of the state of the art. Part 2 provides an overview of the central theoretical developments in the field. Part 3 describes the empirical work on observing spreading processes in real-world networks. Finally, Part 4 goes into detail with recent and exciting new developments: dedicated studies designed to measure specific aspects of the spreading processes, often using randomized control trials to isolate the network effect from confounders, such as homophily. Each contribution is authored by leading experts in the field. This volume, though based on technical selections of the most important results on complex spreading, remains quite accessible to the newly interested. The main benefit to the reader is that the topics are carefully structured to take the novice to the level of expert on the topic of social spreading processes. This book will be of great importance to a wide field: from researchers in physics, computer science, and sociology to professionals in public policy and public health.
This book covers some important topics in the construction of computable general equilibrium (CGE) models and examines use of these models for the analysis of economic policies, their properties, and their implications. Readers will find explanation and discussion of the theoretical structure and practical application of several model typologies, including dynamic, stochastic, micro-macro, and simulation models, as well as different closure rules and policy experiments. The presentation of applications to various country and problem-specific case studies serves to provide an informed and clearly articulated summary of the state of the art and the most important methodological advancements in the field of policy modeling within the framework of general equilibrium analysis. The book is an outcome of a recent workshop of the Italian Development Economists Association attended by a group of leading practitioners involved in the generation of CGE models and research on modeling the economy and policy making. It will be of interest to researchers, professional economists, graduate students, and knowledgeable policy makers.
This book has the unique intention of returning the mathematical tools of neural networks to the biological realm of the nervous system, where they originated a few decades ago. It aims to introduce, in a didactic manner, two relatively recent developments in neural network methodology, namely recurrence in the architecture and the use of spiking or integrate-and-fire neurons. In addition, the neuro-anatomical processes of synapse modification during development, training, and memory formation are discussed as realistic bases for weight-adjustment in neural networks. While neural networks have many applications outside biology, where it is irrelevant precisely which architecture and which algorithms are used, it is essential that there is a close relationship between the network's properties and whatever is the case in a neuro-biological phenomenon that is being modelled or simulated in terms of a neural network. A recurrent architecture, the use of spiking neurons and appropriate weight update rules contribute to the plausibility of a neural network in such a case. Therefore, in the first half of this book the foundations are laid for the application of neural networks as models for the various biological phenomena that are treated in the second half of this book. These include various neural network models of sensory and motor control tasks that implement one or several of the requirements for biological plausibility.
This book is a status report. It provides a broad overview of the most recent developments in the field, spanning a wide range of topical areas in simulational condensed matter physics. These areas include recent developments in simulations of classical statistical mechanics models, electronic structure calculations, quantum simulations, and simulations of polymers. Both new physical results and novel simulational and data analysis methods are presented. Some of the highlights of this volume include detailed accounts of recent theoretical developments in electronic structure calculations, novel quantum simulation techniques and their applications to strongly interacting lattice fermion models, and a wide variety of applications of existing methods as well as novel methods in the simulation of classical statistical mechanics models, including spin glasses and polymers.
This volume explores the complex problems that arise in the modeling and simulation of crowd dynamics in order to present the state-of-the-art of this emerging field and contribute to future research activities. Experts in various areas apply their unique perspectives to specific aspects of crowd dynamics, covering the topic from multiple angles. These include a demonstration of how virtual reality may solve dilemmas in collecting empirical data; a detailed study on pedestrian movement in smoke-filled environments; a presentation of one-dimensional conservation laws with point constraints on the flux; a collection of new ideas on the modeling of crowd dynamics at the microscopic scale; and others. Applied mathematicians interested in crowd dynamics, pedestrian movement, traffic flow modeling, urban planning, and other topics will find this volume a valuable resource. Additionally, researchers in social psychology, architecture, and engineering may find this information relevant to their work. |
You may like...
SolidWorks Simulation 2022 Black Book…
Gaurav Verma, Matt Weber
Hardcover
R1,719
Discovery Miles 17 190
Advances in Principal Component Analysis
Fausto Pedro Garcia Marquez
Hardcover
Pioneers in Machinima: The Grassroots of…
Tracy Gaynor Harwood
Hardcover
R1,786
Discovery Miles 17 860
Applying AI-Based IoT Systems to…
Bhatia Madhulika, Bhatia Surabhi, …
Hardcover
R7,037
Discovery Miles 70 370
Recent Advances in Numerical Simulations
Francisco Bulnes, Jan Peter Hessling
Hardcover
Software Defined Radio using MATLAB…
Robert W. Stewart, Kenneth W Barlee, …
Hardcover
R1,740
Discovery Miles 17 400
|