![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Computer modelling & simulation
The research and its outcomes presented in this collection focus on various aspects of high-performance computing (HPC) software and its development which is confronted with various challenges as today's supercomputer technology heads towards exascale computing. The individual chapters address one or more of the research directions (1) computational algorithms, (2) system software, (3) application software, (4) data management and exploration, (5) programming, and (6) software tools. The collection thereby highlights pioneering research findings as well as innovative concepts in exascale software development that have been conducted under the umbrella of the priority programme "Software for Exascale Computing" (SPPEXA) of the German Research Foundation (DFG) and that have been presented at the SPPEXA Symposium, Jan 25-27 2016, in Munich. The book has an interdisciplinary appeal: scholars from computational sub-fields in computer science, mathematics, physics, or engineering will find it of particular interest.
DRAG (from un modele de la Demande Routiere, des Accidents et leur
Gravite) is a complex computer model that simulates accident
propensities under detailed conditions. The DRAG approach
constitutes the largest road accident modelling effort ever
undertaken. Gaudry is the creator and developer of DRAG and this
work explains its nature, purpose and value. Such a model, which
explains accidents for a whole region, province or country, has
advantages in answering many questions asked about accidents (such
as the role of the economic cycle, weather, prices, insurance etc.)
that other models fail to take fully into account.
An embedded system is loosely defined as any system that utilizes electronics but is not perceived or used as a general-purpose computer. Traditionally, one or more electronic circuits or microprocessors are literally embedded in the system, either taking up roles that used to be performed by mechanical devices, or providing functionality that is not otherwise possible. The goal of this book is to investigate how formal methods can be applied to the domain of embedded system design. The emphasis is on the specification, representation, validation, and design exploration of such systems from a high-level perspective. The authors review the framework upon which the theories and experiments are based, and through which the formal methods are linked to synthesis and simulation. A formal verification methodology is formulated to verify general properties of the designs and demonstrate that this methodology is efficient in dealing with the problem of complexity and effective in finding bugs. However, manual intervention in the form of abstraction selection and separation of timing and functionality is required. It is conjectured that, for specific properties, efficient algorithms exist for completely automatic formal validations of systems. Synchronous Equivalence: Formal Methods for Embedded Systems presents a brand new formal approach to high-level equivalence analysis. It opens design exploration avenues previously uncharted. It is a work that can stand alone but at the same time is fully compatible with the synthesis and simulation framework described in another book by Kluwer Academic Publishers Hardware-Software Co-Design of Embedded Systems: The POLIS Approach, by Balarin et al. Synchronous Equivalence: Formal Methods for Embedded Systems will be of interest to embedded system designers (automotive electronics, consumer electronics, and telecommunications), micro-controller designers, CAD developers and students, as well as IP providers, architecture platform designers, operating system providers, and designers of VLSI circuits and systems.
"Intelligent Control" considers non-traditional modelling and control approaches to nonlinear systems. Fuzzy logic, neural networks and evolutionary computing techniques are the main tools used. The book presents a modular switching fuzzy logic controller where a PD-type fuzzy controller is executed first followed by a PI-type fuzzy controller thus improving the performance of the controller compared with a PID-type fuzzy controller.The advantage of the switching-type fuzzy controller is that it uses one rule-base thus minimises the rule-base during execution. A single rule-base is developed by merging the membership functions for change of error of the PD-type controller and sum of error of the PI-type controller. Membership functions are then optimized using evolutionary algorithms. Since the two fuzzy controllers were executed in series, necessary further tuning of the differential and integral scaling factors of the controller is then performed. Neural-network-based tuning for the scaling parameters of the fuzzy controller is then described and finally an evolutionary algorithm is applied to the neurally-tuned-fuzzy controller in which the sigmoidal function shape of the neural network is determined. The important issue of stability is addressed and the text demonstrates empirically that the developed controller was stable within the operating range. The text concludes with ideas for future research to show the reader the potential for further study in this area. "Intelligent Control "will be of interest to researchers from engineering and computer science backgrounds working in the intelligent and adaptive control."
Offers a unique multidisciplinary overview of how humans interact with soft objects and how multiple sensory signals are used to perceive material properties, with an emphasis on object deformability. The authors describe a range of setups that have been employed to study and exploit sensory signals involved in interactions with compliant objects as well as techniques to simulate and modulate softness - including a psychophysical perspective of the field. Multisensory Softness focuses on the cognitive mechanisms underlying the use of multiple sources of information in softness perception. Divided into three sections, the first Perceptual Softness deals with the sensory components and computational requirements of softness perception, the second Sensorimotor Softness looks at the motor components of the interaction with soft objects and the final part Artificial Softness focuses on the identification of exploitable guidelines to help replicate softness in artificial environments.
Manufacturing and operations management paradigms are evolving toward more open and resilient spaces where innovation is driven not only by ever-changing customer needs but also by agile and fast-reacting networked structures. Flexibility, adaptability and responsiveness are properties that the next generation of systems must have in order to successfully support such new emerging trends. Customers are being attracted to be involved in Co-innovation Networks, as - proved responsiveness and agility is expected from industry ecosystems. Renewed production systems needs to be modeled, engineered and deployed in order to achieve cost-effective solutions. BASYS conferences have been developed and organized as a forum in which to share visions and research findings for innovative sustainable and knowledge-based products-services and manufacturing models. Thus, the focus of BASYS is to discuss how human actors, emergent technologies and even organizations are integrated in order to redefine the way in which the val- creation process must be conceived and realized. BASYS 2010, which was held in Valencia, Spain, proposed new approaches in automation where synergies between people, systems and organizations need to be fully exploited in order to create high added-value products and services. This book contains the selection of the papers which were accepted for presentation at the BASYS 2010 conference, covering consolidated and emerging topics of the conference scope.
This instructional bookshowcases techniques to parameterise human agents in empirical agent-based models(ABM). In doing so, it provides a timely overview of key ABM methodologies and the most innovative approaches through a variety of empirical applications. It features cutting-edge research from leading academics and practitioners, and will provide a guide for characterising and parameterising human agents in empirical ABM.In order to facilitate learning, this text shares the valuable experiences of other modellers in particular modelling situations. Very little has been published in the area of empirical ABM, and this contributed volume will appeal to graduate-level students and researchers studying simulation modeling in economics, sociology, ecology, and trans-disciplinary studies, such as topics related to sustainability. In a similar vein to the instruction found in a cookbook, this text provides the empirical modeller with a set of 'recipes' ready to be implemented. Agent-based modeling (ABM) is a powerful, simulation-modeling technique that has seen a dramatic increase in real-world applications in recent years. In ABM, a system is modeled as a collection of autonomous decision-making entities called agents. Each agent individually assesses its situation and makes decisions on the basis of a set of rules. Agents may execute various behaviors appropriate for the system they represent for example, producing, consuming, or selling. ABM is increasingly used for simulating real-world systems, such as natural resource use, transportation, public health, and conflict.Decision makers increasingly demand support that covers a multitude of indicators that can be effectively addressed using ABM. This is especially the case in situations where human behavior is identified as a critical element. As a result, ABM will only continue its rapid growth. This is the first volume in a series of books that aims to contribute to a cultural change in the community of empirical agent-based modelling. This series will bring together representational experiences and solutions in empirical agent-based modelling. Creating a platform to exchange such experiences allows comparison of solutions and facilitates learning in the empirical agent-based modelling community. Ultimately, the community requires such exchange and learning to test approaches and, thereby, to develop a robust set of techniques within the domain of empirical agent-based modelling. Based on robust and defendable methods, agent-based modelling will become a critical tool for research agencies, decision making and decision supporting agencies, and funding agencies. This series will contribute to more robust and defendable empirical agent-based modelling."
The book describes what these models are, what they are based on, how they function, and then, most innovatively, how they can be used to generate new useful knowledge about the environmental system. Discusses this generation of knowledge by computer models from an epistemological perspective and illustrates it by numerous examples from applied and fundamental research. Includes ample technical appendices and is a valuable source of information for graduate students and scientists alike working in the field of environmental sciences.
These are the proceedings of the 22nd International Conference on Domain Decomposition Methods, which was held in Lugano, Switzerland. With 172 participants from over 24 countries, this conference continued a long-standing tradition of internationally oriented meetings on Domain Decomposition Methods. The book features a well-balanced mix of established and new topics, such as the manifold theory of Schwarz Methods, Isogeometric Analysis, Discontinuous Galerkin Methods, exploitation of modern HPC architectures and industrial applications. As the conference program reflects, the growing capabilities in terms of theory and available hardware allow increasingly complex non-linear and multi-physics simulations, confirming the tremendous potential and flexibility of the domain decomposition concept.
Building upon the fundamental principles of decision theory, Decision-Based Design: Integrating Consumer Preferences into Engineering Design presents an analytical approach to enterprise-driven Decision-Based Design (DBD) as a rigorous framework for decision making in engineering design. Once the related fundamentals of decision theory, economic analysis, and econometrics modelling are established, the remaining chapters describe the entire process, the associated analytical techniques, and the design case studies for integrating consumer preference modeling into the enterprise-driven DBD framework. Methods for identifying key attributes, optimal design of human appraisal experiments, data collection, data analysis, and demand model estimation are presented and illustrated using engineering design case studies. The scope of the chapters also provides: A rigorous framework of integrating the interests from both producer and consumers in engineering design, Analytical techniques of consumer choice modelling to forecast the impact of engineering decisions, Methods for synthesizing business and engineering models in multidisciplinary design environments, and Examples of effective application of Decision-Based Design supported by case studies. No matter whether you are an engineer facing decisions in consumer related product design, an instructor or student of engineering design, or a researcher exploring the role of decision making and consumer choice modelling in design, Decision-Based Design: Integrating Consumer Preferences into Engineering Design provides a reliable reference over a range of key topics.
Computational Challenges in the Geosciences addresses a cross-section of grand challenge problems arising in geoscience applications, including groundwater and petroleum reservoir simulation, hurricane storm surge, oceanography, volcanic eruptions and landslides, and tsunamis. Each of these applications gives rise to complex physical and mathematical models spanning multiple space-time scales, which can only be studied through computer simulation. The data required by the models is often highly uncertain, and the numerical solution of the models requires sophisticated algorithms which are mathematically accurate, computationally efficient and yet must preserve basic physical properties of the models. This volume summarizes current methodologies and future research challenges in this broad and important field.
The seventh book in the CHDL Series is composed of a selection of the best articles from the Forum on Specification and Design Languages (FDL'04). FDL is the European Forum to learn and exchange on new trends on the application of languages and models for the design of electronic and heterogeneous systems. The forum was structured around four workshops that are all represented in the book by outstanding articles: Analog and Mixed-Signal Systems, UML-based System Specification and Design, C/C++-Based System Design and Languages for Formal Specification and Verification. The Analog and Mixed-Signal Systems contributions bring some
answers to the difficult problem of co-simulating discrete and
continuous models of computation. The UML-based System
Specification and Design chapters bring insight into how to use the
Model Driven Engineering to design Systems-on-Chip. The C/C++-Based
System Design articles mainly explore system level design with
SystemC. The Languages for Formal Overall Advances in Design and Specification Languages for SoCs is an excellent opportunity to catch up with the latest research developments in the field of languages for electronic and heterogeneous system design.
This book focuses on new and emerging data mining solutions that offer a greater level of transparency than existing solutions. Transparent data mining solutions with desirable properties (e.g. effective, fully automatic, scalable) are covered in the book. Experimental findings of transparent solutions are tailored to different domain experts, and experimental metrics for evaluating algorithmic transparency are presented. The book also discusses societal effects of black box vs. transparent approaches to data mining, as well as real-world use cases for these approaches.As algorithms increasingly support different aspects of modern life, a greater level of transparency is sorely needed, not least because discrimination and biases have to be avoided. With contributions from domain experts, this book provides an overview of an emerging area of data mining that has profound societal consequences, and provides the technical background to for readers to contribute to the field or to put existing approaches to practical use.
Discrete event simulation and agent-based modeling are increasingly recognized as critical for diagnosing and solving process issues in complex systems. Introduction to Discrete Event Simulation and Agent-based Modeling covers the techniques needed for success in all phases of simulation projects. These include: * Definition - The reader will learn how to plan a project and communicate using a charter. * Input analysis - The reader will discover how to determine defensible sample sizes for all needed data collections. They will also learn how to fit distributions to that data. * Simulation - The reader will understand how simulation controllers work, the Monte Carlo (MC) theory behind them, modern verification and validation, and ways to speed up simulation using variation reduction techniques and other methods. * Output analysis - The reader will be able to establish simultaneous intervals on key responses and apply selection and ranking, design of experiments (DOE), and black box optimization to develop defensible improvement recommendations. * Decision support - Methods to inspire creative alternatives are presented, including lean production. Also, over one hundred solved problems are provided and two full case studies, including one on voting machines that received international attention. Introduction to Discrete Event Simulation and Agent-based Modeling demonstrates how simulation can facilitate improvements on the job and in local communities. It allows readers to competently apply technology considered key in many industries and branches of government. It is suitable for undergraduate and graduate students, as well as researchers and other professionals.
Drawing examplesfrom mathematics, physics, chemistry, biology, engineering, economics, medicine, politics, and sports, this book illustrates how nonlinear dynamics plays a vital role in our world. Examples cover a wide range from the spread and possible control of communicable diseases, to the lack of predictability in long-range weather forecasting, to competition between political groups and nations. After an introductorychapter that explores what it means to be nonlinear, the book covers the mathematical conceptssuch as limit cycles, fractals, chaos, bifurcations, and solitons, that will be applied throughout the book. Numerous computer simulations and exercises allow students to explore topics in greater depth using the Maple computer algebra system. The mathematical level of the text assumes prior exposure to ordinary differential equations and familiarity with the wave and diffusion equations.No prior knowledge of Maple is assumed. The book may be used at the undergraduate or graduate level to prepare science and engineering students for problems in the "real world," or for self-study by practicing scientists and engineers."
In GPU Pro5 Advanced Rendering Techniques, section editors Wolfgang Engel, Christopher Oat, Carsten Dachsbacher, Michal Valient, Wessam Bahnassi, and Marius Bjorge have once again assembled a high-quality collection of cutting-edge techniques for advanced graphics processing unit (GPU) programming. Divided into six sections, the book covers rendering, lighting, effects in image space, mobile devices, 3D engine design, and compute. It explores rasterization of liquids, ray tracing of art assets that would otherwise be used in a rasterized engine, physically based area lights, volumetric light effects, screen-space grass, the usage of quaternions, and a quadtree implementation on the GPU. It also addresses the latest developments in deferred lighting on mobile devices, OpenCL optimizations for mobile devices, morph targets, and tiled deferred blending methods. In color throughout, GPU Pro5 is the only book that incorporates contributions from more than 50 experts who cover the latest developments in graphics programming for games and movies. It presents ready-to-use ideas and procedures that can help solve many of your daily graphics programming challenges. Example programs with source code are provided on the book s CRC Press web page."
Supramolecular chemistry has been defined by J.-M. Lehn as "a highly interdisciplinary field of science covering the chemical, physical, and biological features of chemical species of higher complexity, that are held together and organized by means of intermolecular (noncovalent) binding interactions" (Science, 1993). Recognition, reactivity, and transport represent three basic functional features, in essence dynami s, which may be translated into structural features. The purpose of the NATO workshop which took place september 1-5, 1993 at the Bischenberg (near Strasbourg) was to present computations which may contribute to the atomic level understanding of the structural and thermodynamical features involved in the processes of molecular recognition and supramolecular organization. of "supra-molecular modeling." Other The main focus was therefore, on the many facets applications of computers in chemistry, such as automation, simulation of processes, procedures for fitting kinetic or thermodynamic data, computer assisted synthetic strategies, use of data bases for structure elucidation or for bibliographic searches, have an obvious impact in supramolecular chemistry as well, but were not presented at the workshop.
This book presents the state of the art technology in Serious Games which is driven extensive by applications and research in simulation. The topics in this book include: (1) Fashion simulation; (2) Chinese calligraphy ink diffusion simulation; (3) Rehabilitation (4) Long vehicle turning simulation; (5) Marine traffic conflict control; (6) CNC simulation; (7) Special needs education. The book also addresses the fundamental issues in Simulation and Serious Games such as rapid collision detection, game engines or game development platforms. The target audience for this book includes scientists, engineers and practitioners involved in the field of Serious Games and Simulation. The major part of this book comprises of papers presented at the 2012 Asia-Europe Workshop on Serious Games and Simulation held in Nanyang Technological University, Singapore (May 9, 2012). All the contributions have been peer reviewed and by scientific committee members with report about quality, content and originality.
Shape interrogation is the process of extraction of information from a geometric model. It is a fundamental component of Computer Aided Design and Manufacturing (CAD/CAM) systems. The authors focus on shape interrogation of geometric models bounded by free-form surfaces. Free-form surfaces, also called sculptured surfaces, are widely used in the bodies of ships, automobiles and aircraft, which have both functionality and attractive shape requirements. Many electronic devices as well as consumer products are designed with aesthetic shapes, which involve free-form surfaces. This book provides the mathematical fundamentals as well as algorithms for various shape interrogation methods including nonlinear polynomial solvers, intersection problems, differential geometry of intersection curves, distance functions, curve and surface interrogation, umbilics and lines of curvature, geodesics, and offset curves and surfaces. This book will be of interest both to graduate students and professionals.
This introduction to random variables and signals provides engineering students with the analytical and computational tools for processing random signals using linear systems. It presents the underlying theory as well as examples and applications using computational aids throughout, in particular, computer-based symbolic computation programs are used for performing the analytical manipulations and the numerical calculations. The accompanying CD-ROM provides MathcadTM and MatlabTM notebooks and sheets to develop processing methods. Intended for a one-semester course for advanced undergraduate or beginning graduate students, the book covers such topics as: set theory and probability; random variables, distributions, and processes; deterministic signals, spectral properties, and transformations; and filtering, and detection theory. The large number of worked examples together with the programming aids make the book eminently suited for self study as well as classroom use.
This book continues the biannual series of conference proceedings, which has become a classical reference resource in traffic and granular research alike, and addresses the latest developments at the intersection of physics, engineering and computational science. These involve complex systems, in which multiple simple agents, be they vehicles or particles, give rise to surprising and fascinating phenomena. The contributions collected in these proceedings cover several research fields, all of which deal with transport. Topics include highway, pedestrian and internet traffic; granular matter; biological transport; transport networks; data acquisition; data analysis and technological applications. Different perspectives, i.e., modeling, simulations, experiments, and phenomenological observations are considered.
This book is devoted to a new branch of experimental design theory called simulation experimental design. There are many books devoted either to the theory of experimental design or to system simulation techniques, but in this book an approach to combine both fields is developed. Especially the mathematical theory of such universal variance reduction techniques as splitting and Russian Roulette is explored. The book contains a number of results on regression design theory related to nonlinear problems, the E-optimum criterion and designs which minimize bias. Audience: This volume will be of value to readers interested in systems simulation, applied statistics and numerical methods with basic knowledge of applied statistics and linear algebra.
The management of production and service processes can be supported by microcomputer simulation models-effectively and inexpensively-if the techniques are presented in an understandable manner. Drs. Klafehn, Weinroth, and Boronico prove this and show how to do it-not only for the benefit of operations managers themselves, but for others with management responsibilities in a variety of businesses and industries. They will learn how important daily operations problems can be modeled on a microcomputer, gain understanding of overall simulation methodology, and learn the several forms of cost savings achievable through simulation. For teachers in business schools the book will also provide a link between general management and the management of engineering and R&D. The first chapter introduces the reader to the concepts and steps for undertaking a microcomputer simulation project. In addition, the benefits, drawbacks, and myths are reviewed in detail. Chapter two explores, in a conversational scenario, what is involved in taking a management operations problem involving a truck transfer depot from its point of inception to the formulation of a systems operation model, which in a later chapter is ultimately put into a computer simulation model and tested to, in a sense, come up with answers to the questions posed in the hypothetical conversation. Subsequent chapters in the book are oriented to a discussion of other operations management problems and the effort to seek insight and solutions through simulation modeling. A Just-in-Time manufacturing system is addressed, recognizing the push-pull concept as well as looking at the quality aspect. Attempting to determine the optimum levels for safety, stock, order points, and order quantity is investigated through computer simulation. These levels are predicated on balancing the costs associated with ordering and holding goods as well as the penalty costs of stocking out. Using a simulated environment enables the inclusion of the variability evidenced by the type of distribution. The remaining chapters also review alternative rules and what ifs as applied to machine configuration, facility location for a satellite EMS unit, and job shop operations. Each of the applications chapters provides a printout of the basic computer model, written in GPSS, that was then modified to investigate alternative scenarios. |
You may like...
Modelling and Control in Biomedical…
David Dagan Feng, Janan Zaytoon
Paperback
Mathematical and Physical Simulation of…
M. Pietrzyk, L. Cser, …
Hardcover
R4,188
Discovery Miles 41 880
Applying AI-Based IoT Systems to…
Bhatia Madhulika, Bhatia Surabhi, …
Hardcover
R6,677
Discovery Miles 66 770
Digital Manufacturing - The…
Chandrakant D. Patel, Chun-Hsien Chen
Paperback
R4,567
Discovery Miles 45 670
Global Change Scenarios of the 21st…
J. Alcamo, R. Leemans, …
Hardcover
R4,336
Discovery Miles 43 360
Numerical Modeling and Computer…
Dragan M. Cvetkovic, Gunvant A. Birajdar
Hardcover
R3,071
Discovery Miles 30 710
|