Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > Applications of computing > Computer modelling & simulation
This book deals with transportation processes denoted as the Real-time Distribution of Perishable Goods (RDOPG). The book presents three contributions that are made to the field of transportation. First, a model considering the minimization of customer inconvenience is formulated. Second, a pro-active real-time control approach is proposed. Stochastic knowledge is generated from past request information by a new forecasting approach and is used in the pro-active approach to guide vehicles to request-likely areas before real requests arrive there. Various computational results are presented to show that in many cases the pro-active approach is able to achieve significantly improved results. Moreover, a measure for determining the structural quality of request data sets is also proposed. The third contribution of this book is a method that is presented for considering driver inconvenience aspects which arise from vehicle en-route diversion activities. Specifically, this method makes it possible to restrict the number of performed vehicle en-route diversion activities.
Systems for Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) are currently separate. The potential of the latest technologies and changes in operational and analytical applications over the last decade have given rise to the unification of these systems, which can be of benefit for both workloads. Research and industry have reacted and prototypes of hybrid database systems are now appearing. Benchmarks are the standard method for evaluating, comparing and supporting the development of new database systems. Because of the separation of OLTP and OLAP systems, existing benchmarks are only focused on one or the other. With the rise of hybrid database systems, benchmarks to assess these systems will be needed as well. Based on the examination of existing benchmarks, a new benchmark for hybrid database systems is introduced in this book. It is furthermore used to determine the effect of adding OLAP to an OLTP workload and is applied to analyze the impact of typically used optimizations in the historically separate OLTP and OLAP domains in mixed-workload scenarios.
Over the last decades Discrete Event Simulation has conquered many different application areas. This trend is, on the one hand, driven by an ever wider use of this technology in different fields of science and on the other hand by an incredibly creative use of available software programs through dedicated experts. This book contains articles from scientists and experts from 10 countries. They illuminate the width of application of this technology and the quality of problems solved using Discrete Event Simulation. Practical applications of simulation dominate in the present book. The book is aimed to researchers and students who deal in their work with Discrete Event Simulation and which want to inform them about current applications. By focusing on discrete event simulation, this book can also serve as an inspiration source for practitioners for solving specific problems during their work. Decision makers who deal with the question of the introduction of discrete event simulation for planning support and optimization this book provides a contribution to the orientation, what specific problems could be solved with the help of Discrete Event Simulation within the organization.
The book shows how simulation's long history and close ties to industry since the third industrial revolution have led to its growing importance in Industry 4.0. The book emphasises the role of simulation in the new industrial revolution, and its application as a key aspect of making Industry 4.0 a reality - and thus achieving the complete digitisation of manufacturing and business. It presents various perspectives on simulation and demonstrates its applications, from augmented or virtual reality to process engineering, and from quantum computing to intelligent management. Simulation for Industry 4.0 is a guide and milestone for the simulation community, as well as those readers working to achieve the goals of Industry 4.0. The connections between simulation and Industry 4.0 drawn here will be of interest not only to beginners, but also to practitioners and researchers as a point of departure in the subject, and as a guide for new lines of study.
A wide variety of processes occur on multiple scales, either naturally or as a consequence of measurement. This book contains methodology for the analysis of data that arise from such multiscale processes. The book brings together a number of recent developments and makes them accessible to a wider audience. Taking a Bayesian approach allows for full accounting of uncertainty, and also addresses the delicate issue of uncertainty at multiple scales. The Bayesian approach also facilitates the use of knowledge from prior experience or data, and these methods can handle different amounts of prior knowledge at different scales, as often occurs in practice. The book is aimed at statisticians, applied mathematicians, and engineers working on problems dealing with multiscale processes in time and/or space, such as in engineering, finance, and environmetrics. The book will also be of interest to those working on multiscale computation research. The main prerequisites are knowledge of Bayesian statistics and basic Markov chain Monte Carlo methods. A number of real-world examples are thoroughly analyzed in order to demonstrate the methods and to assist the readers in applying these methods to their own work. To further assist readers, the authors are making source code (for R) available for many of the basic methods discussed herein.
This volume presents a compelling collection of state-of-the-art work in algorithmic computational biology, honoring the legacy of Professor Bernard M.E. Moret in this field. Reflecting the wide-ranging influences of Prof. Moret's research, the coverage encompasses such areas as phylogenetic tree and network estimation, genome rearrangements, cancer phylogeny, species trees, divide-and-conquer strategies, and integer linear programming. Each self-contained chapter provides an introduction to a cutting-edge problem of particular computational and mathematical interest. Topics and features: addresses the challenges in developing accurate and efficient software for the NP-hard maximum likelihood phylogeny estimation problem; describes the inference of species trees, covering strategies to scale phylogeny estimation methods to large datasets, and the construction of taxonomic supertrees; discusses the inference of ultrametric distances from additive distance matrices, and the inference of ancestral genomes under genome rearrangement events; reviews different techniques for inferring evolutionary histories in cancer, from the use of chromosomal rearrangements to tumor phylogenetics approaches; examines problems in phylogenetic networks, including questions relating to discrete mathematics, and issues of statistical estimation; highlights how evolution can provide a framework within which to understand comparative and functional genomics; provides an introduction to Integer Linear Programming and its use in computational biology, including its use for solving the Traveling Salesman Problem. Offering an invaluable source of insights for computer scientists, applied mathematicians, and statisticians, this illuminating volume will also prove useful for graduate courses on computational biology and bioinformatics.
This book covers the major problems of turbulence and turbulent processes, including physical phenomena, their modeling and their simulation. After a general introduction in Chapter 1 illustrating many aspects dealing with turbulent flows, averaged equations and kinetic energy budgets are provided in Chapter 2. The concept of turbulent viscosity as a closure of the Reynolds stress is also introduced. Wall-bounded flows are presented in Chapter 3 and aspects specific to boundary layers and channel or pipe flows are also pointed out. Free shear flows, namely free jets and wakes, are considered in Chapter 4. Chapter 5 deals with vortex dynamics. Homogeneous turbulence, isotropy and dynamics of isotropic turbulence are presented in Chapters 6 and 7. Turbulence is then described both in the physical space and in the wave number space. Time dependent numerical simulations are presented in Chapter 8, where an introduction to large eddy simulation is offered. The last three chapters of the book summarize remarkable digital techniques current and experimental. Many results are presented in a practical way, based on both experiments and numerical simulations. The book is written for a advanced engineering students as well as postgraduate engineers and researchers. For students, it contains the essential results as well as details and demonstrations whose oral transmission is often tedious. At a more advanced level, the text provides numerous references which allow readers to find quickly further study regarding their work and to acquire a deeper knowledge on topics of interest.
This book presents essential methods and tools for research into the reliability of energy systems. It describes in detail the content setting, formalisation, and use of algorithms for assessing the reliability of modern, large, and complex electric power systems. The book uses a wealth of tables and illustrations to represent results and source information in a clear manner. It discusses the main operating conditions which affect the reliability of electric power systems, and describes corresponding computing tools which can help solve issues as they arise. Further, all methodologies presented here are demonstrated in numerical examples. Though primarily intended for researchers and practitioners in the field of electric power systems, the book will also benefit general readers interested in this area.
This volume contains the articles presented at the 21st International Meshing Roundtable (IMR) organized, in part, by Sandia National Laboratories and was held on October 7-10, 2012 in San Jose, CA, USA. The first IMR was held in 1992, and the conference series has been held annually since. Each year the IMR brings together researchers, developers, and application experts in a variety of disciplines, from all over the world, to present and discuss ideas on mesh generation and related topics. The technical papers in this volume present theoretical and novel ideas and algorithms with practical potential, as well as technical applications in science and engineering, geometric modeling, computer graphics, and visualization.
Operational Research (OR) deals with the use of advanced analytical methods to support better decision-making. It is multidisciplinary with strong links to management science, decision science, computer science and many application areas such as engineering, manufacturing, commerce and healthcare. In the study of emergent behaviour in complex adaptive systems, Agent-based Modelling & Simulation (ABMS) is being used in many different domains such as healthcare, energy, evacuation, commerce, manufacturing and defense. This collection of articles presents a convenient introduction to ABMS with papers ranging from contemporary views to representative case studies. The OR Essentials series presents a unique cross-section of high quality research work fundamental to understanding contemporary issues and research across a range of Operational Research (OR) topics. It brings together some of the best research papers from the esteemed Operational Research Society and its associated journals, also published by Palgrave Macmillan.
Senior level/graduate level text/reference presenting state-of-the- art numerical techniques to solve the wave equation in heterogeneous fluid-solid media. Numerical models have become standard research tools in acoustic laboratories, and thus computational acoustics is becoming an increasingly important branch of ocean acoustic science. The first edition of this successful book, written by the recognized leaders of the field, was the first to present a comprehensive and modern introduction to computational ocean acoustics accessible to students. This revision, with 100 additional pages, completely updates the material in the first edition and includes new models based on current research. It includes problems and solutions in every chapter, making the book more useful in teaching (the first edition had a separate solutions manual). The book is intended for graduate and advanced undergraduate students of acoustics, geology and geophysics, applied mathematics, ocean engineering or as a reference in computational methods courses, as well as professionals in these fields, particularly those working in government (especially Navy) and industry labs engaged in the development or use of propagating models.
The increased computational power and software tools available to
engineers have increased the use and dependence on modeling and
computer simulation throughout the design process. These tools have
given engineers the capability of designing highly complex systems
and computer architectures that were previously unthinkable. Every
complex design project, from integrated circuits, to aerospace
vehicles, to industrial manufacturing processes requires these new
methods. This book fulfills the essential need of system and
control engineers at all levels in understanding modeling and
simulation. This book, written as a true text/reference has become
a standard sr./graduate level course in all EE departments
worldwide and all professionals in this area are required to update
their skills. * Presents a working foundation necessary for compliance with
High Level Architecture (HLA) standards
This thesis addresses one of the most fundamental challenges for modern science: how can the brain as a network of neurons process information, how can it create and store internal models of our world, and how can it infer conclusions from ambiguous data? The author addresses these questions with the rigorous language of mathematics and theoretical physics, an approach that requires a high degree of abstraction to transfer results of wet lab biology to formal models. The thesis starts with an in-depth description of the state-of-the-art in theoretical neuroscience, which it subsequently uses as a basis to develop several new and original ideas. Throughout the text, the author connects the form and function of neuronal networks. This is done in order to achieve functional performance of biological brains by transferring their form to synthetic electronics substrates, an approach referred to as neuromorphic computing. The obvious aspect that this transfer can never be perfect but necessarily leads to performance differences is substantiated and explored in detail. The author also introduces a novel interpretation of the firing activity of neurons. He proposes a probabilistic interpretation of this activity and shows by means of formal derivations that stochastic neurons can sample from internally stored probability distributions. This is corroborated by the author's recent findings, which confirm that biological features like the high conductance state of networks enable this mechanism. The author goes on to show that neural sampling can be implemented on synthetic neuromorphic circuits, paving the way for future applications in machine learning and cognitive computing, for example as energy-efficient implementations of deep learning networks. The thesis offers an essential resource for newcomers to the field and an inspiration for scientists working in theoretical neuroscience and the future of computing.
This book presents both methodological papers on and examples of applying behavioral predictive models to specific economic problems, with a focus on how to take into account people's behavior when making economic predictions. This is an important issue, since traditional economic models assumed that people make wise economic decisions based on a detailed rational analysis of all the relevant aspects. However, in reality - as Nobel Prize-winning research has shown - people have a limited ability to process information and, as a result, their decisions are not always optimal. Discussing the need for prediction-oriented statistical techniques, since many statistical methods currently used in economics focus more on model fitting and do not always lead to good predictions, the book is a valuable resource for researchers and students interested in the latest results and challenges and for practitioners wanting to learn how to use state-of-the-art techniques.
Particle models play an important role in many applications in physics, chemistry and biology. They can be studied on the computer with the help of molecular dynamics simulations. This book presents in detail both the necessary numerical methods and techniques (linked-cell method, SPME-method, tree codes, multipole technique) and the theoretical background and foundations. It illustrates the aspects modelling, discretization, algorithms and their parallel implementation with MPI on computer systems with distributed memory. Furthermore, detailed explanations are given to the different steps of numerical simulation, and code examples are provided. With the description of the algorithms and the presentation of the results of various simulations from the areas material science, nanotechnology, biochemistry and astrophysics, the reader of this book will be able to write his own programs for molecular dynamics step by step and to run successful experiments.
The development of innovative drugs is becoming more difficult while relying on empirical approaches. This inspired all major pharmaceutical companies to pursue alternative model-based paradigms. The key question is: How to find innovative compounds and, subsequently, appropriate dosage regimens? Written from the industry perspective and based on many years of experience, this book offers: - Concepts for creation of drug-disease models, introduced and supplemented with extensive MATLAB programs - Guidance for exploration and modification of these programs to enhance the understanding of key principles - Usage of differential equations to pharmacokinetic, pharmacodynamic and (patho-) physiologic problems thereby acknowledging their dynamic nature - A range of topics from single exponential decay to adaptive dosing, from single subject exploration to clinical trial simulation, and from empirical to mechanistic disease modeling. Students with an undergraduate mathematical background or equivalent education, interest in life sciences and skills in a high-level programming language such as MATLAB, are encouraged to engage in model-based pharmaceutical research and development.
Through a series of step-by-step tutorials and numerous hands-on exercises, this book aims to equip the reader with both a good understanding of the importance of space in the abstract world of engineers and the ability to create a model of a product in virtual space - a skill essential for any designer or engineer who needs to present ideas concerning a particular product within a professional environment. The exercises progress logically from the simple to the more complex; while Solid Works or NX is the software used, the underlying philosophy is applicable to all modeling software. In each case, the explanation covers the entire procedure from the basic idea and production capabilities through to the real model; the conversion from 3D model to 2D manufacturing drawing is also clearly explained. Topics covered include modeling of prism, axisymmetric, symmetric and sophisticated shapes; digitization of physical models using modeling software; creation of a CAD model starting from a physical model; free form surface modeling; modeling of product assemblies following bottom-up and top-down principles; and the presentation of a product in accordance with the rules of technical documentation. This book, which includes more than 500 figures, will be ideal for students wishing to gain a sound grasp of space modeling techniques. Academics and professionals will find it to be an excellent teaching and research aid, and an easy-to-use guide.
Make-believe plays a far stronger role in both the design and use of interfaces, games and services than we have come to believe. This edited volume illustrates ways for grasping and utilising that connection to improve interaction, user experiences, and customer value. Useful for designers, undergraduates and researchers alike, this new research provide tools for understanding and applying make-believe in various contexts, ranging from digital tools to physical services. It takes the reader through a world of imagination and intuition applied into efficient practice, with topics including the connection of human-computer interaction (HCI) to make-believe and backstories, the presence of imagination in gamification, gameworlds, virtual worlds and service design, and the believability of make-believe based designs in various contexts. Furthermore, it discusses the challenges inherent in applying make-believe as a basis for interaction design, as well as the enactive mechanism behind it. Whether used as a university textbook or simply used for design inspiration, Digital Make-Believe provides new and efficient insight into approaching interaction in the way in which actual users of devices, software and services can innately utilise it.
This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.
This book presents bond graph model-based fault detection with a focus on hybrid system models. The book addresses model design, simulation, control and model-based fault diagnosis of multidisciplinary engineering systems. The text beings with a brief survey of the state-of-the-art, then focuses on hybrid systems. The author then uses different bond graph approaches throughout the text and provides case studies.
This book reflects more than three decades of research on Cellular Automata (CA), and nearly a decade of work on the application of CA to model biological strings, which forms the foundation of 'A New Kind of Computational Biology' pioneered by the start-up, CARLBio. After a brief introduction on Cellular Automata (CA) theory and functional biology, it reports on the modeling of basic biological strings with CA, starting with the basic nucleotides leading to codon and anti-codon CA models. It derives a more involved CA model of DNA, RNA, the entire translation process for amino acid formation and the evolution of protein to its unique structure and function. In subsequent chapters the interaction of Proteins with other bio-molecules is also modeled. The only prior knowledge assumed necessary is an undergraduate knowledge of computer programming and biology. The book adopts a hands-on, "do-it-yourself" approach to enable readers to apply the method provided to derive the CA rules and comprehend how these are related to the physical 'rules' observed in biology. In a single framework, the authors have presented two branches of science - Computation and Biology. Instead of rigorous molecular dynamics modeling, which the authors describe as a Bottoms-Up model, or relying on the Top-Down new age Artificial Intelligence (AI) and Machine Language (ML) that depends on extensive availability of quality data, this book takes the best from both the Top-Down and Bottoms-up approaches and establishes how the behavior of complex molecules is represented in CA. The CA rules are derived from the basic knowledge of molecular interaction and construction observed in biological world but mapped to a few subset of known results to derive and predict results. This book is useful for students, researchers and industry practitioners who want to explore modeling and simulation of the physical world complex systems from a different perspective. It raises the inevitable the question - 'Are life and the universe nothing but a collection of continuous systems processing information'.
This book describes CoSMoS (Complex Systems Modelling and Simulation), a pattern-based approach to engineering trustworthy simulations that are both scientifically useful to the researcher and scientifically credible to third parties. This approach emphasises three key aspects to this development of a simulation as a scientific instrument: the use of explicit models to capture the scientific domain, the engineered simulation platform, and the experimental results of running simulations; the use of arguments to provide evidence that the scientific instrument is fit for purpose; and the close co-working of domain scientists and simulation software engineers. In Part I the authors provide a managerial overview: the rationale for and benefits of using the CoSMoS approach, and a small worked example to demonstrate it in action. Part II is a catalogue of the core patterns. Part III lists more specific "helper" patterns, showing possible routes to a simulation. Finally Part IV documents CellBranch, a substantial case study developed using the CoSMoS approach.
The sense of touch is fundamental during the interaction between humans and their environment; in virtual reality, objects are created by computer simulations and they can be experienced through haptic devices. In this context haptic textures are fundamental for a realistic haptic perception of virtual objects. This book formalizes the specific artefacts corrupting the rendering of virtual haptic textures and offers a set of simple conditions to guide haptic researchers towards artefact-free textures. The conditions identified are also extremely valuable when designing psychophysical experiments and when analyzing the significance of the data collected. "The Synthesis of Three Dimensional Haptic Textures, Geometry, Control, and ""Psychophysics "examines the problem of rendering virtual haptic textures with force feedback devices. The author provides an introduction to the topic of haptic textures that covers the basics of the physiology of the skin, the psychophysics of roughness perception, and the engineering challenges behind haptic textures rendering. The book continues with the presentation of a novel mathematical framework that characterizes haptic devices, texturing algorithms and their ability to generate realistic haptic textures. Finally, two psychophysical experiments link the perception of roughness with the parameters of the haptic rendering algorithms. This book formalizes the specific artefacts corrupting the rendering of virtual haptic textures and offers a set of simple conditions to guide haptic researchers towards artefact-free textures. The conditions identified are also extremely valuable when designing psychophysical experiments and when analyzing the significance of the data collected.
This book is dedicated to Prof. Peter Young on his 70th birthday. Professor Young has been a pioneer in systems and control, and over the past 45 years he has influenced many developments in this field. This volume comprises a collection of contributions by leading experts in system identification, time-series analysis, environmetric modelling and control system design - modern research in topics that reflect important areas of interest in Professor Young's research career. Recent theoretical developments in and relevant applications of these areas are explored treating the various subjects broadly and in depth. The authoritative and up-to-date research presented here will be of interest to academic researcher in control and disciplines related to environmental research, particularly those to with water systems. The tutorial style in which many of the contributions are composed also makes the book suitable as a source of study material for graduate students in those areas. |
You may like...
Decision Making And Problem Solving - A…
Sachi Nandan Mohanty
Hardcover
R1,469
Discovery Miles 14 690
Advances in Principal Component Analysis
Fausto Pedro Garcia Marquez
Hardcover
Chemical Modelling - Volume 17
Hilke Bahmann, Jean Christophe Tremblay
Hardcover
R11,435
Discovery Miles 114 350
Applying AI-Based IoT Systems to…
Bhatia Madhulika, Bhatia Surabhi, …
Hardcover
R7,037
Discovery Miles 70 370
Numerical Modeling and Computer…
Dragan M. Cvetkovic, Gunvant A. Birajdar
Hardcover
|