![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Applications of computing > Computer modelling & simulation
With the increasing complexity and dynamism in today's machine design and development, more precise, robust and practical approaches and systems are needed to support machine design. Existing design methods treat the targeted machine as stationery. Analysis and simulation are mostly performed at the component level. Although there are some computer-aided engineering tools capable of motion analysis and vibration simulation etc., the machine itself is in the dry-run state. For effective machine design, understanding its thermal behaviours is crucial in achieving the desired performance in real situation. Dynamic Thermal Analysis of Machines in Running State presents a set of innovative solutions to dynamic thermal analysis of machines when they are put under actual working conditions. The objective is to better understand the thermal behaviours of a machine in real situation while at the design stage. The book has two major sections, with the first section presenting a broad-based review of the key areas of research in dynamic thermal analysis and simulation, and the second section presents an in-depth treatment of relevant methodology and algorithms, leading to better understanding of a machine in real situation. The book is a collection of novel ideas, taking into account the need for presenting intellectual challenges while appealing to a broad readership, including academic researchers, practicing engineers and managers, and graduate students. Given the essential role of modern machines in factory automation and quality assurance, a book dedicated to the topic of dynamic thermal analysis, and its practical applications to machine design would be beneficial to readers of all design and manufacturing sectors, from machine design to automotive engineering, in better understanding the present challenges and solutions, as well as future research directions in this important area.
This book highlights recent developments in the field, presented at the Social Simulation 2015 conference in Groningen, The Netherlands. It covers advances both in applications and methods of social simulation. Societal issues addressed range across complexities in economic systems, opinion dynamics and civil violence, changing mobility patterns, different land-use, transition in the energy system, food production and consumption, ecosystem management and historical processes. Methodological developments cover how to use empirical data in validating models in general, formalization of behavioral theory in agent behavior, construction of artificial populations for experimentation, replication of models, and agent-based models that can be run in a web browser. Social simulation is a rapidly evolving field. Social scientists are increasingly interested in social simulation as a tool to tackle the complex non-linear dynamics of society. Furthermore, the software and hardware tools available for social simulation are becoming more and more powerful. This book is an important source for readers interested in the newest developments in the ways in which the simulation of social interaction contributes to our understanding and managing of complex social phenomena.
Soft computing, as an engineering science, and statistics, as a
classical branch of mathematics, emphasize different aspects of
data analysis.
This book presents the textile-, mathematical and mechanical background for the modelling of fiber based structures such as yarns, braided and knitted textiles. The hierarchical scales of these textiles and the structural elements at the different levels are analysed and the methods for their modelling are presented. The author reports about problems, methods and algorithms and possible solutions from his twenty year experience in the modelling and software development of CAD for textiles.
This book commemorates the 65th birthday of Dr. Boris Kovalerchuk, and reflects many of the research areas covered by his work. It focuses on data processing under uncertainty, especially fuzzy data processing, when uncertainty comes from the imprecision of expert opinions. The book includes 17 authoritative contributions by leading experts.
This book examines air pollution of a big city using multi-year and multi-season data from ground-based air monitoring stations and satellite sounding data, which provides more clear and detailed information on the main sources of air pollution, the long-term trend of pollution, the influence of meteorological parameters on pollution levels, and trajectories of polluted air masses. For example, the book shows that particulate matter from local sources is transported from deserts to create air quality challenges. It also analyzes the effects of desert and semi-desert landscapes on high concentrations of pollutants.
This book treats modeling and simulation in a simple way, that builds on the existing knowledge and intuition of students. They will learn how to build a model and solve it using Excel. Most chemical engineering students feel a shiver down the spine when they see a set of complex mathematical equations generated from the modeling of a chemical engineering system. This is because they usually do not understand how to achieve this mathematical model, or they do not know how to solve the equations system without spending a lot of time and effort. Trying to understand how to generate a set of mathematical equations to represent a physical system (to model) and solve these equations (to simulate) is not a simple task. A model, most of the time, takes into account all phenomena studied during a Chemical Engineering course. In the same way, there is a multitude of numerical methods that can be used to solve the same set of equations generated from the modeling, and many different computational languages can be adopted to implement the numerical methods. As a consequence of this comprehensiveness and combinatorial explosion of possibilities, most books that deal with this subject are very extensive and embracing, making need for a lot of time and effort to go through this subject. It is expected that with this book the chemical engineering student and the future chemical engineer feel motivated to solve different practical problems involving chemical processes, knowing they can do that in an easy and fast way, with no need of expensive software.
Disaster management is a process or strategy that is implemented when any type of catastrophic event takes place. The process may be initiated when anything threatens to disrupt normal operations or puts the lives of human beings at risk. Governments on all levels as well as many businesses create some sort of disaster plan that make it possible to overcome the catastrophe and return to normal function as quickly as possible. Response to natural disasters (e.g., floods, earthquakes) or technological disaster (e.g., nuclear, chemical) is an extreme complex process that involves severe time pressure, various uncertainties, high non-linearity and many stakeholders. Disaster management often requires several autonomous agencies to collaboratively mitigate, prepare, respond, and recover from heterogeneous and dynamic sets of hazards to society. Almost all disasters involve high degrees of novelty to deal with most unexpected various uncertainties and dynamic time pressures. Existing studies and approaches within disaster management have mainly been focused on some specific type of disasters with certain agency oriented. There is a lack of a general framework to deal with similarities and synergies among different disasters by taking their specific features into account. This book provides with various decisions analysis theories and support tools in complex systems in general and in disaster management in particular. The book is also generated during a long-term preparation of a European project proposal among most leading experts in the areas related to the book title. Chapters are evaluated based on quality and originality in theory and methodology, application oriented, relevance to the title of the book.
This book deals with transportation processes denoted as the Real-time Distribution of Perishable Goods (RDOPG). The book presents three contributions that are made to the field of transportation. First, a model considering the minimization of customer inconvenience is formulated. Second, a pro-active real-time control approach is proposed. Stochastic knowledge is generated from past request information by a new forecasting approach and is used in the pro-active approach to guide vehicles to request-likely areas before real requests arrive there. Various computational results are presented to show that in many cases the pro-active approach is able to achieve significantly improved results. Moreover, a measure for determining the structural quality of request data sets is also proposed. The third contribution of this book is a method that is presented for considering driver inconvenience aspects which arise from vehicle en-route diversion activities. Specifically, this method makes it possible to restrict the number of performed vehicle en-route diversion activities.
The book offers a comprehensive survey of soft-computing models for optical character recognition systems. The various techniques, including fuzzy and rough sets, artificial neural networks and genetic algorithms, are tested using real texts written in different languages, such as English, French, German, Latin, Hindi and Gujrati, which have been extracted by publicly available datasets. The simulation studies, which are reported in details here, show that soft-computing based modeling of OCR systems performs consistently better than traditional models. Mainly intended as state-of-the-art survey for postgraduates and researchers in pattern recognition, optical character recognition and soft computing, this book will be useful for professionals in computer vision and image processing alike, dealing with different issues related to optical character recognition.
Over the last decades Discrete Event Simulation has conquered many different application areas. This trend is, on the one hand, driven by an ever wider use of this technology in different fields of science and on the other hand by an incredibly creative use of available software programs through dedicated experts. This book contains articles from scientists and experts from 10 countries. They illuminate the width of application of this technology and the quality of problems solved using Discrete Event Simulation. Practical applications of simulation dominate in the present book. The book is aimed to researchers and students who deal in their work with Discrete Event Simulation and which want to inform them about current applications. By focusing on discrete event simulation, this book can also serve as an inspiration source for practitioners for solving specific problems during their work. Decision makers who deal with the question of the introduction of discrete event simulation for planning support and optimization this book provides a contribution to the orientation, what specific problems could be solved with the help of Discrete Event Simulation within the organization.
Systems for Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) are currently separate. The potential of the latest technologies and changes in operational and analytical applications over the last decade have given rise to the unification of these systems, which can be of benefit for both workloads. Research and industry have reacted and prototypes of hybrid database systems are now appearing. Benchmarks are the standard method for evaluating, comparing and supporting the development of new database systems. Because of the separation of OLTP and OLAP systems, existing benchmarks are only focused on one or the other. With the rise of hybrid database systems, benchmarks to assess these systems will be needed as well. Based on the examination of existing benchmarks, a new benchmark for hybrid database systems is introduced in this book. It is furthermore used to determine the effect of adding OLAP to an OLTP workload and is applied to analyze the impact of typically used optimizations in the historically separate OLTP and OLAP domains in mixed-workload scenarios.
This volume brings together, in a central text, chapters written by leading scholars working at the intersection of modeling, the natural and social sciences, and public participation. This book presents the current state of knowledge regarding the theory and practice of engaging stakeholders in environmental modeling for decision-making, and includes basic theoretical considerations, an overview of methods and tools available, and case study examples of these principles and methods in practice. Although there has been a significant increase in research and development regarding participatory modeling, a unifying text that provides an overview of the different methodologies available to scholars and a systematic review of case study applications has been largely unavailable. This edited volume seeks to address a gap in the literature and provide a primer that addresses the growing demand to adopt and apply a range of modeling methods that includes the public in environmental assessment and management. The book is divided into two main sections. The first part of the book covers basic considerations for including stakeholders in the modeling process and its intersection with the theory and practice of public participation in environmental decision-making. The second part of the book is devoted to specific applications and products of the various methods available through case study examination. This second part of the book also provides insight from several international experts currently working in the field about their approaches, types of interactions with stakeholders, models produced, and the challenges they perceived based on their practical experiences.
A wide variety of processes occur on multiple scales, either naturally or as a consequence of measurement. This book contains methodology for the analysis of data that arise from such multiscale processes. The book brings together a number of recent developments and makes them accessible to a wider audience. Taking a Bayesian approach allows for full accounting of uncertainty, and also addresses the delicate issue of uncertainty at multiple scales. The Bayesian approach also facilitates the use of knowledge from prior experience or data, and these methods can handle different amounts of prior knowledge at different scales, as often occurs in practice. The book is aimed at statisticians, applied mathematicians, and engineers working on problems dealing with multiscale processes in time and/or space, such as in engineering, finance, and environmetrics. The book will also be of interest to those working on multiscale computation research. The main prerequisites are knowledge of Bayesian statistics and basic Markov chain Monte Carlo methods. A number of real-world examples are thoroughly analyzed in order to demonstrate the methods and to assist the readers in applying these methods to their own work. To further assist readers, the authors are making source code (for R) available for many of the basic methods discussed herein.
This book covers the major problems of turbulence and turbulent processes, including physical phenomena, their modeling and their simulation. After a general introduction in Chapter 1 illustrating many aspects dealing with turbulent flows, averaged equations and kinetic energy budgets are provided in Chapter 2. The concept of turbulent viscosity as a closure of the Reynolds stress is also introduced. Wall-bounded flows are presented in Chapter 3 and aspects specific to boundary layers and channel or pipe flows are also pointed out. Free shear flows, namely free jets and wakes, are considered in Chapter 4. Chapter 5 deals with vortex dynamics. Homogeneous turbulence, isotropy and dynamics of isotropic turbulence are presented in Chapters 6 and 7. Turbulence is then described both in the physical space and in the wave number space. Time dependent numerical simulations are presented in Chapter 8, where an introduction to large eddy simulation is offered. The last three chapters of the book summarize remarkable digital techniques current and experimental. Many results are presented in a practical way, based on both experiments and numerical simulations. The book is written for a advanced engineering students as well as postgraduate engineers and researchers. For students, it contains the essential results as well as details and demonstrations whose oral transmission is often tedious. At a more advanced level, the text provides numerous references which allow readers to find quickly further study regarding their work and to acquire a deeper knowledge on topics of interest.
The book shows how simulation's long history and close ties to industry since the third industrial revolution have led to its growing importance in Industry 4.0. The book emphasises the role of simulation in the new industrial revolution, and its application as a key aspect of making Industry 4.0 a reality - and thus achieving the complete digitisation of manufacturing and business. It presents various perspectives on simulation and demonstrates its applications, from augmented or virtual reality to process engineering, and from quantum computing to intelligent management. Simulation for Industry 4.0 is a guide and milestone for the simulation community, as well as those readers working to achieve the goals of Industry 4.0. The connections between simulation and Industry 4.0 drawn here will be of interest not only to beginners, but also to practitioners and researchers as a point of departure in the subject, and as a guide for new lines of study.
This volume contains the articles presented at the 21st International Meshing Roundtable (IMR) organized, in part, by Sandia National Laboratories and was held on October 7-10, 2012 in San Jose, CA, USA. The first IMR was held in 1992, and the conference series has been held annually since. Each year the IMR brings together researchers, developers, and application experts in a variety of disciplines, from all over the world, to present and discuss ideas on mesh generation and related topics. The technical papers in this volume present theoretical and novel ideas and algorithms with practical potential, as well as technical applications in science and engineering, geometric modeling, computer graphics, and visualization.
This volume presents a compelling collection of state-of-the-art work in algorithmic computational biology, honoring the legacy of Professor Bernard M.E. Moret in this field. Reflecting the wide-ranging influences of Prof. Moret's research, the coverage encompasses such areas as phylogenetic tree and network estimation, genome rearrangements, cancer phylogeny, species trees, divide-and-conquer strategies, and integer linear programming. Each self-contained chapter provides an introduction to a cutting-edge problem of particular computational and mathematical interest. Topics and features: addresses the challenges in developing accurate and efficient software for the NP-hard maximum likelihood phylogeny estimation problem; describes the inference of species trees, covering strategies to scale phylogeny estimation methods to large datasets, and the construction of taxonomic supertrees; discusses the inference of ultrametric distances from additive distance matrices, and the inference of ancestral genomes under genome rearrangement events; reviews different techniques for inferring evolutionary histories in cancer, from the use of chromosomal rearrangements to tumor phylogenetics approaches; examines problems in phylogenetic networks, including questions relating to discrete mathematics, and issues of statistical estimation; highlights how evolution can provide a framework within which to understand comparative and functional genomics; provides an introduction to Integer Linear Programming and its use in computational biology, including its use for solving the Traveling Salesman Problem. Offering an invaluable source of insights for computer scientists, applied mathematicians, and statisticians, this illuminating volume will also prove useful for graduate courses on computational biology and bioinformatics.
Operational Research (OR) deals with the use of advanced analytical methods to support better decision-making. It is multidisciplinary with strong links to management science, decision science, computer science and many application areas such as engineering, manufacturing, commerce and healthcare. In the study of emergent behaviour in complex adaptive systems, Agent-based Modelling & Simulation (ABMS) is being used in many different domains such as healthcare, energy, evacuation, commerce, manufacturing and defense. This collection of articles presents a convenient introduction to ABMS with papers ranging from contemporary views to representative case studies. The OR Essentials series presents a unique cross-section of high quality research work fundamental to understanding contemporary issues and research across a range of Operational Research (OR) topics. It brings together some of the best research papers from the esteemed Operational Research Society and its associated journals, also published by Palgrave Macmillan.
The increased computational power and software tools available to
engineers have increased the use and dependence on modeling and
computer simulation throughout the design process. These tools have
given engineers the capability of designing highly complex systems
and computer architectures that were previously unthinkable. Every
complex design project, from integrated circuits, to aerospace
vehicles, to industrial manufacturing processes requires these new
methods. This book fulfills the essential need of system and
control engineers at all levels in understanding modeling and
simulation. This book, written as a true text/reference has become
a standard sr./graduate level course in all EE departments
worldwide and all professionals in this area are required to update
their skills. * Presents a working foundation necessary for compliance with
High Level Architecture (HLA) standards
This book presents essential methods and tools for research into the reliability of energy systems. It describes in detail the content setting, formalisation, and use of algorithms for assessing the reliability of modern, large, and complex electric power systems. The book uses a wealth of tables and illustrations to represent results and source information in a clear manner. It discusses the main operating conditions which affect the reliability of electric power systems, and describes corresponding computing tools which can help solve issues as they arise. Further, all methodologies presented here are demonstrated in numerical examples. Though primarily intended for researchers and practitioners in the field of electric power systems, the book will also benefit general readers interested in this area.
This book reflects more than three decades of research on Cellular Automata (CA), and nearly a decade of work on the application of CA to model biological strings, which forms the foundation of 'A New Kind of Computational Biology' pioneered by the start-up, CARLBio. After a brief introduction on Cellular Automata (CA) theory and functional biology, it reports on the modeling of basic biological strings with CA, starting with the basic nucleotides leading to codon and anti-codon CA models. It derives a more involved CA model of DNA, RNA, the entire translation process for amino acid formation and the evolution of protein to its unique structure and function. In subsequent chapters the interaction of Proteins with other bio-molecules is also modeled. The only prior knowledge assumed necessary is an undergraduate knowledge of computer programming and biology. The book adopts a hands-on, "do-it-yourself" approach to enable readers to apply the method provided to derive the CA rules and comprehend how these are related to the physical 'rules' observed in biology. In a single framework, the authors have presented two branches of science - Computation and Biology. Instead of rigorous molecular dynamics modeling, which the authors describe as a Bottoms-Up model, or relying on the Top-Down new age Artificial Intelligence (AI) and Machine Language (ML) that depends on extensive availability of quality data, this book takes the best from both the Top-Down and Bottoms-up approaches and establishes how the behavior of complex molecules is represented in CA. The CA rules are derived from the basic knowledge of molecular interaction and construction observed in biological world but mapped to a few subset of known results to derive and predict results. This book is useful for students, researchers and industry practitioners who want to explore modeling and simulation of the physical world complex systems from a different perspective. It raises the inevitable the question - 'Are life and the universe nothing but a collection of continuous systems processing information'.
Senior level/graduate level text/reference presenting state-of-the- art numerical techniques to solve the wave equation in heterogeneous fluid-solid media. Numerical models have become standard research tools in acoustic laboratories, and thus computational acoustics is becoming an increasingly important branch of ocean acoustic science. The first edition of this successful book, written by the recognized leaders of the field, was the first to present a comprehensive and modern introduction to computational ocean acoustics accessible to students. This revision, with 100 additional pages, completely updates the material in the first edition and includes new models based on current research. It includes problems and solutions in every chapter, making the book more useful in teaching (the first edition had a separate solutions manual). The book is intended for graduate and advanced undergraduate students of acoustics, geology and geophysics, applied mathematics, ocean engineering or as a reference in computational methods courses, as well as professionals in these fields, particularly those working in government (especially Navy) and industry labs engaged in the development or use of propagating models.
Particle models play an important role in many applications in physics, chemistry and biology. They can be studied on the computer with the help of molecular dynamics simulations. This book presents in detail both the necessary numerical methods and techniques (linked-cell method, SPME-method, tree codes, multipole technique) and the theoretical background and foundations. It illustrates the aspects modelling, discretization, algorithms and their parallel implementation with MPI on computer systems with distributed memory. Furthermore, detailed explanations are given to the different steps of numerical simulation, and code examples are provided. With the description of the algorithms and the presentation of the results of various simulations from the areas material science, nanotechnology, biochemistry and astrophysics, the reader of this book will be able to write his own programs for molecular dynamics step by step and to run successful experiments.
This thesis addresses one of the most fundamental challenges for modern science: how can the brain as a network of neurons process information, how can it create and store internal models of our world, and how can it infer conclusions from ambiguous data? The author addresses these questions with the rigorous language of mathematics and theoretical physics, an approach that requires a high degree of abstraction to transfer results of wet lab biology to formal models. The thesis starts with an in-depth description of the state-of-the-art in theoretical neuroscience, which it subsequently uses as a basis to develop several new and original ideas. Throughout the text, the author connects the form and function of neuronal networks. This is done in order to achieve functional performance of biological brains by transferring their form to synthetic electronics substrates, an approach referred to as neuromorphic computing. The obvious aspect that this transfer can never be perfect but necessarily leads to performance differences is substantiated and explored in detail. The author also introduces a novel interpretation of the firing activity of neurons. He proposes a probabilistic interpretation of this activity and shows by means of formal derivations that stochastic neurons can sample from internally stored probability distributions. This is corroborated by the author's recent findings, which confirm that biological features like the high conductance state of networks enable this mechanism. The author goes on to show that neural sampling can be implemented on synthetic neuromorphic circuits, paving the way for future applications in machine learning and cognitive computing, for example as energy-efficient implementations of deep learning networks. The thesis offers an essential resource for newcomers to the field and an inspiration for scientists working in theoretical neuroscience and the future of computing. |
![]() ![]() You may like...
|