![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Computer modelling & simulation
Soft computing, as an engineering science, and statistics, as a
classical branch of mathematics, emphasize different aspects of
data analysis.
The Discrete Element Method (DEM) has emerged as a solution to predicting load capacities of masonry structures. As one of many numerical methods and computational solutions being applied to evaluate masonry structures, further research on DEM tools and methodologies is essential for further advancement. Computational Modeling of Masonry Structures Using the Discrete Element Method explores the latest digital solutions for the analysis and modeling of brick, stone, concrete, granite, limestone, and glass block structures. Focusing on critical research on mathematical and computational methods for masonry analysis, this publication is a pivotal reference source for scholars, engineers, consultants, and graduate-level engineering students.
This book presents the textile-, mathematical and mechanical background for the modelling of fiber based structures such as yarns, braided and knitted textiles. The hierarchical scales of these textiles and the structural elements at the different levels are analysed and the methods for their modelling are presented. The author reports about problems, methods and algorithms and possible solutions from his twenty year experience in the modelling and software development of CAD for textiles.
Over the last decades Discrete Event Simulation has conquered many different application areas. This trend is, on the one hand, driven by an ever wider use of this technology in different fields of science and on the other hand by an incredibly creative use of available software programs through dedicated experts. This book contains articles from scientists and experts from 10 countries. They illuminate the width of application of this technology and the quality of problems solved using Discrete Event Simulation. Practical applications of simulation dominate in the present book. The book is aimed to researchers and students who deal in their work with Discrete Event Simulation and which want to inform them about current applications. By focusing on discrete event simulation, this book can also serve as an inspiration source for practitioners for solving specific problems during their work. Decision makers who deal with the question of the introduction of discrete event simulation for planning support and optimization this book provides a contribution to the orientation, what specific problems could be solved with the help of Discrete Event Simulation within the organization.
A wide variety of processes occur on multiple scales, either naturally or as a consequence of measurement. This book contains methodology for the analysis of data that arise from such multiscale processes. The book brings together a number of recent developments and makes them accessible to a wider audience. Taking a Bayesian approach allows for full accounting of uncertainty, and also addresses the delicate issue of uncertainty at multiple scales. The Bayesian approach also facilitates the use of knowledge from prior experience or data, and these methods can handle different amounts of prior knowledge at different scales, as often occurs in practice. The book is aimed at statisticians, applied mathematicians, and engineers working on problems dealing with multiscale processes in time and/or space, such as in engineering, finance, and environmetrics. The book will also be of interest to those working on multiscale computation research. The main prerequisites are knowledge of Bayesian statistics and basic Markov chain Monte Carlo methods. A number of real-world examples are thoroughly analyzed in order to demonstrate the methods and to assist the readers in applying these methods to their own work. To further assist readers, the authors are making source code (for R) available for many of the basic methods discussed herein.
Disaster management is a process or strategy that is implemented when any type of catastrophic event takes place. The process may be initiated when anything threatens to disrupt normal operations or puts the lives of human beings at risk. Governments on all levels as well as many businesses create some sort of disaster plan that make it possible to overcome the catastrophe and return to normal function as quickly as possible. Response to natural disasters (e.g., floods, earthquakes) or technological disaster (e.g., nuclear, chemical) is an extreme complex process that involves severe time pressure, various uncertainties, high non-linearity and many stakeholders. Disaster management often requires several autonomous agencies to collaboratively mitigate, prepare, respond, and recover from heterogeneous and dynamic sets of hazards to society. Almost all disasters involve high degrees of novelty to deal with most unexpected various uncertainties and dynamic time pressures. Existing studies and approaches within disaster management have mainly been focused on some specific type of disasters with certain agency oriented. There is a lack of a general framework to deal with similarities and synergies among different disasters by taking their specific features into account. This book provides with various decisions analysis theories and support tools in complex systems in general and in disaster management in particular. The book is also generated during a long-term preparation of a European project proposal among most leading experts in the areas related to the book title. Chapters are evaluated based on quality and originality in theory and methodology, application oriented, relevance to the title of the book.
"Current Topics in Membranes" provides a systematic, comprehensive,
and rigorous approach to specific topics relevant to the study of
cellular membranes. Each volume is a guest edited compendium of
membrane biology.
This book highlights recent developments in the field, presented at the Social Simulation 2015 conference in Groningen, The Netherlands. It covers advances both in applications and methods of social simulation. Societal issues addressed range across complexities in economic systems, opinion dynamics and civil violence, changing mobility patterns, different land-use, transition in the energy system, food production and consumption, ecosystem management and historical processes. Methodological developments cover how to use empirical data in validating models in general, formalization of behavioral theory in agent behavior, construction of artificial populations for experimentation, replication of models, and agent-based models that can be run in a web browser. Social simulation is a rapidly evolving field. Social scientists are increasingly interested in social simulation as a tool to tackle the complex non-linear dynamics of society. Furthermore, the software and hardware tools available for social simulation are becoming more and more powerful. This book is an important source for readers interested in the newest developments in the ways in which the simulation of social interaction contributes to our understanding and managing of complex social phenomena.
This volume contains the articles presented at the 21st International Meshing Roundtable (IMR) organized, in part, by Sandia National Laboratories and was held on October 7-10, 2012 in San Jose, CA, USA. The first IMR was held in 1992, and the conference series has been held annually since. Each year the IMR brings together researchers, developers, and application experts in a variety of disciplines, from all over the world, to present and discuss ideas on mesh generation and related topics. The technical papers in this volume present theoretical and novel ideas and algorithms with practical potential, as well as technical applications in science and engineering, geometric modeling, computer graphics, and visualization.
Modeling is a key component to sciences from mathematics to life
science, including environmental and ecological studies. By looking
at the underlying concepts of the software, we can make sure that
we build mathematically feasible models and that we get the most
out of the data and information that we have. This book shows how
models can be analyzed using simple math and software to generate
meaningful qualitative descriptions of system dynamics. This book
shows that even without a full analytical, mathematically rigorous
analysis of the equations, there may be ways to derive some
qualitative understanding of the general behavior of a system. By
relating some of the modeling approaches and systems theory to
real-world examples the book illustrates how these approaches can
help understand concepts such as sustainability, peak oil, adaptive
management, optimal harvest and other practical applications.
This book commemorates the 65th birthday of Dr. Boris Kovalerchuk, and reflects many of the research areas covered by his work. It focuses on data processing under uncertainty, especially fuzzy data processing, when uncertainty comes from the imprecision of expert opinions. The book includes 17 authoritative contributions by leading experts.
This book deals with transportation processes denoted as the Real-time Distribution of Perishable Goods (RDOPG). The book presents three contributions that are made to the field of transportation. First, a model considering the minimization of customer inconvenience is formulated. Second, a pro-active real-time control approach is proposed. Stochastic knowledge is generated from past request information by a new forecasting approach and is used in the pro-active approach to guide vehicles to request-likely areas before real requests arrive there. Various computational results are presented to show that in many cases the pro-active approach is able to achieve significantly improved results. Moreover, a measure for determining the structural quality of request data sets is also proposed. The third contribution of this book is a method that is presented for considering driver inconvenience aspects which arise from vehicle en-route diversion activities. Specifically, this method makes it possible to restrict the number of performed vehicle en-route diversion activities.
Systems for Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) are currently separate. The potential of the latest technologies and changes in operational and analytical applications over the last decade have given rise to the unification of these systems, which can be of benefit for both workloads. Research and industry have reacted and prototypes of hybrid database systems are now appearing. Benchmarks are the standard method for evaluating, comparing and supporting the development of new database systems. Because of the separation of OLTP and OLAP systems, existing benchmarks are only focused on one or the other. With the rise of hybrid database systems, benchmarks to assess these systems will be needed as well. Based on the examination of existing benchmarks, a new benchmark for hybrid database systems is introduced in this book. It is furthermore used to determine the effect of adding OLAP to an OLTP workload and is applied to analyze the impact of typically used optimizations in the historically separate OLTP and OLAP domains in mixed-workload scenarios.
Achieving value-based healthcare, increasing quality, reducing cost, and spreading access, has proven to be extremely challenging, in part due to research that is siloed and largely focused on singular risk factors, ineffective care coordination resulting from service fragmentation, and costly unintended consequences of reform that have emerged due to the complexity of healthcare systems. Understanding the behaviour of the overall system is becoming a major concern among healthcare managers and decision-makers intent on increasing value for their systems. This book fills a gap in the literature that is becoming more evident as reform efforts proliferate: a holistic Modeling and Simulation (M&S) approach to value-based healthcare within a framework that enables designing, testing and implementing concepts to integrate resource allocations, health phenomenon dynamics, individual behaviour, and population dynamics. It presents a pathways-based efficient coordination of care model involving all stakeholders including patients, providers, care deliverers, managers, and payers. It shows how M&S can help design a better service infrastructure and describes the information technologies that are necessary to implement it successfully. It also presents global and national healthcare perspectives from Europe, USA, Asia and Africa, as well as research directions needed to realize the value-based M&S healthcare vision.
This volume presents a compelling collection of state-of-the-art work in algorithmic computational biology, honoring the legacy of Professor Bernard M.E. Moret in this field. Reflecting the wide-ranging influences of Prof. Moret's research, the coverage encompasses such areas as phylogenetic tree and network estimation, genome rearrangements, cancer phylogeny, species trees, divide-and-conquer strategies, and integer linear programming. Each self-contained chapter provides an introduction to a cutting-edge problem of particular computational and mathematical interest. Topics and features: addresses the challenges in developing accurate and efficient software for the NP-hard maximum likelihood phylogeny estimation problem; describes the inference of species trees, covering strategies to scale phylogeny estimation methods to large datasets, and the construction of taxonomic supertrees; discusses the inference of ultrametric distances from additive distance matrices, and the inference of ancestral genomes under genome rearrangement events; reviews different techniques for inferring evolutionary histories in cancer, from the use of chromosomal rearrangements to tumor phylogenetics approaches; examines problems in phylogenetic networks, including questions relating to discrete mathematics, and issues of statistical estimation; highlights how evolution can provide a framework within which to understand comparative and functional genomics; provides an introduction to Integer Linear Programming and its use in computational biology, including its use for solving the Traveling Salesman Problem. Offering an invaluable source of insights for computer scientists, applied mathematicians, and statisticians, this illuminating volume will also prove useful for graduate courses on computational biology and bioinformatics.
The book shows how simulation's long history and close ties to industry since the third industrial revolution have led to its growing importance in Industry 4.0. The book emphasises the role of simulation in the new industrial revolution, and its application as a key aspect of making Industry 4.0 a reality - and thus achieving the complete digitisation of manufacturing and business. It presents various perspectives on simulation and demonstrates its applications, from augmented or virtual reality to process engineering, and from quantum computing to intelligent management. Simulation for Industry 4.0 is a guide and milestone for the simulation community, as well as those readers working to achieve the goals of Industry 4.0. The connections between simulation and Industry 4.0 drawn here will be of interest not only to beginners, but also to practitioners and researchers as a point of departure in the subject, and as a guide for new lines of study.
The development of innovative drugs is becoming more difficult while relying on empirical approaches. This inspired all major pharmaceutical companies to pursue alternative model-based paradigms. The key question is: How to find innovative compounds and, subsequently, appropriate dosage regimens? Written from the industry perspective and based on many years of experience, this book offers: - Concepts for creation of drug-disease models, introduced and supplemented with extensive MATLAB programs - Guidance for exploration and modification of these programs to enhance the understanding of key principles - Usage of differential equations to pharmacokinetic, pharmacodynamic and (patho-) physiologic problems thereby acknowledging their dynamic nature - A range of topics from single exponential decay to adaptive dosing, from single subject exploration to clinical trial simulation, and from empirical to mechanistic disease modeling. Students with an undergraduate mathematical background or equivalent education, interest in life sciences and skills in a high-level programming language such as MATLAB, are encouraged to engage in model-based pharmaceutical research and development.
This volume brings together, in a central text, chapters written by leading scholars working at the intersection of modeling, the natural and social sciences, and public participation. This book presents the current state of knowledge regarding the theory and practice of engaging stakeholders in environmental modeling for decision-making, and includes basic theoretical considerations, an overview of methods and tools available, and case study examples of these principles and methods in practice. Although there has been a significant increase in research and development regarding participatory modeling, a unifying text that provides an overview of the different methodologies available to scholars and a systematic review of case study applications has been largely unavailable. This edited volume seeks to address a gap in the literature and provide a primer that addresses the growing demand to adopt and apply a range of modeling methods that includes the public in environmental assessment and management. The book is divided into two main sections. The first part of the book covers basic considerations for including stakeholders in the modeling process and its intersection with the theory and practice of public participation in environmental decision-making. The second part of the book is devoted to specific applications and products of the various methods available through case study examination. This second part of the book also provides insight from several international experts currently working in the field about their approaches, types of interactions with stakeholders, models produced, and the challenges they perceived based on their practical experiences.
The book offers a comprehensive survey of soft-computing models for optical character recognition systems. The various techniques, including fuzzy and rough sets, artificial neural networks and genetic algorithms, are tested using real texts written in different languages, such as English, French, German, Latin, Hindi and Gujrati, which have been extracted by publicly available datasets. The simulation studies, which are reported in details here, show that soft-computing based modeling of OCR systems performs consistently better than traditional models. Mainly intended as state-of-the-art survey for postgraduates and researchers in pattern recognition, optical character recognition and soft computing, this book will be useful for professionals in computer vision and image processing alike, dealing with different issues related to optical character recognition.
This thesis addresses one of the most fundamental challenges for modern science: how can the brain as a network of neurons process information, how can it create and store internal models of our world, and how can it infer conclusions from ambiguous data? The author addresses these questions with the rigorous language of mathematics and theoretical physics, an approach that requires a high degree of abstraction to transfer results of wet lab biology to formal models. The thesis starts with an in-depth description of the state-of-the-art in theoretical neuroscience, which it subsequently uses as a basis to develop several new and original ideas. Throughout the text, the author connects the form and function of neuronal networks. This is done in order to achieve functional performance of biological brains by transferring their form to synthetic electronics substrates, an approach referred to as neuromorphic computing. The obvious aspect that this transfer can never be perfect but necessarily leads to performance differences is substantiated and explored in detail. The author also introduces a novel interpretation of the firing activity of neurons. He proposes a probabilistic interpretation of this activity and shows by means of formal derivations that stochastic neurons can sample from internally stored probability distributions. This is corroborated by the author's recent findings, which confirm that biological features like the high conductance state of networks enable this mechanism. The author goes on to show that neural sampling can be implemented on synthetic neuromorphic circuits, paving the way for future applications in machine learning and cognitive computing, for example as energy-efficient implementations of deep learning networks. The thesis offers an essential resource for newcomers to the field and an inspiration for scientists working in theoretical neuroscience and the future of computing.
Operational Research (OR) deals with the use of advanced analytical methods to support better decision-making. It is multidisciplinary with strong links to management science, decision science, computer science and many application areas such as engineering, manufacturing, commerce and healthcare. In the study of emergent behaviour in complex adaptive systems, Agent-based Modelling & Simulation (ABMS) is being used in many different domains such as healthcare, energy, evacuation, commerce, manufacturing and defense. This collection of articles presents a convenient introduction to ABMS with papers ranging from contemporary views to representative case studies. The OR Essentials series presents a unique cross-section of high quality research work fundamental to understanding contemporary issues and research across a range of Operational Research (OR) topics. It brings together some of the best research papers from the esteemed Operational Research Society and its associated journals, also published by Palgrave Macmillan.
Computational welding mechanics (CWM) provides an important
technique for modelling welding processes. Welding simulations are
a key tool in improving the design and control of welding processes
and the performance of welded components or structures. CWM can be
used to model phenomena such as heat generation, thermal stresses
and large plastic deformations of components or structures. It also
has a wider application in modelling thermomechanical and
microstructural phenomena in metals. This important book reviews
the principles, methods and applications of CWM.
Particle models play an important role in many applications in physics, chemistry and biology. They can be studied on the computer with the help of molecular dynamics simulations. This book presents in detail both the necessary numerical methods and techniques (linked-cell method, SPME-method, tree codes, multipole technique) and the theoretical background and foundations. It illustrates the aspects modelling, discretization, algorithms and their parallel implementation with MPI on computer systems with distributed memory. Furthermore, detailed explanations are given to the different steps of numerical simulation, and code examples are provided. With the description of the algorithms and the presentation of the results of various simulations from the areas material science, nanotechnology, biochemistry and astrophysics, the reader of this book will be able to write his own programs for molecular dynamics step by step and to run successful experiments. |
You may like...
Recent Advances in Numerical Simulations
Francisco Bulnes, Jan Peter Hessling
Hardcover
R3,114
Discovery Miles 31 140
Digital Manufacturing - The…
Chandrakant D. Patel, Chun-Hsien Chen
Paperback
R4,567
Discovery Miles 45 670
Applying AI-Based IoT Systems to…
Bhatia Madhulika, Bhatia Surabhi, …
Hardcover
R6,677
Discovery Miles 66 770
Simulation Using ProModel
Biman Ghosh, Charles Harrell, …
Paperback
|