Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > Applications of computing > Computer modelling & simulation
This book constitutes the refereed proceedings of the 21st International Conference on Analytical and Stochastic Modelling Techniques and Applications, ASMTA 2014, held in Budapest, Hungary, in June/July 2014. The 18 papers presented were carefully reviewed and selected from 27 submissions. The papers discuss the latest developments in analytical, numerical and simulation algorithms for stochastic systems, including Markov processes, queueing networks, stochastic Petri nets, process algebras, game theory, etc.
The six-volume set LNCS 8579-8584 constitutes the refereed proceedings of the 14th International Conference on Computational Science and Its Applications, ICCSA 2014, held in Guimaraes, Portugal, in June/July 2014. The 347 revised papers presented in 30 workshops and a special track were carefully reviewed and selected from 1167 initial submissions. The 289 papers presented in the workshops cover various areas in computational science ranging from computational science technologies to specific areas of computational science such as computational geometry and security.
This book constitutes the refereed proceedings of the 8th International Conference on Tests and Proofs, TAP 2014, held in York, UK, in July 2014, as part of the STAF 2014 Federated Conferences. The 10 revised full papers and 4 short papers presented together with two tutorial descriptions were carefully reviewed and selected from 27 submissions. The papers cover topics in the following four research areas: test generation, bridging semantic gaps, integrated development processes and bounded verification.
Human-in-the-Loop Simulations is a compilation of articles from experts in the design, development, and use of human-in-the-loop simulations. The first section of the handbook consists of papers on fundamental concepts in human-in-the-loop simulations, such as object-oriented simulation development, interface design and development, and performance measurement. The second section includes papers from researchers who utilized HITL simulations to inform models of cognitive processes to include decision making and metacognition. The last section describes human-in-the-loop processes for complex simulation models in trade space exploration and epidemiological analyses. Human-in-the-Loop Simulations is a useful tool for multiple audiences, including graduate students and researchers in engineering and computer science.
Computational Atomic Physics deals with computational methods for calculating electron (and positron) scattering from atoms and ions, including elastic scattering, excitation, and ionization processes. Each chapter is divided into abstract, theory, computer program with sample input and output, summary, suggested problems, and references. An MS-DOS diskette is included, which holds 11 programs covering the features of each chapter and therefore contributing to a deeper understanding of the field. Thus the book provides a unique practical application of advanced quantum mechanics.
Using computers to solve problems and model physical problems has fast become an integral part of undergraduate and graduate education in physics. This 3rd year undergraduate and subsequent graduate course is a supplement to courses in theoretical physics and develops problem-solving techniques using the computer. It makes use of the newest version of Mathematica (3.0) while still remaining compatible with older versions The programs using Mathematica 3.0 and C are written for both PCs and workstations, and the problems, source files, and graphic routines help students gain experience from the very beginning.
Fuzzy classifiers are important tools in exploratory data analysis, which is a vital set of methods used in various engineering, scientific and business applications. Fuzzy classifiers use fuzzy rules and do not require assumptions common to statistical classification. Rough set theory is useful when data sets are incomplete. It defines a formal approximation of crisp sets by providing the lower and the upper approximation of the original set. Systems based on rough sets have natural ability to work on such data and incomplete vectors do not have to be preprocessed before classification. To achieve better performance than existing machine learning systems, fuzzy classifiers and rough sets can be combined in ensembles. Such ensembles consist of a finite set of learning models, usually weak learners. The present book discusses the three aforementioned fields - fuzzy systems, rough sets and ensemble techniques. As the trained ensemble should represent a single hypothesis, a lot of attention is placed on the possibility to combine fuzzy rules from fuzzy systems being members of classification ensemble. Furthermore, an emphasis is placed on ensembles that can work on incomplete data, thanks to rough set theory. .
Finite Element Programs for Structural Vibrations presents detailed descriptions of how to use six computer programs (written in Fortran 77) to determine the resonant frequencies of one, two, and three-dimensional skeletal structures through the finite element method. Chapter 1 is on "The Finite Element Method" and Chapter 2 demonstrates, with the aid of hand calculations, the finite element solution of some smaller structures. Chapter 3 covers "The Modular Approach", and Chapters 4 to 9 describe the six computer programs, with a large number of worked examples. The six computer programs are given in Appendices I through VI, and on a 3 1/2'' disk included with the book. The programs are suitable for use on IBM (or compatible) PC (640K or more) or minicomputer.
Discrete event simulation and agent-based modeling are increasingly recognized as critical for diagnosing and solving process issues in complex systems. Introduction to Discrete Event Simulation and Agent-based Modeling covers the techniques needed for success in all phases of simulation projects. These include: * Definition - The reader will learn how to plan a project and communicate using a charter. * Input analysis - The reader will discover how to determine defensible sample sizes for all needed data collections. They will also learn how to fit distributions to that data. * Simulation - The reader will understand how simulation controllers work, the Monte Carlo (MC) theory behind them, modern verification and validation, and ways to speed up simulation using variation reduction techniques and other methods. * Output analysis - The reader will be able to establish simultaneous intervals on key responses and apply selection and ranking, design of experiments (DOE), and black box optimization to develop defensible improvement recommendations. * Decision support - Methods to inspire creative alternatives are presented, including lean production. Also, over one hundred solved problems are provided and two full case studies, including one on voting machines that received international attention. Introduction to Discrete Event Simulation and Agent-based Modeling demonstrates how simulation can facilitate improvements on the job and in local communities. It allows readers to competently apply technology considered key in many industries and branches of government. It is suitable for undergraduate and graduate students, as well as researchers and other professionals.
This book constitutes the thoroughly refereed post-conference proceedings of the 9th International Conference on Large-Scale Scientific Computations, LSSC 2013, held in Sozopol, Bulgaria, in June 2013. The 74 revised full papers presented together with 5 plenary and invited papers were carefully reviewed and selected from numerous submissions. The papers are organized in topical sections on numerical modeling of fluids and structures; control and uncertain systems; Monte Carlo methods: theory, applications and distributed computing; theoretical and algorithmic advances in transport problems; applications of metaheuristics to large-scale problems; modeling and numerical simulation of processes in highly heterogeneous media; large-scale models: numerical methods, parallel computations and applications; numerical solvers on many-core systems; cloud and grid computing for resource-intensive scientific applications.
The six-volume set LNCS 8579-8584 constitutes the refereed proceedings of the 14th International Conference on Computational Science and Its Applications, ICCSA 2014, held in Guimaraes, Portugal, in June/July 2014. The 347 revised papers presented in 30 workshops and a special track were carefully reviewed and selected from 1167. The 289 papers presented in the workshops cover various areas in computational science ranging from computational science technologies to specific areas of computational science such as computational geometry and security.
This book constitutes the refereed proceedings of the 4th International Conference on Computational Modeling of Objects Presented in Images, CompIMAGE 2014, held in Pittsburgh, PA, USA, in September 2014. The 29 revised full papers presented together with 10 short papers and 6 keynote talks were carefully reviewed and selected from 54 submissions. The papers cover the following topics: medical treatment, imaging and analysis; image registration, denoising and feature identification; image segmentation; shape analysis, meshing and graphs; medical image processing and simulations; image recognition, reconstruction and predictive modeling; image-based modeling and simulations; and computer vision and data-driven investigations.
This book constitutes the thoroughly refereed post-conference proceedings of the 17th International Workshop on Job Scheduling Strategies for Parallel Processing, JSSPP 2013, held Boston, MA, USA, in May 2013. The 10 revised papers presented were carefully reviewed and selected from 20 submissions. The papers cover the following topics parallel scheduling for commercial environments, scientific computing, supercomputing and cluster platforms.
This book constitutes the refereed proceedings of the 5th International Conference on Information Processing in Computer-Assisted Interventions, IPCAI 2014, held in Fukuoka, Japan, on June 28, 2014. The 28 papers presented were carefully reviewed and selected from 58 submissions. The papers are organized in topical sections on planning, simulation, patient specific models for computer assisted interventions, medical robotics and surgical navigation, interventional imaging and advanced intra-op visualization, cognition, modeling and context awareness, clinical applications, systems, software, and validation.
This book explains and examines the theoretical underpinnings of the Complex Variable Boundary Element Method (CVBEM) as applied to higher dimensions, providing the reader with the tools for extending and using the CVBEM in various applications. Relevant mathematics and principles are assembled and the reader is guided through the key topics necessary for an understanding of the development of the CVBEM in both the usual two as well as three or higher dimensions. In addition to this, problems are provided that build upon the material presented. The Complex Variable Boundary Element Method (CVBEM) is an approximation method useful for solving problems involving the Laplace equation in two dimensions. It has been shown to be a useful modelling technique for solving two-dimensional problems involving the Laplace or Poisson equations on arbitrary domains. The CVBEM has recently been extended to 3 or higher spatial dimensions, which enables the precision of the CVBEM in solving the Laplace equation to be now available for multiple dimensions. The mathematical underpinnings of the CVBEM, as well as the extension to higher dimensions, involve several areas of applied and pure mathematics including Banach Spaces, Hilbert Spaces, among other topics. This book is intended for applied mathematics graduate students, engineering students or practitioners, developers of industrial applications involving the Laplace or Poisson equations and developers of computer modelling applications.
Energy consumption is of great interest to manufacturing companies. Beyond considering individual processes and machines, the perspective on process chains and factories as a whole holds major potentials for energy efficiency improvements. To exploit these potentials, dynamic interactions of different processes as well as auxiliary equipment (e.g. compressed air generation) need to be taken into account. In addition, planning and controlling manufacturing systems require balancing technical, economic and environmental objectives. Therefore, an innovative and comprehensive methodology - with a generic energy flow-oriented manufacturing simulation environment as a core element - is developed and embedded into a step-by-step application cycle. The concept is applied in its entirety to a wide range of case studies such as aluminium die casting, weaving mills, and printed circuit board assembly in order to demonstrate the broad applicability and the benefits that can be achieved.
The field of minimally invasive surgery (MIS) has now taken centre stage in modern clinical practice. With ever changing technologies in the field of MIS, such as robotics, there is now the need to train the surgeon to the next degree. Training by simulation, whether virtual, hybrid, or real, allows the surgeon to rehearse, learn, improve or maintain their skills in a safe and stress free environment. "Simulation Training in Laparoscopy and Robotic Surgery" gives a true insight into the latest educational and learning techniques for new technologies in surgery. Written by an international team of experts, this illustrated text provides advice on specialised team training, non technical skills and simulation. "Simulation Training in Laparoscopy and Robotic Surgery" is an important training aide for surgeons and residents interested in developing skills in this field. "
This book introduces and describes in detail the SEQUAL framework for understanding the quality of models and modeling languages, including the numerous specializations of the generic framework, and the various ways in which this can be used for different applications. Topics and features: contains case studies, chapter summaries, review questions, problems and exercises throughout the text, in addition to Appendices on terminology and abbreviations; presents a thorough introduction to the most important concepts in conceptual modeling, including the underlying philosophical outlook on the quality of models; describes the basic tasks and model types in information systems development and evolution, and the main methodologies for mixing different phases of information system development; provides an overview of the general mechanisms and perspectives used in conceptual modeling; predicts future trends in technological development, and discusses how the role of modeling can be envisaged in this landscape.
3D Imaging, Analysis and Applications brings together core topics, both in terms of well-established fundamental techniques and the most promising recent techniques in the exciting field of 3D imaging and analysis. Many similar techniques are being used in a variety of subject areas and applications and the authors attempt to unify a range of related ideas. With contributions from high profile researchers and practitioners, the material presented is informative and authoritative and represents mainstream work and opinions within the community. Composed of three sections, the first examines 3D imaging and shape representation, the second, 3D shape analysis and processing, and the last section covers 3D imaging applications. Although 3D Imaging, Analysis and Applications is primarily a graduate text, aimed at masters-level and doctoral-level research students, much material is accessible to final-year undergraduate students. It will also serve as a reference text for professional academics, people working in commercial research and development labs and industrial practitioners.
This book constitutes the thoroughly refereed post-conference proceedings of the Second International Workshop on Energy Efficient Data Centers, E(2)DC 2013, held in Berkeley, CA, USA, in May 2013; co-located with SIGCOMM e-Energy 2013. The 8 revised full papers presented were carefully reviewed and selected from numerous submissions. The papers are organized in topical sections on energy and workload measurement; energy management; simulators and control.
This volume contains proceedings of the Case Study Track, held at the 4th International Conference, ABZ 2014, in Toulouse, France, in June 2014. The 11 papers presented were carefully reviewed and selected from numerous submissions. They use different formal techniques: B, ASM, Fiacre. They also propose different kinds of verification such as proof, model checking, test generation, run-time monitoring, and simulation.
The physics and dynamics of the atmosphere and atmosphere-ocean interactions provide the foundation of modern climate models, upon which our understanding of the chemistry and biology of ocean and land surface processes are built. Originally published in 2006, Frontiers of Climate Modeling captures developments in modeling the atmosphere, and their implications for our understanding of climate change, whether due to natural or anthropogenic causes. Emphasis is on elucidating how greenhouse gases and aerosols are altering the radiative forcing of the climate system and the sensitivity of the system to such perturbations. An expert team of authors address key aspects of the atmospheric greenhouse effect, clouds, aerosols, atmospheric radiative transfer, deep convection dynamics, large scale ocean dynamics, stratosphere-troposphere interactions, and coupled ocean-atmosphere model development. The book is an important reference for researchers and advanced students interested in the forces driving the climate system and how they are modeled by climate scientists.
This SpringerBrief focuses on the use of egress models to assess the optimal strategy for total evacuation in high-rise buildings. It investigates occupant relocation and evacuation strategies involving the exit stairs, elevators, sky bridges and combinations thereof. Chapters review existing information on this topic and describe case study simulations of a multi-component exit strategy. This review provides the architectural design, regulatory and research communities with a thorough understanding of the current and emerging evacuation procedures and possible future options. A model case study simulates seven possible strategies for the total evacuation of two identical twin towers linked with two sky-bridges at different heights. The authors present the layout of the building and the available egress components including both vertical and horizontal egress components, namely stairs, occupant evacuation elevators (OEEs), service elevators, transfer floors and sky-bridges. The evacuation strategies employ a continuous spatial representation evacuation model (Pathfinder) and are cross-validated by a fine network model (STEPS). Assessment of Total Evacuation Systems for Tall Buildings is intended for practitioners as a tool for analyzing evacuation methods and efficient exit strategies. Researchers working in architecture and fire safety will also find the book valuable.
Christoph Clauser and Jom Bartels SHE MAT (Simulator for HEat and MAss Transport) is an easy-to-use, general- purpose reactive transport simulation code for a wide variety of thermal and hy- drogeological problems in two and three dimensions. Specifically, SHEMAT solves coupled problems involving fluid flow, heat transfer, species transport, and chemical water-rock interaction in fluid-saturated porous media. It can handle a wide range of time scales. Therefore, it is useful to address both technical and geo- logical processes. In particular, it offers special and attractive features for model- ing steady-state and transient processes in hydro-geothermal reservoirs. This makes it well suited to predict the long-term behavior of heat mining installations in hot aquifers with highly saline brines. SHEMA T in its present form evolved from a fully coupled flow and heat transport model (Clauser 1988) which had been developed from the isothermal USGS 3-D groundwater model of Trescott and Larson (Trescott 1975; Trescott and Larson 1977). Transport of dissolved species, geochemical reactions between the solid and fluid phases, extended cou- pling between the individual processes (most notably between porosity and per- meability), and a convenient user interface (developed from Processing Modflow (Chiang and Kinzelbach 2001)) were added during several research projects funded by the German Science Foundation (DFG) under grant CL 12117 and the German Federal Ministries for Education, Science, Research, and Technology (BMBF) under grant 032 69 95A-D and for Economics and Technology (BMWi) under grant 0327095 (Bartels et al. 2002, Kuhn et al. 2002a).
In order to satisfy the needs of their customers, network
utilities require specially developed maintenance management
capabilities. Maintenance Management information systems are
essential to ensure control, gain knowledge and improve-decision
making in companies dealing with network infrastructure, such as
distribution of gas, water, electricity and telecommunications.
Maintenance Management in Network Utilities studies specified
characteristics of maintenance management in this sector to offer a
practical approach to defining and implementing the best management
practices and suitable frameworks.
|
You may like...
Recent Advances in Numerical Simulations
Francisco Bulnes, Jan Peter Hessling
Hardcover
Numerical Modeling and Computer…
Dragan M. Cvetkovic, Gunvant A. Birajdar
Hardcover
Advances in Principal Component Analysis
Fausto Pedro Garcia Marquez
Hardcover
Chemical Modelling - Volume 17
Hilke Bahmann, Jean Christophe Tremblay
Hardcover
R11,435
Discovery Miles 114 350
|