![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Computer modelling & simulation
Huge earthquakes and tsunamis have caused serious damage to important structures such as civil infrastructure elements, buildings and power plants around the globe. To quantitatively evaluate such damage processes and to design effective prevention and mitigation measures, the latest high-performance computational mechanics technologies, which include telascale to petascale computers, can offer powerful tools. The phenomena covered in this book include seismic wave propagation in the crust and soil, seismic response of infrastructure elements such as tunnels considering soil-structure interactions, seismic response of high-rise buildings, seismic response of nuclear power plants, tsunami run-up over coastal towns and tsunami inundation considering fluid-structure interactions. The book provides all necessary information for addressing these phenomena, ranging from the fundamentals of high-performance computing for finite element methods, key algorithms of accurate dynamic structural analysis, fluid flows with free surfaces, and fluid-structure interactions, to practical applications with detailed simulation results. The book will offer essential insights for researchers and engineers working in the field of computational seismic/tsunami engineering.
This book contains a selection of papers from the 16th International Symposium on Spatial Data Handling (SDH), the premier long-running forum in geographical information science. This collection offers readers exemplary contributions to geospatial scholarship and practice from the conference's 30th anniversary.
This book is a collection of writings by active researchers in the field of Artificial General Intelligence, on topics of central importance in the field. Each chapter focuses on one theoretical problem, proposes a novel solution, and is written in sufficiently non-technical language to be understandable by advanced undergraduates or scientists in allied fields. This book is the very first collection in the field of Artificial General Intelligence (AGI) focusing on theoretical, conceptual, and philosophical issues in the creation of thinking machines. All the authors are researchers actively developing AGI projects, thus distinguishing the book from much of the theoretical cognitive science and AI literature, which is generally quite divorced from practical AGI system building issues. And the discussions are presented in a way that makes the problems and proposed solutions understandable to a wide readership of non-specialists, providing a distinction from the journal and conference-proceedings literature. The book will benefit AGI researchers and students by giving them a solid orientation in the conceptual foundations of the field (which is not currently available anywhere); and it would benefit researchers in allied fields by giving them a high-level view of the current state of thinking in the AGI field. Furthermore, by addressing key topics in the field in a coherent way, the collection as a whole may play an important role in guiding future research in both theoretical and practical AGI, and in linking AGI research with work in allied disciplines
This book is a compilation of a selected subset of research articles presented at the Eighth INFORMS Computing Society Conference, held in Chandler, Arizona, from January 8 to 10, 2003. The articles in this book represent the diversity and depth of the interface between ORiMS (operations research and the management sciences) and CS/AI (computer science and artificial intelligence ). This volume starts with two papers that represent the reflective and integrative thinking that is critical to any scientific discipline. These two articles present philosophical perspectives on computation, covering a variety of traditional and newer methods for modeling, solving, and explaining mathematical models. The next set includes articles that study machine learning and computational heuristics, and is followed by articles that address issues in performance testing of solution algorithms and heuristics. These two sets of papers demonstrate the richness of thought that takes place at the ORiMS and CSI AI interface. The final set of articles demonstrates the usefulness of these and other methods at the interface towards solving problems in the real world, covering e-commerce, workflow, electronic negotiation, music, parallel computation, and telecommunications. The articles in this collection represent the results of cross-fertilization between ORiMS and CSI AI, making possible advances that could have not been achieved in isolation. The continuing aim ofthe INFORMS Computing Society and this research conference is to invigorate and further develop this interface.
Power system modelling and scripting is a quite general and ambitious title. Of course, to embrace all existing aspects of power system modelling would lead to an encyclopedia and would be likely an impossible task. Thus, the book focuses on a subset of power system models based on the following assumptions: (i) devices are modelled as a set of nonlinear differential algebraic equations, (ii) all alternate-current devices are operating in three-phase balanced fundamental frequency, and (iii) the time frame of the dynamics of interest ranges from tenths to tens of seconds. These assumptions basically restrict the analysis to transient stability phenomena and generator controls. The modelling step is not self-sufficient. Mathematical models have to be translated into computer programming code in order to be analyzed, understood and experienced . It is an object of the book to provide a general framework for a power system analysis software tool and hints for filling up this framework with versatile programming code. This book is for all students and researchers that are looking for a quick reference on power system models or need some guidelines for starting the challenging adventure of writing their own code."
Conceived for both computer scientists and biologists alike, this
collection of 22 essays highlights the important new role that
computers play in developmental biology research. Essays show how
through computer modeling, researchers gain further insight into
developmental processes. Featured essays also cover their use in
designing computer algorithms to tackle computer science problems
in areas like neural network design, robot control, evolvable
hardware, and more. Peter Bentley, noted for his prolific research
on evolutionary computation, and Sanjeev Kumar head up a respected
team to guide readers through these very complex and fascinating
disciplines.
This volume introduces a series of different data-driven computational methods for analyzing group processes through didactic and tutorial-based examples. Group processes are of central importance to many sectors of society, including government, the military, health care, and corporations. Computational methods are better suited to handle (potentially huge) group process data than traditional methodologies because of their more flexible assumptions and capability to handle real-time trace data. Indeed, the use of methods under the name of computational social science have exploded over the years. However, attention has been focused on original research rather than pedagogy, leaving those interested in obtaining computational skills lacking a much needed resource. Although the methods here can be applied to wider areas of social science, they are specifically tailored to group process research. A number of data-driven methods adapted to group process research are demonstrated in this current volume. These include text mining, relational event modeling, social simulation, machine learning, social sequence analysis, and response surface analysis. In order to take advantage of these new opportunities, this book provides clear examples (e.g., providing code) of group processes in various contexts, setting guidelines and best practices for future work to build upon. This volume will be of great benefit to those willing to learn computational methods. These include academics like graduate students and faculty, multidisciplinary professionals and researchers working on organization and management science, and consultants for various types of organizations and groups.
This book provides a conceptual and computational framework to study how the nervous system exploits the anatomical properties of limbs to produce mechanical function. The study of the neural control of limbs has historically emphasized the use of optimization to find solutions to the muscle redundancy problem. That is, how does the nervous system select a specific muscle coordination pattern when the many muscles of a limb allow for multiple solutions? I revisit this problem from the emerging perspective of neuromechanics that emphasizes finding and implementing families of feasible solutions, instead of a single and unique optimal solution. Those families of feasible solutions emerge naturally from the interactions among the feasible neural commands, anatomy of the limb, and constraints of the task. Such alternative perspective to the neural control of limb function is not only biologically plausible, but sheds light on the most central tenets and debates in the fields of neural control, robotics, rehabilitation, and brain-body co-evolutionary adaptations. This perspective developed from courses I taught to engineers and life scientists at Cornell University and the University of Southern California, and is made possible by combining fundamental concepts from mechanics, anatomy, mathematics, robotics and neuroscience with advances in the field of computational geometry. Fundamentals of Neuromechanics is intended for neuroscientists, roboticists, engineers, physicians, evolutionary biologists, athletes, and physical and occupational therapists seeking to advance their understanding of neuromechanics. Therefore, the tone is decidedly pedagogical, engaging, integrative, and practical to make it accessible to people coming from a broad spectrum of disciplines. I attempt to tread the line between making the mathematical exposition accessible to life scientists, and convey the wonder and complexity of neuroscience to engineers and computational scientists. While no one approach can hope to definitively resolve the important questions in these related fields, I hope to provide you with the fundamental background and tools to allow you to contribute to the emerging field of neuromechanics.
Understanding how the human brain represents, stores, and processes information is one of the greatest unsolved mysteries of science today. The cerebral cortex is the seat of most of the mental capabilities that distinguish humans from other animals and, once understood, it will almost certainly lead to a better knowledge of other brain nuclei. Although neuroscience research has been underway for 150 years, very little progress has been made. What is needed is a key concept that will trigger a full understanding of existing information, and will also help to identify future directions for research. This book aims to help identify this key concept. Including contributions from leading experts in the field, it provides an overview of different conceptual frameworks that indicate how some pieces of the neuroscience puzzle fit together. It offers a representative selection of current ideas, concepts, analyses, calculations and computer experiments, and also looks at important advances such as the application of new modeling methodologies. Computational Models for Neuroscience will be essential reading for anyone who needs to keep up-to-date with the latest ideas in computational neuroscience, machine intelligence, and intelligent systems. It will also be useful background reading for advanced undergraduates and postgraduates taking courses in neuroscience and psychology.
In this book for the first time two scientific fields - consensus
formation and synchronization of communications - are presented
together and examined through their interrelational aspects, of
rapidly growing importance. Both fields have indeed attracted
enormous research interest especially in relation to complex
networks.
Computer simulations not only belong to the most important methods for the theoretical investigation of granular materials, but provide the tools that have enabled much of the expanding research by physicists and engineers. The present book is intended to serve as an introduction to the application of numerical methods to systems of granular particles. Accordingly emphasis is on a general understanding of the subject rather than on the presentation of latest advances in numerical algorithms. Although a basic knowledge of C++ is needed for the understanding of the numerical methods and algorithms in the book, it avoids usage of elegant but complicated algorithms to remain accessible for those who prefer to use a different programming language. While the book focuses more on models than on the physics of granular material, many applications to real systems are presented.
The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, computer-intensive methods such as the bootstrap and cross-validation freed practitioners from the limitations of parametric models, and paved the way towards the `big data' era of the 21st century. Nonetheless, there is a further step one may take, i.e., going beyond even nonparametric models; this is where the Model-Free Prediction Principle is useful. Interestingly, being able to predict a response variable Y associated with a regressor variable X taking on any possible value seems to inadvertently also achieve the main goal of modeling, i.e., trying to describe how Y depends on X. Hence, as prediction can be treated as a by-product of model-fitting, key estimation problems can be addressed as a by-product of being able to perform prediction. In other words, a practitioner can use Model-Free Prediction ideas in order to additionally obtain point estimates and confidence intervals for relevant parameters leading to an alternative, transformation-based approach to statistical inference.
This volume is dedicated to Jacob Aboudi, a ?ne scientist who has made seminal c- tributions in applied mechanics. The papers presented here re?ect the appreciation of many of Jacob's colleagues. A publication list f- lowing this introduction provides an indi- tion of his distinguished academic career, c- rently in its ?fth decade, and the breadth of hisknowledge. His papersconsistentlydem- strate originality, innovation and diligence. This list uncovers the methodical work of a dedicated researcher whose achievements established him as a leading authority in the area of mathematical modeling of the beh- ior of heterogeneous materials, the area which became known as homogenization theory. Starting in 1981, Jacob established a micromechanical model known as the Method of Cells (MOC) which evolved into the Generalized Method of Cells (GMC) that predicts the macroscopic response of composite materials as a function of the pr- erties, volume fractions, shapes, and constitutive behavior of its constituents. The versatility of the model has been demonstrated to effectively incorporate various types of constituent material behavior (i. e. , both coupled and uncoupled mecha- cal, thermal, electrical and magnetic effects). As a result of its potential in providing an ef?cient tool for the emerging ?eld of multiscale analysis, the method gained increasing attention and became a subject for further research.
This book brings together some of the most influential pieces of research undertaken around the world in design synthesis. It is the first comprehensive work of this kind and covers all three aspects of research in design synthesis:- understanding what constitutes and influences synthesis;- the major approaches to synthesis;- the diverse range of tools that are created to support this crucial design task.The chapters are comprised of cutting edge research and established methods, written by the originators of this growing field of research. They cover all major generic synthesis approaches i.e., composition, retrieval, change and repair, and tackle problems that come from a wide variety of domains within architecture and engineering as well as areas of application including clocks, sensors and medical devices. The book contains an editorial introduction to the chapters and the broader context of research they represent. With its range of tools and methods covered, it is an ideal introduction to design synthesis for those intending to research in this area as well as being a valuable source of ideas for educators and practitioners of engineering design.
This book describes thermal plant simulation, that is, dynamic simulation of plants which produce, exchange and otherwise utilize heat as their working medium. Directed at chemical, mechanical and control engineers involved with operations, control and optimization and operator training, the book gives the mathematical formulation and use of simulation models of the equipment and systems typically found in these industries. The author has adopted a fundamental approach to the subject. The initial chapters provide an overview of simulation concepts and describe a suitable computer environment. Reviews of relevant numerical computation methods and fundamental thermodynamics are followed by a detailed examination of the basic conservation equations. The bulk of the book is concerned with development of specific simulation models. Care is taken to trace each model derivation path from the basic underlying physical equations, explaining simplifying and restrictive assumptions as they arise and relating the model coefficients to the physical dimensions and physical properties of the working materials. Numerous photographs of real equipment complement the text and most models are illustrated by numerical examples based on typical real plant operations.
This book introduces the latest progress in six degrees of freedom (6-DoF) haptic rendering with the focus on a new approach for simulating force/torque feedback in performing tasks that require dexterous manipulation skills. One of the major challenges in 6-DoF haptic rendering is to resolve the conflict between high speed and high fidelity requirements, especially in simulating a tool interacting with both rigid and deformable objects in a narrow space and with fine features. The book presents a configuration-based optimization approach to tackle this challenge. Addressing a key issue in many VR-based simulation systems, the book will be of particular interest to researchers and professionals in the areas of surgical simulation, rehabilitation, virtual assembly, and inspection and maintenance.
This book has the unique intention of returning the mathematical tools of neural networks to the biological realm of the nervous system, where they originated a few decades ago. It aims to introduce, in a didactic manner, two relatively recent developments in neural network methodology, namely recurrence in the architecture and the use of spiking or integrate-and-fire neurons. In addition, the neuro-anatomical processes of synapse modification during development, training, and memory formation are discussed as realistic bases for weight-adjustment in neural networks. While neural networks have many applications outside biology, where it is irrelevant precisely which architecture and which algorithms are used, it is essential that there is a close relationship between the network's properties and whatever is the case in a neuro-biological phenomenon that is being modelled or simulated in terms of a neural network. A recurrent architecture, the use of spiking neurons and appropriate weight update rules contribute to the plausibility of a neural network in such a case. Therefore, in the first half of this book the foundations are laid for the application of neural networks as models for the various biological phenomena that are treated in the second half of this book. These include various neural network models of sensory and motor control tasks that implement one or several of the requirements for biological plausibility.
This book presents the selected results of the XI Scientific Conference Selected Issues of Electrical Engineering and Electronics (WZEE) which was held in Rzeszow and Czarna, Poland on September 27-30, 2013. The main aim of the Conference was to provide academia and industry to discuss and present the latest technological advantages and research results and to integrate the new interdisciplinary scientific circle in the field of electrical engineering, electronics and mechatronics. The Conference was organized by the Rzeszow Division of Polish Association of Theoretical and Applied Electrical Engineering (PTETiS) in cooperation with Rzeszow University of Technology, the Faculty of Electrical and Computer Engineering and Rzeszow University, the Faculty of Mathematics and Natural Sciences.
The book describes the K-Method which has been developed by the authors. The purpose of the K-Method is to negotiate and administrate a complex portfolio of customised materials, all belonging to the same purchasing group (e.g. labels). The underlying idea is to agree prices for specification features, instead of giving each material an individual price based on its unique specification. By doing so, a price formula will be agreed between the buyer and supplier which even defines prices of future materials with any kind of specification.
This book is a status report. It provides a broad overview of the most recent developments in the field, spanning a wide range of topical areas in simulational condensed matter physics. These areas include recent developments in simulations of classical statistical mechanics models, electronic structure calculations, quantum simulations, and simulations of polymers. Both new physical results and novel simulational and data analysis methods are presented. Some of the highlights of this volume include detailed accounts of recent theoretical developments in electronic structure calculations, novel quantum simulation techniques and their applications to strongly interacting lattice fermion models, and a wide variety of applications of existing methods as well as novel methods in the simulation of classical statistical mechanics models, including spin glasses and polymers.
Agent-based modelling on a computer appears to have a special role to play in the development of social science. It offers a means of discovering general and applicable social theory, and grounding it in precise assumptions and derivations, whilst addressing those elements of individual cognition that are central to human society. However, there are important questions to be asked and difficulties to overcome in achieving this potential. What differentiates agent-based modelling from traditional computer modelling? Which model types should be used under which circumstances? If it is appropriate to use a complex model, how can it be validated? Is social simulation research to adopt a realist epistemology, or can it operate within a social constructionist framework? What are the sociological concepts of norms and norm processing that could either be used for planned implementation or for identifying equivalents of social norms among co-operative agents? Can sustainability be achieved more easily in a hierarchical agent society than in a society of isolated agents? What examples are there of hybrid forms of interaction between humans and artificial agents? These are some of the sociological questions that are addressed.
Introduction The exponential scaling of feature sizes in semiconductor technologies has side-effects on layout optimization, related to effects such as inter connect delay, noise and crosstalk, signal integrity, parasitics effects, and power dissipation, that invalidate the assumptions that form the basis of previous design methodologies and tools. This book is intended to sample the most important, contemporary, and advanced layout opti mization problems emerging with the advent of very deep submicron technologies in semiconductor processing. We hope that it will stimulate more people to perform research that leads to advances in the design and development of more efficient, effective, and elegant algorithms and design tools. Organization of the Book The book is organized as follows. A multi-stage simulated annealing algorithm that integrates floorplanning and interconnect planning is pre sented in Chapter 1. To reduce the run time, different interconnect plan ning approaches are applied in different ranges of temperatures. Chapter 2 introduces a new design methodology - the interconnect-centric design methodology and its centerpiece, interconnect planning, which consists of physical hierarchy generation, floorplanning with interconnect planning, and interconnect architecture planning. Chapter 3 investigates a net-cut minimization based placement tool, Dragon, which integrates the state of the art partitioning and placement techniques."
This book contains the research on modeling bodies, cloth and character based adaptation performed during the last 3 years at MIRALab at the University of Geneva. More than ten researchers have worked together in order to reach a truly 3D Virtual Try On. What we mean by Virtual Try On is the possibility of anyone to give dimensions on her predefined body and obtain her own sized shape body, select a 3D cloth and see oneself animated in Real-Time, walking along a catwalk. Some systems exist today but are unable to adapt to body dimensions, have no real-time animation of body and clothes. A truly system on the web of Virtual Try On does not exist so far. This book is an attempt to explain how to build a 3D Virtual Try On system which is now very much in demand in the clothing industry. To describe this work, the book is divided into five chapters. The first chapter contains a brief historical background of general deformation methods. It ends with a section on the 3D human body scanner systems that are used both for rapid p- totyping and statistical analyses of the human body size variations.
The term "haptics" refers to the science of sensing and manipulation through touch. Multiple disciplines such as biomechanics, psychophysics, robotics, neuroscience, and software engineering converge to support haptics, and generally, haptic research is done by three communities: the robotics community, the human computer interface community, and the virtual reality community. This book is different from any other book that has looked at haptics. The authors treat haptics as a new medium rather than just a domain within one of the above areas. They describe human haptic perception and interfaces and present fundamentals in haptic rendering and modeling in virtual environments. Diverse software architectures for standalone and networked haptic systems are explained, and the authors demonstrate the vast application spectrum of this emerging technology along with its accompanying trends. The primary objective is to provide a comprehensive overview and a practical understanding of haptic technologies. An appreciation of the close relationship between the wide range of disciplines that constitute a haptic system is a key principle towards being able to build successful collaborative haptic environments. Structured as a reference to allow for fast accommodation of the issues concerned, this book is intended for researchers interested in studying touch and force feedback for use in technological multimedia systems in computer science, electrical engineering, or other related disciplines. With its novel approach, it paves the way for exploring research trends and challenges in such fields as interpersonal communication, games, or military applications.
Researches and developers of simulation models state that the Java program ming language presents a unique and significant opportunity for important changes in the way we develop simulation models today. The most important characteristics of the Java language that are advantageous for simulation are its multi-threading capabilities, its facilities for executing programs across the Web, and its graphics facilities. It is feasible to develop compatible and reusable simulation components that will facilitate the construction of newer and more complex models. This is possible with Java development environments. Another important trend that begun very recently is web-based simulation, i.e., and the execution of simulation models using Internet browser software. This book introduces the application of the Java programming language in discrete-event simulation. In addition, the fundamental concepts and prac tical simulation techniques for modeling different types of systems to study their general behavior and their performance are introduced. The approaches applied are the process interaction approach to discrete-event simulation and object-oriented modeling. Java is used as the implementation language and UML as the modeling language. The first offers several advantages compared to C++, the most important being: thread handling, graphical user interfaces (QUI) and Web computing. The second language, UML (Unified Modeling Language) is the standard notation used today for modeling systems as a collection of classes, class relationships, objects, and object behavior." |
You may like...
Statistical Methods for Microarray Data…
Andrei Y. Yakovlev, Lev Klebanov, …
Hardcover
The Spiderwick Chronicles: Book 2 - The…
Tony DiTerlizzi, Holly Black
Paperback
(2)
Modifications of Nuclear DNA and its…
Xiaodong Cheng, Robert M. Blumenthal
Hardcover
R3,920
Discovery Miles 39 200
Pathogenic Neisseria - Genomics…
John K. Davies, Charlene M. Kahler
Hardcover
R6,563
Discovery Miles 65 630
|