Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > Computer software packages > Computer graphics software > General
Particle models play an important role in many applications in physics, chemistry and biology. They can be studied on the computer with the help of molecular dynamics simulations. This book presents in detail both the necessary numerical methods and techniques (linked-cell method, SPME-method, tree codes, multipole technique) and the theoretical background and foundations. It illustrates the aspects modelling, discretization, algorithms and their parallel implementation with MPI on computer systems with distributed memory. Furthermore, detailed explanations are given to the different steps of numerical simulation, and code examples are provided. With the description of the algorithms and the presentation of the results of various simulations from the areas material science, nanotechnology, biochemistry and astrophysics, the reader of this book will be able to write his own programs for molecular dynamics step by step and to run successful experiments.
Make-believe plays a far stronger role in both the design and use of interfaces, games and services than we have come to believe. This edited volume illustrates ways for grasping and utilising that connection to improve interaction, user experiences, and customer value. Useful for designers, undergraduates and researchers alike, this new research provide tools for understanding and applying make-believe in various contexts, ranging from digital tools to physical services. It takes the reader through a world of imagination and intuition applied into efficient practice, with topics including the connection of human-computer interaction (HCI) to make-believe and backstories, the presence of imagination in gamification, gameworlds, virtual worlds and service design, and the believability of make-believe based designs in various contexts. Furthermore, it discusses the challenges inherent in applying make-believe as a basis for interaction design, as well as the enactive mechanism behind it. Whether used as a university textbook or simply used for design inspiration, Digital Make-Believe provides new and efficient insight into approaching interaction in the way in which actual users of devices, software and services can innately utilise it.
This book presents bond graph model-based fault detection with a focus on hybrid system models. The book addresses model design, simulation, control and model-based fault diagnosis of multidisciplinary engineering systems. The text beings with a brief survey of the state-of-the-art, then focuses on hybrid systems. The author then uses different bond graph approaches throughout the text and provides case studies.
The physics of metal forming and metal removing is normally expressed using non-linear partial differential equations which can be solved using the finite element method (FEM). However, when the process parameters are uncertain and/or the physics of the process is not well understood, soft computing techniques can be used with FEM or alone to model the process. Using FEM, fuzzy set theory and neural networks as modeling tools; Modeling of Metal Forming and Machining Processes provides a complete treatment of metal forming and machining, and includes: a [ an explanation of FEM and its application to the modeling of manufacturing processes; a [ a discussion of the numerical difficulties of FEM; a [ chapters on the application of soft computing techniques in this modeling process. The algorithms and solved examples included make Modeling of Metal Forming and Machining Processes of value to postgraduates, senior undergraduates, lecturers and researchers in these fields. R&D engineers and consultants for the manufacturing industry will also find it of use.
The Distinguished Dissertation series is published on behalf of the Conference of Professors and Heads of Computing and The British Computer Society, who annually select the best British PhD dissertations in computer science for publication. The dissertations are selected on behalf of the CPHC by a panel of eight academics. Each dissertation chosen makes a noteworthy contribution to the subject and reaches a high standard of exposition, placing all results clearly in the context of computer science as a whole. In this way computer scientists with significantly different interests are able to grasp the essentials - or even find a means of entry - to an unfamiliar research topic. This book develops a theory of game semantics, a recently discovered setting for modelling and reasoning about sequential programming languages, suitable for interpreting higher-order functional languages with rich type structure, and applies it to constr uct a fully abstract model of the metalanguage FPC.
Nowadays, engineering systems are of ever-increasing complexity and must be c- sidered asmultidisciplinary systems composed of interacting subsystems or system components from different engineering disciplines. Thus, an integration of various engineering disciplines, e.g, mechanical, electrical and control engineering in ac- current design approach is required. With regard to the systematic development and analysis of system models, interdisciplinary computer aided methodologies are - coming more and more important. A graphical description formalism particularly suited for multidisciplinary s- tems arebondgraphs devised by Professor Henry Paynter in as early as 1959 at the Massachusetts Institute of Technology (MIT) in Cambridge, Massachusetts, USA and in use since then all over the world. This monograph is devoted exclusively to the bond graph methodology. It gives a comprehensive, in-depth, state-of-the-art presentation including recent results sc- tered over research articles and dissertations and research contributions by the - thor to a number of topics. The book systematically covers the fundamentals of developing bond graphs and deriving mathematical models from them, the recent developments in meth- ology, symbolic and numerical processing of mathematical models derived from bond graphs. Additionally it discusses modern modelling languages, the paradigm of object-oriented modelling, modern software that can be used for building and for processing of bond graph models, and provides a chapter with small case studies illustrating various applications of the methodology
This book offers a basic introduction to genetic algorithms. It provides a detailed explanation of genetic algorithm concepts and examines numerous genetic algorithm optimization problems. In addition, the book presents implementation of optimization problems using C and C++ as well as simulated solutions for genetic algorithm problems using MATLAB 7.0. It also includes application case studies on genetic algorithms in emerging fields.
Computer languages and computer graphics have become the primary modes of human-computer interaction. This book provides a basic introduction to "Real and Virtual Environment" computer modelling. Graphics models are used to illustrate both the way computer languages are processed and also used to create computer models of graphic displays. Computer languages have been bootstrapped from machine code, to high-level languages such as Java, to animation scripting languages. Integrating graphic and computer models takes this support for programming, design and simulation work, one step further, allowing interactive computer graphic displays to be used to construct computer models of both real and virtual environment systems. The Java language is used to implement basic algorithms for language translation, and to generate graphic displays. It is also used to simulate the behaviour of a computer system, to explore the way programming and design-simulation environments can be put together.
This book contains a selection of papers from the 16th International Symposium on Spatial Data Handling (SDH), the premier long-running forum in geographical information science. This collection offers readers exemplary contributions to geospatial scholarship and practice from the conference's 30th anniversary.
This book is a collection of writings by active researchers in the field of Artificial General Intelligence, on topics of central importance in the field. Each chapter focuses on one theoretical problem, proposes a novel solution, and is written in sufficiently non-technical language to be understandable by advanced undergraduates or scientists in allied fields. This book is the very first collection in the field of Artificial General Intelligence (AGI) focusing on theoretical, conceptual, and philosophical issues in the creation of thinking machines. All the authors are researchers actively developing AGI projects, thus distinguishing the book from much of the theoretical cognitive science and AI literature, which is generally quite divorced from practical AGI system building issues. And the discussions are presented in a way that makes the problems and proposed solutions understandable to a wide readership of non-specialists, providing a distinction from the journal and conference-proceedings literature. The book will benefit AGI researchers and students by giving them a solid orientation in the conceptual foundations of the field (which is not currently available anywhere); and it would benefit researchers in allied fields by giving them a high-level view of the current state of thinking in the AGI field. Furthermore, by addressing key topics in the field in a coherent way, the collection as a whole may play an important role in guiding future research in both theoretical and practical AGI, and in linking AGI research with work in allied disciplines
This graduate-level text covers modeling, programming and analysis of simulation experiments and provides a rigorous treatment of the foundations of simulation and why it works. It introduces object-oriented programming for simulation, covers both the probabilistic and statistical basis for simulation in a rigorous but accessible manner (providing all necessary background material); and provides a modern treatment of experiment design and analysis that goes beyond classical statistics. The book emphasizes essential foundations throughout, rather than providing a compendium of algorithms and theorems and prepares the reader to use simulation in research as well as practice. The book is a rigorous, but concise treatment, emphasizing lasting principles but also providing specific training in modeling, programming and analysis. In addition to teaching readers how to do simulation, it also prepares them to use simulation in their research; no other book does this. An online solutions manual for end of chapter exercises is also be provided. "
This book is a compilation of a selected subset of research articles presented at the Eighth INFORMS Computing Society Conference, held in Chandler, Arizona, from January 8 to 10, 2003. The articles in this book represent the diversity and depth of the interface between ORiMS (operations research and the management sciences) and CS/AI (computer science and artificial intelligence ). This volume starts with two papers that represent the reflective and integrative thinking that is critical to any scientific discipline. These two articles present philosophical perspectives on computation, covering a variety of traditional and newer methods for modeling, solving, and explaining mathematical models. The next set includes articles that study machine learning and computational heuristics, and is followed by articles that address issues in performance testing of solution algorithms and heuristics. These two sets of papers demonstrate the richness of thought that takes place at the ORiMS and CSI AI interface. The final set of articles demonstrates the usefulness of these and other methods at the interface towards solving problems in the real world, covering e-commerce, workflow, electronic negotiation, music, parallel computation, and telecommunications. The articles in this collection represent the results of cross-fertilization between ORiMS and CSI AI, making possible advances that could have not been achieved in isolation. The continuing aim ofthe INFORMS Computing Society and this research conference is to invigorate and further develop this interface.
This book provides a conceptual and computational framework to study how the nervous system exploits the anatomical properties of limbs to produce mechanical function. The study of the neural control of limbs has historically emphasized the use of optimization to find solutions to the muscle redundancy problem. That is, how does the nervous system select a specific muscle coordination pattern when the many muscles of a limb allow for multiple solutions? I revisit this problem from the emerging perspective of neuromechanics that emphasizes finding and implementing families of feasible solutions, instead of a single and unique optimal solution. Those families of feasible solutions emerge naturally from the interactions among the feasible neural commands, anatomy of the limb, and constraints of the task. Such alternative perspective to the neural control of limb function is not only biologically plausible, but sheds light on the most central tenets and debates in the fields of neural control, robotics, rehabilitation, and brain-body co-evolutionary adaptations. This perspective developed from courses I taught to engineers and life scientists at Cornell University and the University of Southern California, and is made possible by combining fundamental concepts from mechanics, anatomy, mathematics, robotics and neuroscience with advances in the field of computational geometry. Fundamentals of Neuromechanics is intended for neuroscientists, roboticists, engineers, physicians, evolutionary biologists, athletes, and physical and occupational therapists seeking to advance their understanding of neuromechanics. Therefore, the tone is decidedly pedagogical, engaging, integrative, and practical to make it accessible to people coming from a broad spectrum of disciplines. I attempt to tread the line between making the mathematical exposition accessible to life scientists, and convey the wonder and complexity of neuroscience to engineers and computational scientists. While no one approach can hope to definitively resolve the important questions in these related fields, I hope to provide you with the fundamental background and tools to allow you to contribute to the emerging field of neuromechanics.
Understanding how the human brain represents, stores, and processes information is one of the greatest unsolved mysteries of science today. The cerebral cortex is the seat of most of the mental capabilities that distinguish humans from other animals and, once understood, it will almost certainly lead to a better knowledge of other brain nuclei. Although neuroscience research has been underway for 150 years, very little progress has been made. What is needed is a key concept that will trigger a full understanding of existing information, and will also help to identify future directions for research. This book aims to help identify this key concept. Including contributions from leading experts in the field, it provides an overview of different conceptual frameworks that indicate how some pieces of the neuroscience puzzle fit together. It offers a representative selection of current ideas, concepts, analyses, calculations and computer experiments, and also looks at important advances such as the application of new modeling methodologies. Computational Models for Neuroscience will be essential reading for anyone who needs to keep up-to-date with the latest ideas in computational neuroscience, machine intelligence, and intelligent systems. It will also be useful background reading for advanced undergraduates and postgraduates taking courses in neuroscience and psychology.
This isn't a book about the Object Data Standard; it's the
complete, When it comes to storing objects in databases, ODMG 3.0 is
a
Computer simulations not only belong to the most important methods for the theoretical investigation of granular materials, but provide the tools that have enabled much of the expanding research by physicists and engineers. The present book is intended to serve as an introduction to the application of numerical methods to systems of granular particles. Accordingly emphasis is on a general understanding of the subject rather than on the presentation of latest advances in numerical algorithms. Although a basic knowledge of C++ is needed for the understanding of the numerical methods and algorithms in the book, it avoids usage of elegant but complicated algorithms to remain accessible for those who prefer to use a different programming language. While the book focuses more on models than on the physics of granular material, many applications to real systems are presented.
This book introduces the latest progress in six degrees of freedom (6-DoF) haptic rendering with the focus on a new approach for simulating force/torque feedback in performing tasks that require dexterous manipulation skills. One of the major challenges in 6-DoF haptic rendering is to resolve the conflict between high speed and high fidelity requirements, especially in simulating a tool interacting with both rigid and deformable objects in a narrow space and with fine features. The book presents a configuration-based optimization approach to tackle this challenge. Addressing a key issue in many VR-based simulation systems, the book will be of particular interest to researchers and professionals in the areas of surgical simulation, rehabilitation, virtual assembly, and inspection and maintenance.
This book brings together some of the most influential pieces of research undertaken around the world in design synthesis. It is the first comprehensive work of this kind and covers all three aspects of research in design synthesis:- understanding what constitutes and influences synthesis;- the major approaches to synthesis;- the diverse range of tools that are created to support this crucial design task.The chapters are comprised of cutting edge research and established methods, written by the originators of this growing field of research. They cover all major generic synthesis approaches i.e., composition, retrieval, change and repair, and tackle problems that come from a wide variety of domains within architecture and engineering as well as areas of application including clocks, sensors and medical devices. The book contains an editorial introduction to the chapters and the broader context of research they represent. With its range of tools and methods covered, it is an ideal introduction to design synthesis for those intending to research in this area as well as being a valuable source of ideas for educators and practitioners of engineering design.
This book presents the selected results of the XI Scientific Conference Selected Issues of Electrical Engineering and Electronics (WZEE) which was held in Rzeszow and Czarna, Poland on September 27-30, 2013. The main aim of the Conference was to provide academia and industry to discuss and present the latest technological advantages and research results and to integrate the new interdisciplinary scientific circle in the field of electrical engineering, electronics and mechatronics. The Conference was organized by the Rzeszow Division of Polish Association of Theoretical and Applied Electrical Engineering (PTETiS) in cooperation with Rzeszow University of Technology, the Faculty of Electrical and Computer Engineering and Rzeszow University, the Faculty of Mathematics and Natural Sciences.
The 2nd edition of Chopra's Google SketchUp provides key pedagogical elements, which help prepare readers for the workforce. The content provides real-world and applied material including better PowerPoint presentations and how-to animations. Additional features include updated content to reflect software upgrades and market use; new pedagogy elements and interior design; and more robust resources that will are appropriate for different users of Google Sketch. The book also addresses the similarities between the adapted title, Google SketchUp 8 for Dummies, and Google SketchUp 2. This includes a title that contains the core content and basic software how-to from For Dummies; revised TOC to reflect the course; and new material developed/written by writer and academic advisors/reviewers. This edition goes beyond the basic software use to teach on portions of SketchUp.
This book has the unique intention of returning the mathematical tools of neural networks to the biological realm of the nervous system, where they originated a few decades ago. It aims to introduce, in a didactic manner, two relatively recent developments in neural network methodology, namely recurrence in the architecture and the use of spiking or integrate-and-fire neurons. In addition, the neuro-anatomical processes of synapse modification during development, training, and memory formation are discussed as realistic bases for weight-adjustment in neural networks. While neural networks have many applications outside biology, where it is irrelevant precisely which architecture and which algorithms are used, it is essential that there is a close relationship between the network's properties and whatever is the case in a neuro-biological phenomenon that is being modelled or simulated in terms of a neural network. A recurrent architecture, the use of spiking neurons and appropriate weight update rules contribute to the plausibility of a neural network in such a case. Therefore, in the first half of this book the foundations are laid for the application of neural networks as models for the various biological phenomena that are treated in the second half of this book. These include various neural network models of sensory and motor control tasks that implement one or several of the requirements for biological plausibility.
This book is a status report. It provides a broad overview of the most recent developments in the field, spanning a wide range of topical areas in simulational condensed matter physics. These areas include recent developments in simulations of classical statistical mechanics models, electronic structure calculations, quantum simulations, and simulations of polymers. Both new physical results and novel simulational and data analysis methods are presented. Some of the highlights of this volume include detailed accounts of recent theoretical developments in electronic structure calculations, novel quantum simulation techniques and their applications to strongly interacting lattice fermion models, and a wide variety of applications of existing methods as well as novel methods in the simulation of classical statistical mechanics models, including spin glasses and polymers.
Agent-based modelling on a computer appears to have a special role to play in the development of social science. It offers a means of discovering general and applicable social theory, and grounding it in precise assumptions and derivations, whilst addressing those elements of individual cognition that are central to human society. However, there are important questions to be asked and difficulties to overcome in achieving this potential. What differentiates agent-based modelling from traditional computer modelling? Which model types should be used under which circumstances? If it is appropriate to use a complex model, how can it be validated? Is social simulation research to adopt a realist epistemology, or can it operate within a social constructionist framework? What are the sociological concepts of norms and norm processing that could either be used for planned implementation or for identifying equivalents of social norms among co-operative agents? Can sustainability be achieved more easily in a hierarchical agent society than in a society of isolated agents? What examples are there of hybrid forms of interaction between humans and artificial agents? These are some of the sociological questions that are addressed.
This book examines the historical roots and evolution of simulation from an epistemological, institutional and technical perspective. Rich case studies go far beyond documentation of simulation 's capacity for application in many domains; they also explore the "functional" and "structural" debate that continues to traverse simulation thought and action. This book is an essential contribution to the assessment of simulation as scientific instrument.
Introduction The exponential scaling of feature sizes in semiconductor technologies has side-effects on layout optimization, related to effects such as inter connect delay, noise and crosstalk, signal integrity, parasitics effects, and power dissipation, that invalidate the assumptions that form the basis of previous design methodologies and tools. This book is intended to sample the most important, contemporary, and advanced layout opti mization problems emerging with the advent of very deep submicron technologies in semiconductor processing. We hope that it will stimulate more people to perform research that leads to advances in the design and development of more efficient, effective, and elegant algorithms and design tools. Organization of the Book The book is organized as follows. A multi-stage simulated annealing algorithm that integrates floorplanning and interconnect planning is pre sented in Chapter 1. To reduce the run time, different interconnect plan ning approaches are applied in different ranges of temperatures. Chapter 2 introduces a new design methodology - the interconnect-centric design methodology and its centerpiece, interconnect planning, which consists of physical hierarchy generation, floorplanning with interconnect planning, and interconnect architecture planning. Chapter 3 investigates a net-cut minimization based placement tool, Dragon, which integrates the state of the art partitioning and placement techniques." |
You may like...
Digital Image and Video Watermarking and…
Sudhakar Ramakrishnan
Hardcover
Context in Computing - A…
Patrick Brezillon, Avelino J. Gonzalez
Hardcover
Seminal Contributions to Modelling and…
Khalid Al-Begain, Andrzej Bargiela
Hardcover
R3,293
Discovery Miles 32 930
Agent-Based Models and Complexity…
Liliana Perez, Eun-Kyeong Kim, …
Hardcover
R4,228
Discovery Miles 42 280
|