![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer software packages > Computer graphics software
What is it about the structure and organisation of science and technology that has led to the spectacularly successful growth of knowledge during this century? This book explores this important and much debated question in an innovative way, by using computer simulations. The computer simulation of societies and social processes is a methodology which is rapidly becoming recognised for its potential in the social sciences. This book applies the tools of simulation systematically to a specific domain: science and technology studies. The book shows how computer simulation can be applied both to questions in the history and philosophy of science and to issues of concern to sociologists of science and technology. Chapters in the book demonstrate the use of simulation for clarifying the notion of creativity and for understanding the logical processes employed by eminent scientists to make their discoveries. The book begins with three introductory chapters. The first introduces simulation for the social sciences, surveying current work and explaining the advantages and pitfalls of this new methodology. The second and third chapters review recent work on theoretical aspects of social simulation, introducing fundamental concepts such as self organisation and complexity and relating these to the simulation of scientific discovery."
This text describes computer programs for simulating phenomena in hydrodynamics, gas dynamics, and elastic plastic flow in one, two, and three dimensions. Included in the two-dimensional program are Maxwell's equations, and thermal and radiation diffusion. The numerical procedures described in the text permit the exact conservation of physical properties in the solutions of the fundamental laws of mechanics. The author also treats materials, including the use of simulation programs to predict material behavior.
Since the first edition of this book was published seven years ago, the field of modeling and simulation of communication systems has grown and matured in many ways, and the use of simulation as a day-to-day tool is now even more common practice. With the current interest in digital mobile communications, a primary area of application of modeling and simulation is now in wireless systems of a different flavor from the traditional' ones. This second edition represents a substantial revision of the first, partly to accommodate the new applications that have arisen. New chapters include material on modeling and simulation of nonlinear systems, with a complementary section on related measurement techniques, channel modeling and three new case studies; a consolidated set of problems is provided at the end of the book.
Emerging business models, value configurations, and information technologies interact over time to create competitive advantage. Modern information technology has to be studied, understood, and applied along the time dimension of months and years, where changes are the rule. Such changes created by interactions between business elements and resources are very well suited for system dynamics modeling. ""Business Dynamics in Information Technology"" presents business-technology alignment processes, interaction processes, and decision processes, helping the reader study information technology from a dynamic, rather than a static, perspective. By introducing two simple tools from system dynamic modeling - causal loops and reference modes - the dynamic perspective will become important to both students and practitioners in the future.
The focus of this volume is comprised of the fundamentals, models, and information technologies (IT) methods and tools for disaster prediction and mitigation. A more detailed list of topics includes mathematical and computational modeling of processes leading to or producing disasters, modeling of disaster effects, IT means for disaster mitigation, including data mining tools, knowledge-based and expert systems for use in disaster circumstances, GIS-based systems for disaster prevention and mitigation and equipment for disaster-prone areas. A specific type or class of disasters (natural or human-made), however will not be part of the main focus of this work. Instead, this book was conceived to offer a comprehensive, integrative view on disasters, seeking to determine what various disasters have in common. Because disaster resilience and mitigation involve humans, societies and cultures, not only technologies and economic models, special attention was paid in this volume to gain a comprehensive view on these issues, as a foundation of the IT tool design.
From environmental management to land planning and geo-marketing, the number of application domains that may greatly benefit from using data enriched with spatio-temporal features is expanding very rapidly. Unfortunately, development of new spatio-temporal applications is hampered by the lack of conceptual design methods suited to cope with the additional complexity of spatio-temporal data. This complexity is obviously due to the particular semantics of space and time, but also to the need for multiple representations of the same reality to address the diversity of requirements from highly heterogeneous user communities. Conceptual design methods are also needed to facilitate the exchange and reuse of existing data sets, a must in geographical data management due to the high collection costs of the data. Yet, current practice in areas like geographical information systems or moving objects databases does not include conceptual design methods very well, if at all. This book shows that a conceptual design approach for spatio-temporal databases is both feasible and easy to apprehend. While providing a firm basis through extensive discussion of traditional data modeling concepts, the major focus of the book is on modeling spatial and temporal information. Parent, Spaccapietra and Zimanyi provide a detailed and comprehensive description of an approach that fills the gap between application conceptual requirements and system capabilities, covering both data modeling and data manipulation features. The ideas presented summarize several years of research on the characteristics and description of space, time, and perception. In addition to the authors' own data modeling approach, MADS (Modeling of Application Data with Spatio-temporal features), the book also surveys alternative data models and approaches (from industry and academia) that target support of spatio-temporal modeling. The reader will acquire intimate knowledge of both the traditional and innovative features that form a consistent data modeling approach. Visual notations and examples are employed extensively to illustrate the use of the various constructs. Therefore, this book is of major importance and interest to advanced professionals, researchers, and graduate or post-graduate students in the areas of spatio-temporal databases and geographical information systems. "For anyone thinking of doing research in this field, or who is developing a system based on spatio-temporal data, this text is essential reading." (Mike Worboys, U Maine, Orono, ME, USA) "The high-level semantic model presented and validated in this book provides essential guidance to researchers and implementers when improving the capabilities of data systems to serve the actual needs of applications and their users in the temporal and spatial domains that are so prevalent today." (Gio Wiederhold, Stanford U, CA, USA)"
This textbook reviews the theory, applications, and latest breakthroughs in Delay Tolerant Networks (DTNs). Presenting a specific focus on Opportunistic Mobile Networks (OMNs), the text considers the influence of human aspects, and examines emerging forms of inter-node cooperation. Features: contains review terms and exercises in each chapter, with the solutions and source code available at an associated website; introduces the fundamentals of DTNs, covering OMNs, PSNs, and MOONs; describes the ONE simulator, explaining how to set up a simulation project; provides detailed insights into the development and testing of protocols, together with a set of best practices for increased productivity and optimized performance; examines human aspects in the context of communication networks, from human-centric applications to the impact of emotion on human-network interplay; proposes various schemes for inter-node cooperation in DTNs/OMNs; presents a detailed discussion on aspects of heterogeneity in DTNs.
This book focuses on the use of farm level, micro- and macro-data of cooperative systems and networks in developing new robust, reliable and coherent modeling tools for agricultural and environmental policy analysis. The efficacy of public intervention on agriculture is largely determined by the existence of reliable information on the effects of policy options and market developments on farmers' production decisions and in particular, on key issues such as levels of agricultural and non-agricultural output, land use and incomes, use of natural resources, sustainable-centric management, structural change and the viability of family farms. Over the last years, several methods and analytical tools have been developed for policy analysis using various sets of data. Such methods have been based on integrated approaches in an effort to investigate the above key issues and have thus attempted to offer a powerful environment for decision making, particularly in an era of radical change for both agriculture and the wider economy.
The book reports on the 11th International Workshop on Railway Noise, held on 9 - 13 September, 2013, in Uddevalla, Sweden. The event, which was jointly organized by the Competence Centre Chalmers Railway Mechanics (CHARMEC) and the Departments of Applied Mechanics and Applied Acoustics at Chalmers University of Technology in Gothenburg, Sweden, covered a broad range of topics in the field of railway noise and vibration, including: prospects, legal regulations and perceptions; wheel and rail noise; prediction, measurements and monitoring; ground-borne vibration; squeal noise and structure-borne noise; and aerodynamic noise generated by high-speed trains. Further topics included: resilient track forms; grinding, corrugation and roughness; and interior noise and sound barriers. This book, which consists of a collection of peer-reviewed papers originally submitted to the workshop, not only provides readers with an overview of the latest developments in the field, but also offers scientists and engineers essential support in their daily efforts to identify, understand and solve a number of problems related to railway noise and vibration, and to achieve their ultimate goal of reducing the environmental impact of railway systems.
DRAG (from un modele de la Demande Routiere, des Accidents et leur
Gravite) is a complex computer model that simulates accident
propensities under detailed conditions. The DRAG approach
constitutes the largest road accident modelling effort ever
undertaken. Gaudry is the creator and developer of DRAG and this
work explains its nature, purpose and value. Such a model, which
explains accidents for a whole region, province or country, has
advantages in answering many questions asked about accidents (such
as the role of the economic cycle, weather, prices, insurance etc.)
that other models fail to take fully into account.
An embedded system is loosely defined as any system that utilizes electronics but is not perceived or used as a general-purpose computer. Traditionally, one or more electronic circuits or microprocessors are literally embedded in the system, either taking up roles that used to be performed by mechanical devices, or providing functionality that is not otherwise possible. The goal of this book is to investigate how formal methods can be applied to the domain of embedded system design. The emphasis is on the specification, representation, validation, and design exploration of such systems from a high-level perspective. The authors review the framework upon which the theories and experiments are based, and through which the formal methods are linked to synthesis and simulation. A formal verification methodology is formulated to verify general properties of the designs and demonstrate that this methodology is efficient in dealing with the problem of complexity and effective in finding bugs. However, manual intervention in the form of abstraction selection and separation of timing and functionality is required. It is conjectured that, for specific properties, efficient algorithms exist for completely automatic formal validations of systems. Synchronous Equivalence: Formal Methods for Embedded Systems presents a brand new formal approach to high-level equivalence analysis. It opens design exploration avenues previously uncharted. It is a work that can stand alone but at the same time is fully compatible with the synthesis and simulation framework described in another book by Kluwer Academic Publishers Hardware-Software Co-Design of Embedded Systems: The POLIS Approach, by Balarin et al. Synchronous Equivalence: Formal Methods for Embedded Systems will be of interest to embedded system designers (automotive electronics, consumer electronics, and telecommunications), micro-controller designers, CAD developers and students, as well as IP providers, architecture platform designers, operating system providers, and designers of VLSI circuits and systems.
"Intelligent Control" considers non-traditional modelling and control approaches to nonlinear systems. Fuzzy logic, neural networks and evolutionary computing techniques are the main tools used. The book presents a modular switching fuzzy logic controller where a PD-type fuzzy controller is executed first followed by a PI-type fuzzy controller thus improving the performance of the controller compared with a PID-type fuzzy controller.The advantage of the switching-type fuzzy controller is that it uses one rule-base thus minimises the rule-base during execution. A single rule-base is developed by merging the membership functions for change of error of the PD-type controller and sum of error of the PI-type controller. Membership functions are then optimized using evolutionary algorithms. Since the two fuzzy controllers were executed in series, necessary further tuning of the differential and integral scaling factors of the controller is then performed. Neural-network-based tuning for the scaling parameters of the fuzzy controller is then described and finally an evolutionary algorithm is applied to the neurally-tuned-fuzzy controller in which the sigmoidal function shape of the neural network is determined. The important issue of stability is addressed and the text demonstrates empirically that the developed controller was stable within the operating range. The text concludes with ideas for future research to show the reader the potential for further study in this area. "Intelligent Control "will be of interest to researchers from engineering and computer science backgrounds working in the intelligent and adaptive control."
The book describes what these models are, what they are based on, how they function, and then, most innovatively, how they can be used to generate new useful knowledge about the environmental system. Discusses this generation of knowledge by computer models from an epistemological perspective and illustrates it by numerous examples from applied and fundamental research. Includes ample technical appendices and is a valuable source of information for graduate students and scientists alike working in the field of environmental sciences.
Building upon the fundamental principles of decision theory, Decision-Based Design: Integrating Consumer Preferences into Engineering Design presents an analytical approach to enterprise-driven Decision-Based Design (DBD) as a rigorous framework for decision making in engineering design. Once the related fundamentals of decision theory, economic analysis, and econometrics modelling are established, the remaining chapters describe the entire process, the associated analytical techniques, and the design case studies for integrating consumer preference modeling into the enterprise-driven DBD framework. Methods for identifying key attributes, optimal design of human appraisal experiments, data collection, data analysis, and demand model estimation are presented and illustrated using engineering design case studies. The scope of the chapters also provides: A rigorous framework of integrating the interests from both producer and consumers in engineering design, Analytical techniques of consumer choice modelling to forecast the impact of engineering decisions, Methods for synthesizing business and engineering models in multidisciplinary design environments, and Examples of effective application of Decision-Based Design supported by case studies. No matter whether you are an engineer facing decisions in consumer related product design, an instructor or student of engineering design, or a researcher exploring the role of decision making and consumer choice modelling in design, Decision-Based Design: Integrating Consumer Preferences into Engineering Design provides a reliable reference over a range of key topics.
In GPU Pro5 Advanced Rendering Techniques, section editors Wolfgang Engel, Christopher Oat, Carsten Dachsbacher, Michal Valient, Wessam Bahnassi, and Marius Bjorge have once again assembled a high-quality collection of cutting-edge techniques for advanced graphics processing unit (GPU) programming. Divided into six sections, the book covers rendering, lighting, effects in image space, mobile devices, 3D engine design, and compute. It explores rasterization of liquids, ray tracing of art assets that would otherwise be used in a rasterized engine, physically based area lights, volumetric light effects, screen-space grass, the usage of quaternions, and a quadtree implementation on the GPU. It also addresses the latest developments in deferred lighting on mobile devices, OpenCL optimizations for mobile devices, morph targets, and tiled deferred blending methods. In color throughout, GPU Pro5 is the only book that incorporates contributions from more than 50 experts who cover the latest developments in graphics programming for games and movies. It presents ready-to-use ideas and procedures that can help solve many of your daily graphics programming challenges. Example programs with source code are provided on the book s CRC Press web page."
Supramolecular chemistry has been defined by J.-M. Lehn as "a highly interdisciplinary field of science covering the chemical, physical, and biological features of chemical species of higher complexity, that are held together and organized by means of intermolecular (noncovalent) binding interactions" (Science, 1993). Recognition, reactivity, and transport represent three basic functional features, in essence dynami s, which may be translated into structural features. The purpose of the NATO workshop which took place september 1-5, 1993 at the Bischenberg (near Strasbourg) was to present computations which may contribute to the atomic level understanding of the structural and thermodynamical features involved in the processes of molecular recognition and supramolecular organization. of "supra-molecular modeling." Other The main focus was therefore, on the many facets applications of computers in chemistry, such as automation, simulation of processes, procedures for fitting kinetic or thermodynamic data, computer assisted synthetic strategies, use of data bases for structure elucidation or for bibliographic searches, have an obvious impact in supramolecular chemistry as well, but were not presented at the workshop.
Shape interrogation is the process of extraction of information from a geometric model. It is a fundamental component of Computer Aided Design and Manufacturing (CAD/CAM) systems. The authors focus on shape interrogation of geometric models bounded by free-form surfaces. Free-form surfaces, also called sculptured surfaces, are widely used in the bodies of ships, automobiles and aircraft, which have both functionality and attractive shape requirements. Many electronic devices as well as consumer products are designed with aesthetic shapes, which involve free-form surfaces. This book provides the mathematical fundamentals as well as algorithms for various shape interrogation methods including nonlinear polynomial solvers, intersection problems, differential geometry of intersection curves, distance functions, curve and surface interrogation, umbilics and lines of curvature, geodesics, and offset curves and surfaces. This book will be of interest both to graduate students and professionals.
This introduction to random variables and signals provides engineering students with the analytical and computational tools for processing random signals using linear systems. It presents the underlying theory as well as examples and applications using computational aids throughout, in particular, computer-based symbolic computation programs are used for performing the analytical manipulations and the numerical calculations. The accompanying CD-ROM provides MathcadTM and MatlabTM notebooks and sheets to develop processing methods. Intended for a one-semester course for advanced undergraduate or beginning graduate students, the book covers such topics as: set theory and probability; random variables, distributions, and processes; deterministic signals, spectral properties, and transformations; and filtering, and detection theory. The large number of worked examples together with the programming aids make the book eminently suited for self study as well as classroom use.
The management of production and service processes can be supported by microcomputer simulation models-effectively and inexpensively-if the techniques are presented in an understandable manner. Drs. Klafehn, Weinroth, and Boronico prove this and show how to do it-not only for the benefit of operations managers themselves, but for others with management responsibilities in a variety of businesses and industries. They will learn how important daily operations problems can be modeled on a microcomputer, gain understanding of overall simulation methodology, and learn the several forms of cost savings achievable through simulation. For teachers in business schools the book will also provide a link between general management and the management of engineering and R&D. The first chapter introduces the reader to the concepts and steps for undertaking a microcomputer simulation project. In addition, the benefits, drawbacks, and myths are reviewed in detail. Chapter two explores, in a conversational scenario, what is involved in taking a management operations problem involving a truck transfer depot from its point of inception to the formulation of a systems operation model, which in a later chapter is ultimately put into a computer simulation model and tested to, in a sense, come up with answers to the questions posed in the hypothetical conversation. Subsequent chapters in the book are oriented to a discussion of other operations management problems and the effort to seek insight and solutions through simulation modeling. A Just-in-Time manufacturing system is addressed, recognizing the push-pull concept as well as looking at the quality aspect. Attempting to determine the optimum levels for safety, stock, order points, and order quantity is investigated through computer simulation. These levels are predicated on balancing the costs associated with ordering and holding goods as well as the penalty costs of stocking out. Using a simulated environment enables the inclusion of the variability evidenced by the type of distribution. The remaining chapters also review alternative rules and what ifs as applied to machine configuration, facility location for a satellite EMS unit, and job shop operations. Each of the applications chapters provides a printout of the basic computer model, written in GPSS, that was then modified to investigate alternative scenarios.
This book is devoted to a new branch of experimental design theory called simulation experimental design. There are many books devoted either to the theory of experimental design or to system simulation techniques, but in this book an approach to combine both fields is developed. Especially the mathematical theory of such universal variance reduction techniques as splitting and Russian Roulette is explored. The book contains a number of results on regression design theory related to nonlinear problems, the E-optimum criterion and designs which minimize bias. Audience: This volume will be of value to readers interested in systems simulation, applied statistics and numerical methods with basic knowledge of applied statistics and linear algebra.
This textbook provides a comprehensive introduction to probability and stochastic processes, and shows how these subjects may be applied in computer performance modeling. The author's aim is to derive probability theory in a way that highlights the complementary nature of its formal, intuitive, and applicative aspects while illustrating how the theory is applied in a variety of settings. Readers are assumed to be familiar with elementary linear algebra and calculus, including being conversant with limits, but otherwise, this book provides a self-contained approach suitable for graduate or advanced undergraduate students. The first half of the book covers the basic concepts of probability, including combinatorics, expectation, random variables, and fundamental theorems. In the second half of the book, the reader is introduced to stochastic processes. Subjects covered include renewal processes, queueing theory, Markov processes, matrix geometric techniques, reversibility, and networks of queues. Examples and applications are drawn from problems in computer performance modeling. Throughout, large numbers of exercises of varying degrees of difficulty will help to secure a reader's understanding of these important and fascinating subjects.
This book presents the state of the art in high-performance computing and simulation on modern supercomputer architectures. It covers trends in hardware and software development in general and the future of high-performance systems and heterogeneous architectures in particular. The application-related contributions cover computational fluid dynamics, material science, medical applications and climate research; innovative fields such as coupled multi-physics and multi-scale simulations are highlighted. All papers were chosen from presentations given at the 18th Workshop on Sustained Simulation Performance held at the HLRS, University of Stuttgart, Germany in October 2013 and subsequent Workshop of the same name held at Tohoku University in March 2014.
This status report features the most recent developments in the field, spanning a wide range of topical areas in the computer simulation of condensed matter/materials physics. Both established and new topics are included, ranging from the statistical mechanics of classical magnetic spin models to electronic structure calculations, quantum simulations, and simulations of soft condensed matter.
Changes and additions are sprinkled throughout. Among the significant new features are: * Markov-chain simulation (Sections 1. 3, 2. 6, 3. 6, 4. 3, 5. 4. 5, and 5. 5); * gradient estimation (Sections 1. 6, 2. 5, and 4. 9); * better handling of asynchronous observations (Sections 3. 3 and 3. 6); * radically updated treatment of indirect estimation (Section 3. 3); * new section on standardized time series (Section 3. 8); * better way to generate random integers (Section 6. 7. 1) and fractions (Appendix L, program UNIFL); * thirty-seven new problems plus improvements of old problems. Helpful comments by Peter Glynn, Barry Nelson, Lee Schruben, and Pierre Trudeau stimulated several changes. Our new random integer routine extends ideas of Aarni Perko. Our new random fraction routine implements Pierre L'Ecuyer's recommended composite generator and provides seeds to produce disjoint streams. We thank Springer-Verlag and its late editor, Walter Kaufmann-Bilhler, for inviting us to update the book for its second edition. Working with them has been a pleasure. Denise St-Michel again contributed invaluable text-editing assistance. Preface to the First Edition Simulation means driving a model of a system with suitable inputs and observing the corresponding outputs. It is widely applied in engineering, in business, and in the physical and social sciences.
1.1. INTRODUCTION Plastic covering, either framed or floating, is now used worldwide to protect crops from unfavorable growing conditions, such as severe weather and insects and birds. Protected cultivation in the broad sense, including mulching, has been widely spread by the innovation of plastic films. Paper, straw, and glass were the main materials used before the era of plastics. Utilization of plastics in agriculture started in the developed countries and is now spreading to the developing countries. Early utilization of plastic was in cold regions, and plastic was mainly used for protection from the cold. Now plastic is used also for protection from wind, insects and diseases. The use of covering techniques started with a simple system such as mulching, then row covers and small tunnels were developed, and finally plastic houses. Floating mulch was an exception to this sequence: it was introduced rather recently, although it is a simple structure. New development of functional and inexpensive films triggered widespread use of floating mulch. Table 1.1. The use a/plastic mulch in the world (after Jouet, 2001). |
You may like...
Finite Element Computations in Mechanics…
Khameel Bayo Mustapha
Hardcover
R3,814
Discovery Miles 38 140
Java How to Program, Late Objects…
Paul Deitel, Harvey Deitel
Paperback
Handbook of Floating-Point Arithmetic
Jean-Michel Muller, Nicolas Brunie, …
Hardcover
R4,055
Discovery Miles 40 550
Software Quality Control, Error…
Judith Clapp, Saul F. Stanten, …
Hardcover
R1,241
Discovery Miles 12 410
Fundamentals of Algebraic Specification…
Hartmut Ehrig, Bernd Mahr
Hardcover
R1,491
Discovery Miles 14 910
Dark Silicon and Future On-chip Systems…
Suyel Namasudra, Hamid Sarbazi-Azad
Hardcover
R3,940
Discovery Miles 39 400
|