Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > Computer software packages > Computer graphics software > General
Want to create sophisticated games and graphics-intensive apps? Learn OpenGL ES gets you started immediately with OpenGL ES. After mastering the basics of OpenGL ES itself, you will quickly find yourself writing and building game apps, without having to learn about object oriented programming techniques. This book demonstrates the use of a powerful open-source modeling tool, Blender. You will be guided, step by step, through the development of Tank Fence, a dynamic, interactive 3D game. Along the way you'll gain skills in building apps with Eclipse and the Android SDK or NDK, rendering graphics using hardware acceleration, and multithreading for performance and responsiveness. iOS developers will also find this book's information invaluable when writing their apps. You'll learn everything you need to know about: Creating simple, efficient game UIs Designing the basic building blocks of an exciting, interactive 3D game Pulling all the elements together with Blender, a powerful open-source tool for modeling, animation, rendering, compositing, video editing, and game creation Taking the next big step using custom and inbuilt functions, texturing, shading, light sources, and more Refining your mobile game app through collision detection, player-room-obstacle classes, and storage classes Doing all this efficiently on mobile devices with limited resources and processing
Fuzzy classifiers are important tools in exploratory data analysis, which is a vital set of methods used in various engineering, scientific and business applications. Fuzzy classifiers use fuzzy rules and do not require assumptions common to statistical classification. Rough set theory is useful when data sets are incomplete. It defines a formal approximation of crisp sets by providing the lower and the upper approximation of the original set. Systems based on rough sets have natural ability to work on such data and incomplete vectors do not have to be preprocessed before classification. To achieve better performance than existing machine learning systems, fuzzy classifiers and rough sets can be combined in ensembles. Such ensembles consist of a finite set of learning models, usually weak learners. The present book discusses the three aforementioned fields - fuzzy systems, rough sets and ensemble techniques. As the trained ensemble should represent a single hypothesis, a lot of attention is placed on the possibility to combine fuzzy rules from fuzzy systems being members of classification ensemble. Furthermore, an emphasis is placed on ensembles that can work on incomplete data, thanks to rough set theory. .
This succinct book focuses on computer aided design (CAD), 3-D modeling, and engineering analysis and the ways they can be applied effectively in research and industrial sectors including aerospace, defense, automotive, and consumer products. These efficient tools, deployed for R&D in the laboratory and the field, perform efficiently three-dimensional modeling of finished products, render complex geometrical product designs, facilitate structural analysis and optimal product design, produce graphic and engineering drawings, and generate production documentation. Written with an eye toward green energy installations and novel manufacturing facilities, this concise volume enables scientific researchers and engineering professionals to learn design techniques, control existing and complex issues, proficiently use CAD tools, visualize technical fundamentals, and gain analytic and technical skills. This book also: * Equips practitioners and researchers to handle powerful tools for engineering design and analysis using many detailed illustrations * Emphasizes important engineering design principles in introducing readers to a range of techniques * Includes tutorials providing readers with appropriate scaffolding to accelerate their learning process * Adopts a product development, cost-consideration perspective through the book's many examples
Simulation of ODE/PDE Models with MATLAB(r), OCTAVE and SCILAB shows the reader how to exploit a fuller array of numerical methods for the analysis of complex scientific and engineering systems than is conventionally employed. The book is dedicated to numerical simulation of distributed parameter systems described by mixed systems of algebraic equations, ordinary differential equations (ODEs) and partial differential equations (PDEs). Special attention is paid to the numerical method of lines (MOL), a popular approach to the solution of time-dependent PDEs, which proceeds in two basic steps: spatial discretization and time integration. Besides conventional finite-difference and element techniques, more advanced spatial-approximation methods are examined in some detail, including nonoscillatory schemes and adaptive-grid approaches. A MOL toolbox has been developed within MATLAB(r)/OCTAVE/SCILAB. In addition to a set of spatial approximations and time integrators, this toolbox includes a collection of application examples, in specific areas, which can serve as templates for developing new programs. Simulation of ODE/PDE Models with MATLAB(r), OCTAVE and SCILAB provides a practical introduction to some advanced computational techniques for dynamic system simulation, supported by many worked examples in the text, and a collection of codes available for download from the book s page at www.springer.com. This text is suitable for self-study by practicing scientists and engineers and as a final-year undergraduate course or at the graduate level.
This book focuses on the use of farm level, micro- and macro-data of cooperative systems and networks in developing new robust, reliable and coherent modeling tools for agricultural and environmental policy analysis. The efficacy of public intervention on agriculture is largely determined by the existence of reliable information on the effects of policy options and market developments on farmers' production decisions and in particular, on key issues such as levels of agricultural and non-agricultural output, land use and incomes, use of natural resources, sustainable-centric management, structural change and the viability of family farms. Over the last years, several methods and analytical tools have been developed for policy analysis using various sets of data. Such methods have been based on integrated approaches in an effort to investigate the above key issues and have thus attempted to offer a powerful environment for decision making, particularly in an era of radical change for both agriculture and the wider economy.
The methods considered in the 7th conference on "Finite Volumes for Complex Applications" (Berlin, June 2014) have properties which offer distinct advantages for a number of applications. The second volume of the proceedings covers reviewed contributions reporting successful applications in the fields of fluid dynamics, magnetohydrodynamics, structural analysis, nuclear physics, semiconductor theory and other topics. The finite volume method in its various forms is a space discretization technique for partial differential equations based on the fundamental physical principle of conservation. Recent decades have brought significant success in the theoretical understanding of the method. Many finite volume methods preserve further qualitative or asymptotic properties, including maximum principles, dissipativity, monotone decay of free energy, and asymptotic stability. Due to these properties, finite volume methods belong to the wider class of compatible discretization methods, which preserve qualitative properties of continuous problems at the discrete level. This structural approach to the discretization of partial differential equations becomes particularly important for multiphysics and multiscale applications. Researchers, PhD and masters level students in numerical analysis, scientific computing and related fields such as partial differential equations will find this volume useful, as will engineers working in numerical modeling and simulations.
This SpringerBrief focuses on the use of egress models to assess the optimal strategy for total evacuation in high-rise buildings. It investigates occupant relocation and evacuation strategies involving the exit stairs, elevators, sky bridges and combinations thereof. Chapters review existing information on this topic and describe case study simulations of a multi-component exit strategy. This review provides the architectural design, regulatory and research communities with a thorough understanding of the current and emerging evacuation procedures and possible future options. A model case study simulates seven possible strategies for the total evacuation of two identical twin towers linked with two sky-bridges at different heights. The authors present the layout of the building and the available egress components including both vertical and horizontal egress components, namely stairs, occupant evacuation elevators (OEEs), service elevators, transfer floors and sky-bridges. The evacuation strategies employ a continuous spatial representation evacuation model (Pathfinder) and are cross-validated by a fine network model (STEPS). Assessment of Total Evacuation Systems for Tall Buildings is intended for practitioners as a tool for analyzing evacuation methods and efficient exit strategies. Researchers working in architecture and fire safety will also find the book valuable.
This book explains and examines the theoretical underpinnings of the Complex Variable Boundary Element Method (CVBEM) as applied to higher dimensions, providing the reader with the tools for extending and using the CVBEM in various applications. Relevant mathematics and principles are assembled and the reader is guided through the key topics necessary for an understanding of the development of the CVBEM in both the usual two as well as three or higher dimensions. In addition to this, problems are provided that build upon the material presented. The Complex Variable Boundary Element Method (CVBEM) is an approximation method useful for solving problems involving the Laplace equation in two dimensions. It has been shown to be a useful modelling technique for solving two-dimensional problems involving the Laplace or Poisson equations on arbitrary domains. The CVBEM has recently been extended to 3 or higher spatial dimensions, which enables the precision of the CVBEM in solving the Laplace equation to be now available for multiple dimensions. The mathematical underpinnings of the CVBEM, as well as the extension to higher dimensions, involve several areas of applied and pure mathematics including Banach Spaces, Hilbert Spaces, among other topics. This book is intended for applied mathematics graduate students, engineering students or practitioners, developers of industrial applications involving the Laplace or Poisson equations and developers of computer modelling applications.
The book presents a snapshot of the state-of-art in the field of turbulence modeling and covers the latest developments concerning direct numerical simulations, large eddy simulations, compressible turbulence, coherent structures, two-phase flow simulation and other related topics. It provides readers with a comprehensive review of both theory and applications, describing in detail the authors own experimental results. The book is based on the proceedings of the third Turbulence and Interactions Conference (TI 2012), which was held on June 11-14 in La Saline-les-Bains, La Reunion, France and includes both keynote lectures and outstanding contributed papers presented at the conference. This multifaceted collection, which reflects the conferences emphasis on the interplay of theory, experiments and computing in the process of understanding and predicting the physics of complex flows and solving related engineering problems, offers a practice-oriented guide for students, researchers and professionals in the field of computational fluid dynamics, turbulence modeling and related areas. "
This book highlights recent advances in the development of effective modeling and solution approaches to enhance the performance of military logistics. It seeks to further research in global defense-related topics, including military operations, governmental operations and security, as well as nation support. Additionally its purpose is to promote the global exchange of information and ideas amongst developers and users of military operations research tools and techniques. Over the course of its nine chapters, this edited volume addresses significant issues in military logistics including: a) Restructuring processes via OR methods aimed at improving the efficiency and effectiveness of the military logistics, b) Sense-and-Respond logistics prediction and coordination techniques that provide competitive advantage, spanning the full range of military operations across the strategic, operational and tactical levels of war, c) Procurement and auctioning, d) Inventory and stock control theories and applications, e) Military transport and logistical equipment, and, f) Maintenance, repair and overhaul on operational capability in general and equipment availability. The book aims to bridge the gap between the abundant literature on commercial logistics and its scarce defense & combat counterpart. This collection of useful insights into new trends and research will offer an ideal reference for practitioners and army related personnel interested in integrating scientific rigor to improve logistics management within defense organizations & agencies. Ultimately this book should provide a relevant platform for the latest contributions of operations management, operations research, and computational intelligence towards the enhancement of military logistics.
In order to satisfy the needs of their customers, network
utilities require specially developed maintenance management
capabilities. Maintenance Management information systems are
essential to ensure control, gain knowledge and improve-decision
making in companies dealing with network infrastructure, such as
distribution of gas, water, electricity and telecommunications.
Maintenance Management in Network Utilities studies specified
characteristics of maintenance management in this sector to offer a
practical approach to defining and implementing the best management
practices and suitable frameworks.
Colloids are ubiquitous in the food, medical, cosmetics, polymers, water purification, and pharmaceutical industries. The thermal, mechanical, and storage properties of colloids are highly dependent on their interface morphology and their rheological behavior. Numerical methods provide a convenient and reliable tool for the study of colloids. "Accelerated Lattice Boltzmann Model for Colloidal Suspensions" introduce the main building-blocks for an improved lattice Boltzmann based numerical tool designed for the study of colloidal rheology and interface morphology. This book also covers the migrating multi-block used to simulate single component, multi-component, multiphase, and single component multiphase flows and their validation by experimental, numerical, and analytical solutions. Among other topics discussed are the hybrid lattice Boltzmann method (LBM) for surfactant-covered droplets; biological suspensions such as blood; used in conjunction with the suppression of coalescence for investigating the rheology of colloids and microvasculature blood flow. The presented LBM model provides a flexible numerical platform consisting of various modules that could be used separately or in combination for the study of a variety of colloids and biological flow deformation problems."
Christoph Clauser and Jom Bartels SHE MAT (Simulator for HEat and MAss Transport) is an easy-to-use, general- purpose reactive transport simulation code for a wide variety of thermal and hy- drogeological problems in two and three dimensions. Specifically, SHEMAT solves coupled problems involving fluid flow, heat transfer, species transport, and chemical water-rock interaction in fluid-saturated porous media. It can handle a wide range of time scales. Therefore, it is useful to address both technical and geo- logical processes. In particular, it offers special and attractive features for model- ing steady-state and transient processes in hydro-geothermal reservoirs. This makes it well suited to predict the long-term behavior of heat mining installations in hot aquifers with highly saline brines. SHEMA T in its present form evolved from a fully coupled flow and heat transport model (Clauser 1988) which had been developed from the isothermal USGS 3-D groundwater model of Trescott and Larson (Trescott 1975; Trescott and Larson 1977). Transport of dissolved species, geochemical reactions between the solid and fluid phases, extended cou- pling between the individual processes (most notably between porosity and per- meability), and a convenient user interface (developed from Processing Modflow (Chiang and Kinzelbach 2001)) were added during several research projects funded by the German Science Foundation (DFG) under grant CL 12117 and the German Federal Ministries for Education, Science, Research, and Technology (BMBF) under grant 032 69 95A-D and for Economics and Technology (BMWi) under grant 0327095 (Bartels et al. 2002, Kuhn et al. 2002a).
An overview of biomechanical modeling of human soft tissue using nonlinear theoretical mechanics and incremental finite element methods, useful for computer simulation of the human musculoskeletal system.
The first volume of the proceedings of the 7th conference on "Finite Volumes for Complex Applications" (Berlin, June 2014) covers topics that include convergence and stability analysis, as well as investigations of these methods from the point of view of compatibility with physical principles. It collects together the focused invited papers, as well as the reviewed contributions from internationally leading researchers in the field of analysis of finite volume and related methods. Altogether, a rather comprehensive overview is given of the state of the art in the field. The finite volume method in its various forms is a space discretization technique for partial differential equations based on the fundamental physical principle of conservation. Recent decades have brought significant success in the theoretical understanding of the method. Many finite volume methods preserve further qualitative or asymptotic properties, including maximum principles, dissipativity, monotone decay of free energy, and asymptotic stability. Due to these properties, finite volume methods belong to the wider class of compatible discretization methods, which preserve qualitative properties of continuous problems at the discrete level. This structural approach to the discretization of partial differential equations becomes particularly important for multiphysics and multiscale applications. Researchers, PhD and masters level students in numerical analysis, scientific computing and related fields such as partial differential equations will find this volume useful, as will engineers working in numerical modeling and simulations."
We make complex decisions every day, requiring trust in many different entities for different reasons. These decisions are not made by combining many isolated trust evaluations. Many interlocking factors play a role, each dynamically impacting the others. In this brief, "trust context" is defined as the system level description of how the trust evaluation process unfolds. Networks today are part of almost all human activity, supporting and shaping it. Applications increasingly incorporate new interdependencies and new trust contexts. Social networks connect people and organizations throughout the globe in cooperative and competitive activities. Information is created and consumed at a global scale. Systems, devices, and sensors create and process data, manage physical systems, and participate in interactions with other entities, people and systems alike. To study trust in such applications, we need a multi-disciplinary approach. This book reviews the components of the trust context through a broad review of recent literature in many different fields of study. Common threads relevant to the trust context across many application domains are also illustrated. Illustrations in the text (c) 2013 Aaron Hertzmann. www.dgp.toronto.edu/~hertzman
The "Handbook of Simulation Optimization" presents an overview of the state of the art of simulation optimization, providing a survey of the most well-established approaches for optimizing stochastic simulation models and a sampling of recent research advances in theory and methodology. Leading contributors cover such topics as discrete optimization via simulation, ranking and selection, efficient simulation budget allocation, random search methods, response surface methodology, stochastic gradient estimation, stochastic approximation, sample average approximation, stochastic constraints, variance reduction techniques, model-based stochastic search methods and Markov decision processes. This single volume should serve as a reference for those already in the field and as a means for those new to the field for understanding and applying the main approaches. The intended audience includes researchers, practitioners and graduate students in the business/engineering fields of operations research, management science, operations management and stochastic control, as well as in economics/finance and computer science.
Decision makers in large scale interconnected network systems require simulation models for decision support. The behaviour of these systems is determined by many actors, situated in a dynamic, multi-actor, multi-objective and multi-level environment. How can such systems be modelled and how can the socio-technical complexity be captured? Agent-based modelling is a proven approach to handle this challenge. This book provides a practical introduction to agent-based modelling of socio-technical systems, based on a methodology that has been developed at TU Delft and which has been deployed in a large number of case studies. The book consists of two parts: the first presents the background, theory and methodology as well as practical guidelines and procedures for building models. In the second part this theory is applied to a number of case studies, where for each model the development steps are presented extensively, preparing the reader for creating own models.
The purpose of this book is to present a methodology for designing and tuning fuzzy expert systems in order to identify nonlinear objects; that is, to build input-output models using expert and experimental information. The results of these identifications are used for direct and inverse fuzzy evidence in forecasting and diagnosis problem solving. The book is organised as follows: Chapter 1 presents the basic knowledge about fuzzy sets, genetic algorithms and neural nets necessary for a clear understanding of the rest of this book. Chapter 2 analyzes direct fuzzy inference based on fuzzy if-then rules. Chapter 3 is devoted to the tuning of fuzzy rules for direct inference using genetic algorithms and neural nets. Chapter 4 presents models and algorithms for extracting fuzzy rules from experimental data. Chapter 5 describes a method for solving fuzzy logic equations necessary for the inverse fuzzy inference in diagnostic systems. Chapters 6 and 7 are devoted to inverse fuzzy inference based on fuzzy relations and fuzzy rules. Chapter 8 presents a method for extracting fuzzy relations from data. All the algorithms presented in Chapters 2-8 are validated by computer experiments and illustrated by solving medical and technical forecasting and diagnosis problems. Finally, Chapter 9 includes applications of the proposed methodology in dynamic and inventory control systems, prediction of results of football games, decision making in road accident investigations, project management and reliability analysis.
This book is a collection of writings by active researchers in the field of Artificial General Intelligence, on topics of central importance in the field. Each chapter focuses on one theoretical problem, proposes a novel solution, and is written in sufficiently non-technical language to be understandable by advanced undergraduates or scientists in allied fields. This book is the very first collection in the field of Artificial General Intelligence (AGI) focusing on theoretical, conceptual, and philosophical issues in the creation of thinking machines. All the authors are researchers actively developing AGI projects, thus distinguishing the book from much of the theoretical cognitive science and AI literature, which is generally quite divorced from practical AGI system building issues. And the discussions are presented in a way that makes the problems and proposed solutions understandable to a wide readership of non-specialists, providing a distinction from the journal and conference-proceedings literature. The book will benefit AGI researchers and students by giving them a solid orientation in the conceptual foundations of the field (which is not currently available anywhere); and it would benefit researchers in allied fields by giving them a high-level view of the current state of thinking in the AGI field. Furthermore, by addressing key topics in the field in a coherent way, the collection as a whole may play an important role in guiding future research in both theoretical and practical AGI, and in linking AGI research with work in allied disciplines
This book looks at the convergent nature of technology and its relationship to the field of photogrammetry and 3D design. This is a facet of a broader discussion of the nature of technology itself and the relationship of technology to art, as well as an examination of the educational process. In the field of technology-influenced design-based education it is natural to push for advanced technology, yet within a larger institution the constraints of budget and adherence to tradition must be accepted. These opposing forces create a natural balance; in some cases constraints lead to greater creativity than freedom ever can - but in other cases the opposite is true. This work offers insights into ways to integrate new technologies into the field of design, and from a broader standpoint it also looks ahead, raising further questions and looking to the near future as to what additional technologies might cause further disruptions to 3D design as well as wonderful creative opportunities.
Mobile Intention Recognition addresses problems of practical relevance for mobile system engineers: how can we make mobile assistance systems more intelligent? How can we model and recognize patterns of human behavior which span more than a limited spatial context? This text provides an overview on plan and intention recognition, ranging from the late 1970s to very recent approaches. This overview is unique as it discusses approaches with respect to the specificities of mobile intention recognition. This book covers problems from research on mobile assistance systems using methods from artificial intelligence and natural language processing. It thus addresses an extraordinary interdisciplinary audience.
Every chapter starts with a 'mission briefing' section that describes what is to be achieved by the end of the chapter. This is followed with the decisions and steps required to accomplish the mission objective with challenges to take the project further. The scope for the book thus mimics the reallife requirements of a developer and gets you ready to successfully build your own project. If you are a web designer looking to expand your knowledge of 3D graphics concepts and broaden your existing skill set, then this book is for you. Those looking for an introduction to 3D graphics will benefit from WebGL Hotshot as it is a perfect guide to master 3D concepts, helping you build and deploy 3D worlds much quicker. The book assumes a basic knowledge of HTML, though it can be learned concurrently while reading this book. Basic programming knowledge is useful; however, the graphical nature of web 3D content allows you to learn programming through experimentation.
In this self-consistent monograph, the author gathers and describes different mathematical techniques and combines all together to form practical procedures for the inverse analyses. It puts together topics coming from mathematical programming, with soft computing and Proper Orthogonal Decomposition, in order to show, in the context of structural analyses, how the things work and what are the main problems one needs to tackle. Throughout the book a number of examples and exercises are worked out in order to make reader practically familiar with discussed topics.
This two-volume set of CCIS 391 and CCIS 392 constitutes the refereed proceedings of the Fourth International Conference on Information Computing and Applications, ICICA 2013, held in Singapore, in August 2013. The 126 revised full papers presented in both volumes were carefully reviewed and selected from 665 submissions. The papers are organized in topical sections on Internet computing and applications; engineering management and applications; Intelligent computing and applications; business intelligence and applications; knowledge management and applications; information management system; computational statistics and applications. |
You may like...
Using Airborne Lidar in Archaeological…
Simon Crutchley, Peter Crow
Paperback
R1,191
Discovery Miles 11 910
Discrete-Event Modeling and Simulation…
Gabriel A. Wainer, Pieter J. Mosterman
Hardcover
BIM for Heritage - Developing a Historic…
Sofia Antonopoulou, Paul B. Ryan
Paperback
R1,186
Discovery Miles 11 860
Multilevel Modeling - Methodological…
Steven P. Reise, Naihua Duan
Paperback
R1,514
Discovery Miles 15 140
Advanced Structural Equation Modeling…
George A Marcoulides, Randall E. Schumacker
Hardcover
R4,016
Discovery Miles 40 160
Multilevel Modeling - Methodological…
Steven P. Reise, Naihua Duan
Hardcover
R4,009
Discovery Miles 40 090
Planning for Urban Quality - Urban…
Michael Parfect, Gordon Power
Hardcover
R5,345
Discovery Miles 53 450
|