![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer software packages > Computer graphics software > General
The focus of this volume is comprised of the fundamentals, models, and information technologies (IT) methods and tools for disaster prediction and mitigation. A more detailed list of topics includes mathematical and computational modeling of processes leading to or producing disasters, modeling of disaster effects, IT means for disaster mitigation, including data mining tools, knowledge-based and expert systems for use in disaster circumstances, GIS-based systems for disaster prevention and mitigation and equipment for disaster-prone areas. A specific type or class of disasters (natural or human-made), however will not be part of the main focus of this work. Instead, this book was conceived to offer a comprehensive, integrative view on disasters, seeking to determine what various disasters have in common. Because disaster resilience and mitigation involve humans, societies and cultures, not only technologies and economic models, special attention was paid in this volume to gain a comprehensive view on these issues, as a foundation of the IT tool design.
Fuzzy classifiers are important tools in exploratory data analysis, which is a vital set of methods used in various engineering, scientific and business applications. Fuzzy classifiers use fuzzy rules and do not require assumptions common to statistical classification. Rough set theory is useful when data sets are incomplete. It defines a formal approximation of crisp sets by providing the lower and the upper approximation of the original set. Systems based on rough sets have natural ability to work on such data and incomplete vectors do not have to be preprocessed before classification. To achieve better performance than existing machine learning systems, fuzzy classifiers and rough sets can be combined in ensembles. Such ensembles consist of a finite set of learning models, usually weak learners. The present book discusses the three aforementioned fields - fuzzy systems, rough sets and ensemble techniques. As the trained ensemble should represent a single hypothesis, a lot of attention is placed on the possibility to combine fuzzy rules from fuzzy systems being members of classification ensemble. Furthermore, an emphasis is placed on ensembles that can work on incomplete data, thanks to rough set theory. .
This volume constitutes the papers presented at the 15th International Conference on Computer Aided Systems Theory, EUROCAST 2015, held in February 2015 in Las Palmas de Gran Canaria, Spain. The total of 107 papers presented were carefully reviewed and selected for inclusion in the book. The contributions are organized in topical sections on Systems Theory and Applications; Modelling Biological Systems; Intelligent Information Processing; Theory and Applications of Metaheuristic Algorithms; Computer Methods, Virtual Reality and Image Processing for Clinical and Academic Medicine; Signals and Systems in Electronics; Model-Based System Design, Verification, and Simulation; Digital Signal Processing Methods and Applications; Modelling and Control of Robots; Mobile Platforms, Autonomous and Computing Traffic Systems; Cloud and Other Computing Systems; and Marine Sensors and Manipulators.
This innovative text emphasizes a "less-is-more" approach to modeling complicated systems such as heat transfer by treating them first as "1-node lumped models" that yield simple closed-form solutions. The author develops numerical techniques for students to obtain more detail, but also trains them to use the techniques only when simpler approaches fail. Covering all essential methods offered in traditional texts, but with a different order, Professor Sidebotham stresses inductive thinking and problem solving as well as a constructive understanding of modern, computer-based practice. Readers learn to develop their own code in the context of the material, rather than just how to use packaged software, offering a deeper, intrinsic grasp behind models of heat transfer. Developed from over twenty-five years of lecture notes to teach students of mechanical and chemical engineering at The Cooper Union for the Advancement of Science and Art, the book is ideal for students and practitioners across engineering disciplines seeking a solid understanding of heat transfer. This book also: * Adopts a novel inductive pedagogy where commonly understood examples are introduced early and theory is developed to explain and predict readily recognized phenomena * Introduces new techniques as needed to address specific problems, in contrast to traditional texts' use of a deductive approach, where abstract general principles lead to specific examples * Elucidates readers' understanding of the "heat transfer takes time" idea-transient analysis applications are introduced first and steady-state methods are shown to be a limiting case of those applications * Focuses on basic numerical methods rather than analytical methods of solving partial differential equations, largely obsolete in light of modern computer power * Maximizes readers' insights to heat transfer modeling by framing theory as an engineering design tool, not as a pure science, as has been done in traditional textbooks * Integrates practical use of spreadsheets for calculations and provides many tips for their use throughout the text examples
This book explains and examines the theoretical underpinnings of the Complex Variable Boundary Element Method (CVBEM) as applied to higher dimensions, providing the reader with the tools for extending and using the CVBEM in various applications. Relevant mathematics and principles are assembled and the reader is guided through the key topics necessary for an understanding of the development of the CVBEM in both the usual two as well as three or higher dimensions. In addition to this, problems are provided that build upon the material presented. The Complex Variable Boundary Element Method (CVBEM) is an approximation method useful for solving problems involving the Laplace equation in two dimensions. It has been shown to be a useful modelling technique for solving two-dimensional problems involving the Laplace or Poisson equations on arbitrary domains. The CVBEM has recently been extended to 3 or higher spatial dimensions, which enables the precision of the CVBEM in solving the Laplace equation to be now available for multiple dimensions. The mathematical underpinnings of the CVBEM, as well as the extension to higher dimensions, involve several areas of applied and pure mathematics including Banach Spaces, Hilbert Spaces, among other topics. This book is intended for applied mathematics graduate students, engineering students or practitioners, developers of industrial applications involving the Laplace or Poisson equations and developers of computer modelling applications.
3D Imaging, Analysis and Applications brings together core topics, both in terms of well-established fundamental techniques and the most promising recent techniques in the exciting field of 3D imaging and analysis. Many similar techniques are being used in a variety of subject areas and applications and the authors attempt to unify a range of related ideas. With contributions from high profile researchers and practitioners, the material presented is informative and authoritative and represents mainstream work and opinions within the community. Composed of three sections, the first examines 3D imaging and shape representation, the second, 3D shape analysis and processing, and the last section covers 3D imaging applications. Although 3D Imaging, Analysis and Applications is primarily a graduate text, aimed at masters-level and doctoral-level research students, much material is accessible to final-year undergraduate students. It will also serve as a reference text for professional academics, people working in commercial research and development labs and industrial practitioners.
Christoph Clauser and Jom Bartels SHE MAT (Simulator for HEat and MAss Transport) is an easy-to-use, general- purpose reactive transport simulation code for a wide variety of thermal and hy- drogeological problems in two and three dimensions. Specifically, SHEMAT solves coupled problems involving fluid flow, heat transfer, species transport, and chemical water-rock interaction in fluid-saturated porous media. It can handle a wide range of time scales. Therefore, it is useful to address both technical and geo- logical processes. In particular, it offers special and attractive features for model- ing steady-state and transient processes in hydro-geothermal reservoirs. This makes it well suited to predict the long-term behavior of heat mining installations in hot aquifers with highly saline brines. SHEMA T in its present form evolved from a fully coupled flow and heat transport model (Clauser 1988) which had been developed from the isothermal USGS 3-D groundwater model of Trescott and Larson (Trescott 1975; Trescott and Larson 1977). Transport of dissolved species, geochemical reactions between the solid and fluid phases, extended cou- pling between the individual processes (most notably between porosity and per- meability), and a convenient user interface (developed from Processing Modflow (Chiang and Kinzelbach 2001)) were added during several research projects funded by the German Science Foundation (DFG) under grant CL 12117 and the German Federal Ministries for Education, Science, Research, and Technology (BMBF) under grant 032 69 95A-D and for Economics and Technology (BMWi) under grant 0327095 (Bartels et al. 2002, Kuhn et al. 2002a).
In order to satisfy the needs of their customers, network
utilities require specially developed maintenance management
capabilities. Maintenance Management information systems are
essential to ensure control, gain knowledge and improve-decision
making in companies dealing with network infrastructure, such as
distribution of gas, water, electricity and telecommunications.
Maintenance Management in Network Utilities studies specified
characteristics of maintenance management in this sector to offer a
practical approach to defining and implementing the best management
practices and suitable frameworks.
A Focus on SLM and SLS Methods in 3D Printing is an indispensable collection of articles for anyone involved in additive manufacturing - from academics and researchers through to engineers and managers within the manufacturing industry. The collection features examples of innovative research involving selective laser melting and selective laser sintering techniques applied across a range of contexts.
Variational Regularization of 3D Data provides an introduction to variational methods for data modelling and its application in computer vision. In this book, the authors identify interpolation as an inverse problem that can be solved by Tikhonov regularization. The proposed solutions are generalizations of one-dimensional splines, applicable to n-dimensional data and the central idea is that these splines can be obtained by regularization theory using a trade-off between the fidelity of the data and smoothness properties. As a foundation, the authors present a comprehensive guide to the necessary fundamentals of functional analysis and variational calculus, as well as splines. The implementation and numerical experiments are illustrated using MATLAB (R). The book also includes the necessary theoretical background for approximation methods and some details of the computer implementation of the algorithms. A working knowledge of multivariable calculus and basic vector and matrix methods should serve as an adequate prerequisite.
An overview of biomechanical modeling of human soft tissue using nonlinear theoretical mechanics and incremental finite element methods, useful for computer simulation of the human musculoskeletal system.
This book looks at the convergent nature of technology and its relationship to the field of photogrammetry and 3D design. This is a facet of a broader discussion of the nature of technology itself and the relationship of technology to art, as well as an examination of the educational process. In the field of technology-influenced design-based education it is natural to push for advanced technology, yet within a larger institution the constraints of budget and adherence to tradition must be accepted. These opposing forces create a natural balance; in some cases constraints lead to greater creativity than freedom ever can - but in other cases the opposite is true. This work offers insights into ways to integrate new technologies into the field of design, and from a broader standpoint it also looks ahead, raising further questions and looking to the near future as to what additional technologies might cause further disruptions to 3D design as well as wonderful creative opportunities.
In this self-consistent monograph, the author gathers and describes different mathematical techniques and combines all together to form practical procedures for the inverse analyses. It puts together topics coming from mathematical programming, with soft computing and Proper Orthogonal Decomposition, in order to show, in the context of structural analyses, how the things work and what are the main problems one needs to tackle. Throughout the book a number of examples and exercises are worked out in order to make reader practically familiar with discussed topics.
This book provides an overview of recent developments and applications of the Land Use Scanner model, which has been used in spatial planning for well over a decade. Internationally recognized as among the best of its kind, this versatile model can be applied at a national level for trend extrapolation, scenario studies and optimization, yet can also be employed in a smaller-scale regional context, as demonstrated by the assortment of regional case studies included in the book. Alongside these practical examples from the Netherlands, readers will find discussion of more theoretical aspects of land-use models as well as an assessment of various studies that aim to develop the Land-Use Scanner model further. Spanning the divide between the abstractions of land-use modelling and the imperatives of policy making, this is a cutting-edge account of the way in which the Land-Use Scanner approach is able to interrogate a spectrum of issues that range from climate change to transportation efficiency. Aimed at planners, researchers and policy makers who need to stay abreast of the latest advances in land-use modelling techniques in the context of planning practice, the book guides the reader through the applications supported by current instrumentation. It affords the opportunity for a wide readership to benefit from the extensive and acknowledged expertise of Dutch planners, who have originated a host of much-used models."
This two volume set (CCIS 398 and 399) constitutes the refereed proceedings of the International Symposium on Geo-Informatics in Resource Management and Sustainable Ecosystem, GRMSE 2013, held in Wuhan, China, in November 2013. The 136 papers presented, in addition to 4 keynote speeches and 5 invited sessions, were carefully reviewed and selected from 522 submissions. The papers are divided into 5 sessions: smart city in resource management and sustainable ecosystem, spatial data acquisition through RS and GIS in resource management and sustainable ecosystem, ecological and environmental data processing and management, advanced geospatial model and analysis for understanding ecological and environmental process, applications of geo-informatics in resource management and sustainable ecosystem.
The focus from most Virtual Reality (VR) systems lies mainly on the visual immersion of the user. But the emphasis only on the visual perception is insufficient for some applications as the user is limited in his interactions within the VR. Therefore the textbook presents the principles and theoretical background to develop a VR system that is able to create a link between physical simulations and haptic rendering which requires update rates of 1\, kHz for the force feedback. Special attention is given to the modeling and computation of contact forces in a two-finger grasp of textiles. Addressing further the perception of small scale surface properties like roughness, novel algorithms are presented that are not only able to consider the highly dynamic behaviour of textiles but also capable of computing the small forces needed for the tactile rendering at the contact point. Final analysis of the entire VR system is being made showing the problems and the solutions found in the work
This book constitutes the refereed proceedings of the 14th International Conference on Systems Simulation, Asia Simulation 2014, held in Kitakyushu, Japan, in October 2014. The 32 revised full papers presented were carefully reviewed and selected from 69 submissions. The papers are organized in topical sections on modeling and simulation technology; network simulation; high performance computing and cloud simulation; numerical simulation and visualization; simulation of instrumentation and control application; simulation technology in diversified higher education; general purpose simulation.
This book constitutes the refereed proceedings of the 17th International Conference on Model Driven Engineering Languages and Systems, MODELS 2014, held in Valencia, Spain, in September/October 2014. The 41 full papers presented in this volume were carefully reviewed and selected from a total of 126 submissions. The scope of the conference series is broad, encompassing modeling languages, methods, tools, and applications considered from theoretical and practical angles and in academic and industrial settings. The papers report on the use of modeling in a wide range of cloud, mobile, and web computing, model transformation behavioral modeling, MDE: past, present, future, formal semantics, specification, and verification, models at runtime, feature and variability modeling, composition and adaptation, practices and experience, modeling for analysis, pragmatics, model extraction, manipulation and persistence, querying, and reasoning.
Daylight is a dynamic source of illumination in architectural space, creating diverse and ephemeral configurations of light and shadow within the built environment. Perceptual qualities of daylight, such as contrast and temporal variability, are essential to our understanding of both material and visual effects in architecture. Although spatial contrast and light variability are fundamental to the visual experience of architecture, architects still rely primarily on intuition to evaluate their designs because there are few metrics that address these factors. Through an analysis of contemporary architecture, this work develops a new typological language that categorizes architectural space in terms of contrast and temporal variation. This research proposes a new family of metrics that quantify the magnitude of contrast-based visual effects and time-based variation within daylit space through the use of time-segmented daylight renderings to provide a more holistic analysis of daylight performance.
The physics and dynamics of the atmosphere and atmosphere-ocean interactions provide the foundation of modern climate models, upon which our understanding of the chemistry and biology of ocean and land surface processes are built. Originally published in 2006, Frontiers of Climate Modeling captures developments in modeling the atmosphere, and their implications for our understanding of climate change, whether due to natural or anthropogenic causes. Emphasis is on elucidating how greenhouse gases and aerosols are altering the radiative forcing of the climate system and the sensitivity of the system to such perturbations. An expert team of authors address key aspects of the atmospheric greenhouse effect, clouds, aerosols, atmospheric radiative transfer, deep convection dynamics, large scale ocean dynamics, stratosphere-troposphere interactions, and coupled ocean-atmosphere model development. The book is an important reference for researchers and advanced students interested in the forces driving the climate system and how they are modeled by climate scientists.
This book contains the refereed proceedings of the International Conference on Modeling and Simulation in Engineering, Economics, and Management, MS 2013, held in Castellon de la Plana, Spain, in June 2013. The event was co-organized by the AMSE Association and the SoGReS Research Group of the Jaume I University. This edition of the conference paid special attention to modeling and simulation in diverse fields of business management. The 28 full papers in this book were carefully reviewed and selected from 65 submissions. They are organized in topical sections on: modeling and simulation in CSR and sustainable development; modeling and simulation in finance and accounting; modeling and simulation in management and marketing; modeling and simulation in economics and politics; knowledge-based expert and decision support systems; and modeling and simulation in engineering.
We make complex decisions every day, requiring trust in many different entities for different reasons. These decisions are not made by combining many isolated trust evaluations. Many interlocking factors play a role, each dynamically impacting the others. In this brief, "trust context" is defined as the system level description of how the trust evaluation process unfolds. Networks today are part of almost all human activity, supporting and shaping it. Applications increasingly incorporate new interdependencies and new trust contexts. Social networks connect people and organizations throughout the globe in cooperative and competitive activities. Information is created and consumed at a global scale. Systems, devices, and sensors create and process data, manage physical systems, and participate in interactions with other entities, people and systems alike. To study trust in such applications, we need a multi-disciplinary approach. This book reviews the components of the trust context through a broad review of recent literature in many different fields of study. Common threads relevant to the trust context across many application domains are also illustrated. Illustrations in the text (c) 2013 Aaron Hertzmann. www.dgp.toronto.edu/~hertzman
The purpose of this book is to present a methodology for designing and tuning fuzzy expert systems in order to identify nonlinear objects; that is, to build input-output models using expert and experimental information. The results of these identifications are used for direct and inverse fuzzy evidence in forecasting and diagnosis problem solving. The book is organised as follows: Chapter 1 presents the basic knowledge about fuzzy sets, genetic algorithms and neural nets necessary for a clear understanding of the rest of this book. Chapter 2 analyzes direct fuzzy inference based on fuzzy if-then rules. Chapter 3 is devoted to the tuning of fuzzy rules for direct inference using genetic algorithms and neural nets. Chapter 4 presents models and algorithms for extracting fuzzy rules from experimental data. Chapter 5 describes a method for solving fuzzy logic equations necessary for the inverse fuzzy inference in diagnostic systems. Chapters 6 and 7 are devoted to inverse fuzzy inference based on fuzzy relations and fuzzy rules. Chapter 8 presents a method for extracting fuzzy relations from data. All the algorithms presented in Chapters 2-8 are validated by computer experiments and illustrated by solving medical and technical forecasting and diagnosis problems. Finally, Chapter 9 includes applications of the proposed methodology in dynamic and inventory control systems, prediction of results of football games, decision making in road accident investigations, project management and reliability analysis.
Collected articles in this series are dedicated to the development and use of software for earth system modelling and aims at bridging the gap between IT solutions and climate science. The particular topic covered in this volume addresses the usefulness of coupling infrastructures and data management, strategies and tools for pre- and post-processing, and coupling software and strategies in regional and global coupled climate models. This first part in the series of 6 books sets the scene for the following volumes.
Commanding and controlling organizations in extreme situations is a challenging task in military, intelligence, and disaster management. Such command and control must be quick, effective, and considerate when dealing with the changing, complex, and risky conditions of the situation. To enable optimal command and control under extremes, robust structures and efficient operations are required of organizations. This work discusses how to design and conduct virtual experiments on resilient organizational structures and operational practices using modeling and simulation. The work illustrates key aspects of robustly networked organizations and modeled performance of human decision-makers through examples of naval-air defense, counterterrorism operations, and disaster responses. |
You may like...
Topics in Numerical Partial Differential…
Susanne C. Brenner
Hardcover
R3,281
Discovery Miles 32 810
BIM for Heritage - Developing a Historic…
Sofia Antonopoulou, Paul B. Ryan
Paperback
R770
Discovery Miles 7 700
Digital Manufacturing - The…
Chandrakant D. Patel, Chun-Hsien Chen
Paperback
R4,567
Discovery Miles 45 670
Urban Dynamics and Simulation Models
Denise Pumain, Romain Reuillon
Hardcover
R3,656
Discovery Miles 36 560
Modelling and Control in Biomedical…
David Dagan Feng, Janan Zaytoon
Paperback
|