![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer software packages > Computer graphics software
This book focuses on two issues related to human figures: realtime dynamics computation and interactive motion generation. In spite of the growing interest in human figures as both physical robots and virtual characters, standard algorithms and tools for their kinematics and dynamics computation have not been investigated very much. "Simulating and Generating Motions of Human Figures" presents original algorithms to simulate, analyze, generate and control motions of human figures, all focusing on realtime and interactive computation. The book provides both practical methods for contact/collision simulation essential for the simulation of humanoid robots and virtual characters and a general framework for online, interactive motion generation of human figures based on the dynamics simulation algorithms.
This volume on financial and economic simulations in Swarm marks the continued progress by a group of researchers to incorporate agent-based computer models as an important tool within their discipline. Swarm promotes agent-based computer models as a tool for the study of complex systems. A common language is leading to the growth of user communities in specific areas of application. Furthermore, by providing an organizing framework to guide the development of more problem-specific structures, and by dealing with a whole range of issues that affect their fundamental correctness and their ability to be developed and reused, Swarm has sought to make the use of agent-based models a legitimate tool of scientific investigation that also meets the practical needs of investigators within a community. Swarm's principal foundation is an object-oriented representation of active agents interacting among themselves and with their environment. To this base layer it adds its own structures to drive, record and portrait the events that occur across this world. The specific contents of any world, however, are up to the experimenter to provide, either by building them from scratch or by tapping previous contributions. This book is notable in assembling a rich array of such contributions, which are significant in their own right, but which can also be mined to extract the reusable elements in their respective areas of finance and economics. It also presents three interesting software additions with tutorials in the form of simple financial and economic applications. A Swarm meta-language closer to a natural language', the use of internet-augmented Swarm for experimental economics, and a Swarm visualbuilder will meet the challenges launched by other agent-based modelling competitors. The Swarm community at large can benefit greatly from the lead that the growing field of computational economics is taking to address its own needs, as represented by this book.
This book extends the conventional two-dimensional (2D) magnet arrangement into 3D pattern for permanent magnet linear machines for the first time, and proposes a novel dual Halbach array. It can not only effectively increase the radial component of magnetic flux density and output force of tubular linear machines, but also significantly reduce the axial flux density, radial force and thus system vibrations and noises. The book is also the first to address the fundamentals and provide a summary of conventional arrays, as well as novel concepts for PM pole design in electric linear machines. It covers theoretical study, numerical simulation, design optimization and experimental works systematically. The design concept and analytical approaches can be implemented to other linear and rotary machines with similar structures. The book will be of interest to academics, researchers, R&D engineers and graduate students in electronic engineering and mechanical engineering who wish to learn the core principles, methods, and applications of linear and rotary machines.
This work presents lines of investigation and scientific achievements of the Ukrainian school of optimization theory and adjacent disciplines. These include the development of approaches to mathematical theories, methodologies, methods, and application systems for the solution of applied problems in economy, finances, energy saving, agriculture, biology, genetics, environmental protection, hardware and software engineering, information protection, decision making, pattern recognition, self-adapting control of complicated objects, personnel training, etc. The methods developed include sequential analysis of variants, nondifferential optimization, stochastic optimization, discrete optimization, mathematical modeling, econometric modeling, solution of extremum problems on graphs, construction of discrete images and combinatorial recognition, etc. Some of these methods became well known in the world's mathematical community and are now known as classic methods.
Simulation Methods for Reliability and Availability of Complex Systems discusses the use of computer simulation-based techniques and algorithms to determine reliability and availability (R and A) levels in complex systems. The book: shares theoretical or applied models and decision support systems that make use of simulation to estimate and to improve system R and A levels, forecasts emerging technologies and trends in the use of computer simulation for R and A and proposes hybrid approaches to the development of efficient methodologies designed to solve R and A-related problems in real-life systems. Dealing with practical issues, Simulation Methods for Reliability and Availability of Complex Systems is designed to support managers and system engineers in the improvement of R and A, as well as providing a thorough exploration of the techniques and algorithms available for researchers, and for advanced undergraduate and postgraduate students.
This major reference work represents the first attempt to confront, on a world-wide basis, the way computer associations face up to their own responsibilities in an age increasingly dominated by information and communication technology. The book deals with the codes of ethics and conduct, and related issues. It is the first book to deal with homogenous codes namely codes of national computer societies. Some thirty codes are compared and analysed in depth. To put these into perspective, there are discussion papers covering the methodological, philosophical and organisational issues.
The planned construction of traffic routes through the European Alps represents a challenge for science and technology. In the past decades, Austria has gained a leading position in the field of tunnelling. This has been verified by many successful projects all over the world, which have been realised with the well-known "New Austrian Tunnelling Method". However, further development and economic success of modern tunnelling methods, which are still partly based on empirical assumptions, can only be assured if their scientific basis is improved. The book discusses the application of numerical simulation methods to assist tunnel engineers. Numerical simulation tools for the estimation of the required tunnel support and the required construction measures are described in this book. By using them, it is possible to study the impact on construction and environment during the planning stage and during construction. This will result in an improvement of the safety and economy of tunnels.
This work contains an up-to-date coverage of the last 20 years' advances in Bayesian inference in econometrics, with an emphasis on dynamic models. It shows how to treat Bayesian inference in non linear models, by integrating the useful developments of numerical integration techniques based on simulations (such as Markov Chain Monte Carlo methods), and the long available analytical results of Bayesian inference for linear regression models. It thus covers a broad range of rather recent models for economic time series, such as non linear models, autoregressive conditional heteroskedastic regressions, and cointegrated vector autoregressive models. It contains also an extensive chapter on unit root inference from the Bayesian viewpoint. Several examples illustrate the methods. This book is intended for econometrics and statistics postgraduates, professors and researchers in economics departments, business schools, statistics departments, or any research centre in the same fields, especially econometricians.
Over the past thirty years, with improvements in optics, electronics, and computer technology, great strides have been made in the quantitative analysis of the visual system. A number of books on eye movement research have been written that have dealt with specific aspects of either eye movement control. However, none of these books provide a comprehensive overview of multiple aspects of the visual system. Moreover, few of these books contain modeling and detailed quantitative analyses of the visual system. Further, since the major books are almost ten years old, there is a need for an update to include the most recent research findings. It is with these considerations in mind that we have carefully compiled this updated, comprehensive, and quantitative model-based edited book on various components of the visual system. Some of the best vision scientists in the world in their respective fields have contributed to chapters in this book. They have expertise in a wide variety of fields, including bioengineering, basic and clinical visual science, medicine, neurophysiology, optometry, and psychology. Their combined efforts have resulted in a high quality book that covers modeling and quantitative analysis of optical, neurosensory, oculomotor, perceptual and clinical systems. It includes only those techniques and models that have such fundamentally strong physiological, control system, and perceptual bases that they will serve as foundations for models and analysis techniques in the future. The book is aimed first towards seniors and beginning graduate students in biomedical engineering, neurophysiology, optometry, and psychology, who will gain a broad understanding of quantitative analysisof the visual system. In addition, it has sufficient depth in each area to be useful as an updated reference and tutorial for graduate and post-doctoral students, as well as general vision scientists.
This book introduces modeling and simulation of linear time invariant systems and demonstrates how these translate to systems engineering, mechatronics engineering, and biomedical engineering. It is organized into nine chapters that follow the lectures used for a one-semester course on this topic, making it appropriate for students as well as researchers. The author discusses state space modeling derived from two modeling techniques and the analysis of the system and usage of modeling in control systems design. It also contains a unique chapter on multidisciplinary energy systems with a special focus on bioengineering systems and expands upon how the bond graph augments research in biomedical and bio-mechatronics systems.
This book is a collection of papers presented at the Forum The Impact of Applications on Mathematics in October 2013. It describes an appropriate framework in which to highlight how real-world problems, over the centuries and today, have influenced and are influencing the development of mathematics and thereby, how mathematics is reshaped, in order to advance mathematics and its application. The contents of this book address productive and successful interaction between industry and mathematicians, as well as the cross-fertilization and collaboration that result when mathematics is involved with the advancement of science and technology."
Aerodynamic design, like many other engineering applications, is increasingly relying on computational power. The growing need for multi-disciplinarity and high fidelity in design optimization for industrial applications requires a huge number of repeated simulations in order to find an optimal design candidate. The main drawback is that each simulation can be computationally expensive - this becomes an even bigger issue when used within parametric studies, automated search or optimization loops, which typically may require thousands of analysis evaluations. The core issue of a design-optimization problem is the search process involved. However, when facing complex problems, the high-dimensionality of the design space and the high-multi-modality of the target functions cannot be tackled with standard techniques. In recent years, global optimization using meta-models has been widely applied to design exploration in order to rapidly investigate the design space and find sub-optimal solutions. Indeed, surrogate and reduced-order models can provide a valuable alternative at a much lower computational cost. In this context, this volume offers advanced surrogate modeling applications and optimization techniques featuring reasonable computational resources. It also discusses basic theory concepts and their application to aerodynamic design cases. It is aimed at researchers and engineers who deal with complex aerodynamic design problems on a daily basis and employ expensive simulations to solve them.
In recent years, it has become apparent that knowing the average atomic structure of materials is insufficient to understand their properties. Diffuse scattering in addition to the Bragg scattering holds the key to learning about defects in materials, the topic of many recent books. What has been missing is a detailed step-by-step guide on how to simulate disordered materials. The DISCUS cook book fills this need covering simple topics such as building a computer crystal to complex topics such as domain structures, stacking faults or using advanced refinement techniques to adjust parameters on a disordered model. The book contains a CD-ROM with all files needed to recreate every example given using the program DISCUS. The reader is free to follow the principles behind simulating disordered materials or to get down into the details and run or modify the given examples.
This book provides a simple and unified approach to the mechanics of discontinuous-fibre reinforced composites, and introduces readers as generally as possible to the key concepts regarding the mechanics of elastic stress transfer, intermediate modes of stress transfer, plastic stress transfer, fibre pull-out, fibre fragmentation and matrix rupture. These concepts are subsequently applied to progressive stages of the loading process, through to the composite fractures. The book offers a valuable guide for advanced undergraduate and graduate students attending lecture courses on fibre composites. It is also intended for beginning researchers who wish to develop deeper insights into how discontinuous fibre provides reinforcement to composites, and for engineers, particularly those who wish to apply the concepts presented here to design and develop discontinuous-fibre reinforced composites.
This book, which contains a collection of review articles as well as focus on evidence-based policy making, will serve as a valuable resource not just for all postgraduate students conducting research using systems analysis thinking but also for policy makers. To our knowledge, a book of this nature which also has a strong African focus is currently not available. The book examines environmental and socio-economic risks with the aim of providing an analytical foundation for the management and governance of natural resources, disasters, addressing climate change, and easing the technological and ecological transitions to sustainability. It provides scientific and strategic analysis to better understand the dynamics of future energy transitions, their main driving forces, enabling factors, barriers, as well as their consequences for the social, economic and environmental dimensions of human wellbeing. Science-based policy advice is achieved through an integrated assessment and modeling of how to simultaneously address the major energy policy challenges in the areas of environment (climate change and air pollution), energy poverty (or access to affordable and clean energy for the poor), energy security and reliability. It also aims to improve our understanding of ecosystems and their management in today's changing world-in particular, the current state of ecosystems, and their ecological thresholds and buffering capacities. It provides support for policy makers in developing rational, realistic and science-based regional, national and global strategies for the production of fuel, food and fibre that sustain ecosystem services and safeguard food security. Finally, it addresses the human development dimension of global change based on comprehensive studies on the changing size and composition of human populations around the world by analyzing both their impacts and the differential vulnerabilities by age, gender and level of education.
Computerized modeling is a powerful tool to describe the complex interrelations between measured data and the dynamics of sedimentary systems. Complex interaction of environmental factors with natural variations and increasing anthropogenic intervention is reflected in the sedimentary record at varying scales. The understanding of these processes gives way to the reconstruction of the past and is a key to the prediction of future trends. Especially in cases where observations are limited and/or expensive, computer simulations may substitute for the lack of data. State-of-the-art research work requires a thorough knowledge of processes at the interfaces between atmosphere, hydrosphere, biosphere, and lithosphere, and is therefore an interdisciplinary approach.
This book is the result of a NATO sponsored workshop entitled "Student Modelling: The Key to Individualized Knowledge-Based Instruction" which was held May 4-8, 1991 at Ste. Adele, Quebec, Canada. The workshop was co-directed by Gordon McCalla and Jim Greer of the ARIES Laboratory at the University of Saskatchewan. The workshop focused on the problem of student modelling in intelligent tutoring systems. An intelligent tutoring system (ITS) is a computer program that is aimed at providing knowledgeable, individualized instruction in a one-on-one interaction with a learner. In order to individualize this interaction, the ITS must keep track of many aspects of the leamer: how much and what he or she has leamed to date; what leaming styles seem to be successful for the student and what seem to be less successful; what deeper mental models the student may have; motivational and affective dimensions impacting the leamer; and so ono Student modelling is the problem of keeping track of alI of these aspects of a leamer's leaming.
Fuzzy classi ers are important tools in exploratory data analysis, which is a vital set of methods used in various engineering, scienti c and business applications. Fuzzy classi ers use fuzzy rules and do not require assumptions common to statistical classi cation. Rough set theory is useful when data sets are incomplete. It de nes a formal approximation of crisp sets by providing the lower and the upper approximation of the original set. Systems based on rough sets have natural ability to work on such data and incomplete vectors do not have to be preprocessed before classi cation. To achieve better performance than existing machine learning systems, fuzzy classifiers and rough sets can be combined in ensembles. Such ensembles consist of a nite set of learning models, usually weak learners. The present book discusses the three aforementioned elds - fuzzy systems, rough sets and ensemble techniques. As the trained ensemble should represent a single hypothesis, a lot of attention is placed on the possibility to combine fuzzy rules from fuzzy systems being members of classi cation ensemble. Furthermore, an emphasis is placed on ensembles that can work on incomplete data, thanks to rough set theory. ."
"Multi-finger Haptic Interaction "presents a panorama of technologies and methods for multi-finger haptic interaction, together with an analysis of the benefits and implications of adding multiple-fingers to haptic applications. Research topics covered include: design and control of advanced haptic devices;multi-contact point simulation algorithms;interaction techniques and implications in human perception when interacting with multiple fingers. These multi-disciplinary results are integrated into applications such as medical simulators for training manual skills, simulators for virtual prototyping and precise manipulations in remote environments. "Multi-finger Haptic Interaction "presents the current and potential applications that can be developed with these systems, and details the systems' complexity. The research is focused on enhancing haptic interaction by providing multiple contact points to the user. This state-of-the-art volume is oriented towards researchers who are involved in haptic device design, rendering methods and perception studies, as well as readers from different disciplines who are interested in applying multi-finger haptic technologies and methods to their field of interest.
Implicit objects have gained increasing importance in geometric modeling, visualisation, animation, and computer graphics, because their geometric properties provide a good alternative to traditional parametric objects. This book presents the mathematics, computational methods and data structures, as well as the algorithms needed to render implicit curves and surfaces, and shows how implicit objects can easily describe smooth, intricate, and articulatable shapes, and hence why they are being increasingly used in graphical applications. Divided into two parts, the first introduces the mathematics of implicit curves and surfaces, as well as the data structures suited to store their sampled or discrete approximations, and the second deals with different computational methods for sampling implicit curves and surfaces, with particular reference to how these are applied to functions in 2D and 3D spaces.
This is a volume consisting of selected papers that were presented at the 3rd St. Petersburg Workshop on Simulation held at St. Petersburg, Russia, during June 28-July 3, 1998. The Workshop is a regular international event devoted to mathematical problems of simulation and applied statistics organized by the Department of Stochastic Simulation at St. Petersburg State University in cooperation with INFORMS College on Simulation (USA). Its main purpose is to exchange ideas between researchers from Russia and from the West as well as from other coun tries throughout the World. The 1st Workshop was held during May 24-28, 1994, and the 2nd workshop was held during June 18-21, 1996. The selected proceedings of the 2nd Workshop was published as a special issue of the Journal of Statistical Planning and Inference. Russian mathematical tradition has been formed by such genius as Tchebysh eff, Markov and Kolmogorov whose ideas have formed the basis for contempo rary probabilistic models. However, for many decades now, Russian scholars have been isolated from their colleagues in the West and as a result their mathe matical contributions have not been widely known. One of the primary reasons for these workshops is to bring the contributions of Russian scholars into lime light and we sincerely hope that this volume helps in this specific purpose."
Revitalize your architectural visualizations by bringing new levels of realism to your day and night interior scenes. This book features full-color, step-by-step tutorials to develop a firm understanding of the processes and techniques involved in creating jaw-dropping 3d visualizations for top marketing agencies. This second volume includes day and night lighting of an atrium scene using seasoned tools and techniques to deploy V-Ray 5 and 3ds Max 2020. It has never been quicker and easier to create the industry's top-of-the-range 3d marketing visuals. The book starts with an overview of the best techniques to approach clients via emails, calls, meetings, and social media. There are also key insights into the best practices of handling projects, pricing, contracts, invoices, pre-production, production, post-production, etc. The subsequent step takes users through the installation of V-Ray 5 and the process of accessing the V-Ray Material browser dialog. Throughout the book, users are taken through VRayMtl functions such as Diffuse, Roughness, Reflect, Glossiness, Metalness, Refract, IOR, Abbe number, Fog color, Translucency, BRDF, Coat, Sheen and Bump. Users will also learn how to use procedural maps such as VRayBitmap, VRayTriplanarTex, Bricks, Metals, Carpaint, VRayDisplacementMod, VRayUVWRandomizer, VRayMultiSubTex, VRayPointCloudColor, VRayDirt, VRayAerialPersepective, VRayLightMtl, VRayMtlWrapper, VRayOverrideMtl, VRay2SidedMtl, VRayBlendMtl and VRayEdgesTex. Users will have a rare insight into all functionalities of a V-Ray camera, VRayLight objects, Render settings, Frame buffer, Global switches, IPR options, Bucket and Progressive image samplers, Image filters, Global DMC, Color mapping, Brute force GI, Light cache, Color management, Distributed rendering, Render elements, V-Ray image file format, VFB History settings, VFB Lens Effects, LightMix, Film tonemap, Hue/Saturation, Lookup Table and much more. Finally, users will embark on the amazing journey of utilizing the previous chapters to create eye-catching 3d marketing visuals through the meticulous process of pre-production, production and post-production of both day and night scenes/lighting. The tips and tricks section will extensively cover key sections about Verified views for planning applications, Parametric modeling with AdvArray, anima (R), project manager plug-in, Verge3d, Webrotate 360, Accucities 3d city models and much more.
Growth in the pharmaceutical market has slowed down - almost to a standstill. One reason is that governments and other payers are cutting costs in a faltering world economy. But a more fundamental problem is the failure of major companies to discover, develop and market new drugs. Major drugs losing patent protection or being withdrawn from the market are simply not being replaced by new therapies - the pharmaceutical market model is no longer functioning effectively and most pharmaceutical companies are failing to produce the innovation needed for success. This multi-authored new book looks at a vital strategy which can bring innovation to a market in need of new ideas and new products: Systems Biology (SB). Modeling is a significant task of systems biology. SB aims to develop and use efficient algorithms, data structures, visualization and communication tools to orchestrate the integration of large quantities of biological data with the goal of computer modeling. It involves the use of computer simulations of biological systems, such as the networks of metabolites comprise signal transduction pathways and gene regulatory networks to both analyze and visualize the complex connections of these cellular processes. SB involves a series of operational protocols used for performing research, namely a cycle composed of theoretical, analytic or computational modeling to propose specific testable hypotheses about a biological system, experimental validation, and then using the newly acquired quantitative description of cells or cell processes to refine the computational model or theory.
This book examines the use of agent-based modelling (ABM) in population studies, from concepts to applications, best practices to future developments. It features papers written by leading experts in the field that will help readers to better understand the usefulness of ABM for population projections, how ABM can be injected with empirical data to achieve a better match between model and reality, how geographic information can be fruitfully used in ABM, and how ABM results can be reported effectively and correctly. Coverage ranges from detailing the relation between ABM and existing paradigms in population studies to infusing agent-based models with empirical data. The papers show the benefits that ABM offers the field, including enhanced theory formation by better linking the micro level with the macro level, the ability to represent populations more adequately as complex systems, and the possibility to study rare events and the implications of alternative mechanisms in artificial laboratories. In addition, readers will discover guidelines and best practices with detailed examples of how to apply agent-based models in different areas of population research, including human mating behaviour, migration, and socio-structural determinants of health behaviours. Earlier versions of the papers in this book have been presented at the workshop "Recent Developments and Future Directions in Agent-Based Modelling in Population Studies," which took place at the University of Leuven (KU Leuven), Belgium, in September 2014. The book will contribute to the development of best practices in the field and will provide a solid point of reference for scholars who want to start using agent-based modelling in their own research.
Computational molecular and materials modeling has emerged to
deliver solid technological impacts in the chemical,
pharmaceutical, and materials industries. It is not the
all-predictive science fiction that discouraged early adopters in
the 1980s. Rather, it is proving a valuable aid to designing and
developing new products and processes. People create, not
computers, and these tools give them qualitative relations and
quantitative properties that they need to make creative decisions.
|
You may like...
Using Airborne Lidar in Archaeological…
Simon Crutchley, Peter Crow
Paperback
R1,147
Discovery Miles 11 470
BIM for Heritage - Developing a Historic…
Sofia Antonopoulou, Paul B. Ryan
Paperback
R1,142
Discovery Miles 11 420
Digital Image and Video Watermarking and…
Sudhakar Ramakrishnan
Hardcover
R2,552
Discovery Miles 25 520
Mathematical and Physical Simulation of…
M. Pietrzyk, L. Cser, …
Hardcover
R4,188
Discovery Miles 41 880
|