![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer software packages > Computer graphics software > General
This work presents lines of investigation and scientific achievements of the Ukrainian school of optimization theory and adjacent disciplines. These include the development of approaches to mathematical theories, methodologies, methods, and application systems for the solution of applied problems in economy, finances, energy saving, agriculture, biology, genetics, environmental protection, hardware and software engineering, information protection, decision making, pattern recognition, self-adapting control of complicated objects, personnel training, etc. The methods developed include sequential analysis of variants, nondifferential optimization, stochastic optimization, discrete optimization, mathematical modeling, econometric modeling, solution of extremum problems on graphs, construction of discrete images and combinatorial recognition, etc. Some of these methods became well known in the world's mathematical community and are now known as classic methods.
Simulation Methods for Reliability and Availability of Complex Systems discusses the use of computer simulation-based techniques and algorithms to determine reliability and availability (R and A) levels in complex systems. The book: shares theoretical or applied models and decision support systems that make use of simulation to estimate and to improve system R and A levels, forecasts emerging technologies and trends in the use of computer simulation for R and A and proposes hybrid approaches to the development of efficient methodologies designed to solve R and A-related problems in real-life systems. Dealing with practical issues, Simulation Methods for Reliability and Availability of Complex Systems is designed to support managers and system engineers in the improvement of R and A, as well as providing a thorough exploration of the techniques and algorithms available for researchers, and for advanced undergraduate and postgraduate students.
This major reference work represents the first attempt to confront, on a world-wide basis, the way computer associations face up to their own responsibilities in an age increasingly dominated by information and communication technology. The book deals with the codes of ethics and conduct, and related issues. It is the first book to deal with homogenous codes namely codes of national computer societies. Some thirty codes are compared and analysed in depth. To put these into perspective, there are discussion papers covering the methodological, philosophical and organisational issues.
The planned construction of traffic routes through the European Alps represents a challenge for science and technology. In the past decades, Austria has gained a leading position in the field of tunnelling. This has been verified by many successful projects all over the world, which have been realised with the well-known "New Austrian Tunnelling Method". However, further development and economic success of modern tunnelling methods, which are still partly based on empirical assumptions, can only be assured if their scientific basis is improved. The book discusses the application of numerical simulation methods to assist tunnel engineers. Numerical simulation tools for the estimation of the required tunnel support and the required construction measures are described in this book. By using them, it is possible to study the impact on construction and environment during the planning stage and during construction. This will result in an improvement of the safety and economy of tunnels.
This work contains an up-to-date coverage of the last 20 years' advances in Bayesian inference in econometrics, with an emphasis on dynamic models. It shows how to treat Bayesian inference in non linear models, by integrating the useful developments of numerical integration techniques based on simulations (such as Markov Chain Monte Carlo methods), and the long available analytical results of Bayesian inference for linear regression models. It thus covers a broad range of rather recent models for economic time series, such as non linear models, autoregressive conditional heteroskedastic regressions, and cointegrated vector autoregressive models. It contains also an extensive chapter on unit root inference from the Bayesian viewpoint. Several examples illustrate the methods. This book is intended for econometrics and statistics postgraduates, professors and researchers in economics departments, business schools, statistics departments, or any research centre in the same fields, especially econometricians.
Over the past thirty years, with improvements in optics, electronics, and computer technology, great strides have been made in the quantitative analysis of the visual system. A number of books on eye movement research have been written that have dealt with specific aspects of either eye movement control. However, none of these books provide a comprehensive overview of multiple aspects of the visual system. Moreover, few of these books contain modeling and detailed quantitative analyses of the visual system. Further, since the major books are almost ten years old, there is a need for an update to include the most recent research findings. It is with these considerations in mind that we have carefully compiled this updated, comprehensive, and quantitative model-based edited book on various components of the visual system. Some of the best vision scientists in the world in their respective fields have contributed to chapters in this book. They have expertise in a wide variety of fields, including bioengineering, basic and clinical visual science, medicine, neurophysiology, optometry, and psychology. Their combined efforts have resulted in a high quality book that covers modeling and quantitative analysis of optical, neurosensory, oculomotor, perceptual and clinical systems. It includes only those techniques and models that have such fundamentally strong physiological, control system, and perceptual bases that they will serve as foundations for models and analysis techniques in the future. The book is aimed first towards seniors and beginning graduate students in biomedical engineering, neurophysiology, optometry, and psychology, who will gain a broad understanding of quantitative analysisof the visual system. In addition, it has sufficient depth in each area to be useful as an updated reference and tutorial for graduate and post-doctoral students, as well as general vision scientists.
This book introduces modeling and simulation of linear time invariant systems and demonstrates how these translate to systems engineering, mechatronics engineering, and biomedical engineering. It is organized into nine chapters that follow the lectures used for a one-semester course on this topic, making it appropriate for students as well as researchers. The author discusses state space modeling derived from two modeling techniques and the analysis of the system and usage of modeling in control systems design. It also contains a unique chapter on multidisciplinary energy systems with a special focus on bioengineering systems and expands upon how the bond graph augments research in biomedical and bio-mechatronics systems.
This book is a collection of papers presented at the Forum The Impact of Applications on Mathematics in October 2013. It describes an appropriate framework in which to highlight how real-world problems, over the centuries and today, have influenced and are influencing the development of mathematics and thereby, how mathematics is reshaped, in order to advance mathematics and its application. The contents of this book address productive and successful interaction between industry and mathematicians, as well as the cross-fertilization and collaboration that result when mathematics is involved with the advancement of science and technology."
Aerodynamic design, like many other engineering applications, is increasingly relying on computational power. The growing need for multi-disciplinarity and high fidelity in design optimization for industrial applications requires a huge number of repeated simulations in order to find an optimal design candidate. The main drawback is that each simulation can be computationally expensive - this becomes an even bigger issue when used within parametric studies, automated search or optimization loops, which typically may require thousands of analysis evaluations. The core issue of a design-optimization problem is the search process involved. However, when facing complex problems, the high-dimensionality of the design space and the high-multi-modality of the target functions cannot be tackled with standard techniques. In recent years, global optimization using meta-models has been widely applied to design exploration in order to rapidly investigate the design space and find sub-optimal solutions. Indeed, surrogate and reduced-order models can provide a valuable alternative at a much lower computational cost. In this context, this volume offers advanced surrogate modeling applications and optimization techniques featuring reasonable computational resources. It also discusses basic theory concepts and their application to aerodynamic design cases. It is aimed at researchers and engineers who deal with complex aerodynamic design problems on a daily basis and employ expensive simulations to solve them.
The rapid evolution of computer science, communication, and information technology has enabled the application of control techniques to systems beyond the possibilities of control theory just a decade ago. Critical infrastructures such as electricity, water, trafficand intermodal transport networks are now in the scope of control engineers. The sheer size of such large-scale systems requires the adoption of advanced distributed control approaches. Distributed model predictive control (MPC) is one of the promising control methodologies for control of such systems. This book provides a state-of-the-art overview of distributed MPC approaches, while at the same time making clear directions of research that deserve more attention. The core and rationale of 35 approaches are carefully explained. Moreover, detailed step-by-step algorithmic descriptions of each approach are provided. These features make the book a comprehensive guide both for those seeking an introduction to distributed MPC as well as for those who want to gain a deeper insight in the wide range of distributed MPC techniques available. "
This book provides a simple and unified approach to the mechanics of discontinuous-fibre reinforced composites, and introduces readers as generally as possible to the key concepts regarding the mechanics of elastic stress transfer, intermediate modes of stress transfer, plastic stress transfer, fibre pull-out, fibre fragmentation and matrix rupture. These concepts are subsequently applied to progressive stages of the loading process, through to the composite fractures. The book offers a valuable guide for advanced undergraduate and graduate students attending lecture courses on fibre composites. It is also intended for beginning researchers who wish to develop deeper insights into how discontinuous fibre provides reinforcement to composites, and for engineers, particularly those who wish to apply the concepts presented here to design and develop discontinuous-fibre reinforced composites.
Computerized modeling is a powerful tool to describe the complex interrelations between measured data and the dynamics of sedimentary systems. Complex interaction of environmental factors with natural variations and increasing anthropogenic intervention is reflected in the sedimentary record at varying scales. The understanding of these processes gives way to the reconstruction of the past and is a key to the prediction of future trends. Especially in cases where observations are limited and/or expensive, computer simulations may substitute for the lack of data. State-of-the-art research work requires a thorough knowledge of processes at the interfaces between atmosphere, hydrosphere, biosphere, and lithosphere, and is therefore an interdisciplinary approach.
Fuzzy classi ers are important tools in exploratory data analysis, which is a vital set of methods used in various engineering, scienti c and business applications. Fuzzy classi ers use fuzzy rules and do not require assumptions common to statistical classi cation. Rough set theory is useful when data sets are incomplete. It de nes a formal approximation of crisp sets by providing the lower and the upper approximation of the original set. Systems based on rough sets have natural ability to work on such data and incomplete vectors do not have to be preprocessed before classi cation. To achieve better performance than existing machine learning systems, fuzzy classifiers and rough sets can be combined in ensembles. Such ensembles consist of a nite set of learning models, usually weak learners. The present book discusses the three aforementioned elds - fuzzy systems, rough sets and ensemble techniques. As the trained ensemble should represent a single hypothesis, a lot of attention is placed on the possibility to combine fuzzy rules from fuzzy systems being members of classi cation ensemble. Furthermore, an emphasis is placed on ensembles that can work on incomplete data, thanks to rough set theory. ."
This book is the result of a NATO sponsored workshop entitled "Student Modelling: The Key to Individualized Knowledge-Based Instruction" which was held May 4-8, 1991 at Ste. Adele, Quebec, Canada. The workshop was co-directed by Gordon McCalla and Jim Greer of the ARIES Laboratory at the University of Saskatchewan. The workshop focused on the problem of student modelling in intelligent tutoring systems. An intelligent tutoring system (ITS) is a computer program that is aimed at providing knowledgeable, individualized instruction in a one-on-one interaction with a learner. In order to individualize this interaction, the ITS must keep track of many aspects of the leamer: how much and what he or she has leamed to date; what leaming styles seem to be successful for the student and what seem to be less successful; what deeper mental models the student may have; motivational and affective dimensions impacting the leamer; and so ono Student modelling is the problem of keeping track of alI of these aspects of a leamer's leaming.
Implicit objects have gained increasing importance in geometric modeling, visualisation, animation, and computer graphics, because their geometric properties provide a good alternative to traditional parametric objects. This book presents the mathematics, computational methods and data structures, as well as the algorithms needed to render implicit curves and surfaces, and shows how implicit objects can easily describe smooth, intricate, and articulatable shapes, and hence why they are being increasingly used in graphical applications. Divided into two parts, the first introduces the mathematics of implicit curves and surfaces, as well as the data structures suited to store their sampled or discrete approximations, and the second deals with different computational methods for sampling implicit curves and surfaces, with particular reference to how these are applied to functions in 2D and 3D spaces.
"Multi-finger Haptic Interaction "presents a panorama of technologies and methods for multi-finger haptic interaction, together with an analysis of the benefits and implications of adding multiple-fingers to haptic applications. Research topics covered include: design and control of advanced haptic devices;multi-contact point simulation algorithms;interaction techniques and implications in human perception when interacting with multiple fingers. These multi-disciplinary results are integrated into applications such as medical simulators for training manual skills, simulators for virtual prototyping and precise manipulations in remote environments. "Multi-finger Haptic Interaction "presents the current and potential applications that can be developed with these systems, and details the systems' complexity. The research is focused on enhancing haptic interaction by providing multiple contact points to the user. This state-of-the-art volume is oriented towards researchers who are involved in haptic device design, rendering methods and perception studies, as well as readers from different disciplines who are interested in applying multi-finger haptic technologies and methods to their field of interest.
This is a volume consisting of selected papers that were presented at the 3rd St. Petersburg Workshop on Simulation held at St. Petersburg, Russia, during June 28-July 3, 1998. The Workshop is a regular international event devoted to mathematical problems of simulation and applied statistics organized by the Department of Stochastic Simulation at St. Petersburg State University in cooperation with INFORMS College on Simulation (USA). Its main purpose is to exchange ideas between researchers from Russia and from the West as well as from other coun tries throughout the World. The 1st Workshop was held during May 24-28, 1994, and the 2nd workshop was held during June 18-21, 1996. The selected proceedings of the 2nd Workshop was published as a special issue of the Journal of Statistical Planning and Inference. Russian mathematical tradition has been formed by such genius as Tchebysh eff, Markov and Kolmogorov whose ideas have formed the basis for contempo rary probabilistic models. However, for many decades now, Russian scholars have been isolated from their colleagues in the West and as a result their mathe matical contributions have not been widely known. One of the primary reasons for these workshops is to bring the contributions of Russian scholars into lime light and we sincerely hope that this volume helps in this specific purpose."
Growth in the pharmaceutical market has slowed down - almost to a standstill. One reason is that governments and other payers are cutting costs in a faltering world economy. But a more fundamental problem is the failure of major companies to discover, develop and market new drugs. Major drugs losing patent protection or being withdrawn from the market are simply not being replaced by new therapies - the pharmaceutical market model is no longer functioning effectively and most pharmaceutical companies are failing to produce the innovation needed for success. This multi-authored new book looks at a vital strategy which can bring innovation to a market in need of new ideas and new products: Systems Biology (SB). Modeling is a significant task of systems biology. SB aims to develop and use efficient algorithms, data structures, visualization and communication tools to orchestrate the integration of large quantities of biological data with the goal of computer modeling. It involves the use of computer simulations of biological systems, such as the networks of metabolites comprise signal transduction pathways and gene regulatory networks to both analyze and visualize the complex connections of these cellular processes. SB involves a series of operational protocols used for performing research, namely a cycle composed of theoretical, analytic or computational modeling to propose specific testable hypotheses about a biological system, experimental validation, and then using the newly acquired quantitative description of cells or cell processes to refine the computational model or theory.
Computational molecular and materials modeling has emerged to
deliver solid technological impacts in the chemical,
pharmaceutical, and materials industries. It is not the
all-predictive science fiction that discouraged early adopters in
the 1980s. Rather, it is proving a valuable aid to designing and
developing new products and processes. People create, not
computers, and these tools give them qualitative relations and
quantitative properties that they need to make creative decisions.
Here is a comprehensive guide to the incorporation of computer simulation in all levels of the planning function of an organization. Writing for managers of planning, planners, and programmers, the author enables readers to gain an overall understanding of the potential role of simulation in planning, to apply simulation to their own particular needs, and to translate planning concepts into computer instructions. Nersesian demonstrates that for manager, planner, and programmer alike, simulation is not difficult in concept nor complicated to put into practice. The author argues that simulation is a necessary activity in a planning environment characterized by uncertain futures and rapidly changing conditions. The book is organized into separate chapters, each of which acts as a case study of an aspect in the use of simulation. The synopsis that begins every chapter provides the manager of a planning operation with an appreciation of the general application of simulation to one facet of planning. The chapters themselves focus on particular situations which might befall a planner within the general application of simulation to the planning process. Special appendices--designed to aid programmers who have not had much previous experience in setting up simulation programs--follow each chapter and provide descriptive material and the applicable simulation program. As a comprehensive yet easily understood guide to the benefits of utilizing simulation in the planning process, this book will be an invaluable resource for planners, corporate executives, and programmers.
Computational Creativity, Concept Invention, and General Intelligence in their own right all are flourishing research disciplines producing surprising and captivating results that continuously influence and change our view on where the limits of intelligent machines lie, each day pushing the boundaries a bit further. By 2014, all three fields also have left their marks on everyday life - machine-composed music has been performed in concert halls, automated theorem provers are accepted tools in enterprises' R&D departments, and cognitive architectures are being integrated in pilot assistance systems for next generation airplanes. Still, although the corresponding aims and goals are clearly similar (as are the common methods and approaches), the developments in each of these areas have happened mostly individually within the respective community and without closer relationships to the goings-on in the other two disciplines. In order to overcome this gap and to provide a common platform for interaction and exchange between the different directions, the International Workshops on "Computational Creativity, Concept Invention, and General Intelligence" (C3GI) have been started. At ECAI-2012 and IJCAI-2013, the first and second edition of C3GI each gathered researchers from all three fields, presenting recent developments and results from their research and in dialogue and joint debates bridging the disciplinary boundaries. The chapters contained in this book are based on expanded versions of accepted contributions to the workshops and additional selected contributions by renowned researchers in the relevant fields. Individually, they give an account of the state-of-the-art in their respective area, discussing both, theoretical approaches as well as implemented systems. When taken together and looked at from an integrative perspective, the book in its totality offers a starting point for a (re)integration of Computational Creativity, Concept Invention, and General Intelligence, making visible common lines of work and theoretical underpinnings, and pointing at chances and opportunities arising from the interplay of the three fields.
Object-Oriented Computer Simulation of Discrete-Event Systems offers a comprehensive presentation of a wide repertoire of computer simulation techniques available to the modelers of dynamic systems. Unlike other books on simulation, this book includes a complete and balanced description of all essential issues relevant to computer simulation of discrete event systems, and it teaches simulation users how to design, program and exploit their own computer simulation models. In addition, it uses the object-oriented methodology throughout the book as its main programming platform. The reader is expected to have some background in the theory of probability and statistics and only a little programming experience in C++, as the book is not tied down to any particular simulation language. The book also provides 50 complete simulation problems to assist with writing such simulation programs. Object-Oriented Computer Simulation of Discrete-Event Systems demonstrates the basic and generic concepts used in computer simulation of discrete-event systems in a comprehensive, uniform and self-contained manner.
Colloids are ubiquitous in the food, medical, cosmetics, polymers, water purification, and pharmaceutical industries. The thermal, mechanical, and storage properties of colloids are highly dependent on their interface morphology and their rheological behavior. Numerical methods provide a convenient and reliable tool for the study of colloids. "Accelerated Lattice Boltzmann Model for Colloidal Suspensions" introduce the main building-blocks for an improved lattice Boltzmann based numerical tool designed for the study of colloidal rheology and interface morphology. This book also covers the migrating multi-block used to simulate single component, multi-component, multiphase, and single component multiphase flows and their validation by experimental, numerical, and analytical solutions. Among other topics discussed are the hybrid lattice Boltzmann method (LBM) for surfactant-covered droplets; biological suspensions such as blood; used in conjunction with the suppression of coalescence for investigating the rheology of colloids and microvasculature blood flow. The presented LBM model provides a flexible numerical platform consisting of various modules that could be used separately or in combination for the study of a variety of colloids and biological flow deformation problems."
Master the art of computer animation and visual effects production with the latest edition of this cutting-edge guide This remarkable edition of "The Art of 3D Computer Animation and Effects" offers clear, step-by-step guidelines for the entire process of creating a fully rendered 3D computer animation. With up-to-date coverage of the latest computer animation styles and techniques, this versatile guide provides insightful information for creating animations and visual effects--from creative development and preproduction to finished animation. Designed to work with any computer platform, this "Fourth Edition" cuts through technical jargon and presents numerous easy-to-understand instructive diagrams. Full-color examples are presented--including VFX and animated feature movies, games, and TV commercials--by such leading companies as Blue Sky, Blur, BUF, Disney, DreamWorks, Electronic Arts, Framestore, ILM, Imagi, Microsoft, Mac Guff, The Mill, Menfond, Pixar, Polygon, Rhythm & Hues, Sony Imageworks, Tippett, Ubisoft, and Weta, and many other studios and groundbreaking independent artists from around the world. This fully revised edition features new material on the latest visual effects techniques, a useful update of the traditional principles of animation, practical information on creative development, multiple production pipeline ideas for shorts and visual effects, plus updated information on current production trends and techniques in animation, rendering, modeling, rigging, and compositing. Whether you are a student, an independent artist or creator, or a production company team member, "The Art of 3D Computer Animation and Effects, Fourth Edition" gives you a broad palette of tips and techniques for bringing your visions to life through 3D computer animation.Unique focus on creative development and production issuesNon-platform specific, with multiple examples illustrated in a practical, step-by-step approachThe newest computer animation techniques, including facial animation, image-based and non-photorealistic rendering, model rigging, real-time models, and 2D/3D integrationOver 700 full-color imagesEncyclopedic timeline and production pipeline
In this self-consistent monograph, the author gathers and describes different mathematical techniques and combines all together to form practical procedures for the inverse analyses. It puts together topics coming from mathematical programming, with soft computing and Proper Orthogonal Decomposition, in order to show, in the context of structural analyses, how the things work and what are the main problems one needs to tackle. Throughout the book a number of examples and exercises are worked out in order to make reader practically familiar with discussed topics. |
![]() ![]() You may like...
Forensic Geotechnical Engineering
V.V.S. Rao, G. L. Sivakumar Babu
Hardcover
Sediment Compaction and Applications in…
Troyee Dasgupta, Soumyajit Mukherjee
Hardcover
R4,318
Discovery Miles 43 180
Near Threshold Computing - Technology…
Michael Hubner, Cristina Silvano
Hardcover
R1,597
Discovery Miles 15 970
Java Foundations - Pearson New…
John Lewis, Peter DePasquale, …
Paperback
R2,831
Discovery Miles 28 310
Java Microarchitectures
Vijaykrishnan Narayanan, Mario L. Wolczko
Hardcover
R3,185
Discovery Miles 31 850
A New Kind of Computational Biology…
Parimal Pal Chaudhuri, Soumyabrata Ghosh, …
Hardcover
R2,236
Discovery Miles 22 360
|