Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > Computer software packages > Computer graphics software
Protein engineering endeavors to design new peptides and proteins or to change the structural and/or functional characteristics of existing ones for specific purposes, opening the way for the development of new drugs. This work develops in a comprehensive way the theoretical formulation for the methods used in computer-assisted modeling and predictions, starting from the basic concepts and proceeding to the more sophisticated methods, such as Monte Carlo and molecular dynamics. An evaluation of the approximations inherent to the simulations will allow the reader to obtain a perspective of the possible deficiencies and difficulties and approach the task with realistic expectations. Examples from the authors laboratories, as well as from the literature provide useful information.
This volume contains thoroughly refereed full versions of the best
papers presented at the 5th European Workshop on Modelling
Autonomous Agents in a Multi-Agent World, MAAMAW '93, held in
NeuchA[tel, Switzerland in August 1993.
This book is devoted to a new branch of experimental design theory called simulation experimental design. There are many books devoted either to the theory of experimental design or to system simulation techniques, but in this book an approach to combine both fields is developed. Especially the mathematical theory of such universal variance reduction techniques as splitting and Russian Roulette is explored. The book contains a number of results on regression design theory related to nonlinear problems, the E-optimum criterion and designs which minimize bias. Audience: This volume will be of value to readers interested in systems simulation, applied statistics and numerical methods with basic knowledge of applied statistics and linear algebra.
Experts from university and industry are presenting new technologies for solving industrial problems and giving many important and practicable impulses for new research. Topics explored include NURBS, product engineering, object oriented modelling, solid modelling, surface interrogation, feature modelling, variational design, scattered data algorithms, geometry processing, blending methods, smoothing and fairing algorithms, spline conversion. This collection of 24 articles gives a state-of-the-art survey of the relevant problems and issues in geometric modelling.
This volume constitutes the proceedings of the 7th International
Conference on Advanced Information Systems Engineering, CAiSE '95,
held in Jyvaskyla, Finland in June 1995.
Macrosystems are systems in which the stochastic behaviour of the elements is transformed into deterministic behaviour of the system as a whole. This publication discusses equilibrium in these systems. Mathematical models of stationary states using the principle of maximum entropy are presented. This is developed and generalized for macrosystems with constrained resources. Parametric properties which characterize a model's response to data variations are discussed. The author has developed new computational methods for the computer-aided realization of stationary state models. Algorithms and computer experiments are evaluated. Mathematical modelling methods are applied to problems of hierarchical structures, interregional product exchange and image reconstruction.
The theory of evolution has been most successful explaining the emergence of new species in terms of their morphological traits. Ethologists teach that behaviors, too, qualify as first-class phenotypic features, but evolutionary accounts of behaviors have been much less satisfactory. In part this is because maturational "programs" transforming genotype to phenotype are "open" to environmental influences affected by behaviors. Further, many organisms are able to continue to modify their behavior, i.e., learn, even after fully mature. This creates an even more complex relationship between the genotypic features underlying the mechanisms of maturation and learning and the adapted behaviors ultimately selected.A meeting held at the Santa Fe Institute during the summer of 1993 brought together a small group of biologists, psychologists, and computer scientists with shared interests in questions such as these. This volume consists of papers that explore interacting adaptive systems from a range of interdisciplinary perspectives. About half of the articles are classic, seminal references on the subject, ranging from biologists like Lamarck and Waddington to psychologists like Piaget and Skinner. The other half represent new work by the workshop participants. The role played by mathematical and computational tools, both as models of natural phenomena and as algorithms useful in their own right, is particularly emphasized in these new papers. In all cases, the prefaces help to put the older papers in a modern context. For the new papers, the prefaces have been written by colleagues from a discipline other than the paper's authors, and highlight, for example, what a computer scientist can learn from a biologist's model, or vice versa. Through these cross-disciplinary "dialogues" and a glossary collecting multidisciplinary connotations of pivotal terms, the process of interdisciplinary investigation itself becomes a central theme.
Since the early 1980s, CAD frameworks have received a great deal of attention, both in the research community and in the commercial arena. It is generally agreed that CAD framework technology promises much: advanced CAD frameworks can turn collections of individual tools into effective and user-friendly design environments. But how can this promise be fulfilled? CAD Frameworks: Principles and Architecture describes the design and construction of CAD frameworks. It presents principles for building integrated design environments and shows how a CAD framework can be based on these principles. It derives the architecture of a CAD framework in a systematic way, using well-defined primitives for representation. This architecture defines how the many different framework sub-topics, ranging from concurrency control to design flow management, relate to each other and come together into an overall system. The origin of this work is the research and development performed in the context of the Nelsis CAD Framework, which has been a working system for well over eight years, gaining functionality while evolving from one release to the next. The principles and concepts presented in this book have been field-tested in the Nelsis CAD Framework. CAD Frameworks: Principles and Architecture is primarily intended for EDA professionals, both in industry and in academia, but is also valuable outside the domain of electronic design. Many of the principles and concepts presented are also applicable to other design-oriented application domains, such as mechanical design or computer-aided software engineering (CASE). It is thus a valuable reference for all those involved in computer-aided design.
This book develops, for the first time, a qualitative model for the
representation of spatial knowledge based only on locative
relations between the objects involved.
The conference, coorganized by INRIA and Ecole des Mines de Paris, focuses on Discrete Event Systems (DES) and is aimed at engineers, scientists and mathematicians working in the fields of Automatic Control, Operations Research and Statistics who are interested in the modelling, analysis and optimization of DES. Various methods such as Automata theory, Petri nets, etc. are proposed to describe and analyze such systems. Comparison of these different mathematical approaches and the global confrontation of theoretical approaches with applications in manufacturing, telecommunications, parallel computing, transportation, etc. are the goals of the conference.
Performance evaluation, reliability, and performability are key factors in the development and improvement of computer systems and computer networks. This volume contains the 25 accepted and invited papers presented at the 7th International Conference on Modelling Techniques and Tools for Computer Performance Evaluation. The papers focus on new techniques and the extension of existing techniques for performance and reliability analysis. Tools to support performance and reliability modelling and measurement in all kinds of applications and environments are presented, and the practicability and generality of the approaches are emphasized. The volume summarizes the state of the art and points out future demands and challenges, and will interest both scientists and practitioners.
Supramolecular chemistry has been defined by J.-M. Lehn as "a highly interdisciplinary field of science covering the chemical, physical, and biological features of chemical species of higher complexity, that are held together and organized by means of intermolecular (noncovalent) binding interactions" (Science, 1993). Recognition, reactivity, and transport represent three basic functional features, in essence dynami s, which may be translated into structural features. The purpose of the NATO workshop which took place september 1-5, 1993 at the Bischenberg (near Strasbourg) was to present computations which may contribute to the atomic level understanding of the structural and thermodynamical features involved in the processes of molecular recognition and supramolecular organization. of "supra-molecular modeling." Other The main focus was therefore, on the many facets applications of computers in chemistry, such as automation, simulation of processes, procedures for fitting kinetic or thermodynamic data, computer assisted synthetic strategies, use of data bases for structure elucidation or for bibliographic searches, have an obvious impact in supramolecular chemistry as well, but were not presented at the workshop.
This book is the result of a NATO sponsored workshop entitled "Student Modelling: The Key to Individualized Knowledge-Based Instruction" which was held May 4-8, 1991 at Ste. Adele, Quebec, Canada. The workshop was co-directed by Gordon McCalla and Jim Greer of the ARIES Laboratory at the University of Saskatchewan. The workshop focused on the problem of student modelling in intelligent tutoring systems. An intelligent tutoring system (ITS) is a computer program that is aimed at providing knowledgeable, individualized instruction in a one-on-one interaction with a learner. In order to individualize this interaction, the ITS must keep track of many aspects of the leamer: how much and what he or she has leamed to date; what leaming styles seem to be successful for the student and what seem to be less successful; what deeper mental models the student may have; motivational and affective dimensions impacting the leamer; and so ono Student modelling is the problem of keeping track of alI of these aspects of a leamer's leaming.
The software process is the total set of software engineering activities necessary to develop and maintain software products. Software process technology (SPT) deals with methods, formalisms, and tools for supporting the software process. SPT has developed into a key technology in terms of its importance to software engineering environments, systems integration, cooperative working, and business process re-engineering. This volume contains the proceedings of the third European Workshop on Software Process Technology. It is organized into six parts: architecture, meta-process and methodology, process modeling concepts, PML concepts and paradigms, experiences with SPT, and related domains.
Dynamic Modeling introduces an approach to modeling that makes it a more practical, intuitive endeavour. The book enables readers to convert their understanding of a phenomenon to a computer model, and then to run the model and let it yield the inevitable dynamic consequences built into the structure of the model. Part I provides an introduction to modeling dynamic systems, while Part II offers general methods for modeling. Parts III through to VIII then apply these methods to model real-world phenomena from chemistry, genetics, ecology, economics, and engineering. To develop and execute dynamic simulation models, Dynamic Modeling comes with STELLA II run- time software for Windows-based computers, as well as computer files of sample models used in the book. A clear, approachable introduction to the modeling process, of interest in any field where real problems can be illuminated by computer simulation.
This book summarizes recent advances in robotics using 3D printers and rapid prototyping as a concept development tool. The book is focused on industrial applications, educational aspects, rehabilitation, and other related topics. In particular, the book is intended to offer the reader a smooth yet deep introduction to the use of 3D printers and rapid prototyping techniques as a solution to robotics and mechatronics problems, highlighting successful case studies.
This book is a collection of work arising from a NSF/ AFOSR sponsored workshop held at the University of California, Santa Barbara, 18-20th June 1992. Sixty-nine researchers, from nine countries, participated. Twelve keynote essays give an overview of the field and speculate on future directions and nineteen technical papers delineate the state of the art in the field. This book serves both as in introduction to the topic and as a reference on the current technical problems and approaches.
This book contains the proceedings of the International Confer ence on Artificial Neural Networks which was held between September 13 and 16 in Amsterdam. It is the third in a series which started two years ago in Helsinki and which last year took place in Brighton. Thanks to the European Neural Network Society, ICANN has emerged as the leading conference on neural networks in Europe. Neural networks is a field of research which has enjoyed a rapid expansion and great popularity in both the academic and industrial research communities. The field is motivated by the commonly held belief that applications in the fields of artificial intelligence and robotics will benefit from a good understanding of the neural information processing properties that underlie human intelligence. Essential aspects of neural information processing are highly parallel execution of com putation, integration of memory and process, and robustness against fluctuations. It is believed that intelligent skills, such as perception, motion and cognition, can be easier realized in neuro-computers than in a conventional computing paradigm. This requires active research in neurobiology to extract com putational principles from experimental neurobiological find ings, in physics and mathematics to study the relation between architecture and function in neural networks, and in cognitive science to study higher brain functions, such as language and reasoning. Neural networks technology has already lead to practical methods that solve real problems in a wide area of industrial applications. The clusters on robotics and applications contain sessions on various sub-topics in these fields."
Computer Simulation and Computer Algebra. Starting from simple examples in classical mechanics, these introductory lectures proceed to simulations in statistical physics (using FORTRAN) and then explain in detail the use of computer algebra (by means of Reduce). This third edition takes into account the most recent version of Reduce (3.4.1) and updates the description of large-scale simulations to subjects such as the 170000 X 170000 Ising model. Furthermore, an introduction to both vector and parallel computing is given.
This book is based on a number of systems concepts, of which the following are emphasized here: oThe interacting systems of society and the environment are dynamic and evolution ary oEvolution of these systems carries them through stages of differential stability and instability, continuity and discontinuity oAssociated with evolution and instability is structural change that is essentially irre versible oThe present is a stage of world transformation that may not have been equaled for decades or even centuries oPolicies and decisions must match the times, in the present case the stage of world transformation The time 11:59:59 PM, approximately, on December 31, 2000 has an impor tant symbolic meaning. It marks the end of a minute, the end of an hour, the end of a day, the end of a year, the end of a decade, the end of a century, and the end of a millennium. The time and date provide a convenient yardstick against which we can evaluate the evolution of our thinking and the adequacy of our assumptions, mental models, paradigms, and policies. Will the beginning tum out to be appropriately dif ferent from the end? We hope that this book is helpful in such evaluation. This is a new-paradigm book, which both presents and advances the new way of thinking about the systems of science, technology, society, economics, politics, and the environment, and actively calls for the replacement of the worn out cognitive/sociotechnical paradigm."
In this volume experts from university and industry are presenting new technologies for solving industrial problems as well as important and practicable impulses for new research. The following topics are treated: - solid modelling - geometry processing - feature modelling - product modelling - surfaces over arbitrary topologies - blending methods - scattered data algorithms - smooting and fairing algorithms - NURBS 21 articles are giving a state-of-the-art survey of the relevant problems and issues in the rapidly growing area of geometric modelling.
From mulching to greenhouses, the air space between the cover and the soil surface is the key to the classification of climates under cover. The same mechanism governs environments produced by the various covers. This book describes and analyses all the different environments from mulching to greenhouses. The relationship between plants and environment is another important topic in the book. Stress is placed on the link between quantitative phenomena and qualitative analyses. Most phenomena involved are nonlinear and non-steady-state. An approach called System Dynamics is used, and simulation models developed in the simulation language CSMP are fully used. The subjects covered are of relevance to graduate students, to scientists and researchers in agriculture and biological sciences and, of course, to agricultural organizations in both the developing and developed countries.
From mulching to greenhouses, the air space between the cover and the soil surface is the key to the classification of climates under cover. The same mechanism governs environments produced by the various covers. This book describes and analyses all the different environments from mulching to greenhouses. The relationship between plants and environment is another important topic in the book. Stress is placed on the link between quantitative phenomena and qualitative analyses. Most phenomena involved are nonlinear and non-steady-state. An approach called System Dynamics is used, and simulation models developed in the simulation language CSMP are fully used. The subjects covered are of relevance to graduate students, to scientists and researchers in agriculture and biological sciences and, of course, to agricultural organizations in both the developing and developed countries.
Why a book about logs? That's easy: the humble log is an abstraction that lies at the heart of many systems, from NoSQL databases to cryptocurrencies. Even though most engineers don't think much about them, this short book shows you why logs are worthy of your attention. Based on his popular blog posts, LinkedIn principal engineer Jay Kreps shows you how logs work in distributed systems, and then delivers practical applications of these concepts in a variety of common uses - data integration, enterprise architecture, real-time stream processing, data system design, and abstract computing models. Go ahead and take the plunge with logs; you're going love them. Learn how logs are used for programmatic access in databases and distributed systems Discover solutions to the huge data integration problem when more data of more varieties meet more systems Understand why logs are at the heart of real-time stream processing Learn the role of a log in the internals of online data systems Explore how Jay Kreps applies these ideas to his own work on data infrastructure systems at LinkedIn
The disease that came to be called acquired immunodeficiency syndrome (AIDS) was first identified in the summer of 1981. By that time, nearly 100,000 persons in the United States may have been infected with human immunodeficiency virus (HIV). By the time the routes of transmission were clearly identified and HIV was established as the cause of AIDS in 1983, over 300,000 people may have been infected. That number has continued to increase, with approximately 1,000,000 Americans believed to be infected in 1991. The epidemic is of great public health concern because HlV is infectious, causes severe morbidity and death in most if not all of those infected, and often occurs in relatively young persons. In addition, the cost of medical care for a person with HIV disease is high, and the medical care needs of HIV-infected persons place a severe burden on the medical care systems in many areas. Understanding and controlling the HIV epidemic is a particularly difficult challenge. The long and variable period between HIV infection and clinical disease makes it difficult both to forecast the future magnitude of the epidemic, which is important for health care planning, and to estimate the number infected in the last several years, which is important for monitoring the current status of the epidemic. |
You may like...
Researching Pedagogic Tasks - Second…
Martin Bygate, Peter Skehan, …
Paperback
R1,826
Discovery Miles 18 260
Scheffer/Schachtschabel Soil Science
Hans-Peter Blume, Gerhard W Brummer, …
Paperback
R4,387
Discovery Miles 43 870
Asian Interventions in Global…
Poonam Trivedi, Paromita Chakravarti, …
Hardcover
R4,134
Discovery Miles 41 340
|