![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer software packages > Computer graphics software > General
This book constitutes the refereed proceedings of the Joint
Workshop on Process Algebra and Performance Modeling and
Probabilistic Methods in Verification, PAPM-PROBMIV 2001, held in
Aachen, Germany in September 2001.
The book contains 16 papers and one invited talk presenting the latest research in computer animation and simulation. Special focus is given on the modelling and animation of complex phenomena. This includes the modelling of virtual creatures-from their body-parts to the control of their behaviour-and the nomination of natural phenomena such as water, smoke, fire, and vegetation.
Geometric Modelling is concerned with the computer aided design, manipulation, storage and transmission of geometric shape. It provides fundamental techniques to different areas of application as CAD/CAM, computer graphics, scientific visualization, and virtual Reality. 20 papers presented by leading experts give a state-of-the-art survey of the following topics: surface design and fairing; multiresolution models; reverse engineering; solid modelling; constrained based modelling.
Context has emerged as a central concept in a variety of contemporary app- aches to reasoning. The conference at which the papers in this volume were presented was the third international, interdisciplinary conference on the topic of context, and was held in Dundee, Scotland on July 27-30, 2001. The ?rst conference in this series was held in Rio de Janiero in 1997, and the second in Trento in 1999. Like the previous conferences, CONTEXT 2001 was remarkably successful in bringing together representatives of many di?erent ?elds, spanning the entire range of the cognitive and informational sciences, and with interests ranging from speci?c, commercial applications to highly general philosophical and logical theories. The papers collected here demonstrate well the range of context-related - search. While foundational problems remain, and continue to be discussed in many of the contributions collected in this volume, the work shows increased - phistication about what forms of reasoning are important, and what techniques are appropriate in accounting for them. The papers themselves, however, do not convey the lively excitement of the conference itself, and the continuing spirit of cooperation and communication across disciplines that has been the hallmark of these conferences. We are very pleased that the ?eld of context research has shown over four years intense, sustained development while retaining this sense of interdisciplinary cooperation.
What should be every software organization's primary goal? Managing Software Quality. Producing and sustaining the high quality of products and processes in evolutionary systems are at the core of software engineering, and it is only through a comprehensive measurement program that a successful outcome can be assured. Cost and budget limitations, schedule due dates, all represent systems engineering constraints which impinge on the degree to which software development and maintenance professional can achieve maximum quality. Richard Nance and James Arthur's guide to managing software quality goes beyond the usual answers to the "why" and "what" questions generally provided in the standards documents. They not only look at the "how to" in their focus of the measurement of software quality, but also come up with specific suggestions to the pressing needs of practising software engineers, quality assurance engineers and software and project managers."This is one of the few books in this area that addresses the 'quality' aspect based upon the important aspect of documentation. In addition, the book provides a basis for not only the software manager concerned with measurement implementation, but also the researcher in identifying the current state of the art and practice. This will be a key reference guide for anyone that is concerned with developing quality software."(William H Farr, PhD, Naval Surface Warfare Center Dahlgren Division)About the Authors: Research motivated by problems arising in large, complex software systems is what stimulates Richard Nance. His collaboration with the U.S. Navy on major software-intensive programs spans over 30 years. James Arthur is an Associate Professor of Computer Science at Virginia Tech.
This book contains 33 papers presented at the Third Joint Visualization Symposium of the Eurographics Association and the Technical Committee on Visualization and Graphics of the IEEE Computer Society. The main topics treated are: visualization of geoscience data; multi-resolution and adaptive techniques; unstructured data, multi-scale and visibility; flow visualization; biomedical applications; information visualization; object representation; volume rendering; information visualization applications; and automotive applications.
This book focuseson the use of computer visionand graphics in architecture. It arose from a convergenceof several hot topics: 1. visualization of built environments for engineering, historical and other purposes, 2. virtual reconstruction of architecture from visual data of existing struc tures, whether via photogrammetric or range sensing techniques, and 3. augmentation of video data of architecture with useful information. The focus here is on architecture and howto present it, enhance it's abilities, make it easier to understand and make it accessibleto a larger public. Collective interest in this topic led to the International Symposium on Virtual and Augmented Architecture, whose papers are contained in this book. As editors, we were very pleased about how well the different papers chosen gavea nice focus to the topic and conference.It is clear that there are many different research approaches still active in this area - this makes it an exciting time. Wehope that this book captures that excitement and succeeds in bringing it to you.
This book constitutes the refereed proceedings of the 20th International Conference on Conceptual Modeling, ER 2001, held in Tokohama, Japan, in November 2001.The 45 revised full papers presented together with three keynote presentations were carefully reviewed and selected from a total of 197 submissions. The papers are organized in topical sections on spatial databases, spatio-temporal databases, XML, information modeling, database design, data integration, data warehouse, UML, conceptual models, systems design, method reengineering and video databases, workflows, web information systems, applications, and software engineering.
This book is devoted to investigating and developing the synergy between software engineering for multi-agent systems and agent-based social simulation; it originates from the Second International Workshop on Multi-Agend-Based Simulation, MABS 2000, held in Boston, MA, USA in July 2000, in conjunction with ICAMS 2000. Besides the thoroughly revised full papers accepted for presentation at the workshop, two invited papers and an introductory survey by one of the volume editors have been added in order to round off the scope and achieve complete coverage of all relevant topics. The book competently surveys the state of the art in the area by offering topical sections on model design issues, applications, simulating social relations and processes, and formal approaches.
Why is the question of the di?erence between living and non-living matter - tellectually so attractive to the man of the West? Where are our dreams about our own ability to understand this di?erence and to overcome it using the ?rmly established technologies rooted? Where are, for instance, the cultural roots of the enterprises covered nowadays by the discipline of Arti?cial Life? Cont- plating such questions, one of us has recognized [6] the existence of the eternal dream of the man of the West expressed, for example, in the Old Testament as follows: . . . the Lord God formed the man from the dust of the ground and breathed into his nostrils the breath of life, and the man became a living being (Genesis, 2. 7). This is the dream about the workmanlike act of the creation of Adam from clay, about the creation of life from something non-living, and the con?dence in the magic power of technologies. How has this dream developed and been converted into a reality, and how does it determine our present-day activities in science and technology? What is this con?dence rooted in? Then God said: "Let us make man in our image. . . " (Genesis, 1. 26). Man believes in his own ability to repeat the Creator's acts, to change ideas into real things, because he believes he is godlike. This con?dence is - using the trendy Dawkins' term - perhaps the most important cultural meme of the West.
This book presents state-of-the-art methods in computer animation and simulation. This collection of papers covers current research in human animation, physically based modeling, motion control, animation systems, and other key aspects.
Seventeen papers report on the latest scientific advances in the fields of immersive projection technology and virtual environments. The main topics included here are human computer interaction (user interfaces, interaction techniques), software developments (virtual environment applications, rendering techniques), and input/output devices.
This book is the third official archival publication devoted to
RoboCup and documents the achievements presented at the Third Robot
World Cup Soccer Games and Conferences, Robo-Cup-99, held in
Stockholm, Sweden in July/August 1999. The book presents the
following parts
It is now 30 years since the network for digital communication, the ARPA-net, first came into operation. Since the first experiments with sending electronic mail and performing file transfers, the development of networks has been truly remarkable. Today's Internet continues to develop at an exponential rate that even surpasses that of computing and storage technologies. About five years after being commercialized, it has become as pervasive as the tele phone had become 30 years after its initial deployment. In the United States, the size of the Internet industry already exceeds that of the auto industry, which has been in existence for about 100 years. The exponentially increas ing capabilities of communication, computing, and storage systems is also reshaping the way science and engineering are pursued. Large-scale simulation studies in chemistry, physics, engineering, and sev eral other disciplines may now produce data sets of ,several terabytes or petabytes. Similarly, almost all measurements today produce data in digital form, whether from collections of sensors, three-dimensional digital images, or video. These data sets often represent complex phenomena that require rich visualization capabilities and efficient data-mining techniques to under stand. Furthermore, the data may be produced and archived in several differ ent locations, and the analysis carried out by teams with members at several locations-possibly distinct from those with significant storage, computation, or visualization facilities. The emerging computational Grids enable the transparent use of remote instruments, computational and data resources.
Neural Networks are a new, interdisciplinary tool for information processing. Neurocomputing being successfully introduced to structural problems which are difficult or even impossible to be analysed by standard computers (hard computing). The book is devoted to foundations and applications of NNs in the structural mechanics and design of structures.
This second volume of the series 'Reviews in Computational
Chemistry' explores new applications, new methodologies, and new
perspectives. The topics covered include conformational analysis,
protein folding, force field parameterizations, hydrogen bonding,
charge distributions, electrostatic potentials, electronic
spectroscopy, molecular property correlations, and the
computational chemistry literature. Methodologies described include
conformational search strategies, distance geometry, molecular
mechanics, molecular dynamics, ab initio and semiempirical
molecular orbital calculations, and quantitative structure-activity
relationships (QSAR) using topological and electronic descriptors.
This book provides a comprehensive state-of-the-art, in conceptual modeling. It grew out of research papers presented at the 18th International Conference on Conceptual Modeling (ER '99) and arranged by the editors. The plan of the conference is to cover the whole spectrum of conceptual modeling as it relates to database and information systems design and to offer a complete coverage of data and process modeling, database technology, and database applications. The aim of the conference and of these proceedings is to present new insights related to each of these topics. This book contains both selected and invited papers. The 33 selected papers are organized in 11 sessions encompassing the major themes of the conference, especially : - schema transformation, evolution, and integration - temporal database design - views and reuse in conceptual modeling - advanced conceptual modeling - business process modeling and workflows - data warehouse design. Besides the selected papers, 3 invited papers present the views of three keynote speakers, internationally known for their contribution to conceptual modeling and database research and for their active role in knowledge dissemination. Peter Chen presents the results of his ongoing research on ER model, XML, and the Web. Georges Gardarin presents the first results of an ESPRIT project federating various data sources with XML and XML-QL. Finally, Matthias Jarke develops a way to capture and evaluate the experiences gained about process designs in so-called process data warehouses.
The objective of the present work is to review the existing literature on joint incongruity, cellular mechano-transduction, and computer simulations of mechano-adaptive bone remodelling, and to quantitatively assess the effect of incongruity on load transmission and subchondral mineralisation. Idealised computer models of incongruous joints and a specific anatomically based model of the humero-ulnar joint articulation were analysed with the finite element method, and the results directly compared with experimental and morphological data.
Increasing the designer's con dence that a piece of software or hardwareis c- pliant with its speci cation has become a key objective in the design process for software and hardware systems. Many approaches to reaching this goal have been developed, including rigorous speci cation, formal veri cation, automated validation, and testing. Finite-state model checking, as it is supported by the explicit-state model checkerSPIN, is enjoying a constantly increasingpopularity in automated property validation of concurrent, message based systems. SPIN has been in large parts implemented and is being maintained by Gerard Ho- mann, and is freely available via ftp fromnetlib.bell-labs.comor from URL http: //cm.bell-labs.com/cm/cs/what/spin/Man/README.html. The beauty of nite-state model checking lies in the possibility of building \push-button" validation tools. When the state space is nite, the state-space traversal will eventually terminate with a de nite verdict on the property that is being validated. Equally helpful is the fact that in case the property is inv- idated the model checker will return a counterexample, a feature that greatly facilitates fault identi cation. On the downside, the time it takes to obtain a verdict may be very long if the state space is large and the type of properties that can be validated is restricted to a logic of rather limited expressiveness.
The book presents innovative methods for the solution of multibody descriptor models. It emphasizes the interdependence of modeling and numerical solution of the arising system of differential-algebraic equations (DAE). Here, it is shown that modifications of non-stiff ODE-solvers are very effective for a large class of multibody systems. In particular, implicit methods are found to dovetail optimally with the linearly implicit structure of the model equations, allowing an inverse dynamics approach for their solution. Furthermore, the book stresses the importance of software development in scientific computing and thus presents a complete example of an interdisciplinary problem solution for an important field of application from technical mechanics.
The 20 research papers in this volume demonstrate novel models and concepts in animation and graphics simulation. Special emphasis is given on innovative approaches to Modelling Human Motion, Models of Collision Detection and Perception, Facial Animation and Communication, Specific Animation Models, Realistic Rendering for Animation, and Behavioral Animation.
This volume contains the papers presented at the Second International and - terdisciplinary Conference on Modeling and Using Context (CONTEXT 99), held in Trento (Italy) from 9 to 11 September 1999. CONTEXT 99 is the second in the CONTEXT series. The rst was held in Rio de Janeiro (Brazil) in 1997. The CONTEXT conference series is meant to provide an interdisciplinary - rum where researchers can exchange ideas, methodologies, and results on c- text, and is increasingly becoming an important reference for all people doing research on context. This is testi ed by the larger number of research areas that are represented at CONTEXT 99 (in particular, Philosophy and Cognitive Psychology were not signi cantly present at the rst conference), and by the number and quality of submitted papers. Speci cally, we received 118 papers, mostly of good or excellent quality. Among them, 33 (28%) have been accepted as full papers, and 21 as short papers. We think it is fair to say that the 54 papers collected in this volume provide a signi cant picture of the international research on context currently going on. The notion of context plays an important role in many areas, both theoretical and applied, such as Formal Logic, Arti cial Intelligence, Philosophy, Pragm- ics, Computational Linguistics, Computer Science, Cognitive Psychology."
User modeling researchers look for ways of enabling interactive software systems to adapt to their users-by constructing, maintaining, and exploiting user models, which are representations of properties of individual users. User modeling has been found to enhance the effectiveness and/or usability of software systems in a wide variety of situations. Techniques for user modeling have been developed and evaluated by researchers in a number of fields, including artificial intelligence, education, psychology, linguistics, human-computer interaction, and information science. The biennial series of International Conferences on User Modeling provides a forum in which academic and industrial researchers from all of these fields can exchange their complementary insights on user modeling issues. The published proceedings of these conferences represent a major source of information about developments in this area.
Neural networks have had considerable success in a variety of disciplines including engineering, control, and financial modelling. However a major weakness is the lack of established procedures for testing mis-specified models and the statistical significance of the various parameters which have been estimated. This is particularly important in the majority of financial applications where the data generating processes are dominantly stochastic and only partially deterministic. Based on the latest, most significant developments in estimation theory, model selection and the theory of mis-specified models, this volume develops neural networks into an advanced financial econometrics tool for non-parametric modelling. It provides the theoretical framework required, and displays the efficient use of neural networks for modelling complex financial phenomena. Unlike most other books in this area, this one treats neural networks as statistical devices for non-linear, non-parametric regression analysis.
This book contains the scientific papers presented at the SthEUROGRAPHICS Workshop on Virtual Environments '99, which st st was held in Vienna May 31 and June 1 . It was organized by the Institute of Computer Graphics of the Vienna University of Technology together with the Austrian Academy of Sciences and EUROGRAPHICS. The workshop brought together scientists from all over the world to present and discuss the latest scientific advances in the field of Virtual Environments. 31 papers where submitted for reviewing and 18 where selected to be presented at the workshop. Most of the top research institutions working in the area submitted papers and presented their latest results. These presentations were complemented by invited lectures from Stephen Feiner and Ron Azuma, two key researchers in the area of Augmented Reality. The book gives a good overview of the state of the art in Augmented Reality and Virtual Environment research. The special focus of the Workshop was Augmented Reality, reflecting a noticeable strong trend in the field of Virtual Environments. Augmented Reality tries to enrich real environments with virtual objects rather than replacing the real world with a virtual world. The main challenges include real time rendering, tracking, registration and occlusion of real and virtual objects, shading and lighting interaction, and interaction techniques in augmented environments. These problems are addressed by new research results documented in this book. Besides Augmented Reality, the papers collected here also address levels of detail, distributed environments, systems and applications, and interaction techniques. |
![]() ![]() You may like...
When Suicide Beckons - A Psychoanalyst's…
Bhaskar Sripada
Hardcover
Via Afrika Computer Applications…
F. Avrakotos, V.M. Britton, …
Paperback
R299
Discovery Miles 2 990
Design and Construction of Pavements and…
Antonio Gomes Correia, Yoshitsugu Momoya, …
Hardcover
R4,719
Discovery Miles 47 190
Opening Hearts, Opening Minds…
Richard Raubolt Abpp, Kirk Brink
Hardcover
R634
Discovery Miles 6 340
|