![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Computer modelling & simulation
A central issue in computer vision is the problem of signal to symbol transformation. In the case of texture, which is an important visual cue, this problem has hitherto received very little attention. This book presents a solution to the signal to symbol transformation problem for texture. The symbolic de- scription scheme consists of a novel taxonomy for textures, and is based on appropriate mathematical models for different kinds of texture. The taxonomy classifies textures into the broad classes of disordered, strongly ordered, weakly ordered and compositional. Disordered textures are described by statistical mea- sures, strongly ordered textures by the placement of primitives, and weakly ordered textures by an orientation field. Compositional textures are created from these three classes of texture by using certain rules of composition. The unifying theme of this book is to provide standardized symbolic descriptions that serve as a descriptive vocabulary for textures. The algorithms developed in the book have been applied to a wide variety of textured images arising in semiconductor wafer inspection, flow visualization and lumber processing. The taxonomy for texture can serve as a scheme for the identification and description of surface flaws and defects occurring in a wide range of practical applications.
This book constitutes the proceedings of the 4th International Workshop on Computational Topology in Image Context, CTIC 2012, held in Bertinoro, Italy, in May 2012. The 16 papers presented in this volume were carefully reviewed and selected for inclusion in this book. They focus on the topology and computation in image context. The workshop is devoted to computational methods using topology for the analysis and comparison of images. The involved research fields comprise computational topology and geometry, discrete topology and geometry, geometrical modeling, algebraic topology for image applications, and any other field involving a geometric-topological approach to image processing.
This book constitutes the refereed proceedings of the 6th International Conference on Test and Proofs, TAP 2012, held in Prague, Czech Republic, in May/June 2012, as part of the TOOLS 2012 Federated Conferences. The 9 revised full papers presented together with 2 invited papers, 4 short papers and one tutorial were carefully reviewed and selected from 29 submissions. The papers are devoted to the convergence of tests and proofs for developing novel techniques and application that support engineers in building secure, safe, and reliable systems. Among the topics covered are model-based testing; scenario-based testing; complex data structure generation; and the validation of protocols and libraries.
This book constitutes the thoroughly refereed post-conference proceedings of the Second International Symposium on Foundations of Health Information Engineering and Systems, FHIES 2012, held in Paris, France, in August 2012. The 11 revised full papers presented together with 3 short papers in this volume were carefully reviewed and selected from 26 submissions. Topics of interest covered in this volume are such as software engineering; systems engineering; data engineering; applied mathematics; and psychology.
This book contains extended and revised versions of a set of selected papers from two workshops organized by the Euro Working Group on Decision Support Systems (EWG-DSS), which were held in Liverpool, UK, and Vilnius, Lithuania, in April and July 2012. From a total of 33 submissions, 9 papers were accepted for publication in this edition after being reviewed by at least three internationally known experts from the EWG-DSS Program Committee and external invited reviewers. The selected papers are representative of the current research activities in the area of decision support systems, focusing on topics such as decision analysis for enterprise systems and non-hierarchical networks, integrated solutions for decision support and knowledge management in distributed environments, decision support system evaluation and analysis through social networks, and e-learning and its application to real environments.
The development of new-generation micro-manufacturing technologies and systems has revolutionised the way products are designed and manufactured today with a s- nificant impact in a number of key industrial sectors. Micro-manufacturing techno- gies are often described as disruptive, enabling and interdisciplinary leading to the creation of whole new classes of products that were previously not feasible to ma- facture. While key processes for volume manufacture of micro-parts such as mach- ing and moulding are becoming mature technologies, micro-assembly remains a key challenge for the cost-effective manufacture of complex micro-products. The ability to manufacture customizable micro-products that can be delivered in variable volumes within relatively short timescales is very much dependent on the level of development of the micro-assembly processes, positioning, alignment and measurement techniques, gripping and feeding approaches and devices. Micro-assembly has developed rapidly over the last few years and all the pred- tions are that it will remain a critical technology for high-value products in a number of key sectors such as healthcare, communications, defence and aerospace. The key challenge is to match the significant technological developments with a new gene- tion of micro-products that will establish firmly micro-assembly as a mature manuf- turing process. th The book includes the set of papers presented at the 5 International Precision - sembly Seminar IPAS 2010 held in Chamonix, France from the 14th to the 17th February 2010.
This book presents a collection of chapters describing the state of the art on computational modelling and fabrication in tissue engineering. Tissue Engineering is a multidisciplinary field involving scientists from different fields. The development of mathematical methods is quite relevant to understand cell biology and human tissues as well to model, design and fabricate optimized and smart scaffolds. The chapter authors are the distinguished keynote speakers at the first Eccomas thematic conference on Tissue Engineering where the emphasis was on mathematical and computational modeling for scaffold design and fabrication. This particular area of tissue engineering, whose goal is to obtain substitutes for hard tissues such as bone and cartilage, is growing in importance.
In recent years, rapid changes and improvements have been witnessed in the field of transformer condition monitoring and assessment, especially with the advances in computational intelligence techniques. Condition Monitoring and Assessment of Power Transformers Using Computational Intelligence applies a broad range of computational intelligence techniques to deal with practical transformer operation problems. The approaches introduced are presented in a concise and flowing manner, tackling complex transformer modelling problems and uncertainties occurring in transformer fault diagnosis. Condition Monitoring and Assessment of Power Transformers Using Computational Intelligence covers both the fundamental theories and the most up-to-date research in this rapidly changing field. Many examples have been included that use real-world measurements and realistic operating scenarios of power transformers to fully illustrate the use of computational intelligence techniques for a variety of transformer modelling and fault diagnosis problems. Condition Monitoring and Assessment of Power Transformers Using Computational Intelligence is a useful book for professional engineers and postgraduate students. It also provides a firm foundation for advanced undergraduate students in power engineering.
Daylight is a dynamic source of illumination in architectural space, creating diverse and ephemeral configurations of light and shadow within the built environment. Perceptual qualities of daylight, such as contrast and temporal variability, are essential to our understanding of both material and visual effects in architecture. Although spatial contrast and light variability are fundamental to the visual experience of architecture, architects still rely primarily on intuition to evaluate their designs because there are few metrics that address these factors. Through an analysis of contemporary architecture, this work develops a new typological language that categorizes architectural space in terms of contrast and temporal variation. This research proposes a new family of metrics that quantify the magnitude of contrast-based visual effects and time-based variation within daylit space through the use of time-segmented daylight renderings to provide a more holistic analysis of daylight performance.
There is a tradition of theoretical brain science which started in the forties (Wiener, McCulloch, Turing, Craik, Hebb). This was continued by a small number of people without interruption up to the present. It has definitely provided main guiding lines for brain science, the devel opment of which has been spectacular in the last decades. However, within the bulk of experimental neuroscience, the theoreticians some times had a difficult stand, since it was felt that the times were not ripe yet and the methods not yet available for a development of a true theoretical speciality in this field. Thus theory remained in the hands of a fairly small club which recruited its members from theoretical physicists, mathematicians and some experimentalists with amateurish theoretical leanings. The boom of approaches which go by the name of 'computational neuroscience', 'neuronal networks', 'associative mem ory', 'spinglass theory', 'parallel processing' etc. should not blind one for the fact that the group of people professionally interested in real istic models of brain function up to the present date remains rather small and suffers from a lack of professional organization. It was against this background that we decided to organize a meet ing on Theoretical Brain Science. The meeting was held April 18 - 20, 1990 and took place at Schloss Ringberg, West-Germany, a facility sponsored by the Max-Planck-Society.
Simulation Methods for Reliability and Availability of Complex Systems discusses the use of computer simulation-based techniques and algorithms to determine reliability and availability (R and A) levels in complex systems. The book: shares theoretical or applied models and decision support systems that make use of simulation to estimate and to improve system R and A levels, forecasts emerging technologies and trends in the use of computer simulation for R and A and proposes hybrid approaches to the development of efficient methodologies designed to solve R and A-related problems in real-life systems. Dealing with practical issues, Simulation Methods for Reliability and Availability of Complex Systems is designed to support managers and system engineers in the improvement of R and A, as well as providing a thorough exploration of the techniques and algorithms available for researchers, and for advanced undergraduate and postgraduate students.
This book presents a powerful new language and methodology for programming complex reactive systems in a scenario-based manner. The language is live sequence charts (LSCs), a multimodal extension of sequence charts and UML's sequence diagrams, used in the past mainly for requirements. The methodology is play-in/play-out, an unusually convenient means for specifying inter-object scenario-based behavior directly from a GUI or an object model diagram, with the surprising ability to execute that behavior, or those requirements, directly. The language and methodology are supported by a fully implemented tool the Play-Engine which is attached to the book in CD form. Comments from experts in the field: The design of reactive systems is one of the most challenging problems in computer science. This books starts with a critical insight to explain the difficulty of this problem: there is a fundamental gap between the scenario-based way in which people think about such systems and the state-based way in which these systems are implemented. The book then offers a radical proposal to bridge this gap by means of playing scenarios. Systems can be specified by playing in scenarios and implemented by means of a Play-Engine that plays out scenarios. This idea is carried out and developed, lucidly, formally and playfully, to its fullest. The result is a compelling proposal, accompanied by a prototype software engine, for reactive systems design, which is bound to cause a splash in the software-engineering community. Moshe Y. Vardi, Rice University, Houston, Texas, USA Scenarios are a primary exchange tool in explaining system behavior to others, but their limited expressive power never made them able to fully describe systems, thus limiting their use. The language of Live Sequence Charts (LSCs) presented in this beautifully written book achieves this goal, and the attached Play-Engine software makes these LSCs really come alive. This is undoubtedly a key breakthrough that will start long-awaited and exciting new directions in systems specification, synthesis, and analysis. Gerard Berry, Esterel Technologies and INRIA, Sophia-Antipolis, France The approach of David Harel and Rami Marelly is a fascinating way of combining prototyping techniques with techniques for identifying behavior and user interfaces. Manfred Broy, Technical University of Munich, Germany"
This book constitutes the proceedings of the 7th International ICST Conference, TridentCom 2011, held in Shanghai, China, in April 2011. Out of numerous submissions the Program Committee finally selected 26 full papers and 2 invited papers. They focus on topics as future Internet testbeds, future wireless testbeds, federated and large scale testbeds, network and resource virtualization, overlay network testbeds, management provisioning and tools for networking research, and experimentally driven research and user experience evaluation.
The LNCS journal Transactions on Computational Systems Biology is devoted to inter- and multidisciplinary research in the fields of computer science and life sciences and supports a paradigmatic shift in the techniques from computer and information science to cope with the new challenges arising from the systems oriented point of view of biological phenomena. This, the 14th Transactions on Computational Systems Biology volume, guest edited by Ion Petre and Erik de Vink, focuses on Computational Models for Cell Processes and features a number of carefully selected and enhanced contributions, initially presented at the CompMod workshop, which took place in Aachen, Germany, in September 2011. The papers, written from different points of view and following various approaches, cover a wide range of topics within the field of modeling and analysis of biological systems. In addition, two regular submissions deal with models of self-assembling systems and metabolic constraints on the evolution of genetic codes.
Arturo Carsetti According to molecular Biology, true invariance (life) can exist only within the framework of ongoing autonomous morphogenesis and vice versa. With respect to this secret dialectics, life and cognition appear as indissolubly interlinked. In this sense, for instance, the inner articulation of conceptual spaces appears to be linked to an inner functional development based on a continuous activity of selection and "anchorage" realised on semantic grounds. It is the work of "invention" and g- eration (in invariance), linked with the "rooting" of meaning, which determines the evolution, the leaps and punctuated equilibria, the conditions related to the unfo- ing of new modalities of invariance, an invariance which is never simple repetition and which springs on each occasion through deep-level processes of renewal and recovery. The selection perpetrated by meaning reveals its autonomy aboveall in its underpinning, in an objective way, the ongoing choice of these new modalities. As such it is not, then, concerned only with the game of "possibles," offering itself as a simple channel for pure chance, but with providing a channel for the articulation of the " le" in the humus of a semantic (and embodied) net in order to prepare the necessary conditionsfor a continuousrenewal and recoveryof original creativity. In effect, it is this autonomy in inventing new possible modules of incompressibility whichdeterminestheactualemergenceofnew(andtrue)creativity, whichalsotakes place through the "narration" of the effected construction.
This volume is dedicated to Jacob Aboudi, a ?ne scientist who has made seminal c- tributions in applied mechanics. The papers presented here re?ect the appreciation of many of Jacob s colleagues. A publication list f- lowing this introduction provides an indi- tion of his distinguished academic career, c- rently in its ?fth decade, and the breadth of hisknowledge. His papersconsistentlydem- strate originality, innovation and diligence. This list uncovers the methodical work of a dedicated researcher whose achievements established him as a leading authority in the area of mathematical modeling of the beh- ior of heterogeneous materials, the area which became known as homogenization theory. Starting in 1981, Jacob established a micromechanical model known as the Method of Cells (MOC) which evolved into the Generalized Method of Cells (GMC) that predicts the macroscopic response of composite materials as a function of the pr- erties, volume fractions, shapes, and constitutive behavior of its constituents. The versatility of the model has been demonstrated to effectively incorporate various types of constituent material behavior (i. e., both coupled and uncoupled mecha- cal, thermal, electrical and magnetic effects). As a result of its potential in providing an ef?cient tool for the emerging ?eld of multiscale analysis, the method gained increasing attention and became a subject for further research."
Computer simulation has become a basic tool in many branches of physics such as statistical physics, particle physics, or materials science. The application of efficient algorithms is at least as important as good hardware in large-scale computation. This volume contains didactic lectures on such techniques based on physical insight. The emphasis is on Monte Carlo methods (introduction, cluster algorithms, reweighting and multihistogram techniques, umbrella sampling), efficient data analysis and optimization methods, but aspects of supercomputing, the solution of stochastic differential equations, and molecular dynamics are also discussed. The book addresses graduate students and researchers in theoretical and computational physics.
Society heavily depends on infrastructure systems, such as road-traffic networks, water networks, electricity networks, etc. Infrastructure systems are hereby considered to be large-scale, networked systems, that almost everybody uses on a daily basis, and that are so vital that their incapacity or destruction would have a debilitating impact on the defense or economic security and functioning of society. The operation and control of existing infrastructures such as road-traffic networks, water networks, electricity networks, etc. are failing: too often we are confronted with capacity problems, unsafety, unreliability and inefficiency. This book concentrates on a wide range of problems concerning the way infrastructures are functioning today and discuss novel advanced, intelligent, methods and tools for the operation and control of existing and future infrastructures.
This book constitutes selected papers from the lectures given at the workshops held in conjunction with the User Modeling, Adaptation and Personalization Conference, UMAP 2011, Girona, Spain, in July 2011. The 40 papers presented were carefully reviewed and selected for inclusion in this book. For each workshop there is an overview paper summarizing the workshop themes, the accepted contributions and the future research trends. In addition the volume presents a selection of the best poster papers of UMAP 2011. The workshops included are: AST, adaptive support for team collaboration; AUM, augmenting user models with real worlds experiences to enhance personalization and adaptation; DEMRA, decision making and recommendation acceptance issues in recommender systems; PALE, personalization approaches in learning environments; SASWeb, semantic adaptive social web; TRUM, trust, reputation and user modeling; UMADR, user modeling and adaptation for daily routines: providing assistance to people with special and specific needs; UMMS, user models for motivational systems: the affective and the rational routes to persuasion.
This book constitutes the proceedings of the 8th International Workshop on Programming Multi-Agent Systems held in Toronto, Canada, in May 2010 in conjunction with AAMAS 2010, the 9th International Joint Conference on Autonomous Agents and Multiagent Systems. The 7 revised full papers presented together with 1 invited paper were carefully reviewed and selected for inclusion in the book. The papers cover a broad range of mostly practical topics like decision component of agent systems; practical examples of programming languages; interaction with the environment, and are thus organized in topical sections on reasoning, programming languages, and environments.
We make complex decisions every day, requiring trust in many different entities for different reasons. These decisions are not made by combining many isolated trust evaluations. Many interlocking factors play a role, each dynamically impacting the others. In this brief, "trust context" is defined as the system level description of how the trust evaluation process unfolds. Networks today are part of almost all human activity, supporting and shaping it. Applications increasingly incorporate new interdependencies and new trust contexts. Social networks connect people and organizations throughout the globe in cooperative and competitive activities. Information is created and consumed at a global scale. Systems, devices, and sensors create and process data, manage physical systems, and participate in interactions with other entities, people and systems alike. To study trust in such applications, we need a multi-disciplinary approach. This book reviews the components of the trust context through a broad review of recent literature in many different fields of study. Common threads relevant to the trust context across many application domains are also illustrated. Illustrations in the text (c) 2013 Aaron Hertzmann. www.dgp.toronto.edu/~hertzman
Welcome to Bavaria - Germany and to the First Intercontinental Maritime Simulation Symposium and Mathematical Modelling Workshop. A triennial international conference jointly pro moted by Control Data, IMSF and SCS, which takes place at Schliersee, a small town near the Alps. The aim of the Symposium is to cover most of the aspects of maritime modelling and simulation in theory and practice, to promote the exchange of knowledge and experience between dif ferent international research groups in this field, and to strengthen the international contact between developers and users of modelling and simulation techniques. On the occas on of the Symposium people of scientific and engineering disciplines will meet to discuss the state-of-the art and future activities and developments. A large number of contributed papers has been strictly exam ined and selected by the papers committee to guarantee a high international standard. The book contains the accepted papers which will be presented at the Symposium. The papers have been classified according to the following topics: VI 1. Fifth Generation Computer Technology 2. Simulation-Software-Tools 3. An Industrial Computer System - The Chrysler Story 4. Marine Mathematical Modelling 5. CFD for Marine Vehicles 6. Navigation Methodology 7. Marine Maneuvering and Motion Simulation 8. Off-Shore Modelling 9. Steering and Control of Marine Vehicles 10. Training and Traffic Control 11. Under-Water Vehicles Operation Authors from 9 countries will meet at the Symposium."
Nonsmooth Modeling and Simulation for Switched Circuits concerns the modeling and the numerical simulation of switched circuits with the nonsmooth dynamical systems (NSDS) approach, using piecewise-linear and multivalued models of electronic devices like diodes, transistors, switches. Numerous examples (ranging from introductory academic circuits to various types of power converters) are analyzed and many simulation results obtained with the INRIA open-source SICONOS software package are presented. Comparisons with SPICE and hybrid methods demonstrate the power of the NSDS approach. Nonsmooth Modeling and Simulation for Switched Circuits is intended to researchers and engineers in the field of circuits simulation and design, but may also attract applied mathematicians interested by the numerical analysis for nonsmooth dynamical systems, as well as researchers from Systems and Control.
This book contains the joint proceedings of the Winter School of Hakodate (WSH) 2011 held in Hakodate, Japan, March 15-16, 2011, and the 6th International Workshop on Natural Computing (6th IWNC) held in Tokyo, Japan, March 28-30, 2012, organized by the Special Interest Group of Natural Computing (SIG-NAC), the Japanese Society for Artificial Intelligence (JSAI). This volume compiles refereed contributions to various aspects of natural computing, ranging from computing with slime mold, artificial chemistry, eco-physics, and synthetic biology, to computational aesthetics.
Scientific visualization is concerned with exploring data and information insuch a way as to gain understanding and insight into the data. This is a fundamental objective of much scientific investigation. To achieve this goal, scientific visualization utilises aspects in the areas of computergraphics, user-interface methodology, image processing, system design, and signal processing. This volume is intended for readers new to the field and who require a quick and easy-to-read summary of what scientific visualization is and what it can do. Written in a popular andjournalistic style with many illustrations it will enable readers to appreciate the benefits of scientific visualization and how current tools can be exploited in many application areas. This volume is indispensible for scientists and research workers who have never used computer graphics or other visual tools before, and who wish to find out the benefitsand advantages of the new approaches. |
You may like...
Agricultural Economics Research, Vol…
United States Department of Agriculture
Paperback
R493
Discovery Miles 4 930
Dynamic Auditing - A Student Edition
B. Marx, A. van der Watt, …
Paperback
Management And Cost Accounting In South…
William Bishop, Colin Drury
Paperback
R549
Discovery Miles 5 490
Accounting from a Cross-Cultural…
Asma Salman, Muthanna G. Abdul Razzaq
Hardcover
R3,060
Discovery Miles 30 600
Handbook of Management Accounting…
Christopher S. Chapman, Anthony G Hopwood, …
Hardcover
R4,596
Discovery Miles 45 960
|