![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > General
This book is an introduction to the fundamental concepts and tools needed for solving problems of a geometric nature using a computer. It attempts to fill the gap between standard geometry books, which are primarily theoretical, and applied books on computer graphics, computer vision, robotics, or machine learning. This book covers the following topics: affine geometry, projective geometry, Euclidean geometry, convex sets, SVD and principal component analysis, manifolds and Lie groups, quadratic optimization, basics of differential geometry, and a glimpse of computational geometry (Voronoi diagrams and Delaunay triangulations). Some practical applications of the concepts presented in this book include computer vision, more specifically contour grouping, motion interpolation, and robot kinematics. In this extensively updated second edition, more material on convex sets, Farkas's lemma, quadratic optimization and the Schur complement have been added. The chapter on SVD has been greatly expanded and now includes a presentation of PCA. The book is well illustrated and has chapter summaries and a large number of exercises throughout. It will be of interest to a wide audience including computer scientists, mathematicians, and engineers. Reviews of first edition: "Gallier's book will be a useful source for anyone interested in applications of geometrical methods to solve problems that arise in various branches of engineering. It may help to develop the sophisticated concepts from the more advanced parts of geometry into useful tools for applications." (Mathematical Reviews, 2001) ..".it will be useful as a reference book for postgraduates wishing to find the connection between their current problem and the underlying geometry." (The Australian Mathematical Society, 2001)"
Auctions have long been a popular method for allocation and procurement of products and services. Traditional auctions are constrained by time, place, number of bidders, number of bids, and the bidding experience. With the advent of internet communication technologies, the online auction environment has blossomed to support a bustling enterprise. Up until this time, the functional inner workings of these online exchange mechanisms have only been described using anecdotal accounts. Best Practices for Online Procurement Auctions offers a systematic approach to auction examination that will become invaluable to both practitioners and researchers alike.
This comprehensive, detailed reference to Mathematica provides the reader with both a working knowledge of Mathematica programming in general and a detailed knowledge of key aspects of Mathematica needed to create the fastest, shortest, and most elegant implementations possible to solve problems from the natural and physical sciences. The Guidebook gives the user a deeper understanding of Mathematica by instructive implementations, explanations, and examples from a range of disciplines at varying levels of complexity. "Programming" covers the structure of Mathematica expressions, after an overview of the syntax of Mathematica, its programming, graphic, numeric and symbolic capabilities in chapter 1. Chapter 2-6 cover hierarchical construction of all Mathematica objects out of symbolic expressions, the definition of functions, the recognition of patterns and their efficient application, program flows and program structuring, the manipulation of lists, and additional topics. An Appendix contains some general references on algorithms and applications of computer algebra, Mathematica itself and comparisons of various algebra systems. The multiplatform CD contains Mathematica 4.1 notebooks with detailed descriptions and explanations of the Mathematica commands needed in that chapter and used in applications, supplemented by a variety of mathematical, physical, and graphic examples and worked out solutions to all exercises. The Mathematica Guidebook is an indispensible resource for practitioners, researchers and professionals in mathematics, the sciences, and engineering. It will find a natural place on the bookshelf as an essential reference work.
Science has made great progress in the twentieth century, with the establishment of proper disciplines in the fields of physics, computer science, molecular biology, and many others. At the same time, there have also emerged many engineering ideas that are interdisciplinary in nature, beyond the realm of such orthodox disciplines. These in clude, for example, artificial intelligence, fuzzy logic, artificial neural networks, evolutional computation, data mining, and so on. In or der to generate new technology that is truly human-friendly in the twenty-first century, integration of various methods beyond specific disciplines is required. Soft computing is a key concept for the creation of such human friendly technology in our modern information society. Professor Rutkowski is a pioneer in this field, having devoted himself for many years to publishing a large variety of original work. The present vol ume, based mostly on his own work, is a milestone in the devel opment of soft computing, integrating various disciplines from the fields of information science and engineering. The book consists of three parts, the first of which is devoted to probabilistic neural net works. Neural excitation is stochastic, so it is natural to investi gate the Bayesian properties of connectionist structures developed by Professor Rutkowski. This new approach has proven to be par ticularly useful for handling regression and classification problems vi Preface in time-varying environments. Throughout this book, major themes are selected from theoretical subjects that are tightly connected with challenging applications."
The present book deals with coalition games in which expected pay-offs are only vaguely known. In fact, this idea about vagueness of expectations ap pears to be adequate to real situations in which the coalitional bargaining anticipates a proper realization of the game with a strategic behaviour of players. The vagueness being present in the expectations of profits is mod elled by means of the theory of fuzzy set and fuzzy quantities. The fuzziness of decision-making and strategic behaviour attracts the attention of mathematicians and its particular aspects are discussed in sev eral works. One can mention in this respect in particular the book "Fuzzy and Multiobjective Games for Conflict Resolution" by Ichiro Nishizaki and Masatoshi Sakawa (referred below as 43]) which has recently appeared in the series Studies in Fuzziness and Soft Computing published by Physica-Verlag in which the present book is also apperaing. That book, together with the one you carry in your hands, form in a certain sense a complementary pair. They present detailed views on two main aspects forming the core of game theory: strategic (mostly 2-person) games, and coalitional (or cooperative) games. As a pair they offer quite a wide overview of fuzzy set theoretical approaches to game theoretical models of human behaviour."
The design process of embedded systems has changed substantially in recent years. One of the main reasons for this change is the pressure to shorten time-to-market when designing digital systems. To shorten the product cycles, programmable processes are used to implement more and more functionality of the embedded system. Therefore, nowadays, embedded systems are very often implemented by heterogeneous systems consisting of ASICs, processors, memories and peripherals. As a consequence, the research topic of hardware/software co-design, dealing with the problems of designing these heterogeneous systems, has gained great importance. Hardware/Software Co-design for Data Flow Dominated Embedded Systems introduces the different tasks of hardware/software co-design including system specification, hardware/software partitioning, co-synthesis and co-simulation. The book summarizes and classifies state-of-the-art co-design tools and methods for these tasks. In addition, the co-design tool COOL is presented which solves the co-design tasks for the class of data-flow dominated embedded systems. In Hardware/Software Co-design for Data Flow Dominated Embedded Systems the primary emphasis has been put on the hardware/software partitioning and the co-synthesis phase and their coupling. In contrast to many other publications in this area, a mathematical formulation of the hardware/software partitioning problem is given. This problem formulation supports target architectures consisting of multiple processors and multiple ASICs. Several novel approaches are presented and compared for solving the partitioning problem, including an MILP approach, a heuristic solution and an approach based on geneticalgorithms. The co-synthesis phase is based on the idea of controlling the system by means of a static run-time scheduler implemented in hardware. New algorithms are introduced which generate a complete set of hardware and software specifications required to implement heterogeneous systems. All of these techniques are described in detail and exemplified. Hardware/Software Co-design for Data Flow Dominated Embedded Systems is intended to serve students and researchers working on hardware/software co-design. At the same time the variety of presented techniques automating the design tasks of hardware/software systems will be of interest to industrial engineers and designers of digital systems. From the foreword by Peter Marwedel: Niemann's method should be known by all persons working in the field. Hence, I recommend this book for everyone who is interested in hardware/software co-design.
For almost four decades, Software Engineering: A Practitioner's Approach (SEPA) has been the world's leading textbook in software engineering. The ninth edition represents a major restructuring and update of previous editions, solidifying the book's position as the most comprehensive guide to this important subject.
This book covers the wide-ranging scientific areas of computational science, from basic research fields such as algorithms and soft-computing to diverse applied fields targeting macro, micro, nano, genome and complex systems. It presents the proceedings of the International Symposium on Frontiers of Computational Science 2005, held in Nagoya in December 2005.
Research argues that e-government technologies have positive influences on politics and democracy, improving citizens' environment as well as their engagement with their government. Although much research indicates that e-government technologies have increased citizen participation, there is much more than can be developed. Politics, Democracy and E-Government: Participation and Service Delivery examines how e-government impacts politics and democracy in both developed and developing countries, discussing the participation of citizens in government service delivery. This book brings forth the idea that e-government has a direct influence on the important function of governing through participation and service delivery. Containing chapters from leading e-government scholars and practitioners from across the globe, the overall objective of this book is accomplished through its discussion on the influences of e-government on democratic institutions and processes.
Sadly enough, war, conflicts and terrorism appear to stay with us in the 21st century. But what is our outlook on new methods for preventing and ending them? Present-day hard- and software enables the development of large crisis, conflict, and conflict management databases with many variables, sometimes with automated updates, statistical analyses of a high complexity, elaborate simulation models, and even interactive uses of these databases. In this book, these methods are presented, further developed, and applied in relation to the main issue: the resolution and prevention of intra- and international conflicts. Conflicts are a worldwide phenomenon. Therefore, internationally leading researchers from the USA, Austria, Canada, Germany, New Zealand and Switzerland have contributed.
Homeland security information systems are an important area of inquiry due to the tremendous influence information systems play on the preparation and response of government to a terrorist attack or natural disaster. ""Homeland Security Preparedness and Information Systems: Strategies for Managing Public Policy"" delves into the issues and challenges that public managers face in the adoption and implementation of information systems for homeland security. A defining collection of field advancements, this publication provides solutions for those interested in adopting additional information systems security measures in their governments.
Introduction The exponential scaling of feature sizes in semiconductor technologies has side-effects on layout optimization, related to effects such as inter connect delay, noise and crosstalk, signal integrity, parasitics effects, and power dissipation, that invalidate the assumptions that form the basis of previous design methodologies and tools. This book is intended to sample the most important, contemporary, and advanced layout opti mization problems emerging with the advent of very deep submicron technologies in semiconductor processing. We hope that it will stimulate more people to perform research that leads to advances in the design and development of more efficient, effective, and elegant algorithms and design tools. Organization of the Book The book is organized as follows. A multi-stage simulated annealing algorithm that integrates floorplanning and interconnect planning is pre sented in Chapter 1. To reduce the run time, different interconnect plan ning approaches are applied in different ranges of temperatures. Chapter 2 introduces a new design methodology - the interconnect-centric design methodology and its centerpiece, interconnect planning, which consists of physical hierarchy generation, floorplanning with interconnect planning, and interconnect architecture planning. Chapter 3 investigates a net-cut minimization based placement tool, Dragon, which integrates the state of the art partitioning and placement techniques."
It is a complete training in digital communications in the same book with all the aspects involved in such training: courses, tutorials with many typical problems targeted with detailed solutions, practical work concretely illustrating various aspects of technical implementation implemented. It breaks down into three parts. The Theory of information itself, which concerns both the sources of information and the channels of its transmission, taking into account the errors they introduce in the transmission of information and the means of protect by the use of appropriate coding methods. Then for the technical aspects of transmission, first the baseband transmission is presented with the important concept and fundamental technique of equalization. The performance evaluation in terms of probability of errors is systematically developed and detailed as well as the online codes used. Finally, the third part presents the Transmissions with digital modulation of carriers used in radio transmissions but also on electric cables. A second important aspect in learning a learner's knowledge and skills is this book. It concerns the "Directed Work" aspect of a training. This is an ordered set of 33 typical problems with detailed solutions covering the different parts of the course with practical work. Finally, the last aspect concerns the practical aspects in the proper sense of the term, an essential complement to training going as far as know-how. We propose here a set of 5 practical works.
Facing the challenge of the fast changing technological environment, many companies are developing an interest in the field of technology intelligence. Their aim is to support the decision-making process by taking advantage of a well-timed preparation of relevant information by means of systematic identification, collection, analysis, dissemination, and application of this information. This book covers the gap in literature by showing how a technology intelligence system can be designed and implemented.
Knowledge in its pure state is tacit in nature-difficult to formalize and communicate-but can be converted into codified form and shared through both social interactions and the use of IT-based applications and systems. Even though there seems to be considerable synergies between the resulting huge data and the convertible knowledge, there is still a debate on how the increasing amount of data captured by corporations could improve decision making and foster innovation through effective knowledge-sharing practices. Big Data and Knowledge Sharing in Virtual Organizations provides innovative insights into the influence of big data analytics and artificial intelligence and the tools, methods, and techniques for knowledge-sharing processes in virtual organizations. The content within this publication examines cloud computing, machine learning, and knowledge sharing. It is designed for government officials and organizations, policymakers, academicians, researchers, technology developers, and students.
Reputation In Artificial Societies discusses the role of reputation
in the achievement of social order. The book proposes that
reputation is an agent property that results from transmission of
beliefs about how the agents are evaluated with regard to a
socially desirable conduct. This desirable conduct represents one
or another of the solutions to the problem of social order and may
consist of cooperation or altruism, reciprocity, or norm obedience.
This Proceedings Volume documents recent cutting-edge developments in multi-robot systems research and is the result of the Second International Workshop on Multi-Robot Systems that was held in March 2003 at the Naval Research Laboratory in Washington, D.C. This Workshop brought together top researchers working in areas relevant to designing teams of autonomous vehicles, including robots and unmanned ground, air, surface, and undersea vehicles. The workshop focused on the challenging issues of team architectures, vehicle learning and adaptation, heterogeneous group control and cooperation, task selection, dynamic autonomy, mixed initiative, and human and robot team interaction. A broad range of applications of this technology are presented in this volume, including UCAVS (Unmanned Combat Air Vehicles), micro-air vehicles, UUVs (Unmanned Underwater Vehicles), UGVs (Unmanned Ground Vehicles), planetary exploration, assembly in space, clean-up, and urban search and rescue. This Proceedings Volume represents the contributions of the top researchers in this field and serves as a valuable tool for professionals in this interdisciplinary field.
Information engineering and applications is the field of study concerned with constructing information computing, intelligent systems, mathematical models, numerical solution techniques, and using computers and other electronic devices to analyze and solve natural scientific, social scientific and engineering problems. Information engineering is an important underpinning for techniques used in information and computational science and there are many unresolved problems worth studying. The Proceedings of the 2nd International Conference on Information Engineering and Applications (IEA 2012), which was held in Chongqing, China, from October 26-28, 2012, discusses the most innovative research and developments including technical challenges and social, legal, political, and economic issues. A forum for engineers and scientists in academia, industry, and government, the Proceedings of the 2nd International Conference on Information Engineering and Applications presents ideas, results, works in progress, and experience in all aspects of information engineering and applications.
The augmentation of urban spaces with technology, commonly referred to as Media Architecture, has found increasing interest in the scientific community within the last few years. At the same time architects began to use digital media as a new material apart from concrete, glass or wood to create buildings and urban structures. Simultaneously, Human-Computer Interaction (HCI) researchers began to exploit the interaction opportunities between users and buildings and to bridge the gaps between interface, information medium and architecture. As an example, they extended architectural structures with interactive, light-emitting elements on their outer shell, thereby transforming the surfaces of these structures into giant public screens. At the same time the wide distribution of mobile devices and the coverage of mobile internet allow manifold interaction opportunities between open data and citizens, thereby enabling the internet of things in the public domain. However, the appropriate distribution of information to all citizens is still cumbersome and a mutual dialogue not always successful (i.e. who gets what data and when?). In this book we therefore provide a deeper investigation of Using Information and Media as Construction Material with media architecture as an input and output medium.
The collapse of the Soviet Union has seen the emergence of its unprecedentedly comprehensive global secret military mapping project and the commercial availability of a vast number of detailed topographic maps and city plans at several scales. This thesis provides an in-depth examination of the series of over 2,000 large-scale city plans produced in secret by the Military Topographic Directorate ( ) of the General Staff between the end of the Second World War and the collapse of the USSR in 1991. After positioning the series in its historical context, the nature and content of the plans are examined in detail. A poststructuralist perspective introduces possibilities to utilise and apply the maps in new contexts, which this thesis facilitates by providing a systematic, empirical analysis of the Soviet map symbology at 1:10,000 and 1:25,000, using new translations of production manuals and a sample of the city plans. A comparative analysis with the current OpenStreetMap symbology indicates scope for Soviet mapping to be used as a valuable supplementary topographic resource in a variety of existing and future global mapping initiatives, including humanitarian crisis mapping. This leads to a conclusion that the relevance and value of Soviet military maps endure in modern applications, both as a source of data and as a means of overcoming contemporary cartographic challenges relating to symbology, design and the handling of large datasets.
With the development of networked computing and the increased complexity of applications and software systems development, the importance of computer-supported collaborative work CSCW] has dramatically increased. Globalization has further accentuated the necessity of collaboration, while the Web has made geographically distributed collaborative systems technologically feasible in a manner that was impossible until recently. The software environments needed to support such distributed teams are referred to as Groupware. Groupware is intended to address the logistical, managerial, social, organizational and cognitive difficulties that arise in the application of distributed expertise. These issues represent the fundamental challenges to the next generation of process management. Computer-Supported Collaboration with Applications to Software Development reviews the theory of collaborative groups and the factors that affect collaboration, particularly collaborative software development. The influences considered derive from diverse sources: social and cognitive psychology, media characteristics, the problem-solving behavior of groups, process management, group information processing, and organizational effects. It also surveys empirical studies of computer-supported problem solving, especially for software development. The concluding chapter describes a collaborative model for program development. Computer-Supported Collaboration with Applications to Software Development is designed for an academic and professional market in software development, professionals and researchers in the areas of software engineering, collaborative development, management information systems, problem solving, cognitive and social psychology. This book also meets the needs of graduate-level students in computer science and information systems.
Internet heterogeneity is driving a new challenge in application development: adaptive software. Together with the increased Internet capacity and new access technologies, network congestion and the use of older technologies, wireless access, and peer-to-peer networking are increasing the heterogeneity of the Internet. Applications should provide gracefully degraded levels of service when network conditions are poor, and enhanced services when network conditions exceed expectations. Existing adaptive technologies, which are primarily end-to-end or proxy-based and often focus on a single deficient link, can perform poorly in heterogeneous networks. Instead, heterogeneous networks frequently require multiple, coordinated, and distributed remedial actions. Conductor: Distributed Adaptation for Heterogeneous Networks describes a new approach to graceful degradation in the face of network heterogeneity - distributed adaptation - in which adaptive code is deployed at multiple points within a network. The feasibility of this approach is demonstrated by conductor, a middleware framework that enables distributed adaptation of connection-oriented, application-level protocols. By adapting protocols, conductor provides application-transparent adaptation, supporting both existing applications and applications designed with adaptation in mind. Conductor: Distributed Adaptation for Heterogeneous Networks introduces new techniques that enable distributed adaptation, making it automatic, reliable, and secure. In particular, we introduce the notion of semantic segmentation, which maintains exactly-once delivery of the semantic elements of a data stream while allowing the stream to be arbitrarily adapted in transit. We also introduce a secure architecture for automatic adaptor selection, protecting user data from unauthorized adaptation. These techniques are described both in the context of conductor and in the broader context of distributed systems. Finally, this book presents empirical evidence from several case studies indicating that distributed adaptation can allow applications to degrade gracefully in heterogeneous networks, providing a higher quality of service to users than other adaptive techniques. Further, experimental results indicate that the proposed techniques can be employed without excessive cost. Thus, distributed adaptation is both practical and beneficial. Conductor: Distributed Adaptation for Heterogeneous Networks is designed to meet the needs of a professional audience composed of researchers and practitioners in industry and graduate-level students in computer science. |
![]() ![]() You may like...
Report on Bow River Power and Storage…
M C (Murray Colder) D 1951 Hendry, Canada Dominion Water Power Branch
Hardcover
R1,076
Discovery Miles 10 760
The Role and Impact of Entrepreneurship…
Alain Fayolle, Dafna Kariv, …
Hardcover
R4,059
Discovery Miles 40 590
|