![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
Interleaving Planning and Execution for Autonomous Robots develops a formal representation for interleaving planning and execution in the context of incomplete information. This work bridges the gap between theory and practice in robotics by presenting control architectures that are provably sound, complete and optimal, and then describing real-world implementations of these robot architectures. Dervish, winner of the 1994 AAAI National Robot Contest, is one of the robots featured. Interleaving Planning and Execution for Autonomous Robots is based on the author's PhD research, covering the same material taught in CS 224, the very popular Introduction to Robot Programming Laboratory taught at Stanford for four years by Professor Michael Genesereth and the author.
This is a thorough description of this increasingly important technology, starting from the development of head-up displays (HUDs), particularly specifications and standards and operational problems associated with HUD use. HUD involvement in spatial disorientation and its use in recognizing and recovering from unusual attitudes is discussed. The book summarizes the design criteria including hardware, software, interface and display criteria. It goes on to outline flight tasks to be used for evaluating HUDs and discusses the impact of HUDs on flight training. Recent work indicates that a HUD may allow a significant reduction in the time required to train a pilot on a particular aircraft, even considering non-HUD-related tasks. The author concludes with a review of unresolved HUD issues and recommendations for further research and provides an impressive bibliography, glossary and index. Within the military aviation sector the book will be of use to industry, research agencies, test pilot schools and air force training establishments. In the civil area regulatory authorities, airlines and industry will also have an increasing interest.
This monograph describes new methods for intelligent pattern recognition using soft computing techniques including neural networks, fuzzy logic, and genetic algorithms. Hybrid intelligent systems that combine several soft computing techniques are needed due to the complexity of pattern recognition problems. Hybrid intelligent systems can have different architectures, which have an impact on the efficiency and accuracy of pattern recognition systems, to achieve the ultimate goal of pattern recognition. This book also shows results of the application of hybrid intelligent systems to real-world problems of face, fingerprint, and voice recognition. This monograph is intended to be a major reference for scientists and engineers applying new computational and mathematical tools to intelligent pattern recognition and can be also used as a textbook for graduate courses in soft computing, intelligent pattern recognition, computer vision, or applied artificial intelligence.
Geographic information system (GIS) computer technology is
revolutionizing the way we interact with information. Data, text,
drawings, maps, and images contain information that can be accessed
and used intuitively through drawings containing graphical
representations of the facilities to which they apply, e.g.,
emission stacks, sampling locations, and sites, to name only a few
examples.
This book presents the fundamental concepts of fuzzy logic and fuzzy control, chaos theory and chaos control. It also provides a definition of chaos on the metric space of fuzzy sets. The book raises many questions and generates a great potential to attract more attention to combine fuzzy systems with chaos theory. In this way it contains important seeds for future scientific research and engineering applications.
The aim of this volume of scientific essays is twofold. On the one hand, by remembering the scientific figure of Eduardo R. Caianiello, it aims at focusing on his outstanding contributions - from theoretical physics to cybernetics - which after so many years still represent occasion of innovative paths to be fruitfully followed. It must be stressed the contribution that his interdisciplinary methodology can still be of great help in affording and solving present day complex problems. On the other hand, it aims at pinpointing with the help of the scientists contributing to the volume - some crucial problems in present day research in the fields of interest of Eduardo Caianiello and which are still among the main lines of investigation of some of the Institutes founded by Eduardo (Istituto di Cibernetica del CNR, IIAS, etc).
Countering Cyber Sabotage: Introducing Consequence-Driven, Cyber-Informed Engineering (CCE) introduces a new methodology to help critical infrastructure owners, operators and their security practitioners make demonstrable improvements in securing their most important functions and processes. Current best practice approaches to cyber defense struggle to stop targeted attackers from creating potentially catastrophic results. From a national security perspective, it is not just the damage to the military, the economy, or essential critical infrastructure companies that is a concern. It is the cumulative, downstream effects from potential regional blackouts, military mission kills, transportation stoppages, water delivery or treatment issues, and so on. CCE is a validation that engineering first principles can be applied to the most important cybersecurity challenges and in so doing, protect organizations in ways current approaches do not. The most pressing threat is cyber-enabled sabotage, and CCE begins with the assumption that well-resourced, adaptive adversaries are already in and have been for some time, undetected and perhaps undetectable. Chapter 1 recaps the current and near-future states of digital technologies in critical infrastructure and the implications of our near-total dependence on them. Chapters 2 and 3 describe the origins of the methodology and set the stage for the more in-depth examination that follows. Chapter 4 describes how to prepare for an engagement, and chapters 5-8 address each of the four phases. The CCE phase chapters take the reader on a more granular walkthrough of the methodology with examples from the field, phase objectives, and the steps to take in each phase. Concluding chapter 9 covers training options and looks towards a future where these concepts are scaled more broadly.
This book has a rather strange history. It began in spring 1989, thirteen years after our Systems Science Department at SUNY-Binghamton was established, when I was asked by a group of students in our doctoral program to have a meeting with them. The spokesman of the group, Cliff Joslyn, opened our meeting by stating its purpose. I can closely paraphrase what he said: "We called this meeting to discuss with you, as Chairman of the Department, a fundamental problem with our systems science curriculum. In general, we consider it a good curriculum: we learn a lot of concepts, principles, and methodological tools, mathematical, computational, heu ristic, which are fundamental to understanding and dealing with systems. And, yet, we learn virtually nothing about systems science itself. What is systems science? What are its historical roots? What are its aims? Where does it stand and where is it likely to go? These are pressing questions to us. After all, aren't we supposed to carry the systems science flag after we graduate from this program? We feel that a broad introductory course to systems science is urgently needed in the curriculum. Do you agree with this assessment?" The answer was obvious and, yet, not easy to give: "I agree, of course, but I do not see how the situation could be alleviated in the foreseeable future."
As increasing numbers of social anthropoloists use computers for wordprocessing, interest in other applications inevitably follows. "Applications in Computing for Social Anthropologists" addresses this interest and encourages researchers to make full use of their computers to help them organize data. Firstly, the author discusses computing applications in relation to research activities shared by all anthropologists - ethnographic fieldwork, management and analysis of footnotes and the use of visual and aural material. The book then illustrates the way in which computer-based representations can satisfy the requirements of anthropological methods with a detailed examination of representing kinship relations in an original way. Nal developments in the representation of visual and aural data on computers, as well as possible applications of knowledge based models are also introduced.
Techniques for Designing and Analyzing Algorithms Design and analysis of algorithms can be a difficult subject for students due to its sometimes-abstract nature and its use of a wide variety of mathematical tools. Here the author, an experienced and successful textbook writer, makes the subject as straightforward as possible in an up-to-date textbook incorporating various new developments appropriate for an introductory course. This text presents the main techniques of algorithm design, namely, divide-and-conquer algorithms, greedy algorithms, dynamic programming algorithms, and backtracking. Graph algorithms are studied in detail, and a careful treatment of the theory of NP-completeness is presented. In addition, the text includes useful introductory material on mathematical background including order notation, algorithm analysis and reductions, and basic data structures. This will serve as a useful review and reference for students who have covered this material in a previous course. Features The first three chapters provide a mathematical review, basic algorithm analysis, and data structures Detailed pseudocode descriptions of the algorithms along with illustrative algorithms are included Proofs of correctness of algorithms are included when appropriate The book presents a suitable amount of mathematical rigor After reading and understanding the material in this book, students will be able to apply the basic design principles to various real-world problems that they may encounter in their future professional careers.
One of the main applications of VHDL is the synthesis of electronic circuits. Circuit Synthesis with VHDL is an introduction to the use of VHDL logic (RTL) synthesis tools in circuit design. The modeling styles proposed are independent of specific market tools and focus on constructs widely recognized as synthesizable by synthesis tools. A statement of the prerequisites for synthesis is followed by a short introduction to the VHDL concepts used in synthesis. Circuit Synthesis with VHDL presents two possible approaches to synthesis: the first starts with VHDL features and derives hardware counterparts; the second starts from a given hardware component and derives several description styles. The book also describes how to introduce the synthesis design cycle into existing design methodologies and the standard synthesis environment. Circuit Synthesis with VHDL concludes with a case study providing a realistic example of the design flow from behavioral description down to the synthesized level. Circuit Synthesis with VHDL is essential reading for all students, researchers, design engineers and managers working with VHDL in a synthesis environment.
Written by a distinguished cast of contributors, Alan Turing: Life and Legacy of a Great Thinker is the definitive collection of essays in commemoration of the 90th birthday of Alan Turing. This fascinating text covers the rich facets of his life, thoughts, and legacy, but also sheds some light on the future of computing science with a chapter contributed by visionary Ray Kurzweil, winner of the 1999 National Medal of Technology. Further, important contributions come from the philosopher Daniel Dennett, the Turing biographer Andrew Hodges, and from the distinguished logician Martin Davis, who provides a first critical essay on an emerging and controversial field termed "hypercomputation."
Cycloadditions are a very important class of reactions, which can be 1 used to obtain compounds of various ring sizes * Although these ,2 react? been largf: nvnntigated expnnivnntntit consi:fn ee f, :~oversy still fgeir mechanism, A reaction gene' ves the formaci new (J bonds between the reactants at the expense of n bonds. For such processes it is possible to postulate three different mechanisms: i) a synchron.ous concerted approach involving a cyclic transition state (TS) new bonds formed extent: ii)a neynchronous co:,:':" m:"chanism in whict fWO disticc to changes in e:f,me occurring the reactants and the single TS and the others mainly between the TS and products. iii) a two-step process, which occurs in two kinetically distinct steps via a ghnntical intermedintn Renect! gene computed the petential enerse or the ies of prototype tion reaction:f 4a i) the [2+2] cycloadditions H2C=CH2 + H C=CH , H C=O + H C=O,':b 2 2 2 2 H2 = 2 + O=O:4c C CH ii) the 1,3 dipolar cycloaddition HCNO + HC=CH, HCNO + H2C=CH2, H2CNHO 4d + H2C=CHd' 4e iii) cgdoaddition H C=CH * 2 2 have been potential enenpd th (STO-3G)5 ab-ini techniques f:ntcnged (4-31G)6 basis sets. All critical points have been fully optimized 7 using MC-SCF gradient techniques and characterized by diagonalizing 35 the related Hessian matrices computed using finite differences.
The objective of this edited volume is to offer a general view at the recent conceptual developments of Soft Computing (SC) regarded as a general methodology supporting the design of hybrid systems along with their diversified applications to modeling, simulation and control of non-linear dynamical systems. As of now, SC methodologies embrace neural networks, fuzzy logic, genetic algorithms and chaos theory. Each of these methodologies exhibits well delineated advantages and disadvantages. Interestingly, they have been found useful in solving a broad range of problems. However, many real-world complex problems require a prudent, carefully orchestrated integration of several of these methodologies to fully achieve the required efficiency, accuracy, and interpretability of the solutions. In this edited volume, an overview of SC methodologies, and their applications to modeling, simulation and control, will be given in an introductory paper by the Editors. Then, detailed methods for integrating the different SC methodologies in solving real-world problems will be given in the papers by the other authors in the book. The edited volume will cover a wide spectrum of applications including areas such as: robotic dynamic systems, non-linear plants, manufacturing systems, and time series prediction.
Discussing career decision making (CDM), career guidance, a
computerized system of career guidance, and the interplay among
them, this book describes the way people sort themselves, or are
sorted, into educational and occupational options. The options
represent the content of this book, and the sorting represents the
process. The sequence of decisions may extend over a lifetime, but
several crucial choice-points tend to occur at predictable stages
in a career. Career guidance is a professional intervention in CDM;
"professional" implies that practitioners conform to a standard of
ethics, knowledge, and competence beyond what may be offered by
other intervenors. Guidance is partly an art, but it is also partly
a science -- at least an application of science, based on a
synthesis of logic and evidence derived from research.
This work examines the topic of dispute resolution, specifically the multi-criteria approach that seeks to arrive at a conclusion that is mutually beneficial to both sides. Through the use of decision-aiding software, the multi-criteria approach can allow each side to give on various criteria that are not important to it, but are important to the other side. In this way, a super-optimum solution may even be met, in which both sides receive something significantly better than they had expected. Such a result is very difficult, if not impossible, to achieve, Stuart Nagel points out, in traditional single-dimension dispute resolution. Nagel and Mills describe the nature of multi-criteria dispute resolution utilizing decision-aiding software. The first part of the book clarifies the general character of computer-aided negotiation, computer-aided mediation, and super-optimizing dispute resolution. Part two guides the reader through the use of Policy/Goal Percentaging (P/G%) decision-aiding software, centering on general decision-making, negotiation, mediation, and prediction of outcomes. Multi-criteria resolution in the context of rule-making and legal policy disputes is the focus of part three, where such matters as determining initial alternatives and criteria, resolving deadlocks, and arriving at super-optimum solutions are discussed. Part four emphasizes dispute resolution in the context of rule-applying and litigation disputes, as well as mediation at the international level and between lawyers and clients. The final part deals with future applications, such as computer-aided mediation and group decision-making with phone modems. The book's combination of decision-aiding software, arbitration-mediation, and super-optimum expansionist decision-making brings a truly innovative approach to the topic of dispute resolution. This volume should be a welcome addition to academic, legal, and public libraries, and a valuable reference work for lawyers, law students, and legal professors and researchers.
Virtual technology is increasingly prevalent in all spheres of daily life, including infiltration into governmental policies, processes, infrastructures, and frameworks. E-Government Research: Policy and Management provides scholars and practitioners with a critical mass of research on the integration, management, implications, and application of e-government. Covering such issues as e-government adoption and diffusion; social and performance issues of e-government; and information security, privacy, and policy, this book is an essential resource to any library collection.
The book is devoted to the perturbation analysis of matrix equations. The importance of perturbation analysis is that it gives a way to estimate the influence of measurement and/or parametric errors in mathematical models together with the rounding errors done in the computational process. The perturbation bounds may further be incorporated in accuracy estimates for the solution computed in finite arithmetic. This is necessary for the development of reliable computational methods, algorithms and software from the viewpoint of modern numerical analysis.
Becoming a Teacher offers a broad context for understanding education, addressing issues such as social justice, educational ideology and teacher well-being and identity. The theoretical content is balanced with practical advice for the classroom on topics such as assessment for learning, behaviour management, differentiation and curriculum planning. Becoming a Teacher draws extensively on contemporary research and empirical evidence to support critical reflection about learning and teaching. Encouraging the reader to reflect on their own knowledge and beliefs, it explores some of the complex social and cultural dimensions that influence professional learning and practice. Becoming a Teacher's approach chimes with the commonly accepted recognition that all those involved in the education of young people should take a research-informed approach towards classroom practice. The substantial rethinking that has informed this sixth edition means the Becoming a Teacher continues to provide invaluable support, guidance and insight for all those training to be secondary teachers and a rich resource for students undertaking undergraduate or postgraduate education studies programmes.
Designing VLSI systems represents a challenging task. It is a transfonnation among different specifications corresponding to different levels of design: abstraction, behavioral, stntctural and physical. The behavioral level describes the functionality of the design. It consists of two components; static and dynamic. The static component describes operations, whereas the dynamic component describes sequencing and timing. The structural level contains infonnation about components, control and connectivity. The physical level describes the constraints that should be imposed on the floor plan, the placement of components, and the geometry of the design. Constraints of area, speed and power are also applied at this level. To implement such multilevel transfonnation, a design methodology should be devised, taking into consideration the constraints, limitations and properties of each level. The mapping process between any of these domains is non-isomorphic. A single behavioral component may be transfonned into more than one structural component. Design methodologies are the most recent evolution in the design automation era, which started off with the introduction and subsequent usage of module generation especially for regular structures such as PLA's and memories. A design methodology should offer an integrated design system rather than a set of separate unrelated routines and tools. A general outline of a desired integrated design system is as follows: * Decide on a certain unified framework for all design levels. * Derive a design method based on this framework. * Create a design environment to implement this design method.
Robotic systems are characterized by the intersection of computer intelligence with the physical world. This blend of physical reasoning and computational intelligence is well illustrated by the Tetrobot study described in this book. Tetrobot: A Modular Approach to Reconfigurable Parallel Robotics describes a new approach to the design of robotic systems. The Tetrobot approach utilizes modular components which may be reconfigured into many different mechanisms which are suited to different applications. The Tetrobot system includes two unique contributions: a new mechanism (a multilink spherical joint design), and a new control architecture based on propagation of kinematic solutions through the structure. The resulting Tetrobot system consists of fundamental components which may be mechanically reassembled into any modular configuration, and the control architecture will provide position control of the resulting structure. A prototype Tetrobot system has been built and evaluated experimentally. Tetrobot arms, platforms, and walking machines have been built and controlled in a variety of motion and loading conditions. The Tetrobot system has applications in a variety of domains where reconfiguration, flexibility, load capacity, and failure recovery are important aspects of the task. A number of key research directions have been opened by the Tetrobot research activities. Continuing topics of interest include: development of a more distributed implementation of the computer control architecture, analysis of the dynamics of the Tetrobot system motion for improved control of high-speed motions, integration of sensor systems to control the motion and shape of the high-dimensionality systems, and exploration of self-reconfiguration of the system. Tetrobot: A Modular Approach to Reconfigurable Parallel Robotics will be of interest to research workers, specialists and professionals in the areas of robotics, mechanical systems and computer engineering.
Memes work as rhetorical weapons and discursive arguments in political conflicts. Across digital platforms, they confirm, contest and challenge political power and hierarchies. They simultaneously create social distortion, hostility, and a sense of community. Memes thus not only reflect norms but also work as a tool for negotiating them. At the same time, memes meld symbolic and cultural elements with technological functionalities, allowing for replicability and remixing. This book studies how memes disrupt and reimagine politics in humorous ways. Memes create a playful activity that follows a shared set of rules and gives a (shared) voice, which may generate togetherness and political identities but also increase polarization. As their template travels, memes continue to appropriate new political contexts and to (re)negotiate frontiers in the political. The chapters in this book allow us to chart the playful politics of memes and how they establish or push frontiers in various political, cultural, and platform-specific contexts. Taken together, memes can challenge and regenerate populism, carve out spaces for new identity formations, and create togetherness in situations of crises. They can also, however, lead to the normalization of racist discourses. This book will be of interest to researchers and advanced students of Media and Communication Studies, Information Studies, Politics, Sociology, and Cultural Studies. It was originally published as a special issue of the journal, Information, Communication & Society.
Unit Integration Testing (UIT) had been a challenge because there was no tool that could help in XHR programming and unit integration validations in an efficient way until Cypress arrived. Cypress started releasing versions in 2015 and became popular in 2018 with version 2.0.0. This book explores Cypress scripts that help implement 'shift left testing', which is a dream come true for many software testers. Shift left occurs in the majority of testing projects, but could not be implemented fully because tools were unavailable and knowledge was lacking about the possibilities of testing early in the life cycle. Shift left is a key testing strategy to help testing teams focus less on defect identifications and more on developing practices to prevent defects. Cypress scripts can help front-end developers and quality engineers to work together to find defects soon after web components are built. These components can be tested immediately after they are built with Cypress Test Driven Development (TDD) scripts. Thus, defects can be fixed straight away during the development stage. Testing teams do not have to worry about finding these same defects in a later development stage because Cypress tests keep verifying components in the later stages. Defect fixing has become much cheaper with Cypress than when other tools are used. The book also covers Behaviour Driven Development (BDD)-based Gherkin scripts and the Cypress Cucumber preprocessor, which can improve test scenario coverage. Automated Software Testing with Cypress is written to fulfil the BDD and TDD needs of testing teams. Two distinct open source repositories are provided in Github to help start running Cypress tests in no time! |
You may like...
Silicon Photonics, Volume 99
Chennupati Jagadish, Sebastian Lourdudoss, …
Hardcover
R5,217
Discovery Miles 52 170
|