Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > Computer programming
Genetic programming (GP) is a popular heuristic methodology of program synthesis with origins in evolutionary computation. In this generate-and-test approach, candidate programs are iteratively produced and evaluated. The latter involves running programs on tests, where they exhibit complex behaviors reflected in changes of variables, registers, or memory. That behavior not only ultimately determines program output, but may also reveal its `hidden qualities' and important characteristics of the considered synthesis problem. However, the conventional GP is oblivious to most of that information and usually cares only about the number of tests passed by a program. This `evaluation bottleneck' leaves search algorithm underinformed about the actual and potential qualities of candidate programs. This book proposes behavioral program synthesis, a conceptual framework that opens GP to detailed information on program behavior in order to make program synthesis more efficient. Several existing and novel mechanisms subscribing to that perspective to varying extent are presented and discussed, including implicit fitness sharing, semantic GP, co-solvability, trace convergence analysis, pattern-guided program synthesis, and behavioral archives of subprograms. The framework involves several concepts that are new to GP, including execution record, combined trace, and search driver, a generalization of objective function. Empirical evidence gathered in several presented experiments clearly demonstrates the usefulness of behavioral approach. The book contains also an extensive discussion of implications of the behavioral perspective for program synthesis and beyond.
The book 'BiLBIQ: A biologically inspired Robot with walking and rolling locomotion' deals with implementing a locomotion behavior observed in the biological archetype Cebrennus villosus to a robot prototype whose structural design needs to be developed. The biological sample is investigated as far as possible and compared to other evolutional solutions within the framework of nature's inventions. Current achievements in robotics are examined and evaluated for their relation and relevance to the robot prototype in question. An overview of what is state of the art in actuation ensures the choice of the hardware available and most suitable for this project. Through a constant consideration of the achievement of two fundamentally different ways of locomotion with one and the same structure, a robot design is developed and constructed taking hardware constraints into account. The development of a special leg structure that needs to resemble and replace body elements of the biological archetype is a special challenge to be dealt with. Finally a robot prototype was achieved, which is able to walk and roll - inspired by the spider Cebrennus villosus.
The Internet has become the major form of map delivery. The current presentation of maps is based on the use of online services. This session examines developments related to online methods of map delivery, particularly Application Programmer Interfaces (APIs) and MapServices in general, including Google Maps API and similar services. Map mashups have had a major impact on how spatial information is presented. The advantage of using a major online mapping site is that the maps represent a common and recognizable representation of the world. Overlaying features on top of these maps provides a frame of reference for the map user. A particular advantage for thematic mapping is the ability to spatially reference thematic data.
The 4th FTRA International Conference on Information Technology
Convergence and Services (ITCS-12) will be held in Gwangju, Korea
on September 6 - 8, 2012.
Grids, P2P and Services Computing, the 12th volume of the CoreGRID series, is based on the CoreGrid ERCIM Working Group Workshop on Grids, P2P and Service Computing in Conjunction with EuroPar 2009. The workshop will take place August 24th, 2009 in Delft, The Netherlands. Grids, P2P and Services Computing, an edited volume contributed by well-established researchers worldwide, will focus on solving research challenges for Grid and P2P technologies. Topics of interest include: Service Level Agreement, Data & Knowledge Management, Scheduling, Trust and Security, Network Monitoring and more. Grids are a crucial enabling technology for scientific and industrial development. This book also includes new challenges related to service-oriented infrastructures. Grids, P2P and Services Computing is designed for a professional audience composed of researchers and practitioners within the Grid community industry. This volume is also suitable for advanced-level students in computer science.
Validation and verification is an area of software engineering that has been around since the early stages of program development, especially one of its more known areas: testing. Testing, the dynamic side of validation and verification (V&V), has been complemented with other, more formal techniques of software engineering, and so the static verification - traditional in formal methods - has been joined by model checking and other techniques. ""Verification, Validation and Testing in Software Engineering"" offers thorough coverage of many valuable formal and semiformal techniques of V&V. It explores, depicts, and provides examples of different applications in V&V that produce many areas of software development - including real-time applications - where V&V techniques are required.
The focus of this book is on three influential cognitive motives: achievement, affiliation, and power motivation. Incentive-based theories of achievement, affiliation and power motivation are the basis for competence-seeking behaviour, relationship-building, leadership, and resource-controlling behaviour in humans. In this book we show how these motives can be modelled and embedded in artificial agents to achieve behavioural diversity. Theoretical issues are addressed for representing and embedding computational models of motivation in rule-based agents, learning agents, crowds and evolution of motivated agents. Practical issues are addressed for defining games, mini-games or in-game scenarios for virtual worlds in which computer-controlled, motivated agents can participate alongside human players. The book is structured into four parts: game playing in virtual worlds by humans and agents; comparing human and artificial motives; game scenarios for motivated agents; and evolution and the future of motivated game-playing agents. It will provide game programmers, and those with an interest in artificial intelligence, with the knowledge required to develop diverse, believable game-playing agents for virtual worlds.
The information infrastructure - comprising computers, embedded devices, networks and software systems - is vital to operations in every sector: inf- mation technology, telecommunications, energy, banking and ?nance, tra- portation systems, chemicals, agriculture and food, defense industrial base, public health and health care, national monuments and icons, drinking water and water treatment systems, commercial facilities, dams, emergency services, commercial nuclear reactors, materials and waste, postal and shipping, and government facilities. Global business and industry, governments, indeed - ciety itself, cannot function if major components of the critical information infrastructure are degraded, disabled or destroyed. This book, Critical Infrastructure Protection III, is the third volume in the annualseriesproducedbyIFIP WorkingGroup11.10onCriticalInfrastructure Protection, an active international community of scientists, engineers, prac- tioners and policy makers dedicated to advancing research, development and implementation e?orts related to critical infrastructure protection. The book presents original research results and innovative applications in the area of infrastructure protection. Also, it highlights the importance of weaving s- ence, technology and policy in crafting sophisticated, yet practical, solutions that will help secure information, computer and network assets in the various critical infrastructure sectors. This volume contains seventeen edited papers from the Third Annual IFIP Working Group 11.10 International Conference on Critical Infrastructure P- tection, held at Dartmouth College, Hanover, New Hampshire, March 23-25, 2009. The papers were refereed by members of IFIP Working Group 11.10 and other internationally-recognized experts in critical infrastructure protection.
The latest work by the world's leading authorities on the use of formal methods in computer science is presented in this volume, based on the 1995 International Summer School in Marktoberdorf, Germany. Logic is of special importance in computer science, since it provides the basis for giving correct semantics of programs, for specification and verification of software, and for program synthesis. The lectures presented here provide the basic knowledge a researcher in this area should have and give excellent starting points for exploring the literature. Topics covered include semantics and category theory, machine based theorem proving, logic programming, bounded arithmetic, proof theory, algebraic specifications and rewriting, algebraic algorithms, and type theory.
"Distributed Programming: Theory and Practice" presents a practical and rigorous method to develop distributed programs that correctly implement their specifications. The method also covers how to write specifications and how to use them. Numerous examples such as bounded buffers, distributed locks, message-passing services, and distributed termination detection illustrate the method. Larger examples include data transfer protocols, distributed shared memory, and TCP network sockets. "Distributed Programming: Theory and Practice" bridges the gap between books that focus on specific concurrent programming languages and books that focus on distributed algorithms. Programs are written in a "real-life" programming notation, along the lines of Java and Python with explicit instantiation of threads and programs.Students and programmers will see these as programs and not "merely" algorithms in pseudo-code. The programs implement interesting algorithms and solve problems that are large enough to serve as projects in programming classes and software engineering classes. Exercises and examples are included at the end of each chapter with on-line access to the solutions. "Distributed Programming: Theory and Practice "is designed as an advanced-level text book for students in computer science and electrical engineering. Programmers, software engineers and researchers working in this field will also find this book useful."
A principal aim of computer graphics is to generate images that look as real as photographs. Realistic computer graphics imagery has however proven to be quite challenging to produce, since the appearance of materials arises from complicated physical processes that are difficult to analytically model and simulate, and image-based modeling of real material samples is often impractical due to the high-dimensional space of appearance data that needs to be acquired. This book presents a general framework based on the inherent coherency in the appearance data of materials to make image-based appearance modeling more tractable. We observe that this coherence manifests itself as low-dimensional structure in the appearance data, and by identifying this structure we can take advantage of it to simplify the major processes in the appearance modeling pipeline. This framework consists of two key components, namely the coherence structure and the accompanying reconstruction method to fully recover the low-dimensional appearance data from sparse measurements. Our investigation of appearance coherency has led to three major forms of low-dimensional coherence structure and three types of coherency-based reconstruction upon which our framework is built. This coherence-based approach can be comprehensively applied to all the major elements of image-based appearance modeling, from data acquisition of real material samples to user-assisted modeling from a photograph, from synthesis of volumes to editing of material properties, and from efficient rendering algorithms to physical fabrication of objects. In this book we present several techniques built on this coherency framework to handle various appearance modeling tasks both for surface reflections and subsurface scattering, the two primary physical components that generate material appearance. We believe that coherency-based appearance modeling will make it easier and more feasible for practitioners to bring computer graphics imagery to life. This book is aimed towards readers with an interest in computer graphics. In particular, researchers, practitioners and students will benefit from this book by learning about the underlying coherence in appearance structure and how it can be utilized to improve appearance modeling. The specific techniques presented in our manuscript can be of value to anyone who wishes to elevate the realism of their computer graphics imagery. For understanding this book, an elementary background in computer graphics is assumed, such as from an introductory college course or from practical experience with computer graphics.
This volume of LNCSE is a collection of the papers from the proceedings of the third workshop on sparse grids and applications. Sparse grids are a popular approach for the numerical treatment of high-dimensional problems. Where classical numerical discretization schemes fail in more than three or four dimensions, sparse grids, in their different guises, are frequently the method of choice, be it spatially adaptive in the hierarchical basis or via the dimensionally adaptive combination technique. Demonstrating once again the importance of this numerical discretization scheme, the selected articles present recent advances on the numerical analysis of sparse grids as well as efficient data structures. The book also discusses a range of applications, including uncertainty quantification and plasma physics.
The development of software system with acceptable level of reliability and quality within available time frame and budget becomes a challenging objective. This objective could be achieved to some extent through early prediction of number of faults present in the software, which reduces the cost of development as it provides an opportunity to make early corrections during development process. The book presents an early software reliability prediction model that will help to grow the reliability of the software systems by monitoring it in each development phase, i.e. from requirement phase to testing phase. Different approaches are discussed in this book to tackle this challenging issue. An important approach presented in this book is a model to classify the modules into two categories (a) fault-prone and (b) not fault-prone. The methods presented in this book for assessing expected number of faults present in the software, assessing expected number of faults present at the end of each phase and classification of software modules in fault-prone or no fault-prone category are easy to understand, develop and use for any practitioner. The practitioners are expected to gain more information about their development process and product reliability, which can help to optimize the resources used.
This book focuses on the next generation optical networks as well as mobile communication technologies. The reader will find chapters on Cognitive Optical Network, 5G Cognitive Wireless, LTE, Data Analysis and Natural Language Processing. It also presents a comprehensive view of the enhancements and requirements foreseen for Machine Type Communication. Moreover, some data analysis techniques and Brazilian Portuguese natural language processing technologies are also described here.
This book introduces readers to genetic algorithms (GAs) with an emphasis on making the concepts, algorithms, and applications discussed as easy to understand as possible. Further, it avoids a great deal of formalisms and thus opens the subject to a broader audience in comparison to manuscripts overloaded by notations and equations. The book is divided into three parts, the first of which provides an introduction to GAs, starting with basic concepts like evolutionary operators and continuing with an overview of strategies for tuning and controlling parameters. In turn, the second part focuses on solution space variants like multimodal, constrained, and multi-objective solution spaces. Lastly, the third part briefly introduces theoretical tools for GAs, the intersections and hybridizations with machine learning, and highlights selected promising applications.
This book provides comprehensive coverage of the latest trends/advances in subjective and objective quality evaluation for traditional visual signals, such as 2D images and video, as well as the most recent challenges for the field of multimedia quality assessment and processing, such as mobile video and social media. Readers will learn how to ensure the highest storage/delivery/ transmission quality of visual content (including image, video, graphics, animation, etc.) from the server to the consumer, under resource constraints, such as computation, bandwidth, storage space, battery life, etc.
This book examines how and why collaborative quality assurance techniques, particularly pair programming and peer code review, affect group cognition and software quality in agile software development teams. Prior research on these extremely popular but also costly techniques has focused on isolated pairs of developers and ignored the fact that they are typically applied in larger, enduring teams. This book is one of the first studies to investigate how these techniques depend on and influence the joint cognitive accomplishments of entire development teams rather than individuals. It employs theories on transactive memory systems and functional affordances to provide answers based on empirical research. The mixed-methods research presented includes several in-depth case studies and survey results from more than 500 software developers, team leaders, and product managers in 81 software development teams. The book's findings will advance IS research and have explicit implications for developers of code review tools, information systems development teams, and software development managers.
In his rich and varied career as a mathematician, computer scientist, and educator, Jacob T. Schwartz wrote seminal works in analysis, mathematical economics, programming languages, algorithmics, and computational geometry. In this volume of essays, his friends, students, and collaborators at the Courant Institute of Mathematical Sciences present recent results in some of the fields that Schwartz explored: quantum theory, the theory and practice of programming, program correctness and decision procedures, dextrous manipulation in Robotics, motion planning, and genomics. In addition to presenting recent results in these fields, these essays illuminate the astonishingly productive trajectory of a brilliant and original scientist and thinker.
Software Life Cycle Models. Objectoriented Concepts and Modeling. Formal Specification and Verification. Design Methodologies and Specifications. Programming and Coding. Programming Tools. Declarative Programming. Automatic Program Synthesis and Reuse. Program Verification and Testing. Software Maintenance. Advanced Programming Environments. Other Selected Topics. Index.
This monograph describes the latest advances in discriminative learning methods for biometric recognition. Specifically, it focuses on three representative categories of methods: sparse representation-based classification, metric learning, and discriminative feature representation, together with their applications in palmprint authentication, face recognition and multi-biometrics. The ideas, algorithms, experimental evaluation and underlying rationales are also provided for a better understanding of these methods. Lastly, it discusses several promising research directions in the field of discriminative biometric recognition.
Collaboration among individuals - from users to developers - is central to modern software engineering. It takes many forms: joint activity to solve common problems, negotiation to resolve conflicts, creation of shared definitions, and both social and technical perspectives impacting all software development activity. The difficulties of collaboration are also well documented. The grand challenge is not only to ensure that developers in a team deliver effectively as individuals, but that the whole team delivers more than just the sum of its parts. The editors of this book have assembled an impressive selection of authors, who have contributed to an authoritative body of work tackling a wide range of issues in the field of collaborative software engineering. The resulting volume is divided into four parts, preceded by a general editorial chapter providing a more detailed review of the domain of collaborative software engineering. Part 1 is on "Characterizing Collaborative Software Engineering," Part 2 examines various "Tools and Techniques," Part 3 addresses organizational issues, and finally Part 4 contains four examples of "Emerging Issues in Collaborative Software Engineering." As a result, this book delivers a comprehensive state-of-the-art overview and empirical results for researchers in academia and industry in areas like software process management, empirical software engineering, and global software development. Practitioners working in this area will also appreciate the detailed descriptions and reports which can often be used as guidelines to improve their daily work. |
You may like...
Java How to Program, Late Objects…
Paul Deitel, Harvey Deitel
Paperback
|