0
Your cart

Your cart is empty

Browse All Departments
Price
  • R50 - R100 (1)
  • R100 - R250 (271)
  • R250 - R500 (3,733)
  • R500+ (10,915)
  • -
Status
Format
Author / Contributor
Publisher

Books > Computing & IT > Computer programming > Programming languages > General

Estimation of Distribution Algorithms - A New Tool for Evolutionary Computation (Paperback, Softcover reprint of the original... Estimation of Distribution Algorithms - A New Tool for Evolutionary Computation (Paperback, Softcover reprint of the original 1st ed. 2002)
Pedro Larranaga, Jose A. Lozano
R5,177 Discovery Miles 51 770 Ships in 18 - 22 working days

Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation is devoted to a new paradigm for evolutionary computation, named estimation of distribution algorithms (EDAs). This new class of algorithms generalizes genetic algorithms by replacing the crossover and mutation operators with learning and sampling from the probability distribution of the best individuals of the population at each iteration of the algorithm. Working in such a way, the relationships between the variables involved in the problem domain are explicitly and effectively captured and exploited. This text constitutes the first compilation and review of the techniques and applications of this new tool for performing evolutionary computation. Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation is clearly divided into three parts. Part I is dedicated to the foundations of EDAs. In this part, after introducing some probabilistic graphical models - Bayesian and Gaussian networks - a review of existing EDA approaches is presented, as well as some new methods based on more flexible probabilistic graphical models. A mathematical modeling of discrete EDAs is also presented. Part II covers several applications of EDAs in some classical optimization problems: the travelling salesman problem, the job scheduling problem, and the knapsack problem. EDAs are also applied to the optimization of some well-known combinatorial and continuous functions. Part III presents the application of EDAs to solve some problems that arise in the machine learning field: feature subset selection, feature weighting in K-NN classifiers, rule induction, partial abductive inference in Bayesian networks, partitional clustering, and the search for optimal weights in artificial neural networks. Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation is a useful and interesting tool for researchers working in the field of evolutionary computation and for engineers who face real-world optimization problems. This book may also be used by graduate students and researchers in computer science. ... I urge those who are interested in EDAs to study this well-crafted book today.' David E. Goldberg, University of Illinois Champaign-Urbana.

Computer Applications for Software Engineering, Disaster Recovery, and Business Continuity - International Conferences, ASEA... Computer Applications for Software Engineering, Disaster Recovery, and Business Continuity - International Conferences, ASEA and DRBC 2012, Held in Conjunction with GST 2012, Jeju Island, Korea, November 28-December 2, 2012. Proceedings (Paperback, 2012 ed.)
Tai-Hoon Kim, Carlos Ramos, Haeng-kon Kim, Akingbehin Kiumi, Sabah Mohammed, …
R1,462 Discovery Miles 14 620 Ships in 18 - 22 working days

This book comprises the refereed proceedings of the International Conferences, ASEA and DRBC 2012, held in conjunction with GST 2012 on Jeju Island, Korea, in November/December 2012. The papers presented were carefully reviewed and selected from numerous submissions and focus on the various aspects of advanced software engineering and its applications, and disaster recovery and business continuity.

Logics for Databases and Information Systems (Paperback, Softcover reprint of the original 1st ed. 1998): Jan Chomicki, Gunter... Logics for Databases and Information Systems (Paperback, Softcover reprint of the original 1st ed. 1998)
Jan Chomicki, Gunter Saake
R5,185 Discovery Miles 51 850 Ships in 18 - 22 working days

Time is ubiquitous in information systems. Almost every enterprise faces the problem of its data becoming out of date. However, such data is often valu able, so it should be archived and some means to access it should be provided. Also, some data may be inherently historical, e.g., medical, cadastral, or ju dicial records. Temporal databases provide a uniform and systematic way of dealing with historical data. Many languages have been proposed for tem poral databases, among others temporal logic. Temporal logic combines ab stract, formal semantics with the amenability to efficient implementation. This chapter shows how temporal logic can be used in temporal database applica tions. Rather than presenting new results, we report on recent developments and survey the field in a systematic way using a unified formal framework [GHR94; Ch094]. The handbook [GHR94] is a comprehensive reference on mathematical foundations of temporal logic. In this chapter we study how temporal logic is used as a query and integrity constraint language. Consequently, model-theoretic notions, particularly for mula satisfaction, are of primary interest. Axiomatic systems and proof meth ods for temporal logic [GHR94] have found so far relatively few applications in the context of information systems. Moreover, one needs to bear in mind that for the standard linearly-ordered time domains temporal logic is not re cursively axiomatizable [GHR94]' so recursive axiomatizations are by necessity incomplete.

VHDL'92 - The New Features of the VHDL Hardware Description Language (Paperback, Softcover reprint of the original 1st ed.... VHDL'92 - The New Features of the VHDL Hardware Description Language (Paperback, Softcover reprint of the original 1st ed. 1993)
Jean-Michel Berge, Alain Fonkoua, Serge Maginot, Jacques Rouillard
R1,392 Discovery Miles 13 920 Ships in 18 - 22 working days

An open process of restandardization, conducted by the IEEE, has led to the definitions of the new VHDL standard. The changes make VHDL safer, more portable, and more powerful. VHDL also becomes bigger and more complete. The canonical simulator of VHDL is enriched by new mechanisms, the predefined environment is more complete, and the syntax is more regular and flexible. Discrepancies and known bugs of VHDL'87 have been fixed. However, the new VHDL'92 is compatible with VHDL'87, with some minor exceptions. This book presents the new VHDL'92 for the VHDL designer. New features ar explained and classified. Examples are provided, each new feature is given a rationale and its impact on design methodology, and performance is analysed. Where appropriate, pitfalls and traps are explained. The VHDL designer will quickly be able to find the feature needed to evaluate the benefits it brings, to modify previous VHDL'87 code to make it more efficient, more portable, and more flexible. VHDL'92 is the essential update for all VHDL designers and managers involved in electronic design.

Games and Full Abstraction for a Functional Metalanguage with Recursive Types (Paperback, Softcover reprint of the original 1st... Games and Full Abstraction for a Functional Metalanguage with Recursive Types (Paperback, Softcover reprint of the original 1st ed. 1998)
Guy McCusker
R1,384 Discovery Miles 13 840 Ships in 18 - 22 working days

This book is a minor revision of the thesis submitted in August 1996; no major changes have been made. However, I would like to take this opportunity to mention that since the thesis was written, discoveries have been made which would allow a substantial simplification and strengthening of the results in Chapters 3 and 6. In particular, it is now possible to model sums correctly in the category I as well as in GBP, which means that the definability results of Chapter 6 can be stated and proved at the intensional level, making them simpler and much closer in spirit to the original proofs of Abramsky, Jagadeesan, Malacaria, Hyland, Ong and Nickau [10,61,79]. This also leads quite straightforwardly to an understanding of call-by-value languages. Details of these improvements can be found in [14,73]. It is also worth mentioning that progress has been made on some of the topics suggested for future research in Chapter 7. In particular, fully abstract models have been found for various kinds of languages with local variables [8,13-16], and a fully complete games model of the polymorphic language System F has been constructed by Hughes [59]. Guy McCusker February 1998 Acknowledgements First of all, I must thank my supervisor, Samson Abramsky. It was he who first introduced me to game semantics and suggested avenues of research in the area; this book would certainly not exist were it not for him.

TeX Reference Manual (Paperback, Softcover reprint of the original 1st ed. 2002): David Bausum TeX Reference Manual (Paperback, Softcover reprint of the original 1st ed. 2002)
David Bausum
R4,041 Discovery Miles 40 410 Ships in 18 - 22 working days

Introduction or Why I wrote this book N the fallof 1997 a dedicated troff user e-rnalled me the macros he used to typeset his books. 1took one look inside his fileand thought, "I can do I this;It'sjustcode. " Asan experiment1spent aweekand wrote a Cprogram and troff macros which formatted and typeset a membership directory for a scholarly society with approximately 2,000 members. When 1 was done, I could enter two commands, and my program and troff would convert raw membershipdata into 200 pages ofPostScriptin 35 seconds. Previously, it had taken me several days to prepare camera-readycopy for the directory using a word processor. For completeness 1sat down and tried to write 1EXmacros for the typesetting. 1failed. Although ninety-five percent of my macros worked, I was unable to prepare the columns the project required. As my frustration grew, 1began this book-mentally, in myhead-as an answer to the question, "Why is 'lEX so hard to learn?" Why use Tgx? Lest you accuse me of the old horse and cart problem, 1should address the question, "Why use 1EX at all?" before 1explain why 'lEX is hard. Iuse 'lEXfor the followingreasons. Itisstable, fast, free, and it uses ASCII. Ofcourse, the most important reason is: 'lEX does a fantastic job. Bystable, I mean it is not likely to change in the next 10 years (much less the next one or two), and it is free of bugs. Both ofthese are important.

Programming and Meta-Programming in Scheme (Paperback, Softcover reprint of the original 1st ed. 1998): Jon Pearce Programming and Meta-Programming in Scheme (Paperback, Softcover reprint of the original 1st ed. 1998)
Jon Pearce
R1,438 Discovery Miles 14 380 Ships in 18 - 22 working days

A comprehensive first course in Scheme, covering all of its major features: abstraction, functional programming, data types, recursion, and semantic programming. Although the primary goal is to teach students to program in Scheme, this will be suitable for anyone taking a general programming principles course. Each chapter is divided into three sections: core, appendix , and problems. Most essential topics are covered in the core section, but it is assumed that most students will read the appendices and solve most of the problems - all of which require short Scheme procedures. As well as providing a thorough grounding in Scheme, the author discusses different programming paradigms in depth. An important theme throughout is that of "meta-programming", thus providing an insight into topics such as type-checking and overloading which might otherwise be missed.

Advanced Relational Programming (Paperback, Softcover reprint of the original 1st ed. 1996): F. Cacace, G. Lamperti Advanced Relational Programming (Paperback, Softcover reprint of the original 1st ed. 1996)
F. Cacace, G. Lamperti
R2,691 Discovery Miles 26 910 Ships in 18 - 22 working days

Databaseprogrammingis the process ofdeveloping data-intensiveapplications which demand the access to large amounts of structured, persistent data. The primary tool required for implementing such applications is a database programming language, namely aformal language which is specialized in the definition and manipulationof relevant large-scale data. As such, a database programming language is expected to provide high-level data modeling capabilitiesas well as avarietyofconstructs which facilitatethehandlingofthespecifieddata. Inthis perspective, the aim of this book is: (i) to present the recent advances in database technologyfrom theviewpointofthe novel database paradigmsproposedfor the developmentofadvanced, non-standard, data-intensive applications, (ii) to focus specificallyon the relational approach, with considerableemphasis on the extensions proposed in the last decade, and (iii) to describe the extended relational database languageAlgres which is primarily the outcome of research work conducted by the authorsincooperationwithalargenumberofothercolleaguesandstudents. Furthermore, in orderto put the concepts presented in the book into practice, the reader is invited to experiment with the Algres system, afree copyofwhich can be requestedfromKluwerAcademicPublishers,ordirectlyfromtheauthors. Dependingonthespecific interest andbackgroundofthereader,thebookcanserve either:(1) to overview recent trends in databases, (2) to introduce in more detail the concepts and theory of the nested relational model, or (3) to present a complete advancedrelationallanguagewhichcanbefreelyusedforexperimentalpurposeswithin academicandresearchframeworks.

Formal Specification Techniques for Engineering Modular C Programs (Paperback, Softcover reprint of the original 1st ed. 1996):... Formal Specification Techniques for Engineering Modular C Programs (Paperback, Softcover reprint of the original 1st ed. 1996)
Tan Yang Meng
R3,995 Discovery Miles 39 950 Ships in 18 - 22 working days

Software is difficult to develop, maintain, and reuse. Two factors that contribute to this difficulty are the lack of modular design and good program documentation. The first makes software changes more difficult to implement. The second makes programs more difficult to understand and to maintain. Formal Specification Techniques for Engineering Modular C Programs describes a novel approach to promoting program modularity. The book presents a formal specification language that promotes software modularity through the use of abstract data types, even though the underlying programming language may not have such support. This language is structured to allow useful information to be extracted from a specification, which is then used to perform consistency checks between the specification and its implementation. Formal Specification Techniques for Engineering Modular C Programs also describes a specification-driven, software re-engineering process model for improving existing programs. The aim of this process is to make existing programs easier to maintain and reuse while keeping their essential functionalities unchanged. Audience: Suitable as a secondary text for graduate level courses in software engineering, and as a reference for researchers and practitioners in industry.

The Field Programming Environment: A Friendly Integrated Environment for Learning and Development (Paperback, Softcover reprint... The Field Programming Environment: A Friendly Integrated Environment for Learning and Development (Paperback, Softcover reprint of the original 1st ed. 1995)
Steven P. Reiss
R4,017 Discovery Miles 40 170 Ships in 18 - 22 working days

FIELD has been a remarkably successful research project. The ideas first exhibited in the environment now form the basis for most of the current generation of programming environments, including Hewlett-Packard's Softbench, DEC's FUSE, Sun's Tooltalk, Lucid's Energize, and SGI's Codevision. FIELD pioneered the notion of broadcast messaging as a basis for tool integration. Moreover, many of the other tool concepts introduced in FIELD have made their way into these environments. Thus in discussing the FIELD environment, this book actually explains the inner workings of today's programming environments. The book will be valuable for those interested in the development of programming tools and environments, as well as serious users of programming environments. It will also be of interest to anyone undertaking a large software project, both by introducing the software tools needed to work on such a project and by demonstrating the concepts of message-based integration which can be applied to a variety of domains.

Quality of Communication-Based Systems - Proceedings of an International Workshop held at the TU Berlin, Germany, September... Quality of Communication-Based Systems - Proceedings of an International Workshop held at the TU Berlin, Germany, September 1994 (Paperback, Softcover reprint of the original 1st ed. 1995)
Gunter Hommel
R2,632 Discovery Miles 26 320 Ships in 18 - 22 working days

Quality of Communication-Based Systems presents the research results of students of the Graduiertenkolleg Communication-Based Systems' to an international community. To stimulate the scientific discussion, renowned experts have been invited to give their views on the research areas: Formal specification and mathematical foundations of distributed systems using process algebra, graph transformations, process calculi and temporal logics Performance evaluation, dependability modelling and analysis of real-time systems with different kinds of timed Petri-nets Specification and analysis of communication protocols Reliability, security and dependability in distributed systems Object orientation in distributed systems architecture Software development and concepts for distributed applications Computer network architecture and management Language concepts for distributed systems.

Tools and Environments for Parallel and Distributed Systems (Paperback, Softcover reprint of the original 1st ed. 1996): Amr... Tools and Environments for Parallel and Distributed Systems (Paperback, Softcover reprint of the original 1st ed. 1996)
Amr Zaky, Ted Lewis
R4,018 Discovery Miles 40 180 Ships in 18 - 22 working days

Developing correct and efficient software is far more complex for parallel and distributed systems than it is for sequential processors. Some of the reasons for this added complexity are: the lack of a universally acceptable parallel and distributed programming paradigm, the criticality of achieving high performance, and the difficulty of writing correct parallel and distributed programs. These factors collectively influence the current status of parallel and distributed software development tools efforts. Tools and Environments for Parallel and Distributed Systems addresses the above issues by describing working tools and environments, and gives a solid overview of some of the fundamental research being done worldwide. Topics covered in this collection are: mainstream program development tools, performance prediction tools and studies; debugging tools and research; and nontraditional tools. Audience: Suitable as a secondary text for graduate level courses in software engineering and parallel and distributed systems, and as a reference for researchers and practitioners in industry.

Scientific Data Analysis using Jython Scripting and Java (Paperback, 2010 ed.): Sergei V. Chekanov Scientific Data Analysis using Jython Scripting and Java (Paperback, 2010 ed.)
Sergei V. Chekanov
R1,453 Discovery Miles 14 530 Ships in 18 - 22 working days

Scientific Data Analysis using Jython Scripting and Java presents practical approaches for data analysis using Java scripting based on Jython, a Java implementation of the Python language. The chapters essentially cover all aspects of data analysis, from arrays and histograms to clustering analysis, curve fitting, metadata and neural networks. A comprehensive coverage of data visualisation tools implemented in Java is also included. Written by the primary developer of the jHepWork data-analysis framework, the book provides a reliable and complete reference source laying the foundation for data-analysis applications using Java scripting. More than 250 code snippets (of around 10-20 lines each) written in Jython and Java, plus several real-life examples help the reader develop a genuine feeling for data analysis techniques and their programming implementation. This is the first data-analysis and data-mining book which is completely based on the Jython language, and opens doors to scripting using a fully multi-platform and multi-threaded approach. Graduate students and researchers will benefit from the information presented in this book.

Reversible Grammar in Natural Language Processing (Paperback, Softcover reprint of the original 1st ed. 1994): T. Strzalkowski Reversible Grammar in Natural Language Processing (Paperback, Softcover reprint of the original 1st ed. 1994)
T. Strzalkowski
R5,199 Discovery Miles 51 990 Ships in 18 - 22 working days

Reversible grammar allows computational models to be built that are equally well suited for the analysis and generation of natural language utterances. This task can be viewed from very different perspectives by theoretical and computational linguists, and computer scientists. The papers in this volume present a broad range of approaches to reversible, bi-directional, and non-directional grammar systems that have emerged in recent years. This is also the first collection entirely devoted to the problems of reversibility in natural language processing. Most papers collected in this volume are derived from presentations at a workshop held at the University of California at Berkeley in the summer of 1991 organised under the auspices of the Association for Computational Linguistics. This book will be a valuable reference to researchers in linguistics and computer science with interests in computational linguistics, natural language processing, and machine translation, as well as in practical aspects of computability.

Network and Parallel Computing - 9th IFIP International Conference, NPC 2012, Gwangju, Korea, September 6-8, 2012, Proceedings... Network and Parallel Computing - 9th IFIP International Conference, NPC 2012, Gwangju, Korea, September 6-8, 2012, Proceedings (Paperback, 2012 ed.)
James J. Park, Albert Y. Zomaya, Sang-Soo Yeo, Sartaj Sahni
R2,753 Discovery Miles 27 530 Ships in 18 - 22 working days

This book constitutes the refereed post-proceedings of the 9th IFIP International Conference on Network and Parallel Computing, NPC 2012, held in Gwangju, Korea, in September 2012. The 38 papers presented were carefully reviewed and selected from 136 submissions. The papers are organized in the following topical sections: algorithms, scheduling, analysis, and data mining; network architecture and protocol design; network security; paralel, distributed, and virtualization techniques; performance modeling, prediction, and tuning; resource management; ubiquitous communications and networks; and web, communication, and cloud computing. In addition, a total of 37 papers selected from five satellite workshops (ATIMCN, ATSME, Cloud&Grid, DATICS, and UMAS 2012) are included.

Constructing Predictable Real Time Systems (Paperback, Softcover reprint of the original 1st ed. 1991): Alexander D. Stoyenko Constructing Predictable Real Time Systems (Paperback, Softcover reprint of the original 1st ed. 1991)
Alexander D. Stoyenko
R4,025 Discovery Miles 40 250 Ships in 18 - 22 working days

Vorwort In der Natur entwickelten sich die Echtzeitsysteme seit einigen 100 Mil- Honen Jahren. Tierische Nervensysteme haben zur Aufgabe, auf die Nachrichten aus der Umwelt die Steuerungsbefehle an die aktiven Or- gane zu geben. Dabei spielen zum Beispiel bedingte Reflexe eine wichtige Rolle. Vielleicht kann man die Entstehung des Menschen etwa zu der Zeit ansetzen, als sein sich allmahlich entwickelndes Gehirn Gedanken entwickelte, deren Bedeutung in vorausplanender Weise iiber die gerade vorliegende Situation hinausging. Das fiihrte schliesslich unter anderem zum heutigen Wissenschaftler, der seine Theorien und Systeme aufgrund langwieriger Uberlegungen aufbaut. Die Entwicklung der Computer ging im wesentlichen den umgekehrten Weg. Zunachst diente sie nur der Durchfiihrung "starrer" Programme, wie z.B. das erste programmgesteuerte Rechengerat Z3, das der Unterzeichner im Jahre 1941 vorfiihren konnte. Es folgte unter an- derem ein Spezialgerat zur Fliigelvermessung, das man als den ersten Prozessrechner bezeichnen kann. Es wurden etwa vierzig als Analog- Digital-Wandler arbeitende Messuhren yom Rechnerautomaten abgele- sen und im Rahmen eines Programms als Variable verarbeitet. Abel' auch das erfolgte noch in starrer Reihenfolge. Die echte Prozesssteuerung - heute auch Echtzeitsysteme genannt - erfordert aber ein Reagieren auf bestandig wechselnde Situationen.

Intelligent Image Processing in Prolog (Paperback, Softcover reprint of the original 1st ed. 1991): Bruce G. Batchelor Intelligent Image Processing in Prolog (Paperback, Softcover reprint of the original 1st ed. 1991)
Bruce G. Batchelor
R4,052 Discovery Miles 40 520 Ships in 18 - 22 working days

After a slow and somewhat tentative beginning, machine vision systems are now finding widespread use in industry. So far, there have been four clearly discernible phases in their development, based upon the types of images processed and how that processing is performed: (1) Binary (two level) images, processing in software (2) Grey-scale images, processing in software (3) Binary or grey-scale images processed in fast, special-purpose hardware (4) Coloured/multi-spectral images Third-generation vision systems are now commonplace, although a large number of binary and software-based grey-scale processing systems are still being sold. At the moment, colour image processing is commercially much less significant than the other three and this situation may well remain for some time, since many industrial artifacts are nearly monochrome and the use of colour increases the cost of the equipment significantly. A great deal of colour image processing is a straightforward extension of standard grey-scale methods. Industrial applications of machine vision systems can also be sub divided, this time into two main areas, which have largely retained distinct identities: (i) Automated Visual Inspection (A VI) (ii) Robot Vision (RV) This book is about a fifth generation of industrial vision systems, in which this distinction, based on applications, is blurred and the processing is marked by being much smarter (i. e. more "intelligent") than in the other four generations."

Software Prototyping in Data and Knowledge Engineering (Paperback, Softcover reprint of the original 1st ed. 1999): G. Guida,... Software Prototyping in Data and Knowledge Engineering (Paperback, Softcover reprint of the original 1st ed. 1999)
G. Guida, G. Lamperti, Marina Zanella
R2,695 Discovery Miles 26 950 Ships in 18 - 22 working days

This monograph describes an innovative prototyping framework for data and knowledge intensive systems. The proposed approach will prove especially useful for advanced and research-oriented projects that aim to develop a traditional database perspective into fully-fledged advanced database approaches and knowledge engineering technologies. The book is organised in two parts. The first part, comprising chapters 1 to 4, provides an introduction to the concept of prototyping, to database and knowledge-based technologies, and to the main issues involved in the integration of data and knowledge engineering. The second part, comprising chapters 5 to 12, illustrates the proposed approach in technical detail. Audience: This volume will be of interest to researchers in the field of databases and knowledge engineering in general, and for software designers and knowledge engineers who aim to expand their expertise in data and knowledge intensive systems.

Compiler Technology - Tools, Translators and Language Implementation (Paperback, Softcover reprint of the original 1st ed.... Compiler Technology - Tools, Translators and Language Implementation (Paperback, Softcover reprint of the original 1st ed. 1997)
Derek Beng Kee Kiong
R3,995 Discovery Miles 39 950 Ships in 18 - 22 working days

Compiler technology is fundamental to computer science since it provides the means to implement many other tools. It is interesting that, in fact, many tools have a compiler framework - they accept input in a particular format, perform some processing and present output in another format. Such tools support the abstraction process and are crucial to productive systems development. The focus of Compiler Technology: Tools, Translators and Language Implementation is to enable quick development of analysis tools. Both lexical scanner and parser generator tools are provided as supplements to this book, since a hands-on approach to experimentation with a toy implementation aids in understanding abstract topics such as parse-trees and parse conflicts. Furthermore, it is through hands-on exercises that one discovers the particular intricacies of language implementation. Compiler Technology: Tools, Translators and Language Implementation is suitable as a textbook for an undergraduate or graduate level course on compiler technology, and as a reference for researchers and practitioners interested in compilers and language implementation.

Multiprocessor Execution of Logic Programs (Paperback, Softcover reprint of the original 1st ed. 1994): Gopal Gupta Multiprocessor Execution of Logic Programs (Paperback, Softcover reprint of the original 1st ed. 1994)
Gopal Gupta
R4,003 Discovery Miles 40 030 Ships in 18 - 22 working days

Multiprocessor Execution of Logic Programs addresses the problem of efficient implementation of logic programming languages, specifically Prolog, on multiprocessor architectures. The approaches and implementations developed attempt to take full advantage of sequential implementation technology developed for Prolog (such as the WAM) while exploiting all forms of control parallelism present in logic programs, namely, or-parallelism, independent and-parallelism and dependent and-parallelism. Coverage includes a thorough survey of parallel implementation techniques and parallel systems developed for Prolog. Multiprocessor Execution of Logic Programs is recommended for people implementing parallel logic programming systems, parallel symbolic systems, parallel AI systems, and parallel theorem proving systems. It will also be useful to people who wish to learn about the implementation of parallel logic programming systems.

Objects for Concurrent Constraint Programming (Paperback, Softcover reprint of the original 1st ed. 1998): Martin Henz Objects for Concurrent Constraint Programming (Paperback, Softcover reprint of the original 1st ed. 1998)
Martin Henz
R5,796 Discovery Miles 57 960 Ships in 18 - 22 working days

Concurrent constraint programming (ccp) is a recent development in programming language design. Its central contribution is the notion of partial information provided by a shared constraint store. This constraint store serves as a communication medium between concurrent threads of control and as a vehicle for their synchronization. Objects for Concurrent Constraint Programming analyzes the possibility of supporting object-oriented programming in ccp. Starting from established approaches, the book covers various object models and discusses their properties. Small Oz, a sublanguage of the ccp language Oz, is used as a model language for this analysis. This book presents a general-purpose object system for Small Oz and describes its implementation and expressivity for concurrent computation. Objects for Concurrent Constraint Programming is written for programming language researchers with an interest in programming language aspects of concurrency, object-oriented programming, or constraint programming. Programming language implementors will benefit from the rigorous treatment of the efficient implementation of Small Oz. Oz programmers will get a first-hand view of the design decisions that lie behind the Oz object system.

Flex & Bison (Paperback): John Levine Flex & Bison (Paperback)
John Levine
R679 R592 Discovery Miles 5 920 Save R87 (13%) Ships in 9 - 17 working days

If you need to parse or process text data in Linux or Unix, this classic book explains how to use flex and bison to solve your problems quickly - whether you're interpreting code, configuration files, or any other structured format. "Flex and Bison" is the long-awaited sequel to the classic O'Reilly book, "Lex and Yacc". In the nearly two decades since that book was published, the "Flex and Bison" utilities have proven to be more reliable and more powerful than the original Unix tools. This book covers the same core functionality vital to Linux and Unix program development, along with several important new topics. This thoroughly updated edition will help you: address syntax crunching that regular expressions tools can't handle; build compilers and interpreters, and handle a wide range of text processing functions; learn key programming techniques, including syntax trees and symbol tables; implement a full SQL grammar, with complete sample code; and, use new features such as pure (reentrant) lexers and parsers, powerful GLR parsers, and interfaces to C++. This book includes revised tutorial sections for novice users and reference sections for advanced users, with chapters that explain each utility's basic usage and simple, stand-alone applications. Dive into "Flex and Bison" and discover the wide range of uses these flexible tools provide.

Constraint and Integer Programming - Toward a Unified Methodology (Paperback, Softcover reprint of the original 1st ed. 2004):... Constraint and Integer Programming - Toward a Unified Methodology (Paperback, Softcover reprint of the original 1st ed. 2004)
Michela Milano
R4,042 Discovery Miles 40 420 Ships in 18 - 22 working days

Constraint and Integer Programming presents some of the basic ideas of constraint programming and mathematical programming, explores approaches to integration, brings us up to date on heuristic methods, and attempts to discern future directions in this fast-moving field.

Transactions on Petri Nets and Other Models of Concurrency VI (Paperback, 2012 ed.): Kurt Jensen Transactions on Petri Nets and Other Models of Concurrency VI (Paperback, 2012 ed.)
Kurt Jensen; Edited by Wil M.P. van der Aalst, Marco Ajmone Marsan, Giuliana Franceschinis, Jetty Kleijn, …
R1,433 Discovery Miles 14 330 Ships in 18 - 22 working days

These Transactions publish archival papers in the broad area of Petri nets and other models of concurrency, ranging from theoretical work to tool support and industrial applications. ToPNoC issues are published as LNCS volumes, and hence are widely distributed and indexed. This Journal has its own Editorial Board which selects papers based on a rigorous two-stage refereeing process. ToPNoC contains: - Revised versions of a selection of the best papers from workshops and tutorials at the annual Petri net conferences - Special sections/issues within particular subareas (similar to those published in the Advances in Petri Nets series) - Other papers invited for publication in ToPNoC - Papers submitted directly to ToPNoC by their authors.

The sixth volume of ToPNoCincludes revised versions of selected papers from workshops and tutorials held at the 32nd International Conference on Application and Theory of Petri Nets and Concurrency. It alsocontainsa special section on Networks, Protocols, and Services, as well asa contributed paper submitted through the regular submission track of ToPNoC. The 14 papers cover a diverse range of topics including model checking and system verification, synthesis, foundational work on specific classes of Petri nets, and innovative applications of Petri nets and other models of concurrency. Thus this volume gives a good view of ongoing concurrent systems and Petri nets research."

Data Structures and Algorithms - An Object-Oriented Approach Using Ada 95 (Paperback, Softcover reprint of the original 1st ed.... Data Structures and Algorithms - An Object-Oriented Approach Using Ada 95 (Paperback, Softcover reprint of the original 1st ed. 1997)
John Beidler
R1,457 Discovery Miles 14 570 Ships in 18 - 22 working days

This textbook provides an in depth course on data structures in the context of object oriented development. Its main themes are abstraction, implementation, encapsulation, and measurement: that is, that the software process begins with abstraction of data types, which then lead to alternate representations and encapsulation, and finally to resource measurement. A clear object oriented approach, making use of Booch components, will provide readers with a useful library of data structure components and experience in software reuse. Students using this book are expected to have a reasonable understanding of the basic logical structures such as stacks and queues. Throughout, Ada 95 is used and the author takes full advantage of Ada's encapsulation features and the ability to present specifications without implementational details. Ada code is supported by two suites available over the World Wide Web.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Lives of the Necromancers - Or, an…
William Godwin Paperback R641 Discovery Miles 6 410
Tower Adhesive Letter Sign - D (Med…
R17 R14 Discovery Miles 140
The SΘrcerer's Yi-Jing - A 21st Century…
Nathaniel J. Harris Paperback R521 Discovery Miles 5 210
Indigenist Mobilization - Confronting…
Luisa Steur Hardcover R2,847 Discovery Miles 28 470
How to Use the I Ching
Adcock William Hardcover R215 R200 Discovery Miles 2 000
Applied Big Data Analytics in Operations…
Manish Kumar Hardcover R4,029 Discovery Miles 40 290
A New Humanity Is Rising
The Aquarian Team Paperback R367 R344 Discovery Miles 3 440
Advances in Digital Government…
William J. McIver Jr., Ahmed K. Elmagarmid Hardcover R5,329 Discovery Miles 53 290
The Imbalance of Power - Leadership…
Marc Brightman Hardcover R2,839 Discovery Miles 28 390
Cornwall Misfits Curiosities and Legends…
Tj Dockree, Cornwall Writers Paperback R311 Discovery Miles 3 110

 

Partners