0
Your cart

Your cart is empty

Browse All Departments
Price
  • R50 - R100 (15)
  • R100 - R250 (543)
  • R250 - R500 (4,916)
  • R500+ (30,891)
  • -
Status
Format
Author / Contributor
Publisher

Books > Computing & IT > Computer programming

Future Business Software - Current Trends in Business Software Development (Hardcover, 2014): Gino Brunetti, Thomas Feld, Lutz... Future Business Software - Current Trends in Business Software Development (Hardcover, 2014)
Gino Brunetti, Thomas Feld, Lutz Heuser, Joachim Schnitter, Christian Webel
R4,989 R4,533 Discovery Miles 45 330 Save R456 (9%) Ships in 10 - 15 working days

What will business software look like in the future? And how will it be developed?

This book covers the proceedings of the first international conference on Future Business Software - a new think tank discussing the trends in enterprise software with speakers from Europe's most successful software companies and the leading research institutions. The articles focus on two of the most prominent trends in the field: emergent software and agile development processes.

"Emergent Software" is a new paradigm of software development that addresses the highly complex requirements of tomorrow's business software and aims at dynamically and flexibly combining a business software solution's different components in order to fulfill customers' needs with a minimum of effort. Agile development processes are the response of software technology to the implementation of diverse and rapidly changing software requirements. A major focus is on the minimization of project risks, e.g. through short, iterative development cycles, test-driven development and an intensive culture of communication."

A Framework of Software Measurement (Hardcover, Reprint 2012): Horst Zuse A Framework of Software Measurement (Hardcover, Reprint 2012)
Horst Zuse
R5,641 R4,362 Discovery Miles 43 620 Save R1,279 (23%) Ships in 18 - 22 working days

Zuse's textbook on software measurement provides basic principles as well as theoretical and practical guidelines for the use of numerous kinds of software measures. It is written to enable scientists, teachers, practit ioners, and students to define the basic terminology of Software Measurement and to contribute to theory building. The textbook considers, among other, qualitative and numerical models behind software measures. It explains step-by-step the importance of qualitative properties, the meaning of scale types, the foundations of the validation of measures, and the foundations of prediction models, the models behind the Function-Point method and the COCOMO model, and the qualitative assumption of object-oriented measures. For applications of software measures in practice more than two hundred software measures of the software life-cycle are described in detail (object-oriented measures included). The enclosed CD contains a selection of more than 1,600 references of literature, and a small demo version of ZD-MIS (Zuse/Drabe - Measurement Information System) is presented.

Formal Equivalence Checking and Design Debugging (Hardcover, 1998 ed.): Shi-Yu Huang, Kwang-Ting (Tim) Cheng Formal Equivalence Checking and Design Debugging (Hardcover, 1998 ed.)
Shi-Yu Huang, Kwang-Ting (Tim) Cheng
R4,820 Discovery Miles 48 200 Ships in 18 - 22 working days

Formal Equivalence Checking and Design Debugging covers two major topics in design verification: logic equivalence checking and design debugging. The first part of the book reviews the design problems that require logic equivalence checking and describes the underlying technologies that are used to solve them. Some novel approaches to the problems of verifying design revisions after intensive sequential transformations such as retiming are described in detail. The second part of the book gives a thorough survey of previous and recent literature on design error diagnosis and design error correction. This part also provides an in-depth analysis of the algorithms used in two logic debugging software programs, ErrorTracer and AutoFix, developed by the authors. From the Foreword: With the adoption of the static sign-off approach to verifying circuit implementations the application-specific integrated circuit (ASIC) industry will experience the first radical methodological revolution since the adoption of logic synthesis. Equivalence checking is one of the two critical elements of this methodological revolution. This book is timely for either the designer seeking to better understand the mechanics of equivalence checking or for the CAD researcher who wishes to investigate well-motivated research problems such as equivalence checking of retimed designs or error diagnosis in sequential circuits.' Kurt Keutzer, University of California, Berkeley

Patterns, Programming and Everything (Hardcover, 2012): Karin K. Breitman, R.Nigel Horspool Patterns, Programming and Everything (Hardcover, 2012)
Karin K. Breitman, R.Nigel Horspool
R2,653 Discovery Miles 26 530 Ships in 18 - 22 working days

With 11 invited submissions from leading researchers and teams of researchers sharing one common characteristic ? all have worked with Dr. Judith Bishop during her long and continuing career as a leader in computer science education and research ? this book reflects on Dr Bishop's outstandingcontribution to computer science. Havingworked at three different universities she now holds a leadership position in the research division of a major software company.

The topics covered reflect some of the transitions in her career. The dominant theme is programming languages, with chapters on object oriented programming, real-time programming, component programming and design patterns. Another major and related topic is compilers, with contributions on dataflow analysis, tree rewriting and keyword recognition. Finally, there are some additional chapters on other varied but highly interesting topics including smart homes, mobile systems and teaching computer science."

Designing Sorting Networks - A New Paradigm (Hardcover, 2011): Sherenaz W. Al-Haj Baddar, Kenneth E. Batcher Designing Sorting Networks - A New Paradigm (Hardcover, 2011)
Sherenaz W. Al-Haj Baddar, Kenneth E. Batcher
R1,397 Discovery Miles 13 970 Ships in 18 - 22 working days

Designing Sorting Networks: A New Paradigm provides an in-depth guide to maximizing the efficiency of sorting networks, and uses 0/1 cases, partially ordered sets and Haase diagrams to closely analyze their behavior in an easy, intuitive manner. This book also outlines new ideas and techniques for designing faster sorting networks using Sortnet, and illustrates how these techniques were used to design faster 12-key and 18-key sorting networks through a series of case studies. Finally, it examines and explains the mysterious behavior exhibited by the fastest-known 9-step 16-key network. Designing Sorting Networks: A New Paradigm is intended for advanced-level students, researchers and practitioners as a reference book. Academics in the fields of computer science, engineering and mathematics will also find this book invaluable.

Practical .NET for Financial Markets (Hardcover, 1st ed.): Vivek Shetty, Manish Jayaswal Practical .NET for Financial Markets (Hardcover, 1st ed.)
Vivek Shetty, Manish Jayaswal
R2,335 Discovery Miles 23 350 Ships in 18 - 22 working days

This unique book examines up-to-the-minute uses of technology in financial markets and then explains how you can profit from that knowledge. To participate in mainstream .NET development, you must address the changes in financial markets by using the most sophisticated tools available, Microsoft .NET technology.

Software developers and architects, IT pros, and tech-savvy business users alike will find this book comprehensive and relevant. Each chapter presents problems and solutions that cover business aspects and relevant .NET features. Each aspect of .NET is analyzed in its proper context, so you'll understand why it is relevant and applicable in a real-life business case.

C++17 - The Complete Guide (Hardcover): Nicolai M. Josuttis C++17 - The Complete Guide (Hardcover)
Nicolai M. Josuttis
R1,464 Discovery Miles 14 640 Ships in 18 - 22 working days
Managing Chaos - Digital Governance by Design (Paperback): Lisa Welchman Managing Chaos - Digital Governance by Design (Paperback)
Lisa Welchman
R921 Discovery Miles 9 210 Ships in 9 - 17 working days
Internet of Things and Advanced Application in Healthcare (Hardcover): Catarina I. Reis, Marisa da Silva Maximiano Internet of Things and Advanced Application in Healthcare (Hardcover)
Catarina I. Reis, Marisa da Silva Maximiano
R5,263 Discovery Miles 52 630 Ships in 18 - 22 working days

The ubiquitous nature of the Internet of Things allows for enhanced connectivity between people in modern society. When applied to various industries, these current networking capabilities create opportunities for new applications. Internet of Things and Advanced Application in Healthcare is a critical reference source for emerging research on the implementation of the latest networking and technological trends within the healthcare industry. Featuring in-depth coverage across the broad scope of the Internet of Things in specialized settings, such as context-aware computing, reliability, and healthcare support systems, this publication is an ideal resource for professionals, researchers, upper-level students, practitioners, and technology developers seeking innovative material on the Internet of Things and its distinct applications. Topics Covered: Assistive Technologies Context-Aware Computing Systems Health Risk Management Healthcare Support Systems Reliability Concerns Smart Healthcare Wearable Sensors

Compiler Design - Analysis and Transformation (Hardcover, 2012 ed.): Helmut Seidl, Reinhard Wilhelm, Sebastian Hack Compiler Design - Analysis and Transformation (Hardcover, 2012 ed.)
Helmut Seidl, Reinhard Wilhelm, Sebastian Hack
R1,986 Discovery Miles 19 860 Ships in 18 - 22 working days

While compilers for high-level programming languages are large complex software systems, they have particular characteristics that differentiate them from other software systems. Their functionality is almost completely well-defined - ideally there exist complete precise descriptions of the source and target languages. Additional descriptions of the interfaces to the operating system, programming system and programming environment, and to other compilers and libraries are often available. The book deals with the optimization phase of compilers. In this phase, programs are transformed in order to increase their efficiency. To preserve the semantics of the programs in these transformations, the compiler has to meet the associated applicability conditions. These are checked using static analysis of the programs. In this book the authors systematically describe the analysis and transformation of imperative and functional programs. In addition to a detailed description of important efficiency-improving transformations, the book offers a concise introduction to the necessary concepts and methods, namely to operational semantics, lattices, and fixed-point algorithms. This book is intended for students of computer science. The book is supported throughout with examples, exercises and program fragments.

Evolutionary Algorithms and Agricultural Systems (Hardcover, 2002 ed.): David G. Mayer Evolutionary Algorithms and Agricultural Systems (Hardcover, 2002 ed.)
David G. Mayer
R5,201 Discovery Miles 52 010 Ships in 18 - 22 working days

Evolutionary Algorithms and Agricultural Systems deals with the practical application of evolutionary algorithms to the study and management of agricultural systems. The rationale of systems research methodology is introduced, and examples listed of real-world applications. It is the integration of these agricultural systems models with optimization techniques, primarily genetic algorithms, which forms the focus of this book. The advantages are outlined, with examples of agricultural models ranging from national and industry-wide studies down to the within-farm scale. The potential problems of this approach are also discussed, along with practical methods of resolving these problems. Agricultural applications using alternate optimization techniques (gradient and direct-search methods, simulated annealing and quenching, and the tabu search strategy) are also listed and discussed. The particular problems and methodologies of these algorithms, including advantageous features that may benefit a hybrid approach or be usefully incorporated into evolutionary algorithms, are outlined. From consideration of this and the published examples, it is concluded that evolutionary algorithms are the superior method for the practical optimization of models of agricultural and natural systems. General recommendations on robust options and parameter settings for evolutionary algorithms are given for use in future studies. Evolutionary Algorithms and Agricultural Systems will prove useful to practitioners and researchers applying these methods to the optimization of agricultural or natural systems, and would also be suited as a text for systems management, applied modeling, or operations research.

Strategies for Quasi-Monte Carlo (Hardcover, 1999 ed.): Bennett L. Fox Strategies for Quasi-Monte Carlo (Hardcover, 1999 ed.)
Bennett L. Fox
R4,230 Discovery Miles 42 300 Ships in 18 - 22 working days

Strategies for Quasi-Monte Carlo builds a framework to design and analyze strategies for randomized quasi-Monte Carlo (RQMC). One key to efficient simulation using RQMC is to structure problems to reveal a small set of important variables, their number being the effective dimension, while the other variables collectively are relatively insignificant. Another is smoothing. The book provides many illustrations of both keys, in particular for problems involving Poisson processes or Gaussian processes. RQMC beats grids by a huge margin. With low effective dimension, RQMC is an order-of-magnitude more efficient than standard Monte Carlo. With, in addition, certain smoothness - perhaps induced - RQMC is an order-of-magnitude more efficient than deterministic QMC. Unlike the latter, RQMC permits error estimation via the central limit theorem. For random-dimensional problems, such as occur with discrete-event simulation, RQMC gets judiciously combined with standard Monte Carlo to keep memory requirements bounded. This monograph has been designed to appeal to a diverse audience, including those with applications in queueing, operations research, computational finance, mathematical programming, partial differential equations (both deterministic and stochastic), and particle transport, as well as to probabilists and statisticians wanting to know how to apply effectively a powerful tool, and to those interested in numerical integration or optimization in their own right. It recognizes that the heart of practical application is algorithms, so pseudocodes appear throughout the book. While not primarily a textbook, it is suitable as a supplementary text for certain graduate courses. As a reference, it belongs on the shelf of everyone with a serious interest in improving simulation efficiency. Moreover, it will be a valuable reference to all those individuals interested in improving simulation efficiency with more than incremental increases.

Software Architectures and Component Technology (Hardcover, 2002 ed.): Mehmed Aksit Software Architectures and Component Technology (Hardcover, 2002 ed.)
Mehmed Aksit
R5,358 Discovery Miles 53 580 Ships in 18 - 22 working days

Software architectures have gained wide popularity in the last decade. They generally play a fundamental role in coping with the inherent difficulties of the development of large-scale and complex software systems. Component-oriented and aspect-oriented programming enables software engineers to implement complex applications from a set of pre-defined components. Software Architectures and Component Technology collects excellent chapters on software architectures and component technologies from well-known authors, who not only explain the advantages, but also present the shortcomings of the current approaches while introducing novel solutions to overcome the shortcomings. The unique features of this book are: evaluates the current architecture design methods and component composition techniques and explains their shortcomings; presents three practical architecture design methods in detail; gives four industrial architecture design examples; presents conceptual models for distributed message-based architectures; explains techniques for refining architectures into components; presents the recent developments in component and aspect-oriented techniques; explains the status of research on Piccola, Hyper/JA(R), Pluggable Composite Adapters and Composition Filters. Software Architectures and Component Technology is a suitable text for graduate level students in computer science and engineering, and as a reference for researchers and practitioners in industry.

Software Engineering with Computational Intelligence (Hardcover, 2003 ed.): Jonathan Lee Software Engineering with Computational Intelligence (Hardcover, 2003 ed.)
Jonathan Lee
R4,164 Discovery Miles 41 640 Ships in 18 - 22 working days

This edited book invites the reader to explore how the latest technologies developed in computational intelligence can be extended and applied to software engineering. Leading experts demonstrate how this recent confluence of software engineering and computational intelligence provides a powerful tool to address the increasing demand for complex applications in diversified areas, the ever-increasing complexity and size of software systems, and the inherently imperfect nature of the information. The presented treatments to software modeling and formal analysis permit the extension of computational intelligence to various phases in software life cycles, such as managing fuzziness resident in the requirements, coping with fuzzy objects and imprecise knowledge, and handling uncertainty encountered in quality prediction.

Advances in Digital Forensics V - Fifth IFIP WG 11.9 International Conference on Digital Forensics, Orlando, Florida, USA,... Advances in Digital Forensics V - Fifth IFIP WG 11.9 International Conference on Digital Forensics, Orlando, Florida, USA, January 26-28, 2009, Revised Selected Papers (Hardcover, 2009 ed.)
Gilbert Peterson, Sujeet Shenoi
R2,693 Discovery Miles 26 930 Ships in 18 - 22 working days

Digital forensics deals with the acquisition, preservation, examination, analysis and presentation of electronic evidence. Networked computing, wireless communications and portable electronic devices have expanded the role of digital forensics beyond traditional computer crime investigations. Practically every crime now involves some aspect of digital evidence; digital forensics provides the techniques and tools to articulate this evidence. Digital forensics also has myriad intelligence applications. Furthermore, it has a vital role in information assurance - investigations of security breaches yield valuable information that can be used to design more secure systems.

Advances in Digital Forensics V describes original research results and innovative applications in the discipline of digital forensics. In addition, it highlights some of the major technical and legal issues related to digital evidence and electronic crime investigations. The areas of coverage include: themes and issues, forensic techniques, integrity and privacy, network forensics, forensic computing, investigative techniques, legal issues and evidence management.

This book is the fifth volume in the annual series produced by the International Federation for Information Processing (IFIP) Working Group 11.9 on Digital Forensics, an international community of scientists, engineers and practitioners dedicated to advancing the state of the art of research and practice in digital forensics. The book contains a selection of twenty-three edited papers from the Fifth Annual IFIP WG 11.9 International Conference on Digital Forensics, held at the National Center for Forensic Science, Orlando, Florida, USA in the spring of 2009.

Advances in Digital Forensics V is an important resource for researchers, faculty members and graduate students, as well as for practitioners and individuals engaged in research and development efforts for the law enforcement and intelligence communities.

Managing Software Engineering Knowledge (Hardcover, 2003 ed.): Aybuke Aurum, Ross Jeffery, Claes Wohlin, Meliha Handzic Managing Software Engineering Knowledge (Hardcover, 2003 ed.)
Aybuke Aurum, Ross Jeffery, Claes Wohlin, Meliha Handzic
R2,871 Discovery Miles 28 710 Ships in 18 - 22 working days

Software development is a complex problem-solving activity with a high level of uncertainty. There are many technical challenges concerning scheduling, cost estimation, reliability, performance, etc, which are further aggravated by weaknesses such as changing requirements, team dynamics, and high staff turnover. Thus the management of knowledge and experience is a key means of systematic software development and process improvement. "Managing Software Engineering Knowledge" illustrates several theoretical examples of this vision and solutions applied to industrial practice. It is structured in four parts addressing the motives for knowledge management, the concepts and models used in knowledge management for software engineering, their application to software engineering, and practical guidelines for managing software engineering knowledge. This book provides a comprehensive overview of the state of the art and best practice in knowledge management applied to software engineering. While researchers and graduate students will benefit from the interdisciplinary approach leading to basic frameworks and methodologies, professional software developers and project managers will also profit from industrial experience reports and practical guidelines.

Trust Management III - Third IFIP WG 11.11 International Conference, IFIPTM 2009, West Lafayette, IN, USA, June 15-19, 2009,... Trust Management III - Third IFIP WG 11.11 International Conference, IFIPTM 2009, West Lafayette, IN, USA, June 15-19, 2009, Proceedings (Hardcover, 2009 ed.)
Elena Ferrari, Ninghui Li, Elisa Bertino, Yucel Karabulut
R2,692 Discovery Miles 26 920 Ships in 18 - 22 working days

ThisvolumecontainstheproceedingsofIFIPTM2009, theThirdIFIPWG11.11 International Conference on Trust Management, held at Purdue University in West Lafayette, Indiana, USA during June 15-19, 2009. IFIPTM 2009 provided a truly global platform for the reporting of research, development, policyandpracticeintheinterdependentareasofprivacy, security, and trust. Building on the traditions inherited from the highly successful iTrust conference series, the IFIPTM 2007 conference in Moncton, New Brunswick, Canada, and the IFIPTM 2008conferencein Trondheim, Norway, IFIPTM 2009 focusedontrust, privacyand security from multidisciplinary perspectives. The conferenceisanarenafor discussionaboutrelevantproblemsfromboth research and practice in the areas of academia, business, and government. IFIPTM 2009 was an open IFIP conference. The program of the conference featured both theoretical research papers and reports of real-world case studies. IFIPTM 2009 received 44 submissions. The ProgramCommittee selected 17 - pers for presentation and inclusion in the proceedings. In addition, the program and the proceedings include one invited paper and ?ve demo descriptions. The highlights of IFIPTM 2009 included invited talks and tutorials by academic and governmental experts in the ?elds of trust management, privacy and security, including Eugene Spa?ord, Marianne Winslett, and Michael Novak. Running an international conference requires an immense e?ort from all p- ties involved. We would like to thank the Program Committee members and external referees for having provided timely and in-depth reviews of the subm- ted papers.We wouldalsolike to thank the Workshop, Tutorial, Demonstration, Local Arrangements, and Website Chairs, for having provided great help or- nizing the con

Making Databases Work - The Pragmatic Wisdom of Michael Stonebraker (Hardcover): Michael L Brodie Making Databases Work - The Pragmatic Wisdom of Michael Stonebraker (Hardcover)
Michael L Brodie
R2,911 Discovery Miles 29 110 Ships in 18 - 22 working days

This book celebrates Michael Stonebraker's accomplishments that led to his 2014 ACM A.M. Turing Award "for fundamental contributions to the concepts and practices underlying modern database systems." The book describes, for the broad computing community, the unique nature, significance, and impact of Mike's achievements in advancing modern database systems over more than forty years. Today, data is considered the world's most valuable resource, whether it is in the tens of millions of databases used to manage the world's businesses and governments, in the billions of databases in our smartphones and watches, or residing elsewhere, as yet unmanaged, awaiting the elusive next generation of database systems. Every one of the millions or billions of databases includes features that are celebrated by the 2014 Turing Award and are described in this book. Why should I care about databases? What is a database? What is data management? What is a database management system (DBMS)? These are just some of the questions that this book answers, in describing the development of data management through the achievements of Mike Stonebraker and his over 200 collaborators. In reading the stories in this book, you will discover core data management concepts that were developed over the two greatest eras (so far) of data management technology. The book is a collection of 36 stories written by Mike and 38 of his collaborators: 23 world-leading database researchers, 11 world-class systems engineers, and 4 business partners. If you are an aspiring researcher, engineer, or entrepreneur you might read these stories to find these turning points as practice to tilt at your own computer-science windmills, to spur yourself to your next step of innovation and achievement.

Real-Time Systems Engineering and Applications - Engineering and Applications (Hardcover, 1992 ed.): Michael Schiebe, Saskia... Real-Time Systems Engineering and Applications - Engineering and Applications (Hardcover, 1992 ed.)
Michael Schiebe, Saskia Pferrer
R5,391 Discovery Miles 53 910 Ships in 18 - 22 working days

Real-Time Systems Engineering and Applications is a well-structured collection of chapters pertaining to present and future developments in real-time systems engineering. After an overview of real-time processing, theoretical foundations are presented. The book then introduces useful modeling concepts and tools. This is followed by concentration on the more practical aspects of real-time engineering with a thorough overview of the present state of the art, both in hardware and software, including related concepts in robotics. Examples are given of novel real-time applications which illustrate the present state of the art. The book concludes with a focus on future developments, giving direction for new research activities and an educational curriculum covering the subject. This book can be used as a source for academic and industrial researchers as well as a textbook for computing and engineering courses covering the topic of real-time systems engineering.

TeX Reference Manual (Hardcover, 2002 ed.): David Bausum TeX Reference Manual (Hardcover, 2002 ed.)
David Bausum
R4,226 Discovery Miles 42 260 Ships in 18 - 22 working days

Introduction or Why I wrote this book N the fallof 1997 a dedicated troff user e-rnalled me the macros he used to typeset his books. 1took one look inside his fileand thought, "I can do I this;It'sjustcode. " Asan experiment1spent aweekand wrote a Cprogram and troff macros which formatted and typeset a membership directory for a scholarly society with approximately 2,000 members. When 1 was done, I could enter two commands, and my program and troff would convert raw membershipdata into 200 pages ofPostScriptin 35 seconds. Previously, it had taken me several days to prepare camera-readycopy for the directory using a word processor. For completeness 1sat down and tried to write 1EXmacros for the typesetting. 1failed. Although ninety-five percent of my macros worked, I was unable to prepare the columns the project required. As my frustration grew, 1began this book-mentally, in myhead-as an answer to the question, "Why is 'lEX so hard to learn?" Why use Tgx? Lest you accuse me of the old horse and cart problem, 1should address the question, "Why use 1EX at all?" before 1explain why 'lEX is hard. Iuse 'lEXfor the followingreasons. Itisstable, fast, free, and it uses ASCII. Ofcourse, the most important reason is: 'lEX does a fantastic job. Bystable, I mean it is not likely to change in the next 10 years (much less the next one or two), and it is free of bugs. Both ofthese are important.

Noniterative Coordination in Multilevel Systems (Hardcover, 1999 ed.): Todor Stoilov Noniterative Coordination in Multilevel Systems (Hardcover, 1999 ed.)
Todor Stoilov
R2,679 Discovery Miles 26 790 Ships in 18 - 22 working days

Multilevel decision theory arises to resolve the contradiction between increasing requirements towards the process of design, synthesis, control and management of complex systems and the limitation of the power of technical, control, computer and other executive devices, which have to perform actions and to satisfy requirements in real time. This theory rises suggestions how to replace the centralised management of the system by hierarchical co-ordination of sub-processes. All sub-processes have lower dimensions, which support easier management and decision making. But the sub-processes are interconnected and they influence each other. Multilevel systems theory supports two main methodological tools: decomposition and co-ordination. Both have been developed, and implemented in practical applications concerning design, control and management of complex systems. In general, it is always beneficial to find the best or optimal solution in processes of system design, control and management. The real tendency towards the best (optimal) decision requires to present all activities in the form of a definition and then the solution of an appropriate optimization problem. Every optimization process needs the mathematical definition and solution of a well stated optimization problem. These problems belong to two classes: static optimization and dynamic optimization. Static optimization problems are solved applying methods of mathematical programming: conditional and unconditional optimization. Dynamic optimization problems are solved by methods of variation calculus: Euler Lagrange method; maximum principle; dynamical programming."

Number Theory for Computing (Hardcover, 2nd ed. 2002): M. E. Hellmann Number Theory for Computing (Hardcover, 2nd ed. 2002)
M. E. Hellmann; Song Y. Yan
R2,448 Discovery Miles 24 480 Ships in 18 - 22 working days

There are many surprising connections between the theory of numbers, which is one of the oldest branches of mathematics, and computing and information theory. Number theory has important applications in computer organization and security, coding and cryptography, random number generation, hash functions, and graphics. Conversely, number theorists use computers in factoring large integers, determining primes, testing conjectures, and solving other problems. This book takes the reader from elementary number theory, via algorithmic number theory, to applied number theory in computer science. It introduces basic concepts, results, and methods, and discusses their applications in the design of hardware and software, cryptography, and security. It is aimed at undergraduates in computing and information technology, but will also be valuable to mathematics students interested in applications. In this 2nd edition full proofs of many theorems are added and some corrections are made.

Linked Data in Linguistics - Representing and Connecting Language Data and Language Metadata (Hardcover, 2012 ed.): Christian... Linked Data in Linguistics - Representing and Connecting Language Data and Language Metadata (Hardcover, 2012 ed.)
Christian Chiarcos, Sebastian Nordhoff, Sebastian Hellmann
R1,419 Discovery Miles 14 190 Ships in 18 - 22 working days

The explosion of information technology has led to substantial growth of web-accessible linguistic data in terms of quantity, diversity and complexity. These resources become even more useful when interlinked with each other to generate network effects.

The general trend of providing data online is thus accompanied by newly developing methodologies to interconnect linguistic data and metadata. This includes linguistic data collections, general-purpose knowledge bases (e.g., the DBpedia, a machine-readable edition of the Wikipedia), and repositories with specific information about languages, linguistic categories and phenomena. The Linked Data paradigm provides a framework for interoperability and access management, and thereby allows to integrate information from such a diverse set of resources.

The contributions assembled in this volume illustrate the band-width of applications of the Linked Data paradigm for representative types of language resources. They cover lexical-semantic resources, annotated corpora, typological databases as well as terminology and metadata repositories. The book includes representative applications from diverse fields, ranging from academic linguistics (e.g., typology and corpus linguistics) over applied linguistics (e.g., lexicography and translation studies) to technical applications (in computational linguistics, Natural Language Processing and information technology).

This volume accompanies the Workshop on Linked Data in Linguistics 2012 (LDL-2012) in Frankfurt/M., Germany, organized by the Open Linguistics Working Group (OWLG) of the Open Knowledge Foundation (OKFN). It assembles contributions of the workshop participants and, beyond this, it summarizes initial steps in the formation of a Linked Open Data cloud of linguistic resources, the Linguistic Linked Open Data cloud (LLOD).

Multiprocessor Execution of Logic Programs (Hardcover, 1994 ed.): Gopal Gupta Multiprocessor Execution of Logic Programs (Hardcover, 1994 ed.)
Gopal Gupta
R4,148 Discovery Miles 41 480 Ships in 18 - 22 working days

Multiprocessor Execution of Logic Programs addresses the problem of efficient implementation of logic programming languages, specifically Prolog, on multiprocessor architectures. The approaches and implementations developed attempt to take full advantage of sequential implementation technology developed for Prolog (such as the WAM) while exploiting all forms of control parallelism present in logic programs, namely, or-parallelism, independent and-parallelism and dependent and-parallelism. Coverage includes a thorough survey of parallel implementation techniques and parallel systems developed for Prolog. Multiprocessor Execution of Logic Programs is recommended for people implementing parallel logic programming systems, parallel symbolic systems, parallel AI systems, and parallel theorem proving systems. It will also be useful to people who wish to learn about the implementation of parallel logic programming systems.

Fundamentals of Algebraic Specification 2 - Module Specifications and Constraints (Hardcover, 1990 ed.): Hartmut Ehrig, Bernd... Fundamentals of Algebraic Specification 2 - Module Specifications and Constraints (Hardcover, 1990 ed.)
Hartmut Ehrig, Bernd Mahr
R1,491 Discovery Miles 14 910 Ships in 18 - 22 working days

Since the early seventies concepts of specification have become central in the whole area of computer science. Especially algebraic specification techniques for abstract data types and software systems have gained considerable importance in recent years. They have not only played a central role in the theory of data type specification, but meanwhile have had a remarkable influence on programming language design, system architectures, arid software tools and environments. The fundamentals of algebraic specification lay a basis for teaching, research, and development in all those fields of computer science where algebraic techniques are the subject or are used with advantage on a conceptual level. Such a basis, however, we do not regard to be a synopsis of all the different approaches and achievements but rather a consistently developed theory. Such a theory should mainly emphasize elaboration of basic concepts from one point of view and, in a rigorous way, reach the state of the art in the field. We understand fundamentals in this context as: 1. Fundamentals in the sense of a carefully motivated introduction to algebraic specification, which is understandable for computer scientists and mathematicians. 2. Fundamentals in the sense of mathematical theories which are the basis for precise definitions, constructions, results, and correctness proofs. 3. Fundamentals in the sense of concepts from computer science, which are introduced on a conceptual level and formalized in mathematical terms.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Java Programming
Joyce Farrell Paperback R1,326 R1,236 Discovery Miles 12 360
Clean Architecture - A Craftsman's Guide…
Robert Martin Paperback  (1)
R860 R549 Discovery Miles 5 490
Programming Logic & Design
Joyce Farrell Paperback R757 Discovery Miles 7 570
Essential Java for Scientists and…
Brian Hahn, Katherine Malan Paperback R1,266 Discovery Miles 12 660
Hardware Accelerator Systems for…
Shiho Kim, Ganesh Chandra Deka Hardcover R3,950 Discovery Miles 39 500
Introducing Delphi Programming - Theory…
John Barrow, Linda Miller, … Paperback  (1)
R751 Discovery Miles 7 510
Reachable Sets of Dynamic Systems…
Stanislaw Raczynski Paperback R3,927 Discovery Miles 39 270
Program Construction - Calculating…
Roland Backhouse Paperback R2,460 Discovery Miles 24 600
C++ Programming - Program Design…
D. Malik Paperback R1,646 R1,523 Discovery Miles 15 230
Programming Logic & Design…
Joyce Farrell Paperback R1,256 R1,170 Discovery Miles 11 700

 

Partners