0
Your cart

Your cart is empty

Browse All Departments
Price
  • R50 - R100 (5)
  • R100 - R250 (517)
  • R250 - R500 (4,466)
  • R500+ (32,257)
  • -
Status
Format
Author / Contributor
Publisher

Books > Computing & IT > Computer programming

Linked Data in Linguistics - Representing and Connecting Language Data and Language Metadata (Hardcover, 2012 ed.): Christian... Linked Data in Linguistics - Representing and Connecting Language Data and Language Metadata (Hardcover, 2012 ed.)
Christian Chiarcos, Sebastian Nordhoff, Sebastian Hellmann
R1,534 Discovery Miles 15 340 Ships in 10 - 15 working days

The explosion of information technology has led to substantial growth of web-accessible linguistic data in terms of quantity, diversity and complexity. These resources become even more useful when interlinked with each other to generate network effects.

The general trend of providing data online is thus accompanied by newly developing methodologies to interconnect linguistic data and metadata. This includes linguistic data collections, general-purpose knowledge bases (e.g., the DBpedia, a machine-readable edition of the Wikipedia), and repositories with specific information about languages, linguistic categories and phenomena. The Linked Data paradigm provides a framework for interoperability and access management, and thereby allows to integrate information from such a diverse set of resources.

The contributions assembled in this volume illustrate the band-width of applications of the Linked Data paradigm for representative types of language resources. They cover lexical-semantic resources, annotated corpora, typological databases as well as terminology and metadata repositories. The book includes representative applications from diverse fields, ranging from academic linguistics (e.g., typology and corpus linguistics) over applied linguistics (e.g., lexicography and translation studies) to technical applications (in computational linguistics, Natural Language Processing and information technology).

This volume accompanies the Workshop on Linked Data in Linguistics 2012 (LDL-2012) in Frankfurt/M., Germany, organized by the Open Linguistics Working Group (OWLG) of the Open Knowledge Foundation (OKFN). It assembles contributions of the workshop participants and, beyond this, it summarizes initial steps in the formation of a Linked Open Data cloud of linguistic resources, the Linguistic Linked Open Data cloud (LLOD).

Advances in Computing and Intelligent Systems - Proceedings of ICACM 2019 (Hardcover, 1st ed. 2020): Harish Sharma, Kannan... Advances in Computing and Intelligent Systems - Proceedings of ICACM 2019 (Hardcover, 1st ed. 2020)
Harish Sharma, Kannan Govindan, Ramesh C. Poonia, Sandeep Kumar, Wael M. El-Medany
R4,482 Discovery Miles 44 820 Ships in 10 - 15 working days

This book gathers selected papers presented at the International Conference on Advancements in Computing and Management (ICACM 2019). Discussing current research in the field of artificial intelligence and machine learning, cloud computing, recent trends in security, natural language processing and machine translation, parallel and distributed algorithms, as well as pattern recognition and analysis, it is a valuable resource for academics, practitioners in industry and decision-makers.

Multi-Disciplinary Digital Signal Processing - A Functional Approach Using Matlab (Hardcover, 1st ed. 2018): E.S. Gopi Multi-Disciplinary Digital Signal Processing - A Functional Approach Using Matlab (Hardcover, 1st ed. 2018)
E.S. Gopi
R4,542 Discovery Miles 45 420 Ships in 12 - 19 working days

This book provides a comprehensive overview of digital signal processing for a multi-disciplinary audience. It posits that though the theory involved in digital signal processing stems from electrical, electronics, communication, and control engineering, the topic has use in other disciplinary areas like chemical, mechanical, civil, computer science, and management. This book is written about digital signal processing in such a way that it is suitable for a wide ranging audience. Readers should be able to get a grasp of the field, understand the concepts easily, and apply as needed in their own fields. It covers sampling and reconstruction of signals; infinite impulse response filter; finite impulse response filter; multi rate signal processing; statistical signal processing; and applications in multidisciplinary domains. The book takes a functional approach and all techniques are illustrated using Matlab.

Coding - All the Basic Need to Learn Programming Like a Pro. This Book Includes Python, Java, and C ++ (Hardcover): Alan Grid Coding - All the Basic Need to Learn Programming Like a Pro. This Book Includes Python, Java, and C ++ (Hardcover)
Alan Grid
R1,091 R936 Discovery Miles 9 360 Save R155 (14%) Ships in 10 - 15 working days
Internet of Things and Advanced Application in Healthcare (Hardcover): Catarina I. Reis, Marisa da Silva Maximiano Internet of Things and Advanced Application in Healthcare (Hardcover)
Catarina I. Reis, Marisa da Silva Maximiano
R5,707 Discovery Miles 57 070 Ships in 10 - 15 working days

The ubiquitous nature of the Internet of Things allows for enhanced connectivity between people in modern society. When applied to various industries, these current networking capabilities create opportunities for new applications. Internet of Things and Advanced Application in Healthcare is a critical reference source for emerging research on the implementation of the latest networking and technological trends within the healthcare industry. Featuring in-depth coverage across the broad scope of the Internet of Things in specialized settings, such as context-aware computing, reliability, and healthcare support systems, this publication is an ideal resource for professionals, researchers, upper-level students, practitioners, and technology developers seeking innovative material on the Internet of Things and its distinct applications. Topics Covered: Assistive Technologies Context-Aware Computing Systems Health Risk Management Healthcare Support Systems Reliability Concerns Smart Healthcare Wearable Sensors

Algorithms: Design Techniques And Analysis (Hardcover): M H Alsuwaiyel Algorithms: Design Techniques And Analysis (Hardcover)
M H Alsuwaiyel
R4,020 Discovery Miles 40 200 Ships in 12 - 19 working days

Problem solving is an essential part of every scientific discipline. It has two components: (1) problem identification and formulation, and (2) solution of the formulated problem. One can solve a problem on its own using ad hoc techniques or follow those techniques that have produced efficient solutions to similar problems. This requires the understanding of various algorithm design techniques, how and when to use them to formulate solutions and the context appropriate for each of them. This book advocates the study of algorithm design techniques by presenting most of the useful algorithm design techniques and illustrating them through numerous examples.

Evolutionary Algorithms and Agricultural Systems (Hardcover, 2002 ed.): David G. Mayer Evolutionary Algorithms and Agricultural Systems (Hardcover, 2002 ed.)
David G. Mayer
R5,640 Discovery Miles 56 400 Ships in 10 - 15 working days

Evolutionary Algorithms and Agricultural Systems deals with the practical application of evolutionary algorithms to the study and management of agricultural systems. The rationale of systems research methodology is introduced, and examples listed of real-world applications. It is the integration of these agricultural systems models with optimization techniques, primarily genetic algorithms, which forms the focus of this book. The advantages are outlined, with examples of agricultural models ranging from national and industry-wide studies down to the within-farm scale. The potential problems of this approach are also discussed, along with practical methods of resolving these problems. Agricultural applications using alternate optimization techniques (gradient and direct-search methods, simulated annealing and quenching, and the tabu search strategy) are also listed and discussed. The particular problems and methodologies of these algorithms, including advantageous features that may benefit a hybrid approach or be usefully incorporated into evolutionary algorithms, are outlined. From consideration of this and the published examples, it is concluded that evolutionary algorithms are the superior method for the practical optimization of models of agricultural and natural systems. General recommendations on robust options and parameter settings for evolutionary algorithms are given for use in future studies. Evolutionary Algorithms and Agricultural Systems will prove useful to practitioners and researchers applying these methods to the optimization of agricultural or natural systems, and would also be suited as a text for systems management, applied modeling, or operations research.

Real-Time Systems Engineering and Applications - Engineering and Applications (Hardcover, 1992 ed.): Michael Schiebe, Saskia... Real-Time Systems Engineering and Applications - Engineering and Applications (Hardcover, 1992 ed.)
Michael Schiebe, Saskia Pferrer
R5,846 Discovery Miles 58 460 Ships in 10 - 15 working days

Real-Time Systems Engineering and Applications is a well-structured collection of chapters pertaining to present and future developments in real-time systems engineering. After an overview of real-time processing, theoretical foundations are presented. The book then introduces useful modeling concepts and tools. This is followed by concentration on the more practical aspects of real-time engineering with a thorough overview of the present state of the art, both in hardware and software, including related concepts in robotics. Examples are given of novel real-time applications which illustrate the present state of the art. The book concludes with a focus on future developments, giving direction for new research activities and an educational curriculum covering the subject. This book can be used as a source for academic and industrial researchers as well as a textbook for computing and engineering courses covering the topic of real-time systems engineering.

Strategies for Quasi-Monte Carlo (Hardcover, 1999 ed.): Bennett L. Fox Strategies for Quasi-Monte Carlo (Hardcover, 1999 ed.)
Bennett L. Fox
R4,585 Discovery Miles 45 850 Ships in 10 - 15 working days

Strategies for Quasi-Monte Carlo builds a framework to design and analyze strategies for randomized quasi-Monte Carlo (RQMC). One key to efficient simulation using RQMC is to structure problems to reveal a small set of important variables, their number being the effective dimension, while the other variables collectively are relatively insignificant. Another is smoothing. The book provides many illustrations of both keys, in particular for problems involving Poisson processes or Gaussian processes. RQMC beats grids by a huge margin. With low effective dimension, RQMC is an order-of-magnitude more efficient than standard Monte Carlo. With, in addition, certain smoothness - perhaps induced - RQMC is an order-of-magnitude more efficient than deterministic QMC. Unlike the latter, RQMC permits error estimation via the central limit theorem. For random-dimensional problems, such as occur with discrete-event simulation, RQMC gets judiciously combined with standard Monte Carlo to keep memory requirements bounded. This monograph has been designed to appeal to a diverse audience, including those with applications in queueing, operations research, computational finance, mathematical programming, partial differential equations (both deterministic and stochastic), and particle transport, as well as to probabilists and statisticians wanting to know how to apply effectively a powerful tool, and to those interested in numerical integration or optimization in their own right. It recognizes that the heart of practical application is algorithms, so pseudocodes appear throughout the book. While not primarily a textbook, it is suitable as a supplementary text for certain graduate courses. As a reference, it belongs on the shelf of everyone with a serious interest in improving simulation efficiency. Moreover, it will be a valuable reference to all those individuals interested in improving simulation efficiency with more than incremental increases.

Support Vector Machines and Evolutionary Algorithms for Classification - Single or Together? (Hardcover, 2014 ed.): Catalin... Support Vector Machines and Evolutionary Algorithms for Classification - Single or Together? (Hardcover, 2014 ed.)
Catalin Stoean, Ruxandra Stoean
R3,430 Discovery Miles 34 300 Ships in 12 - 19 working days

When discussing classification, support vector machines are known to be a capable and efficient technique to learn and predict with high accuracy within a quick time frame. Yet, their black box means to do so make the practical users quite circumspect about relying on it, without much understanding of the how and why of its predictions. The question raised in this book is how can this 'masked hero' be made more comprehensible and friendly to the public: provide a surrogate model for its hidden optimization engine, replace the method completely or appoint a more friendly approach to tag along and offer the much desired explanations? Evolutionary algorithms can do all these and this book presents such possibilities of achieving high accuracy, comprehensibility, reasonable runtime as well as unconstrained performance.

Basics of Game Design (Hardcover): Michael Moore Basics of Game Design (Hardcover)
Michael Moore
R5,854 Discovery Miles 58 540 Ships in 12 - 19 working days

Basics of Game Design is for anyone wanting to become a professional game designer. Focusing on creating the game mechanics for data-driven games, it covers role-playing, real-time strategy, first-person shooter, simulation, and other games. Written by a 25-year veteran of the game industry, the guide offers detailed explanations of how to design the data sets used to resolve game play for moving, combat, solving puzzles, interacting with NPCs, managing inventory, and much more. Advice on developing stories for games, building maps and levels, and designing the graphical user interface is also included.

.NET Test Automation Recipes - A Problem-Solution Approach (Hardcover, 1st ed.): James McCaffrey .NET Test Automation Recipes - A Problem-Solution Approach (Hardcover, 1st ed.)
James McCaffrey
R1,614 Discovery Miles 16 140 Ships in 10 - 15 working days

If you develop, test, or manage .NET software, you will find ."NET Test Automation Recipes: A Problem-Solution Approach" very useful. The book presents practical techniques for writing lightweight software test automation in a .NET environment and covers API testing thoroughly. It also discusses lightweight, custom Windows application user interface automation and teaches you low-level web application user interface automation. Additional material covers SQL stored procedure testing techniques.

The examples in this book have been successfully used in seminars and teaching environments where they have proven highly effective for students who are learning intermediate-level .NET programming. You'll come away from the book knowing how to write production-quality combination and permutation methods.-->Table of Contents-->API Testing Reflection-Based UI Testing Windows-Based UI Testing Test Harness Design Patterns Request-Response Testing Script-Based Web UI Testing Low-Level Web UI Testing Web Services Testing SQL Stored Procedure Testing Combinations and Permutations ADO.NET Testing XML Testing

Advances in Digital Forensics V - Fifth IFIP WG 11.9 International Conference on Digital Forensics, Orlando, Florida, USA,... Advances in Digital Forensics V - Fifth IFIP WG 11.9 International Conference on Digital Forensics, Orlando, Florida, USA, January 26-28, 2009, Revised Selected Papers (Hardcover, 2009 ed.)
Gilbert Peterson, Sujeet Shenoi
R2,917 Discovery Miles 29 170 Ships in 10 - 15 working days

Digital forensics deals with the acquisition, preservation, examination, analysis and presentation of electronic evidence. Networked computing, wireless communications and portable electronic devices have expanded the role of digital forensics beyond traditional computer crime investigations. Practically every crime now involves some aspect of digital evidence; digital forensics provides the techniques and tools to articulate this evidence. Digital forensics also has myriad intelligence applications. Furthermore, it has a vital role in information assurance - investigations of security breaches yield valuable information that can be used to design more secure systems.

Advances in Digital Forensics V describes original research results and innovative applications in the discipline of digital forensics. In addition, it highlights some of the major technical and legal issues related to digital evidence and electronic crime investigations. The areas of coverage include: themes and issues, forensic techniques, integrity and privacy, network forensics, forensic computing, investigative techniques, legal issues and evidence management.

This book is the fifth volume in the annual series produced by the International Federation for Information Processing (IFIP) Working Group 11.9 on Digital Forensics, an international community of scientists, engineers and practitioners dedicated to advancing the state of the art of research and practice in digital forensics. The book contains a selection of twenty-three edited papers from the Fifth Annual IFIP WG 11.9 International Conference on Digital Forensics, held at the National Center for Forensic Science, Orlando, Florida, USA in the spring of 2009.

Advances in Digital Forensics V is an important resource for researchers, faculty members and graduate students, as well as for practitioners and individuals engaged in research and development efforts for the law enforcement and intelligence communities.

Managing Software Engineering Knowledge (Hardcover, 2003 ed.): Aybuke Aurum, Ross Jeffery, Claes Wohlin, Meliha Handzic Managing Software Engineering Knowledge (Hardcover, 2003 ed.)
Aybuke Aurum, Ross Jeffery, Claes Wohlin, Meliha Handzic
R3,111 Discovery Miles 31 110 Ships in 10 - 15 working days

Software development is a complex problem-solving activity with a high level of uncertainty. There are many technical challenges concerning scheduling, cost estimation, reliability, performance, etc, which are further aggravated by weaknesses such as changing requirements, team dynamics, and high staff turnover. Thus the management of knowledge and experience is a key means of systematic software development and process improvement. "Managing Software Engineering Knowledge" illustrates several theoretical examples of this vision and solutions applied to industrial practice. It is structured in four parts addressing the motives for knowledge management, the concepts and models used in knowledge management for software engineering, their application to software engineering, and practical guidelines for managing software engineering knowledge. This book provides a comprehensive overview of the state of the art and best practice in knowledge management applied to software engineering. While researchers and graduate students will benefit from the interdisciplinary approach leading to basic frameworks and methodologies, professional software developers and project managers will also profit from industrial experience reports and practical guidelines.

Trust Management III - Third IFIP WG 11.11 International Conference, IFIPTM 2009, West Lafayette, IN, USA, June 15-19, 2009,... Trust Management III - Third IFIP WG 11.11 International Conference, IFIPTM 2009, West Lafayette, IN, USA, June 15-19, 2009, Proceedings (Hardcover, 2009 ed.)
Elena Ferrari, Ninghui Li, Elisa Bertino, Yucel Karabulut
R2,916 Discovery Miles 29 160 Ships in 10 - 15 working days

ThisvolumecontainstheproceedingsofIFIPTM2009, theThirdIFIPWG11.11 International Conference on Trust Management, held at Purdue University in West Lafayette, Indiana, USA during June 15-19, 2009. IFIPTM 2009 provided a truly global platform for the reporting of research, development, policyandpracticeintheinterdependentareasofprivacy, security, and trust. Building on the traditions inherited from the highly successful iTrust conference series, the IFIPTM 2007 conference in Moncton, New Brunswick, Canada, and the IFIPTM 2008conferencein Trondheim, Norway, IFIPTM 2009 focusedontrust, privacyand security from multidisciplinary perspectives. The conferenceisanarenafor discussionaboutrelevantproblemsfromboth research and practice in the areas of academia, business, and government. IFIPTM 2009 was an open IFIP conference. The program of the conference featured both theoretical research papers and reports of real-world case studies. IFIPTM 2009 received 44 submissions. The ProgramCommittee selected 17 - pers for presentation and inclusion in the proceedings. In addition, the program and the proceedings include one invited paper and ?ve demo descriptions. The highlights of IFIPTM 2009 included invited talks and tutorials by academic and governmental experts in the ?elds of trust management, privacy and security, including Eugene Spa?ord, Marianne Winslett, and Michael Novak. Running an international conference requires an immense e?ort from all p- ties involved. We would like to thank the Program Committee members and external referees for having provided timely and in-depth reviews of the subm- ted papers.We wouldalsolike to thank the Workshop, Tutorial, Demonstration, Local Arrangements, and Website Chairs, for having provided great help or- nizing the con

TeX Reference Manual (Hardcover, 2002 ed.): David Bausum TeX Reference Manual (Hardcover, 2002 ed.)
David Bausum
R4,581 Discovery Miles 45 810 Ships in 10 - 15 working days

Introduction or Why I wrote this book N the fallof 1997 a dedicated troff user e-rnalled me the macros he used to typeset his books. 1took one look inside his fileand thought, "I can do I this;It'sjustcode. " Asan experiment1spent aweekand wrote a Cprogram and troff macros which formatted and typeset a membership directory for a scholarly society with approximately 2,000 members. When 1 was done, I could enter two commands, and my program and troff would convert raw membershipdata into 200 pages ofPostScriptin 35 seconds. Previously, it had taken me several days to prepare camera-readycopy for the directory using a word processor. For completeness 1sat down and tried to write 1EXmacros for the typesetting. 1failed. Although ninety-five percent of my macros worked, I was unable to prepare the columns the project required. As my frustration grew, 1began this book-mentally, in myhead-as an answer to the question, "Why is 'lEX so hard to learn?" Why use Tgx? Lest you accuse me of the old horse and cart problem, 1should address the question, "Why use 1EX at all?" before 1explain why 'lEX is hard. Iuse 'lEXfor the followingreasons. Itisstable, fast, free, and it uses ASCII. Ofcourse, the most important reason is: 'lEX does a fantastic job. Bystable, I mean it is not likely to change in the next 10 years (much less the next one or two), and it is free of bugs. Both ofthese are important.

Noniterative Coordination in Multilevel Systems (Hardcover, 1999 ed.): Todor Stoilov Noniterative Coordination in Multilevel Systems (Hardcover, 1999 ed.)
Todor Stoilov
R2,902 Discovery Miles 29 020 Ships in 10 - 15 working days

Multilevel decision theory arises to resolve the contradiction between increasing requirements towards the process of design, synthesis, control and management of complex systems and the limitation of the power of technical, control, computer and other executive devices, which have to perform actions and to satisfy requirements in real time. This theory rises suggestions how to replace the centralised management of the system by hierarchical co-ordination of sub-processes. All sub-processes have lower dimensions, which support easier management and decision making. But the sub-processes are interconnected and they influence each other. Multilevel systems theory supports two main methodological tools: decomposition and co-ordination. Both have been developed, and implemented in practical applications concerning design, control and management of complex systems. In general, it is always beneficial to find the best or optimal solution in processes of system design, control and management. The real tendency towards the best (optimal) decision requires to present all activities in the form of a definition and then the solution of an appropriate optimization problem. Every optimization process needs the mathematical definition and solution of a well stated optimization problem. These problems belong to two classes: static optimization and dynamic optimization. Static optimization problems are solved applying methods of mathematical programming: conditional and unconditional optimization. Dynamic optimization problems are solved by methods of variation calculus: Euler Lagrange method; maximum principle; dynamical programming."

Protocol Engineering (Hardcover, 2012 ed.): Hartmut Koenig Protocol Engineering (Hardcover, 2012 ed.)
Hartmut Koenig
R2,484 Discovery Miles 24 840 Ships in 10 - 15 working days

Communication protocols form the operational basis of computer networks and tele communication systems. They are behavior conventions that describe how com munication systems inter act with each other, defining the temporal order of the interactions and the formats of the data units exchanged - essentially they determine the efficiency and reliability of computer networks. Protocol Engineering is an important discipline covering the design, validation, and implementation of communication protocols. Part I of this book is devoted to the fundamentals of communication protocols, describing their working principles and implicitly also those of computer networks. The author introduces the concepts of service, protocol, layer, and layered architecture, and introduces the main elements required in the description of protocols using a model language. He then presents the most important protocol functions. Part II deals with the description of communication proto cols, offering an overview of the various formal methods, the essence of Protocol Engineering. The author introduces the fundamental description methods, such as finite state machines, Petri nets, process calculi, and temporal logics, that are in part used as semantic models for formal description techniques. He then introduces one represen tative technique for each of the main description approaches, among others SDL and LOTOS, and surveys the use of UML for describing protocols. Part III covers the protocol life cycle and the most important development stages, presenting the reader with approaches for systematic protocol design, with various verification methods, with the main implementation techniques, and with strategies for their testing, in particular with conformance and interoperability tests, and the test description language TTCN. The author uses the simple data transfer example protocol XDT (eXample Data Transfer) throughout the book as a reference protocol to exemplify the various description techniques and to demonstrate important validation and implementation approaches. The book is an introduction to communication protocols and their development for undergraduate and graduate students of computer science and communication technology, and it is also a suitable reference for engineers and programmers. Most chapters contain exercises, and the author's accompanying website provides further online material including a complete formal description of the XDT protocol and an animated simulation visualizing its behavior.

Equilibrium Problems and Variational Models (Hardcover, 2003 ed.): P. Daniele, F. Giannessi, A. Maugeri Equilibrium Problems and Variational Models (Hardcover, 2003 ed.)
P. Daniele, F. Giannessi, A. Maugeri
R3,140 Discovery Miles 31 400 Ships in 10 - 15 working days

The volume, devoted to variational analysis and its applications, collects selected and refereed contributions, which provide an outline of the field. The meeting of the title "Equilibrium Problems and Variational Models," which was held in Erice (Sicily) in the period June 23 - July 2 2000, was the occasion of the presentation of some of these papers; other results are a consequence of a fruitful and constructive atmosphere created during the meeting. New results, which enlarge the field of application of variational analysis, are presented in the book; they deal with the vectorial analysis, time dependent variational analysis, exact penalization, high order deriva tives, geometric aspects, distance functions and log-quadratic proximal methodology. The new theoretical results allow one to improve in a remarkable way the study of significant problems arising from the applied sciences, as continuum model of transportation, unilateral problems, multicriteria spatial price models, network equilibrium problems and many others. As noted in the previous book "Equilibrium Problems: Nonsmooth Optimization and Variational Inequality Models," edited by F. Giannessi, A. Maugeri and P.M. Pardalos, Kluwer Academic Publishers, Vol. 58 (2001), the progress obtained by variational analysis has permitted to han dle problems whose equilibrium conditions are not obtained by the mini mization of a functional. These problems obey a more realistic equilibrium condition expressed by a generalized orthogonality (complementarity) con dition, which enriches our knowledge of the equilibrium behaviour. Also this volume presents important examples of this formulation."

Inventing Software - The Rise of Computer-Related Patents (Hardcover): Kenneth Nichols Inventing Software - The Rise of Computer-Related Patents (Hardcover)
Kenneth Nichols
R2,770 Discovery Miles 27 700 Ships in 10 - 15 working days

Since the introduction of personal computers, software has emerged as a driving force in the global economy and a major industry in its own right. During this time, the U.S. government has reversed its prior policy against software patents and is now issuing thousands of such patents each year, provoking heated controversy among programmers, lawyers, scholars, and software companies. This book is the first to step outside of the highly-polarized debate and examine the current state of the law, its suitability to the realities of software development, and its implications for day-to-day software development.

Written by a former lawyer and working software developer, "Inventing Software" provides a comprehensive overview of software patents, from the lofty perspectives of legal history and computing theory to the technical details and issues of actual patents. People interested in the legal aspect of software patents will find detailed technical analysis of actual patented software, the legal strategies behind the wording of the patents, and an analysis of the ease or difficulty of detecting infringements. Software developers will find ways to integrate patent planning into their standard software engineering practices, and a practical guide for studying and appraising their competitors' patents and safeguarding the value of their own. Intended primarily for programmers and software industry executives and managers, "Inventing Software" will also be useful, illuminating reading for attorneys and software company investors.

Number Theory for Computing (Hardcover, 2nd ed. 2002): M. E. Hellmann Number Theory for Computing (Hardcover, 2nd ed. 2002)
M. E. Hellmann; Song Y. Yan
R2,623 Discovery Miles 26 230 Ships in 12 - 19 working days

There are many surprising connections between the theory of numbers, which is one of the oldest branches of mathematics, and computing and information theory. Number theory has important applications in computer organization and security, coding and cryptography, random number generation, hash functions, and graphics. Conversely, number theorists use computers in factoring large integers, determining primes, testing conjectures, and solving other problems. This book takes the reader from elementary number theory, via algorithmic number theory, to applied number theory in computer science. It introduces basic concepts, results, and methods, and discusses their applications in the design of hardware and software, cryptography, and security. It is aimed at undergraduates in computing and information technology, but will also be valuable to mathematics students interested in applications. In this 2nd edition full proofs of many theorems are added and some corrections are made.

Algorithms for Parallel Processing (Hardcover, 1999 ed.): Michael T. Heath, Abhiram Ranade, Robert S. Schreiber Algorithms for Parallel Processing (Hardcover, 1999 ed.)
Michael T. Heath, Abhiram Ranade, Robert S. Schreiber
R3,097 Discovery Miles 30 970 Ships in 10 - 15 working days

This IMA Volume in Mathematics and its Applications ALGORITHMS FOR PARALLEL PROCESSING is based on the proceedings of a workshop that was an integral part of the 1996-97 IMA program on "MATHEMATICS IN HIGH-PERFORMANCE COMPUTING. " The workshop brought together algorithm developers from theory, combinatorics, and scientific computing. The topics ranged over models, linear algebra, sorting, randomization, and graph algorithms and their analysis. We thank Michael T. Heath of University of lllinois at Urbana (Com puter Science), Abhiram Ranade of the Indian Institute of Technology (Computer Science and Engineering), and Robert S. Schreiber of Hewlett Packard Laboratories for their excellent work in organizing the workshop and editing the proceedings. We also take this opportunity to thank the National Science Founda tion (NSF) and the Army Research Office (ARO), whose financial support made the workshop possible. A vner Friedman Robert Gulliver v PREFACE The Workshop on Algorithms for Parallel Processing was held at the IMA September 16 - 20, 1996; it was the first workshop of the IMA year dedicated to the mathematics of high performance computing. The work shop organizers were Abhiram Ranade of The Indian Institute of Tech nology, Bombay, Michael Heath of the University of Illinois, and Robert Schreiber of Hewlett Packard Laboratories. Our idea was to bring together researchers who do innovative, exciting, parallel algorithms research on a wide range of topics, and by sharing insights, problems, tools, and methods to learn something of value from one another."

Future Directions for Intelligent Systems and Information Sciences - The Future of Speech and Image Technologies, Brain... Future Directions for Intelligent Systems and Information Sciences - The Future of Speech and Image Technologies, Brain Computers, WWW, and Bioinformatics (Hardcover, 2000 ed.)
Nikola Kasabov
R4,593 Discovery Miles 45 930 Ships in 10 - 15 working days

This edited volume comprises invited chapters that cover five areas of the current and the future development of intelligent systems and information sciences. Half of the chapters were presented as invited talks at the Workshop "Future Directions for Intelligent Systems and Information Sciences" held in Dunedin, New Zealand, 22-23 November 1999 after the International Conference on Neuro-Information Processing (lCONIPI ANZIISI ANNES '99) held in Perth, Australia. In order to make this volume useful for researchers and academics in the broad area of information sciences I invited prominent researchers to submit materials and present their view about future paradigms, future trends and directions. Part I contains chapters on adaptive, evolving, learning systems. These are systems that learn in a life-long, on-line mode and in a changing environment. The first chapter, written by the editor, presents briefly the paradigm of Evolving Connectionist Systems (ECOS) and some of their applications. The chapter by Sung-Bae Cho presents the paradigms of artificial life and evolutionary programming in the context of several applications (mobile robots, adaptive agents of the WWW). The following three chapters written by R.Duro, J.Santos and J.A.Becerra (chapter 3), GCoghill . (chapter 4), Y.Maeda (chapter 5) introduce new techniques for building adaptive, learning robots.

Multiprocessor Execution of Logic Programs (Hardcover, 1994 ed.): Gopal Gupta Multiprocessor Execution of Logic Programs (Hardcover, 1994 ed.)
Gopal Gupta
R4,497 Discovery Miles 44 970 Ships in 10 - 15 working days

Multiprocessor Execution of Logic Programs addresses the problem of efficient implementation of logic programming languages, specifically Prolog, on multiprocessor architectures. The approaches and implementations developed attempt to take full advantage of sequential implementation technology developed for Prolog (such as the WAM) while exploiting all forms of control parallelism present in logic programs, namely, or-parallelism, independent and-parallelism and dependent and-parallelism. Coverage includes a thorough survey of parallel implementation techniques and parallel systems developed for Prolog. Multiprocessor Execution of Logic Programs is recommended for people implementing parallel logic programming systems, parallel symbolic systems, parallel AI systems, and parallel theorem proving systems. It will also be useful to people who wish to learn about the implementation of parallel logic programming systems.

Fundamentals of Algebraic Specification 2 - Module Specifications and Constraints (Hardcover, 1990 ed.): Hartmut Ehrig, Bernd... Fundamentals of Algebraic Specification 2 - Module Specifications and Constraints (Hardcover, 1990 ed.)
Hartmut Ehrig, Bernd Mahr
R1,612 Discovery Miles 16 120 Ships in 10 - 15 working days

Since the early seventies concepts of specification have become central in the whole area of computer science. Especially algebraic specification techniques for abstract data types and software systems have gained considerable importance in recent years. They have not only played a central role in the theory of data type specification, but meanwhile have had a remarkable influence on programming language design, system architectures, arid software tools and environments. The fundamentals of algebraic specification lay a basis for teaching, research, and development in all those fields of computer science where algebraic techniques are the subject or are used with advantage on a conceptual level. Such a basis, however, we do not regard to be a synopsis of all the different approaches and achievements but rather a consistently developed theory. Such a theory should mainly emphasize elaboration of basic concepts from one point of view and, in a rigorous way, reach the state of the art in the field. We understand fundamentals in this context as: 1. Fundamentals in the sense of a carefully motivated introduction to algebraic specification, which is understandable for computer scientists and mathematicians. 2. Fundamentals in the sense of mathematical theories which are the basis for precise definitions, constructions, results, and correctness proofs. 3. Fundamentals in the sense of concepts from computer science, which are introduced on a conceptual level and formalized in mathematical terms.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
C Programming For Beginners - The Simple…
Tim Warren Hardcover R597 R541 Discovery Miles 5 410
Programming Logic & Design
Joyce Farrell Paperback R800 Discovery Miles 8 000
Basic Python Commands - Learn the Basic…
Manuel Mcfeely Hardcover R847 R730 Discovery Miles 7 300
C++ How to Program: Horizon Edition
Harvey Deitel, Paul Deitel Paperback R1,917 Discovery Miles 19 170
Programming Logic & Design…
Joyce Farrell Paperback R1,336 R1,239 Discovery Miles 12 390
Dark Silicon and Future On-chip Systems…
Suyel Namasudra, Hamid Sarbazi-Azad Hardcover R4,186 Discovery Miles 41 860
Program Construction - Calculating…
Roland Backhouse Paperback R1,467 Discovery Miles 14 670
Object-Oriented Game Development
Julian Gold Paperback R2,247 R1,778 Discovery Miles 17 780
Essential Java for Scientists and…
Brian Hahn, Katherine Malan Paperback R1,341 Discovery Miles 13 410
Using UML - Software Engineering with…
Perdita Stevens Paperback R2,462 Discovery Miles 24 620

 

Partners