0
Your cart

Your cart is empty

Browse All Departments
Price
  • R100 - R250 (70)
  • R250 - R500 (373)
  • R500+ (14,478)
  • -
Status
Format
Author / Contributor
Publisher

Books > Computing & IT > Applications of computing > Databases

The Definitive Guide to Apache mod_rewrite (Hardcover, 1st ed.): Rich Bowen The Definitive Guide to Apache mod_rewrite (Hardcover, 1st ed.)
Rich Bowen
R1,417 Discovery Miles 14 170 Ships in 18 - 22 working days

Organizing websites is highly dynamic and often chaotic. Thus, it is crucial that host web servers manipulate URLs in order to cope with temporarily or permanently relocated resources, prevent attacks by automated worms, and control resource access.

The Apache mod_rewrite module has long inspired fits of joy because it offers an unparalleled toolset for manipulating URLs. "The Definitive Guide to Apache mod_rewrite" guides you through configuration and use of the module for a variety of purposes, including basic and conditional rewrites, access control, virtual host maintenance, and proxies.

This book was authored by Rich Bowen, noted Apache expert and Apache Software Foundation member, and draws on his years of experience administering, and regular speaking and writing about, the Apache server.

Nearest Neighbor Search: - A Database Perspective (Hardcover, 2005 ed.): Apostolos N. Papadopoulos, Yannis Manolopoulos Nearest Neighbor Search: - A Database Perspective (Hardcover, 2005 ed.)
Apostolos N. Papadopoulos, Yannis Manolopoulos
R2,755 Discovery Miles 27 550 Ships in 18 - 22 working days

Modern applications are both data and computationally intensive and require the storage and manipulation of voluminous traditional (alphanumeric) and nontraditional data sets (images, text, geometric objects, time-series). Examples of such emerging application domains are: Geographical Information Systems (GIS), Multimedia Information Systems, CAD/CAM, Time-Series Analysis, Medical Information Sstems, On-Line Analytical Processing (OLAP), and Data Mining. These applications pose diverse requirements with respect to the information and the operations that need to be supported. From the database perspective, new techniques and tools therefore need to be developed towards increased processing efficiency.

This monograph explores the way spatial database management systems aim at supporting queries that involve the space characteristics of the underlying data, and discusses query processing techniques for nearest neighbor queries. It provides both basic concepts and state-of-the-art results in spatial databases and parallel processing research, and studies numerous applications of nearest neighbor queries.

Data Streams - Models and Algorithms (Hardcover): Charu C. Aggarwal Data Streams - Models and Algorithms (Hardcover)
Charu C. Aggarwal
R4,387 Discovery Miles 43 870 Ships in 10 - 15 working days

Data Streams: Models and Algorithms primarily discusses issues related to the mining aspects of streams. Recent progress in hardware technology makes it possible for organizations to store and record large streams of transactional data. For example, even simple daily transactions, such as using the credit card or phone, result in automated data storage, which brings us to a fairly new topic called data streams. This volume covers mining aspects of data streams in a comprehensive style, in which each contributed chapter contains a survey on the topic, the key ideas in the field from that particular topic, and future research directions. Data Streams: Models and Algorithms is intended for a professional audience composed of researchers and practitioners in industry. This book is also appropriate for graduate-level students in computer science.

Mining Spatio-Temporal Information Systems (Hardcover, 2002 ed.): Roy Ladner, Kevin Shaw, Mahdi Abdelguerfi Mining Spatio-Temporal Information Systems (Hardcover, 2002 ed.)
Roy Ladner, Kevin Shaw, Mahdi Abdelguerfi
R2,746 Discovery Miles 27 460 Ships in 18 - 22 working days

Mining Spatio-Temporal Information Systems, an edited volume is composed of chapters from leading experts in the field of Spatial-Temporal Information Systems and addresses the many issues in support of modeling, creation, querying, visualizing and mining. Mining Spatio-Temporal Information Systems is intended to bring together a coherent body of recent knowledge relating to STIS data modeling, design, implementation and STIS in knowledge discovery. In particular, the reader is exposed to the latest techniques for the practical design of STIS, essential for complex query processing.
Mining Spatio-Temporal Information Systems is structured to meet the needs of practitioners and researchers in industry and graduate-level students in Computer Science.

Foundations of Dependable Computing - System Implementation (Hardcover, 1994 ed.): Gary M. Koob, Clifford G. Lau Foundations of Dependable Computing - System Implementation (Hardcover, 1994 ed.)
Gary M. Koob, Clifford G. Lau
R4,190 Discovery Miles 41 900 Ships in 18 - 22 working days

Foundations of Dependable Computing: System Implementation, explores the system infrastructure needed to support the various paradigms of Paradigms for Dependable Applications. Approaches to implementing support mechanisms and to incorporating additional appropriate levels of fault detection and fault tolerance at the processor, network, and operating system level are presented. A primary concern at these levels is balancing cost and performance against coverage and overall dependability. As these chapters demonstrate, low overhead, practical solutions are attainable and not necessarily incompatible with performance considerations. The section on innovative compiler support, in particular, demonstrates how the benefits of application specificity may be obtained while reducing hardware cost and run-time overhead. A companion to this volume (published by Kluwer) subtitled Models and Frameworks for Dependable Systems presents two comprehensive frameworks for reasoning about system dependability, thereby establishing a context for understanding the roles played by specific approaches presented in this book's two companion volumes. It then explores the range of models and analysis methods necessary to design, validate and analyze dependable systems. Another companion to this book (published by Kluwer), subtitled Paradigms for Dependable Applications, presents a variety of specific approaches to achieving dependability at the application level. Driven by the higher level fault models of Models and Frameworks for Dependable Systems, and built on the lower level abstractions implemented in a third companion book subtitled System Implementation, these approaches demonstrate how dependability may be tuned to the requirements of an application, the fault environment, and the characteristics of the target platform. Three classes of paradigms are considered: protocol-based paradigms for distributed applications, algorithm-based paradigms for parallel applications, and approaches to exploiting application semantics in embedded real-time control systems.

Information Security in Research and Business - Proceedings of the IFIP TC11 13th international conference on Information... Information Security in Research and Business - Proceedings of the IFIP TC11 13th international conference on Information Security (SEC '97): 14-16 May 1997, Copenhagen, Denmark (Hardcover, 1997 ed.)
Louise Yngstroem, Jan Carlsen
R5,403 Discovery Miles 54 030 Ships in 18 - 22 working days

Recently, IT has entered all important areas of society. Enterprises, individuals and civilisations all depend on functioning, safe and secure IT. Focus on IT security has previously been fractionalised, detailed and often linked to non-business applicaitons. The aim of this book is to address the current and future prospects of modern IT security, functionality in business, trade, industry, health care and government. The main topic areas covered include existing IT security tools and methodology for modern IT environments, laws, regulations and ethics in IT security environments, current and future prospects in technology, infrastructures, technique and methodology and IT security in retrospective.

Facilitating Technology Transfer through Partnership (Hardcover, 1997 ed.): Tom McMaster, E. Mumford, E. B. Swanson, B Warboys,... Facilitating Technology Transfer through Partnership (Hardcover, 1997 ed.)
Tom McMaster, E. Mumford, E. B. Swanson, B Warboys, David Wastell
R4,230 Discovery Miles 42 300 Ships in 18 - 22 working days

The primary aim for this book is to gather and collate articles which represent the best and latest thinking in the domain of technology transfer, from research, academia and practice around the world. We envisage that the book will, as a result of this, represent an important source of knowledge in this domain to students (undergraduate and postgraduate), researchers, practitioners and consultants, chiefly in the software engineering and IT/industries, but also in management and other organisational and social disciplines. An important aspect of the book is the role that reflective practitioners (and not just academics) play. They will be involved in the production, and evaluation of contributions, as well as in the design and delivery of conference events, upon which of course, the book will be based.

Text Retrieval and Filtering - Analytic Models of Performance (Hardcover, 1998 ed.): Robert M. Losee Text Retrieval and Filtering - Analytic Models of Performance (Hardcover, 1998 ed.)
Robert M. Losee
R5,279 Discovery Miles 52 790 Ships in 18 - 22 working days

Text Retrieval and Filtering: Analytical Models of Performance is the first book that addresses the problem of analytically computing the performance of retrieval and filtering systems. The book describes means by which retrieval may be studied analytically, allowing one to describe current performance, predict future performance, and to understand why systems perform as they do. The focus is on retrieving and filtering natural language text, with material addressing retrieval performance for the simple case of queries with a single term, the more complex case with multiple terms, both with term independence and term dependence, and for the use of grammatical information to improve performance. Unambiguous statements of the conditions under which one method or system will be more effective than another are developed. Text Retrieval and Filtering: Analytical Models of Performance focuses on the performance of systems that retrieve natural language text, considering full sentences as well as phrases and individual words. The last chapter explicitly addresses how grammatical constructs and methods may be studied in the context of retrieval or filtering system performance. The book builds toward solving this problem, although the material in earlier chapters is as useful to those addressing non-linguistic, statistical concerns as it is to linguists. Those interested in grammatical information should be cautioned to carefully examine earlier chapters, especially Chapters 7 and 8, which discuss purely statistical relationships between terms, before moving on to Chapter 10, which explicitly addresses linguistic issues. Text Retrieval and Filtering: Analytical Models of Performance is suitable as a secondary text for a graduate level course on Information Retrieval or Linguistics, and as a reference for researchers and practitioners in industry.

Fundamentals of Cryptology - A Professional Reference and Interactive Tutorial (Mixed media product, 2000 ed.): Henk C.A. van... Fundamentals of Cryptology - A Professional Reference and Interactive Tutorial (Mixed media product, 2000 ed.)
Henk C.A. van Tilborg
R1,540 Discovery Miles 15 400 Ships in 18 - 22 working days

The protection of sensitive information against unauthorized access or fraudulent changes has been of prime concern throughout the centuries. Modern communication techniques, using computers connected through networks, make all data even more vulnerable to these threats. In addition, new issues have surfaced that did not exist previously, e.g. adding a signature to an electronic document.Cryptology addresses the above issues - it is at the foundation of all information security. The techniques employed to this end have become increasingly mathematical in nature. Fundamentals of Cryptology serves as an introduction to modern cryptographic methods. After a brief survey of classical cryptosystems, it concentrates on three main areas. First, stream ciphers and block ciphers are discussed. These systems have extremely fast implementations, but sender and receiver must share a secret key. Second, the book presents public key cryptosystems, which make it possible to protect data without a prearranged key. Their security is based on intractable mathematical problems, such as the factorization of large numbers. The remaining chapters cover a variety of topics, including zero-knowledge proofs, secret sharing schemes and authentication codes. Two appendices explain all mathematical prerequisites in detail: one presents elementary number theory (Euclid's Algorithm, the Chinese Remainder Theorem, quadratic residues, inversion formulas, and continued fractions) and the other introduces finite fields and their algebraic structure.Fundamentals of Cryptology is an updated and improved version of An Introduction to Cryptology, originally published in 1988. Apart from a revision of the existing material, there are many new sections, and two new chapters on elliptic curves and authentication codes, respectively. In addition, the book is accompanied by a full text electronic version on CD-ROM as an interactive Mathematica manuscript.Fundamentals of Cryptology will be of interest to computer scientists, mathematicians, and researchers, students, and practitioners in the area of cryptography.

Data-Driven Technology for Engineering Systems Health Management - Design Approach, Feature Construction, Fault Diagnosis,... Data-Driven Technology for Engineering Systems Health Management - Design Approach, Feature Construction, Fault Diagnosis, Prognosis, Fusion and Decisions (Hardcover, 1st ed. 2017)
Gang Niu
R5,092 Discovery Miles 50 920 Ships in 10 - 15 working days

This book introduces condition-based maintenance (CBM)/data-driven prognostics and health management (PHM) in detail, first explaining the PHM design approach from a systems engineering perspective, then summarizing and elaborating on the data-driven methodology for feature construction, as well as feature-based fault diagnosis and prognosis. The book includes a wealth of illustrations and tables to help explain the algorithms, as well as practical examples showing how to use this tool to solve situations for which analytic solutions are poorly suited. It equips readers to apply the concepts discussed in order to analyze and solve a variety of problems in PHM system design, feature construction, fault diagnosis and prognosis.

Multimedia Processing, Communication and Computing Applications - Proceedings of the First International Conference, ICMCCA,... Multimedia Processing, Communication and Computing Applications - Proceedings of the First International Conference, ICMCCA, 13-15 December 2012 (Hardcover, 2013 ed.)
Punitha P. Swamy, Devanur S Guru
R6,325 Discovery Miles 63 250 Ships in 18 - 22 working days

ICMCCA 2012 is the first International Conference on Multimedia Processing, Communication and Computing Applications and the theme of the Conference is chosen as 'Multimedia Processing and its Applications'. Multimedia processing has been an active research area contributing in many frontiers of today's science and technology. This book presents peer-reviewed quality papers on multimedia processing, which covers a very broad area of science and technology. The prime objective of the book is to familiarize readers with the latest scientific developments that are taking place in various fields of multimedia processing and is widely used in many disciplines such as Medical Diagnosis, Digital Forensic, Object Recognition, Image and Video Analysis, Robotics, Military, Automotive Industries, Surveillance and Security, Quality Inspection, etc. The book will assist the research community to get the insight of the overlapping works which are being carried out across the globe at many medical hospitals and institutions, defense labs, forensic labs, academic institutions, IT companies and security & surveillance domains. It also discusses latest state-of-the-art research problems and techniques and helps to encourage, motivate and introduce the budding researchers to a larger domain of multimedia.

Advanced Digital Image Steganography Using LSB, PVD, and EMD - Emerging Research and Opportunities (Hardcover): Gandharba Swain Advanced Digital Image Steganography Using LSB, PVD, and EMD - Emerging Research and Opportunities (Hardcover)
Gandharba Swain
R4,138 Discovery Miles 41 380 Ships in 18 - 22 working days

In the last few decades, the use of the Internet has grown tremendously, and the use of online communications has grown even more. The lack of security in private messages between individuals, however, allows hackers to collect loads of sensitive information. Modern security measures are required to prevent this attack on the world's communication technologies. Advanced Digital Image Steganography Using LSB, PVD, and EMD: Emerging Research and Opportunities provides evolving research exploring the theoretical and practical aspects of data encryption techniques and applications within computer science. The book provides introductory knowledge on steganography and its importance, detailed analysis of how RS and PDH are performed, discussion on pixel value differencing principles, and hybrid approaches using substitution, PVD, and EMD principles. It is ideally designed for researchers and graduate and under graduate students seeking current research on the security of data during transit.

Foundations of Dependable Computing - Models and Frameworks for Dependable Systems (Hardcover, 1994 ed.): Gary M. Koob,... Foundations of Dependable Computing - Models and Frameworks for Dependable Systems (Hardcover, 1994 ed.)
Gary M. Koob, Clifford G. Lau
R4,159 Discovery Miles 41 590 Ships in 18 - 22 working days

Foundations of Dependable Computing: Models and Frameworks for Dependable Systems presents two comprehensive frameworks for reasoning about system dependability, thereby establishing a context for understanding the roles played by specific approaches presented in this book's two companion volumes. It then explores the range of models and analysis methods necessary to design, validate and analyze dependable systems. A companion to this book (published by Kluwer), subtitled Paradigms for Dependable Applications, presents a variety of specific approaches to achieving dependability at the application level. Driven by the higher level fault models of Models and Frameworks for Dependable Systems, and built on the lower level abstractions implemented in a third companion book subtitled System Implementation, these approaches demonstrate how dependability may be tuned to the requirements of an application, the fault environment, and the characteristics of the target platform. Three classes of paradigms are considered: protocol-based paradigms for distributed applications, algorithm-based paradigms for parallel applications, and approaches to exploiting application semantics in embedded real-time control systems. Another companion book (published by Kluwer) subtitled System Implementation, explores the system infrastructure needed to support the various paradigms of Paradigms for Dependable Applications. Approaches to implementing support mechanisms and to incorporating additional appropriate levels of fault detection and fault tolerance at the processor, network, and operating system level are presented. A primary concern at these levels is balancing cost and performance against coverage and overall dependability. As these chapters demonstrate, low overhead, practical solutions are attainable and not necessarily incompatible with performance considerations. The section on innovative compiler support, in particular, demonstrates how the benefits of application specificity may be obtained while reducing hardware cost and run-time overhead.

Compression and Coding Algorithms (Hardcover, 2002 ed.): Alistair Moffat, Andrew Turpin Compression and Coding Algorithms (Hardcover, 2002 ed.)
Alistair Moffat, Andrew Turpin
R1,561 Discovery Miles 15 610 Ships in 18 - 22 working days

Compression and Coding Algorithms describes in detail the coding mechanisms that are available for use in data compression systems. The well known Huffman coding technique is one mechanism, but there have been many others developed over the past few decades, and this book describes, explains and assesses them. People undertaking research of software development in the areas of compression and coding algorithms will find this book an indispensable reference. In particular, the careful and detailed description of algorithms and their implementation, plus accompanying pseudo-code that can be readily implemented on computer, make this book a definitive reference in an area currently without one.

Foundations of Dependable Computing - Paradigms for Dependable Applications (Hardcover, 1994 ed.): Gary M. Koob, Clifford G. Lau Foundations of Dependable Computing - Paradigms for Dependable Applications (Hardcover, 1994 ed.)
Gary M. Koob, Clifford G. Lau
R4,132 Discovery Miles 41 320 Ships in 18 - 22 working days

Foundations of Dependable Computing: Paradigms for Dependable Applications, presents a variety of specific approaches to achieving dependability at the application level. Driven by the higher level fault models of Models and Frameworks for Dependable Systems, and built on the lower level abstractions implemented in a third companion book subtitled System Implementation, these approaches demonstrate how dependability may be tuned to the requirements of an application, the fault environment, and the characteristics of the target platform. Three classes of paradigms are considered: protocol-based paradigms for distributed applications, algorithm-based paradigms for parallel applications, and approaches to exploiting application semantics in embedded real-time control systems. The companion volume subtitled Models and Frameworks for Dependable Systems presents two comprehensive frameworks for reasoning about system dependability, thereby establishing a context for understanding the roles played by specific approaches presented in this book's two companion volumes. It then explores the range of models and analysis methods necessary to design, validate and analyze dependable systems. Another companion book (published by Kluwer) subtitled System Implementation, explores the system infrastructure needed to support the various paradigms of Paradigms for Dependable Applications. Approaches to implementing support mechanisms and to incorporating additional appropriate levels of fault detection and fault tolerance at the processor, network, and operating system level are presented. A primary concern at these levels is balancing cost and performance against coverage and overall dependability. As these chapters demonstrate, low overhead, practical solutions are attainable and not necessarily incompatible with performance considerations. The section on innovative compiler support, in particular, demonstrates how the benefits of application specificity may be obtained while reducing hardware cost and run-time overhead.

Intelligent Multimedia Databases and Information Retrieval - Advancing Applications and Technologies (Hardcover, New): Li Yan,... Intelligent Multimedia Databases and Information Retrieval - Advancing Applications and Technologies (Hardcover, New)
Li Yan, Zongmin Ma
R4,933 Discovery Miles 49 330 Ships in 18 - 22 working days

As consumer costs for multimedia devices such as digital cameras and Web phones have decreased and diversity in the market has skyrocketed, the amount of digital information has grown considerably. Intelligent Multimedia Databases and Information Retrieval: Advancing Applications and Technologies details the latest information retrieval technologies and applications, the research surrounding the field, and the methodologies and design related to multimedia databases. Together with academic researchers and developers from both information retrieval and artificial intelligence fields, this book details issues and semantics of data retrieval with contributions from around the globe. As the information and data from multimedia databases continues to expand, the research and documentation surrounding it should keep pace as best as possible, and this book provides an excellent resource for the latest developments.

The Testability of Distributed Real-Time Systems (Hardcover, 1993 ed.): Werner Schutz The Testability of Distributed Real-Time Systems (Hardcover, 1993 ed.)
Werner Schutz
R2,735 Discovery Miles 27 350 Ships in 18 - 22 working days

The Testability of Distributed Real-Time Systems starts by collecting and analyzing all principal problems, as well as their interrelations that one has to keep in mind wh4en testing a distributed real-time system. The book discusses them in some detail from the viewpoints of software engineering, distributed systems principles, and real-time system development. These problems are organization, observability, reproducibility, the host/target approach, environment simulation, and (test) representativity. Based on this framework, the book summarizes and evaluates the current work done in this area before going on to argue that the particular system architecture (hardware plus operating system) has a much greater influence on testing than is the case for ordinary', non-real-time software. The notions of event-triggered and time-triggered system architectures are introduced, and its is shown that time-triggered systems automatically' (i.e. by the nature of their system architecture) solve or greatly ease solving of some of the problems introduced earlier, i.e. observability, reproducibility, and (partly) representativity.A test methodology is derived for the time-triggered, distributed real-time system MARS. The book describes in detail how the author has taken advantage of its architecture, and shows how the remaining problems can be solved for this particular system architecture. Some experiments conducted to evaluate this test methodology are reported, including the experience gained from them, leading to a description of a number of prototype support tools.The Testability of Distributed Real-Time Systems can be used by both academic and industrial researchers interested in distributedand/or real-time systems, or in software engineering for such systems. This book can also be used as a text in advanced courses on distributed or real-time systems.

Cyber Security - Everything an Executive Needs to Know (Hardcover): Phillip Ferraro Cyber Security - Everything an Executive Needs to Know (Hardcover)
Phillip Ferraro
R627 R566 Discovery Miles 5 660 Save R61 (10%) Ships in 18 - 22 working days
Text Mining - Predictive Methods for Analyzing Unstructured Information (Hardcover): Sholom M. Weiss, Nitin Indurkhya, Tong... Text Mining - Predictive Methods for Analyzing Unstructured Information (Hardcover)
Sholom M. Weiss, Nitin Indurkhya, Tong Zhang, Fred Damerau
R4,146 Discovery Miles 41 460 Ships in 18 - 22 working days

Data mining is a mature technology. The prediction problem, looking for predictive patterns in data, has been widely studied. Strong me- ods are available to the practitioner. These methods process structured numerical information, where uniform measurements are taken over a sample of data. Text is often described as unstructured information. So, it would seem, text and numerical data are different, requiring different methods. Or are they? In our view, a prediction problem can be solved by the same methods, whether the data are structured - merical measurements or unstructured text. Text and documents can be transformed into measured values, such as the presence or absence of words, and the same methods that have proven successful for pred- tive data mining can be applied to text. Yet, there are key differences. Evaluation techniques must be adapted to the chronological order of publication and to alternative measures of error. Because the data are documents, more specialized analytical methods may be preferred for text. Moreover, the methods must be modi?ed to accommodate very high dimensions: tens of thousands of words and documents. Still, the central themes are similar.

Data Mining and Computational Intelligence (Hardcover, 2001 ed.): Abraham Kandel, Mark Last, Horst Bunke Data Mining and Computational Intelligence (Hardcover, 2001 ed.)
Abraham Kandel, Mark Last, Horst Bunke
R4,208 Discovery Miles 42 080 Ships in 18 - 22 working days

Many business decisions are made in the absence of complete information about the decision consequences. Credit lines are approved without knowing the future behavior of the customers; stocks are bought and sold without knowing their future prices; parts are manufactured without knowing all the factors affecting their final quality; etc. All these cases can be categorized as decision making under uncertainty. Decision makers (human or automated) can handle uncertainty in different ways. Deferring the decision due to the lack of sufficient information may not be an option, especially in real-time systems. Sometimes expert rules, based on experience and intuition, are used. Decision tree is a popular form of representing a set of mutually exclusive rules. An example of a two-branch tree is: if a credit applicant is a student, approve; otherwise, decline. Expert rules are usually based on some hidden assumptions, which are trying to predict the decision consequences. A hidden assumption of the last rule set is: a student will be a profitable customer. Since the direct predictions of the future may not be accurate, a decision maker can consider using some information from the past. The idea is to utilize the potential similarity between the patterns of the past (e.g., "most students used to be profitable") and the patterns of the future (e.g., "students will be profitable").

Organizational Data Mining - Leveraging Enterprise Data Resources for Optimal Performance (Hardcover, New): Organizational Data Mining - Leveraging Enterprise Data Resources for Optimal Performance (Hardcover, New)
R2,153 Discovery Miles 21 530 Ships in 18 - 22 working days

Successfully competing in the new global economy requires immediate decision capability. This immediate decision capability requires quick analysis of both timely and relevant data. To support this analysis, organizations are piling up mountains of business data in their databases every day. Terabyte-sized (1,000 megabytes) databases are commonplace in organizations today, and this enormous growth will make petabyte-sized databases (1,000 terabytes) a reality within the next few years (Whiting, 2002). Those organizations making swift, fact-based decisions by optimally leveraging their data resources will outperform those organizations that do not. A technology that facilitates this process of optimal decision-making is known as Organizational Data Mining (ODM). Organizational Data Mining: Leveraging Enterprise Data Resources for Optimal Performance demonstrates how organizations can leverage ODM for enhanced competitiveness and optimal performance.

Communications and Multimedia Security II (Hardcover, 1996 ed.): Patrick Horster Communications and Multimedia Security II (Hardcover, 1996 ed.)
Patrick Horster
R4,164 Discovery Miles 41 640 Ships in 18 - 22 working days

In multimedia and communication environments all documents must be protected against attacks. The movie Forrest Gump showed how multimedia documents can be manipulated. The required security can be achieved by a number of different security measures. This book provides an overview of the current research in Multimedia and Communication Security. A broad variety of subjects are addressed including: network security; attacks; cryptographic techniques; healthcare and telemedicine; security infrastructures; payment systems; access control; models and policies; auditing and firewalls. This volume contains the selected proceedings of the joint conference on Communications and Multimedia Security; organized by the International Federation for Information processing and supported by the Austrian Computer Society, Gesellschaft fuer Informatik e.V. and TeleTrust Deutschland e.V. The conference took place in Essen, Germany, in September 1996

Variation Principle in Informational Macrodynamics (Hardcover, 2003 ed.): Vladimir S. Lerner Variation Principle in Informational Macrodynamics (Hardcover, 2003 ed.)
Vladimir S. Lerner
R2,801 Discovery Miles 28 010 Ships in 18 - 22 working days

Information Macrodynamics (IMD) belong to an interdisciplinary science that represents a new theoretical and computer-based methodology for a system informational descriptionand improvement, including various activities in such areas as thinking, intelligent processes, communications, management, and other nonphysical subjects with their mutual interactions, informational superimposition, and theinformation transferredbetweeninteractions. The IMD is based on the implementation of a single concept by a unique mathematical principle and formalism, rather than on an artificial combination of many arbitrary, auxiliary concepts and/or postulates and different mathematical subjects, such as the game, automata, catastrophe, logical operations theories, etc. This concept is explored mathematically using classical mathematics as calculus of variation and the probability theory, which are potent enough, without needing to developnew, specifiedmathematical systemicmethods. The formal IMD model automatically includes the related results from other fields, such as linear, nonlinear, collective and chaotic dynamics, stability theory, theory of information, physical analogies of classical and quantum mechanics, irreversible thermodynamics, andkinetics. The main IMD goal is to reveal the information regularities, mathematically expressed by the considered variation principle (VP), as a mathematical tool to extractthe regularities and define the model, whichdescribes theregularities. The IMD regularities and mechanisms are the results of the analytical solutions and are not retained by logical argumentation, rational introduction, and a reasonable discussion. The IMD's information computer modeling formalism includes a human being (as an observer, carrier and producer ofinformation), with a restoration of the model during the objectobservations.

Oracle 10g Developing Media Rich Applications (Paperback, New): Lynne Dunckley, Larry Guros Oracle 10g Developing Media Rich Applications (Paperback, New)
Lynne Dunckley, Larry Guros
R1,994 Discovery Miles 19 940 Ships in 10 - 15 working days

Oracle 10g Developing Media Rich Applications is focused squarely on database administrators and programmers as the foundation of multimedia database applications. With the release of Oracle8 Database in 1997, Oracle became the first commercial database with integrated multimedia technology for application developers. Since that time, Oracle has enhanced and extended these features to include native support for image, audio, video and streaming media storage; indexing, retrieval and processing in the Oracle Database, Application Server; and development tools. Databases are not only words and numbers for accountants, but they also should utilize a full range of media to satisfy customer needs, from race car engineers, to manufacturing processes to security.
The full range of audio, video and integration of media into databases is mission critical to these applications. This book details the most recent features in Oracle's multimedia technology including those of the Oracle10gR2 Database and the Oracle9i Application Server. The technology covered includes: object relational media storage and services within the database, middle tier application development interfaces, wireless delivery mechanisms, and Java-based tools.
* Gives broad coverage to integration of multimedia features such as audio and instrumentation video, from race cars to analyze performance, to voice and picture recognition for security data bases. As well as full multimedia for presentations
* Includes field tested examples in enterprise environments
* Provides coverage in a thorough and clear fashion developed in a London University Professional Course

Advanced Topics in Information Retrieval (Hardcover, 2011 Ed.): Massimo Melucci, Ricardo Baeza-Yates Advanced Topics in Information Retrieval (Hardcover, 2011 Ed.)
Massimo Melucci, Ricardo Baeza-Yates
R2,702 Discovery Miles 27 020 Ships in 18 - 22 working days

Information retrieval is the science concerned with the effective and efficient retrieval of documents starting from their semantic content. It is employed to fulfill some information need from a large number of digital documents. Given the ever-growing amount of documents available and the heterogeneous data structures used for storage, information retrieval has recently faced and tackled novel applications.

In this book, Melucci and Baeza-Yates present a wide-spectrum illustration of recent research results in advanced areas related to information retrieval. Readers will find chapters on e.g. aggregated search, digital advertising, digital libraries, discovery of spam and opinions, information retrieval in context, multimedia resource discovery, quantum mechanics applied to information retrieval, scalability challenges in web search engines, and interactive information retrieval evaluation. All chapters are written by well-known researchers, are completely self-contained and comprehensive, and are complemented by an integrated bibliography and subject index.

With this selection, the editors provide the most up-to-date survey of topics usually not addressed in depth in traditional (text)books on information retrieval. The presentation is intended for a wide audience of people interested in information retrieval: undergraduate and graduate students, post-doctoral researchers, lecturers, and industrial researchers.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Financial Analysis With Microsoft Excel
Timothy Mayes Paperback R1,339 R1,257 Discovery Miles 12 570
Statistics - Growing Data Sets and…
Turkmen Goksel Hardcover R3,072 Discovery Miles 30 720
Project Code: Create Computer Games with…
Kevin Wood Hardcover R525 Discovery Miles 5 250
MIS
Hossein Bidgoli Paperback R1,169 R1,095 Discovery Miles 10 950
Oracle 12c - SQL
Joan Casteel Paperback  (1)
R1,321 R1,228 Discovery Miles 12 280
Information Geometry and Its…
Shun-Ichi Amari Hardcover R4,142 Discovery Miles 41 420
Nintendo Switch Gaming Guide
Chris Stead Hardcover R602 Discovery Miles 6 020
Attractors and Higher Dimensions in…
Gennadiy Vladimirovich Zhizhin Hardcover R4,111 Discovery Miles 41 110
Minecraft Joke Book
Mojang AB Paperback  (1)
R180 R161 Discovery Miles 1 610
Unity from Zero to Proficiency…
Patrick Felicia Paperback R834 Discovery Miles 8 340

 

Partners