![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > Data structures
Many complex systems found in nature can be viewed as function optimizers. In particular, they can be viewed as such optimizers of functions in extremely high dimensional spaces. Given the difficulty of performing such high-dimensional op timization with modern computers, there has been a lot of exploration of computa tional algorithms that try to emulate those naturally-occurring function optimizers. Examples include simulated annealing (SA [15,18]), genetic algorithms (GAs) and evolutionary computation [2,3,9,11,20-22,24,28]. The ultimate goal of this work is an algorithm that can, for any provided high-dimensional function, come close to extremizing that function. Particularly desirable would be such an algorithm that works in an adaptive and robust manner, without any explicit knowledge of the form of the function being optimized. In particular, such an algorithm could be used for distributed adaptive control---one of the most important tasks engineers will face in the future, when the systems they design will be massively distributed and horribly messy congeries ofcomputational systems.
Computers that `program themselves' has long been an aim of computer scientists. Recently genetic programming (GP) has started to show its promise by automatically evolving programs. Indeed in a small number of problems GP has evolved programs whose performance is similar to or even slightly better than that of programs written by people. The main thrust of GP has been to automatically create functions. While these can be of great use they contain no memory and relatively little work has addressed automatic creation of program code including stored data. This issue is the main focus of Genetic Programming, and Data Structures: Genetic Programming + Data Structures = Automatic Programming!. This book is motivated by the observation from software engineering that data abstraction (e.g., via abstract data types) is essential in programs created by human programmers. This book shows that abstract data types can be similarly beneficial to the automatic production of programs using GP. Genetic Programming and Data Structures: Genetic Programming + Data Structures = Automatic Programming! shows how abstract data types (stacks, queues and lists) can be evolved using genetic programming, demonstrates how GP can evolve general programs which solve the nested brackets problem, recognises a Dyck context free language, and implements a simple four function calculator. In these cases, an appropriate data structure is beneficial compared to simple indexed memory. This book also includes a survey of GP, with a critical review of experiments with evolving memory, and reports investigations of real world electrical network maintenance scheduling problems that demonstrate that Genetic Algorithms can find low cost viable solutions to such problems. Genetic Programming and Data Structures: Genetic Programming + Data Structures = Automatic Programming! should be of direct interest to computer scientists doing research on genetic programming, genetic algorithms, data structures, and artificial intelligence. In addition, this book will be of interest to practitioners working in all of these areas and to those interested in automatic programming.
Restricted-orientation convexity is the study of geometric objects whose intersections with lines from some fixed set are connected. This notion generalizes standard convexity and several types of nontraditional convexity. The authors explore the properties of this generalized convexity in multidimensional Euclidean space, and describ restricted-orientation analogs of lines, hyperplanes, flats, halfspaces, and identify major properties of standard convex sets that also hold for restricted-orientation convexity. They then introduce the notion of strong restricted-orientation convexity, which is an alternative generalization of convexity, and show that its properties are also similar to that of standard convexity.
One of the main uses of computer systems is the management of large amounts of symbolic information representing the state of some application domain, such as information about all the people I communicate with in my personal address database, or relevant parts of the outer space in the knowledge base of a NASA space mission. While database management systems offer only the basic services of information storage and retrieval, more powerful knowledge systems offer, in addition, a number of advanced services such as deductive and abductive reasoning for the purpose of finding explanations and diagnoses, or generating plans. In order to design and understand database and knowledge-based applications it is important to build upon well-established conceptual and mathematical foundations. What are the principles behind database and knowledge systems? What are their major components? Which are the important cases of knowledge systems? What are their limitations? Addressing these questions, and discussing the fundamental issues of information update, knowledge assimilation, integrity maintenance, and inference-based query answering, is the purpose of this book. Foundations of Databases and Knowledge Systems covers both basic and advanced topics. It may be used as the textbook of a course offering a broad introduction to databases and knowledge bases, or it may be used as an additional textbook in a course on databases or Artificial Intelligence. Professionals and researchers interested in learning about new developments will benefit from the encyclopedic character of the book, which provides organized access to many advanced concepts in the theory of databases and knowledge bases.
This book comprises the refereed proceedings of the International
Conferences, MAS and ASNT 2012, held in conjunction with GST 2012
on Jeju Island, Korea, in November/December 2012.
The Eighth Annual Working Conference of Information Security Management and Small Systems Security, jointly presented by WG11.1 and WG11.2 of the International Federation for Information Processing (IFIP), focuses on various state-of-art concepts in the two relevant fields. The conference focuses on technical, functional as well as managerial issues. This working conference brings together researchers and practitioners of different disciplines, organisations, and countries, to discuss the latest developments in (amongst others) information security methods, methodologies and techniques, information security management issues, risk analysis, managing information security within electronic commerce, computer crime and intrusion detection. We are fortunate to have attracted two highly acclaimed international speakers to present invited lectures, which will set the platform for the reviewed papers. Invited speakers will talk on a broad spectrum of issues, all related to information security management and small system security issues. These talks cover new perspectives on electronic commerce, security strategies, documentation and many more. All papers presented at this conference were reviewed by a minimum of two international reviewers. We wish to express our gratitude to all authors of papers and the international referee board. We would also like to express our appreciation to the organising committee, chaired by Gurpreet Dhillon, for all their inputs and arrangements. Finally, we would like to thank Les Labuschagne and Hein Venter for their contributions in compiling this proceeding for WG11.1 and WG 11.2.
This book is a comprehensive presentation of recent results and developments on several widely used transforms and their fast algorithms. In many cases, new options are provided for improved or new fast algorithms, some of which are not well known in the digital signal processing community. The book is suitable as a textbook for senior undergraduate and graduate courses in digital signal processing. It may also serve as an excellent self-study reference for electrical engineers and applied mathematicians whose work is related to the fields of electronics, signal processing, image and speech processing, or digital design and communication.
Time is ubiquitous in information systems. Almost every enterprise faces the problem of its data becoming out of date. However, such data is often valu able, so it should be archived and some means to access it should be provided. Also, some data may be inherently historical, e.g., medical, cadastral, or ju dicial records. Temporal databases provide a uniform and systematic way of dealing with historical data. Many languages have been proposed for tem poral databases, among others temporal logic. Temporal logic combines ab stract, formal semantics with the amenability to efficient implementation. This chapter shows how temporal logic can be used in temporal database applica tions. Rather than presenting new results, we report on recent developments and survey the field in a systematic way using a unified formal framework [GHR94; Ch094]. The handbook [GHR94] is a comprehensive reference on mathematical foundations of temporal logic. In this chapter we study how temporal logic is used as a query and integrity constraint language. Consequently, model-theoretic notions, particularly for mula satisfaction, are of primary interest. Axiomatic systems and proof meth ods for temporal logic [GHR94] have found so far relatively few applications in the context of information systems. Moreover, one needs to bear in mind that for the standard linearly-ordered time domains temporal logic is not re cursively axiomatizable [GHR94]' so recursive axiomatizations are by necessity incomplete.
With the proliferation of packaging technology, failure and reliability have become serious concerns. This invaluable reference details processes that enable detection, analysis and prevention of failures. It provides a comprehensive account of the failures of device packages, discrete component connectors, PCB carriers and PCB assemblies.
Recent Advances in RSA Cryptography surveys the most important achievements of the last 22 years of research in RSA cryptography. Special emphasis is laid on the description and analysis of proposed attacks against the RSA cryptosystem. The first chapters introduce the necessary background information on number theory, complexity and public key cryptography. Subsequent chapters review factorization algorithms and specific properties that make RSA attractive for cryptographers. Most recent attacks against RSA are discussed in the third part of the book (among them attacks against low-exponent RSA, Hastad's broadcast attack, and Franklin-Reiter attacks). Finally, the last chapter reviews the use of the RSA function in signature schemes. Recent Advances in RSA Cryptography is of interest to graduate level students and researchers who will gain an insight into current research topics in the field and an overview of recent results in a unified way. Recent Advances in RSA Cryptography is suitable as a secondary text for a graduate level course, and as a reference for researchers and practitioners in industry.
Introduction or Why I wrote this book N the fallof 1997 a dedicated troff user e-rnalled me the macros he used to typeset his books. 1took one look inside his fileand thought, "I can do I this;It'sjustcode. " Asan experiment1spent aweekand wrote a Cprogram and troff macros which formatted and typeset a membership directory for a scholarly society with approximately 2,000 members. When 1 was done, I could enter two commands, and my program and troff would convert raw membershipdata into 200 pages ofPostScriptin 35 seconds. Previously, it had taken me several days to prepare camera-readycopy for the directory using a word processor. For completeness 1sat down and tried to write 1EXmacros for the typesetting. 1failed. Although ninety-five percent of my macros worked, I was unable to prepare the columns the project required. As my frustration grew, 1began this book-mentally, in myhead-as an answer to the question, "Why is 'lEX so hard to learn?" Why use Tgx? Lest you accuse me of the old horse and cart problem, 1should address the question, "Why use 1EX at all?" before 1explain why 'lEX is hard. Iuse 'lEXfor the followingreasons. Itisstable, fast, free, and it uses ASCII. Ofcourse, the most important reason is: 'lEX does a fantastic job. Bystable, I mean it is not likely to change in the next 10 years (much less the next one or two), and it is free of bugs. Both ofthese are important.
Fuzzy Logic and Soft Computing contains contributions from world-leading experts from both the academic and industrial communities. The first part of the volume consists of invited papers by international authors describing possibilistic logic in decision analysis, fuzzy dynamic programming in optimization, linguistic modifiers for word computation, and theoretical treatments and applications of fuzzy reasoning. The second part is composed of eleven contributions from Chinese authors focusing on some of the key issues in the fields: stable adaptive fuzzy control systems, partial evaluations and fuzzy reasoning, fuzzy wavelet neural networks, analysis and applications of genetic algorithms, partial repeatability, rough set reduction for data enriching, limits of agents in process calculus, medium logic and its evolution, and factor spaces canes. These contributions are not only theoretically sound and well-formulated, but are also coupled with applicability implications and/or implementation treatments. The domains of applications realized or implied are: decision analysis, word computation, databases and knowledge discovery, power systems, control systems, and multi-destinational routing. Furthermore, the articles contain materials that are an outgrowth of recently conducted research, addressing fundamental and important issues of fuzzy logic and soft computing.
This book constitutes the refereed proceedings of the First
Mediterranean Conference on Algorithms, MedAlg2012, held in Kibbutz
Ein Gedi, Israel, in December 2012.
Multimedia Mining: A Highway to Intelligent Multimedia Documents brings together experts in digital media content analysis, state-of-art data mining and knowledge discovery in multimedia database systems, knowledge engineers and domain experts from diverse applied disciplines. Multimedia documents are ubiquitous and often required, if not essential, in many applications today. This phenomenon has made multimedia documents widespread and extremely large. There are tools for managing and searching within these collections, but the need for tools to extract hidden useful knowledge embedded within multimedia objects is becoming pressing and central for many decision-making applications. The tools needed today are tools for discovering relationships between objects or segments within multimedia document components, such as classifying images based on their content, extracting patterns in sound, categorizing speech and music, and recognizing and tracking objects in video streams.
Databaseprogrammingis the process ofdeveloping data-intensiveapplications which demand the access to large amounts of structured, persistent data. The primary tool required for implementing such applications is a database programming language, namely aformal language which is specialized in the definition and manipulationof relevant large-scale data. As such, a database programming language is expected to provide high-level data modeling capabilitiesas well as avarietyofconstructs which facilitatethehandlingofthespecifieddata. Inthis perspective, the aim of this book is: (i) to present the recent advances in database technologyfrom theviewpointofthe novel database paradigmsproposedfor the developmentofadvanced, non-standard, data-intensive applications, (ii) to focus specificallyon the relational approach, with considerableemphasis on the extensions proposed in the last decade, and (iii) to describe the extended relational database languageAlgres which is primarily the outcome of research work conducted by the authorsincooperationwithalargenumberofothercolleaguesandstudents. Furthermore, in orderto put the concepts presented in the book into practice, the reader is invited to experiment with the Algres system, afree copyofwhich can be requestedfromKluwerAcademicPublishers,ordirectlyfromtheauthors. Dependingonthespecific interest andbackgroundofthereader,thebookcanserve either:(1) to overview recent trends in databases, (2) to introduce in more detail the concepts and theory of the nested relational model, or (3) to present a complete advancedrelationallanguagewhichcanbefreelyusedforexperimentalpurposeswithin academicandresearchframeworks.
The advent of the World Wide Web has changed the perspectives of groupware systems. The interest and deployment of Internet and intranet groupware solutions is growing rapidly, not just in academic circles but also in the commercial arena. The first generation of Web-based groupware tools has already started to emerge, and leading groupware vendors are urgently adapting their products for compatibility and integration with Web technologies. The focus of Groupware and the World Wide Web is to explore the potential for Web-based groupware. This book includes an analysis of the key characteristics of the Web, presenting reasons for its success, and describes developments of a diverse range of Web-based groupware systems. An emphasis on the technical obstacles and challenges is implemented by more analytical discussions and perspectives, including that of Information Technology managers looking to deploy groupware solutions within their organizations. Written by experts from different backgrounds - academic and commercial, technical and organizational - this book provides a unique overview of and insight into current issues and future possibilities concerning extension of the World Wide Web for group working.
Genetic Programming Theory and Practice explores the emerging interaction between theory and practice in the cutting-edge, machine learning method of Genetic Programming (GP). The material contained in this contributed volume was developed from a workshop at the University of Michigan's Center for the Study of Complex Systems where an international group of genetic programming theorists and practitioners met to examine how GP theory informs practice and how GP practice impacts GP theory. The contributions cover the full spectrum of this relationship and are written by leading GP theorists from major universities, as well as active practitioners from leading industries and businesses. Chapters include such topics as John Koza's development of human-competitive electronic circuit designs; David Goldberg's application of "competent GA" methodology to GP; Jason Daida's discovery of a new set of factors underlying the dynamics of GP starting from applied research; and Stephen Freeland's essay on the lessons of biology for GP and the potential impact of GP on evolutionary theory.
Fundamentals of Information Systems contains articles from the 7th International Workshop on Foundations of Models and Languages for Data and Objects (FoMLaDO '98), which was held in Timmel, Germany. These articles capture various aspects of database and information systems theory: * identification as a primitive of database models * deontic action programs * marked nulls in queries * topological canonization in spatial databases * complexity of search queries * complexity of Web queries * attribute grammars for structured document queries * hybrid multi-level concurrency control * efficient navigation in persistent object stores * formal semantics of UML * reengineering of object bases and integrity dependence . Fundamentals of Information Systems serves as an excellent reference, providing insight into some of the most challenging research issues in the field.
Autonomous, Model-Based Diagnosis Agents defines and describes the implementation of an architecture for autonomous, model-based diagnosis agents. It does this by developing a logic programming approach for model-based diagnosis and introducing strategies to deal with more complex diagnosis problems, and then embedding the diagnosis framework into the agent architecture of vivid agents. Autonomous, Model-Based Diagnosis Agents surveys extended logic programming and shows how this expressive language is used to model diagnosis problems stemming from applications such as digital circuits, traffic control, integrity checking of a chemical database, alarm-correlation in cellular phone networks, diagnosis of an automatic mirror furnace, and diagnosis of communication protocols. The book reviews a bottom-up algorithm to remove contradiction from extended logic programs and substantially improves it by top-down evaluation of extended logic programs. Both algorithms are evaluated in the circuit domain including some of the ISCAS85 benchmark circuits. This comprehensive in-depth study of concepts, architectures, and implementation of autonomous, model-based diagnosis agents will be of great value for researchers, engineers, and graduate students with a background in artificial intelligence. For practitioners, it provides three main contributions: first, it provides many examples from diverse areas such as alarm correlation in phone networks to inconsistency checking in databases; second, it describes an architecture to develop agents; and third, it describes a sophisticated and declarative implementation of the concepts and architectures introduced.
This book constitutes the thoroughly refereed post-conference proceedings of the 7th International Conference on Information Security and Cryptology, Inscrypt 2011, held in Beijing, China, in November/December 2011. The 24 revised full papers presented together with 2 invited talks were carefully reviewed and selected from 80 submissions. The papers present research advances in the areas of information security, cryptology, and their applications.
Information systems are the backbone of many of today's computerized applications. Distributed databases and the infrastructure needed to support them have been well studied. However, this book is the first to address distributed database interoperability by examining the successes and failures, various approaches, infrastructures, and trends of the field. A gap exists in the way that these systems have been investigated by real practitioners. This gap is more pronounced than usual, partly because of the way businesses operate, the systems they have, and the difficulties created by systems' autonomy and heterogeneity. Telecommunications firms, for example, must deal with an increased demand for automation while at the same time continuing to function at their current level. While academics are focusing on investigating differences between distributed databases, federated databases, heterogeneous databases, and, more generally, among loosely connected and tightly coupled systems, those who have to deal with real problems right away know that the only relevant research is the one that will ensure that their system works to produce reasonably correct results. Interconnecting Heterogeneous Information Systems covers the underlying principles and infrastructures needed to realize truly global information systems. The book discusses technologies related to middleware, the Web, workflows, transactions, and data warehousing. It also overviews architectures with a discussion of critical issues. The book gives an overview of systems that can be viewed as learning platforms. While these systems do not translate to successful commercial realities, they push the envelope in terms of research. Successful commercial systems have benefited from the experiments conducted in these prototypes. The book includes two case studies based on the authors' own work. Interconnecting Heterogeneous Information Systems is suitable as a textbook for a graduate-level course on Interconnecting Heterogeneous Information Systems, as well as a secondary text for a graduate-level course on database or information systems, and as a reference for researchers and practitioners in industry.
Real-time systems are defined as those for which correctness depends not only on the logical properties of the produced results, but also on the temporal properties of these results. In a database, real-time means that in addition to typical logical consistency constraints, such as a constraint on a data item's value, there are constraints on when transactions execute and on the `freshness' of the data transactions access. The challenges and tradeoffs faced by the designers of real-time database systems are quite different from those faced by the designers of general-purpose database systems. To achieve the fundamental requirements of timeliness and predictability, not only do conventional methods for scheduling and transaction management have to be redesigned, but also new concepts that have not been considered in conventional database systems or in real-time systems need to be added. Real-Time Database and Information Systems: Research Advances is devoted to new techniques for scheduling of transactions, concurrency management, transaction logging, database languages, and new distributed database architectures. Real-Time Database and Information Systems: Research Advances is primarily intended for practicing engineers and researchers working in the growing area of real-time database and information retrieval systems. For practitioners, the book will provide a much needed bridge for technology transfer and continued education. For researchers, the book will provide a comprehensive reference for well-established results. The book can also be used in a senior or graduate level course on real-time systems, real-time database systems, and database systems, or closely related courses.
Big Data Application Architecture Pattern Recipes provides an insight into heterogeneous infrastructures, databases, and visualization and analytics tools used for realizing the architectures of big data solutions. Its problem-solution approach helps in selecting the right architecture to solve the problem at hand. In the process of reading through these problems, you will learn harness the power of new big data opportunities which various enterprises use to attain real-time profits. Big Data Application Architecture Pattern Recipes answers one of the most critical questions of this time 'how do you select the best end-to-end architecture to solve your big data problem?'. The book deals with various mission critical problems encountered by solution architects, consultants, and software architects while dealing with the myriad options available for implementing a typical solution, trying to extract insight from huge volumes of data in real--time and across multiple relational and non-relational data types for clients from industries like retail, telecommunication, banking, and insurance.The patterns in this book provide the strong architectural foundation required to launch your next big data application. The architectures for realizing these opportunities are based on relatively less expensive and heterogeneous infrastructures compared to the traditional monolithic and hugely expensive options that exist currently. This book describes and evaluates the benefits of heterogeneity which brings with it multiple options of solving the same problem, evaluation of trade-offs and validation of 'fitness-for-purpose' of the solution. What you'll learn * Major considerations in building a big data solution * Big data application architectures problems for specific industries * What are the components one needs to build and end-to-end big data solution? * Does one really need a real-time big data solution or an off-line analytics batch solution? * What are the operations and support architectures for a big data solution? * What are the scalability considerations, and options for a Hadoop installation?Who this book is for * CIOs, CTOs, enterprise architects, and software architects * Consultants, solution architects, and information management (IM) analysts who want to architect a big data solution for their enterprise
Solders have given the designer of modern consumer, commercial, and military electronic systems a remarkable flexibility to interconnect electronic components. The properties of solder have facilitated broad assembly choices that have fueled creative applications to advance technology. Solder is the electrical and me chanical "glue" of electronic assemblies. This pervasive dependency on solder has stimulated new interest in applica tions as well as a more concerted effort to better understand materials properties. We need not look far to see solder being used to interconnect ever finer geo metries. Assembly of micropassive discrete devices that are hardly visible to the unaided eye, of silicon chips directly to ceramic and plastic substrates, and of very fine peripheral leaded packages constitute a few of solder's uses. There has been a marked increase in university research related to solder. New electronic packaging centers stimulate applications, and materials engineering and science departments have demonstrated a new vigor to improve both the materials and our understanding of them. Industrial research and development continues to stimulate new application, and refreshing new packaging ideas are emerging. New handbooks have been published to help both the neophyte and seasoned packaging engineer.
The volume LNCS 8155 constitutes the refereed proceedings of the 19th International Workshop on Cellular Automata and Discrete Complex Systems, AUTOMATA 2013, held in Giessen, Germany, in September 2013. The 8 papers presented were carefully reviewed and selected from 26 submissions. The scope of the workshop spans the following areas the theoretical and practical aspects of a permanent, international, multidisciplinary forum for the collaboration of researchers in the field of Cellular Automata (CA) and Discrete Complex Systems (DCS), to provide a platform for presenting and discussing new ideas and results, to support the development of theory and applications of CA and DCS (e.g. parallel computing, physics, biology, social sciences, and others) as long as fundamental aspects and their relations are concerned, to identify and study within an inter- and multidisciplinary context, the important fundamental aspects, concepts, notions and problems concerning CA and DCS. |
You may like...
Comprehensive Metaheuristics…
S. Ali Mirjalili, Amir Hossein Gandomi
Paperback
R3,956
Discovery Miles 39 560
|