![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer programming > Algorithms & procedures
This new book-the first of its kind-examines the use of algorithmic techniques to compress random and non-random sequential strings found in chains of polymers. The book is an introduction to algorithmic complexity. Examples taken from current research in the polymer sciences are used for compression of like-natured properties as found on a chain of polymers. Both theory and applied aspects of algorithmic compression are reviewed. A description of the types of polymers and their uses is followed by a chapter on various types of compression systems that can be used to compress polymer chains into manageable units. The work is intended for graduate and postgraduate university students in the physical sciences and engineering.
The Design and Analysis of Computer Algorithms introduces the basic data structures and programming techniques often used in efficient algorithms. It covers the use of lists, push-down stacks, queues, trees, and graphs.
Hardware-intrinsic security is a young field dealing with secure secret key storage. By generating the secret keys from the intrinsic properties of the silicon, e.g., from intrinsic Physical Unclonable Functions (PUFs), no permanent secret key storage is required anymore, and the key is only present in the device for a minimal amount of time. The field is extending to hardware-based security primitives and protocols such as block ciphers and stream ciphers entangled with the hardware, thus improving IC security. While at the application level there is a growing interest in hardware security for RFID systems and the necessary accompanying system architectures. This book brings together contributions from researchers and practitioners in academia and industry, an interdisciplinary group with backgrounds in physics, mathematics, cryptography, coding theory and processor theory. It will serve as important background material for students and practitioners, and will stimulate much further research and development.
Software has become a key component of contemporary life and algorithms that rank, classify, or recommend are everywhere. Building on the philosophy of Gilbert Simondon and the cultural techniques tradition, this book examines the constructive and cumulative character of software and retraces the historical trajectories of a series of algorithmic techniques that have become the building blocks for contemporary practices of ordering. Developed in opposition to centuries of library tradition, these techniques instantiate dynamic, perspectivist, and interested forms of knowing. Embedded in technical infrastructures and economic logics, they have become engines of order that transform how we arrange information, ideas, and people.
For 60 years the International Federation for Information Processing (IFIP) has been advancing research in Information and Communication Technology (ICT). This book looks into both past experiences and future perspectives using the core of IFIP's competence, its Technical Committees (TCs) and Working Groups (WGs). Soon after IFIP was founded, it established TCs and related WGs to foster the exchange and development of the scientific and technical aspects of information processing. IFIP TCs are as diverse as the different aspects of information processing, but they share the following aims: To establish and maintain liaison with national and international organizations with allied interests and to foster cooperative action, collaborative research, and information exchange. To identify subjects and priorities for research, to stimulate theoretical work on fundamental issues, and to foster fundamental research which will underpin future development. To provide a forum for professionals with a view to promoting the study, collection, exchange, and dissemination of ideas, information, and research findings and thereby to promote the state of the art. To seek and use the most effective ways of disseminating information about IFIP's work including the organization of conferences, workshops and symposia and the timely production of relevant publications. To have special regard for the needs of developing countries and to seek practicable ways of working with them. To encourage communication and to promote interaction between users, practitioners, and researchers. To foster interdisciplinary work and - in particular - to collaborate with other Technical Committees and Working Groups. The 17 contributions in this book describe the scientific, technical, and further work in TCs and WGs and in many cases also assess the future consequences of the work's results. These contributions explore the developments of IFIP and the ICT profession now and over the next 60 years. The contributions are arranged per TC and conclude with the chapter on the IFIP code of ethics and conduct.
This book focuses on the implementation, evaluation and application of DNA/RNA-based genetic algorithms in connection with neural network modeling, fuzzy control, the Q-learning algorithm and CNN deep learning classifier. It presents several DNA/RNA-based genetic algorithms and their modifications, which are tested using benchmarks, as well as detailed information on the implementation steps and program code. In addition to single-objective optimization, here genetic algorithms are also used to solve multi-objective optimization for neural network modeling, fuzzy control, model predictive control and PID control. In closing, new topics such as Q-learning and CNN are introduced. The book offers a valuable reference guide for researchers and designers in system modeling and control, and for senior undergraduate and graduate students at colleges and universities.
This book provides a systematic and comparative description of the vast number of research issues related to the quality of data and information. It does so by delivering a sound, integrated and comprehensive overview of the state of the art and future development of data and information quality in databases and information systems. To this end, it presents an extensive description of the techniques that constitute the core of data and information quality research, including record linkage (also called object identification), data integration, error localization and correction, and examines the related techniques in a comprehensive and original methodological framework. Quality dimension definitions and adopted models are also analyzed in detail, and differences between the proposed solutions are highlighted and discussed. Furthermore, while systematically describing data and information quality as an autonomous research area, paradigms and influences deriving from other areas, such as probability theory, statistical data analysis, data mining, knowledge representation, and machine learning are also included. Last not least, the book also highlights very practical solutions, such as methodologies, benchmarks for the most effective techniques, case studies, and examples. The book has been written primarily for researchers in the fields of databases and information management or in natural sciences who are interested in investigating properties of data and information that have an impact on the quality of experiments, processes and on real life. The material presented is also sufficiently self-contained for masters or PhD-level courses, and it covers all the fundamentals and topics without the need for other textbooks. Data and information system administrators and practitioners, who deal with systems exposed to data-quality issues and as a result need a systematization of the field and practical methods in the area, will also benefit from the combination of concrete practical approaches with sound theoretical formalisms.
Tremendous achievements in the area of semiconductor electronics turn - croelectronics into nanoelectronics. Actually, we observe a real technical boom connected with achievements in nanoelectronics. It results in devel- mentofverycomplexintegratedcircuits, particularlythe?eldprogrammable logic devices (FPLD). Up-to-day FPLD chips are so huge, that it is enough only one chip to implement a really complex digital system including a da- path and a control unit. Because of the extreme complexity of modern - crochips, it is very important to develop e?ective design methods oriented on particular properties of logic elements. The development of digital s- tems with use of FPLD microchips is not possible without use of di?erent hardware description languages(HDL), such as VHDL and Verilog. Di?erent computer-aided design tools (CAD) are wide used to develop digital system hardware. As majorityof researchespoint out, the design processis nowvery similar to the process of program development. It allows a researcher to pay more attention to some speci?c problems, where there are no standard f- mal methods of their solution. But application of all these achievements does not guaranteeper sedevelopmentof some competitiveelectronic product, - pecially in the acceptable time-to-market. This problem solution is possible only if a researcher possesses fundamental knowledge of a design process and knows exactly the mode of operation of industrial CAD tools in use. As it is known, any digital system can be represented as a composition of a da- path and a control uni
This open access book presents the results of three years collaboration between earth scientists and data scientist, in developing and applying data science methods for scientific discovery. The book will be highly beneficial for other researchers at senior and graduate level, interested in applying visual data exploration, computational approaches and scientifc workflows.
Descriptive complexity theory establishes a connection between the computational complexity of algorithmic problems (the computational resources required to solve the problems) and their descriptive complexity (the language resources required to describe the problems). This groundbreaking book approaches descriptive complexity from the angle of modern structural graph theory, specifically graph minor theory. It develops a 'definable structure theory' concerned with the logical definability of graph theoretic concepts such as tree decompositions and embeddings. The first part starts with an introduction to the background, from logic, complexity, and graph theory, and develops the theory up to first applications in descriptive complexity theory and graph isomorphism testing. It may serve as the basis for a graduate-level course. The second part is more advanced and mainly devoted to the proof of a single, previously unpublished theorem: properties of graphs with excluded minors are decidable in polynomial time if, and only if, they are definable in fixed-point logic with counting.
In recent years, popular media have inundated audiences with sensationalised headlines recounting data breaches, new forms of surveillance and other dangers of our digital age. Despite their regularity, such accounts treat each case as unprecedented and unique. This book proposes a radical rethinking of the history, present and future of our relations with the digital, spatial technologies that increasingly mediate our everyday lives. From smartphones to surveillance cameras, to navigational satellites, these new technologies offer visions of integrated, smooth and efficient societies, even as they directly conflict with the ways users experience them. Recognising the potential for both control and liberation, the authors argue against both acquiescence to and rejection of these technologies. Through intentional use of the very systems that monitor them, activists from Charlottesville to Hong Kong are subverting, resisting and repurposing geographic technologies. Using examples as varied as writings on the first telephones to the experiences of a feminist collective for migrant women in Spain, the authors present a revolution of everyday technologies. In the face of the seemingly inevitable dominance of corporate interests, these technologies allow us to create new spaces of affinity, and a new politics of change.
Written in easy to understand language, this self-explanatory guide introduces the fundamentals of finite element methods and its application to differential equations. Beginning with a brief introduction to Sobolev spaces and elliptic scalar problems, the text progresses through an explanation of finite element spaces and estimates for the interpolation error. The concepts of finite element methods for parabolic scalar parabolic problems, object-oriented finite element algorithms, efficient implementation techniques, and high dimensional parabolic problems are presented in different chapters. Recent advances in finite element methods, including non-conforming finite elements for boundary value problems of higher order and approaches for solving differential equations in high dimensional domains are explained for the benefit of the reader. Numerous solved examples and mathematical theorems are interspersed throughout the text for enhanced learning.
Shadow Algorithms Data Miner provides a high-level understanding of the complete set of shadow concepts and algorithms, addressing their usefulness from a larger graphics system perspective. It discusses the applicability and limitations of all the direct illumination approaches for shadow generation. With an emphasis on shadow fundamentals, the book gives an organized picture of the motivations, complexities, and categorized algorithms available to generate digital shadows. It helps readers select the most relevant algorithms for their needs by placing the shadow algorithms in real-world contexts and looking at them from a larger graphics system perspective. As a result, readers know where to start for their application needs, which algorithms to begin considering, and which papers and supplemental material should be consulted for further details.
This comprehensive textbook presents a clean and coherent account of most fundamental tools and techniques in Parameterized Algorithms and is a self-contained guide to the area. The book covers many of the recent developments of the field, including application of important separators, branching based on linear programming, Cut & Count to obtain faster algorithms on tree decompositions, algorithms based on representative families of matroids, and use of the Strong Exponential Time Hypothesis. A number of older results are revisited and explained in a modern and didactic way. The book provides a toolbox of algorithmic techniques. Part I is an overview of basic techniques, each chapter discussing a certain algorithmic paradigm. The material covered in this part can be used for an introductory course on fixed-parameter tractability. Part II discusses more advanced and specialized algorithmic ideas, bringing the reader to the cutting edge of current research. Part III presents complexity results and lower bounds, giving negative evidence by way of W[1]-hardness, the Exponential Time Hypothesis, and kernelization lower bounds. All the results and concepts are introduced at a level accessible to graduate students and advanced undergraduate students. Every chapter is accompanied by exercises, many with hints, while the bibliographic notes point to original publications and related work.
Gather and analyze data successfully, identify trends, and then create overarching strategies and actionable next steps - all through Excel. This book will show even those who lack a technical background how to make advanced interactive reports with only Excel at hand. Advanced visualization is available to everyone, and this step-by-step guide will show you how. The information in this book is presented in an accessible and understandable way for everyone, regardless of the level of technical skills and proficiency in MS Excel. The dashboard development process is given in the format of step-by-step instructions, taking you through each step in detail. Universal checklists and recommendations of a practicing business analyst and trainer will help in solving various tasks when working with data visualization. Illustrations will help you perceive information easily and quickly. Make Your Data Speak will show you how to master the main rules, techniques and tricks of professional data visualization in just a few days. What You'll Learn See how interactive dashboards can be useful for a business Review basic rules for building dashboards Understand why it's important to pay attention to colors and fonts when developing a dashboard Create interactive management reports in Excel Who This Book is For Company executives and divisional managers, Middle managers, business analysts
In machine learning applications, practitioners must take into account the cost associated with the algorithm. These costs include:
Cost-Sensitive Machine Learning is one of the first books to provide an overview of the current research efforts and problems in this area. It discusses real-world applications that incorporate the cost of learning into the modeling process. The first part of the book presents the theoretical underpinnings of cost-sensitive machine learning. It describes well-established machine learning approaches for reducing data acquisition costs during training as well as approaches for reducing costs when systems must make predictions for new samples. The second part covers real-world applications that effectively trade off different types of costs. These applications not only use traditional machine learning approaches, but they also incorporate cutting-edge research that advances beyond the constraining assumptions by analyzing the application needs from first principles. Spurring further research on several open problems, this volume highlights the often implicit assumptions in machine learning techniques that were not fully understood in the past. The book also illustrates the commercial importance of cost-sensitive machine learning through its coverage of the rapid application developments made by leading companies and academic research labs.
Algorithms and Theory of Computation Handbook, Second Edition: General Concepts and Techniques provides an up-to-date compendium of fundamental computer science topics and techniques. It also illustrates how the topics and techniques come together to deliver efficient solutions to important practical problems. Along with updating and revising many of the existing chapters, this second edition contains four new chapters that cover external memory and parameterized algorithms as well as computational number theory and algorithmic coding theory. This best-selling handbook continues to help computer professionals and engineers find significant information on various algorithmic topics. The expert contributors clearly define the terminology, present basic results and techniques, and offer a number of current references to the in-depth literature. They also provide a glimpse of the major research issues concerning the relevant topics.
This revised and extensively expanded edition of "Computability and Complexity Theory" comprises essential materials that are core knowledge in the theory of computation. The book is self-contained, with a preliminary chapter describing key mathematical concepts and notations. Subsequent chapters move from the qualitative aspects of classical computability theory to the quantitative aspects of complexity theory. Dedicated chapters on undecidability, NP-completeness, andrelative computability focus on the limitations of computability and the distinctions between feasible and intractable. Substantial new content in this edition includes: a chapter on nonuniformity studying Boolean circuits, advice classes and the important result of Karp Lipton.a chapter studying properties of the fundamental probabilistic complexity classesa study of the alternating Turing machine and uniform circuit classes. an introduction of counting classes, proving the famous results of Valiant and Vazirani and of Todaa thorough treatment of the proof that IP is identical to PSPACE With its accessibility and well-devised organization, this text/reference is an excellent resource and guide for those looking to develop a solid grounding in the theory of computing. Beginning graduates, advanced undergraduates, and professionals involved in theoretical computer science, complexity theory, and computability will find the book an essential and practical learning tool. Topics and features: Concise, focused materials cover the most fundamental concepts and results in the field of modern complexity theory, including the theory of NP-completeness, NP-hardness, the polynomial hierarchy, and complete problems for other complexity classes Contains information that otherwise exists only in research literature and presents it in a unified, simplified mannerProvides key mathematical background information, including sections on logic and number theory and algebra Supported by numerous exercises and supplementary problems for reinforcement and self-study purposes "
A One-Stop Source of Known Results, a Bibliography of Papers on the Subject, and Novel Research Directions Focusing on a very active area of research in the last decade, Combinatorics of Compositions and Words provides an introduction to the methods used in the combinatorics of pattern avoidance and pattern enumeration in compositions and words. It also presents various tools and approaches that are applicable to other areas of enumerative combinatorics. After a historical perspective on research in the area, the text introduces techniques to solve recurrence relations, including iteration and generating functions. It then focuses on enumeration of basic statistics for compositions. The text goes on to present results on pattern avoidance for subword, subsequence, and generalized patterns in compositions and then applies these results to words. The authors also cover automata, the ECO method, generating trees, and asymptotic results via random compositions and complex analysis. Highlighting both established and new results, this book explores numerous tools for enumerating patterns in compositions and words. It includes a comprehensive bibliography and incorporates the use of the computer algebra systems Maple and Mathematica(r), as well as C++ to perform computations.
Computation, itself a form of calculation, incorporates steps that include arithmetical and non-arithmetical (logical) steps following a specific set of rules (an algorithm). This uniquely accessible textbook introduces students using a very distinctive approach, quite rapidly leading them into essential topics with sufficient depth, yet in a highly intuitive manner. From core elements like sets, types, Venn diagrams and logic, to patterns of reasoning, calculus, recursion and expression trees, the book spans the breadth of key concepts and methods that will enable students to readily progress with their studies in Computer Science.
Networked computers are ubiquitous, and are subject to attack, misuse, and abuse. One method to counteracting this cyber threat is to provide security analysts with better tools to discover patterns, detect anomalies, identify correlations, and communicate their findings. Visualization for computer security (VizSec) researchers and developers are doing just that. VizSec is about putting robust information visualization tools into the hands of human analysts to take advantage of the power of the human perceptual and cognitive processes in solving computer security problems. This volume collects the papers presented at the 4th International Workshop on Computer Security - VizSec 2007.
Customer-Oriented Optimization in Public Transportation develops models, results and algorithms for optimizing public transportation from a customer-oriented point of view. The methods used are based on graph-theoretic approaches and integer programming. The specific topics are all motivated by real-world examples which occurred in practical projects. An appendix summarizes some of the basics of optimization needed to interpret the material in the book. In detail, the topics the book covers in its three parts are as follows: Stop location - Does it make sense to open new stations along existing bus or railway lines? If yes, in which locations? The problem is modeled as a continuous covering problem. To solve it, the author develops a finite dominating set and shows that efficient methods are possible if the special structure of the covering matrix is used; Delay management - Should a train wait for delayed feeder trains or should it depart in time? |
You may like...
Multicore Systems On-Chip: Practical…
Abderazek Ben Abdallah
Hardcover
R1,950
Discovery Miles 19 500
Precalculus: Mathematics for Calculus…
Lothar Redlin, Saleem Watson, …
Paperback
Evaluating Children's Interactive…
Panos Markopoulos, Janet C. Read, …
Paperback
R1,178
Discovery Miles 11 780
Grandma and Grandpa Can You Code
Timothy Amadi, Eugene Amadi, …
Hardcover
R538
Discovery Miles 5 380
Big Data Management, Technologies, and…
Wen-Chen Hu, Naima Kaabouch
Hardcover
R4,548
Discovery Miles 45 480
Research Anthology on Strategies for…
Information R Management Association
Hardcover
R13,719
Discovery Miles 137 190
|