![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > Data structures
Explores the Impact of the Analysis of Algorithms on Many Areas within and beyond Computer Science A flexible, interactive teaching format enhanced by a large selection of examples and exercises Developed from the author's own graduate-level course, Methods in Algorithmic Analysis presents numerous theories, techniques, and methods used for analyzing algorithms. It exposes students to mathematical techniques and methods that are practical and relevant to theoretical aspects of computer science. After introducing basic mathematical and combinatorial methods, the text focuses on various aspects of probability, including finite sets, random variables, distributions, Bayes' theorem, and Chebyshev inequality. It explores the role of recurrences in computer science, numerical analysis, engineering, and discrete mathematics applications. The author then describes the powerful tool of generating functions, which is demonstrated in enumeration problems, such as probabilistic algorithms, compositions and partitions of integers, and shuffling. He also discusses the symbolic method, the principle of inclusion and exclusion, and its applications. The book goes on to show how strings can be manipulated and counted, how the finite state machine and Markov chains can help solve probabilistic and combinatorial problems, how to derive asymptotic results, and how convergence and singularities play leading roles in deducing asymptotic information from generating functions. The final chapter presents the definitions and properties of the mathematical infrastructure needed to accommodate generating functions. Accompanied by more than 1,000 examples and exercises, this comprehensive, classroom-tested text develops students' understanding of the mathematical methodology behind the analysis of algorithms. It emphasizes the important relation between continuous (classical) mathematics and discrete mathematics, which is the basis of computer science.
This book constitutes the proceedings of the 14th International Workshop on Knowledge Management and Acquisition for Intelligent Systems, PKAW 2016, held in Phuket, Thailand, in August 2016. The 16 full papers and 5 short papers included in this volume were carefully reviewed and selected from 61 initial submissions. They deal with knowledge acquisition and machine learning; knowledge acquisition and natural language processing; knowledge acquisition from network and big data; and knowledge acquisition and applications.
The book is based on the PhD thesis "Descriptive Set Theoretic Methods in Automata Theory," awarded the E.W. Beth Prize in 2015 for outstanding dissertations in the fields of logic, language, and information. The thesis reveals unexpected connections between advanced concepts in logic, descriptive set theory, topology, and automata theory and provides many deep insights into the interplay between these fields. It opens new perspectives on central problems in the theory of automata on infinite words and trees and offers very impressive advances in this theory from the point of view of topology. "...the thesis of Michal Skrzypczak offers certainly what we expect from excellent mathematics: new unexpected connections between a priori distinct concepts, and proofs involving enlightening ideas." Thomas Colcombet.
Edited in collaboration with FoLLI, the Association of Logic, Language and Information this book constitutes the refereed proceedings of the 23rd Workshop on Logic, Language, Information and Communication, WoLLIC 2016, held in Puebla, Mexico, in August 2016.The 23 contributed papers, presented together with 9 invited lectures and tutorials, were carefully reviewed and selected from 33 submissions. The focus of the workshop is to provide a forum on inter-disciplinary research involving formal logic, computing and programming theory, and natural language and reasoning.
Churn is the bane of any subscription business, such as content subscriptions, software as a service, and even ad-supported freemium apps. You can improve customer retention through product changes and targeted engagement campaigns based on data-driven interventions. This hands-on guide is packed with techniques for converting raw data into measurable metrics, testing hypotheses, and presenting findings that are easily understandable to non-technical decision makers. Don't let your hard-won customers vanish from subscription services, taking their money with them. In Fighting Churn with Data you'll learn powerful data-driven techniques to maximize customer retention and minimize actions that cause them to stop engaging or unsubscribe altogether. * Identifying processes suited to machine learning * Using machine learning to automate back office processes * Seven everyday business process projects * Using open source and cloud-based tools * Case studies for machine learning decision making For readers with basic data analysis skills, including Python and SQL.
This, the 27th issue of Transactions on Large-Scale Data- and Knowledge-Centered Systems, contains extended and revised versions of 12 papers presented at the Big Data and Technology for Complex Urban Systems symposium, held in Kauai, HI, USA in January 2016. The papers explore the use of big data in complex urban systems in the areas of politics, society, commerce, tax, and emergency management.
This book discusses semantic interaction, a user interaction methodology for visual analytic applications that more closely couples the visual reasoning processes of people with the computation. This methodology affords user interaction on visual data representations that are native to the domain of the data. User interaction in visual analytics systems is critical to enabling visual data exploration. Interaction transforms people from mere viewers to active participants in the process of analyzing and understanding data. This discourse between people and data enables people to understand aspects of their data, such as structure, patterns, trends, outliers, and other properties that ultimately result in insight. Through interacting with visualizations, users engage in sensemaking, a process of developing and understanding relationships within datasets through foraging and synthesis. The book provides a description of the principles of semantic interaction, providing design guidelines for the integration of semantic interaction into visual analytics, examples of existing technologies that leverage semantic interaction, and a discussion of how to evaluate these technologies. Semantic interaction has the potential to increase the effectiveness of visual analytic technologies and opens possibilities for a fundamentally new design space for user interaction in visual analytics systems.
Designed for use in a second course on linear algebra, Matrix Theory and Applications with MATLAB covers the basics of the subject-from a review of matrix algebra through vector spaces to matrix calculus and unitary similarity-in a presentation that stresses insight, understanding, and applications. Among its most outstanding features is the integration of MATLAB throughout the text. Each chapter includes a MATLAB subsection that discusses the various commands used to do the computations in that section and offers code for the graphics and some algorithms used in the text.
This book constitutes revised selected papers from the 7th International Workshop on Constructive Side-Channel Analysis and Secure Design, COSADE 2016, held in Graz, Austria, in April 2016. The 12 papers presented in this volume were carefully reviewed and selected from 32 submissions. They were organized in topical sections named: security and physical attacks; side-channel analysis (case studies); fault analysis; and side-channel analysis (tools).
This book will contemplate the nature of our participatory digital media culture, the diversity of actors involved, and how the role of the news librarian has evolved-from information gatekeeper to knowledge networker, collaborating and facilitating content creation with print and broadcast media professionals. It will explore how information professionals assist in the newsroom, drawing on the author's experiential knowledge as an embedded research librarian in the media industry. The past decade has seen significant changes in the media landscape. Large media outlets have traditionally controlled news and information flows, with everyone obtaining news via these dominant channels. In the digital world, the nature of what constitutes news has changed in fundamental ways. Social media and technologies such as crowdsourcing now play a pivotal role in how broadcast media connects and engages with their audiences. The book will focus on news reporting in the age of social media, examining the significance of verification and evaluating social media content from a journalistic and Information Science (IS) perspective. With such an emphasis on using social media for research, it is imperative to have mechanisms in place to make sure that information is authoritative before passing it on to a client as correct and accurate. Technology innovation and the 24/7 news cycle are driving forces compelling information professionals and journalists alike to adapt and learn new skills. The shift to tablets and smartphones for communication, news, and entertainment has dramatically changed the library and media landscape. Finally, we will consider automated journalism and examine future roles for news library professionals in the age of digital social media.
This book deals with timing attacks on cryptographic ciphers. It describes and analyzes various unintended covert timing channels that are formed when ciphers are executed in microprocessors. The book considers modern superscalar microprocessors which are enabled with features such as multi-threaded, pipelined, parallel, speculative, and out-of order execution. Various timing attack algorithms are described and analyzed for both block ciphers as well as public-key ciphers. The interplay between the cipher implementation, the system architecture, and the attack's success is analyzed. Further hardware and software countermeasures are discussed with the aim of illustrating methods to build systems that can protect against these attacks.
Frontiers of Higher Order Fuzzy Sets, provides a unified representation theorem for higher order fuzzy sets. The book elaborates on the concept of gradual elements and their integration with the higher order fuzzy sets. This book also is devoted to the introduction of new frameworks based on general T2FSs, IT2FSs, Gradual elements, Shadowed sets and rough sets. Such new frameworks will provide more capable frameworks for real applications. Applications of higher order fuzzy sets in various fields will be discussed. In particular, the properties and characteristics of the new proposed frameworks would be studied. Such frameworks that are the result of the integration of general T2FSs, IT2FSs, gradual elements, shadowed sets and rough sets will be shown to be suitable to be applied in the fields of bioinformatics, business, management, ambient intelligence, medicine, cloud computing and smart grids.
This book offers a coherent and comprehensive approach to feature subset selection in the scope of classification problems, explaining the foundations, real application problems and the challenges of feature selection for high-dimensional data. The authors first focus on the analysis and synthesis of feature selection algorithms, presenting a comprehensive review of basic concepts and experimental results of the most well-known algorithms. They then address different real scenarios with high-dimensional data, showing the use of feature selection algorithms in different contexts with different requirements and information: microarray data, intrusion detection, tear film lipid layer classification and cost-based features. The book then delves into the scenario of big dimension, paying attention to important problems under high-dimensional spaces, such as scalability, distributed processing and real-time processing, scenarios that open up new and interesting challenges for researchers. The book is useful for practitioners, researchers and graduate students in the areas of machine learning and data mining.
This book constitutes the refereed proceedings of the 23rd International Static Analysis Symposium, SAS 2016, held in Edinburgh, UK, in September 2016. The 21 papers presented in this volume were carefully reviewed and selected from 55 submissions. The contributions cover a variety of multi-disciplinary topics in abstract domains; abstract interpretation; abstract testing; bug detection; data flow analysis; model checking; new applications; program transformation; program verification; security analysis; theoretical frameworks; and type checking.
A major, comprehensive professional text/reference for designing and maintaining security and reliability. From basic concepts to designing principles to deployment, all critical concepts and phases are clearly explained and presented. Includes coverage of wireless security testing techniques and prevention techniques for intrusion (attacks). An essential resource for wireless network administrators and developers.
This book represents the refereed proceedings of the Ninth International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing that was held at the University of Warsaw (Poland) in August 2010. These biennial conferences are major events for Monte Carlo and the premiere event for quasi-Monte Carlo research. The proceedings include articles based on invited lectures as well as carefully selected contributed papers on all theoretical aspects and applications of Monte Carlo and quasi-Monte Carlo methods. The reader will be provided with information on latest developments in these very active areas. The book is an excellent reference for theoreticians and practitioners interested in solving high-dimensional computational problems arising, in particular, in finance and statistics.
The authors describe systematic methods for uncovering scientific laws a priori, on the basis of intuition, or "Gedanken Experiments". Mathematical expressions of scientific laws are, by convention, constrained by the rule that their form must be invariant with changes of the units of their variables. This constraint makes it possible to narrow down the possible forms of the laws. It is closely related to, but different from, dimensional analysis. It is a mathematical book, largely based on solving functional equations. In fact, one chapter is an introduction to the theory of functional equations.
In this introductory textbook the author explains the key topics in cryptography. He takes a modern approach, where defining what is meant by "secure" is as important as creating something that achieves that goal, and security definitions are central to the discussion throughout. The author balances a largely non-rigorous style - many proofs are sketched only - with appropriate formality and depth. For example, he uses the terminology of groups and finite fields so that the reader can understand both the latest academic research and "real-world" documents such as application programming interface descriptions and cryptographic standards. The text employs colour to distinguish between public and private information, and all chapters include summaries and suggestions for further reading. This is a suitable textbook for advanced undergraduate and graduate students in computer science, mathematics and engineering, and for self-study by professionals in information security. While the appendix summarizes most of the basic algebra and notation required, it is assumed that the reader has a basic knowledge of discrete mathematics, probability, and elementary calculus.
This work reviews the state of the art in SVM and perceptron classifiers. A Support Vector Machine (SVM) is easily the most popular tool for dealing with a variety of machine-learning tasks, including classification. SVMs are associated with maximizing the margin between two classes. The concerned optimization problem is a convex optimization guaranteeing a globally optimal solution. The weight vector associated with SVM is obtained by a linear combination of some of the boundary and noisy vectors. Further, when the data are not linearly separable, tuning the coefficient of the regularization term becomes crucial. Even though SVMs have popularized the kernel trick, in most of the practical applications that are high-dimensional, linear SVMs are popularly used. The text examines applications to social and information networks. The work also discusses another popular linear classifier, the perceptron, and compares its performance with that of the SVM in different application areas.>
This unique textbook/reference presents unified coverage of bioinformatics topics relating to both biological sequences and biological networks, providing an in-depth analysis of cutting-edge distributed algorithms, as well as of relevant sequential algorithms. In addition to introducing the latest algorithms in this area, more than fifteen new distributed algorithms are also proposed. Topics and features: reviews a range of open challenges in biological sequences and networks; describes in detail both sequential and parallel/distributed algorithms for each problem; suggests approaches for distributed algorithms as possible extensions to sequential algorithms, when the distributed algorithms for the topic are scarce; proposes a number of new distributed algorithms in each chapter, to serve as potential starting points for further research; concludes each chapter with self-test exercises, a summary of the key points, a comparison of the algorithms described, and a literature review.
Computability and complexity theory are two central areas of research in theoretical computer science. This book provides a systematic, technical development of "algorithmic randomness" and complexity for scientists from diverse fields.
This textbook connects three vibrant areas at the interface between economics and computer science: algorithmic game theory, computational social choice, and fair division. It thus offers an interdisciplinary treatment of collective decision making from an economic and computational perspective. Part I introduces to algorithmic game theory, focusing on both noncooperative and cooperative game theory. Part II introduces to computational social choice, focusing on both preference aggregation (voting) and judgment aggregation. Part III introduces to fair division, focusing on the division of both a single divisible resource ("cake-cutting") and multiple indivisible and unshareable resources ("multiagent resource allocation"). In all these parts, much weight is given to the algorithmic and complexity-theoretic aspects of problems arising in these areas, and the interconnections between the three parts are of central interest.
New generations of IT users are increasingly abstracted from the underlying devices and platforms that provide and safeguard their services. As a result they may have little awareness that they are critically dependent on the embedded security devices that are becoming pervasive in daily modern life. Secure Smart Embedded Devices, Platforms and Applications provides a broad overview of the many security and practical issues of embedded devices, tokens, and their operation systems, platforms and main applications. It also addresses a diverse range of industry/government initiatives and considerations, while focusing strongly on technical and practical security issues. The benefits and pitfalls of developing and deploying applications that rely on embedded systems and their security functionality are presented. A sufficient level of technical detail to support embedded systems is provided throughout the text, although the book is quite readable for those seeking awareness through an initial overview of the topics. This edited volume benefits from the contributions of industry and academic experts and helps provide a cross-discipline overview of the security and practical issues for embedded systems, tokens, and platforms. It is an ideal complement to the earlier work, Smart Cards Tokens, Security and Applications from the same editors.
This book offers an introduction to cryptology, the science that makes secure communications possible, and addresses its two complementary aspects: cryptography---the art of making secure building blocks---and cryptanalysis---the art of breaking them. The text describes some of the most important systems in detail, including AES, RSA, group-based and lattice-based cryptography, signatures, hash functions, random generation, and more, providing detailed underpinnings for most of them. With regard to cryptanalysis, it presents a number of basic tools such as the differential and linear methods and lattice attacks. This text, based on lecture notes from the author's many courses on the art of cryptography, consists of two interlinked parts. The first, modern part explains some of the basic systems used today and some attacks on them. However, a text on cryptology would not be complete without describing its rich and fascinating history. As such, the colorfully illustrated historical part interspersed throughout the text highlights selected inventions and episodes, providing a glimpse into the past of cryptology. The first sections of this book can be used as a textbook for an introductory course to computer science or mathematics students. Other sections are suitable for advanced undergraduate or graduate courses. Many exercises are included. The emphasis is on providing reasonably complete explanation of the background for some selected systems.
The two-volume set LNCS 9722 and LNCS 9723 constitutes the refereed proceedings of the 21st Australasian Conference on Information Security and Privacy, ACISP 2016, held in Melbourne, VIC, Australia, in July 2016. The 52 revised full and 8 short papers presented together with 6 invited papers in this double volume were carefully reviewed and selected from 176 submissions. The papers of Part I (LNCS 9722) are organized in topical sections on National Security Infrastructure; Social Network Security; Bitcoin Security; Statistical Privacy; Network Security; Smart City Security; Digital Forensics; Lightweight Security; Secure Batch Processing; Pseudo Random/One-Way Function; Cloud Storage Security; Password/QR Code Security; and Functional Encryption and Attribute-Based Cryptosystem. Part II (LNCS 9723) comprises topics such as Signature and Key Management; Public Key and Identity-Based Encryption; Searchable Encryption; Broadcast Encryption; Mathematical Primitives; Symmetric Cipher; Public Key and Identity-Based Encryption; Biometric Security; Digital Forensics; National Security Infrastructure; Mobile Security; Network Security; and Pseudo Random / One-Way Function. |
![]() ![]() You may like...
Data Abstraction and Problem Solving…
Janet Prichard, Frank Carrano
Paperback
R2,263
Discovery Miles 22 630
Python Programming for Computations…
Computer Language
Hardcover
Scheduling Problems - New Applications…
Rodrigo Da Rosa Righi
Hardcover
R3,390
Discovery Miles 33 900
|