![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > Mathematical theory of computation
The first part of this book covers the key concepts of cryptography on an undergraduate level, from encryption and digital signatures to cryptographic protocols. Essential techniques are demonstrated in protocols for key exchange, user identification, electronic elections and digital cash. In the second part, more advanced topics are addressed, such as the bit security of one-way functions and computationally perfect pseudorandom bit generators. The security of cryptographic schemes is a central topic. Typical examples of provably secure encryption and signature schemes and their security proofs are given. Though particular attention is given to the mathematical foundations, no special background in mathematics is presumed. The necessary algebra, number theory and probability theory are included in the appendix. Each chapter closes with a collection of exercises. In the second edition the authors added a complete description of the AES, an extended section on cryptographic hash functions, and new sections on random oracle proofs and public-key encryption schemes that are provably secure against adaptively-chosen-ciphertext attacks. The third edition is a further substantive extension, with new topics added, including: elliptic curve cryptography; Paillier encryption; quantum cryptography; the new SHA-3 standard for cryptographic hash functions; a considerably extended section on electronic elections and Internet voting; mix nets; and zero-knowledge proofs of shuffles. The book is appropriate for undergraduate and graduate students in computer science, mathematics, and engineering.
This book explores Probabilistic Cellular Automata (PCA) from the perspectives of statistical mechanics, probability theory, computational biology and computer science. PCA are extensions of the well-known Cellular Automata models of complex systems, characterized by random updating rules. Thanks to their probabilistic component, PCA offer flexible computing tools for complex numerical constructions, and realistic simulation tools for phenomena driven by interactions among a large number of neighboring structures. PCA are currently being used in various fields, ranging from pure probability to the social sciences and including a wealth of scientific and technological applications. This situation has produced a highly diversified pool of theoreticians, developers and practitioners whose interaction is highly desirable but can be hampered by differences in jargon and focus. This book - just as the workshop on which it is based - is an attempt to overcome these difference and foster interest among newcomers and interaction between practitioners from different fields. It is not intended as a treatise, but rather as a gentle introduction to the role and relevance of PCA technology, illustrated with a number of applications in probability, statistical mechanics, computer science, the natural sciences and dynamical systems. As such, it will be of interest to students and non-specialists looking to enter the field and to explore its challenges and open issues.
The aim of the book is to give an accessible introduction of mathematical models and signal processing methods in speech and hearing sciences for senior undergraduate and beginning graduate students with basic knowledge of linear algebra, differential equations, numerical analysis, and probability. Speech and hearing sciences are fundamental to numerous technological advances of the digital world in the past decade, from music compression in MP3 to digital hearing aids, from network based voice enabled services to speech interaction with mobile phones. Mathematics and computation are intimately related to these leaps and bounds. On the other hand, speech and hearing are strongly interdisciplinary areas where dissimilar scientific and engineering publications and approaches often coexist and make it difficult for newcomers to enter.
This book discusses all the major nature-inspired algorithms with a focus on their application in the context of solving navigation and routing problems. It also reviews the approximation methods and recent nature-inspired approaches for practical navigation, and compares these methods with traditional algorithms to validate the approach for the case studies discussed. Further, it examines the design of alternative solutions using nature-inspired techniques, and explores the challenges of navigation and routing problems and nature-inspired metaheuristic approaches.
This is the first book devoted to the task of computing integrability structures by computer. The symbolic computation of integrability operator is a computationally hard problem and the book covers a huge number of situations through tutorials. The mathematical part of the book is a new approach to integrability structures that allows to treat all of them in a unified way. The software is an official package of Reduce. Reduce is free software, so everybody can download it and make experiments using the programs available at our website.
This book shows how mathematics, computer science and science can be usefully and seamlessly intertwined. It begins with a general model of cognitive processes in a network of computational nodes, such as neurons, using a variety of tools from mathematics, computational science and neurobiology. It then moves on to solve the diffusion model from a low-level random walk point of view. It also demonstrates how this idea can be used in a new approach to solving the cable equation, in order to better understand the neural computation approximations. It introduces specialized data for emotional content, which allows a brain model to be built using MatLab tools, and also highlights a simple model of cognitive dysfunction.
This book presents a mathematical treatment of the radio resource allocation of modern cellular communications systems in contested environments. It focuses on fulfilling the quality of service requirements of the living applications on the user devices, which leverage the cellular system, and with attention to elevating the users' quality of experience. The authors also address the congestion of the spectrum by allowing sharing with the band incumbents while providing with a quality-of-service-minded resource allocation in the network. The content is of particular interest to telecommunications scheduler experts in industry, communications applications academia, and graduate students whose paramount research deals with resource allocation and quality of service.
This book addresses the basics of interval/fuzzy set theory, artificial neural networks (ANN) and computational methods. It presents step-by-step modeling for application problems along with simulation and numerical solutions. In general, every science and engineering problem is inherently biased by uncertainty, and there is often a need to model, solve and interpret problems in the world of uncertainty. At the same time, exact information about models and parameters of practical applications is usually not known and precise values do not exist. This book discusses uncertainty in both data and models. It consists of seven chapters covering various aspects of fuzzy uncertainty in application problems, such as shallow water wave equations, static structural problems, robotics, radon diffusion in soil, risk of invasive alien species and air quality quantification. These problems are handled by means of advanced computational and fuzzy theory along with machine intelligence when the uncertainties involved are fuzzy. The proposed computational methods offer new fuzzy computing methods that help other areas of knowledge construction where inexact information is present.
The Semantic Web is characterized by the existence of a very large number of distributed semantic resources, which together define a network of ontologies. These ontologies in turn are interlinked through a variety of different meta-relationships such as versioning, inclusion, and many more. This scenario is radically different from the relatively narrow contexts in which ontologies have been traditionally developed and applied, and thus calls for new methods and tools to effectively support the development of novel network-oriented semantic applications. This book by Suarez-Figueroa et al. provides the necessary methodological and technological support for the development and use of ontology networks, which ontology developers need in this distributed environment. After an introduction, in its second part the authors describe the NeOn Methodology framework. The book's third part details the key activities relevant to the ontology engineering life cycle. For each activity, a general introduction, methodological guidelines, and practical examples are provided. The fourth part then presents a detailed overview of the NeOn Toolkit and its plug-ins. Lastly, case studies from the pharmaceutical and the fishery domain round out the work. The book primarily addresses two main audiences: students (and their lecturers) who need a textbook for advanced undergraduate or graduate courses on ontology engineering, and practitioners who need to develop ontologies in particular or Semantic Web-based applications in general. Its educational value is maximized by its structured approach to explaining guidelines and combining them with case studies and numerous examples. The description of the open source NeOn Toolkit provides an additional asset, as it allows readers to easily evaluate and apply the ideas presented."
This book presents cutting-edge developments in the advanced mathematical theories utilized in computer graphics research - fluid simulation, realistic image synthesis, and texture, visualization and digital fabrication. A spin-off book from the International Symposium on Mathematical Progress in Expressive Image Synthesis in 2016 and 2017 (MEIS2016/2017) held in Fukuoka, Japan, it includes lecture notes and an expert introduction to the latest research presented at the symposium. The book offers an overview of the emerging interdisciplinary themes between computer graphics and driven mathematic theories, such as discrete differential geometry. Further, it highlights open problems in those themes, making it a valuable resource not only for researchers, but also for graduate students interested in computer graphics and mathematics.
This book develops a coherent and quite general theoretical approach to algorithm design for iterative learning control based on the use of operator representations and quadratic optimization concepts including the related ideas of inverse model control and gradient-based design. Using detailed examples taken from linear, discrete and continuous-time systems, the author gives the reader access to theories based on either signal or parameter optimization. Although the two approaches are shown to be related in a formal mathematical sense, the text presents them separately as their relevant algorithm design issues are distinct and give rise to different performance capabilities. Together with algorithm design, the text demonstrates the underlying robustness of the paradigm and also includes new control laws that are capable of incorporating input and output constraints, enable the algorithm to reconfigure systematically in order to meet the requirements of different reference and auxiliary signals and also to support new properties such as spectral annihilation. Iterative Learning Control will interest academics and graduate students working in control who will find it a useful reference to the current status of a powerful and increasingly popular method of control. The depth of background theory and links to practical systems will be of use to engineers responsible for precision repetitive processes.
This book presents the latest research advances in complex network structure analytics based on computational intelligence (CI) approaches, particularly evolutionary optimization. Most if not all network issues are actually optimization problems, which are mostly NP-hard and challenge conventional optimization techniques. To effectively and efficiently solve these hard optimization problems, CI based network structure analytics offer significant advantages over conventional network analytics techniques. Meanwhile, using CI techniques may facilitate smart decision making by providing multiple options to choose from, while conventional methods can only offer a decision maker a single suggestion. In addition, CI based network structure analytics can greatly facilitate network modeling and analysis. And employing CI techniques to resolve network issues is likely to inspire other fields of study such as recommender systems, system biology, etc., which will in turn expand CI's scope and applications. As a comprehensive text, the book covers a range of key topics, including network community discovery, evolutionary optimization, network structure balance analytics, network robustness analytics, community-based personalized recommendation, influence maximization, and biological network alignment. Offering a rich blend of theory and practice, the book is suitable for students, researchers and practitioners interested in network analytics and computational intelligence, both as a textbook and as a reference work.
This book reveals the historical context and the evolution of the technically complex Allied Signals Intelligence (Sigint) activity against Japan from 1920 to 1945. It traces the all-important genesis and development of the cryptanalytic techniques used to break the main Japanese Navy code (JN-25) and the Japanese Army s Water Transport Code during WWII. This is the first book to describe, explain and analyze the code breaking techniques developed and used to provide this intelligence, thus closing the sole remaining gap in the published accounts of the Pacific War. The authors also explore the organization of cryptographic teams and issues of security, censorship, and leaks. Correcting gaps in previous research, this book illustrates how Sigint remained crucial to Allied planning throughout the war. It helped direct the advance to the Philippines from New Guinea, the sea battles and the submarine onslaught on merchant shipping. Written by well-known authorities on the history of cryptography and mathematics, Code Breaking in the Pacific is designed for cryptologists, mathematicians and researchers working in communications security. Advanced-level students interested in cryptology, the history of the Pacific War, mathematics or the history of computing will also find this book a valuable resource."
Quantum physics started in the 1920's with wave mechanics and the wave-particle duality. However, the last 20 years have seen a second quantum revolution, centered around non-locality and quantum correlations between measurement outcomes. The associated key property, entanglement, is recognized today as the signature of quantumness. This second revolution opened the possibility of studying quantum correlations without any assumption on the internal functioning of the measurement apparata, the so-called Device-Independent Approach to Quantum Physics. This thesis explores this new approach using the powerful geometrical tool of polytopes. Emphasis is placed on the study of non-locality in the case of three or more parties, where it is shown that a whole new variety of phenomena appear compared to the bipartite case. Genuine multiparty entanglement is also studied for the first time within the device-independent framework. Finally, these tools are used to answer a long-standing open question: could quantum non-locality be explained by influences that propagate from one party to the others faster than light, but that remain hidden so that one cannot use them to communicate faster than light? This would provide a way around Einstein's notion of action at a distance that would be compatible with relativity. However, the answer is shown to be negative, as such influences could not remain hidden."
Imagine yourself as a military officer in a conflict zone trying to identify locations of weapons caches supporting road-side bomb attacks on your country's troops. Or imagine yourself as a public health expert trying to identify the location of contaminated water that is causing diarrheal diseases in a local population. Geospatial abduction is a new technique introduced by the authors that allows such problems to be solved. Geospatial Abduction provides the mathematics underlying geospatial abduction and the algorithms to solve them in practice; it has wide applicability and can be used by practitioners and researchers in many different fields. Real-world applications of geospatial abduction to military problems are included. Compelling examples drawn from other domains as diverse as criminology, epidemiology and archaeology are covered as well. This book also includes access to a dedicated website on geospatial abduction hosted by University of Maryland. Geospatial Abduction targets practitioners working in general AI, game theory, linear programming, data mining, machine learning, and more. Those working in the fields of computer science, mathematics, geoinformation, geological and biological science will also find this book valuable.
Information is an important concept that is studied extensively across a range of disciplines, from the physical sciences to genetics to psychology to epistemology. Information continues to increase in importance, and the present age has been referred to as the "Information Age." One may understand information in a variety of ways. For some, information is found in facts that were previously unknown. For others, a fact must have some economic value to be considered information. Other people emphasize the movement through a communication channel from one location to another when describing information. In all of these instances, information is the set of characteristics of the output of a process. Yet Information has seldom been studied in a consistent way across different disciplines. "Information from Processes" provides a discipline-independent and precise presentation of both information and computing processes. Information concepts and phenomena are examined in an effort to understand them, given a hierarchy of information processes, where one process uses others. Research about processes and computing is applied to answer the question of what information can and cannot be produced, and to determine the nature of this information (theoretical information science). The book also presents some of the basic processes that are used in specific domains (applied information science), such as those that generate information in areas like reasoning, the evolution of informative systems, cryptography, knowledge, natural language, and the economic value of information.Written for researchers and graduate students in information science and related fields, "Information from Processes "details a unique information model independent from other concepts in computer or archival science, which is thus applicable to a wide range of domains. Combining theoretical and empirical methods as well as psychological, mathematical, philosophical, and economic techniques, Losee's book delivers a solid basis and starting point for future discussions and research about the creation and use of information."
This book offers a self-contained guide to advanced algorithms and their applications in various fields of science. Gathering contributions by authoritative researchers in the field of mathematics, statistics and computer science, it aims at offering a comprehensive and up-to-date view of algorithms, including the theory behind them, as well as practical considerations, current limitations and solutions. It covers applications in energy management, decision making, computer networks, materials science, mechanics and process optimization. It offers an integrated and timely guide to important algorithms, and represents a valuable reference resource for graduate students and researchers in various fields of applied mathematics, statistics and engineering.
Fuzzy Cluster Analysis presents advanced and powerful fuzzy clustering techniques. This thorough and self-contained introduction to fuzzy clustering methods and applications covers classification, image recognition, data analysis and rule generation. Combining theoretical and practical perspectives, each method is analysed in detail and fully illustrated with examples. Features include:
Logical form has always been a prime concern for philosophers belonging to the analytic tradition. For at least one century, the study of logical form has been widely adopted as a method of investigation, relying on its capacity to reveal the structure of thoughts or the constitution of facts. This book focuses on the very idea of logical form, which is directly relevant to any principled reflection on that method. Its central thesis is that there is no such thing as a correct answer to the question of what is logical form: two significantly different notions of logical form are needed to fulfill two major theoretical roles that pertain respectively to logic and to semantics. This thesis has a negative and a positive side. The negative side is that a deeply rooted presumption about logical form turns out to be overly optimistic: there is no unique notion of logical form that can play both roles. The positive side is that the distinction between two notions of logical form, once properly spelled out, sheds light on some fundamental issues concerning the relation between logic and language.
Analysis and Control of Boolean Networks presents a systematic new approach to the investigation of Boolean control networks. The fundamental tool in this approach is a novel matrix product called the semi-tensor product (STP). Using the STP, a logical function can be expressed as a conventional discrete-time linear system. In the light of this linear expression, certain major issues concerning Boolean network topology - fixed points, cycles, transient times and basins of attractors - can be easily revealed by a set of formulae. This framework renders the state-space approach to dynamic control systems applicable to Boolean control networks. The bilinear-systemic representation of a Boolean control network makes it possible to investigate basic control problems including controllability, observability, stabilization, disturbance decoupling etc.
This book presents intellectual, innovative, information
technologies (I3-technologies) based on logical and probabilistic
(LP) risk models. The technologies presented here consider such
models for structurally complex systems and processes with logical
links and with random events in economics and technology. A number of applications is given to show the effectiveness of risk management technologies. In addition, topics of lectures and practical computer exercises intended for a two-semester course Risk management technologies are suggested."
This study explores an approach to text generation that interprets systemic grammar as a computational representation. Terry Patten demonstrates that systemic grammar can be easily and automatically translated into current AI knowledge representations and efficiently processed by the same knowledge-based techniques currently exploited by expert systems. Thus the fundamental methodological problem of interfacing specialized computational representations with equally specialized linguistic representations can be resolved. The study provides a detailed discussion of a substantial implementation involving a relatively large systemic grammar, and a formal model of the method. It represents a fundamental and productive contribution to the literature on text generation.
This collaborative book presents recent trends on the study of sequences, including combinatorics on words and symbolic dynamics, and new interdisciplinary links to group theory and number theory. Other chapters branch out from those areas into subfields of theoretical computer science, such as complexity theory and theory of automata. The book is built around four general themes: number theory and sequences, word combinatorics, normal numbers, and group theory. Those topics are rounded out by investigations into automatic and regular sequences, tilings and theory of computation, discrete dynamical systems, ergodic theory, numeration systems, automaton semigroups, and amenable groups. This volume is intended for use by graduate students or research mathematicians, as well as computer scientists who are working in automata theory and formal language theory. With its organization around unified themes, it would also be appropriate as a supplemental text for graduate level courses.
This book is a comprehensive, systematic survey of the synthesis problem, and of region theory which underlies its solution, covering the related theory, algorithms, and applications. The authors focus on safe Petri nets and place/transition nets (P/T-nets), treating synthesis as an automated process which, given behavioural specifications or partial specifications of a system to be realized, decides whether the specifications are feasible, and then produces a Petri net realizing them exactly, or if this is not possible produces a Petri net realizing an optimal approximation of the specifications. In Part I the authors introduce elementary net synthesis. In Part II they explain variations of elementary net synthesis and the unified theory of net synthesis. The first three chapters of Part III address the linear algebraic structure of regions, synthesis of P/T-nets from finite initialized transition systems, and the synthesis of unbounded P/T-nets. Finally, the last chapter in Part III and the chapters in Part IV cover more advanced topics and applications: P/T-net with the step firing rule, extracting concurrency from transition systems, process discovery, supervisory control, and the design of speed-independent circuits. Most chapters conclude with exercises, and the book is a valuable reference for both graduate students of computer science and electrical engineering and researchers and engineers in this domain.
This book is devoted to Professor Jurgen Lehn, who passed away on September 29, 2008, at the age of 67. It contains invited papers that were presented at the Wo- shop on Recent Developments in Applied Probability and Statistics Dedicated to the Memory of Professor Jurgen Lehn, Middle East Technical University (METU), Ankara, April 23-24, 2009, which was jointly organized by the Technische Univ- sitat Darmstadt (TUD) and METU. The papers present surveys on recent devel- ments in the area of applied probability and statistics. In addition, papers from the Panel Discussion: Impact of Mathematics in Science, Technology and Economics are included. Jurgen Lehn was born on the 28th of April, 1941 in Karlsruhe. From 1961 to 1968 he studied mathematics in Freiburg and Karlsruhe, and obtained a Diploma in Mathematics from the University of Karlsruhe in 1968. He obtained his Ph.D. at the University of Regensburg in 1972, and his Habilitation at the University of Karlsruhe in 1978. Later in 1978, he became a C3 level professor of Mathematical Statistics at the University of Marburg. In 1980 he was promoted to a C4 level professorship in mathematics at the TUD where he was a researcher until his death." |
![]() ![]() You may like...
Reflecting Rogue - Inside The Mind Of A…
Pumla Dineo Gqola
Paperback
![]()
|