![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing
Network Science is the emerging field concerned with the study of large, realistic networks. This interdisciplinary endeavor, focusing on the patterns of interactions that arise between individual components of natural and engineered systems, has been applied to data sets from activities as diverse as high-throughput biological experiments, online trading information, smart-meter utility supplies, and pervasive telecommunications and surveillance technologies. This unique text/reference provides a fascinating insight into the state of the art in network science, highlighting the commonality across very different areas of application and the ways in which each area can be advanced by injecting ideas and techniques from another. The book includes contributions from an international selection of experts, providing viewpoints from a broad range of disciplines. It emphasizes networks that arise in nature-such as food webs, protein interactions, gene expression, and neural connections-and in technology-such as finance, airline transport, urban development and global trade. Topics and Features: begins with a clear overview chapter to introduce this interdisciplinary field; discusses the classic network science of fixed connectivity structures, including empirical studies, mathematical models and computational algorithms; examines time-dependent processes that take place over networks, covering topics such as synchronisation, and message passing algorithms; investigates time-evolving networks, such as the World Wide Web and shifts in topological properties (connectivity, spectrum, percolation); explores applications of complex networks in the physical and engineering sciences, looking ahead to new developments in the field. Researchers and professionals from disciplines as varied as computer science, mathematics, engineering, physics, chemistry, biology, ecology, neuroscience, epidemiology, and the social sciences will all benefit from this topical and broad overview of current activities and grand challenges in the unfolding field of network science.
This book provides a literature review of techniques used to pass from continuous to combinatorial space, before discussing a detailed example with individual steps of how cuckoo search (CS) can be adapted to solve combinatorial optimization problems. It demonstrates the application of CS to three different problems and describes their source code. The content is divided into five chapters, the first of which provides a technical description, together with examples of combinatorial search spaces. The second chapter summarizes a diverse range of methods used to solve combinatorial optimization problems. In turn, the third chapter presents a description of CS, its formulation and characteristics. In the fourth chapter, the application of discrete cuckoo search (DCS) to solve three POCs (the traveling salesman problem, quadratic assignment problem and job shop scheduling problem) is explained, focusing mainly on a reinterpretation of the terminology used in CS and its source of inspiration. In closing, the fifth chapter discusses random-key cuckoo search (RKCS) using random keys to represent positions found by cuckoo search in the TSP and QAP solution space.
The First International Conference on Computational Methods (ICCM04), organized by the department of Mechanical Engineering, National University of Singapore, was held in Singapore, December 15-17, 2004, with great success. This conference proceedings contains some 290 papers from more than 30 countries/regions. The papers cover a broad range of topics such as meshfree particle methods, Generalized FE and Extended FE methods, inverse analysis and optimization methods. Computational methods for geomechanics, machine learning, vibration, shock, impact, health monitoring, material modeling, fracture and damage mechanics, multi-physics and multi-scales simulation, sports and environments are also included. All the papers are pre-reviewed before they are accepted for publication in this proceedings. The proceedings will provide an informative, timely and invaluable resource for engineers and scientists working in the important areas of computational methods.
This monograph illustrates important notions in security reductions and essential techniques in security reductions for group-based cryptosystems. Using digital signatures and encryption as examples, the authors explain how to program correct security reductions for those cryptographic primitives. Various schemes are selected and re-proven in this book to demonstrate and exemplify correct security reductions. This book is suitable for researchers and graduate students engaged with public-key cryptography.
This book is useful to understand and write alongside non-human agents, examine the impact of algorithms and AI on writing, and accommodate relationships with autonomous agents. This ground-breaking future-driven framework prepares scholars and practitioners to investigate and plan for the social, digital literacy, and civic implications arising from emerging technologies. This book prepares researchers, students, practitioners, and citizens to work with AI writers, virtual humans, and social robots. This book explores prompts to envision how fields and professions will change. The book's unique integration with Fabric of Digital Life, a database and structured content repository for conducting social and cultural analysis of emerging technologies, provides concrete examples throughout. Readers gain imperative direction for collaborative, algorithmic, and autonomous writing futures.
This book focuses on the different representations and cryptographic properties of Booleans functions, presents constructions of Boolean functions with some good cryptographic properties. More specifically, Walsh spectrum description of the traditional cryptographic properties of Boolean functions, including linear structure, propagation criterion, nonlinearity, and correlation immunity are presented. Constructions of symmetric Boolean functions and of Boolean permutations with good cryptographic properties are specifically studied. This book is not meant to be comprehensive, but with its own focus on some original research of the authors in the past. To be self content, some basic concepts and properties are introduced. This book can serve as a reference for cryptographic algorithm designers, particularly the designers of stream ciphers and of block ciphers, and for academics with interest in the cryptographic properties of Boolean functions.
Applying TQM to systems engineering can reduce costs while simultaneously improving product quality. This guide to proactive systems engineering shows how to develop and optimize a practical approach, while highlighting the pitfalls and potentials involved.
This study explores an approach to text generation that interprets systemic grammar as a computational representation. Terry Patten demonstrates that systemic grammar can be easily and automatically translated into current AI knowledge representations and efficiently processed by the same knowledge-based techniques currently exploited by expert systems. Thus the fundamental methodological problem of interfacing specialized computational representations with equally specialized linguistic representations can be resolved. The study provides a detailed discussion of a substantial implementation involving a relatively large systemic grammar, and a formal model of the method. It represents a fundamental and productive contribution to the literature on text generation.
This book is devoted to Professor Jurgen Lehn, who passed away on September 29, 2008, at the age of 67. It contains invited papers that were presented at the Wo- shop on Recent Developments in Applied Probability and Statistics Dedicated to the Memory of Professor Jurgen Lehn, Middle East Technical University (METU), Ankara, April 23-24, 2009, which was jointly organized by the Technische Univ- sitat Darmstadt (TUD) and METU. The papers present surveys on recent devel- ments in the area of applied probability and statistics. In addition, papers from the Panel Discussion: Impact of Mathematics in Science, Technology and Economics are included. Jurgen Lehn was born on the 28th of April, 1941 in Karlsruhe. From 1961 to 1968 he studied mathematics in Freiburg and Karlsruhe, and obtained a Diploma in Mathematics from the University of Karlsruhe in 1968. He obtained his Ph.D. at the University of Regensburg in 1972, and his Habilitation at the University of Karlsruhe in 1978. Later in 1978, he became a C3 level professor of Mathematical Statistics at the University of Marburg. In 1980 he was promoted to a C4 level professorship in mathematics at the TUD where he was a researcher until his death."
This book presents a systematic exposition of the main ideas and methods in treating inverse problems for PDEs arising in basic mathematical models, though it makes no claim to being exhaustive. Mathematical models of most physical phenomena are governed by initial and boundary value problems for PDEs, and inverse problems governed by these equations arise naturally in nearly all branches of science and engineering. The book's content, especially in the Introduction and Part I, is self-contained and is intended to also be accessible for beginning graduate students, whose mathematical background includes only basic courses in advanced calculus, PDEs and functional analysis. Further, the book can be used as the backbone for a lecture course on inverse and ill-posed problems for partial differential equations. In turn, the second part of the book consists of six nearly-independent chapters. The choice of these chapters was motivated by the fact that the inverse coefficient and source problems considered here are based on the basic and commonly used mathematical models governed by PDEs. These chapters describe not only these inverse problems, but also main inversion methods and techniques. Since the most distinctive features of any inverse problems related to PDEs are hidden in the properties of the corresponding solutions to direct problems, special attention is paid to the investigation of these properties. For the second edition, the authors have added two new chapters focusing on real-world applications of inverse problems arising in wave and vibration phenomena. They have also revised the whole text of the first edition.
In this essay collection, leading physicists, philosophers, and historians attempt to fill the empty theoretical ground in the foundations of information and address the related question of the limits to our knowledge of the world. Over recent decades, our practical approach to information and its exploitation has radically outpaced our theoretical understanding - to such a degree that reflection on the foundations may seem futile. But it is exactly fields such as quantum information, which are shifting the boundaries of the physically possible, that make a foundational understanding of information increasingly important. One of the recurring themes of the book is the claim by Eddington and Wheeler that information involves interaction and putting agents or observers centre stage. Thus, physical reality, in their view, is shaped by the questions we choose to put to it and is built up from the information residing at its core. This is the root of Wheeler's famous phrase "it from bit." After reading the stimulating essays collected in this volume, readers will be in a good position to decide whether they agree with this view.
This book highlights the current challenges for engineers involved in product development and the associated changes in procedure they make necessary. Methods for systematically analyzing the requirements for safety and security mechanisms are described using examples of how they are implemented in software and hardware, and how their effectiveness can be demonstrated in terms of functional and design safety are discussed. Given today's new E-mobility and automated driving approaches, new challenges are arising and further issues concerning "Road Vehicle Safety" and "Road Traffic Safety" have to be resolved. To address the growing complexity of vehicle functions, as well as the increasing need to accommodate interdisciplinary project teams, previous development approaches now have to be reconsidered, and system engineering approaches and proven management systems need to be supplemented or wholly redefined. The book presents a continuous system development process, starting with the basic requirements of quality management and continuing until the release of a vehicle and its components for road use. Attention is paid to the necessary definition of the respective development item, the threat-, hazard- and risk analysis, safety concepts and their relation to architecture development, while the book also addresses the aspects of product realization in mechanics, electronics and software as well as for subsequent testing, verification, integration and validation phases. In November 2011, requirements for the Functional Safety (FuSa) of road vehicles were first published in ISO 26262. The processes and methods described here are intended to show developers how vehicle systems can be implemented according to ISO 26262, so that their compliance with the relevant standards can be demonstrated as part of a safety case, including audits, reviews and assessments.
This book focuses on three core knowledge requirements for effective and thorough data analysis for solving business problems. These are a foundational understanding of: 1. statistical, econometric, and machine learning techniques; 2. data handling capabilities; 3. at least one programming language. Practical in orientation, the volume offers illustrative case studies throughout and examples using Python in the context of Jupyter notebooks. Covered topics include demand measurement and forecasting, predictive modeling, pricing analytics, customer satisfaction assessment, market and advertising research, and new product development and research. This volume will be useful to business data analysts, data scientists, and market research professionals, as well as aspiring practitioners in business data analytics. It can also be used in colleges and universities offering courses and certifications in business data analytics, data science, and market research.
Most networks and databases that humans have to deal with contain large, albeit finite number of units. Their structure, for maintaining functional consistency of the components, is essentially not random and calls for a precise quantitative description of relations between nodes (or data units) and all network components. This book is an introduction, for both graduate students and newcomers to the field, to the theory of graphs and random walks on such graphs. The methods based on random walks and diffusions for exploring the structure of finite connected graphs and databases are reviewed (Markov chain analysis). This provides the necessary basis for consistently discussing a number of applications such diverse as electric resistance networks, estimation of land prices, urban planning, linguistic databases, music, and gene expression regulatory networks.
This book is designed both for FPGA users interested in developing new, specific components - generally for reducing execution times -and IP core designers interested in extending their catalog of specific components. The main focus is circuit synthesis and the discussion shows, for example, how a given algorithm executing some complex function can be translated to a synthesizable circuit description, as well as which are the best choices the designer can make to reduce the circuit cost, latency, or power consumption. This is not a book on algorithms. It is a book that shows how to translate efficiently an algorithm to a circuit, using techniques such as parallelism, pipeline, loop unrolling, and others. Numerous examples of FPGA implementation are described throughout this book and the circuits are modeled in VHDL. Complete and synthesizable source files are available for download."
These contributions, written by the foremost international researchers and practitioners of Genetic Programming (GP), explore the synergy between theoretical and empirical results on real-world problems, producing a comprehensive view of the state of the art in GP. Topics in this volume include: evolutionary constraints, relaxation of selection mechanisms, diversity preservation strategies, flexing fitness evaluation, evolution in dynamic environments, multi-objective and multi-modal selection, foundations of evolvability, evolvable and adaptive evolutionary operators, foundation of injecting expert knowledge in evolutionary search, analysis of problem difficulty and required GP algorithm complexity, foundations in running GP on the cloud - communication, cooperation, flexible implementation, and ensemble methods. Additional focal points for GP symbolic regression are: (1) The need to guarantee convergence to solutions in the function discovery mode; (2) Issues on model validation; (3) The need for model analysis workflows for insight generation based on generated GP solutions - model exploration, visualization, variable selection, dimensionality analysis; (4) Issues in combining different types of data. Readers will discover large-scale, real-world applications of GP to a variety of problem domains via in-depth presentations of the latest and most significant results.
Decision-aiding software, the underpinning of computer-aided judicial analysis, can facilitate the prediction of how cases are likely to be decided, prescribe decisions that should be reached in such cases, and help administrate more efficiently the court process. It can do so, says Nagel, by listing past cases on each row of a spreadsheet matrix, by listing predictive criteria in the columns, and in general by showing for each factual element the estimated probability of winning a case. The software aggregates the information available and deduces likely outcomes. But it can also prescribe judicial decisions by listing alternatives in the rows, the goals to be achieved in the columns, and by showing relations between alternatives in the cells. By similar means decision-aiding software can also help perform administrative tasks, such as rationally assigning judges or other personnel to cases, and by sequencing cases to reduce the time consumed by each case. In Part I, Nagel provides an overview of computer-aided analysis and the role of decision-aiding software in the legal process. In the second part he deals with judicial prediction from prior cases and from present facts; and in the third part he emphasizes the prescribing role of judges, particularly in deciding the rules that ought to be applied in civil and criminal procedures. Nagel also covers computer-aided mediation and provides a new perspective on judicial decisions. Then, in Part IV, he treats at length the process of judicial administration and how to improve its efficiency. Of particular interest to court personnel will be the benefits to be derived from reducing delays and in the docketing and sequencing of cases.
This book provides a general overview of multiple instance learning (MIL), defining the framework and covering the central paradigms. The authors discuss the most important algorithms for MIL such as classification, regression and clustering. With a focus on classification, a taxonomy is set and the most relevant proposals are specified. Efficient algorithms are developed to discover relevant information when working with uncertainty. Key representative applications are included. This book carries out a study of the key related fields of distance metrics and alternative hypothesis. Chapters examine new and developing aspects of MIL such as data reduction for multi-instance problems and imbalanced MIL data. Class imbalance for multi-instance problems is defined at the bag level, a type of representation that utilizes ambiguity due to the fact that bag labels are available, but the labels of the individual instances are not defined. Additionally, multiple instance multiple label learning is explored. This learning framework introduces flexibility and ambiguity in the object representation providing a natural formulation for representing complicated objects. Thus, an object is represented by a bag of instances and is allowed to have associated multiple class labels simultaneously. This book is suitable for developers and engineers working to apply MIL techniques to solve a variety of real-world problems. It is also useful for researchers or students seeking a thorough overview of MIL literature, methods, and tools.
Improved geospatial instrumentation and technology such as in laser scanning has now resulted in millions of data being collected, e.g., point clouds. It is in realization that such huge amount of data requires efficient and robust mathematical solutions that this third edition of the book extends the second edition by introducing three new chapters: Robust parameter estimation, Multiobjective optimization and Symbolic regression. Furthermore, the linear homotopy chapter is expanded to include nonlinear homotopy. These disciplines are discussed first in the theoretical part of the book before illustrating their geospatial applications in the applications chapters where numerous numerical examples are presented. The renewed electronic supplement contains these new theoretical and practical topics, with the corresponding Mathematica statements and functions supporting their computations introduced and applied. This third edition is renamed in light of these technological advancements.
With more restrictions upon animal experimentations, pharmaceutical industries are currently focusing on a new generation of experiments and technologies that are considerably more efficient and less controversial. The integration of computational and experimental strategies has led to the identification and development of promising compounds. Computer Applications in Drug Discovery and Development is a pivotal reference source that provides innovative research on the application of computers for discovering and designing new drugs in modern molecular biology and medicinal chemistry. While highlighting topics such as chemical structure databases and dataset utilization, this publication delves into the current panorama of drug discovery, where high drug failure rates are a major concern and properly designed virtual screening strategies can be a time-saving, cost-effective, and productive alternative. This book is ideally designed for chemical engineers, pharmacists, molecular biologists, students, researchers, and academicians seeking current research on the unexplored avenues and future perspectives of drug design.
This unique textbook/reference presents unified coverage of bioinformatics topics relating to both biological sequences and biological networks, providing an in-depth analysis of cutting-edge distributed algorithms, as well as of relevant sequential algorithms. In addition to introducing the latest algorithms in this area, more than fifteen new distributed algorithms are also proposed. Topics and features: reviews a range of open challenges in biological sequences and networks; describes in detail both sequential and parallel/distributed algorithms for each problem; suggests approaches for distributed algorithms as possible extensions to sequential algorithms, when the distributed algorithms for the topic are scarce; proposes a number of new distributed algorithms in each chapter, to serve as potential starting points for further research; concludes each chapter with self-test exercises, a summary of the key points, a comparison of the algorithms described, and a literature review.
Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovi, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational algorithms, and multidisciplinary applications. Special features of this volume: - Presents results and approximation methods in various computational settings including: polynomial and orthogonal systems, analytic functions, and differential equations. - Provides a historical overview of approximation theory and many of its subdisciplines; - Contains new results from diverse areas of research spanning mathematics, engineering, and the computational sciences. "Approximation and Computation" is intended for mathematicians and researchers focusing on approximation theory and numerical analysis, but can also be a valuable resource to students and researchers in the computational and applied sciences." |
![]() ![]() You may like...
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
Discovering Computers, Essentials…
Susan Sebok, Jennifer Campbell, …
Paperback
|