![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer programming > Algorithms & procedures
This book discusses the role of mobile network data in urban informatics, particularly how mobile network data is utilized in the mobility context, where approaches, models, and systems are developed for understanding travel behavior. The objectives of this book are thus to evaluate the extent to which mobile network data reflects travel behavior and to develop guidelines on how to best use such data to understand and model travel behavior. To achieve these objectives, the book attempts to evaluate the strengths and weaknesses of this data source for urban informatics and its applicability to the development and implementation of travel behavior models through a series of the authors' research studies. Traditionally, survey-based information is used as an input for travel demand models that predict future travel behavior and transportation needs. A survey-based approach is however costly and time-consuming, and hence its information can be dated and limited to a particular region. Mobile network data thus emerges as a promising alternative data source that is massive in both cross-sectional and longitudinal perspectives, and one that provides both broader geographic coverage of travelers and longer-term travel behavior observation. The two most common types of travel demand model that have played an essential role in managing and planning for transportation systems are four-step models and activity-based models. The book's chapters are structured on the basis of these travel demand models in order to provide researchers and practitioners with an understanding of urban informatics and the important role that mobile network data plays in advancing the state of the art from the perspectives of travel behavior research.
This collection of essays explores the different ways the insights from complexity theory can be applied to law. Complexity theory - a variant of systems theory - views law as an emergent, complex, self-organising system comprised of an interactive network of actors and systems that operate with no overall guiding hand, giving rise to complex, collective behaviour in law communications and actions. Addressing such issues as the unpredictability of legal systems, the ability of legal systems to adapt to changes in society, the importance of context, and the nature of law, the essays look to the implications of a complexity theory analysis for the study of public policy and administrative law, international law and human rights, regulatory practices in business and finance, and the practice of law and legal ethics. These are areas where law, which craves certainty, encounters unending, irresolvable complexity. This collection shows the many ways complexity theory thinking can reshape and clarify our understanding of the various problems relating to the theory and practice of law.
Exact eigenvalues, eigenvectors, and principal vectors of operators with infinite dimensional ranges can rarely be found. Therefore, one must approximate such operators by finite rank operators, then solve the original eigenvalue problem approximately. Serving as both an outstanding text for graduate students and as a source of current results for research scientists, Spectral Computations for Bounded Operators addresses the issue of solving eigenvalue problems for operators on infinite dimensional spaces. From a review of classical spectral theory through concrete approximation techniques to finite dimensional situations that can be implemented on a computer, this volume illustrates the marriage of pure and applied mathematics. It contains a variety of recent developments, including a new type of approximation that encompasses a variety of approximation methods but is simple to verify in practice. It also suggests a new stopping criterion for the QR Method and outlines advances in both the iterative refinement and acceleration techniques for improving the accuracy of approximations. The authors illustrate all definitions and results with elementary examples and include numerous exercises. Spectral Computations for Bounded Operators thus serves as both an outstanding text for second-year graduate students and as a source of current results for research scientists.
Fuzzy social choice theory is useful for modeling the uncertainty and imprecision prevalent in social life yet it has been scarcely applied and studied in the social sciences. Filling this gap, Application of Fuzzy Logic to Social Choice Theory provides a comprehensive study of fuzzy social choice theory. The book explains the concept of a fuzzy maximal subset of a set of alternatives, fuzzy choice functions, the factorization of a fuzzy preference relation into the "union" (conorm) of a strict fuzzy relation and an indifference operator, fuzzy non-Arrowian results, fuzzy versions of Arrow's theorem, and Black's median voter theorem for fuzzy preferences. It examines how unambiguous and exact choices are generated by fuzzy preferences and whether exact choices induced by fuzzy preferences satisfy certain plausible rationality relations. The authors also extend known Arrowian results involving fuzzy set theory to results involving intuitionistic fuzzy sets as well as the Gibbard-Satterthwaite theorem to the case of fuzzy weak preference relations. The final chapter discusses Georgescu's degree of similarity of two fuzzy choice functions.
In Algorithms Illuminated, Tim Roughgarden teaches the basics of algorithms in the most accessible way imaginable. This Omnibus Edition contains the complete text of Parts 1-4, with thorough coverage of asymptotic analysis, graph search and shortest paths, data structures, divide-and-conquer algorithms, greedy algorithms, dynamic programming, and NP-hard problems. Hundreds of worked examples, quizzes, and exercises, plus comprehensive online videos, help readers become better programmers; sharpen their analytical skills; learn to think algorithmically; acquire literacy with computer science's greatest hits; and ace their technical interviews.
Network Function Virtualization (NFV) has recently attracted considerable attention from both research and industrial communities. Numerous papers have been published regarding solving the resource- allocation problems in NFV, from various perspectives, considering different constraints, and adopting a range of techniques. However, it is difficult to get a clear impression of how to understand and classify different kinds of resource allocation problems in NFV and how to design solutions to solve these problems efficiently. This book addresses these concerns by offering a comprehensive overview and explanation of different resource allocation problems in NFV and presenting efficient solutions to solve them. It covers resource allocation problems in NFV, including an introduction to NFV and QoS parameters modelling as well as related problem definition, formulation and the respective state-of-the-art algorithms. This book allows readers to gain a comprehensive understanding of and deep insights into the resource allocation problems in NFV. It does so by exploring (1) the working principle and architecture of NFV, (2) how to model the Quality of Service (QoS) parameters in NFV services, (3) definition, formulation and analysis of different kinds of resource allocation problems in various NFV scenarios, (4) solutions for solving the resource allocation problem in NFV, and (5) possible future work in the respective area.
Examines classic algorithms, geometric diagrams, and mechanical principles for enhancing visualization of statistical estimation procedures and mathematical concepts in physics, engineering, and computer programming.
Synthetic Aperture Radar Automatic Detection Algorithms (SARADA) for Oil Spills conveys the pivotal tool required to fully comprehend the advanced algorithms in radar monitoring and detection of oil spills, particularly quantum computing and algorithms as a keystone to comprehending theories and algorithms behind radar imaging and detection of marine pollution. Bridging the gap between modern quantum mechanics and computing detection algorithms of oil spills, this book contains precise theories and techniques for automatic identification of oil spills from SAR measurements. Based on modern quantum physics, the book also includes the novel theory on radar imaging mechanism of oil spills. With the use of precise quantum simulation of trajectory movements of oil spills using a sequence of radar images, this book demonstrates the use of SARADA for contamination by oil spills as a promising novel technique. Key Features: Introduces basic concepts of a radar remote sensing. Fills a gap in the knowledge base of quantum theory and microwave remote sensing. Discusses the important aspects of oil spill imaging in radar data in relation to the quantum theory. Provides recent developments and progresses of automatic detection algorithms of oil spill from radar data. Presents 2-D oil spill radar data in 4-D images.
For computer scientists, especially those in the security field, the use of chaos has been limited to the computation of a small collection of famous but unsuitable maps that offer no explanation of why chaos is relevant in the considered contexts. Discrete Dynamical Systems and Chaotic Machines: Theory and Applications shows how to make finite machines, such as computers, neural networks, and wireless sensor networks, work chaotically as defined in a rigorous mathematical framework. Taking into account that these machines must interact in the real world, the authors share their research results on the behaviors of discrete dynamical systems and their use in computer science. Covering both theoretical and practical aspects, the book presents: Key mathematical and physical ideas in chaos theory Computer science fundamentals, clearly establishing that chaos properties can be satisfied by finite state machines Concrete applications of chaotic machines in computer security, including pseudorandom number generators, hash functions, digital watermarking, and steganography Concrete applications of chaotic machines in wireless sensor networks, including secure data aggregation and video surveillance Until the authors' recent research, the practical implementation of the mathematical theory of chaos on finite machines raised several issues. This self-contained book illustrates how chaos theory enables the study of computer security problems, such as steganalysis, that otherwise could not be tackled. It also explains how the theory reinforces existing cryptographically secure tools and schemes.
This book provides theoretical and practical knowledge on AI and swarm intelligence. It provides a methodology for EA (evolutionary algorithm)-based approach for complex adaptive systems with the integration of several meta-heuristics, e.g., ACO (Ant Colony Optimization), ABC (Artificial Bee Colony), and PSO (Particle Swarm Optimization), etc. These developments contribute towards better problem-solving methodologies in AI. The book also covers emerging uses of swarm intelligence in applications such as complex adaptive systems, reaction-diffusion computing, and diffusion-limited aggregation, etc. Another emphasis is its real-world applications. We give empirical examples from real-world problems and show that the proposed approaches are successful when addressing tasks from such areas as swarm robotics, silicon traffics, image understanding, Vornoi diagrams, queuing theory, and slime intelligence, etc. Each chapter begins with the background of the problem followed by the current state-of-the-art techniques of the field, and ends with a detailed discussion. In addition, the simulators, based on optimizers such as PSO and ABC complex adaptive system simulation, are described in detail. These simulators, as well as some source codes, are available online on the author's website for the benefit of readers interested in getting some hands-on experience of the subject. The concepts presented in this book aim to promote and facilitate the effective research in swarm intelligence approaches in both theory and practice. This book would also be of value to other readers because it covers interdisciplinary research topics that encompass problem-solving tasks in AI, complex adaptive systems, and meta-heuristics.
Modelling Transitions shows what computational, formal and data-driven approaches can and could mean for sustainability transitions research, presenting the state-of-the-art and exploring what lies beyond. Featuring contributions from many well-known authors, this book presents the various benefits of modelling for transitions research. More than just taking stock, it also critically examines what modelling of transformative change means and could mean for transitions research and for other disciplines that study societal changes. This includes identifying a variety of approaches currently not part of the portfolios of transitions modellers. Far from only singing praise, critical methodological and philosophical introspection are key aspects of this important book. This book speaks to modellers and non-modellers alike who value the development of robust knowledge on transitions to sustainability, including colleagues in congenial fields. Be they students, researchers or practitioners, everyone interested in transitions should find this book relevant as reference, resource and guide.
Bioinformatics is growing by leaps and bounds; theories/algorithms/statistical techniques are constantly evolving. Nevertheless, a core body of algorithmic ideas have emerged and researchers are beginning to adopt a "problem solving" approach to bioinformatics, wherein they use solutions to well-abstracted problems as building blocks to solve larger scope problems. "Problem Solving Handbook for Computational Biology" and Bioinformatics is an edited volume contributed by world renowned leaders in this field. This comprehensive handbook with problem solving emphasis, covers all relevant areas of computational biology and bioinformatics. Web resources and related themes are highlighted at every opportunity in this central easy-to-read reference. Designed for advanced-level students, researchers and professors in computer science and bioengineering as a reference or secondary text, this handbook is also suitable for professionals working in this industry.
This book explores inductive inference using the minimum message length (MML) principle, a Bayesian method which is a realisation of Ockham's Razor based on information theory. Accompanied by a library of software, the book can assist an applications programmer, student or researcher in the fields of data analysis and machine learning to write computer programs based upon this principle. MML inference has been around for 50 years and yet only one highly technical book has been written about the subject. The majority of research in the field has been backed by specialised one-off programs but this book includes a library of general MML-based software, in Java. The Java source code is available under the GNU GPL open-source license. The software library is documented using Javadoc which produces extensive cross referenced HTML manual pages. Every probability distribution and statistical model that is described in the book is implemented and documented in the software library. The library may contain a component that directly solves a reader's inference problem, or contain components that can be put together to solve the problem, or provide a standard interface under which a new component can be written to solve the problem. This book will be of interest to application developers in the fields of machine learning and statistics as well as academics, postdocs, programmers and data scientists. It could also be used by third year or fourth year undergraduate or postgraduate students.
The authors' treatment of data structures in Data Structures and Algorithms is unified by an informal notion of "abstract data types," allowing readers to compare different implementations of the same concept. Algorithm design techniques are also stressed and basic algorithm analysis is covered. Most of the programs are written in Pascal.
After a decade of development, genetic algorithms and genetic programming have become a widely accepted toolkit for computational finance. Genetic Algorithms and Genetic Programming in Computational Finance is a pioneering volume devoted entirely to a systematic and comprehensive review of this subject. Chapters cover various areas of computational finance, including financial forecasting, trading strategies development, cash flow management, option pricing, portfolio management, volatility modeling, arbitraging, and agent-based simulations of artificial stock markets. Two tutorial chapters are also included to help readers quickly grasp the essence of these tools. Finally, a menu-driven software program, Simple GP, accompanies the volume, which will enable readers without a strong programming background to gain hands-on experience in dealing with much of the technical material introduced in this work.
Divided roughly into two sections, this book provides a brief history of the development of ECG along with heart rate variability (HRV) algorithms and the engineering innovations over the last decade in this area. It reviews clinical research, presents an overview of the clinical field, and the importance of heart rate variability in diagnosis. The book then discusses the use of particular ECG and HRV algorithms in the context of clinical applications.
Pattern Recognition Algorithms for Data Mining addresses different pattern recognition (PR) tasks in a unified framework with both theoretical and experimental results. Tasks covered include data condensation, feature selection, case generation, clustering/classification, and rule generation and evaluation. This volume presents various theories, methodologies, and algorithms, using both classical approaches and hybrid paradigms. The authors emphasize large datasets with overlapping, intractable, or nonlinear boundary classes, and datasets that demonstrate granular computing in soft frameworks. Organized into eight chapters, the book begins with an introduction to PR, data mining, and knowledge discovery concepts. The authors analyze the tasks of multi-scale data condensation and dimensionality reduction, then explore the problem of learning with support vector machine (SVM). They conclude by highlighting the significance of granular computing for different mining tasks in a soft paradigm.
This book offers a clear and comprehensive introduction to broad learning, one of the novel learning problems studied in data mining and machine learning. Broad learning aims at fusing multiple large-scale information sources of diverse varieties together, and carrying out synergistic data mining tasks across these fused sources in one unified analytic. This book takes online social networks as an application example to introduce the latest alignment and knowledge discovery algorithms. Besides the overview of broad learning, machine learning and social network basics, specific topics covered in this book include network alignment, link prediction, community detection, information diffusion, viral marketing, and network embedding.
This edited book first consolidates the results of the EU-funded EDISON project (Education for Data Intensive Science to Open New science frontiers), which developed training material and information to assist educators, trainers, employers, and research infrastructure managers in identifying, recruiting and inspiring the data science professionals of the future. It then deepens the presentation of the information and knowledge gained to allow for easier assimilation by the reader. The contributed chapters are presented in sequence, each chapter picking up from the end point of the previous one. After the initial book and project overview, the chapters present the relevant data science competencies and body of knowledge, the model curriculum required to teach the required foundations, profiles of professionals in this domain, and use cases and applications. The text is supported with appendices on related process models. The book can be used to develop new courses in data science, evaluate existing modules and courses, draft job descriptions, and plan and design efficient data-intensive research teams across scientific disciplines.
This book collects selected contributions presented at the INdAM Workshop "Geometric Challenges in Isogeometric Analysis", held in Rome, Italy on January 27-31, 2020. It gives an overview of the forefront research on splines and their efficient use in isogeometric methods for the discretization of differential problems over complex and trimmed geometries. A variety of research topics in this context are covered, including (i) high-quality spline surfaces on complex and trimmed geometries, (ii) construction and analysis of smooth spline spaces on unstructured meshes, (iii) numerical aspects and benchmarking of isogeometric discretizations on unstructured meshes, meshing strategies and software. Given its scope, the book will be of interest to both researchers and graduate students working in the areas of approximation theory, geometric design and numerical simulation. Chapter 10 is available open access under a Creative Commons Attribution 4.0 International License via link.springer.com.
Originally published in 1995, Large Deviations for Performance Analysis consists of two synergistic parts. The first half develops the theory of large deviations from the beginning, through recent results on the theory for processes with boundaries, keeping to a very narrow path: continuous-time, discrete-state processes. By developing only what is needed for the applications, the theory is kept to a manageable level, both in terms of length and in terms of difficulty. Within its scope, the treatment is detailed, comprehensive and self-contained. As the book shows, there are sufficiently many interesting applications of jump Markov processes to warrant a special treatment. The second half is a collection of applications developed at Bell Laboratories. The applications cover large areas of the theory of communication networks: circuit switched transmission, packet transmission, multiple access channels, and the M/M/1 queue. Aspects of parallel computation are covered as well including, basics of job allocation, rollback-based parallel simulation, assorted priority queueing models that might be used in performance models of various computer architectures, and asymptotic coupling of processors. These applications are thoroughly analysed using the tools developed in the first half of the book.
In recent years game theory has had a substantial impact on computer science, especially on Internet- and e-commerce-related issues. Algorithmic Game Theory, first published in 2007, develops the central ideas and results of this exciting area in a clear and succinct manner. More than 40 of the top researchers in this field have written chapters that go from the foundations to the state of the art. Basic chapters on algorithmic methods for equilibria, mechanism design and combinatorial auctions are followed by chapters on important game theory applications such as incentives and pricing, cost sharing, information markets and cryptography and security. This definitive work will set the tone of research for the next few years and beyond. Students, researchers, and practitioners alike need to learn more about these fascinating theoretical developments and their widespread practical application.
'The book under review is an interesting elaboration that fills the gaps in libraries for concisely written and student-friendly books about essentials in computer science ... I recommend this book for anyone who would like to study algorithms, learn a lot about computer science or simply would like to deepen their knowledge ... The book is written in very simple English and can be understood even by those with limited knowledge of the English language. It should be emphasized that, despite the fact that the book consists of many examples, mathematical formulas and theorems, it is very hard to find any mistakes, errors or typos.'zbMATHIn computer science, an algorithm is an unambiguous specification of how to solve a class of problems. Algorithms can perform calculation, data processing and automated reasoning tasks.As an effective method, an algorithm can be expressed within a finite amount of space and time and in a well-defined formal language for calculating a function. Starting from an initial state and initial input (perhaps empty), the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states, eventually producing 'output' and terminating at a final ending state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input.This book introduces a set of concepts in solving problems computationally such as Growth of Functions; Backtracking; Divide and Conquer; Greedy Algorithms; Dynamic Programming; Elementary Graph Algorithms; Minimal Spanning Tree; Single-Source Shortest Paths; All Pairs Shortest Paths; Flow Networks; Polynomial Multiplication, to ways of solving NP-Complete Problems, supported with comprehensive, and detailed problems and solutions, making it an ideal resource to those studying computer science, computer engineering and information technology.
This practically-focused study guide introduces the fundamentals of discrete mathematics through an extensive set of classroom-tested problems. Each chapter presents a concise introduction to the relevant theory, followed by a detailed account of common challenges and methods for overcoming these. The reader is then encouraged to practice solving such problems for themselves, by tackling a varied selection of questions and assignments of different levels of complexity. This updated second edition now covers the design and analysis of algorithms using Python, and features more than 50 new problems, complete with solutions. Topics and features: provides a substantial collection of problems and examples of varying levels of difficulty, suitable for both laboratory practical training and self-study; offers detailed solutions to each problem, applying commonly-used methods and computational schemes; introduces the fundamentals of mathematical logic, the theory of algorithms, Boolean algebra, graph theory, sets, relations, functions, and combinatorics; presents more advanced material on the design and analysis of algorithms, including Turing machines, asymptotic analysis, and parallel algorithms; includes reference lists of trigonometric and finite summation formulae in an appendix, together with basic rules for differential and integral calculus. This hands-on workbook is an invaluable resource for undergraduate students of computer science, informatics, and electronic engineering. Suitable for use in a one- or two-semester course on discrete mathematics, the text emphasizes the skills required to develop and implement an algorithm in a specific programming language.
These contributions, written by the foremost international researchers and practitioners of Genetic Programming (GP), explore the synergy between theoretical and empirical results on real-world problems, producing a comprehensive view of the state of the art in GP. Topics in this volume include: evolutionary constraints, relaxation of selection mechanisms, diversity preservation strategies, flexing fitness evaluation, evolution in dynamic environments, multi-objective and multi-modal selection, foundations of evolvability, evolvable and adaptive evolutionary operators, foundation of injecting expert knowledge in evolutionary search, analysis of problem difficulty and required GP algorithm complexity, foundations in running GP on the cloud - communication, cooperation, flexible implementation, and ensemble methods. Additional focal points for GP symbolic regression are: (1) The need to guarantee convergence to solutions in the function discovery mode; (2) Issues on model validation; (3) The need for model analysis workflows for insight generation based on generated GP solutions - model exploration, visualization, variable selection, dimensionality analysis; (4) Issues in combining different types of data. Readers will discover large-scale, real-world applications of GP to a variety of problem domains via in-depth presentations of the latest and most significant results. |
You may like...
Dialect Accent Features for Establishing…
Manisha Kulshreshtha, Ramkumar Mathur
Hardcover
R1,408
Discovery Miles 14 080
Text, Speech, and Dialogue - 16th…
Ivan Habernal, Vaclav Matousek
Paperback
R1,495
Discovery Miles 14 950
Automatic Speech Signal Analysis for…
Ladan Baghai-Ravary, Steve W. Beet
Paperback
R1,408
Discovery Miles 14 080
|