![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > Data structures
A best-seller in its French edition, the construction of this book is original and its success in the French market demonstrates its appeal. It is based on three principles: 1. An organization of the chapters by families of algorithms : exhaustive search, divide and conquer, etc. At the contrary, there is no chapter only devoted to a systematic exposure of, say, algorithms on strings. Some of these will be found in different chapters. 2. For each family of algorithms, an introduction is given to the mathematical principles and the issues of a rigorous design, with one or two pedagogical examples. 3. For its most part, the book details 150 problems, spanning on seven families of algorithms. For each problem, a precise and progressive statement is given. More important, a complete solution is detailed, with respect to the design principles that have been presented ; often, some classical errors are pointed at. Roughly speaking, two thirds of the book are devoted to the detailed rational construction of the solutions.
This book highlights selected papers from the 4th ICSA-Canada Chapter Symposium, as well as invited articles from established researchers in the areas of statistics and data science. It covers a variety of topics, including methodology development in data science, such as methodology in the analysis of high dimensional data, feature screening in ultra-high dimensional data and natural language ranking; statistical analysis challenges in sampling, multivariate survival models and contaminated data, as well as applications of statistical methods. With this book, readers can make use of frontier research methods to tackle their problems in research, education, training and consultation.
This volume contains refereed papers and extended abstracts of papers presented at the NATO Advanced Research Workshop entitled 'Numerical Integration: Recent Developments, Software and Applications', held at Dalhousie University, Halifax, Canada, August 11-15, 1986. The Workshop was attended by thirty-six scientists from eleven NATO countries. Thirteen invited lectures and twenty-two contributed lectures were presented, of which twenty-five appear in full in this volume, together with extended abstracts of the remaining ten. It is more than ten years since the last workshop of this nature was held, in Los Alamos in 1975. Many developments have occurred in quadrature in the intervening years, and it seemed an opportune time to bring together again researchers in this area. The development of QUADPACK by Piessens, de Doncker, Uberhuber and Kahaner has changed the focus of research in the area of one dimensional quadrature from the construction of new rules to an emphasis on reliable robust software. There has been a dramatic growth in interest in the testing and evaluation of software, stimulated by the work of Lyness and Kaganove, Einarsson, and Piessens. The earlier research of Patterson into Kronrod extensions of Gauss rules, followed by the work of Monegato, and Piessens and Branders, has greatly increased interest in Gauss-based formulas for one-dimensional integration.
A new era of complexity science is emerging, in which nature- and bio-inspired principles are being applied to provide solutions. At the same time, the complexity of systems is increasing due to such models like the Internet of Things (IoT) and fog computing. Will complexity science, applying the principles of nature, be able to tackle the challenges posed by highly complex networked systems? Bio-Inspired Optimization in Fog and Edge Computing: Principles, Algorithms, and Systems is an attempt to answer this question. It presents innovative, bio-inspired solutions for fog and edge computing and highlights the role of machine learning and informatics. Nature- or biological-inspired techniques are successful tools to understand and analyze a collective behavior. As this book demonstrates, algorithms, and mechanisms of self-organization of complex natural systems have been used to solve optimization problems, particularly in complex systems that are adaptive, ever-evolving, and distributed in nature. The chapters look at ways of enhancingto enhance the performance of fog networks in real-world applications using nature-based optimization techniques. They discuss challenges and provide solutions to the concerns of security, privacy, and power consumption in cloud data center nodes and fog computing networks. The book also examines how: The existing fog and edge architecture is used to provide solutions to future challenges. A geographical information system (GIS) can be used with fog computing to help users in an urban region access prime healthcare. An optimization framework helps in cloud resource management. Fog computing can improve the quality, quantity, long-term viability, and cost-effectiveness in agricultural production. Virtualization can support fog computing, increase resources to be allocated, and be applied to different network layers. The combination of fog computing and IoT or cloud computing can help healthcare workers predict and analyze diseases in patients.
Covers different technologies like AI, IoT and Signal Processing in the context of biomedical applications Reviews medical image analysis, disease detection, and prediction Comprehends the advantage of recent technologies for medical record keeping through electronics health records (EHRs) Presents state of art research in the field of biomedical engineering using various physiological signals Explores different Bio Sensors used in Healthcare Applications using IoT
Discusses concepts such as Basic Programming Principles, OOP Principles, Database Programming, GUI Programming, Network Programming, Data Analytics and Visualization, Statistical Analysis, Virtual Reality, Web Development, Machine Learning, Deep Learning Provides the code and the output for all the concepts discussed Includes a case study at the end of each chapter
This book provides a comprehensive, modern introduction to convex optimization, a field that is becoming increasingly important in applied mathematics, economics and finance, engineering, and computer science, notably in data science and machine learning. Written by a leading expert in the field, this book includes recent advances in the algorithmic theory of convex optimization, naturally complementing the existing literature. It contains a unified and rigorous presentation of the acceleration techniques for minimization schemes of first- and second-order. It provides readers with a full treatment of the smoothing technique, which has tremendously extended the abilities of gradient-type methods. Several powerful approaches in structural optimization, including optimization in relative scale and polynomial-time interior-point methods, are also discussed in detail. Researchers in theoretical optimization as well as professionals working on optimization problems will find this book very useful. It presents many successful examples of how to develop very fast specialized minimization algorithms. Based on the author's lectures, it can naturally serve as the basis for introductory and advanced courses in convex optimization for students in engineering, economics, computer science and mathematics.
This textbook introduces basic and advanced embedded system topics through Arm Cortex M microcontrollers, covering programmable microcontroller usage starting from basic to advanced concepts using the STMicroelectronics Discovery development board. Designed for use in upper-level undergraduate and graduate courses on microcontrollers, microprocessor systems, and embedded systems, the book explores fundamental and advanced topics, real-time operating systems via FreeRTOS and Mbed OS, and then offers a solid grounding in digital signal processing, digital control, and digital image processing concepts - with emphasis placed on the usage of a microcontroller for these advanced topics. The book uses C language, "the" programming language for microcontrollers, C++ language, and MicroPython, which allows Python language usage on a microcontroller. Sample codes and course slides are available for readers and instructors, and a solutions manual is available to instructors. The book will also be an ideal reference for practicing engineers and electronics hobbyists who wish to become familiar with basic and advanced microcontroller concepts.
A decision procedure is an algorithm that, given a decision problem, terminates with a correct yes/no answer. Here, the authors focus on theories that are expressive enough to model real problems, but are still decidable. Specifically, the book concentrates on decision procedures for first-order theories that are commonly used in automated verification and reasoning, theorem-proving, compiler optimization and operations research. The techniques described in the book draw from fields such as graph theory and logic, and are routinely used in industry. The authors introduce the basic terminology of satisfiability modulo theories and then, in separate chapters, study decision procedures for each of the following theories: propositional logic; equalities and uninterpreted functions; linear arithmetic; bit vectors; arrays; pointer logic; and quantified formulas.
This open access handbook describes foundational issues, methodological approaches and examples on how to analyse and model data using Computational Social Science (CSS) for policy support. Up to now, CSS studies have mostly developed on a small, proof-of concept, scale that prevented from unleashing its potential to provide systematic impact to the policy cycle, as well as from improving the understanding of societal problems to the definition, assessment, evaluation, and monitoring of policies. The aim of this handbook is to fill this gap by exploring ways to analyse and model data for policy support, and to advocate the adoption of CSS solutions for policy by raising awareness of existing implementations of CSS in policy-relevant fields. To this end, the book explores applications of computational methods and approaches like big data, machine learning, statistical learning, sentiment analysis, text mining, systems modelling, and network analysis to different problems in the social sciences. The book is structured into three Parts: the first chapters on foundational issues open with an exposition and description of key policymaking areas where CSS can provide insights and information. In detail, the chapters cover public policy, governance, data justice and other ethical issues. Part two consists of chapters on methodological aspects dealing with issues such as the modelling of complexity, natural language processing, validity and lack of data, and innovation in official statistics. Finally, Part three describes the application of computational methods, challenges and opportunities in various social science areas, including economics, sociology, demography, migration, climate change, epidemiology, geography, and disaster management. The target audience of the book spans from the scientific community engaged in CSS research to policymakers interested in evidence-informed policy interventions, but also includes private companies holding data that can be used to study social sciences and are interested in achieving a policy impact.
Following an introduction to the basis of the fast Fourier transform (FFT), this book focuses on the implementation details on FFT for parallel computers. FFT is an efficient implementation of the discrete Fourier transform (DFT), and is widely used for many applications in engineering, science, and mathematics. Presenting many algorithms in pseudo-code and a complexity analysis, this book offers a valuable reference guide for graduate students, engineers, and scientists in the field who wish to apply FFT to large-scale problems.Parallel computation is becoming indispensable in solving the large-scale problems increasingly arising in a wide range of applications. The performance of parallel supercomputers is steadily improving, and it is expected that a massively parallel system with hundreds of thousands of compute nodes equipped with multi-core processors and accelerators will be available in the near future. Accordingly, the book also provides up-to-date computational techniques relevant to the FFT in state-of-the-art parallel computers. Following the introductory chapter, Chapter 2 introduces readers to the DFT and the basic idea of the FFT. Chapter 3 explains mixed-radix FFT algorithms, while Chapter 4 describes split-radix FFT algorithms. Chapter 5 explains multi-dimensional FFT algorithms, Chapter 6 presents high-performance FFT algorithms, and Chapter 7 addresses parallel FFT algorithms for shared-memory parallel computers. In closing, Chapter 8 describes parallel FFT algorithms for distributed-memory parallel computers.
Algorithim (mathematics) helps in understanding the direct and indirect relationship of plants that exist within it and other environmental factors. This book helps to understand how yield is related to different growth parameters, how growth is influenced by different environmental phenomenon, how best the resources can be used for crop production, etc. The numerical examples in the book guide a student to coordinate the different parameters and understand the subject of Agronomy well. This book is divided into thirteen chapters and covers comprehensively the different agronomic aspects to understand the science of mathematical Agronomy to meet the current and future challenges related to cropping practices.
This book provides a unified framework that describes how genetic learning can be used to design pattern recognition and learning systems. It examines how a search technique, the genetic algorithm, can be used for pattern classification mainly through approximating decision boundaries. Coverage also demonstrates the effectiveness of the genetic classifiers vis-a-vis several widely used classifiers, including neural networks. "
These contributions, written by the foremost international researchers and practitioners of Genetic Programming (GP), explore the synergy between theoretical and empirical results on real-world problems, producing a comprehensive view of the state of the art in GP. In this year's edition, the topics covered include many of the most important issues and research questions in the field, such as: opportune application domains for GP-based methods, game playing and co-evolutionary search, symbolic regression and efficient learning strategies, encodings and representations for GP, schema theorems, and new selection mechanisms.The volume includes several chapters on best practices and lessons learned from hands-on experience. Readers will discover large-scale, real-world applications of GP to a variety of problem domains via in-depth presentations of the latest and most significant results.
This book exposes how inequalities based on class and social background arise from employment practices in the digital age. It considers instances where social media is used in recruitment to infiltrate private lives and hide job advertisements based on locality; where algorithms assess socio-economic data to filter candidates; where human interviewers are replaced by artificial intelligence with design that disadvantages users of classed language; and where already vulnerable groups become victims of digitalisation and remote work. The author examines whether these practices create risks of discrimination based on certain protected attributes, including ‘social origin’ in international labour law and laws in Australia and South Africa, ‘social condition’ and ‘family status’ in laws within Canada, and others. The book proposes essential law reform and improvements to workplace policy.
This book focuses on picturing B-IoT techniques from a few perspectives, which are architecture, key technologies, security and privacy, service models and framework, practical use cases and more. Main contents of this book derive from most updated technical achievements or breakthroughs in the field. A number of representative IoT service offerings will be covered by this book, such as vehicular networks, document sharing system, and telehealth. Both theoretical and practical contents will be involved in this book in order to assist readers to have a comprehensive and deep understanding the mechanism of using blockchain for powering up IoT systems. The blockchain-enabled Internet of Things (B-IoT) is deemed to be a novel technical alternative that provides network-based services with additional functionalities, benefits, and implementations in terms of decentralization, immutability, and auditability. Towards the enhanced secure and privacy-preserving Internet of Things (IoT), this book introduces a few significant aspects of B-IoT, which includes fundamental knowledge of both blockchain and IoT, state-of-the-art reviews of B-IoT applications, crucial components in the B-IoT system and the model design, and future development potentials and trends. IoT technologies and services, e.g. cloud data storage technologies and vehicular services, play important roles in wireless technology developments. On the other side, blockchain technologies are being adopted in a variety of academic societies and professional realms due to its promising characteristics. It is observable that the research and development on integrating these two technologies will provide critical thinking and solid references for contemporary and future network-relevant solutions. This book targets researchers and advanced level students in computer science, who are focused on cryptography, cloud computing and internet of things, as well as electrical engineering students and researchers focused on vehicular networks and more. Professionals working in these fields will also find this book to be a valuable resource.
This text emphasizes the importance of artificial intelligence techniques in the field of biological computation. It also discusses fundamental principles that can be applied beyond bio-inspired computing. It comprehensively covers important topics including data integration, data mining, machine learning, genetic algorithms, evolutionary computation, evolved neural networks, nature-inspired algorithms, and protein structure alignment. The text covers the application of evolutionary computations for fractal visualization of sequence data, artificial intelligence, and automatic image interpretation in modern biological systems. The text is primarily written for graduate students and academic researchers in areas of electrical engineering, electronics engineering, computer engineering, and computational biology. This book: * Covers algorithms in the fields of artificial intelligence, and machine learning useful in biological data analysis. * Discusses comprehensively artificial intelligence and automatic image interpretation in modern biological systems. * Presents the application of evolutionary computations for fractal visualization of sequence data. * Explores the use of genetic algorithms for pair-wise and multiple sequence alignments. * Examines the roles of efficient computational techniques in biology.
The information infrastructure - comprising computers, embedded devices, networks and software systems - is vital to operations in every sector: inf- mation technology, telecommunications, energy, banking and ?nance, tra- portation systems, chemicals, agriculture and food, defense industrial base, public health and health care, national monuments and icons, drinking water and water treatment systems, commercial facilities, dams, emergency services, commercial nuclear reactors, materials and waste, postal and shipping, and government facilities. Global business and industry, governments, indeed - ciety itself, cannot function if major components of the critical information infrastructure are degraded, disabled or destroyed. This book, Critical Infrastructure Protection IV, is the fourth volume in the annual series produced by IFIP Working Group 11.10 on Critical Infr- tructure Protection, an active international community of scientists, engineers, practitioners and policy makers dedicated to advancing research, development and implementation e?orts related to critical infrastructure protection. The book presents original research results and innovative applications in the area of infrastructure protection. Also, it highlights the importance of weaving s- ence, technology and policy in crafting sophisticated, yet practical, solutions that will help secure information, computer and network assets in the various critical infrastructure sectors. This volume contains seventeen edited papers from the Fourth Annual IFIP Working Group 11.10 International Conference on Critical Infrastructure P- tection, held at the National Defense University, Washington, DC, March 15- 17, 2010. The papers were refereed by members of IFIP Working Group 11.10 and other internationally-recognized experts in critical infrastructure prot- tion.
The solitaire game "The Tower of Hanoi" was invented in the 19th century by the French number theorist Edouard Lucas. The book presents its mathematical theory and offers a survey of the historical development from predecessors up to recent research. In addition to long-standing myths, it provides a detailed overview of the essential mathematical facts with complete proofs, and also includes unpublished material, e.g., on some captivating integer sequences. The main objects of research today are the so-called Hanoi graphs and the related Sierpinski graphs. Acknowledging the great popularity of the topic in computer science, algorithms, together with their correctness proofs, form an essential part of the book. In view of the most important practical applications, namely in physics, network theory and cognitive (neuro)psychology, the book also addresses other structures related to the Tower of Hanoi and its variants. The updated second edition includes, for the first time in English, the breakthrough reached with the solution of the "The Reve's Puzzle" in 2014. This is a special case of the famed Frame-Stewart conjecture which is still open after more than 75 years. Enriched with elaborate illustrations, connections to other puzzles and challenges for the reader in the form of (solved) exercises as well as problems for further exploration, this book is enjoyable reading for students, educators, game enthusiasts and researchers alike. Excerpts from reviews of the first edition: "The book is an unusual, but very welcome, form of mathematical writing: recreational mathematics taken seriously and serious mathematics treated historically. I don't hesitate to recommend this book to students, professional research mathematicians, teachers, and to readers of popular mathematics who enjoy more technical expository detail." Chris Sangwin, The Mathematical Intelligencer 37(4) (2015) 87f. "The book demonstrates that the Tower of Hanoi has a very rich mathematical structure, and as soon as we tweak the parameters we surprisingly quickly find ourselves in the realm of open problems." Laszlo Kozma, ACM SIGACT News 45(3) (2014) 34ff. "Each time I open the book I discover a renewed interest in the Tower of Hanoi. I am sure that this will be the case for all readers." Jean-Paul Allouche, Newsletter of the European Mathematical Society 93 (2014) 56.
Presenting a novel biomimetic design method for transferring design solutions from nature to technology, this book focuses on structure-function patterns in nature and advanced modeling tools derived from TRIZ, the theory of inventive problem-solving. The book includes an extensive literature review on biomimicry as an engine of both innovation and sustainability, and discusses in detail the biomimetic design process, current biomimetic design methods and tools. The structural biomimetic design method for innovation and sustainability put forward in this text encompasses (1) the research method and rationale used to develop and validate this new design method; (2) the suggested design algorithm and tools including the Find structure database, structure-function patterns and ideality patterns; and (3) analyses of four case studies describing how to use the proposed method. This book offers an essential resource for designers who wish to use nature as a source of inspiration and knowledge, innovators and sustainability experts, and scientists and researchers, amongst others.
Activity recognition has emerged as a challenging and high-impact research field, as over the past years smaller and more powerful sensors have been introduced in wide-spread consumer devices. Validation of techniques and algorithms requires large-scale human activity corpuses and improved methods to recognize activities and the contexts in which they occur. This book deals with the challenges of designing valid and reproducible experiments, running large-scale dataset collection campaigns, designing activity and context recognition methods that are robust and adaptive, and evaluating activity recognition systems in the real world with real users.
This book focuses on the development of approximation-related algorithms and their relevant applications. Individual contributions are written by leading experts and reflect emerging directions and connections in data approximation and optimization. Chapters discuss state of the art topics with highly relevant applications throughout science, engineering, technology and social sciences. Academics, researchers, data science practitioners, business analysts, social sciences investigators and graduate students will find the number of illustrations, applications, and examples provided useful. This volume is based on the conference Approximation and Optimization: Algorithms, Complexity, and Applications, which was held in the National and Kapodistrian University of Athens, Greece, June 29-30, 2017. The mix of survey and research content includes topics in approximations to discrete noisy data; binary sequences; design of networks and energy systems; fuzzy control; large scale optimization; noisy data; data-dependent approximation; networked control systems; machine learning ; optimal design; no free lunch theorem; non-linearly constrained optimization; spectroscopy.
This book introduces a new scheduler to fairly and efficiently distribute system resources to many users of varying usage patterns compete for them in large shared computing environments. The Rawlsian Fair scheduler developed for this effort is shown to boost performance while reducing delay in high performance computing workloads of certain types including the following four types examined in this book: i. Class A - similar but complementary workloads ii. Class B - similar but steady vs intermittent workloads iii. Class C - Large vs small workloads iv. Class D - Large vs noise-like workloads This new scheduler achieves short-term fairness for small timescale demanding rapid response to varying workloads and usage profiles. Rawlsian Fair scheduler is shown to consistently benefit workload Classes C and D while it only benefits Classes A and B workloads where they become disproportionate as the number of users increases. A simulation framework, dSim, simulates the new Rawlsian Fair scheduling mechanism. The dSim helps achieve instantaneous fairness in High Performance Computing environments, effective utilization of computing resources, and user satisfaction through the Rawlsian Fair scheduler.
Computer science is the science of the future, and already underlies every facet of business and technology, and much of our everyday lives. In addition, it will play a crucial role in the science the 21st century, which will be dominated by biology and biochemistry, similar to the role of mathematics in the physical sciences of the 20th century. In this award-winning best-seller, the author and his co-author focus on the fundamentals of computer science, which revolve around the notion of the "algorithm." They discuss the design of algorithms, and their efficiency and correctness, the inherent limitations of algorithms and computation, quantum algorithms, concurrency, large systems and artificial intelligence. Throughout, the authors, in their own words, stress the 'fundamental and robust nature of the science in a form that is virtually independent of the details of specific computers, languages and formalisms'. This version of the book is published to celebrate 25 years since its first edition, and in honor of the Alan M. Turing Centennial year. Turing was a true pioneer of computer science, whose work forms the underlying basis of much of this book. " |
![]() ![]() You may like...
Python Programming for Computations…
Computer Language
Hardcover
Scheduling Problems - New Applications…
Rodrigo Da Rosa Righi
Hardcover
R3,390
Discovery Miles 33 900
|