![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > Data structures
Markov decision process (MDP) models are widely used for modeling
sequential decision-making problems that arise in engineering,
economics, computer science, and the social sciences. Many
real-world problems modeled by MDPs have huge state and/or action
spaces, giving an opening to the curse of dimensionality and so
making practical solution of the resulting models intractable. In
other cases, the system of interest is too complex to allow
explicit specification of some of the MDP model parameters, but
simulation samples are readily available (e.g., for random
transitions and costs). For these settings, various sampling and
population-based algorithms have been developed to overcome the
difficulties of computing an optimal solution in terms of a policy
and/or value function. Specific approaches include adaptive
sampling, evolutionary policy iteration, evolutionary random policy
search, and model reference adaptive search.
A best-seller in its French edition, the construction of this book is original and its success in the French market demonstrates its appeal. It is based on three principles: 1. An organization of the chapters by families of algorithms : exhaustive search, divide and conquer, etc. At the contrary, there is no chapter only devoted to a systematic exposure of, say, algorithms on strings. Some of these will be found in different chapters. 2. For each family of algorithms, an introduction is given to the mathematical principles and the issues of a rigorous design, with one or two pedagogical examples. 3. For its most part, the book details 150 problems, spanning on seven families of algorithms. For each problem, a precise and progressive statement is given. More important, a complete solution is detailed, with respect to the design principles that have been presented ; often, some classical errors are pointed at. Roughly speaking, two thirds of the book are devoted to the detailed rational construction of the solutions.
This volume contains the proceedings of the IFIPTM 2008, the Joint iTrust and PST Conferences on Privacy, Trust Management and Security, held in Trondheim, Norway from June 18 to June 20, 2008. IFIPTM 2008 provides a truly global platform for the reporting of research, development, policy and practice in the interdependent areas of Privacy, Security, and Trust. Following the traditions inherited from the highly successful iTrust and PST conference series, IFIPTM 2008 focuses on trust, privacy and security from multidisciplinary perspectives. The conference is an arena for discussion about re levant problems from both research and practice in the areas of academia, busi ness, and government. IFIPTM 2008 is an open IFIP conference, which only accepts contributed pa pers, so all papers in these proceedings have passed strict peer review. The pro gram of the conference features both theoretical research papers and reports of real world case studies. IFIPTM 2008 received 62 submissions. The program commit tee selected 22 papers for presentation and inclusion in the proceedings. In addi tion, the program and the proceedings include 3 demo descriptions. The highlights of IFIPTM 2008 include invited talks and tutorials by industri al and academic experts in the fields of trust management, privacy and security, including Jon Bing and Michael Steiner.
Genetic Programming Theory and Practice VII presents the results of the annual Genetic Programming Theory and Practice Workshop, contributed by the foremost international researchers and practitioners in the GP arena. Contributions examine the similarities and differences between theoretical and empirical results on real-world problems, and explore the synergy between theory and practice, producing a comprehensive view of the state of the art in GP application. Application areas include chemical process control, circuit design, financial data mining and bio-informatics, to name a few. About this book: Discusses the hurdles encountered when solving large-scale, cutting-edge applications, provides in-depth presentations of the latest and most significant applications of GP and the most recent theoretical results with direct applicability to state-of-the-art problems. Genetic Programming Theory and Practice VII is suitable for researchers, practitioners and students of Genetic Programming, including industry technical staffs, technical consultants and business entrepreneurs.
Computersystemsresearch is heavilyinfluencedby changesincomputertechnol- ogy. As technology changes alterthe characteristics ofthe underlying hardware com- ponents of the system, the algorithms used to manage the system need to be re- examinedand newtechniques need to bedeveloped. Technological influencesare par- ticularly evident in the design of storage management systems such as disk storage managers and file systems. The influences have been so pronounced that techniques developed as recently as ten years ago are being made obsolete. The basic problem for disk storage managers is the unbalanced scaling of hard- warecomponenttechnologies. Disk storage managerdesign depends on the technolo- gy for processors, main memory, and magnetic disks. During the 1980s, processors and main memories benefited from the rapid improvements in semiconductortechnol- ogy and improved by several orders ofmagnitude in performance and capacity. This improvement has not been matched by disk technology, which is bounded by the me- chanics ofrotating magnetic media. Magnetic disks ofthe 1980s have improved by a factor of 10in capacity butonly a factor of2 in performance. This unbalanced scaling ofthe hardware components challenges the disk storage manager to compensate for the slower disks and allow performance to scale with the processor and main memory technology. Unless the performance of file systems can be improved over that of the disks, I/O-bound applications will be unable to use the rapid improvements in processor speeds to improve performance for computer users. Disk storage managers must break this bottleneck and decouple application perfor- mance from the disk.
This book introduces the concepts, applications and development of data science in the telecommunications industry by focusing on advanced machine learning and data mining methodologies in the wireless networks domain. Mining Over Air describes the problems and their solutions for wireless network performance and quality, device quality readiness and returns analytics, wireless resource usage profiling, network traffic anomaly detection, intelligence-based self-organizing networks, telecom marketing, social influence, and other important applications in the telecom industry. Written by authors who study big data analytics in wireless networks and telecommunication markets from both industrial and academic perspectives, the book targets the pain points in telecommunication networks and markets through big data. Designed for both practitioners and researchers, the book explores the intersection between the development of new engineering technology and uses data from the industry to understand consumer behavior. It combines engineering savvy with insights about human behavior. Engineers will understand how the data generated from the technology can be used to understand the consumer behavior and social scientists will get a better understanding of the data generation process.
Details robustness, stability, and performance of Evolutionary Algorithms in dynamic environments
A state-of-the-art research monograph providing consistent treatment of supervisory control, by one of the world 's leading groups in the area of Bayesian identification, control, and decision making. An accompanying CD illustrates the book 's underlying theory.
Synthesis and Optimization of DSP Algorithms describes approaches taken to synthesising structural hardware descriptions of digital circuits from high-level descriptions of Digital Signal Processing (DSP) algorithms. The book contains: -A tutorial on the subjects of digital design and architectural
synthesis, intended for DSP engineers,
This book is an up-to-date documentation of the state of the art in combinatorial optimization, presenting approximate solutions of virtually all relevant classes of NP-hard optimization problems. The well-structured wealth of problems, algorithms, results, and techniques introduced systematically will make the book an indispensible source of reference for professionals. The smooth integration of numerous illustrations, examples, and exercises make this monograph an ideal textbook.
For the introductory Data Structures course (CS2) that typically follows a first course in programming. This text continues to offer a thorough, well-organized, and up-to-date presentation of essential principles and practices in data structures using C++. Reflecting the newest trends in computer science, new and revised material throughout the Second Edition places increased emphasis on abstract data types (ADTs) and object-oriented design. \ To access the author's Companion Website, including Solutions Manual, for ADTS, Data Structures and Problem Solving with C++, please go to http://cs.calvin.edu/books/c++/ds/2e/ For other books by Larry Nyhoff, please go to www.prenhall.com/nyhoff
There has been continuing interest in the improvement of the speed of Digital Signal processing. The use of Residue Number Systems for the design of DSP systems has been extensively researched in literature. Szabo and Tanaka have popularized this approach through their book published in 1967. Subsequently, Jenkins and Leon have rekindled the interest of researchers in this area in 1978, from which time there have been several efforts to use RNS in practical system implementation. An IEEE Press book has been published in 1986 which was a collection of Papers. It is very interesting to note that in the recent past since 1988, the research activity has received a new thrust with emphasis on VLSI design using non ROM based designs as well as ROM based designs as evidenced by the increased publications in this area. The main advantage in using RNS is that several small word-length Processors are used to perform operations such as addition, multiplication and accumulation, subtraction, thus needing less instruction execution time than that needed in conventional 16 bitl32 bit DSPs. However, the disadvantages of RNS have b. een the difficulty of detection of overflow, sign detection, comparison of two numbers, scaling, and division by arbitrary number, RNS to Binary conversion and Binary to RNS conversion. These operations, unfortunately, are computationally intensive and are time consuming."
These contributions, written by the foremost international researchers and practitioners of Genetic Programming (GP), explore the synergy between theoretical and empirical results on real-world problems, producing a comprehensive view of the state of the art in GP. Topics in this volume include: exploiting subprograms in genetic programming, schema frequencies in GP, Accessible AI, GP for Big Data, lexicase selection, symbolic regression techniques, co-evolution of GP and LCS, and applying ecological principles to GP. It also covers several chapters on best practices and lessons learned from hands-on experience. Readers will discover large-scale, real-world applications of GP to a variety of problem domains via in-depth presentations of the latest and most significant results.
This book not only provides a comprehensive introduction to neural-based PCA methods in control science, but also presents many novel PCA algorithms and their extensions and generalizations, e.g., dual purpose, coupled PCA, GED, neural based SVD algorithms, etc. It also discusses in detail various analysis methods for the convergence, stabilizing, self-stabilizing property of algorithms, and introduces the deterministic discrete-time systems method to analyze the convergence of PCA/MCA algorithms. Readers should be familiar with numerical analysis and the fundamentals of statistics, such as the basics of least squares and stochastic algorithms. Although it focuses on neural networks, the book only presents their learning law, which is simply an iterative algorithm. Therefore, no a priori knowledge of neural networks is required. This book will be of interest and serve as a reference source to researchers and students in applied mathematics, statistics, engineering, and other related fields.
This monograph gives a thorough treatment of the celebrated compositions of signature and encryption that allow for verifiability, that is, to efficiently prove properties about the encrypted data. This study is provided in the context of two cryptographic primitives: (1) designated confirmer signatures, an opaque signature which was introduced to control the proliferation of certified copies of documents, and (2) signcryption, a primitive that offers privacy and authenticity at once in an efficient way. This book is a useful resource to researchers in cryptology and information security, graduate and PhD students, and security professionals.
This book contains extended and revised versions of the best papers presented at the 17th IFIP WG 10.5/IEEE International Conference on Very Large Scale Integration, VLSI-SoC 2009, held in Florian polis, Brazil, in October 2009. The 8 papers included in the book together with two keynote talks were carefully reviewed and selected from 27 papers presented at the conference. The papers cover a wide variety of excellence in VLSI technology and advanced research addressing the current trend toward increasing chip integration and technology process advancements bringing about stimulating new challenges both at the physical and system-design levels, as well as in the test of theses systems.
"Examines classic algorithms, geometric diagrams, and mechanical principles for enhances visualization of statistical estimation procedures and mathematical concepts in physics, engineering, and computer programming."
Critical Infrastructure Protection II describes original research results and innovative applications in the interdisciplinary field of critical infrastructure protection. Also, it highlights the importance of weaving science, technology and policy in crafting sophisticated solutions that will help secure information, computer and network assets in the various critical infrastructure sectors. This book is the second volume in the annual series produced by the International Federation for Information Processing (IFIP) Working Group 11.10 on Critical Infrastructure Protection, an international community of scientists, engineers, practitioners and policy makers dedicated to advancing research, development and implementation efforts focused on infrastructure protection. The book contains a selection of twenty edited papers from the Second Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection held at George Mason University, Arlington, Virginia, USA in the spring of 2008.
This book aims to present the impact of Artificial Intelligence (AI) and Big Data in healthcare for medical decision making and data analysis in myriad fields including Radiology, Radiomics, Radiogenomics, Oncology, Pharmacology, COVID-19 prognosis, Cardiac imaging, Neuroradiology, Psychiatry and others. This will include topics such as Artificial Intelligence of Thing (AIOT), Explainable Artificial Intelligence (XAI), Distributed learning, Blockchain of Internet of Things (BIOT), Cybersecurity, and Internet of (Medical) Things (IoTs). Healthcare providers will learn how to leverage Big Data analytics and AI as methodology for accurate analysis based on their clinical data repositories and clinical decision support. The capacity to recognize patterns and transform large amounts of data into usable information for precision medicine assists healthcare professionals in achieving these objectives. Intelligent Health has the potential to monitor patients at risk with underlying conditions and track their progress during therapy. Some of the greatest challenges in using these technologies are based on legal and ethical concerns of using medical data and adequately representing and servicing disparate patient populations. One major potential benefit of this technology is to make health systems more sustainable and standardized. Privacy and data security, establishing protocols, appropriate governance, and improving technologies will be among the crucial priorities for Digital Transformation in Healthcare.
This text explains the fundamental principles of algorithms available for performing arithmetic operations on digital computers. These include basic arithmetic operations like addition, subtraction, multiplication, and division in fixed-point and floating-point number systems as well as more complex operations such as square root extraction and evaluation of exponential, logarithmic, and trigonometric functions. The algorithms described are independent of the particular technology employed for their implementation.
In the research area of computer science, practitioners are constantly searching for faster platforms with pertinent results. With analytics that span environmental development to computer hardware emulation, problem-solving algorithms are in high demand. Field-Programmable Gate Array (FPGA) is a promising computing platform that can be significantly faster for some applications and can be applied to a variety of fields. FPGA Algorithms and Applications in the IoT, AI, and High-Performance Computing provides emerging research exploring the theoretical and practical aspects of computable algorithms and applications within robotics and electronics development. Featuring coverage on a broad range of topics such as neuroscience, bioinformatics, and artificial intelligence, this book is ideally designed for computer science specialists, researchers, professors, and students seeking current research on cognitive analytics and advanced computing.
In delivering lectures and writing books, we were most often forced to pay absolutely no attention to a great body of interesting results and useful algorithms appearing in numerous sources and occasionally encountered. It was absolutely that most of these re sults would finally be forgotten because it is impossible to run through the entire variety of sources where these materials could be published. Therefore, we decided to do what we can to correct this situation. We discussed this problem with Ershov and came to an idea to write an encyclopedia of algorithms on graphs focusing our main attention on the algorithms already used in programming and their generalizations or modifications. We thought that it is reasonable to group all graphs into certain classes and place the algo rithms developed for each class into a separate book. The existence of trees, i. e., a class of graphs especially important for programming, also supported this decision. This monograph is the first but, as we hope, not the last book written as part of our project. It was preceded by two books "Algorithms on Trees" (1984) and "Algorithms of Processing of Trees" (1990) small editions of which were published at the Computer Center of the Siberian Division of the Russian Academy of Sciences. The books were distributed immediately and this made out our decision to prepare a combined mono graph on the basis of these books even stronger."
This book provides a comprehensive picture of fog computing technology, including of fog architectures, latency aware application management issues with real time requirements, security and privacy issues and fog analytics, in wide ranging application scenarios such as M2M device communication, smart homes, smart vehicles, augmented reality and transportation management. This book explores the research issues involved in the application of traditional shallow machine learning and deep learning techniques to big data analytics. It surveys global research advances in extending the conventional unsupervised or clustering algorithms, extending supervised and semi-supervised algorithms and association rule mining algorithms to big data Scenarios. Further it discusses the deep learning applications of big data analytics to fields of computer vision and speech processing, and describes applications such as semantic indexing and data tagging. Lastly it identifies 25 unsolved research problems and research directions in fog computing, as well as in the context of applying deep learning techniques to big data analytics, such as dimensionality reduction in high-dimensional data and improved formulation of data abstractions along with possible directions for their solutions.
I want to express my sincere thanks to all authors who submitted research papers to support the Third IFIP International Conference on Computer and Computing Te- nologies in Agriculture and the Third Symposium on Development of Rural Infor- tion (CCTA 2009) held in China, during October 14-17, 2009. This conference was hosted by the CICTA (EU-China Centre for Information & Communication Technologies, China Agricultural University), China National En- neering Research Center for Information Technology in Agriculture, Asian Conf- ence on Precision Agriculture, International Federation for Information Processing, Chinese Society of Agricultural Engineering, Beijing Society for Information Te- nology in Agriculture, and the Chinese Society for Agricultural Machinery. The pla- num sponsor includes the Ministry of Science and Technology of China, Ministry of Agriculture of China, Ministry of Education of China, among others. The CICTA (EU-China Centre for Information & Communication Technologies, China Agricultural University) focuses on research and development of advanced and practical technologies applied in agriculture and on promoting international communi- tion and cooperation. It has successfully held three International Conferences on C- puter and Computing Technologies in Agriculture, namely CCTA 2007, CCTA 2008 and CCTA 2009. Sustainable agriculture is the focus of the whole world currently, and therefore the application of information technology in agriculture is becoming more and more - portant. 'Informatized agriculture' has been sought by many countries recently in order to scientifically manage agriculture to achieve low costs and high incomes. |
You may like...
Computational and Statistical Methods…
Shen Liu, James McGree, …
Hardcover
R1,802
Discovery Miles 18 020
Comprehensive Metaheuristics…
S. Ali Mirjalili, Amir Hossein Gandomi
Paperback
R3,956
Discovery Miles 39 560
|