![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer programming > Algorithms & procedures
This book introduces the concepts, applications and development of data science in the telecommunications industry by focusing on advanced machine learning and data mining methodologies in the wireless networks domain. Mining Over Air describes the problems and their solutions for wireless network performance and quality, device quality readiness and returns analytics, wireless resource usage profiling, network traffic anomaly detection, intelligence-based self-organizing networks, telecom marketing, social influence, and other important applications in the telecom industry. Written by authors who study big data analytics in wireless networks and telecommunication markets from both industrial and academic perspectives, the book targets the pain points in telecommunication networks and markets through big data. Designed for both practitioners and researchers, the book explores the intersection between the development of new engineering technology and uses data from the industry to understand consumer behavior. It combines engineering savvy with insights about human behavior. Engineers will understand how the data generated from the technology can be used to understand the consumer behavior and social scientists will get a better understanding of the data generation process.
Details robustness, stability, and performance of Evolutionary Algorithms in dynamic environments
A state-of-the-art research monograph providing consistent treatment of supervisory control, by one of the world 's leading groups in the area of Bayesian identification, control, and decision making. An accompanying CD illustrates the book 's underlying theory.
Synthesis and Optimization of DSP Algorithms describes approaches taken to synthesising structural hardware descriptions of digital circuits from high-level descriptions of Digital Signal Processing (DSP) algorithms. The book contains: -A tutorial on the subjects of digital design and architectural
synthesis, intended for DSP engineers,
This book is an up-to-date documentation of the state of the art in combinatorial optimization, presenting approximate solutions of virtually all relevant classes of NP-hard optimization problems. The well-structured wealth of problems, algorithms, results, and techniques introduced systematically will make the book an indispensible source of reference for professionals. The smooth integration of numerous illustrations, examples, and exercises make this monograph an ideal textbook.
For the introductory Data Structures course (CS2) that typically follows a first course in programming. This text continues to offer a thorough, well-organized, and up-to-date presentation of essential principles and practices in data structures using C++. Reflecting the newest trends in computer science, new and revised material throughout the Second Edition places increased emphasis on abstract data types (ADTs) and object-oriented design. \ To access the author's Companion Website, including Solutions Manual, for ADTS, Data Structures and Problem Solving with C++, please go to http://cs.calvin.edu/books/c++/ds/2e/ For other books by Larry Nyhoff, please go to www.prenhall.com/nyhoff
Written for developers with some understanding of deep learning algorithms. Experience with reinforcement learning is not required. Grokking Deep Reinforcement Learning introduces this powerful machine learning approach, using examples, illustrations, exercises, and crystal-clear teaching. You'll love the perfectly paced teaching and the clever, engaging writing style as you dig into this awesome exploration of reinforcement learning fundamentals, effective deep learning techniques, and practical applications in this emerging field. We all learn through trial and error. We avoid the things that cause us to experience pain and failure. We embrace and build on the things that give us reward and success. This common pattern is the foundation of deep reinforcement learning: building machine learning systems that explore and learn based on the responses of the environment. * Foundational reinforcement learning concepts and methods * The most popular deep reinforcement learning agents solving high-dimensional environments * Cutting-edge agents that emulate human-like behavior and techniques for artificial general intelligence Deep reinforcement learning is a form of machine learning in which AI agents learn optimal behavior on their own from raw sensory input. The system perceives the environment, interprets the results of its past decisions and uses this information to optimize its behavior for maximum long-term return.
There has been continuing interest in the improvement of the speed of Digital Signal processing. The use of Residue Number Systems for the design of DSP systems has been extensively researched in literature. Szabo and Tanaka have popularized this approach through their book published in 1967. Subsequently, Jenkins and Leon have rekindled the interest of researchers in this area in 1978, from which time there have been several efforts to use RNS in practical system implementation. An IEEE Press book has been published in 1986 which was a collection of Papers. It is very interesting to note that in the recent past since 1988, the research activity has received a new thrust with emphasis on VLSI design using non ROM based designs as well as ROM based designs as evidenced by the increased publications in this area. The main advantage in using RNS is that several small word-length Processors are used to perform operations such as addition, multiplication and accumulation, subtraction, thus needing less instruction execution time than that needed in conventional 16 bitl32 bit DSPs. However, the disadvantages of RNS have b. een the difficulty of detection of overflow, sign detection, comparison of two numbers, scaling, and division by arbitrary number, RNS to Binary conversion and Binary to RNS conversion. These operations, unfortunately, are computationally intensive and are time consuming."
These contributions, written by the foremost international researchers and practitioners of Genetic Programming (GP), explore the synergy between theoretical and empirical results on real-world problems, producing a comprehensive view of the state of the art in GP. Topics in this volume include: exploiting subprograms in genetic programming, schema frequencies in GP, Accessible AI, GP for Big Data, lexicase selection, symbolic regression techniques, co-evolution of GP and LCS, and applying ecological principles to GP. It also covers several chapters on best practices and lessons learned from hands-on experience. Readers will discover large-scale, real-world applications of GP to a variety of problem domains via in-depth presentations of the latest and most significant results.
This book not only provides a comprehensive introduction to neural-based PCA methods in control science, but also presents many novel PCA algorithms and their extensions and generalizations, e.g., dual purpose, coupled PCA, GED, neural based SVD algorithms, etc. It also discusses in detail various analysis methods for the convergence, stabilizing, self-stabilizing property of algorithms, and introduces the deterministic discrete-time systems method to analyze the convergence of PCA/MCA algorithms. Readers should be familiar with numerical analysis and the fundamentals of statistics, such as the basics of least squares and stochastic algorithms. Although it focuses on neural networks, the book only presents their learning law, which is simply an iterative algorithm. Therefore, no a priori knowledge of neural networks is required. This book will be of interest and serve as a reference source to researchers and students in applied mathematics, statistics, engineering, and other related fields.
This is a how-to book for solving geometric problems robustly or error free in actual practice. The contents and accompanying source code are based on the feature requests and feedback received from industry professionals and academics who want both the descriptions and source code for implementations of geometric algorithms. The book provides a framework for geometric computing using several arithmetic systems and describes how to select the appropriate system for the problem at hand. Key Features: A framework of arithmetic systems that can be applied to many geometric algorithms to obtain robust or error-free implementations Detailed derivations for algorithms that lead to implementable code Teaching the readers how to use the book concepts in deriving algorithms in their fields of application The Geometric Tools Library, a repository of well-tested code at the Geometric Tools website, https://www.geometrictools.com, that implements the book concepts
This monograph gives a thorough treatment of the celebrated compositions of signature and encryption that allow for verifiability, that is, to efficiently prove properties about the encrypted data. This study is provided in the context of two cryptographic primitives: (1) designated confirmer signatures, an opaque signature which was introduced to control the proliferation of certified copies of documents, and (2) signcryption, a primitive that offers privacy and authenticity at once in an efficient way. This book is a useful resource to researchers in cryptology and information security, graduate and PhD students, and security professionals.
"Examines classic algorithms, geometric diagrams, and mechanical principles for enhances visualization of statistical estimation procedures and mathematical concepts in physics, engineering, and computer programming."
This book contains extended and revised versions of the best papers presented at the 17th IFIP WG 10.5/IEEE International Conference on Very Large Scale Integration, VLSI-SoC 2009, held in Florian polis, Brazil, in October 2009. The 8 papers included in the book together with two keynote talks were carefully reviewed and selected from 27 papers presented at the conference. The papers cover a wide variety of excellence in VLSI technology and advanced research addressing the current trend toward increasing chip integration and technology process advancements bringing about stimulating new challenges both at the physical and system-design levels, as well as in the test of theses systems.
Critical Infrastructure Protection II describes original research results and innovative applications in the interdisciplinary field of critical infrastructure protection. Also, it highlights the importance of weaving science, technology and policy in crafting sophisticated solutions that will help secure information, computer and network assets in the various critical infrastructure sectors. This book is the second volume in the annual series produced by the International Federation for Information Processing (IFIP) Working Group 11.10 on Critical Infrastructure Protection, an international community of scientists, engineers, practitioners and policy makers dedicated to advancing research, development and implementation efforts focused on infrastructure protection. The book contains a selection of twenty edited papers from the Second Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection held at George Mason University, Arlington, Virginia, USA in the spring of 2008.
This book aims to present the impact of Artificial Intelligence (AI) and Big Data in healthcare for medical decision making and data analysis in myriad fields including Radiology, Radiomics, Radiogenomics, Oncology, Pharmacology, COVID-19 prognosis, Cardiac imaging, Neuroradiology, Psychiatry and others. This will include topics such as Artificial Intelligence of Thing (AIOT), Explainable Artificial Intelligence (XAI), Distributed learning, Blockchain of Internet of Things (BIOT), Cybersecurity, and Internet of (Medical) Things (IoTs). Healthcare providers will learn how to leverage Big Data analytics and AI as methodology for accurate analysis based on their clinical data repositories and clinical decision support. The capacity to recognize patterns and transform large amounts of data into usable information for precision medicine assists healthcare professionals in achieving these objectives. Intelligent Health has the potential to monitor patients at risk with underlying conditions and track their progress during therapy. Some of the greatest challenges in using these technologies are based on legal and ethical concerns of using medical data and adequately representing and servicing disparate patient populations. One major potential benefit of this technology is to make health systems more sustainable and standardized. Privacy and data security, establishing protocols, appropriate governance, and improving technologies will be among the crucial priorities for Digital Transformation in Healthcare.
This text explains the fundamental principles of algorithms available for performing arithmetic operations on digital computers. These include basic arithmetic operations like addition, subtraction, multiplication, and division in fixed-point and floating-point number systems as well as more complex operations such as square root extraction and evaluation of exponential, logarithmic, and trigonometric functions. The algorithms described are independent of the particular technology employed for their implementation.
Your secret weapon to understanding--and using!--one of the most powerful influences in the world today From your Facebook News Feed to your most recent insurance premiums--even making toast!--algorithms play a role in virtually everything that happens in modern society and in your personal life. And while they can seem complicated from a distance, the reality is that, with a little help, anyone can understand--and even use--these powerful problem-solving tools! In Algorithms For Dummies, you'll discover the basics of algorithms, including what they are, how they work, where you can find them (spoiler alert: everywhere!), who invented the most important ones in use today (a Greek philosopher is involved), and how to create them yourself. You'll also find: Dozens of graphs and charts that help you understand the inner workings of algorithms Links to an online repository called GitHub for constant access to updated code Step-by-step instructions on how to use Google Colaboratory, a zero-setup coding environment that runs right from your browser Whether you're a curious internet user wondering how Google seems to always know the right answer to your question or a beginning computer science student looking for a head start on your next class, Algorithms For Dummies is the can't-miss resource you've been waiting for.
In the research area of computer science, practitioners are constantly searching for faster platforms with pertinent results. With analytics that span environmental development to computer hardware emulation, problem-solving algorithms are in high demand. Field-Programmable Gate Array (FPGA) is a promising computing platform that can be significantly faster for some applications and can be applied to a variety of fields. FPGA Algorithms and Applications in the IoT, AI, and High-Performance Computing provides emerging research exploring the theoretical and practical aspects of computable algorithms and applications within robotics and electronics development. Featuring coverage on a broad range of topics such as neuroscience, bioinformatics, and artificial intelligence, this book is ideally designed for computer science specialists, researchers, professors, and students seeking current research on cognitive analytics and advanced computing.
In delivering lectures and writing books, we were most often forced to pay absolutely no attention to a great body of interesting results and useful algorithms appearing in numerous sources and occasionally encountered. It was absolutely that most of these re sults would finally be forgotten because it is impossible to run through the entire variety of sources where these materials could be published. Therefore, we decided to do what we can to correct this situation. We discussed this problem with Ershov and came to an idea to write an encyclopedia of algorithms on graphs focusing our main attention on the algorithms already used in programming and their generalizations or modifications. We thought that it is reasonable to group all graphs into certain classes and place the algo rithms developed for each class into a separate book. The existence of trees, i. e., a class of graphs especially important for programming, also supported this decision. This monograph is the first but, as we hope, not the last book written as part of our project. It was preceded by two books "Algorithms on Trees" (1984) and "Algorithms of Processing of Trees" (1990) small editions of which were published at the Computer Center of the Siberian Division of the Russian Academy of Sciences. The books were distributed immediately and this made out our decision to prepare a combined mono graph on the basis of these books even stronger."
This book provides a comprehensive picture of fog computing technology, including of fog architectures, latency aware application management issues with real time requirements, security and privacy issues and fog analytics, in wide ranging application scenarios such as M2M device communication, smart homes, smart vehicles, augmented reality and transportation management. This book explores the research issues involved in the application of traditional shallow machine learning and deep learning techniques to big data analytics. It surveys global research advances in extending the conventional unsupervised or clustering algorithms, extending supervised and semi-supervised algorithms and association rule mining algorithms to big data Scenarios. Further it discusses the deep learning applications of big data analytics to fields of computer vision and speech processing, and describes applications such as semantic indexing and data tagging. Lastly it identifies 25 unsolved research problems and research directions in fog computing, as well as in the context of applying deep learning techniques to big data analytics, such as dimensionality reduction in high-dimensional data and improved formulation of data abstractions along with possible directions for their solutions.
I want to express my sincere thanks to all authors who submitted research papers to support the Third IFIP International Conference on Computer and Computing Te- nologies in Agriculture and the Third Symposium on Development of Rural Infor- tion (CCTA 2009) held in China, during October 14-17, 2009. This conference was hosted by the CICTA (EU-China Centre for Information & Communication Technologies, China Agricultural University), China National En- neering Research Center for Information Technology in Agriculture, Asian Conf- ence on Precision Agriculture, International Federation for Information Processing, Chinese Society of Agricultural Engineering, Beijing Society for Information Te- nology in Agriculture, and the Chinese Society for Agricultural Machinery. The pla- num sponsor includes the Ministry of Science and Technology of China, Ministry of Agriculture of China, Ministry of Education of China, among others. The CICTA (EU-China Centre for Information & Communication Technologies, China Agricultural University) focuses on research and development of advanced and practical technologies applied in agriculture and on promoting international communi- tion and cooperation. It has successfully held three International Conferences on C- puter and Computing Technologies in Agriculture, namely CCTA 2007, CCTA 2008 and CCTA 2009. Sustainable agriculture is the focus of the whole world currently, and therefore the application of information technology in agriculture is becoming more and more - portant. 'Informatized agriculture' has been sought by many countries recently in order to scientifically manage agriculture to achieve low costs and high incomes.
This book shares essential insights into how the social sciences and technology could foster new advances in managing the complexity inherent to the criminal and digital policing landscape. Said landscape is both dynamic and intricate, emanating as it does from crimes that are both persistent and transnational. Globalization, human and drug trafficking, cybercrime, terrorism, and other forms of transnational crime can have significant impacts on societies around the world. This necessitates a reassessment of what crime, national security and policing mean. Recent global events such as human and drug trafficking, the COVID-19 pandemic, violent protests, cyber threats and terrorist activities underscore the vulnerabilities of our current security and digital policing posture. This book presents concepts, theories and digital policing applications, offering a comprehensive analysis of current and emerging trends in digital policing. Pursuing an evidence-based approach, it offers an extraordinarily perceptive and detailed view of issues and solutions regarding the crime and digital policing landscape. To this end, it highlights current technological and methodological solutions as well as advances concerning integrated computational and analytical solutions deployed in digital policing. It also provides a comprehensive analysis of the technical, ethical, legal, privacy and civil liberty challenges stemming from the aforementioned advances in the field of digital policing; and accordingly, offers detailed recommendations supporting the design and implementation of best practices including technical, ethical and legal approaches when conducting digital policing. The research gathered here fits well into the larger body of work on various aspects of AI, cybersecurity, national security, digital forensics, cyberterrorism, ethics, human rights, cybercrime and law. It provides a valuable reference for law enforcement, policymakers, cybersecurity experts, digital forensic practitioners, researchers, graduates and advanced undergraduates, and other stakeholders with an interest in counter-terrorism. In addition to this target audience, it offers a valuable tool for lawyers, criminologist and technology enthusiasts.
Our cyber defenses are static and are governed by lengthy processes, e.g., for testing and security patch deployment. Adversaries could plan their attacks carefully over time and launch attacks at cyber speeds at any given moment. We need a new class of defensive strategies that would force adversaries to continually engage in reconnaissance and re-planning of their cyber operations. One such strategy is to present adversaries with a moving target where the attack surface of a system keeps changing. "Moving Target Defense II: Application of Game Theory and Adversarial Modeling "includes contributions from world experts in the cyber security field. In the first volume of MTD, we presented MTD approaches based on software transformations, and MTD approaches based on network and software stack configurations. In thissecond volume of MTD, a group of leading researchers describe game theoretic, cyber maneuver, and software transformation approaches for constructing and analyzing MTD systems. Designed as a professional book for practitioners and researchers working in the cyber security field, advanced -level students and researchers focused on computer science will also find this book valuable as a secondary text book or reference."
Takes an interdisciplinary approach to contribute to the ongoing development of human-AI interaction. Current debate and development of AI is "algorithm-driven" or technical-oriented in lieu of human-centered. At present, there is no systematic interdisciplinary discussion to effectively deal with issues and challenges arising from AI. This book offers critical analysis of the logic and social implications of algorithmic processes. Reporting from the processes of scientific research, the results can be useful for understanding the relationship between algorithms and humans, allowing AI designers to assess the quality of the meaningful interactions with AI systems. |
You may like...
|