![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer programming > Algorithms & procedures
Structural operational semantics is a simple, yet powerful mathematical theory for describing the behaviour of programs in an implementation-independent manner. This book provides a self-contained introduction to structural operational semantics, featuring semantic definitions using big-step and small-step semantics of many standard programming language constructs, including control structures, structured declarations and objects, parameter mechanisms and procedural abstraction, concurrency, nondeterminism and the features of functional programming languages. Along the way, the text introduces and applies the relevant proof techniques, including forms of induction and notions of semantic equivalence (including bisimilarity). Thoroughly class-tested, this book has evolved from lecture notes used by the author over a 10-year period at Aalborg University to teach undergraduate and graduate students. The result is a thorough introduction that makes the subject clear to students and computing professionals without sacrificing its rigour. No experience with any specific programming language is required.
This book constitutes the refereed proceedings of the 40th International Conference on Current Trends in Theory and Practice of Computer Science, SOFSEM 2014, held in Novy Smokovec, Slovakia, in January 2014. The 40 revised full papers presented in this volume were carefully reviewed and selected from 104 submissions. The book also contains 6 invited talks. The contributions covers topics as: Foundations of Computer Science, Software and Web Engineering, as well as Data, Information and Knowledge Engineering and Cryptography, Security and Verification."
A more accessible approach than most competitor texts, which move into advanced, research-level topics too quickly for today's students. Part I is comprehensive in providing all necessary mathematical underpinning, particularly for those who need more opportunity to develop their mathematical competence. More confident students may move directly to Part II and dip back into Part I as a reference. Ideal for use as an introductory text for courses in quantum computing. Fully worked examples illustrate the application of mathematical techniques. Exercises throughout develop concepts and enhance understanding. End-of-chapter exercises offer more practice in developing a secure foundation.
This new volume provides the information needed to understand the simplex method, the revised simplex method, dual simplex method, and more for solving linear programming problems. Following a logical order, the book first gives a mathematical model of the linear problem programming and describes the usual assumptions under which the problem is solved. It gives a brief description of classic algorithms for solving linear programming problems as well as some theoretical results. It goes on to explain the definitions and solutions of linear programming problems, outlining the simplest geometric methods and showing how they can be implemented. Practical examples are included along the way. The book concludes with a discussion of multi-criteria decision-making methods. Advances in Optimization and Linear Programming is a highly useful guide to linear programming for professors and students in optimization and linear programming.
It explore issues of diversity and inclusion in relation to artificial intelligence (AI). The author leads a research group on Digitalization and Robotization of Society at NTNU Norwegian University of Science and Technology.
The best selling 'Algorithmics' presents the most important, concepts, methods and results that are fundamental to the science of computing. It starts by introducing the basic ideas of algorithms, including their structures and methods of data manipulation. It then goes on to demonstrate how to design accurate and efficient algorithms, and discusses their inherent limitations. As the author himself says in the preface to the book; 'This book attempts to present a readable account of some of the most important and basic topics of computer science, stressing the fundamental and robust nature of the science in a form that is virtually independent of the details of specific computers, languages and formalisms'.
A best-seller in its French edition, the construction of this book is original and its success in the French market demonstrates its appeal. It is based on three principles: 1. An organization of the chapters by families of algorithms : exhaustive search, divide and conquer, etc. At the contrary, there is no chapter only devoted to a systematic exposure of, say, algorithms on strings. Some of these will be found in different chapters. 2. For each family of algorithms, an introduction is given to the mathematical principles and the issues of a rigorous design, with one or two pedagogical examples. 3. For its most part, the book details 150 problems, spanning on seven families of algorithms. For each problem, a precise and progressive statement is given. More important, a complete solution is detailed, with respect to the design principles that have been presented ; often, some classical errors are pointed at. Roughly speaking, two thirds of the book are devoted to the detailed rational construction of the solutions.
Physically unclonable functions (PUFs) are innovative physical security primitives that produce unclonable and inherent instance-specific measurements of physical objects; in many ways they are the inanimate equivalent of biometrics for human beings. Since they are able to securely generate and store secrets, they allow us to bootstrap the physical implementation of an information security system. In this book the author discusses PUFs in all their facets: the multitude of their physical constructions, the algorithmic and physical properties which describe them, and the techniques required to deploy them in security applications. The author first presents an extensive overview and classification of PUF constructions, with a focus on so-called intrinsic PUFs. He identifies subclasses, implementation properties, and design techniques used to amplify submicroscopic physical distinctions into observable digital response vectors. He lists the useful qualities attributed to PUFs and captures them in descriptive definitions, identifying the truly PUF-defining properties in the process, and he also presents the details of a formal framework for deploying PUFs and similar physical primitives in cryptographic reductions. The author then describes a silicon test platform carrying different intrinsic PUF structures which was used to objectively compare their reliability, uniqueness, and unpredictability based on experimental data. In the final chapters, the author explains techniques for PUF-based entity identification, entity authentication, and secure key generation. He proposes practical schemes that implement these techniques, and derives and calculates measures for assessing different PUF constructions in these applications based on the quality of their response statistics. Finally, he presents a fully functional prototype implementation of a PUF-based cryptographic key generator, demonstrating the full benefit of using PUFs and the efficiency of the processing techniques described. This is a suitable introduction and reference for security researchers and engineers, and graduate students in information security and cryptography.
This open access handbook describes foundational issues, methodological approaches and examples on how to analyse and model data using Computational Social Science (CSS) for policy support. Up to now, CSS studies have mostly developed on a small, proof-of concept, scale that prevented from unleashing its potential to provide systematic impact to the policy cycle, as well as from improving the understanding of societal problems to the definition, assessment, evaluation, and monitoring of policies. The aim of this handbook is to fill this gap by exploring ways to analyse and model data for policy support, and to advocate the adoption of CSS solutions for policy by raising awareness of existing implementations of CSS in policy-relevant fields. To this end, the book explores applications of computational methods and approaches like big data, machine learning, statistical learning, sentiment analysis, text mining, systems modelling, and network analysis to different problems in the social sciences. The book is structured into three Parts: the first chapters on foundational issues open with an exposition and description of key policymaking areas where CSS can provide insights and information. In detail, the chapters cover public policy, governance, data justice and other ethical issues. Part two consists of chapters on methodological aspects dealing with issues such as the modelling of complexity, natural language processing, validity and lack of data, and innovation in official statistics. Finally, Part three describes the application of computational methods, challenges and opportunities in various social science areas, including economics, sociology, demography, migration, climate change, epidemiology, geography, and disaster management. The target audience of the book spans from the scientific community engaged in CSS research to policymakers interested in evidence-informed policy interventions, but also includes private companies holding data that can be used to study social sciences and are interested in achieving a policy impact.
A new era of complexity science is emerging, in which nature- and bio-inspired principles are being applied to provide solutions. At the same time, the complexity of systems is increasing due to such models like the Internet of Things (IoT) and fog computing. Will complexity science, applying the principles of nature, be able to tackle the challenges posed by highly complex networked systems? Bio-Inspired Optimization in Fog and Edge Computing: Principles, Algorithms, and Systems is an attempt to answer this question. It presents innovative, bio-inspired solutions for fog and edge computing and highlights the role of machine learning and informatics. Nature- or biological-inspired techniques are successful tools to understand and analyze a collective behavior. As this book demonstrates, algorithms, and mechanisms of self-organization of complex natural systems have been used to solve optimization problems, particularly in complex systems that are adaptive, ever-evolving, and distributed in nature. The chapters look at ways of enhancingto enhance the performance of fog networks in real-world applications using nature-based optimization techniques. They discuss challenges and provide solutions to the concerns of security, privacy, and power consumption in cloud data center nodes and fog computing networks. The book also examines how: The existing fog and edge architecture is used to provide solutions to future challenges. A geographical information system (GIS) can be used with fog computing to help users in an urban region access prime healthcare. An optimization framework helps in cloud resource management. Fog computing can improve the quality, quantity, long-term viability, and cost-effectiveness in agricultural production. Virtualization can support fog computing, increase resources to be allocated, and be applied to different network layers. The combination of fog computing and IoT or cloud computing can help healthcare workers predict and analyze diseases in patients.
The book describes state-of-the-art advances in simulators and emulators for quantum computing. It introduces the main concepts of quantum computing, defining q-bits, explaining the parallelism behind any quantum computation, describing measurement of the quantum state of information and explaining the process of quantum bit entanglement, collapsed state and cloning. The book reviews the concept of quantum unitary, binary and ternary quantum operators as well as the computation implied by each operator. It provides details of the quantum processor, providing its architecture, which is validated via execution simulation of some quantum instructions.
Describes the contribution of soft computing techniques towards a new paradigm shift Explores Soft Computing techniques in a systematic manner starting from their initial stage to recent developments in this area Presents a systematic application of fuzzy logic in mathematical sciences and decision-making. Examines the application of soft computing in health sciences and in the modeling of epidemics including the effects of vaccination Discusses the application of soft computing techniques in the modeling of infectious diseases
Matrix Algorithms in MATLAB focuses on the MATLAB code implementations of matrix algorithms. The MATLAB codes presented in the book are tested with thousands of runs of MATLAB randomly generated matrices, and the notation in the book follows the MATLAB style to ensure a smooth transition from formulation to the code, with MATLAB codes discussed in this book kept to within 100 lines for the sake of clarity. The book provides an overview and classification of the interrelations of various algorithms, as well as numerous examples to demonstrate code usage and the properties of the presented algorithms. Despite the wide availability of computer programs for matrix computations, it continues to be an active area of research and development. New applications, new algorithms, and improvements to old algorithms are constantly emerging.
This book describes simple to complex ASIC design practical scenarios using Verilog. It builds a story from the basic fundamentals of ASIC designs to advanced RTL design concepts using Verilog. Looking at current trends of miniaturization, the contents provide practical information on the issues in ASIC design and synthesis using Synopsys DC and their solution. The book explains how to write efficient RTL using Verilog and how to improve design performance. It also covers architecture design strategies, multiple clock domain designs, low-power design techniques, DFT, pre-layout STA and the overall ASIC design flow with case studies. The contents of this book will be useful to practicing hardware engineers, students, and hobbyists looking to learn about ASIC design and synthesis.
Covers different technologies like AI, IoT and Signal Processing in the context of biomedical applications Reviews medical image analysis, disease detection, and prediction Comprehends the advantage of recent technologies for medical record keeping through electronics health records (EHRs) Presents state of art research in the field of biomedical engineering using various physiological signals Explores different Bio Sensors used in Healthcare Applications using IoT
With the rapid penetration of technology in varied application domains, the existing cities are getting connected more seamlessly. Cities becomes smart by inducing ICT in the classical city infrastructure for its management. According to McKenzie Report, about 68% of the world population will migrate towards urban settlements in near future. This migration is largely because of the improved Quality of Life (QoL) and livelihood in urban settlements. In the light of urbanization, climate change, democratic flaws, and rising urban welfare expenditures, smart cities have emerged as an important approach for society's future development. Smart cities have achieved enhanced QoL by giving smart information to people regarding healthcare, transportation, smart parking, smart traffic structure, smart home, smart agronomy, community security etc. Typically, in smart cities data is sensed by the sensor devices and provided to end users for further use. The sensitive data is transferred with the help of internet creating higher chances for the adversaries to breach the data. Considering the privacy and security as the area of prime focus, this book covers the most prominent security vulnerabilities associated with varied application areas like healthcare, manufacturing, transportation, education and agriculture etc. Furthermore, the massive amount of data being generated through ubiquitous sensors placed across the smart cities needs to be handled in an effective, efficient, secured and privacy preserved manner. Since a typical smart city ecosystem is data driven, it is imperative to manage this data in an optimal manner. Enabling technologies like Internet of Things (IoT), Natural Language Processing (NLP), Blockchain Technology, Deep Learning, Machine Learning, Computer vision, Big Data Analytics, Next Generation Networks and Software Defined Networks (SDN) provide exemplary benefits if they are integrated in the classical city ecosystem in an effective manner. The application of Artificial Intelligence (AI) is expanding across many domains in the smart city, such as infrastructure, transportation, environmental protection, power and energy, privacy and security, governance, data management, healthcare, and more. AI has the potential to improve human health, prosperity, and happiness by reducing our reliance on manual labor and accelerating our progress in the sciences and technologies. NLP is an extensive domain of AI and is used in collaboration with machine learning and deep learning algorithms for clinical informatics and data processing. In modern smart cities, blockchain provides a complete framework that controls the city operations and ensures that they are managed as effectively as possible. Besides having an impact on our daily lives, it also facilitates many areas of city management.
This book highlights selected papers from the 4th ICSA-Canada Chapter Symposium, as well as invited articles from established researchers in the areas of statistics and data science. It covers a variety of topics, including methodology development in data science, such as methodology in the analysis of high dimensional data, feature screening in ultra-high dimensional data and natural language ranking; statistical analysis challenges in sampling, multivariate survival models and contaminated data, as well as applications of statistical methods. With this book, readers can make use of frontier research methods to tackle their problems in research, education, training and consultation.
The volume contains latest research on software reliability assessment, testing, quality management, inventory management, mathematical modeling, analysis using soft computing techniques and management analytics. It links researcher and practitioner perspectives from different branches of engineering and management, and from around the world for a bird's eye view on the topics. The interdisciplinarity of engineering and management research is widely recognized and considered to be the most appropriate and significant in the fast changing dynamics of today's times. With insights from the volume, companies looking to drive decision making are provided actionable insight on each level and for every role using key indicators, to generate mobile-enabled scorecards, time-series based analysis using charts, and dashboards. At the same time, the book provides scholars with a platform to derive maximum utility in the area by subscribing to the idea of managing business through performance and business analytics.
Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean logic, Bayesian Programming covers new methods to build probabilistic programs for real-world applications. Written by the team who designed and implemented an efficient probabilistic inference engine to interpret Bayesian programs, the book offers many Python examples that are also available on a supplementary website together with an interpreter that allows readers to experiment with this new approach to programming. Principles and Modeling Only requiring a basic foundation in mathematics, the first two parts of the book present a new methodology for building subjective probabilistic models. The authors introduce the principles of Bayesian programming and discuss good practices for probabilistic modeling. Numerous simple examples highlight the application of Bayesian modeling in different fields. Formalism and AlgorithmsThe third part synthesizes existing work on Bayesian inference algorithms since an efficient Bayesian inference engine is needed to automate the probabilistic calculus in Bayesian programs. Many bibliographic references are included for readers who would like more details on the formalism of Bayesian programming, the main probabilistic models, general purpose algorithms for Bayesian inference, and learning problems. FAQsAlong with a glossary, the fourth part contains answers to frequently asked questions. The authors compare Bayesian programming and possibility theories, discuss the computational complexity of Bayesian inference, cover the irreducibility of incompleteness, and address the subjectivist versus objectivist epistemology of probability. The First Steps toward a Bayesian ComputerA new modeling methodology, new inference algorithms, new programming languages, and new hardware are all needed to create a complete Bayesian computing framework. Focusing on the methodology and algorithms, this book describes the first steps toward reaching that goal. It encourages readers to explore emerging areas, such as bio-inspired computing, and develop new programming languages and hardware architectures.
The book presents laboratory experiments concerning ARM microcontrollers, and discusses the architecture of the Tiva Cortex-M4 ARM microcontrollers from Texas Instruments, describing various ways of programming them. Given the meager peripherals and sensors available on the kit, the authors describe the design of Padma - a circuit board with a large set of peripherals and sensors that connects to the Tiva Launchpad and exploits the Tiva microcontroller family's on-chip features. ARM microcontrollers, which are classified as 32-bit devices, are currently the most popular of all microcontrollers. They cover a wide range of applications that extend from traditional 8-bit devices to 32-bit devices. Of the various ARM subfamilies, Cortex-M4 is a middle-level microcontroller that lends itself well to data acquisition and control as well as digital signal manipulation applications. Given the prominence of ARM microcontrollers, it is important that they should be incorporated in academic curriculums. However, there is a lack of up-to-date teaching material - textbooks and comprehensive laboratory manuals. In this book each of the microcontroller's resources - digital input and output, timers and counters, serial communication channels, analog-to-digital conversion, interrupt structure and power management features - are addressed in a set of more than 70 experiments to help teach a full semester course on these microcontrollers. Beyond these physical interfacing exercises, it describes an inexpensive BoB (break out board) that allows students to learn how to design and build standalone projects, as well a number of illustrative projects.
This book focuses on grammatical inference, presenting classic and modern methods of grammatical inference from the perspective of practitioners. To do so, it employs the Python programming language to present all of the methods discussed. Grammatical inference is a field that lies at the intersection of multiple disciplines, with contributions from computational linguistics, pattern recognition, machine learning, computational biology, formal learning theory and many others. Though the book is largely practical, it also includes elements of learning theory, combinatorics on words, the theory of automata and formal languages, plus references to real-world problems. The listings presented here can be directly copied and pasted into other programs, thus making the book a valuable source of ready recipes for students, academic researchers, and programmers alike, as well as an inspiration for their further development.>
Automatic Differentiation (AD) is a maturing computational technology and has become a mainstream tool used by practicing scientists and computer engineers. The rapid advance of hardware computing power and AD tools has enabled practitioners to quickly generate derivative-enhanced versions of their code for a broad range of applications in applied research and development. "Automatic Differentiation of Algorithms" provides a comprehensive and authoritative survey of all recent developments, new techniques, and tools for AD use. The book covers all aspects of the subject: mathematics, scientific programming ( i.e., use of adjoints in optimization) and implementation (i.e., memory management problems). A strong theme of the book is the relationships between AD tools and other software tools, such as compilers and parallelizers. A rich variety of significant applications are presented as well, including optimum-shape design problems, for which AD offers more efficient tools and techniques. Topics and features: * helpful introductory AD survey chapter for brief overview of the field *extensive applications chapters, i.e., for circuit simulation, optimization and optimal-control shape design, structural mechanics, and multibody dynamical systems modeling *comprehensive bibliography for all current literature and results for the field *performance issues *optimal control sensitivity analysis *AD use with object oriented software tool kits The book is an ideal and accessible survey of recent developments and applications of AD tools and techniques for a broad scientific computing and computer engineering readership. Practitioners, professionals, and advanced graduates working in AD development will find the book a useful reference and essential resource for their work.
This book presents a model of electromagnetic (EM) information leakage based on electromagnetic and information theory. It discusses anti-leakage, anti-interception and anti-reconstruction technologies from the perspectives of both computer science and electrical engineering. In the next five years, the threat posed by EM information leakage will only become greater, and the demand for protection will correspondingly increase. The book systematically introduces readers to the theory of EM information leakage and the latest technologies and measures designed to counter it, and puts forward an EM information leakage model that has established the foundation for new research in this area, paving the way for new technologies to counter EM information leakage. As such, it offers a valuable reference guide for all researchers and engineers involved in EM information leakage and countermeasures.
This book is designed as a reference book and presents a systematic approach to analyze evolutionary and nature-inspired population-based search algorithms. Beginning with an introduction to optimization methods and algorithms and various enzymes, the book then moves on to provide a unified framework of process optimization for enzymes with various algorithms. The book presents current research on various applications of machine learning and discusses optimization techniques to solve real-life problems. The book compiles the different machine learning models for optimization of process parameters for production of industrially important enzymes. The production and optimization of various enzymes produced by different microorganisms are elaborated in the book It discusses the optimization methods that help minimize the error in developing patterns and classifications, which further helps improve prediction and decision-making Covers the best-performing methods and approaches for optimization sustainable enzymes production with AI integration in a real-time environment Featuring valuable insights, the book helps readers explore new avenues leading towards multidisciplinary research discussions The book is aimed primarily at advanced undergraduates and graduates studying machine learning, data science and industrial biotechnology. Researchers and professionals will also find this book useful.
Algorithim (mathematics) helps in understanding the direct and indirect relationship of plants that exist within it and other environmental factors. This book helps to understand how yield is related to different growth parameters, how growth is influenced by different environmental phenomenon, how best the resources can be used for crop production, etc. The numerical examples in the book guide a student to coordinate the different parameters and understand the subject of Agronomy well. This book is divided into thirteen chapters and covers comprehensively the different agronomic aspects to understand the science of mathematical Agronomy to meet the current and future challenges related to cropping practices. |
You may like...
|