![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer programming > Algorithms & procedures
This book provides awareness of methods used for functional encryption in the academic and professional communities. The book covers functional encryption algorithms and its modern applications in developing secure systems via entity authentication, message authentication, software security, cyber security, hardware security, Internet of Thing (IoT), cloud security, smart card technology, CAPTCHA, digital signature, and digital watermarking. This book is organized into fifteen chapters; topics include foundations of functional encryption, impact of group theory in cryptosystems, elliptic curve cryptography, XTR algorithm, pairing based cryptography, NTRU algorithms, ring units, cocks IBE schemes, Boneh-Franklin IBE, Sakai-Kasahara IBE, hierarchical identity based encryption, attribute based Encryption, extensions of IBE and related primitives, and digital signatures. Explains the latest functional encryption algorithms in a simple way with examples; Includes applications of functional encryption in information security, application security, and network security; Relevant to academics, research scholars, software developers, etc.
This book provides an essential update for experienced data processing professionals, transaction managers and database specialists who are seeking system solutions beyond the confines of traditional approaches. It provides practical advice on how to manage complex transactions and share distributed databases on client servers and the Internet. Based on extensive research in over 100 companies in the USA, Europe, Japan and the UK, topics covered include : * the challenge of global transaction requirements within an expanding business perspective *how to handle long transactions and their constituent elements *possible benefits from object-oriented solutions * the contribution of knowledge engineering in transaction management * the Internet, the World Wide Web and transaction handling * systems software and transaction-processing monitors * OSF/1 and the Encina transaction monitor * active data transfers and remote procedure calls * serialization in a transaction environment * transaction locks, two-phase commit and deadlocks * improving transaction-oriented database management * the successful development of an increasingly complex transaction environment.
This book discusses machine learning and artificial intelligence (AI) for agricultural economics. It is written with a view towards bringing the benefits of advanced analytics and prognostics capabilities to small scale farmers worldwide. This volume provides data science and software engineering teams with the skills and tools to fully utilize economic models to develop the software capabilities necessary for creating lifesaving applications. The book introduces essential agricultural economic concepts from the perspective of full-scale software development with the emphasis on creating niche blue ocean products. Chapters detail several agricultural economic and AI reference architectures with a focus on data integration, algorithm development, regression, prognostics model development and mathematical optimization. Upgrading traditional AI software development paradigms to function in dynamic agricultural and economic markets, this volume will be of great use to researchers and students in agricultural economics, data science, engineering, and machine learning as well as engineers and industry professionals in the public and private sectors.
This book focuses on the combination of IoT and data science, in particular how methods, algorithms, and tools from data science can effectively support IoT. The authors show how data science methodologies, techniques and tools, can translate data into information, enabling the effectiveness and usefulness of new services offered by IoT stakeholders. The authors posit that if IoT is indeed the infrastructure of the future, data structure is the key that can lead to a significant improvement of human life. The book aims to present innovative IoT applications as well as ongoing research that exploit modern data science approaches. Readers are offered issues and challenges in a cross-disciplinary scenario that involves both IoT and data science fields. The book features contributions from academics, researchers, and professionals from both fields.
This book describes concepts and tools needed for water resources management, including methods for modeling, simulation, optimization, big data analysis, data mining, remote sensing, geographical information system, game theory, conflict resolution, System dynamics, agent-based models, multiobjective, multicriteria, and multiattribute decision making and risk and uncertainty analysis, for better and sustainable management of water resources and consumption, thus mitigating the present and future global water shortage crisis. It presents the applications of these tools through case studies which demonstrate its benefits of proper management of water resources systems. This book acts as a reference for students, professors, industrial practitioners, and stakeholders in the field of water resources and hydrology.
This book highlights the design, use and structure of blockchain systems and decentralized ledger technologies (B/DLT) for use in the construction industry. Construction remains a fragmented change-resistant industry with chronic problems of underproductivity and a very low digitization factor compared to other fields. In parallel, the convergence, embedding and coordination of digital technologies in the physical world provides a unique opportunity for the construction industry to leap ahead and adopt fourth industrial revolution technologies. Within this context, B/DLT are an excellent fit for the digitization of the construction industry. B/DLT are effective in this as they organize and align digital and physical supply chains, produce stigmergic coordination out of decentralization, enable the governance of complex projects for multiple stakeholders, while enabling the creation of a new class of business models and legal instruments for construction.
Computational geometry as an area of research in its own right emerged in the early seventies of this century. Right from the beginning, it was obvious that strong connections of various kinds exist to questions studied in the considerably older field of combinatorial geometry. For example, the combinatorial structure of a geometric problem usually decides which algorithmic method solves the problem most efficiently. Furthermore, the analysis of an algorithm often requires a great deal of combinatorial knowledge. As it turns out, however, the connection between the two research areas commonly referred to as computa tional geometry and combinatorial geometry is not as lop-sided as it appears. Indeed, the interest in computational issues in geometry gives a new and con structive direction to the combinatorial study of geometry. It is the intention of this book to demonstrate that computational and com binatorial investigations in geometry are doomed to profit from each other. To reach this goal, I designed this book to consist of three parts, acorn binatorial part, a computational part, and one that presents applications of the results of the first two parts. The choice of the topics covered in this book was guided by my attempt to describe the most fundamental algorithms in computational geometry that have an interesting combinatorial structure. In this early stage geometric transforms played an important role as they reveal connections between seemingly unrelated problems and thus help to structure the field."
This book introduces the state-of-the-art algorithms for data and computation privacy. It mainly focuses on searchable symmetric encryption algorithms and privacy preserving multi-party computation algorithms. This book also introduces algorithms for breaking privacy, and gives intuition on how to design algorithm to counter privacy attacks. Some well-designed differential privacy algorithms are also included in this book. Driven by lower cost, higher reliability, better performance, and faster deployment, data and computing services are increasingly outsourced to clouds. In this computing paradigm, one often has to store privacy sensitive data at parties, that cannot fully trust and perform privacy sensitive computation with parties that again cannot fully trust. For both scenarios, preserving data privacy and computation privacy is extremely important. After the Facebook-Cambridge Analytical data scandal and the implementation of the General Data Protection Regulation by European Union, users are becoming more privacy aware and more concerned with their privacy in this digital world. This book targets database engineers, cloud computing engineers and researchers working in this field. Advanced-level students studying computer science and electrical engineering will also find this book useful as a reference or secondary text.
This book introduces the technological innovations of robotic vehicles. It presents the concepts required for self-driving cars on the road. Besides, readers can gain invaluable knowledge in the construction, programming, and control of the six-legged robot. The book also presents the controllers and aerodynamics of several different types of rotorcrafts. It includes the simulation and flight of the various kinds of rotor-propelled air vehicles under each of their different aerodynamics environment. The book is suitable for academia, educators, students, and researchers who are interested in autonomous vehicles, robotics, and rotor-propelled vehicles.
This book is written for software product teams that use AI to add intelligent models to their products or are planning to use it. As AI adoption grows, it is becoming important that all AI driven products can demonstrate they are not introducing any bias to the AI-based decisions they are making, as well as reducing any pre-existing bias or discrimination. The responsibility to ensure that the AI models are ethical and make responsible decisions does not lie with the data scientists alone. The product owners and the business analysts are as important in ensuring bias-free AI as the data scientists on the team. This book addresses the part that these roles play in building a fair, explainable and accountable model, along with ensuring model and data privacy. Each chapter covers the fundamentals for the topic and then goes deep into the subject matter - providing the details that enable the business analysts and the data scientists to implement these fundamentals. AI research is one of the most active and growing areas of computer science and statistics. This book includes an overview of the many techniques that draw from the research or are created by combining different research outputs. Some of the techniques from relevant and popular libraries are covered, but deliberately not drawn very heavily from as they are already well documented, and new research is likely to replace some of it.
Books on computation in the marketplace tend to discuss the topics within specific fields. Many computational algorithms, however, share common roots. Great advantages emerge if numerical methodologies break the boundaries and find their uses across disciplines. Interdisciplinary Computing In Java Programming Language introduces readers of different backgrounds to the beauty of the selected algorithms. Serious quantitative researchers, writing customized codes for computation, enjoy cracking source codes as opposed to the black-box approach. Most C and Fortran programs, despite being slightly faster in program execution, lack built-in support for plotting and graphical user interface. This book selects Java as the platform where source codes are developed and applications are run, helping readers/users best appreciate the fun of computation. Interdisciplinary Computing In Java Programming Language is designed to meet the needs of a professional audience composed of practitioners and researchers in science and technology. This book is also suitable for senior undergraduate and graduate-level students in computer science, as a secondary text.
This book presents a new understanding on how control systems truly operate, and explains how to recognize, simulate, and improve control systems in all fields of activity. It also reveals the pervasive, ubiquitous and indispensable role of control processes in our life and the need to develop a "control-oriented thinking"-based on uncomplicated but effective models derived from systems thinking-that is, a true "discipline of control." Over the book's thirteen chapters, Piero Mella shows that there are simple control systems (rather than complex ones) that can easily help us to manage complexity without drawing upon more sophisticated control systems. It begins by reviewing the basic language of systems thinking and the models it allows users to create. It then introduces the control process, presenting the theoretical structure of three simple control systems we all can observe in order to gain fundamental knowledge from them about the basic structure of a control system. Then, it presents the anatomy of the simplest "magic ring" and the general theoretical model of any control system. This is followed by an introduction to a general typology of control systems and a broader view of control systems by investigating multi-lever control systems and multi-objective systems. The book undertakes the concepts through various environments, increasingly broader in scope to suggest to readers how to recognize therein control systems manifestations in everyday life and in natural phenomena. Updated for the 2nd edition, new chapters explore control systems regulating the biological environment and the organizations, with an in-depth study of the control of quality, productivity, production, stocks and costs. Finally, it concludes by dealing with the learning process, problem-solving, and designing the logical structure of control systems.
This introductory book on quantum computing includes an emphasis on the development of algorithms. Appropriate for both university students as well as software developers interested in programming a quantum computer, this practical approach to modern quantum computing takes the reader through the required background and up to the latest developments. Beginning with introductory chapters on the required math and quantum mechanics, Fundamentals of Quantum Computing proceeds to describe four leading qubit modalities and explains the core principles of quantum computing in detail. Providing a step-by-step derivation of math and source code, some of the well-known quantum algorithms are explained in simple ways so the reader can try them either on IBM Q or Microsoft QDK. The book also includes a chapter on adiabatic quantum computing and modern concepts such as topological quantum computing and surface codes. Features: o Foundational chapters that build the necessary background on math and quantum mechanics. o Examples and illustrations throughout provide a practical approach to quantum programming with end-of-chapter exercises. o Detailed treatment on four leading qubit modalities -- trapped-ion, superconducting transmons, topological qubits, and quantum dots -- teaches how qubits work so that readers can understand how quantum computers work under the hood and devise efficient algorithms and error correction codes. Also introduces protected qubits - 0- qubits, fluxon parity protected qubits, and charge-parity protected qubits. o Principles of quantum computing, such as quantum superposition principle, quantum entanglement, quantum teleportation, no-cloning theorem, quantum parallelism, and quantum interference are explained in detail. A dedicated chapter on quantum algorithm explores both oracle-based, and Quantum Fourier Transform-based algorithms in detail with step-by-step math and working code that runs on IBM QisKit and Microsoft QDK. Topics on EPR Paradox, Quantum Key Distribution protocols, Density Matrix formalism, and Stabilizer formalism are intriguing. While focusing on the universal gate model of quantum computing, this book also introduces adiabatic quantum computing and quantum annealing. This book includes a section on fault-tolerant quantum computing to make the discussions complete. The topics on Quantum Error Correction, Surface codes such as Toric code and Planar code, and protected qubits help explain how fault tolerance can be built at the system level.
This book surveys the state-of-the-art in the theory of combinatorial games, that is games not involving chance or hidden information. Enthusiasts will find a wide variety of exciting topics, from a trailblazing presentation of scoring to solutions of three piece ending positions of bidding chess. Theories and techniques in many subfields are covered, such as universality, Wythoff Nim variations, misere play, partizan bidding (a.k.a. Richman games), loopy games, and the algebra of placement games. Also included are an updated list of unsolved problems, extremely efficient algorithms for taking and breaking games, a historical exposition of binary numbers and games by David Singmaster, chromatic Nim variations, renormalization for combinatorial games, and a survey of temperature theory by Elwyn Berlekamp, one of the founders of the field. The volume was initiated at the Combinatorial Game Theory Workshop, January 2011, held at the Banff International Research Station.
This book presents high-quality research papers presented at the International Conference on Smart Computing and Cyber Security: Strategic Foresight, Security Challenges and Innovation (SMARTCYBER 2020) held during July 7-8, 2020, in the Department of Smart Computing, Kyungdong University, Global Campus, South Korea. The book includes selected works from academics and industrial experts in the field of computer science, information technology, and electronics and telecommunication. The content addresses challenges of cyber security.
This book develops survey data analysis tools in Python, to create and analyze cross-tab tables and data visuals, weight data, perform hypothesis tests, and handle special survey questions such as Check-all-that-Apply. In addition, the basics of Bayesian data analysis and its Python implementation are presented. Since surveys are widely used as the primary method to collect data, and ultimately information, on attitudes, interests, and opinions of customers and constituents, these tools are vital for private or public sector policy decisions. As a compact volume, this book uses case studies to illustrate methods of analysis essential for those who work with survey data in either sector. It focuses on two overarching objectives: Demonstrate how to extract actionable, insightful, and useful information from survey data; and Introduce Python and Pandas for analyzing survey data.
This volume deals with problems of modern effective algorithms for the numerical solution of the most frequently occurring elliptic partial differential equations. From the point of view of implementation, attention is paid to algorithms for both classical sequential and parallel computer systems. The first two chapters are devoted to fast algorithms for solving the Poisson and biharmonic equation. In the third chapter, parallel algorithms for model parallel computer systems of the SIMD and MIMD types are described. The implementation aspects of parallel algorithms for solving model elliptic boundary value problems are outlined for systems with matrix, pipeline and multiprocessor parallel computer architectures. A modern and popular multigrid computational principle which offers a good opportunity for a parallel realization is described in the next chapter. More parallel variants based in this idea are presented, whereby methods and assignments strategies for hypercube systems are treated in more detail. The last chapter presents VLSI designs for solving special tridiagonal linear systems of equations arising from finite-difference approximations of elliptic problems. For researchers interested in the development and application of fast algorithms for solving elliptic partial differential equations using advanced computer systems.
This book presents a summary of artificial intelligence and machine learning techniques in its first two chapters. The remaining chapters of the book provide everything one must know about the basic artificial intelligence to modern machine intelligence techniques including the hybrid computational intelligence technique, using the concepts of several real-life solved examples, design of projects and research ideas. The solved examples with more than 200 illustrations presented in the book are a great help to instructors, students, non-AI professionals, and researchers. Each example is discussed in detail with encoding, normalization, architecture, detailed design, process flow, and sample input/output. Summary of the fundamental concepts with solved examples is a unique combination and highlight of this book.
This innovative textbook presents material for a course on modern statistics that incorporates Python as a pedagogical and practical resource. Drawing on many years of teaching and conducting research in various applied and industrial settings, the authors have carefully tailored the text to provide an ideal balance of theory and practical applications. Numerous examples and case studies are incorporated throughout, and comprehensive Python applications are illustrated in detail. A custom Python package is available for download, allowing students to reproduce these examples and explore others. The first chapters of the text focus on analyzing variability, probability models, and distribution functions. Next, the authors introduce statistical inference and bootstrapping, and variability in several dimensions and regression models. The text then goes on to cover sampling for estimation of finite population quantities and time series analysis and prediction, concluding with two chapters on modern data analytic methods. Each chapter includes exercises, data sets, and applications to supplement learning. Modern Statistics: A Computer-Based Approach with Python is intended for a one- or two-semester advanced undergraduate or graduate course. Because of the foundational nature of the text, it can be combined with any program requiring data analysis in its curriculum, such as courses on data science, industrial statistics, physical and social sciences, and engineering. Researchers, practitioners, and data scientists will also find it to be a useful resource with the numerous applications and case studies that are included. A second, closely related textbook is titled Industrial Statistics: A Computer-Based Approach with Python. It covers topics such as statistical process control, including multivariate methods, the design of experiments, including computer experiments and reliability methods, including Bayesian reliability. These texts can be used independently or for consecutive courses. The mistat Python package can be accessed at https://gedeck.github.io/mistat-code-solutions/ModernStatistics/ "In this book on Modern Statistics, the last two chapters on modern analytic methods contain what is very popular at the moment, especially in Machine Learning, such as classifiers, clustering methods and text analytics. But I also appreciate the previous chapters since I believe that people using machine learning methods should be aware that they rely heavily on statistical ones. I very much appreciate the many worked out cases, based on the longstanding experience of the authors. They are very useful to better understand, and then apply, the methods presented in the book. The use of Python corresponds to the best programming experience nowadays. For all these reasons, I think the book has also a brilliant and impactful future and I commend the authors for that." Professor Fabrizio RuggeriResearch Director at the National Research Council, ItalyPresident of the International Society for Business and Industrial Statistics (ISBIS)Editor-in-Chief of Applied Stochastic Models in Business and Industry (ASMBI)
This book aims to provide some insights into recently developed bio-inspired algorithms within recent emerging trends of fog computing, sentiment analysis, and data streaming as well as to provide a more comprehensive approach to the big data management from pre-processing to analytics to visualization phases. The subject area of this book is within the realm of computer science, notably algorithms (meta-heuristic and, more particularly, bio-inspired algorithms). Although application domains of these new algorithms may be mentioned, the scope of this book is not on the application of algorithms to specific or general domains but to provide an update on recent research trends for bio-inspired algorithms within a specific application domain or emerging area. These areas include data streaming, fog computing, and phases of big data management. One of the reasons for writing this book is that the bio-inspired approach does not receive much attention but shows considerable promise and diversity in terms of approach of many issues in big data and streaming. Some novel approaches of this book are the use of these algorithms to all phases of data management (not just a particular phase such as data mining or business intelligence as many books focus on); effective demonstration of the effectiveness of a selected algorithm within a chapter against comparative algorithms using the experimental method. Another novel approach is a brief overview and evaluation of traditional algorithms, both sequential and parallel, for use in data mining, in order to provide an overview of existing algorithms in use. This overview complements a further chapter on bio-inspired algorithms for data mining to enable readers to make a more suitable choice of algorithm for data mining within a particular context. In all chapters, references for further reading are provided, and in selected chapters, the author also include ideas for future research.
This book comprises select proceedings of the 7th International Conference on Data Science and Engineering (ICDSE 2021). The contents of this book focus on responsible data science. This book tries to integrate research across diverse topics related to data science, such as fairness, trust, ethics, confidentiality, transparency, and accuracy. The chapters in this book represent research from different perspectives that offer novel theoretical implications that span multiple disciplines. The book will serve as a reference resource for researchers and practitioners in academia and industry.
Although rigidity has been studied since the time of Lagrange (1788) and Maxwell (1864), it is only in the last twenty-five years that it has begun to find applications in the basic sciences. The modern era starts with Laman (1970), who made the subject rigorous in two dimensions, followed by the development of computer algorithms that can test over a million sites in seconds and find the rigid regions, and the associated pivots, leading to many applications. This workshop was organized to bring together leading researchers studying the underlying theory, and to explore the various areas of science where applications of these ideas are being implemented.
This book highlights essential concepts in connection with the traditional bat algorithm and its recent variants, as well as its application to find optimal solutions for a variety of real-world engineering and medical problems. Today, swarm intelligence-based meta-heuristic algorithms are extensively being used to address a wide range of real-world optimization problems due to their adaptability and robustness. Developed in 2009, the bat algorithm (BA) is one of the most successful swarm intelligence procedures, and has been used to tackle optimization tasks for more than a decade. The BA's mathematical model is quite straightforward and easy to understand and enhance, compared to other swarm approaches. Hence, it has attracted the attention of researchers who are working to find optimal solutions in a diverse range of domains, such as N-dimensional numerical optimization, constrained/unconstrained optimization and linear/nonlinear optimization problems. Along with the traditional BA, its enhanced versions are now also being used to solve optimization problems in science, engineering and medical applications around the globe.
The theory of parsing is an important application area of the theory of formal languages and automata. The evolution of modem high-level programming languages created a need for a general and theoretically dean methodology for writing compilers for these languages. It was perceived that the compilation process had to be "syntax-directed," that is, the functioning of a programming language compiler had to be defined completely by the underlying formal syntax of the language. A program text to be compiled is "parsed" according to the syntax of the language, and the object code for the program is generated according to the semantics attached to the parsed syntactic entities. Context-free grammars were soon found to be the most convenient formalism for describing the syntax of programming languages, and accordingly methods for parsing context-free languages were devel oped. Practical considerations led to the definition of various kinds of restricted context-free grammars that are parsable by means of efficient deterministic linear-time algorithms."
The five-volume set IFIP AICT 630, 631, 632, 633, and 634 constitutes the refereed proceedings of the International IFIP WG 5.7 Conference on Advances in Production Management Systems, APMS 2021, held in Nantes, France, in September 2021.*The 378 papers presented were carefully reviewed and selected from 529 submissions. They discuss artificial intelligence techniques, decision aid and new and renewed paradigms for sustainable and resilient production systems at four-wall factory and value chain levels. The papers are organized in the following topical sections: Part I: artificial intelligence based optimization techniques for demand-driven manufacturing; hybrid approaches for production planning and scheduling; intelligent systems for manufacturing planning and control in the industry 4.0; learning and robust decision support systems for agile manufacturing environments; low-code and model-driven engineering for production system; meta-heuristics and optimization techniques for energy-oriented manufacturing systems; metaheuristics for production systems; modern analytics and new AI-based smart techniques for replenishment and production planning under uncertainty; system identification for manufacturing control applications; and the future of lean thinking and practice Part II: digital transformation of SME manufacturers: the crucial role of standard; digital transformations towards supply chain resiliency; engineering of smart-product-service-systems of the future; lean and Six Sigma in services healthcare; new trends and challenges in reconfigurable, flexible or agile production system; production management in food supply chains; and sustainability in production planning and lot-sizing Part III: autonomous robots in delivery logistics; digital transformation approaches in production management; finance-driven supply chain; gastronomic service system design; modern scheduling and applications in industry 4.0; recent advances in sustainable manufacturing; regular session: green production and circularity concepts; regular session: improvement models and methods for green and innovative systems; regular session: supply chain and routing management; regular session: robotics and human aspects; regular session: classification and data management methods; smart supply chain and production in society 5.0 era; and supply chain risk management under coronavirus Part IV: AI for resilience in global supply chain networks in the context of pandemic disruptions; blockchain in the operations and supply chain management; data-based services as key enablers for smart products, manufacturing and assembly; data-driven methods for supply chain optimization; digital twins based on systems engineering and semantic modeling; digital twins in companies first developments and future challenges; human-centered artificial intelligence in smart manufacturing for the operator 4.0; operations management in engineer-to-order manufacturing; product and asset life cycle management for smart and sustainable manufacturing systems; robotics technologies for control, smart manufacturing and logistics; serious games analytics: improving games and learning support; smart and sustainable production and supply chains; smart methods and techniques for sustainable supply chain management; the new digital lean manufacturing paradigm; and the role of emerging technologies in disaster relief operations: lessons from COVID-19 Part V: data-driven platforms and applications in production and logistics: digital twins and AI for sustainability; regular session: new approaches for routing problem solving; regular session: improvement of design and operation of manufacturing systems; regular session: crossdock and transportation issues; regular session: maintenance improvement and lifecycle management; regular session: additive manufacturing and mass customization; regular session: frameworks and conceptual modelling for systems and services efficiency; regular session: optimization of production and transportation systems; regular session: optimization of supply chain agility and reconfigurability; regular session: advanced modelling approaches; regular session: simulation and optimization of systems performances; regular session: AI-based approaches for quality and performance improvement of production systems; and regular session: risk and performance management of supply chains *The conference was held online. |
You may like...
Concept Parsing Algorithms (CPA) for…
Uri Shafrir, Masha Etkind
Hardcover
R3,276
Discovery Miles 32 760
|