![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > Data structures
In many real-world problems, rare categories (minority classes) play essential roles despite their extreme scarcity. The discovery, characterization and prediction of rare categories of rare examples may protect us from fraudulent or malicious behavior, aid scientific discovery, and even save lives. This book focuses on rare category analysis, where the majority classes have smooth distributions, and the minority classes exhibit the compactness property. Furthermore, it focuses on the challenging cases where the support regions of the majority and minority classes overlap. The author has developed effective algorithms with theoretical guarantees and good empirical results for the related techniques, and these are explained in detail. The book is suitable for researchers in the area of artificial intelligence, in particular machine learning and data mining.
In this book the author introduces a novel approach to securing exam systems. He provides an in-depth understanding, useful for studying the security of exams and similar systems, such as public tenders, personnel selections, project reviews, and conference management systems. After a short chapter that explains the context and objectives of the book, in Chap. 2 the author introduces terminology for exams and the foundations required to formulate their security requirements. He describes the tasks that occur during an exam, taking account of the levels of detail and abstraction of an exam specification and the threats that arise out of the different exam roles. He also presents a taxonomy that classifies exams by types and categories. Chapter 3 contains formal definitions of the authentication, privacy, and verifiability requirements for exams, a framework based on the applied pi-calculus for the specification of authentication and privacy, and a more abstract approach based on set-theory that enables the specification of verifiability. Chapter 4 describes the Huszti-Petho protocol in detail and proposes a security enhancement. In Chap. 5 the author details Remark!, a protocol for Internet-based exams, discussing its cryptographic building blocks and some security considerations. Chapter 6 focuses on WATA, a family of computer-assisted exams that employ computer assistance while keeping face-to-face testing. The chapter also introduces formal definitions of accountability requirements and details the analysis of a WATA protocol against such definitions. In Chaps. 4, 5, and 6 the author uses the cryptographic protocol verifier ProVerif for the formal analyses. Finally, the author outlines future work in Chap. 7. The book is valuable for researchers and graduate students in the areas of information security, in particular for people engaged with exams or protocols.
With this book, Christopher Kormanyos delivers a highly practical guide to programming real-time embedded microcontroller systems in C++. It is divided into three parts plus several appendices. Part I provides a foundation for real-time C++ by covering language technologies, including object-oriented methods, template programming and optimization. Next, part II presents detailed descriptions of a variety of C++ components that are widely used in microcontroller programming. It details some of C++'s most powerful language elements, such as class types, templates and the STL, to develop components for microcontroller register access, low-level drivers, custom memory management, embedded containers, multitasking, etc. Finally, part III describes mathematical methods and generic utilities that can be employed to solve recurring problems in real-time C++. The appendices include a brief C++ language tutorial, information on the real-time C++ development environment and instructions for building GNU GCC cross-compilers and a microcontroller circuit. For this fourth edition, the most recent specification of C++20 is used throughout the text. Several sections on new C++20 functionality have been added, and various others reworked to reflect changes in the standard. Also several new example projects ranging from introductory to advanced level are included and existing ones extended, and various reader suggestions have been incorporated. Efficiency is always in focus and numerous examples are backed up with runtime measurements and size analyses that quantify the true costs of the code down to the very last byte and microsecond. The target audience of this book mainly consists of students and professionals interested in real-time C++. Readers should be familiar with C or another programming language and will benefit most if they have had some previous experience with microcontroller electronics and the performance and size issues prevalent in embedded systems programming.
This book covers the two broad areas of the electronics and electrical aspects of control applications, highlighting the many different types of control systems of relevance to real-life control system design. The control techniques presented are state-of-the-art. In the electronics section, readers will find essential information on microprocessor, microcontroller, mechatronics and electronics control. The low-level assembly programming language performs basic input/output control techniques as well as controlling the stepper motor and PWM dc motor. In the electrical section, the book addresses the complete elevator PLC system design, neural network plant control, load flow analysis, and process control, as well as machine vision topics. Illustrative diagrams, circuits and programming examples and algorithms help to explain the details of the system function design. Readers will find a wealth of computer control and industrial automation practices and applications for modern industries, as well as the educational sector.
This volume is intended to be used as a textbook for a special topic course in computer science. It addresses contemporary research topics of interest such as intelligent control, genetic algorithms, neural networks, optimization techniques, expert systems, fractals, and computer vision. The work incorporates many new research ideas, and focuses on the role of continuous mathematics. Audience: This book will be valuable to graduate students interested in theoretical computer topics, algorithms, expert systems, neural networks, and software engineering.
Fault-Tolerant Parallel Computation presents recent advances in algorithmic ways of introducing fault-tolerance in multiprocessors under the constraint of preserving efficiency. The difficulty associated with combining fault-tolerance and efficiency is that the two have conflicting means: fault-tolerance is achieved by introducing redundancy, while efficiency is achieved by removing redundancy. This monograph demonstrates how in certain models of parallel computation it is possible to combine efficiency and fault-tolerance and shows how it is possible to develop efficient algorithms without concern for fault-tolerance, and then correctly and efficiently execute these algorithms on parallel machines whose processors are subject to arbitrary dynamic fail-stop errors. The efficient algorithmic approaches to multiprocessor fault-tolerance presented in this monograph make a contribution towards bridging the gap between the abstract models of parallel computation and realizable parallel architectures. Fault-Tolerant Parallel Computation presents the state of the art in algorithmic approaches to fault-tolerance in efficient parallel algorithms. The monograph synthesizes work that was presented in recent symposia and published in refereed journals by the authors and other leading researchers. This is the first text that takes the reader on the grand tour of this new field summarizing major results and identifying hard open problems. This monograph will be of interest to academic and industrial researchers and graduate students working in the areas of fault-tolerance, algorithms and parallel computation and may also be used as a text in a graduate course on parallel algorithmic techniques and fault-tolerance.
The Complexity Theory Companion is an accessible, algorithmically oriented, research-centered, up-to-date guide to some of the most interesting techniques of complexity theory. The book's thesis is that simple algorithms are at the heart of complexity theory. From the tree-pruning and interval-pruning algorithms that shape the first chapter to the query simulation procedures that dominate the last chapter, the central proof methods of the book are algorithmic. And to more clearly highlight the role of algorithmic techniques in complexity theory, the book is - unlike other texts on complexity - organized by technique rather than by topic. Each chapter of this book focuses on one technique: what it is, and what results and applications it yields. This textbook was developed at the University of Rochester in courses given to graduate students and advanced undergraduates. Researchers also will find this book a valuable source of reference due to the comprehensive bibliography of close to five hundred entries, the thirty-five page subject index, and the appendices giving overviews of complexity classes and reductions.
Today, Internet of Things (IoT) is ubiquitous as it is applied in practice in everything from Industrial Control Systems (ICS) to e-Health, e-commerce, Cyber Physical Systems (CPS), smart cities, smart parking, healthcare, supply chain management and many more. Numerous industries, academics, alliances and standardization organizations make an effort on IoT standardization, innovation and development. But there is still a need for a comprehensive framework with integrated standards under one IoT vision. Furthermore, the existing IoT systems are vulnerable to huge range of malicious attacks owing to the massive numbers of deployed IoT systems, inadequate data security standards and the resource-constrained nature. Existing security solutions are insufficient and therefore it is necessary to enable the IoT devices to dynamically counter the threats and save the system. Apart from illustrating the diversified IoT applications, this book also addresses the issue of data safekeeping along with the development of new security-enhancing schemes such as blockchain, as well as a range of other advances in IoT. The reader will discover that the IoT facilitates a multidisciplinary approach dedicated to create novel applications and develop integrated solutions to build a sustainable society. The innovative and fresh advances that demonstrate IoT and computational intelligence in practice are discussed in this book, which will be helpful and informative for scientists, research scholars, academicians, policymakers, industry professionals, government organizations and others. This book is intended for a broad target audience, including scholars of various generations and disciplines, recognized scholars (lecturers and professors) and young researchers (postgraduate and undergraduates) who study the legal and socio-economic consequences of the emergence and dissemination of digital technologies such as IoT. Furthermore, the book is intended for researchers, developers and operators working in the field of IoT and eager to comprehend the vulnerability of the IoT paradigm. The book will serve as a comprehensive guide for the advanced-level students in computer science who are interested in understanding the severity and implications of the accompanied security issues in IoT. Dr. Bharat Bhushan is an Assistant Professor of Department of Computer Science and Engineering (CSE) at School of Engineering and Technology, Sharda University, Greater Noida, India. Prof. (Dr.) Sudhir Kumar Sharma is currently a Professor and Head of the Department of Computer Science, Institute of Information Technology & Management affiliated to GGSIPU, New Delhi, India. Prof. (Dr.) Bhuvan Unhelkar (BE, MDBA, MSc, PhD; FACS; PSM-I, CBAP (R)) is an accomplished IT professional and Professor of IT at the University of South Florida, Sarasota-Manatee (Lead Faculty). Dr. Muhammad Fazal Ijaz is working as an Assistant Professor in Department of Intelligent Mechatronics Engineering, Sejong University, Seoul, Korea. Prof. (Dr.) Lamia Karim is a professor of computer science at the National School of Applied Sciences Berrechid (ENSAB), Hassan 1st University.
Aspects of Robust Statistics are important in many areas. Based on the International Conference on Robust Statistics 2001 (ICORS 2001) in Vorau, Austria, this volume discusses future directions of the discipline, bringing together leading scientists, experienced researchers and practitioners, as well as younger researchers. The papers cover a multitude of different aspects of Robust Statistics. For instance, the fundamental problem of data summary (weights of evidence) is considered and its robustness properties are studied. Further theoretical subjects include e.g.: robust methods for skewness, time series, longitudinal data, multivariate methods, and tests. Some papers deal with computational aspects and algorithms. Finally, the aspects of application and programming tools complete the volume.
This book discusses the effective use of modern ICT solutions for business needs, including the efficient use of IT resources, decision support systems, business intelligence, data mining and advanced data processing algorithms, as well as the processing of large datasets (inter alia social networking such as Twitter and Facebook, etc.). The ability to generate, record and process qualitative and quantitative data, including in the area of big data, the Internet of Things (IoT) and cloud computing offers a real prospect of significant improvements for business, as well as the operation of a company within Industry 4.0. The book presents new ideas, approaches, solutions and algorithms in the area of knowledge representation, management and processing, quantitative and qualitative data processing (including sentiment analysis), problems of simulation performance, and the use of advanced signal processing to increase the speed of computation. The solutions presented are also aimed at the effective use of business process modeling and notation (BPMN), business process semantization and investment project portfolio selection. It is a valuable resource for researchers, data analysts, entrepreneurs and IT professionals alike, and the research findings presented make it possible to reduce costs, increase the accuracy of investment, optimize resources and streamline operations and marketing.
This book discusses the interplay between statistics, data science, machine learning and artificial intelligence, with a focus on environmental science, the natural sciences, and technology. It covers the state of the art from both a theoretical and a practical viewpoint and describes how to successfully apply machine learning methods, demonstrating the benefits of statistics for modeling and analyzing high-dimensional and big data. The book's expert contributions include theoretical studies of machine learning methods, expositions of general methodologies for sound statistical analyses of data as well as novel approaches to modeling and analyzing data for specific problems and areas. In terms of applications, the contributions deal with data as arising in industrial quality control, autonomous driving, transportation and traffic, chip manufacturing, photovoltaics, football, transmission of infectious diseases, Covid-19 and public health. The book will appeal to statisticians and data scientists, as well as engineers and computer scientists working in related fields or applications.
Besides scheduling problems for single and parallel machines and shop scheduling problems, this book covers advanced models involving due-dates, sequence dependent changeover times and batching. Discussion also extends to multiprocessor task scheduling and problems with multi-purpose machines. Among the methods used to solve these problems are linear programming, dynamic programming, branch-and-bound algorithms, and local search heuristics. The text goes on to summarize complexity results for different classes of deterministic scheduling problems.
This book presents a new understanding on how control systems truly operate, and explains how to recognize, simulate, and improve control systems in all fields of activity. It also reveals the pervasive, ubiquitous and indispensable role of control processes in our life and the need to develop a "control-oriented thinking"-based on uncomplicated but effective models derived from systems thinking-that is, a true "discipline of control." Over the book's thirteen chapters, Piero Mella shows that there are simple control systems (rather than complex ones) that can easily help us to manage complexity without drawing upon more sophisticated control systems. It begins by reviewing the basic language of systems thinking and the models it allows users to create. It then introduces the control process, presenting the theoretical structure of three simple control systems we all can observe in order to gain fundamental knowledge from them about the basic structure of a control system. Then, it presents the anatomy of the simplest "magic ring" and the general theoretical model of any control system. This is followed by an introduction to a general typology of control systems and a broader view of control systems by investigating multi-lever control systems and multi-objective systems. The book undertakes the concepts through various environments, increasingly broader in scope to suggest to readers how to recognize therein control systems manifestations in everyday life and in natural phenomena. Updated for the 2nd edition, new chapters explore control systems regulating the biological environment and the organizations, with an in-depth study of the control of quality, productivity, production, stocks and costs. Finally, it concludes by dealing with the learning process, problem-solving, and designing the logical structure of control systems.
This introductory book on quantum computing includes an emphasis on the development of algorithms. Appropriate for both university students as well as software developers interested in programming a quantum computer, this practical approach to modern quantum computing takes the reader through the required background and up to the latest developments. Beginning with introductory chapters on the required math and quantum mechanics, Fundamentals of Quantum Computing proceeds to describe four leading qubit modalities and explains the core principles of quantum computing in detail. Providing a step-by-step derivation of math and source code, some of the well-known quantum algorithms are explained in simple ways so the reader can try them either on IBM Q or Microsoft QDK. The book also includes a chapter on adiabatic quantum computing and modern concepts such as topological quantum computing and surface codes. Features: o Foundational chapters that build the necessary background on math and quantum mechanics. o Examples and illustrations throughout provide a practical approach to quantum programming with end-of-chapter exercises. o Detailed treatment on four leading qubit modalities -- trapped-ion, superconducting transmons, topological qubits, and quantum dots -- teaches how qubits work so that readers can understand how quantum computers work under the hood and devise efficient algorithms and error correction codes. Also introduces protected qubits - 0- qubits, fluxon parity protected qubits, and charge-parity protected qubits. o Principles of quantum computing, such as quantum superposition principle, quantum entanglement, quantum teleportation, no-cloning theorem, quantum parallelism, and quantum interference are explained in detail. A dedicated chapter on quantum algorithm explores both oracle-based, and Quantum Fourier Transform-based algorithms in detail with step-by-step math and working code that runs on IBM QisKit and Microsoft QDK. Topics on EPR Paradox, Quantum Key Distribution protocols, Density Matrix formalism, and Stabilizer formalism are intriguing. While focusing on the universal gate model of quantum computing, this book also introduces adiabatic quantum computing and quantum annealing. This book includes a section on fault-tolerant quantum computing to make the discussions complete. The topics on Quantum Error Correction, Surface codes such as Toric code and Planar code, and protected qubits help explain how fault tolerance can be built at the system level.
This book highlights the design, use and structure of blockchain systems and decentralized ledger technologies (B/DLT) for use in the construction industry. Construction remains a fragmented change-resistant industry with chronic problems of underproductivity and a very low digitization factor compared to other fields. In parallel, the convergence, embedding and coordination of digital technologies in the physical world provides a unique opportunity for the construction industry to leap ahead and adopt fourth industrial revolution technologies. Within this context, B/DLT are an excellent fit for the digitization of the construction industry. B/DLT are effective in this as they organize and align digital and physical supply chains, produce stigmergic coordination out of decentralization, enable the governance of complex projects for multiple stakeholders, while enabling the creation of a new class of business models and legal instruments for construction.
This textbook aims to help the reader develop an in-depth understanding of logical reasoning and gain knowledge of the theory of computation. The book combines theoretical teaching and practical exercises; the latter is realised in Isabelle/HOL, a modern theorem prover, and PAT, an industry-scale model checker. I also give entry-level tutorials on the two software to help the reader get started. By the end of the book, the reader should be proficient in both software. Content-wise, this book focuses on the syntax, semantics and proof theory of various logics; automata theory, formal languages, computability and complexity. The final chapter closes the gap with a discussion on the insight that links logic with computation. This book is written for a high-level undergraduate course or a Master's course. The hybrid skill set of practical theorem proving and model checking should be helpful for the future of readers should they pursue a research career or engineering in formal methods.
Swarm intelligence algorithms are a form of nature-based optimization algorithms. Their main inspiration is the cooperative behavior of animals within specific communities. This can be described as simple behaviors of individuals along with the mechanisms for sharing knowledge between them, resulting in the complex behavior of the entire community. Examples of such behavior can be found in ant colonies, bee swarms, schools of fish or bird flocks. Swarm intelligence algorithms are used to solve difficult optimization problems for which there are no exact solving methods or the use of such methods is impossible, e.g. due to unacceptable computational time. This book thoroughly presents the basics of 24 algorithms selected from the entire family of swarm intelligence algorithms. Each chapter deals with a different algorithm describing it in detail and showing how it works in the form of a pseudo-code. In addition, the source code is provided for each algorithm in Matlab and in the C ++ programming language. In order to better understand how each swarm intelligence algorithm works, a simple numerical example is included in each chapter, which guides the reader step by step through the individual stages of the algorithm, showing all necessary calculations. This book can provide the basics for understanding how swarm intelligence algorithms work, and aid readers in programming these algorithms on their own to solve various computational problems. This book should also be useful for undergraduate and postgraduate students studying nature-based optimization algorithms, and can be a helpful tool for learning the basics of these algorithms efficiently and quickly. In addition, it can be a useful source of knowledge for scientists working in the field of artificial intelligence, as well as for engineers interested in using this type of algorithms in their work. If the reader already has basic knowledge of swarm intelligence algorithms, we recommend the book: "Swarm Intelligence Algorithms: Modifications and Applications" (Edited by A. Slowik, CRC Press, 2020), which describes selected modifications of these algorithms and presents their practical applications.
Machine Vision Algorithms in Java provides a comprehensive introduction to the algorithms and techniques associated with machine vision systems. The Java programming language is also introduced, with particular reference to its imaging capabilities. The book contains explanations of key machine vision techniques and algorithms, along with the associated Java source code.Special features include: - A complete self-contained treatment of the topics and techniques essential to the understanding and implementation of machine vision.- An introduction to object-oriented programming and to the Java programming language, with particular reference to its imaging capabilities.- Java source code for a wide range of practical image processing and analysis functions.- Readers will be given the opportunity to download a fully functional Java-based visual programming environment for machine vision, available via the WWW. This contains over 200 image processing, manipulation and analysis functions and will enable users to implement many of the ideas covered in this book. - Details relating to the design of a Java-based visual programming environment for machine vision.- An introduction to the Java 2D imaging and Java Advanced Imaging (JAI) APIs- A wide range of illustrative examples.- Practical treatment of the subject matter. This book is aimed at senior undergraduate and postgraduate students in engineering and computer science as well as practitioners in machine vision who may wish to update or expand their knowledge of the subject. The techniques and algorithms of machine vision are expounded in a way that will be understood not only by specialists but also by those who are less familiar with the topic.
This book is a groundbreaking resource that covers both algorithms and technologies of interactive videos. It presents recent research and application work for building and browsing interactive digital videos. The book deals mainly with low-level semi-automatic and full-automatic processing of the video content for intelligent human computer interaction. There is a special focus on eye tracking methods.
This book comprises select proceedings of the 7th International Conference on Data Science and Engineering (ICDSE 2021). The contents of this book focus on responsible data science. This book tries to integrate research across diverse topics related to data science, such as fairness, trust, ethics, confidentiality, transparency, and accuracy. The chapters in this book represent research from different perspectives that offer novel theoretical implications that span multiple disciplines. The book will serve as a reference resource for researchers and practitioners in academia and industry.
Is the exponential function computable? Are union and intersection of closed subsets of the real plane computable? Are differentiation and integration computable operators? Is zero finding for complex polynomials computable? Is the Mandelbrot set decidable? And in case of computability, what is the computational complexity? Computable analysis supplies exact definitions for these and many other similar questions and tries to solve them. - Merging fundamental concepts of analysis and recursion theory to a new exciting theory, this book provides a solid basis for studying various aspects of computability and complexity in analysis. It is the result of an introductory course given for several years and is written in a style suitable for graduate-level and senior students in computer science and mathematics. Many examples illustrate the new concepts while numerous exercises of varying difficulty extend the material and stimulate readers to work actively on the text.
The papers in this volume comprise the refereed proceedings of the Second IFIP International Conference on Computer and Computing Technologies in Agriculture (CCTA2008), in Beijing, China, 2008. The conference on the Second IFIP International Conference on Computer and Computing Technologies in Agriculture (CCTA 2008) is cooperatively sponsored and organized by the China Agricultural University (CAU), the National Engineering Research Center for Information Technology in Agriculture (NERCITA), the Chinese Society of Agricultural Engineering (CSAE) , International Federation for Information Processing (IFIP), Beijing Society for Information Technology in Agriculture, China and Beijing Research Center for Agro-products Test and Farmland Inspection, China. The related departments of China's central government bodies like: Ministry of Science and Technology, Ministry of Industry and Information Technology, Ministry of Education and the Beijing Municipal Natural Science Foundation, Beijing Academy of Agricultural and Forestry Sciences, etc. have greatly contributed and supported to this event. The conference is as good platform to bring together scientists and researchers, agronomists and information engineers, extension servers and entrepreneurs from a range of disciplines concerned with impact of Information technology for sustainable agriculture and rural development. The representatives of all the supporting organizations, a group of invited speakers, experts and researchers from more than 15 countries, such as: the Netherlands, Spain, Portugal, Mexico, Germany, Greece, Australia, Estonia, Japan, Korea, India, Iran, Nigeria, Brazil, China, etc.
This book is about conformal prediction, an approach to prediction that originated in machine learning in the late 1990s. The main feature of conformal prediction is the principled treatment of the reliability of predictions. The prediction algorithms described - conformal predictors - are provably valid in the sense that they evaluate the reliability of their own predictions in a way that is neither over-pessimistic nor over-optimistic (the latter being especially dangerous). The approach is still flexible enough to incorporate most of the existing powerful methods of machine learning. The book covers both key conformal predictors and the mathematical analysis of their properties. Algorithmic Learning in a Random World contains, in addition to proofs of validity, results about the efficiency of conformal predictors. The only assumption required for validity is that of "randomness" (the prediction algorithm is presented with independent and identically distributed examples); in later chapters, even the assumption of randomness is significantly relaxed. Interesting results about efficiency are established both under randomness and under stronger assumptions. Since publication of the First Edition in 2005 conformal prediction has found numerous applications in medicine and industry, and is becoming a popular machine-learning technique. This Second Edition contains three new chapters. One is about conformal predictive distributions, which are more informative than the set predictions produced by standard conformal predictors. Another is about the efficiency of ways of testing the assumption of randomness based on conformal prediction. The third new chapter harnesses conformal testing procedures for protecting machine-learning algorithms against changes in the distribution of the data. In addition, the existing chapters have been revised, updated, and expanded.
bridges ML and Optimisation; discusses optimisation techniques to improve ML algorithms for big data problems; identifies key research areas to solve large-scale machine learning problems; identifies recent research directions to solve major areas to tackle the challenge
This book provides an essential update for experienced data processing professionals, transaction managers and database specialists who are seeking system solutions beyond the confines of traditional approaches. It provides practical advice on how to manage complex transactions and share distributed databases on client servers and the Internet. Based on extensive research in over 100 companies in the USA, Europe, Japan and the UK, topics covered include : * the challenge of global transaction requirements within an expanding business perspective *how to handle long transactions and their constituent elements *possible benefits from object-oriented solutions * the contribution of knowledge engineering in transaction management * the Internet, the World Wide Web and transaction handling * systems software and transaction-processing monitors * OSF/1 and the Encina transaction monitor * active data transfers and remote procedure calls * serialization in a transaction environment * transaction locks, two-phase commit and deadlocks * improving transaction-oriented database management * the successful development of an increasingly complex transaction environment. |
![]() ![]() You may like...
Computational Models for Neuroscience…
Robert Hecht-Nielsen, Thomas McKenna
Hardcover
R4,383
Discovery Miles 43 830
Intuitionistic and Type-2 Fuzzy Logic…
Oscar Castillo, Patricia Melin, …
Hardcover
R6,495
Discovery Miles 64 950
Artificial Intelligence Applications for…
Farid Meziane, Sunil Vadera
Hardcover
R4,977
Discovery Miles 49 770
Applied Intelligence for Industry 4.0
Nazmul Siddique, Mohammad Shamsul Arefin, …
Hardcover
R3,069
Discovery Miles 30 690
Computer and Computing Technologies in…
Daoliang Li, Yingyi Chen
Hardcover
R2,978
Discovery Miles 29 780
Biomedical and Business Applications…
Richard S Segall, Gao Niu
Hardcover
R7,211
Discovery Miles 72 110
Social Network Based Big Data Analysis…
Mehmet Kaya, Jalal Kawash, …
Hardcover
R3,470
Discovery Miles 34 700
|