![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General
Recent achievements in hardware and software development, such as multi-core CPUs and DRAM capacities of multiple terabytes per server, enabled the introduction of a revolutionary technology: in-memory data management. This technology supports the flexible and extremely fast analysis of massive amounts of enterprise data. Professor Hasso Plattner and his research group at the Hasso Plattner Institute in Potsdam, Germany, have been investigating and teaching the corresponding concepts and their adoption in the software industry for years. This book is based on an online course that was first launched in autumn 2012 with more than 13,000 enrolled students and marked the successful starting point of the openHPI e-learning platform. The course is mainly designed for students of computer science, software engineering, and IT related subjects, but addresses business experts, software developers, technology experts, and IT analysts alike. Plattner and his group focus on exploring the inner mechanics of a column-oriented dictionary-encoded in-memory database. Covered topics include - amongst others - physical data storage and access, basic database operators, compression mechanisms, and parallel join algorithms. Beyond that, implications for future enterprise applications and their development are discussed. Step by step, readers will understand the radical differences and advantages of the new technology over traditional row-oriented, disk-based databases. In this completely revised 2nd edition, we incorporate the feedback of thousands of course participants on openHPI and take into account latest advancements in hard- and software. Improved figures, explanations, and examples further ease the understanding of the concepts presented. We introduce advanced data management techniques such as transparent aggregate caches and provide new showcases that demonstrate the potential of in-memory databases for two diverse industries: retail and life sciences.
This book reviews a variety of methods for wave-based acoustic simulation and recent applications to architectural and environmental acoustic problems. Following an introduction providing an overview of computational simulation of sound environment, the book is in two parts: four chapters on methods and four chapters on applications. The first part explains the fundamentals and advanced techniques for three popular methods, namely, the finite-difference time-domain method, the finite element method, and the boundary element method, as well as alternative time-domain methods. The second part demonstrates various applications to room acoustics simulation, noise propagation simulation, acoustic property simulation for building components, and auralization. This book is a valuable reference that covers the state of the art in computational simulation for architectural and environmental acoustics.
When discussing classification, support vector machines are known to be a capable and efficient technique to learn and predict with high accuracy within a quick time frame. Yet, their black box means to do so make the practical users quite circumspect about relying on it, without much understanding of the how and why of its predictions. The question raised in this book is how can this ‘masked hero’ be made more comprehensible and friendly to the public: provide a surrogate model for its hidden optimization engine, replace the method completely or appoint a more friendly approach to tag along and offer the much desired explanations? Evolutionary algorithms can do all these and this book presents such possibilities of achieving high accuracy, comprehensibility, reasonable runtime as well as unconstrained performance.
This book focuses on organization and mechanisms of expert decision-making support using modern information and communication technologies, as well as information analysis and collective intelligence technologies (electronic expertise or simply e-expertise). Chapter 1 (E-Expertise) discusses the role of e-expertise in decision-making processes. The procedures of e-expertise are classified, their benefits and shortcomings are identified and the efficiency conditions are considered. Chapter 2 (Expert Technologies and Principles) provides a comprehensive overview of modern expert technologies. A special emphasis is placed on the specifics of e-expertise. Moreover, the authors study the feasibility and reasonability of employing well-known methods and approaches in e-expertise. Chapter 3 (E-Expertise: Organization and Technologies) describes some examples of up-to-date technologies to perform e-expertise. Chapter 4 (Trust Networks and Competence Networks) deals with the problems of expert finding and grouping by information and communication technologies. Chapter 5 (Active Expertise) treats the problem of expertise stability against any strategic manipulation by experts or coordinators pursuing individual goals. The book addresses a wide range of readers interested in management, decision-making and expert activity in political, economic, social and industrial spheres.
This book serves as a practical guide for practicing engineers who need to design embedded systems for high-speed data acquisition and control systems. A minimum amount of theory is presented, along with a review of analog and digital electronics, followed by detailed explanations of essential topics in hardware design and software development. The discussion of hardware focuses on microcontroller design (ARM microcontrollers and FPGAs), techniques of embedded design, high speed data acquisition (DAQ) and control systems. Coverage of software development includes main programming techniques, culminating in the study of real-time operating systems. All concepts are introduced in a manner to be highly-accessible to practicing engineers and lead to the practical implementation of an embedded board that can be used in various industrial fields as a control system and high speed data acquisition system.
The book offers an original view on channel coding, based on a unitary approach to block and convolutional codes for error correction. It presents both new concepts and new families of codes. For example, lengthened and modified lengthened cyclic codes are introduced as a bridge towards time-invariant convolutional codes and their extension to time-varying versions. The novel families of codes include turbo codes and low-density parity check (LDPC) codes, the features of which are justified from the structural properties of the component codes. Design procedures for regular LDPC codes are proposed, supported by the presented theory. Quasi-cyclic LDPC codes, in block or convolutional form, represent one of the most original contributions of the book. The use of more than 100 examples allows the reader gradually to gain an understanding of the theory, and the provision of a list of more than 150 definitions, indexed at the end of the book, permits rapid location of sought information.
The present book includes a set of selected papers from the tenth “International Conference on Informatics in Control Automation and Robotics” (ICINCO 2013), held in Reykjavík, Iceland, from 29 to 31 July 2013. The conference was organized in four simultaneous tracks: “Intelligent Control Systems and Optimization”, “Robotics and Automation”, “Signal Processing, Sensors, Systems Modeling and Control” and “Industrial Engineering, Production and Management”. The book is based on the same structure. ICINCO 2013 received 255 paper submissions from 50 countries, in all continents. After a double blind paper review performed by the Program Committee only 30% were published and presented orally. A further refinement was made after the conference, based also on the assessment of presentation quality, so that this book includes the extended and revised versions of the very best papers of ICINCO 2013.
This book describes how evolutionary algorithms (EA), including genetic algorithms (GA) and particle swarm optimization (PSO) can be utilized for solving multi-objective optimization problems in the area of embedded and VLSI system design. Many complex engineering optimization problems can be modelled as multi-objective formulations. This book provides an introduction to multi-objective optimization using meta-heuristic algorithms, GA and PSO and how they can be applied to problems like hardware/software partitioning in embedded systems, circuit partitioning in VLSI, design of operational amplifiers in analog VLSI, design space exploration in high-level synthesis, delay fault testing in VLSI testing and scheduling in heterogeneous distributed systems. It is shown how, in each case, the various aspects of the EA, namely its representation and operators like crossover, mutation, etc, can be separately formulated to solve these problems. This book is intended for design engineers and researchers in the field of VLSI and embedded system design. The book introduces the multi-objective GA and PSO in a simple and easily understandable way that will appeal to introductory readers.
This book explores the extent to which fuzzy set logic can overcome some of the shortcomings of public choice theory, particularly its inability to provide adequate predictive power in empirical studies. Especially in the case of social preferences, public choice theory has failed to produce the set of alternatives from which collective choices are made. The book presents empirical findings achieved by the authors in their efforts to predict the outcome of government formation processes in European parliamentary and semi-presidential systems. Using data from the Comparative Manifesto Project (CMP), the authors propose a new approach that reinterprets error in the coding of CMP data as ambiguity in the actual political positions of parties on the policy dimensions being coded. The range of this error establishes parties’ fuzzy preferences. The set of possible outcomes in the process of government formation is then calculated on the basis of both the fuzzy Pareto set and the fuzzy maximal set, and the predictions are compared with those made by two conventional approaches as well as with the government that was actually formed. The comparison shows that, in most cases, the fuzzy approaches outperform their conventional counterparts.
This book is focused on the recent advances in computer vision methodologies and technical solutions using conventional and intelligent paradigms. The Contributions include: · Morphological Image Analysis for Computer Vision Applications. · Methods for Detecting of Structural Changes in Computer Vision Systems. · Hierarchical Adaptive KL-based Transform: Algorithms and Applications. · Automatic Estimation for Parameters of Image Projective Transforms Based on Object-invariant Cores. · A Way of Energy Analysis for Image and Video Sequence Processing. · Optimal Measurement of Visual Motion Across Spatial and Temporal Scales. · Scene Analysis Using Morphological Mathematics and Fuzzy Logic. · Digital Video Stabilization in Static and Dynamic Scenes. · Implementation of Hadamard Matrices for Image Processing. · A Generalized Criterion of Efficiency for Telecommunication Systems. The book is directed to PhD students, professors, researchers and software developers working in the areas of digital video processing and computer vision technologies.
Soft computing, intelligent robotics and control are in the core interest of contemporary engineering. Essential characteristics of soft computing methods are the ability to handle vague information, to apply human-like reasoning, their learning capability and ease of application. Soft computing techniques are widely applied in the control of dynamic systems, including mobile robots. The present volume is a collection of 20 chapters written by respectable experts of the fields, addressing various theoretical and practical aspects in soft computing, intelligent robotics and control. The first part of the book concerns with issues of intelligent robotics, including robust xed point transformation design, experimental verification of the input-output feedback linearization of differentially driven mobile robot and applying kinematic synthesis to micro electro-mechanical systems design. The second part of the book is devoted to fundamental aspects of soft computing. This includes practical aspects of fuzzy rule interpolation, subjective weights based meta learning in multi criteria decision making, swarm-based heuristics for an area exploration and knowledge driven adaptive product representations. The last part addresses different problems, issues and methods of applied mathematics. This includes perturbation estimates for invariant subspaces of Hessenberg matrices, uncertainty and nonlinearity modelling by probabilistic metric spaces and comparison and visualization of the DNA of six primates.
To deal with the flexible architectures and evolving functionalities of complex modern systems, the agent metaphor and agent-based computing are often the most appropriate software design approach. As a result, a broad range of special-purpose design processes has been developed in the last several years to tackle the challenges of these specific application domains. In this context, in early 2012 the IEEE-FIPA Design Process Documentation Template SC0097B was defined, which facilitates the representation of design processes and method fragments through the use of standardized templates, thus supporting the creation of easily sharable repositories and facilitating the composition of new design processes. Following this standardization approach, this book gathers the documentations of some of the best-known agent-oriented design processes. After an introductory section, describing the goal of the book and the existing IEEE FIPA standard for design process documentation, thirteen processes (including the widely known Open UP, the de facto standard in object-oriented software engineering) are documented by their original creators or other well-known scientists working in the field. As a result, this is the first work to adopt a standard, unified descriptive approach for documenting different processes, making it much easier to study the individual processes, to rigorously compare them, and to apply them in industrial projects. While there are a few books on the market describing the individual agent-oriented design processes, none of them presents all the processes, let alone in the same format. With this handbook, for the first time, researchers as well as professional software developers looking for an overview as well as for detailed and standardized descriptions of design processes will find a comprehensive presentation of the most important agent-oriented design processes, which will be an invaluable resource when developing solutions in various application areas.
This volume presents an analysis of the problems and solutions of the market mockery of the democratic collective decision-choice system with imperfect information structure composed of defective and deceptive structures using methods of fuzzy rationality. The book is devoted to the political economy of rent-seeking, rent-protection and rent-harvesting to enhance profits under democratic collective decision-choice systems. The toolbox used in the monograph consists of methods of fuzzy decision, approximate reasoning, negotiation games and fuzzy mathematics. The monograph further discusses the rent-seeking phenomenon in the Schumpeterian and Marxian political economies where the rent-seeking activities transform the qualitative character of the general capitalism into oligarchic socialism and making the democratic collective decision-choice system as an ideology rather than social calculus for resolving conflicts in preferences in the collective decision-choice space without violence.
This book offers a comprehensive analysis of the social choice literature and shows, by applying fuzzy sets, how the use of fuzzy preferences, rather than that of strict ones, may affect the social choice theorems. To do this, the book explores the presupposition of rationality within the fuzzy framework and shows that the two conditions for rationality, completeness and transitivity, do exist with fuzzy preferences. Specifically, this book examines: the conditions under which a maximal set exists; the Arrow’s theorem; the Gibbard-Satterthwaite theorem and the median voter theorem. After showing that a non-empty maximal set does exists for fuzzy preference relations, this book goes on to demonstrating the existence of a fuzzy aggregation rule satisfying all five Arrowian conditions, including non-dictatorship. While the Gibbard-Satterthwaite theorem only considers individual fuzzy preferences, this work shows that both individuals and groups can choose alternatives to various degrees, resulting in a social choice that can be both strategy-proof and non-dictatorial. Moreover, the median voter theorem is shown to hold under strict fuzzy preferences but not under weak fuzzy preferences. By providing a standard model of fuzzy social choice and by drawing the necessary connections between the major theorems, this book fills an important gap in the current literature and encourages future empirical research in the field.
This book has been motivated by an urgent need for designing and implementation of innovative control algorithms and systems for tracked vehicles. Nowadays the unmanned vehicles are becoming more and more common. Therefore there is a need for innovative mechanical constructions capable of adapting to various applications regardless the ground, air or water/underwater environment. There are multiple various activities connected with tracked vehicles. They can be distributed among three main groups: design and control algorithms, sensoric and vision based in-formation, construction and testing mechanical parts of unmanned vehicles. Scientists and researchers involved in mechanics, control algorithms, image processing, computer vision, data fusion, or IC will find this book useful.
This volume offers a guide to the state of the art in the fast evolving field of biometric recognition to newcomers and experienced practitioners. It is focused on the emerging strategies to perform biometric recognition under uncontrolled data acquisition conditions. The mainstream research work in this field is presented in an organized manner, so the reader can easily follow the trends that best suits her/his interests in this growing field. The book chapters cover the recent advances in less controlled / covert data acquisition frameworks, segmentation of poor quality biometric data, biometric data quality assessment, normalization of poor quality biometric data. contactless biometric recognition strategies, biometric recognition robustness, data resolution, illumination, distance, pose, motion, occlusions, multispectral biometric recognition, multimodal biometrics, fusion at different levels, high confidence automatic surveillance.
This book is a selection of results obtained within three years of research performed under SYNAT—a nation-wide scientific project aiming at creating an infrastructure for scientific content storage and sharing for academia, education and open knowledge society in Poland. The book is intended to be the last of the series related to the SYNAT project. The previous books, titled “Intelligent Tools for Building a Scientific Information Platform” and “Intelligent Tools for Building a Scientific Information Platform: Advanced Architectures and Solutions”, were published as volumes 390 and 467 in Springer's Studies in Computational Intelligence. Its contents is based on the SYNAT 2013 Workshop held in Warsaw. The papers included in this volume present an overview and insight into information retrieval, repository systems, text processing, ontology-based systems, text mining, multimedia data processing and advanced software engineering, addressing the problems of implementing intelligent tools for building a scientific information platform.
The book is a unique effort to represent a variety of techniques designed to represent, enhance, and empower multi-disciplinary and multi-institutional machine learning research in healthcare informatics. The book provides a unique compendium of current and emerging machine learning paradigms for healthcare informatics and reflects the diversity, complexity and the depth and breath of this multi-disciplinary area. The integrated, panoramic view of data and machine learning techniques can provide an opportunity for novel clinical insights and discoveries.
This book introduces a novel transcoding algorithm for real time video applications, designed to overcome inter-operability problems between MPEG-2 to H.264/AVC. The new algorithm achieves 92.8% reduction in the transcoding run time at a price of an acceptable Peak Signal-to-Noise Ratio (PSNR) degradation, enabling readers to use it for real time video applications. The algorithm described is evaluated through simulation and experimental results. In addition, the authors present a hardware implementation of the new algorithm using Field Programmable Gate Array (FPGA) and Application-specific standard products (ASIC). • Describes a novel transcoding algorithm for real time video applications, designed to overcome inter-operability problems between H.264/AVC to MPEG-2; • Implements algorithm presented using Field Programmable Gate Array (FPGA) and Application-specific Integrated Circuit (ASIC); • Demonstrates the solution to real problems, with verification through simulation and experimental results.
The book discusses intelligent system design using soft computing and similar systems and their interdisciplinary applications. It also focuses on the recent trends to use soft computing as a versatile tool for designing a host of decision support systems.
Teaching and learning paradigms have attracted increased attention especially in the last decade. Immense developments of different ICT technologies and services have paved the way for alternative but effective approaches in educational processes. Many concepts of the agent technology, such as intelligence, autonomy and cooperation, have had a direct positive impact on many of the requests imposed on modern e-learning systems and educational processes. This book presents the state-of-the-art of e-learning and tutoring systems and discusses their capabilities and benefits that stem from integrating software agents. We hope that the presented work will be of a great use to our colleagues and researchers interested in the e-learning and agent technology.
This book presents an exhaustive and timely review of key research work on fuzzy XML data management, and provides readers with a comprehensive resource on the state-of-the art tools and theories in this fast growing area. Topics covered in the book include: representation of fuzzy XML, query of fuzzy XML, fuzzy database models, extraction of fuzzy XML from fuzzy database models, reengineering of fuzzy XML into fuzzy database models, and reasoning of fuzzy XML. The book is intended as a reference guide for researchers, practitioners and graduate students working and/or studying in the field of Web Intelligence, as well as for data and knowledge engineering professionals seeking new approaches to replace traditional methods, which may be unnecessarily complex or even unproductive.
Membrane Computing was introduced as a computational paradigm in Natural Computing. The models introduced, called Membrane (or P) Systems, provide a coherent platform to describe and study living cells as computational systems. Membrane Systems have been investigated for their computational aspects and employed to model problems in other fields, like: Computer Science, Linguistics, Biology, Economy, Computer Graphics, Robotics, etc. Their inherent parallelism, heterogeneity and intrinsic versatility allow them to model a broad range of processes and phenomena, being also an efficient means to solve and analyze problems in a novel way. Membrane Computing has been used to model biological systems, becoming with time a thorough modeling paradigm comparable, in its modeling and predicting capabilities, to more established models in this area. This book is the result of the need to collect, in an organic way, different facets of this paradigm. The chapters of this book, together with the web pages accompanying them, present different applications of Membrane Systems to Biology. Deterministic, non-deterministic and stochastic systems paired with different algorithms and methodologies show the full potential of this framework. The book is addressed to researchers interested in applications of discrete biological models and the interplay between Membrane Systems and other approaches to analyze complex systems.
This monograph presents a comprehensive study of portfolio optimization, an important area of quantitative finance. Considering that the information available in financial markets is incomplete and that the markets are affected by vagueness and ambiguity, the monograph deals with fuzzy portfolio optimization models. At first, the book makes the reader familiar with basic concepts, including the classical mean–variance portfolio analysis. Then, it introduces advanced optimization techniques and applies them for the development of various multi-criteria portfolio optimization models in an uncertain environment. The models are developed considering both the financial and non-financial criteria of investment decision making, and the inputs from the investment experts. The utility of these models in practice is then demonstrated using numerical illustrations based on real-world data, which were collected from one of the premier stock exchanges in India. The book addresses both academics and professionals pursuing advanced research and/or engaged in practical issues in the rapidly evolving field of portfolio optimization.
Modern information and communication technologies, together with a cultural upheaval within the research community, have profoundly changed research in nearly every aspect. Ranging from sharing and discussing ideas in social networks for scientists to new collaborative environments and novel publication formats, knowledge creation and dissemination as we know it is experiencing a vigorous shift towards increased transparency, collaboration and accessibility. Many assume that research workflows will change more in the next 20 years than they have in the last 200. This book provides researchers, decision makers, and other scientific stakeholders with a snapshot of the basics, the tools, and the underlying visions that drive the current scientific (r)evolution, often called ‘Open Science.’ |
You may like...
Digital Libraries - Integrating Content…
Mark V Dahl, Kyle Banerjee, …
Paperback
R1,150
Discovery Miles 11 500
Handbook of Artificial Intelligence in…
Benedict du Boulay, Antonija Mitrovic, …
Hardcover
R8,636
Discovery Miles 86 360
Principles Of Business Information…
Ralph Stair, George Reynolds, …
Paperback
(1)R1,780 Discovery Miles 17 800
If Anyone Builds It, Everyone Dies - The…
Eliezer Yudkowsky, Nate Soares
Paperback
|