Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > General theory of computing
This book presents an examination of the middleware that can be used to configure and operate heterogeneous node platforms and sensor networks. The middleware requirements for a range of application scenarios are compared and analysed. The text then defines middleware architecture that has been integrated in an approach demonstrated live in a refinery. Features: presents a thorough introduction to the major concepts behind wireless sensor networks (WSNs); reviews the various application scenarios and existing middleware solutions for WSNs; discusses the middleware mechanisms necessary for heterogeneous WSNs; provides a detailed examination of platform-agnostic middleware architecture, including important implementation details; investigates the programming paradigms for WSNs, and for heterogeneous sensor networks in general; describes the results of extensive experimentation and testing, demonstrating that the generic architecture is viable for implementation on multiple platforms.
This book presents practical optimization techniques used in image processing and computer vision problems. Ill-posed problems are introduced and used as examples to show how each type of problem is related to typical image processing and computer vision problems. Unconstrained optimization gives the best solution based on numerical minimization of a single, scalar-valued objective function or cost function. Unconstrained optimization problems have been intensively studied, and many algorithms and tools have been developed to solve them. Most practical optimization problems, however, arise with a set of constraints. Typical examples of constraints include: (i) pre-specified pixel intensity range, (ii) smoothness or correlation with neighboring information, (iii) existence on a certain contour of lines or curves, and (iv) given statistical or spectral characteristics of the solution. Regularized optimization is a special method used to solve a class of constrained optimization problems. The term regularization refers to the transformation of an objective function with constraints into a different objective function, automatically reflecting constraints in the unconstrained minimization process. Because of its simplicity and efficiency, regularized optimization has many application areas, such as image restoration, image reconstruction, optical flow estimation, etc. Optimization plays a major role in a wide variety of theories for image processing and computer vision. Various optimization techniques are used at different levels for these problems, and this volume summarizes and explains these techniques as applied to image processing and computer vision.
This book develops a coherent and quite general theoretical approach to algorithm design for iterative learning control based on the use of operator representations and quadratic optimization concepts including the related ideas of inverse model control and gradient-based design. Using detailed examples taken from linear, discrete and continuous-time systems, the author gives the reader access to theories based on either signal or parameter optimization. Although the two approaches are shown to be related in a formal mathematical sense, the text presents them separately as their relevant algorithm design issues are distinct and give rise to different performance capabilities. Together with algorithm design, the text demonstrates the underlying robustness of the paradigm and also includes new control laws that are capable of incorporating input and output constraints, enable the algorithm to reconfigure systematically in order to meet the requirements of different reference and auxiliary signals and also to support new properties such as spectral annihilation. Iterative Learning Control will interest academics and graduate students working in control who will find it a useful reference to the current status of a powerful and increasingly popular method of control. The depth of background theory and links to practical systems will be of use to engineers responsible for precision repetitive processes.
This reference and handbook describes theory, algorithms and applications of the Global Positioning System (GPS/Glonass/Galileo/Compass). It is primarily based on source-code descriptions of the KSGsoft program developed at the GFZ in Potsdam. The theory and algorithms are extended and verified for a new development of a multi-functional GPS/Galileo software. Besides the concepts such as the unified GPS data processing method, the diagonalisation algorithm, the adaptive Kalman filter, the general ambiguity search criteria, and the algebraic solution of variation equation reported in the first edition, the equivalence theorem of the GPS algorithms, the independent parameterisation method, and the alternative solar radiation model reported in the second edition, the modernisation of the GNSS system, the new development of the theory and algorithms, and research in broad applications are supplemented in this new edition. Mathematically rigorous, the book begins with the introduction, the basics of coordinate and time systems and satellite orbits, as well as GPS observables, and deals with topics such as physical influences, observation equations and their parameterisation, adjustment and filtering, ambiguity resolution, software development and data processing and the determination of perturbed orbits.
The biggest challenges faced by the software industry are cost control and schedule control. As such, effective strategies for process improvement must be researched and implemented. Analyzing the Role of Risk Mitigation and Monitoring in Software Development is a critical scholarly resource that explores software risk and development as organizations continue to implement more applications across multiple technologies and a multi-tiered environment. Featuring coverage on a broad range of topics such as quantitative risk assessment, threat analysis, and software vulnerability management, this book is a vital resource for engineers, academicians, professionals, and researchers seeking current research on the importance of risk management in software development.
The Semantic Web is characterized by the existence of a very large number of distributed semantic resources, which together define a network of ontologies. These ontologies in turn are interlinked through a variety of different meta-relationships such as versioning, inclusion, and many more. This scenario is radically different from the relatively narrow contexts in which ontologies have been traditionally developed and applied, and thus calls for new methods and tools to effectively support the development of novel network-oriented semantic applications. This book by Suarez-Figueroa et al. provides the necessary methodological and technological support for the development and use of ontology networks, which ontology developers need in this distributed environment. After an introduction, in its second part the authors describe the NeOn Methodology framework. The book's third part details the key activities relevant to the ontology engineering life cycle. For each activity, a general introduction, methodological guidelines, and practical examples are provided. The fourth part then presents a detailed overview of the NeOn Toolkit and its plug-ins. Lastly, case studies from the pharmaceutical and the fishery domain round out the work. The book primarily addresses two main audiences: students (and their lecturers) who need a textbook for advanced undergraduate or graduate courses on ontology engineering, and practitioners who need to develop ontologies in particular or Semantic Web-based applications in general. Its educational value is maximized by its structured approach to explaining guidelines and combining them with case studies and numerous examples. The description of the open source NeOn Toolkit provides an additional asset, as it allows readers to easily evaluate and apply the ideas presented."
Increasing numbers of businesses and Information Technology firms are outsourcing their software and Web development tasks. It is has been estimated that currently half of the Fortune 500 companies have utilized outsourcing for their development needs and estimates that by the end of 2008, 40% of U.S. companies will either develop, test, support, or store software overseas, with another 40% considering doing the same. Several industries, from computer software to telemarketing, have begun aggressively shifting white-collar work out of the United States. The United States currently accounts for more than half of worldwide spending on IT outsourcing, with a growing portion of this spending going to countries such as India, Russia, and the Philippines, and this trend will continue. Research has indicated that the primary problem is language because of idiomatic expressions and subtle cultural nuances associated with the use of particular words. Thus communication frequently breaks down when dealing with overseas companies.
The book presents various state-of-the-art approaches for process synchronization in a distributed environment. The range of algorithms discussed in the book starts from token based mutual exclusion algorithms that work on tree based topology. Then there are interesting solutions for more flexible logical topology like a directed graph, with or without cycle. In a completely different approach, one of the chapters presents two recent voting-based DME algorithms. All DME algorithms presented in the book aim to ensure fairness in terms of first come first serve (FCFS) order among equal priority processes. At the same time, the solutions consider the priority of the requesting processes and allocate resource for the earliest request when no such request from a higher priority process is pending.
This book presents fundamental new techniques for understanding and processing geospatial data. These "spatial gems" articulate and highlight insightful ideas that often remain unstated in graduate textbooks, and which are not the focus of research papers. They teach us how to do something useful with spatial data, in the form of algorithms, code, or equations. Unlike a research paper, Spatial Gems, Volume 1 does not focus on "Look what we have done!" but rather shows "Look what YOU can do!" With contributions from researchers at the forefront of the field, this volume occupies a unique position in the literature by serving graduate students, professional researchers, professors, and computer developers in the field alike.
This book reveals the historical context and the evolution of the technically complex Allied Signals Intelligence (Sigint) activity against Japan from 1920 to 1945. It traces the all-important genesis and development of the cryptanalytic techniques used to break the main Japanese Navy code (JN-25) and the Japanese Army s Water Transport Code during WWII. This is the first book to describe, explain and analyze the code breaking techniques developed and used to provide this intelligence, thus closing the sole remaining gap in the published accounts of the Pacific War. The authors also explore the organization of cryptographic teams and issues of security, censorship, and leaks. Correcting gaps in previous research, this book illustrates how Sigint remained crucial to Allied planning throughout the war. It helped direct the advance to the Philippines from New Guinea, the sea battles and the submarine onslaught on merchant shipping. Written by well-known authorities on the history of cryptography and mathematics, Code Breaking in the Pacific is designed for cryptologists, mathematicians and researchers working in communications security. Advanced-level students interested in cryptology, the history of the Pacific War, mathematics or the history of computing will also find this book a valuable resource."
The variety and abundance of qualitative characteristics of agricultural products have been the main reasons for the development of different types of non-destructive methods (NDTs). Quality control of these products is one of the most important tasks in manufacturing processes. The use of control and automation has become more widespread, and new approaches provide opportunities for production competition through new technologies. Applications of Image Processing and Soft Computing Systems in Agriculture examines applications of artificial intelligence in agriculture and the main uses of shape analysis on agricultural products such as relationships between form and genetics, adaptation, product characteristics, and product sorting. Additionally, it provides insights developed through computer vision techniques. Highlighting such topics as deep learning, agribusiness, and augmented reality, it is designed for academicians, researchers, agricultural practitioners, and industry professionals.
The book emphasizes neural network structures for achieving
practical and effective systems, and provides many examples.
Practitioners, researchers, and students in industrial,
manufacturing, electrical, mechanical, and production engineering
will find this volume a unique and comprehensive reference source
for diverse application methodologies.
This graduate-level textbook elucidates low-risk and fail-safe systems in mathematical detail. It addresses, in particular, problems where mission-critical performance is paramount, such as in aircraft, missiles, nuclear reactors and weapons, submarines, and many other types of systems where "failure" can result in overwhelming loss of life and property. The book is divided into four parts: Fundamentals, Electronics, Software, and Dangerous Goods. The first part on Fundamentals addresses general concepts of system safety engineering that are applicable to any type of system. The second part, Electronics, addresses the detection and correction of electronic hazards. In particular, the Bent Pin Problem, Sneak Circuit Problem, and related electrical problems are discussed with mathematical precision. The third part on Software addresses predicting software failure rates as well as detecting and correcting deep software logical flaws (called defects). The fourth part on Dangerous Goods presents solutions to three typical industrial chemical problems faced by the system safety engineer during the design, storage, and disposal phases of a dangerous goods' life cycle.
Quantum physics started in the 1920's with wave mechanics and the wave-particle duality. However, the last 20 years have seen a second quantum revolution, centered around non-locality and quantum correlations between measurement outcomes. The associated key property, entanglement, is recognized today as the signature of quantumness. This second revolution opened the possibility of studying quantum correlations without any assumption on the internal functioning of the measurement apparata, the so-called Device-Independent Approach to Quantum Physics. This thesis explores this new approach using the powerful geometrical tool of polytopes. Emphasis is placed on the study of non-locality in the case of three or more parties, where it is shown that a whole new variety of phenomena appear compared to the bipartite case. Genuine multiparty entanglement is also studied for the first time within the device-independent framework. Finally, these tools are used to answer a long-standing open question: could quantum non-locality be explained by influences that propagate from one party to the others faster than light, but that remain hidden so that one cannot use them to communicate faster than light? This would provide a way around Einstein's notion of action at a distance that would be compatible with relativity. However, the answer is shown to be negative, as such influences could not remain hidden."
Technological advancements have become an integral part of life, impacting the way we work, communicate, make decisions, learn, and play. As technology continually progresses, humans are being outpaced by its capabilities, and it is important for businesses, organizations, and individuals to understand how to optimize data and to implement new methods for more efficient knowledge discovery and information management and retrieval. Innovative Applications of Knowledge Discovery and Information Resources Management offers in-depth coverage on the pervasiveness of technological change with a collection of material on topics such as the impact of permeable work-life boundaries, burnout and turnover, big data usage, and computer-based learning. It proves a worthy source for academicians, practitioners, IT leaders, IT professionals, and advanced-level students interested in examining the ways in which technology is changing the world.
Enabling information interoperability, fostering legal knowledge usability and reuse, enhancing legal information search, in short, formalizing the complexity of legal knowledge to enhance legal knowledge management are challenging tasks, for which different solutions and lines of research have been proposed. During the last decade, research and applications based on the use of legal ontologies as a technique to represent legal knowledge has raised a very interesting debate about their capacity and limitations to represent conceptual structures in the legal domain. Making conceptual legal knowledge explicit would support the development of a web of legal knowledge, improve communication, create trust and enable and support open data, e-government and e-democracy activities. Moreover, this explicit knowledge is also relevant to the formalization of software agents and the shaping of virtual institutions and multi-agent systems or environments. This book explores the use of ontologism in legal knowledge
representation for semantically-enhanced legal knowledge systems or
web-based applications. In it, current methodologies, tools and
languages used for ontology development are revised, and the book
includes an exhaustive revision of existing ontologies in the legal
domain. The development of the Ontology of Professional Judicial
Knowledge (OPJK) is presented as a case study.
Blockchain and other trustless systems have gone from being relatively obscure technologies, which were only known to a small community of computer scientists and cryptologists, to mainstream phenomena that are now considered powerful game changers for many industries. This book explores and assesses real-world use cases and case studies on blockchain and related technologies. The studies describe the respective applications and address how these technologies have been deployed, the rationale behind their application, and finally, their outcomes. The book shares a wealth of experiences and lessons learned regarding financial markets, energy, SCM, healthcare, law and compliance. Given its scope, it is chiefly intended for academics and practitioners who want to learn more about blockchain applications.
Logical form has always been a prime concern for philosophers belonging to the analytic tradition. For at least one century, the study of logical form has been widely adopted as a method of investigation, relying on its capacity to reveal the structure of thoughts or the constitution of facts. This book focuses on the very idea of logical form, which is directly relevant to any principled reflection on that method. Its central thesis is that there is no such thing as a correct answer to the question of what is logical form: two significantly different notions of logical form are needed to fulfill two major theoretical roles that pertain respectively to logic and to semantics. This thesis has a negative and a positive side. The negative side is that a deeply rooted presumption about logical form turns out to be overly optimistic: there is no unique notion of logical form that can play both roles. The positive side is that the distinction between two notions of logical form, once properly spelled out, sheds light on some fundamental issues concerning the relation between logic and language.
This book gathers selected high-quality research papers presented at Arab Conference for Emerging Technologies 2020 organized virtually in Cairo during 21-23 June 2020. This book emphasizes the role and recent developments in the field of emerging technologies and artificial intelligence, and related technologies with a special focus on sustainable development in the Arab world. The book targets high-quality scientific research papers with applications, including theory, practical, prototypes, new ideas, case studies and surveys which cover machine learning applications in data science.
Social networks have emerged as a major trend in computing and social paradigms in the past few years. The social network model helps to inform the study of community behavior, allowing qualitative and quantitative assessments of how people communicate and the rules that govern communication. Social Networking and Community Behavior Modeling: Qualitative and Quantitative Measures provides a clear and consolidated view of current social network models. This work explores new methods for modeling, characterizing, and constructing social networks. Chapters contained in this book study critical security issues confronting social networking, the emergence of new mobile social networking devices and applications, network robustness, and how social networks impact the business aspects of organizations.
Tearing and interconnecting methods, such as FETI, FETI-DP, BETI, etc., are among the most successful domain decomposition solvers for partial differential equations. The purpose of this book is to give a detailed and self-contained presentation of these methods, including the corresponding algorithms as well as a rigorous convergence theory. In particular, two issues are addressed that have not been covered in any monograph yet: the coupling of finite and boundary elements within the tearing and interconnecting framework including exterior problems, and the case of highly varying (multiscale) coefficients not resolved by the subdomain partitioning. In this context, the book offers a detailed view to an active and up-to-date area of research.
The aim of this book is to give a complete overview of the classical electromagnetic theory together with detailed insight in modern numerical methods for analysis of the problems in electromagnetics. Classical electromagnetic theory was developed in the 19th century but due to a wide range of applications from electrical apparatus such as motors or heaters to telecommunications, this subject is still very interesting. This book explains basic postulates and laws of the theory and its specialization to static and time-dependent problems. Special attention is given to utilization of computers in application of the modern numerical methods to solution of electromagnetic field problems. |
You may like...
Discovering Computers (c)2017
Mark Frydenberg, Misty Vermaat, …
Paperback
(3)
R966 Discovery Miles 9 660
Technology In Action Complete, Global…
Alan Evans, Kendall Martin, …
Paperback
R2,490
Discovery Miles 24 900
|