![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
In recent years, IT standardization has become increasingly complex as a result of globalization, widespread Internet use, and the economic importance of standards. New Applications in IT Standards: Developments and Progress unites contributions on all facets of standards research, providing essential research on developing, teaching, and implementing standards in global organizations and institutions. Researchers can benefit from specific cases, frameworks, and new theories in IT standards studies.
Imagine yourself as a military officer in a conflict zone trying to identify locations of weapons caches supporting road-side bomb attacks on your country's troops. Or imagine yourself as a public health expert trying to identify the location of contaminated water that is causing diarrheal diseases in a local population. Geospatial abduction is a new technique introduced by the authors that allows such problems to be solved. Geospatial Abduction provides the mathematics underlying geospatial abduction and the algorithms to solve them in practice; it has wide applicability and can be used by practitioners and researchers in many different fields. Real-world applications of geospatial abduction to military problems are included. Compelling examples drawn from other domains as diverse as criminology, epidemiology and archaeology are covered as well. This book also includes access to a dedicated website on geospatial abduction hosted by University of Maryland. Geospatial Abduction targets practitioners working in general AI, game theory, linear programming, data mining, machine learning, and more. Those working in the fields of computer science, mathematics, geoinformation, geological and biological science will also find this book valuable.
This monograph presents examples of best practices when combining bioinspired algorithms with parallel architectures. The book includes recent work by leading researchers in the field and offers a map with the main paths already explored and new ways towards the future. Parallel Architectures and Bioinspired Algorithms will be of value to both specialists in Bioinspired Algorithms, Parallel and Distributed Computing, as well as computer science students trying to understand the present and the future of Parallel Architectures and Bioinspired Algorithms.
Organizations of all types are consistently working on new initiatives, product lines, or implementation of new workflows as a way to remain competitive in the modern business environment. No matter the type of project, employing the best methods for effective execution and timely completion of the task at hand is essential to project success. The implementation of computer technology has provided further opportunities for innovation and progress in the daily operations and initiatives of corporations. Knowledge Management and Innovation in Network Organizations: Emerging Research and Opportunities is an essential scholarly resource that explores the use of information communication technologies in management models and the development of network organizations operating in various sectors of the economy. Highlighting coverage on a wide range of topics such as cloud computing, organizational development, and business management, this book is ideal for business professionals, organizational researchers, and academicians interested in the latest research on network organizations.
Fuzzy Cluster Analysis presents advanced and powerful fuzzy clustering techniques. This thorough and self-contained introduction to fuzzy clustering methods and applications covers classification, image recognition, data analysis and rule generation. Combining theoretical and practical perspectives, each method is analysed in detail and fully illustrated with examples. Features include:
Modelling for Business Improvement contains the proceedings of the First International Conference on Process Modelling and Process Management (MMEP 2010) held in Cambridge, England, in March 2010. It contains contributions from an international group of leading researchers in the fields of process modelling and process management. This conference will showcase recent trends in the modelling and management of engineering processes, explore potential synergies between different modelling approaches, gather and discuss future challenges for the management of engineering processes and discuss future research areas and topics. Modelling for Business Improvement is divided into three main parts: theoretical foundation of modelling and management of engineering processes, and achievements in theory; experiences from management practice using various modelling methods and tools, and their future challenges; and, new perspectives on modelling methods, techniques and tools. Based on the latest achievements in this and related fields, the editors aim to landmark the research map for modelling and management of engineering processes for 2020.
This volume presents some recent and principal developments related to computational intelligence and optimization methods in control. Theoretical aspects and practical applications of control engineering are covered by 14 self-contained contributions. Additional gems include the discussion of future directions and research perspectives designed to add to the reader's understanding of both the challenges faced in control engineering and the insights into the developing of new techniques. With the knowledge obtained, readers are encouraged to determine the appropriate control method for specific applications.
This book presents intellectual, innovative, information
technologies (I3-technologies) based on logical and probabilistic
(LP) risk models. The technologies presented here consider such
models for structurally complex systems and processes with logical
links and with random events in economics and technology. A number of applications is given to show the effectiveness of risk management technologies. In addition, topics of lectures and practical computer exercises intended for a two-semester course Risk management technologies are suggested."
This book is a comprehensive, systematic survey of the synthesis problem, and of region theory which underlies its solution, covering the related theory, algorithms, and applications. The authors focus on safe Petri nets and place/transition nets (P/T-nets), treating synthesis as an automated process which, given behavioural specifications or partial specifications of a system to be realized, decides whether the specifications are feasible, and then produces a Petri net realizing them exactly, or if this is not possible produces a Petri net realizing an optimal approximation of the specifications. In Part I the authors introduce elementary net synthesis. In Part II they explain variations of elementary net synthesis and the unified theory of net synthesis. The first three chapters of Part III address the linear algebraic structure of regions, synthesis of P/T-nets from finite initialized transition systems, and the synthesis of unbounded P/T-nets. Finally, the last chapter in Part III and the chapters in Part IV cover more advanced topics and applications: P/T-net with the step firing rule, extracting concurrency from transition systems, process discovery, supervisory control, and the design of speed-independent circuits. Most chapters conclude with exercises, and the book is a valuable reference for both graduate students of computer science and electrical engineering and researchers and engineers in this domain.
This reference and handbook describes theory, algorithms and applications of the Global Positioning System (GPS/Glonass/Galileo/Compass). It is primarily based on source-code descriptions of the KSGsoft program developed at the GFZ in Potsdam. The theory and algorithms are extended and verified for a new development of a multi-functional GPS/Galileo software. Besides the concepts such as the unified GPS data processing method, the diagonalisation algorithm, the adaptive Kalman filter, the general ambiguity search criteria, and the algebraic solution of variation equation reported in the first edition, the equivalence theorem of the GPS algorithms, the independent parameterisation method, and the alternative solar radiation model reported in the second edition, the modernisation of the GNSS system, the new development of the theory and algorithms, and research in broad applications are supplemented in this new edition. Mathematically rigorous, the book begins with the introduction, the basics of coordinate and time systems and satellite orbits, as well as GPS observables, and deals with topics such as physical influences, observation equations and their parameterisation, adjustment and filtering, ambiguity resolution, software development and data processing and the determination of perturbed orbits.
"The Supply of ConceptS" achieves a major breakthrough in the general theory of systems. It unfolds a theory of everything that steps beyond Physics' theory of the same name. The author unites all knowledge by including not only the natural but also the philosophical and theological universes of discourse. The general systems model presented here resembles an organizational flow chart that represents conceptual positions within any type of system and shows how the parts are connected hierarchically for communication and control. Analyzing many types of systems in various branches of learned discourse, the model demonstrates how any system type manages to maintain itself true to type. The concepts thus generated form a network that serves as a storehouse for the supply of concepts in learned discourse. Partial to the use of analogies, Irving Silverman presents his thesis in an easy-to-read style, explaining a way of thinking that he has found useful. This book will be of particular interest to the specialist in systems theory, philosophy, linguistics, and the social sciences. Irving Silverman applies his general systems model to 22 system types and presents rationales for these analyses. He provides the reader with a method, and a way to apply that method; a theory of knowledge derived from the method; and a practical outlook based on a comprehensive approach. Chapters include: Minding the Storehouse; Standing Together; The Cognitive Contract; The Ecological Contract; The Social Contract; The Semantic Terrain.
This book presents material from 3 survey lectures and 14 additional invited lectures given at the Euroconference "Computational Methods for Representations of Groups and Algebras" held at Essen University in April 1997. The purpose of this meeting was to provide a survey of general theoretical and computational methods and recent advances in the representation theory of groups and algebras. The foundations of these research areas were laid in survey articles by P. DrAxler and R. NArenberg on "Classification problems in the representation theory of finite-dimensional algebras," R. A. Wilson on "Construction of finite matrix groups" and E. Green on "Noncommutative GrAbner bases, and projective resolutions." Furthermore, new applications of the computational methods in linear algebra to the revision of the classification of finite simple sporadic groups are presented. Computational tools (including high-performance computations on supercomputers) have become increasingly important for classification problems. They are also inevitable for the construction of projective resolutions of finitely generated modules over finite-dimensional algebras and the study of group cohomology and rings of invariants. A major part of this book is devoted to a survey of algorithms for computing special examples in the study of Grothendieck groups, quadratic forms and derived categories of finite-dimensional algebras. Open questions on Lie algebras, Bruhat orders, Coxeter groups and Kazhdan Lusztig polynomials are investigated with the aid of computer programs. The contents of this book provide an overview on the present state of the art. Therefore it will be very useful for graduate students and researchers in mathematics, computer science and physics.
Analysis and Control of Boolean Networks presents a systematic new approach to the investigation of Boolean control networks. The fundamental tool in this approach is a novel matrix product called the semi-tensor product (STP). Using the STP, a logical function can be expressed as a conventional discrete-time linear system. In the light of this linear expression, certain major issues concerning Boolean network topology - fixed points, cycles, transient times and basins of attractors - can be easily revealed by a set of formulae. This framework renders the state-space approach to dynamic control systems applicable to Boolean control networks. The bilinear-systemic representation of a Boolean control network makes it possible to investigate basic control problems including controllability, observability, stabilization, disturbance decoupling etc.
As population increases, the need for energy becomes a crisis of great importance. Technologies for Electrical Power Conversion, Efficiency, and Distribution: Methods and Processes combines unparalleled research, contemporary achievements, and emerging trends within electrical energy conversion technologies and renewable energy sources. The scholarly findings compiled provide a background for discussion of the problems and opportunities of power efficiency and energy conversion in order to develop innovative ways to implement such cutting-edge technologies in the future.
"The healthcare industry in the United States consumes roughly 20% of the gross national product per year. This huge expenditure not only represents a large portion of the country's collective interests, but also an enormous amount of medical information. Information intensive healthcare enterprises have unique issues related to the collection, disbursement, and integration of various data within the healthcare system.Information Systems and Healthcare Enterprises provides insight on the challenges arising from the adaptation of information systems to the healthcare industry, including development, design, usage, adoption, expansion, and compliance with industry regulations. Highlighting the role of healthcare information systems in fighting healthcare fraud and the role of information technology and vendors, this book will be a highly valued addition to academic, medical, and health science libraries."
Most networks and databases that humans have to deal with contain large, albeit finite number of units. Their structure, for maintaining functional consistency of the components, is essentially not random and calls for a precise quantitative description of relations between nodes (or data units) and all network components. This book is an introduction, for both graduate students and newcomers to the field, to the theory of graphs and random walks on such graphs. The methods based on random walks and diffusions for exploring the structure of finite connected graphs and databases are reviewed (Markov chain analysis). This provides the necessary basis for consistently discussing a number of applications such diverse as electric resistance networks, estimation of land prices, urban planning, linguistic databases, music, and gene expression regulatory networks.
Applying TQM to systems engineering can reduce costs while simultaneously improving product quality. This guide to proactive systems engineering shows how to develop and optimize a practical approach, while highlighting the pitfalls and potentials involved.
Prolog Programming at its best! Discover A Book That Tells You What You Should Do and How! Instead of jumping right into the instructions, this book will provide you first with all the necessary concepts that you need to learn in order to make the learning process a whole lot easier. This way, you're sure not to get lost in confusion once you get to the more complex lessons provided in the latter chapters. Graphs and flowcharts, as well as sample codes, are provided for a more visual approach on your learning You will also learn the designs and forms of Parallel, and what's more convenient than getting to know both sides! Want to know More? Buy Now!
Biological and biomedical studies have entered a new era over the past two decades thanks to the wide use of mathematical models and computational approaches. A booming of computational biology, which sheerly was a theoretician's fantasy twenty years ago, has become a reality. Obsession with computational biology and theoretical approaches is evidenced in articles hailing the arrival of what are va- ously called quantitative biology, bioinformatics, theoretical biology, and systems biology. New technologies and data resources in genetics, such as the International HapMap project, enable large-scale studies, such as genome-wide association st- ies, which could potentially identify most common genetic variants as well as rare variants of the human DNA that may alter individual's susceptibility to disease and the response to medical treatment. Meanwhile the multi-electrode recording from behaving animals makes it feasible to control the animal mental activity, which could potentially lead to the development of useful brain-machine interfaces. - bracing the sheer volume of genetic, genomic, and other type of data, an essential approach is, ?rst of all, to avoid drowning the true signal in the data. It has been witnessed that theoretical approach to biology has emerged as a powerful and st- ulating research paradigm in biological studies, which in turn leads to a new - search paradigm in mathematics, physics, and computer science and moves forward with the interplays among experimental studies and outcomes, simulation studies, and theoretical investigations.
"Digital Forensics for Legal Professionals" provides you with a guide to digital technology forensics in plain English. In the authors years of experience in working with attorneys as digital forensics experts, common questions arise again and again: What do I ask for? Is the evidence relevant? What does this item in the forensic report mean? What should I ask the other expert? What should I ask you? Can you explain that to a jury? This book answers many of those questions in clear language that is understandable by non-technical people. With many illustrations and diagrams that will be usable in court, they explain technical concepts such as unallocated space, forensic copies, timeline artifacts and metadata in simple terms that make these concepts accessible to both attorneys and juries. The authors also explain how to determine what evidence to ask
for, evidence might be that could be discoverable, and the methods
for getting to it including relevant subpoena and motion language.
Additionally, this book provides an overview of the current state
of digital forensics, the right way to select a qualified expert,
what to expect from a qualified expert and how to properly use
experts before and during trial.
New Edition: Introduction to Computational Earthquake Engineering (3rd Edition)Introduction to Computational Earthquake Engineering covers solid continuum mechanics, finite element method and stochastic modeling comprehensively, with the second and third chapters explaining the numerical simulation of strong ground motion and faulting, respectively. Stochastic modeling is used for uncertain underground structures, and advanced analytical methods for linear and non-linear stochastic models are presented. The verification of these methods by comparing the simulation results with observed data is then presented, and examples of numerical simulations which apply these methods to practical problems are generously provided. Furthermore three advanced topics of computational earthquake engineering are covered, detailing examples of applying computational science technology to earthquake engineering problems. |
You may like...
Silicon Photonics, Volume 99
Chennupati Jagadish, Sebastian Lourdudoss, …
Hardcover
R5,217
Discovery Miles 52 170
Educational Ministry in the Logic of the…
James E. Loder
Hardcover
|