![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
"Computer Science and Convergence"is proceedings of the 3rd FTRA International Conference on Computer Science and its Applications (CSA-11) and The 2011 FTRA World Convergence Conference (FTRA WCC 2011). The topics of CSA and WCC cover the current hot topics satisfying the world-wide ever-changing needs. CSA-11 will be the most comprehensive conference focused on the various aspects of advances in computer science and its applicationsand will provide an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of CSA. In addition, the conference will publish high quality papers which are closely related to the various theories and practical applications in CSA. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject. The main scope of CSA-11 is as follows: - Mobile and ubiquitous computing WCC-11 is a major conference for scientists, engineers, and practitioners throughout the world to present the latest research, results, ideas, developments and applications in all areas of convergence technologies. The main scope of WCC-11 is as follows: - Cryptography and Security for Converged environments -
Wireless sensor network for Converged environments
The book presents various state-of-the-art approaches for process synchronization in a distributed environment. The range of algorithms discussed in the book starts from token based mutual exclusion algorithms that work on tree based topology. Then there are interesting solutions for more flexible logical topology like a directed graph, with or without cycle. In a completely different approach, one of the chapters presents two recent voting-based DME algorithms. All DME algorithms presented in the book aim to ensure fairness in terms of first come first serve (FCFS) order among equal priority processes. At the same time, the solutions consider the priority of the requesting processes and allocate resource for the earliest request when no such request from a higher priority process is pending.
This reference and handbook describes theory, algorithms and applications of the Global Positioning System (GPS/Glonass/Galileo/Compass). It is primarily based on source-code descriptions of the KSGsoft program developed at the GFZ in Potsdam. The theory and algorithms are extended and verified for a new development of a multi-functional GPS/Galileo software. Besides the concepts such as the unified GPS data processing method, the diagonalisation algorithm, the adaptive Kalman filter, the general ambiguity search criteria, and the algebraic solution of variation equation reported in the first edition, the equivalence theorem of the GPS algorithms, the independent parameterisation method, and the alternative solar radiation model reported in the second edition, the modernisation of the GNSS system, the new development of the theory and algorithms, and research in broad applications are supplemented in this new edition. Mathematically rigorous, the book begins with the introduction, the basics of coordinate and time systems and satellite orbits, as well as GPS observables, and deals with topics such as physical influences, observation equations and their parameterisation, adjustment and filtering, ambiguity resolution, software development and data processing and the determination of perturbed orbits.
This book is devoted to Professor Jurgen Lehn, who passed away on September 29, 2008, at the age of 67. It contains invited papers that were presented at the Wo- shop on Recent Developments in Applied Probability and Statistics Dedicated to the Memory of Professor Jurgen Lehn, Middle East Technical University (METU), Ankara, April 23-24, 2009, which was jointly organized by the Technische Univ- sitat Darmstadt (TUD) and METU. The papers present surveys on recent devel- ments in the area of applied probability and statistics. In addition, papers from the Panel Discussion: Impact of Mathematics in Science, Technology and Economics are included. Jurgen Lehn was born on the 28th of April, 1941 in Karlsruhe. From 1961 to 1968 he studied mathematics in Freiburg and Karlsruhe, and obtained a Diploma in Mathematics from the University of Karlsruhe in 1968. He obtained his Ph.D. at the University of Regensburg in 1972, and his Habilitation at the University of Karlsruhe in 1978. Later in 1978, he became a C3 level professor of Mathematical Statistics at the University of Marburg. In 1980 he was promoted to a C4 level professorship in mathematics at the TUD where he was a researcher until his death."
"The Supply of ConceptS" achieves a major breakthrough in the general theory of systems. It unfolds a theory of everything that steps beyond Physics' theory of the same name. The author unites all knowledge by including not only the natural but also the philosophical and theological universes of discourse. The general systems model presented here resembles an organizational flow chart that represents conceptual positions within any type of system and shows how the parts are connected hierarchically for communication and control. Analyzing many types of systems in various branches of learned discourse, the model demonstrates how any system type manages to maintain itself true to type. The concepts thus generated form a network that serves as a storehouse for the supply of concepts in learned discourse. Partial to the use of analogies, Irving Silverman presents his thesis in an easy-to-read style, explaining a way of thinking that he has found useful. This book will be of particular interest to the specialist in systems theory, philosophy, linguistics, and the social sciences. Irving Silverman applies his general systems model to 22 system types and presents rationales for these analyses. He provides the reader with a method, and a way to apply that method; a theory of knowledge derived from the method; and a practical outlook based on a comprehensive approach. Chapters include: Minding the Storehouse; Standing Together; The Cognitive Contract; The Ecological Contract; The Social Contract; The Semantic Terrain.
This edited book presents scientific results of the 14th IEEE/ACIS International Conference on Computer and Information Science (ICIS 2015) which was held on June 28 - July 1, 2015 in Las Vegas, USA. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. Research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them.
This book presents material from 3 survey lectures and 14 additional invited lectures given at the Euroconference "Computational Methods for Representations of Groups and Algebras" held at Essen University in April 1997. The purpose of this meeting was to provide a survey of general theoretical and computational methods and recent advances in the representation theory of groups and algebras. The foundations of these research areas were laid in survey articles by P. DrAxler and R. NArenberg on "Classification problems in the representation theory of finite-dimensional algebras," R. A. Wilson on "Construction of finite matrix groups" and E. Green on "Noncommutative GrAbner bases, and projective resolutions." Furthermore, new applications of the computational methods in linear algebra to the revision of the classification of finite simple sporadic groups are presented. Computational tools (including high-performance computations on supercomputers) have become increasingly important for classification problems. They are also inevitable for the construction of projective resolutions of finitely generated modules over finite-dimensional algebras and the study of group cohomology and rings of invariants. A major part of this book is devoted to a survey of algorithms for computing special examples in the study of Grothendieck groups, quadratic forms and derived categories of finite-dimensional algebras. Open questions on Lie algebras, Bruhat orders, Coxeter groups and Kazhdan Lusztig polynomials are investigated with the aid of computer programs. The contents of this book provide an overview on the present state of the art. Therefore it will be very useful for graduate students and researchers in mathematics, computer science and physics.
In recent decades there has been incredible growth in the use of various internet applications by individuals and organizations who store sensitive information online on different servers. This greater reliance of organizations and individuals on internet technologies and applications increases the threat space and poses several challenges for implementing and maintaining cybersecurity practices. Constructing an Ethical Hacking Knowledge Base for Threat Awareness and Prevention provides innovative insights into how an ethical hacking knowledge base can be used for testing and improving the network and system security posture of an organization. It is critical for each individual and institute to learn hacking tools and techniques that are used by dangerous hackers in tandem with forming a team of ethical hacking professionals to test their systems effectively. Highlighting topics including cyber operations, server security, and network statistics, this publication is designed for technical experts, students, academicians, government officials, and industry professionals.
Analysis and Control of Boolean Networks presents a systematic new approach to the investigation of Boolean control networks. The fundamental tool in this approach is a novel matrix product called the semi-tensor product (STP). Using the STP, a logical function can be expressed as a conventional discrete-time linear system. In the light of this linear expression, certain major issues concerning Boolean network topology - fixed points, cycles, transient times and basins of attractors - can be easily revealed by a set of formulae. This framework renders the state-space approach to dynamic control systems applicable to Boolean control networks. The bilinear-systemic representation of a Boolean control network makes it possible to investigate basic control problems including controllability, observability, stabilization, disturbance decoupling etc.
This book considers logical proof systems from the point of view of their space complexity. After an introduction to propositional proof complexity the author structures the book into three main parts. Part I contains two chapters on resolution, one containing results already known in the literature before this work and one focused on space in resolution, and the author then moves on to polynomial calculus and its space complexity with a focus on the combinatorial technique to prove monomial space lower bounds. The first chapter in Part II addresses the proof complexity and space complexity of the pigeon principles. Then there is an interlude on a new type of game, defined on bipartite graphs, essentially independent from the rest of the book, collecting some results on graph theory. Finally Part III analyzes the size of resolution proofs in connection with the Strong Exponential Time Hypothesis (SETH) in complexity theory. The book is appropriate for researchers in theoretical computer science, in particular computational complexity.
As population increases, the need for energy becomes a crisis of great importance. Technologies for Electrical Power Conversion, Efficiency, and Distribution: Methods and Processes combines unparalleled research, contemporary achievements, and emerging trends within electrical energy conversion technologies and renewable energy sources. The scholarly findings compiled provide a background for discussion of the problems and opportunities of power efficiency and energy conversion in order to develop innovative ways to implement such cutting-edge technologies in the future.
Most networks and databases that humans have to deal with contain large, albeit finite number of units. Their structure, for maintaining functional consistency of the components, is essentially not random and calls for a precise quantitative description of relations between nodes (or data units) and all network components. This book is an introduction, for both graduate students and newcomers to the field, to the theory of graphs and random walks on such graphs. The methods based on random walks and diffusions for exploring the structure of finite connected graphs and databases are reviewed (Markov chain analysis). This provides the necessary basis for consistently discussing a number of applications such diverse as electric resistance networks, estimation of land prices, urban planning, linguistic databases, music, and gene expression regulatory networks.
Applying TQM to systems engineering can reduce costs while simultaneously improving product quality. This guide to proactive systems engineering shows how to develop and optimize a practical approach, while highlighting the pitfalls and potentials involved.
Prolog Programming at its best! Discover A Book That Tells You What You Should Do and How! Instead of jumping right into the instructions, this book will provide you first with all the necessary concepts that you need to learn in order to make the learning process a whole lot easier. This way, you're sure not to get lost in confusion once you get to the more complex lessons provided in the latter chapters. Graphs and flowcharts, as well as sample codes, are provided for a more visual approach on your learning You will also learn the designs and forms of Parallel, and what's more convenient than getting to know both sides! Want to know More? Buy Now!
This book presents intellectual, innovative, information
technologies (I3-technologies) based on logical and probabilistic
(LP) risk models. The technologies presented here consider such
models for structurally complex systems and processes with logical
links and with random events in economics and technology. A number of applications is given to show the effectiveness of risk management technologies. In addition, topics of lectures and practical computer exercises intended for a two-semester course Risk management technologies are suggested."
New Edition: Introduction to Computational Earthquake Engineering (3rd Edition)Introduction to Computational Earthquake Engineering covers solid continuum mechanics, finite element method and stochastic modeling comprehensively, with the second and third chapters explaining the numerical simulation of strong ground motion and faulting, respectively. Stochastic modeling is used for uncertain underground structures, and advanced analytical methods for linear and non-linear stochastic models are presented. The verification of these methods by comparing the simulation results with observed data is then presented, and examples of numerical simulations which apply these methods to practical problems are generously provided. Furthermore three advanced topics of computational earthquake engineering are covered, detailing examples of applying computational science technology to earthquake engineering problems.
An Expert Guide to Software Performance Optimization From mobile and cloud apps to video games to driverless vehicle control, more and more software is time-constrained: It must deliver reliable results seamlessly, consistently, and virtually instantaneously. If it doesn't, customers are unhappy--and sometimes lives are put at risk. When complex software underperforms or fails, software engineers need to identify and address the root causes. This is difficult and, historically, few tools have been available to help. In Understanding Software Dynamics, performance expert Richard L. Sites tackles the problem head on, offering expert methods and advanced tools for understanding complex, time-constrained software dynamics, improving reliability and troubleshooting challenging performance problems. Sites draws on several decades of experience pioneering software performance optimization, as well as extensive experience teaching graduate-level developers. He introduces principles and techniques for use in any environment, from embedded devices to datacenters, illuminating them with examples based on x86 or ARM processors running Linux and linked by Ethernet. He also guides readers through building and applying a powerful, new, extremely low-overhead open-source software tool, KUtrace, to precisely trace executions on every CPU core. Using insights gleaned from this tool, readers can apply nuanced solutions--not merely brute-force techniques such as turning off caches or cores. Measure and address issues associated with CPUs, memory, disk/SSD, networks, and their interactions Fix programs that are always too slow, and those that sometimes lag for no apparent reason Design useful observability, logging, and time-stamping capabilities into your code Reason more effectively about performance data to see why reality differs from expectations Identify problems such as excess execution, slow instruction execution, waiting for resources, and software locks Understanding Software Dynamics will be valuable to experienced software professionals, including application and OS developers, hardware and system architects, real-time system designers, and game developers, as well as advanced students. Register your book for convenient access to downloads, updates, and/or corrections as they become available. See inside book for details.
This monograph illustrates important notions in security reductions and essential techniques in security reductions for group-based cryptosystems. Using digital signatures and encryption as examples, the authors explain how to program correct security reductions for those cryptographic primitives. Various schemes are selected and re-proven in this book to demonstrate and exemplify correct security reductions. This book is suitable for researchers and graduate students engaged with public-key cryptography.
The main objective of pervasive computing systems is to create environments where computers become invisible by being seamlessly integrated and connected into our everyday environment, where such embedded computers can then provide inf- mation and exercise intelligent control when needed, but without being obtrusive. Pervasive computing and intelligent multimedia technologies are becoming incre- ingly important to the modern way of living. However, many of their potential applications have not yet been fully realized. Intelligent multimedia allows dynamic selection, composition and presentation of the most appropriate multimedia content based on user preferences. A variety of applications of pervasive computing and - telligent multimedia are being developed for all walks of personal and business life. Pervasive computing (often synonymously called ubiquitous computing, palpable computing or ambient intelligence) is an emerging ?eld of research that brings in revolutionary paradigms for computing models in the 21st century. Pervasive c- puting is the trend towards increasingly ubiquitous connected computing devices in the environment, a trend being brought about by a convergence of advanced el- tronic - and particularly, wireless - technologies and the Internet. Recent advances in pervasive computers, networks, telecommunications and information technology, along with the proliferation of multimedia mobile devices - such as laptops, iPods, personal digital assistants (PDAs) and cellular telephones - have further stimulated the development of intelligent pervasive multimedia applications. These key te- nologiesarecreatingamultimediarevolutionthatwillhavesigni?cantimpactacross a wide spectrum of consumer, business, healthcare and governmental domains.
This book provides a general overview of multiple instance learning (MIL), defining the framework and covering the central paradigms. The authors discuss the most important algorithms for MIL such as classification, regression and clustering. With a focus on classification, a taxonomy is set and the most relevant proposals are specified. Efficient algorithms are developed to discover relevant information when working with uncertainty. Key representative applications are included. This book carries out a study of the key related fields of distance metrics and alternative hypothesis. Chapters examine new and developing aspects of MIL such as data reduction for multi-instance problems and imbalanced MIL data. Class imbalance for multi-instance problems is defined at the bag level, a type of representation that utilizes ambiguity due to the fact that bag labels are available, but the labels of the individual instances are not defined. Additionally, multiple instance multiple label learning is explored. This learning framework introduces flexibility and ambiguity in the object representation providing a natural formulation for representing complicated objects. Thus, an object is represented by a bag of instances and is allowed to have associated multiple class labels simultaneously. This book is suitable for developers and engineers working to apply MIL techniques to solve a variety of real-world problems. It is also useful for researchers or students seeking a thorough overview of MIL literature, methods, and tools. |
You may like...
Handbook of Research on Advanced ICT…
Peter Sonntagbauer, Kawa Nazemi, …
Hardcover
R7,407
Discovery Miles 74 070
|