![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
st The world of the 21 century is, more than ever, global and impersonal. Criminal and terrorist threats, both physical and on the Internet, increase by the day. The demand for better methods of identification and access control is growing, not only in companies and organisations but also in the world at large. At the same time, such security measures have to be balanced with means for protecting the privacy of users. Identity management is put under pressure, due to the growing number of frauds who want to hide their true identity. This challenges the information security research community to focus on interdisciplinary and holistic approaches while retaining the benefits of previous research efforts. In this context, the IFIP Working Group 11.6 on Identity Management has been founded in August 2006. The intention of the Working Group is to offer a broad forum for the exchange of knowledge and for the tracking and discussion of issues and new developments. In this, we take an interdisciplinary approach. Scientists as well as practitioners, from government and business, who are involved in the field of identity management are welcome to participate. The IDMAN 2007 Conference on Policies and Research in Identity Management was the very first conference organized by this Working Group. We aim to organize conferences bi-annually. The IDMAN 2007 Conference has been centered around the theme of National Identity Management or, in other words, identity management in the public sector.
As business paradigm shifts from a desktop-centric environment to a data-centric mobile environment, mobile services provide numerous new business opportunities, and in some cases, challenge some of the basic premises of existing business models.Strategy, Adoption, and Competitive Advantage of Mobile Services in the Global Economy seeks to foster a scientific understanding of mobile services, provide a timely publication of current research efforts, and forecast future trends in the mobile services industry. This book is an ideal resource for academics, researchers, government policymakers, as well as corporate managers looking to enhance their competitive edge in or understanding of mobile services.
Data Management is the process of planning, coordinating and controlling data resources. More often, applications need to store and search a large amount of data. Managing Data has been continuously challenged by demands from various areas and applications and has evolved in parallel with advances in hardware and computing techniques. This volume focuses on its recent advances and it is composed of five parts and a total of eighteen chapters. The first part of the book contains five contributions in the area of information retrieval and Web intelligence: a novel approach to solving index selection problem, integrated retrieval from Web of documents and data, bipolarity in database querying, deriving data summarization through ontologies, and granular computing for Web intelligence. The second part of the book contains four contributions in knowledge discovery area. Its third part contains three contributions in information integration and data security area. The remaining two parts of the book contain six contributions in the area of intelligent agents and applications of data management in medical domain.
Intelligent information and database systems are two closely related and we- established subfields of modern computer science. They focus on the integration of artificial intelligence and classic database technologies in order to create the class of next generation information systems. The major target of this new gene- tion of systems is to provide end-users with intelligent behavior: simple and/or advanced learning, problem solving, uncertain and certain reasoning, se- organization, cooperation, etc. Such intelligent abilities are implemented in classic information systems to make them autonomous and user oriented, in particular when advanced problems of multimedia information and knowledge discovery, access, retrieval and manipulation are to be solved in the context of large, distr- uted and heterogeneous environments. It means that intelligent knowledge-based information and database systems are used to solve basic problems of large coll- tions management, carry out knowledge discovery from large data collections, reason about information under uncertain conditions, support users in their for- lation of complex queries etc. Topics discussed in this volume include but are not limited to the foundations and principles of data, information, and knowledge models, methodologies for intelligent information and database systems analysis, design, implementation, validation, maintenance and evolution.
The book comprises an assembly of benchmarks and examples for porous media mechanics collected over the last twenty years. Analysis of thermo-hydro-mechanical-chemical (THMC) processes is essential to many applications in environmental engineering, such as geological waste deposition, geothermal energy utilisation, carbon capture and storage, water resources management, hydrology, even climate chance. In order to assess the feasibility as well as the safety of geotechnical applications, process-based modelling is the only tool to put numbers, i.e. to quantify future scenarios. This charges a huge responsibility concerning the reliability of computational tools. Benchmarking is an appropriate methodology to verify the quality of modelling tools based on best practices. Moreover, benchmarking and code comparison foster community efforts. The benchmark book is part of the OpenGeoSys initiative - an open source project to share knowledge and experience in environmental analysis and scientific computation.
This textbook provides an accessible introduction to the most important features of Fortran 2008. Features: presents a complete discussion of all the basic features needed to write complete Fortran programs; makes extensive use of examples and case studies to illustrate the practical use of features of Fortran 08, and supplies simple problems for the reader; provides a detailed exploration of control constructs, modules, procedures, arrays, character strings, data structures and derived types, pointer variables, and object-oriented programming; includes coverage of such major new features in Fortran 08 as coarrays, submodules, parameterized derived types, and derived-type input and output; highlights the topic of modules as the framework for organizing data and procedures for a Fortran program; investigates the excellent input/output facilities available in Fortran; contains appendices listing the many intrinsic procedures and providing a brief informal syntax specification for the language.
Managing Complexity is the first book that clearly defines the concept of Complexity, explains how Complexity can be measured and tuned, and describes the seven key features of Complex Systems: 1. Connectivity 2. Autonomy 3. Emergency 4. Nonequilibrium 5. Non-linearity 6. Self-organisation 7. Co-evolution The thesis of the book is that complexity of the environment in which we work and live offers new opportunities and that the best strategy for surviving and prospering under conditions of complexity is to develop adaptability to perpetually changing conditions. An effective method for designing adaptability into business processes using multi-agent technology is presented and illustrated by several extensive examples, including adaptive, real-time scheduling of taxis, see-going tankers, road transport, supply chains, railway trains, production processes and swarms of small space satellites. Additional case studies include adaptive servicing of the International Space Station; adaptive processing of design changes of large structures such as wings of the largest airliner in the world; dynamic data mining, knowledge discovery and distributed semantic processing.Finally, the book provides a foretaste of the next generation of complex issues, notably, The Internet of Things, Smart Cities, Digital Enterprises and Smart Logistics.
This book explores the two major elements of Hintikka's model of inquiry: underlying game theoretical motivations and the central role of questioning. The chapters build on the Hintikkan tradition extending Hintikka's model and present a wide variety of approaches to the philosophy of inquiry from different directions, ranging from erotetic logic to Lakatosian philosophy, from socio-epistemologic approaches to strategic reasoning and mathematical practice. Hintikka's theory of inquiry is a well-known example of a dynamic epistemic procedure. In an interrogative inquiry, the inquirer is given a theory and a question. He then tries to answer the question based on the theory by posing questions to nature or an oracle. The initial formulation of this procedure by Hintikka is rather broad and informal. This volume introduces a carefully selected responses to the issues discussed by Hintikka. The articles in the volume were contributed by various authors associated with a research project on Hintikka's interrogative theory of inquiry conducted in the Institut d'Histoire et de Philosophie des Sciences et des Techniques (IHPST) of Paris, including those who visited to share their insight.
Computing power performance was important at times when hardware was still expensive, because hardware had to be put to the best use. Later on this criterion was no longer critical, since hardware had become inexpensive. Meanwhile, however, people have realized that performance again plays a significant role, because of the major drain on system resources involved in developing complex applications. This book distinguishes between three levels of performance optimization: the system level, application level and business processes level. On each, optimizations can be achieved and cost-cutting potentials can be identified. The book presents the relevant theoretical background and measuring methods as well as proposed solutions. An evaluation of network monitors and checklists rounds out the work.
The information technology explosion in our global society is creating tremendous challenges and opportunities for educators as we help shape the next generation of information pioneers. But in these times of severe budget cuts, our challenges become even greater, and the necessity for success more critical. Current Issues in IT Education addresses the ongoing quest for teaching excellence in the midst of implementing teaching technologies and crossing disciplinary boundaries.
This book focuses on recent research in modern optimization and its implications in control and data analysis. This book is a collection of papers from the conference "Optimization and Its Applications in Control and Data Science" dedicated to Professor Boris T. Polyak, which was held in Moscow, Russia on May 13-15, 2015. This book reflects developments in theory and applications rooted by Professor Polyak's fundamental contributions to constrained and unconstrained optimization, differentiable and nonsmooth functions, control theory and approximation. Each paper focuses on techniques for solving complex optimization problems in different application areas and recent developments in optimization theory and methods. Open problems in optimization, game theory and control theory are included in this collection which will interest engineers and researchers working with efficient algorithms and software for solving optimization problems in market and data analysis. Theoreticians in operations research, applied mathematics, algorithm design, artificial intelligence, machine learning, and software engineering will find this book useful and graduate students will find the state-of-the-art research valuable.
The focus of these conference proceedings is on research, development, and applications in the fields of numerical geometry, scientific computing and numerical simulation, particularly in mesh generation and related problems. In addition, this year's special focus is on Voronoi diagrams and their applications, celebrating the 150th birthday of G.F. Voronoi. In terms of content, the book strikes a balance between engineering algorithms and mathematical foundations. It presents an overview of recent advances in numerical geometry, grid generation and adaptation in terms of mathematical foundations, algorithm and software development and applications. The specific topics covered include: quasi-conformal and quasi-isometric mappings, hyperelastic deformations, multidimensional generalisations of the equidistribution principle, discrete differential geometry, spatial and metric encodings, Voronoi-Delaunay theory for tilings and partitions, duality in mathematical programming and numerical geometry, mesh-based optimisation and optimal control methods. Further aspects examined include iterative solvers for variational problems and algorithm and software development. The applications of the methods discussed are multidisciplinary and include problems from mathematics, physics, biology, chemistry, material science, and engineering.
The computer is the great technological and scientific innovation of the last half of the twentieth century. It has revolutionized how we organize information, how we communicate with each other, and even the way that we think about the human mind. Computers have eased the drudgery of such tasks as calculating sums and clerical work, making them both more bearable and more efficient. The computer has become ubiquitous in many aspects of business, recreation, and everyday life, and the trend is that they are becoming both more powerful and easier to use. Computers: The Life Story of a Technology provides an accessible overview of this ever changing technology history, giving students and lay readers an understanding of the complete scope of its history from ancient times to the present day. In addition to providing a concise biography of how this technology developed, this book provides insights into how the computer has changed our lives: * Demonstrates how, just as the invention of the steam engine in the 1700s stimulated scientists to think of the laws of nature in terms of machines, the success of the computer in the late 1900s prompted scientists to think of the basic laws of the universe as being similar to the operation of a computer. * Provides a worldwide examination of computing, and how such needs as security and defense during the Cold War drove the development of computing technology. * Shows how the computer has entered almost every aspect of daily life in the 21st century The volume includes a glossary of terms, a timeline of important events, and a selected bibliography of useful resources for further information.
This is volume 73 of "Advances in Computers." This series, which
began publication in 1960, is the oldest continuously published
anthology that chronicles the ever- changing information technology
field. In these volumes we publish from 5 to 7 chapters, three
times per year, that cover the latest changes to the design,
development, use and implications of computer technology on society
today. In this current volume, subtitled "Emerging Technologies,"
we discuss several new advances in computer software generation as
well as describe new applications of those computers.
At a time when Internet use is closely tracked and social networking sites supply data for targeted advertising, Lars Heide presents the first academic study of the invention that fueled today's information revolution: the punched card. Early punched cards helped to process the United States census in 1890. They soon proved useful in calculating invoices and issuing pay slips. As demand for more sophisticated systems and reading machines increased in both the United States and Europe, punched cards served ever-larger data-processing purposes. Insurance companies, public utilities, businesses, and governments all used them to keep detailed records of their customers, competitors, employees, citizens, and enemies. The United States used punched-card registers in the late 1930s to pay roughly 21 million Americans their Social Security pensions, Vichy France used similar technologies in an attempt to mobilize an army against the occupying German forces, and the Germans in 1941 developed several punched-card registers to make the war effort--and surveillance of minorities--more effective. Heide's analysis of these three major punched-card systems, as well as the impact of the invention on Great Britain, illustrates how different cultures collected personal and financial data and how they adapted to new technologies. This comparative study will interest students and scholars from a wide range of disciplines, including the history of technology, computer science, business history, and management and organizational studies.
The World Wide Web exploded into public consciousness in 1995, a year which saw the coming of age of the Internet. People are communicating, working, shopping, learning, and entertaining themselves, as well as satisfying carnal desires and even finding God through the simple act of connecting their computers to the wide universe of cyberspace. We are assured, at the same time, that this progress will have profound effects on work, culture, leisure--everything, including the ways in which we interact with each other. Yet just what these effects will be, how power will be distributed, and what recourse will be available to those adversely affected by the new technologies, are issues that have yet to be negotiated. Aside from the occasional panic over cyber-porn, few have considered the wide-ranging effects of our increasing reliance on interactive technologies. "Cyberfutures" offers a close examination of issues that will become increasingly important as computers, networks, and technologies occupy crucial roles in our everyday lives. Comprised of essays from a range of occupational and disciplinary perspectives, including those of Vivian Sobchack and Arturo Escobar, this volume makes essential reading for students in cultural and media studies, anthropology, as well as for citizens interested in considering the larger implications of the Information Superhighway.
Competitive intelligence uses public sources to obtain valuable information on competition and competitors. By using competitive intelligence aggressively and intelligently, corporations can obtain information on potential acquisition targets, markets, key personnel, the probable emergence of new products, or the financial strength or contracts of a competing firm. An absolutely indispensable playbook for anyone who has to compete during the information explosion. "Martin Sikora, Editor, Mergers and AcquisitionS" Competitive intelligence uses public sources to obtain valuable information on competition and competitors. In an open society such as our own, businesses place a great deal of information in the public domain. By using competitive intelligence aggressively and intelligently, corporations can obtain information on potential acquisition targets, markets, key personnel, the probable emergence of new products, or the financial strength or contracts of a competing firm. In fact, the authors contend that as much as 90 percent of the information required to decide on a course of litigation, acquisitions, expansion, new product introduction, or financing, is available through competitive intelligence.
This book presents advances in alternative swarm development that have proved to be effective in several complex problems. Swarm intelligence (SI) is a problem-solving methodology that results from the cooperation between a set of agents with similar characteristics. The study of biological entities, such as animals and insects, manifesting social behavior has resulted in several computational models of swarm intelligence. While there are numerous books addressing the most widely known swarm methods, namely ant colony algorithms and particle swarm optimization, those discussing new alternative approaches are rare. The focus on developments based on the simple modification of popular swarm methods overlooks the opportunity to discover new techniques and procedures that can be useful in solving problems formulated by the academic and industrial communities. Presenting various novel swarm methods and their practical applications, the book helps researchers, lecturers, engineers and practitioners solve their own optimization problems.
From the reviews of the 1st edition: "This book provides a comprehensive and detailed account of different topics in algorithmic 3-dimensional topology, culminating with the recognition procedure for Haken manifolds and including the up-to-date results in computer enumeration of 3-manifolds. Originating from lecture notes of various courses given by the author over a decade, the book is intended to combine the pedagogical approach of a graduate textbook (without exercises) with the completeness and reliability of a research monograph... All the material, with few exceptions, is presented from the peculiar point of view of special polyhedra and special spines of 3-manifolds. This choice contributes to keep the level of the exposition really elementary. In conclusion, the reviewer subscribes to the quotation from the back cover: "the book fills a gap in the existing literature and will become a standard reference for algorithmic 3-dimensional topology both for graduate students and researchers." Zentralblatt fur Mathematik 2004 For this 2nd edition, new results, new proofs, and commentaries for a better orientation of the reader have been added. In particular, in Chapter 7 several new sections concerning applications of the computer program "3-Manifold Recognizer" have been included. "
A formal method is not the main engine of a development process, its contribution is to improve system dependability by motivating formalisation where useful. This book summarizes the results of the DEPLOY research project on engineering methods for dependable systems through the industrial deployment of formal methods in software development. The applications considered were in automotive, aerospace, railway, and enterprise information systems, and microprocessor design. The project introduced a formal method, Event-B, into several industrial organisations and built on the lessons learned to provide an ecosystem of better tools, documentation and support to help others to select and introduce rigorous systems engineering methods. The contributing authors report on these projects and the lessons learned. For the academic and research partners and the tool vendors, the project identified improvements required in the methods and supporting tools, while the industrial partners learned about the value of formal methods in general. A particular feature of the book is the frank assessment of the managerial and organisational challenges, the weaknesses in some current methods and supporting tools, and the ways in which they can be successfully overcome. The book will be of value to academic researchers, systems and software engineers developing critical systems, industrial managers, policymakers, and regulators.
|
You may like...
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
Systems Analysis And Design In A…
John Satzinger, Robert Jackson, …
Hardcover
(1)
|