![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
Privacy, Security and Trust within the Context of Pervasive Computing is an edited volume based on a post workshop at the second international conference on Pervasive Computing. The workshop was held April18-23, 2004, in Vienna, Austria. The goal of the workshop was not to focus on specific, even novel mechanisms, but rather on the interfaces between mechanisms in different technical and social problem spaces. An investigation of the interfaces between the notions of context, privacy, security, and trust will result in a deeper understanding of the "atomic" problems, leading to a more complete understanding of the social and technical issues in pervasive computing.
This book encapsulates some work done in the DIRC project concerned with trust and responsibility in socio-technical systems. It brings together a range of disciplinary approaches - computer science, sociology and software engineering - to produce a socio-technical systems perspective on the issues surrounding trust in technology in complex settings. Computer systems can only bring about their purported benefits if functionality, users and usability are central to their design and deployment. Thus, technology can only be trusted in situ and in everyday use if these issues have been brought to bear on the process of technology design, implementation and use. The studies detailed in this book analyse the ways in which trust in technology is achieved and/or worked around in everyday situations in a range of settings - including hospitals, a steelworks, a public enquiry, the financial services sector and air traffic control.
While a typical project manager s responsibility and accountability are both limited to a project with a clear start and end date, IT managers are responsible for an ongoing, ever-changing process for which they must adapt and evolve to stay updated, dependable, and secure in their field. Professional Advancements and Management Trends in the IT Sector offers the latest managerial trends within the field of information technology management. By collecting research from experts from around the world, in a variety of sectors and levels of technical expertise, this volume offers a broad variety of case studies, best practices, methodologies, and research within the field of information technology management. It will serve as a vital resource for practitioners and academics alike.
This volume covers a variety of topics in the field of research in strategic management and information technology. These topics include organizational fit and flexibility and the determinants of business unit reliance on information technologies.
Digital Timing Macromodeling for VLSI Design Verification first of all provides an extensive history of the development of simulation techniques. It presents detailed discussion of the various techniques implemented in circuit, timing, fast-timing, switch-level timing, switch-level, and gate-level simulation. It also discusses mixed-mode simulation and interconnection analysis methods. The review in Chapter 2 gives an understanding of the advantages and disadvantages of the many techniques applied in modern digital macromodels. The book also presents a wide variety of techniques for performing nonlinear macromodeling of digital MOS subcircuits which address a large number of shortcomings in existing digital MOS macromodels. Specifically, the techniques address the device model detail, transistor coupling capacitance, effective channel length modulation, series transistor reduction, effective transconductance, input terminal dependence, gate parasitic capacitance, the body effect, the impact of parasitic RC-interconnects, and the effect of transmission gates. The techniques address major sources of errors in existing macromodeling techniques, which must be addressed if macromodeling is to be accepted in commercial CAD tools by chip designers. The techniques presented in Chapters 4-6 can be implemented in other macromodels, and are demonstrated using the macromodel presented in Chapter 3. The new techniques are validated over an extremely wide range of operating conditions: much wider than has been presented for previous macromodels, thus demonstrating the wide range of applicability of these techniques.
The papers in this volume comprise the refereed proceedings of the First Int- national Conference on Computer and Computing Technologies in Agriculture (CCTA 2007), in Wuyishan, China, 2007. This conference is organized by China Agricultural University, Chinese Society of Agricultural Engineering and the Beijing Society for Information Technology in Agriculture. The purpose of this conference is to facilitate the communication and cooperation between institutions and researchers on theories, methods and implementation of computer science and information technology. By researching information technology development and the - sources integration in rural areas in China, an innovative and effective approach is expected to be explored to promote the technology application to the development of modern agriculture and contribute to the construction of new countryside. The rapid development of information technology has induced substantial changes and impact on the development of China's rural areas. Western thoughts have exerted great impact on studies of Chinese information technology devel- ment and it helps more Chinese and western scholars to expand their studies in this academic and application area. Thus, this conference, with works by many prominent scholars, has covered computer science and technology and information development in China's rural areas; and probed into all the important issues and the newest research topics, such as Agricultural Decision Support System and Expert System, GIS, GPS, RS and Precision Farming, CT applications in Rural Area, Agricultural System Simulation, Evolutionary Computing, etc.
Geocomputation may be viewed as the application of a computational science paradigm to study a wide range of problems in geographical systems contexts.This volume presents a clear, comprehensive and thoroughly state-of-the-art overview of current research, written by leading figures in the field.It provides important insights into this new and rapidly developing field and attempts to establish the principles, and to develop techniques for solving real world problems in a wide array of application domains with a catalyst to greater understanding of what geocomputation is and what it entails.The broad coverage makes it invaluable reading for resarchers and professionals in geography, environmental and economic sciences as well as for graduate students of spatial science and computer science.
Collaborative research in bioinformatics and systems biology is a key element of modern biology and health research. This book highlights and provides access to many of the methods, environments, results and resources involved, including integral laboratory data generation and experimentation and clinical activities. Collaborative projects embody a research paradigm that connects many of the top scientists, institutions, their resources and research worldwide, resulting in first-class contributions to bioinformatics and systems biology. Central themes include describing processes and results in collaborative research projects using computational biology and providing a guide for researchers to access them. The book is also a practical guide on how science is managed. It shows how collaborative researchers are putting results together in a way accessible to the entire biomedical community.
This self-contained book systematically explores the statistical dynamics on and of complex networks having relevance across a large number of scientific disciplines. The theories related to complex networks are increasingly being used by researchers for their usefulness in harnessing the most difficult problems of a particular discipline. The book is a collection of surveys and cutting-edge research contributions exploring the interdisciplinary relationship of dynamics on and of complex networks. Topics covered include complex networks found in nature-genetic pathways, ecological networks, linguistic systems, and social systems-as well as man-made systems such as the World Wide Web and peer-to-peer networks. The contributed chapters in this volume are intended to promote cross-fertilization in several research areas, and will be valuable to newcomers in the field, experienced researchers, practitioners, and graduate students interested in systems exhibiting an underlying complex network structure in disciplines such as computer science, biology, statistical physics, nonlinear dynamics, linguistics, and the social sciences.
Computer-Aided Verification is a collection of papers that begins with a general survey of hardware verification methods. Ms. Gupta starts with the issue of verification itself and develops a taxonomy of verification methodologies, focusing especially upon recent advances. Although her emphasis is hardware verification, most of what she reports applies to software verification as well. Graphical presentation is coming to be a de facto requirement for a friendly' user interface. The second paper presents a generic format for graphical presentations of coordinating systems represented by automata. The last two papers as a pair, present a variety of generic techniques for reducing the computational cost of computer-aided verification based upon explicit computational memory: the first of the two gives a time-space trade-off, while the second gives a technique which trades space for a (sometimes predictable) probability of error. Computer-Aided Verification is an edited volume of original research. This research work has also been published as a special issue of the journal Formal Methods in System Design, 1:2-3.
This volume examines the application of swarm intelligence in data mining, addressing the issues of swarm intelligence and data mining using novel intelligent approaches. The book comprises 11 chapters including an introduction reviewing fundamental definitions and important research challenges. Important features include a detailed overview of swarm intelligence and data mining paradigms, focused coverage of timely, advanced data mining topics, state-of-the-art theoretical research and application developments and contributions by pioneers in the field.
This book presents the most recent advances in fuzzy clustering techniques and their applications. The contents include Introduction to Fuzzy Clustering; Fuzzy Clustering based Principal Component Analysis; Fuzzy Clustering based Regression Analysis; Kernel based Fuzzy Clustering; Evaluation of Fuzzy Clustering; Self-Organized Fuzzy Clustering. This book is directed to the computer scientists, engineers, scientists, professors and students of engineering, science, computer science, business, management, avionics and related disciplines.
For organizations, it's imperative to have the ability to analyze data sources, harmonize disparate data elements, and communicate the results of the analysis in an effective manner to stakeholders. Created by certified enterprise data architect Jeff Voivoda, this simple guide to data analysis and harmonization begins by identifying the problems caused by inefficient data storage. It moves through the life cycle of identifying, gathering, recording, harmonizing and presenting data so that it is organized and comprehensible.Other key areas covered include the following: Seeking out the right experts Reviewing data standards and considerations Grouping and managing data Understanding the practical applications of data analysis Suggesting next steps in the development life cycle.It's essential to understand data requirements, management tools, and industry-wide standards if you want your organization to succeed or improve on its already strong position. Determine your next strategic step and manage your data as an asset with "Data Analysis and Harmonization."
The papers gathered in this book were published over a period of more than twenty years in widely scattered journals. They led to the discovery of randomness in arithmetic which was presented in the recently published monograph on "Algorithmic Information Theory" by the author. There the strongest possible version of Goedel's incompleteness theorem, using an information-theoretic approach based on the size of computer programs, was discussed. The present book is intended as a companion volume to the monograph and it will serve as a stimulus for work on complexity, randomness and unpredictability, in physics and biology as well as in metamathematics.
The papers gathered in this book were published over a period of more than twenty years in widely scattered journals. They led to the discovery of randomness in arithmetic which was presented in the recently published monograph on "Algorithmic Information Theory" by the author. There the strongest possible version of Goedel's incompleteness theorem, using an information-theoretic approach based on the size of computer programs, was discussed. The present book is intended as a companion volume to the monograph and it will serve as a stimulus for work on complexity, randomness and unpredictability, in physics and biology as well as in metamathematics.
Silicon-On-Insulator (SOI) CMOS technology has been regarded as another major technology for VLSI in addition to bulk CMOS technology. Owing to the buried oxide structure, SOI technology offers superior CMOS devices with higher speed, high density, and reduced second order effects for deep-submicron low-voltage, low-power VLSI circuits applications. In addition to VLSI applications, and because of its outstanding properties, SOI technology has been used to realize communication circuits, microwave devices, BICMOS devices, and even fiber optics applications. CMOS VLSI Engineering: Silicon-On-Insulator addresses three key factors in engineering SOI CMOS VLSI - processing technology, device modelling, and circuit designs are all covered with their mutual interactions. Starting from the SOI CMOS processing technology and the SOI CMOS digital and analog circuits, behaviors of the SOI CMOS devices are presented, followed by a CAD program, ST-SPICE, which incorporates models for deep-submicron fully-depleted mesa-isolated SOI CMOS devices and special purpose SOI devices including polysilicon TFTs. CMOS VLSI Engineering: Silicon-On-Insulator is written for undergraduate senior students and first-year graduate students interested in CMOS VLSI. It will also be suitable for electrical engineering professionals interested in microelectronics.
As miniaturisation deepens, and nanotechnology and its machines become more prevalent in the real world, the need to consider using quantum mechanical concepts to perform various tasks in computation increases. Such tasks include: the teleporting of information, breaking heretofore "unbreakable" codes, communicating with messages that betray eavesdropping, and the generation of random numbers. This is the first book to apply quantum physics to the basic operations of a computer, representing the ideal vehicle for explaining the complexities of quantum mechanics to students, researchers and computer engineers, alike, as they prepare to design and create the computing and information delivery systems for the future. Both authors have solid backgrounds in the subject matter at the theoretical and more practical level. While serving as a text for senior/grad level students in computer science/physics/engineering, this book has its primary use as an up-to-date reference work in the emerging interdisciplinary field of quantum computing - the only prerequisite being knowledge of calculus and familiarity with the concept of the Turing machine.
th The 20 anniversary of the IFIP WG6. 1 Joint International Conference on Fonna! Methods for Distributed Systems and Communication Protocols (FORTE XIII / PSTV XX) was celebrated by the year 2000 edition of the Conference, which was held for the first time in Italy, at Pisa, October 10-13, 2000. In devising the subtitle for this special edition --'Fonna! Methods Implementation Under Test' --we wanted to convey two main concepts that, in our opinion, are reflected in the contents of this book. First, the early, pioneering phases in the development of Formal Methods (FM's), with their conflicts between evangelistic and agnostic attitudes, with their over optimistic applications to toy examples and over-skeptical views about scalability to industrial cases, with their misconceptions and myths . . . , all this is essentially over. Many FM's have successfully reached their maturity, having been 'implemented' into concrete development practice: a number of papers in this book report about successful experiences in specifYing and verifYing real distributed systems and protocols. Second, one of the several myths about FM's - the fact that their adoption would eventually eliminate the need for testing - is still quite far from becoming a reality, and, again, this book indicates that testing theory and applications are still remarkably healthy. A total of 63 papers have been submitted to FORTEIPSTV 2000, out of which the Programme Committee has selected 22 for presentation at the Conference and inclusion in the Proceedings.
In many countries, small businesses comprise over 95% of the proportion of private businesses and approximately half of the private workforce, with information technology being used in more than 90% of these businesses. As a result, governments worldwide are placing increasing importance upon the success of small business entrepreneurs and are providing increased resources to support this emphasis. Managing Information Technology in Small Business: Challenges and Solutions presents research in areas such as IT performance, electronic commerce, internet adoption, and IT planning methodologies and focuses on how these areas impact small businesses.
"Intelligent Data Mining Techniques and Applications" is an organized edited collection of contributed chapters covering basic knowledge for intelligent systems and data mining, applications in economic and management, industrial engineering and other related industrial applications. The main objective of this book is to gather a number of peer-reviewed high quality contributions in the relevant topic areas. The focus is especially on those chapters that provide theoretical/analytical solutions to the problems of real interest in intelligent techniques possibly combined with other traditional tools, for data mining and the corresponding applications to engineers and managers of different industrial sectors. Academic and applied researchers and research students working on data mining can also directly benefit from this book.
Modern electronics is driven by the explosive growth of digital communications and multi-media technology. A basic challenge is to design first-time-right complex digital systems, that meet stringent constraints on performance and power dissipation. In order to combine this growing system complexity with an increasingly short time-to-market, new system design technologies are emerging based on the paradigm of embedded programmable processors. This concept introduces modularity, flexibility and re-use in the electronic system design process. However, its success will critically depend on the availability of efficient and reliable CAD tools to design, programme and verify the functionality of embedded processors. Recently, new research efforts emerged on the edge between software compilation and hardware synthesis, to develop high-quality code generation tools for embedded processors. Code Generation for Embedded Systems provides a survey of these new developments. Although not limited to these targets, the main emphasis is on code generation for modern DSP processors. Important themes covered by the book include: the scope of general purpose versus application-specific processors, machine code quality for embedded applications, retargetability of the code generation process, machine description formalisms, and code generation methodologies. Code Generation for Embedded Systems is the essential introduction to this fast developing field of research for students, researchers, and practitioners alike.
The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation.
The more complex instructional design (ID) projects grow, the more a design language can support the success of the projects, and the continuing process of integration of technologies in education makes this issue even more relevant. The Hanndbook of visual languages for instructional design: Theories and practice serves as a practical guide for the integration of ID languages and notation systems into the practice of ID by presenting recent languages and notation systems for ID; exploring the connection between the use of ID languages and the integration of technologies in education, and assessing the benefits and drawbacks of the use of ID languages in specific project settings
Collaboration is a form of electronic communication in which individuals work on the same documents or processes over a period of time. When applied to technologies development, collaboration often has a focus on user-centered design and rapid prototyping, with a strong people-orientation. ""Collaborative Technologies and Applications for Interactive Information Design: Emerging Trends in User Experiences"" covers a wide range of emerging topics in collaboration, Web 2.0, and social computing, with a focus on technologies that impact the user experience. This cutting-edge source provides the latest international findings useful to practitioners, researchers, and academicians involved in education, ontologies, open source communities, and trusted networks. |
You may like...
Practical Guide to Usability Testing
Joseph S. Dumas, Janice C. Redish
Paperback
R984
Discovery Miles 9 840
Discovering Computers, Essentials…
Susan Sebok, Jennifer Campbell, …
Paperback
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
Computer-Graphic Facial Reconstruction
John G. Clement, Murray K. Marks
Hardcover
R2,327
Discovery Miles 23 270
Infinite Words, Volume 141 - Automata…
Dominique Perrin, Jean-Eric Pin
Hardcover
R4,065
Discovery Miles 40 650
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
|