![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Applications of computing > General
Privacy, Security and Trust within the Context of Pervasive Computing is an edited volume based on a post workshop at the second international conference on Pervasive Computing. The workshop was held April18-23, 2004, in Vienna, Austria. The goal of the workshop was not to focus on specific, even novel mechanisms, but rather on the interfaces between mechanisms in different technical and social problem spaces. An investigation of the interfaces between the notions of context, privacy, security, and trust will result in a deeper understanding of the "atomic" problems, leading to a more complete understanding of the social and technical issues in pervasive computing.
Client/Server applications are of increasing importance in industry, and have been improved by advanced distributed object-oriented techniques, dedicated tool support and both multimedia and mobile computing extensions. Recent responses to this trend are standardized distributed platforms and models including the Distributed Computing Environment (DCE) of the Open Software Foundation (OS F), Open Distributed Processing (ODP), and the Common Object Request Broker Architecture (CORBA) of the Object Management Group (OMG). These proceedings are the compilation of papers from the technical stream of the IFIPIIEEE International Conference on Distributed Platforms, Dresden, Germany. This conference has been sponsored by IFIP TC6.1, by the IEEE Communications Society, and by the German Association of Computer Science (GI -Gesellschaft fur Informatik). ICDP'96 was organized jointly by Dresden University of Technology and Aachen University of Technology. It is closely related to the International Workshop on OSF DCE in Karlsruhe, 1993, and to the IFIP International Conference on Open Distributed Processing. ICDP has been designed to bring together researchers and practitioners who are studying and developing new methodologies, tools and technologies for advanced client/server environ ments, distributed systems, and network applications based on distributed platforms."
In todaya (TM)s competitive world, industries are focusing on shorter lead times, improved quality, reduced cost, increased profit, improved productivity and better customer service. As ERP and other information management systems have been widely implemented, information growth poses new challenges to decision makers in areas ranging from shop floor control to supply chain management and design. Frontiers in Computing Technologies for Manufacturing Applications presents an overview of the state-of-the-art intelligent computing in manufacturing. Modelling, data processing, algorithms and computational analysis of difficult problems found in advanced manufacturing are discussed. It is the first book to bring together combinatorial optimization, information systems and fault diagnosis and monitoring in a consistent manner. Techniques are presented in order to aid decision makers needing to consider multiple, conflicting objectives in their decision processes. In particular, the use of metaheuristic optimization techniques for multi-objective problems is discussed. Readers will learn about computational technologies that can improve the performance of manufacturing systems ranging from manufacturing equipment to supply chains. Frontiers in Computing Technologies for Manufacturing Applications will be of interest to students in industrial and mechanical engineering as well as information engineers needing practical examples for the successful integration of information in manufacturing applications. The book will also appeal to technical decision makers involved in production planning, logistics, supply chain, industrial ecology, manufacturing information systems, faultdiagnosis and monitoring.
For organizations, it's imperative to have the ability to analyze data sources, harmonize disparate data elements, and communicate the results of the analysis in an effective manner to stakeholders. Created by certified enterprise data architect Jeff Voivoda, this simple guide to data analysis and harmonization begins by identifying the problems caused by inefficient data storage. It moves through the life cycle of identifying, gathering, recording, harmonizing and presenting data so that it is organized and comprehensible.Other key areas covered include the following: Seeking out the right experts Reviewing data standards and considerations Grouping and managing data Understanding the practical applications of data analysis Suggesting next steps in the development life cycle.It's essential to understand data requirements, management tools, and industry-wide standards if you want your organization to succeed or improve on its already strong position. Determine your next strategic step and manage your data as an asset with "Data Analysis and Harmonization."
This volume covers a variety of topics in the field of research in strategic management and information technology. These topics include organizational fit and flexibility and the determinants of business unit reliance on information technologies.
Sparse grids have gained increasing interest in recent years for
the numerical treatment of high-dimensional problems. Whereas
classical numerical discretization schemes fail in more than three
or four dimensions, sparse grids make it possible to overcome the
curse of dimensionality to some degree, extending the number of
dimensions that can be dealt with. This volume of LNCSE collects
the papers from the proceedings of the second workshop on sparse
grids and applications, demonstrating once again the importance of
this numerical discretization scheme. The selected articles present
recent advances on the numerical analysis of sparse grids as well
as efficient data structures, and the range of applications extends
to uncertainty quantification settings and clustering, to name but
a few examples.
As instructors move further into the incorporation of 21st century technologies in adult education, a new paradigm of digitally-enriched mediated learning has emerged. ""Adult Learning in the Digital Age: Perspectives on Online Technologies and Outcomes"" provides a comprehensive framework of trends and issues related to adult learning for the facilitation of authentic learning in the age of digital technology. This significant reference source offers researchers, academicians, and practitioners a valuable compendium of expert ideas, practical experiences, field challenges, and potential opportunities concerning the advancement of new technological and pedagogical techniques used in adult schooling.
This self-contained book systematically explores the statistical dynamics on and of complex networks having relevance across a large number of scientific disciplines. The theories related to complex networks are increasingly being used by researchers for their usefulness in harnessing the most difficult problems of a particular discipline. The book is a collection of surveys and cutting-edge research contributions exploring the interdisciplinary relationship of dynamics on and of complex networks. Topics covered include complex networks found in nature-genetic pathways, ecological networks, linguistic systems, and social systems-as well as man-made systems such as the World Wide Web and peer-to-peer networks. The contributed chapters in this volume are intended to promote cross-fertilization in several research areas, and will be valuable to newcomers in the field, experienced researchers, practitioners, and graduate students interested in systems exhibiting an underlying complex network structure in disciplines such as computer science, biology, statistical physics, nonlinear dynamics, linguistics, and the social sciences.
Organisational Semiotics offers an effective approach to analysing organisations and modelling organisational behaviour. The methods and techniques derived from Organisational Semiotics enable us to study the organisation by examining how information is created and used for communication, coordination and performance of actions towards organisational objectives. The latest development of the young discipline and its applications have been reported in this book, which provides a useful guide and a valuable reference to anyone working in the areas of organisational study and information systems development.
Collaborative research in bioinformatics and systems biology is a key element of modern biology and health research. This book highlights and provides access to many of the methods, environments, results and resources involved, including integral laboratory data generation and experimentation and clinical activities. Collaborative projects embody a research paradigm that connects many of the top scientists, institutions, their resources and research worldwide, resulting in first-class contributions to bioinformatics and systems biology. Central themes include describing processes and results in collaborative research projects using computational biology and providing a guide for researchers to access them. The book is also a practical guide on how science is managed. It shows how collaborative researchers are putting results together in a way accessible to the entire biomedical community.
This book presents the most recent advances in fuzzy clustering techniques and their applications. The contents include Introduction to Fuzzy Clustering; Fuzzy Clustering based Principal Component Analysis; Fuzzy Clustering based Regression Analysis; Kernel based Fuzzy Clustering; Evaluation of Fuzzy Clustering; Self-Organized Fuzzy Clustering. This book is directed to the computer scientists, engineers, scientists, professors and students of engineering, science, computer science, business, management, avionics and related disciplines.
This text describes the advanced concepts and techniques used for ASIC chip synthesis, formal verification and static timing analysis, using the Synopsys suite of tools. In addition, the entire ASIC design flow methodology targeted for VDSM (Very-Deep-Sub-Micron) technologies is covered in detail. The emphasis of this book is on real-time application of Synopsys tools used to combat various problems seen at VDSM geometries. Readers are exposed to an effective design methodology for handling complex, sub-micron ASIC designs. Significance is placed on HDL coding styles, synthesis and optimization, dynamic simulation, formal verification, DFT scan insertion, links to layout, and static timing analysis. At each step, problems related to each phase of the design flow are identified, with solutions and work-arounds described in detail. In addition, crucial issues related to layout, which includes clock tree synthesis and back-end integration (links to layout) are also discussed at length. The book is intended for anyone who is involved in the ASIC design methodology, starting from RTL synthesis to final tape-out. Target audiences for this book are practicing ASIC design engineers and graduate students undertaking advanced courses in ASIC chip design and DFT techniques.
Digital Timing Macromodeling for VLSI Design Verification first of all provides an extensive history of the development of simulation techniques. It presents detailed discussion of the various techniques implemented in circuit, timing, fast-timing, switch-level timing, switch-level, and gate-level simulation. It also discusses mixed-mode simulation and interconnection analysis methods. The review in Chapter 2 gives an understanding of the advantages and disadvantages of the many techniques applied in modern digital macromodels. The book also presents a wide variety of techniques for performing nonlinear macromodeling of digital MOS subcircuits which address a large number of shortcomings in existing digital MOS macromodels. Specifically, the techniques address the device model detail, transistor coupling capacitance, effective channel length modulation, series transistor reduction, effective transconductance, input terminal dependence, gate parasitic capacitance, the body effect, the impact of parasitic RC-interconnects, and the effect of transmission gates. The techniques address major sources of errors in existing macromodeling techniques, which must be addressed if macromodeling is to be accepted in commercial CAD tools by chip designers. The techniques presented in Chapters 4-6 can be implemented in other macromodels, and are demonstrated using the macromodel presented in Chapter 3. The new techniques are validated over an extremely wide range of operating conditions: much wider than has been presented for previous macromodels, thus demonstrating the wide range of applicability of these techniques.
Geocomputation may be viewed as the application of a computational science paradigm to study a wide range of problems in geographical systems contexts.This volume presents a clear, comprehensive and thoroughly state-of-the-art overview of current research, written by leading figures in the field.It provides important insights into this new and rapidly developing field and attempts to establish the principles, and to develop techniques for solving real world problems in a wide array of application domains with a catalyst to greater understanding of what geocomputation is and what it entails.The broad coverage makes it invaluable reading for resarchers and professionals in geography, environmental and economic sciences as well as for graduate students of spatial science and computer science.
This book explores non-extensive statistical mechanics in non-equilibrium thermodynamics, and presents an overview of the strong nonlinearity of chaos and complexity in natural systems, drawing on relevant mathematics from topology, measure-theory, inverse and ill-posed problems, set-valued analysis, and nonlinear functional analysis. It offers a self-contained theory of complexity and complex systems as the steady state of non-equilibrium systems, denoting a homeostatic dynamic equilibrium between stabilizing order and destabilizing disorder.
This book encapsulates some work done in the DIRC project concerned with trust and responsibility in socio-technical systems. It brings together a range of disciplinary approaches - computer science, sociology and software engineering - to produce a socio-technical systems perspective on the issues surrounding trust in technology in complex settings. Computer systems can only bring about their purported benefits if functionality, users and usability are central to their design and deployment. Thus, technology can only be trusted in situ and in everyday use if these issues have been brought to bear on the process of technology design, implementation and use. The studies detailed in this book analyse the ways in which trust in technology is achieved and/or worked around in everyday situations in a range of settings - including hospitals, a steelworks, a public enquiry, the financial services sector and air traffic control.
Computer-Aided Verification is a collection of papers that begins with a general survey of hardware verification methods. Ms. Gupta starts with the issue of verification itself and develops a taxonomy of verification methodologies, focusing especially upon recent advances. Although her emphasis is hardware verification, most of what she reports applies to software verification as well. Graphical presentation is coming to be a de facto requirement for a friendly' user interface. The second paper presents a generic format for graphical presentations of coordinating systems represented by automata. The last two papers as a pair, present a variety of generic techniques for reducing the computational cost of computer-aided verification based upon explicit computational memory: the first of the two gives a time-space trade-off, while the second gives a technique which trades space for a (sometimes predictable) probability of error. Computer-Aided Verification is an edited volume of original research. This research work has also been published as a special issue of the journal Formal Methods in System Design, 1:2-3.
While a typical project manager s responsibility and accountability are both limited to a project with a clear start and end date, IT managers are responsible for an ongoing, ever-changing process for which they must adapt and evolve to stay updated, dependable, and secure in their field. Professional Advancements and Management Trends in the IT Sector offers the latest managerial trends within the field of information technology management. By collecting research from experts from around the world, in a variety of sectors and levels of technical expertise, this volume offers a broad variety of case studies, best practices, methodologies, and research within the field of information technology management. It will serve as a vital resource for practitioners and academics alike.
The papers in this volume comprise the refereed proceedings of the First Int- national Conference on Computer and Computing Technologies in Agriculture (CCTA 2007), in Wuyishan, China, 2007. This conference is organized by China Agricultural University, Chinese Society of Agricultural Engineering and the Beijing Society for Information Technology in Agriculture. The purpose of this conference is to facilitate the communication and cooperation between institutions and researchers on theories, methods and implementation of computer science and information technology. By researching information technology development and the - sources integration in rural areas in China, an innovative and effective approach is expected to be explored to promote the technology application to the development of modern agriculture and contribute to the construction of new countryside. The rapid development of information technology has induced substantial changes and impact on the development of China's rural areas. Western thoughts have exerted great impact on studies of Chinese information technology devel- ment and it helps more Chinese and western scholars to expand their studies in this academic and application area. Thus, this conference, with works by many prominent scholars, has covered computer science and technology and information development in China's rural areas; and probed into all the important issues and the newest research topics, such as Agricultural Decision Support System and Expert System, GIS, GPS, RS and Precision Farming, CT applications in Rural Area, Agricultural System Simulation, Evolutionary Computing, etc.
This book presents methodologies for analysing large data sets produced by the direct numerical simulation (DNS) of turbulence and combustion. It describes the development of models that can be used to analyse large eddy simulations, and highlights both the most common techniques and newly emerging ones. The chapters, written by internationally respected experts, invite readers to consider DNS of turbulence and combustion from a formal, data-driven standpoint, rather than one led by experience and intuition. This perspective allows readers to recognise the shortcomings of existing models, with the ultimate goal of quantifying and reducing model-based uncertainty. In addition, recent advances in machine learning and statistical inferences offer new insights on the interpretation of DNS data. The book will especially benefit graduate-level students and researchers in mechanical and aerospace engineering, e.g. those with an interest in general fluid mechanics, applied mathematics, and the environmental and atmospheric sciences.
Collaboration is a form of electronic communication in which individuals work on the same documents or processes over a period of time. When applied to technologies development, collaboration often has a focus on user-centered design and rapid prototyping, with a strong people-orientation. ""Collaborative Technologies and Applications for Interactive Information Design: Emerging Trends in User Experiences"" covers a wide range of emerging topics in collaboration, Web 2.0, and social computing, with a focus on technologies that impact the user experience. This cutting-edge source provides the latest international findings useful to practitioners, researchers, and academicians involved in education, ontologies, open source communities, and trusted networks.
In many countries, small businesses comprise over 95% of the proportion of private businesses and approximately half of the private workforce, with information technology being used in more than 90% of these businesses. As a result, governments worldwide are placing increasing importance upon the success of small business entrepreneurs and are providing increased resources to support this emphasis. Managing Information Technology in Small Business: Challenges and Solutions presents research in areas such as IT performance, electronic commerce, internet adoption, and IT planning methodologies and focuses on how these areas impact small businesses.
This volume examines the application of swarm intelligence in data mining, addressing the issues of swarm intelligence and data mining using novel intelligent approaches. The book comprises 11 chapters including an introduction reviewing fundamental definitions and important research challenges. Important features include a detailed overview of swarm intelligence and data mining paradigms, focused coverage of timely, advanced data mining topics, state-of-the-art theoretical research and application developments and contributions by pioneers in the field.
th The 20 anniversary of the IFIP WG6. 1 Joint International Conference on Fonna! Methods for Distributed Systems and Communication Protocols (FORTE XIII / PSTV XX) was celebrated by the year 2000 edition of the Conference, which was held for the first time in Italy, at Pisa, October 10-13, 2000. In devising the subtitle for this special edition --'Fonna! Methods Implementation Under Test' --we wanted to convey two main concepts that, in our opinion, are reflected in the contents of this book. First, the early, pioneering phases in the development of Formal Methods (FM's), with their conflicts between evangelistic and agnostic attitudes, with their over optimistic applications to toy examples and over-skeptical views about scalability to industrial cases, with their misconceptions and myths . . . , all this is essentially over. Many FM's have successfully reached their maturity, having been 'implemented' into concrete development practice: a number of papers in this book report about successful experiences in specifYing and verifYing real distributed systems and protocols. Second, one of the several myths about FM's - the fact that their adoption would eventually eliminate the need for testing - is still quite far from becoming a reality, and, again, this book indicates that testing theory and applications are still remarkably healthy. A total of 63 papers have been submitted to FORTEIPSTV 2000, out of which the Programme Committee has selected 22 for presentation at the Conference and inclusion in the Proceedings. |
![]() ![]() You may like...
Preventing Classroom Disruption (RLE Edu…
David Coulby, Tim Harper
Hardcover
R4,626
Discovery Miles 46 260
Automated Scoring of Complex Tasks in…
David M. Williamson, Robert J. Mislevy, …
Paperback
R2,005
Discovery Miles 20 050
TIMSS 2015 Grade 9 national report…
Linda Zuze, Vijay Reddy, …
Paperback
Assessment of Authentic Performance in…
Richard A. Lesh, Susan J. Lamon
Hardcover
R2,740
Discovery Miles 27 400
Assessment in Higher Education…
Patrick L. Courts, Kathleen McInerney
Hardcover
R2,765
Discovery Miles 27 650
Measuring Up - Challenges Minorities…
Arie L. Nettles, Michael T Nettles
Hardcover
R3,036
Discovery Miles 30 360
|