![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > General
Privacy, Security and Trust within the Context of Pervasive Computing is an edited volume based on a post workshop at the second international conference on Pervasive Computing. The workshop was held April18-23, 2004, in Vienna, Austria. The goal of the workshop was not to focus on specific, even novel mechanisms, but rather on the interfaces between mechanisms in different technical and social problem spaces. An investigation of the interfaces between the notions of context, privacy, security, and trust will result in a deeper understanding of the "atomic" problems, leading to a more complete understanding of the social and technical issues in pervasive computing.
Pulling aside the curtain of 'Big Data' buzz, this book introduces C-suite and other non-technical senior leaders to the essentials of obtaining and maintaining accurate, reliable data, especially for decision-making purposes. Bad data begets bad decisions, and an understanding of data fundamentals - how data is generated, organized, stored, evaluated, and maintained - has never been more important when solving problems such as the pandemic-related supply chain crisis. This book addresses the data-related challenges that businesses face, answering questions such as: What are the characteristics of high-quality data? How do you get from bad data to good data? What procedures and practices ensure high-quality data? How do you know whether your data supports the decisions you need to make? This clear and valuable resource will appeal to C-suite executives and top-line managers across industries, as well as business analysts at all career stages and data analytics students.
In todaya (TM)s competitive world, industries are focusing on shorter lead times, improved quality, reduced cost, increased profit, improved productivity and better customer service. As ERP and other information management systems have been widely implemented, information growth poses new challenges to decision makers in areas ranging from shop floor control to supply chain management and design. Frontiers in Computing Technologies for Manufacturing Applications presents an overview of the state-of-the-art intelligent computing in manufacturing. Modelling, data processing, algorithms and computational analysis of difficult problems found in advanced manufacturing are discussed. It is the first book to bring together combinatorial optimization, information systems and fault diagnosis and monitoring in a consistent manner. Techniques are presented in order to aid decision makers needing to consider multiple, conflicting objectives in their decision processes. In particular, the use of metaheuristic optimization techniques for multi-objective problems is discussed. Readers will learn about computational technologies that can improve the performance of manufacturing systems ranging from manufacturing equipment to supply chains. Frontiers in Computing Technologies for Manufacturing Applications will be of interest to students in industrial and mechanical engineering as well as information engineers needing practical examples for the successful integration of information in manufacturing applications. The book will also appeal to technical decision makers involved in production planning, logistics, supply chain, industrial ecology, manufacturing information systems, faultdiagnosis and monitoring.
Sparse grids have gained increasing interest in recent years for
the numerical treatment of high-dimensional problems. Whereas
classical numerical discretization schemes fail in more than three
or four dimensions, sparse grids make it possible to overcome the
curse of dimensionality to some degree, extending the number of
dimensions that can be dealt with. This volume of LNCSE collects
the papers from the proceedings of the second workshop on sparse
grids and applications, demonstrating once again the importance of
this numerical discretization scheme. The selected articles present
recent advances on the numerical analysis of sparse grids as well
as efficient data structures, and the range of applications extends
to uncertainty quantification settings and clustering, to name but
a few examples.
Client/Server applications are of increasing importance in industry, and have been improved by advanced distributed object-oriented techniques, dedicated tool support and both multimedia and mobile computing extensions. Recent responses to this trend are standardized distributed platforms and models including the Distributed Computing Environment (DCE) of the Open Software Foundation (OS F), Open Distributed Processing (ODP), and the Common Object Request Broker Architecture (CORBA) of the Object Management Group (OMG). These proceedings are the compilation of papers from the technical stream of the IFIPIIEEE International Conference on Distributed Platforms, Dresden, Germany. This conference has been sponsored by IFIP TC6.1, by the IEEE Communications Society, and by the German Association of Computer Science (GI -Gesellschaft fur Informatik). ICDP'96 was organized jointly by Dresden University of Technology and Aachen University of Technology. It is closely related to the International Workshop on OSF DCE in Karlsruhe, 1993, and to the IFIP International Conference on Open Distributed Processing. ICDP has been designed to bring together researchers and practitioners who are studying and developing new methodologies, tools and technologies for advanced client/server environ ments, distributed systems, and network applications based on distributed platforms."
Much of the world's advanced data processing applications are now dependant on eXtensible Markup Language (XML), from publishing to medical information storage. Therefore, XML has become a de facto standard for data exchange and representation on the World Wide Web and in daily life. Applications and Structures in XML Processing: Label Streams, Semantics Utilization and Data Query Technologies reflects the significant research results and latest findings of scholars' worldwide, working to explore and expand the role of XML. This collection represents an understanding of XML processing technologies in connection with both advanced applications and the latest XML processing technologies that is of primary importance. It provides the opportunity to understand topics in detail and discover XML research at a comprehensive level.
XML Programming at its best! Discover A Book That Tells You What You Should Do and How! Instead of jumping right into the instructions, this book will provide you first with all the necessary concepts that you need to learn in order to make the learning process a whole lot easier. This way, you're sure not to get lost in confusion once you get to the more complex lessons provided in the latter chapters. Graphs and flowcharts, as well as sample codes, are provided for a more visual approach on your learning You will also learn the designs and forms of XML, and what's more convenient than getting to know both sides! Want to know More?
This self-contained book systematically explores the statistical dynamics on and of complex networks having relevance across a large number of scientific disciplines. The theories related to complex networks are increasingly being used by researchers for their usefulness in harnessing the most difficult problems of a particular discipline. The book is a collection of surveys and cutting-edge research contributions exploring the interdisciplinary relationship of dynamics on and of complex networks. Topics covered include complex networks found in nature-genetic pathways, ecological networks, linguistic systems, and social systems-as well as man-made systems such as the World Wide Web and peer-to-peer networks. The contributed chapters in this volume are intended to promote cross-fertilization in several research areas, and will be valuable to newcomers in the field, experienced researchers, practitioners, and graduate students interested in systems exhibiting an underlying complex network structure in disciplines such as computer science, biology, statistical physics, nonlinear dynamics, linguistics, and the social sciences.
Collaborative research in bioinformatics and systems biology is a key element of modern biology and health research. This book highlights and provides access to many of the methods, environments, results and resources involved, including integral laboratory data generation and experimentation and clinical activities. Collaborative projects embody a research paradigm that connects many of the top scientists, institutions, their resources and research worldwide, resulting in first-class contributions to bioinformatics and systems biology. Central themes include describing processes and results in collaborative research projects using computational biology and providing a guide for researchers to access them. The book is also a practical guide on how science is managed. It shows how collaborative researchers are putting results together in a way accessible to the entire biomedical community.
This book presents the most recent advances in fuzzy clustering techniques and their applications. The contents include Introduction to Fuzzy Clustering; Fuzzy Clustering based Principal Component Analysis; Fuzzy Clustering based Regression Analysis; Kernel based Fuzzy Clustering; Evaluation of Fuzzy Clustering; Self-Organized Fuzzy Clustering. This book is directed to the computer scientists, engineers, scientists, professors and students of engineering, science, computer science, business, management, avionics and related disciplines.
This book presents a novel account of the human temporal dimension called the "human temporality" and develops a special mathematical formalism for describing such an object as the human mind. One of the characteristic features of the human mind is its temporal extent. For objects of physical reality, only the present exists, which may be conceived as a point-like moment in time. In the human temporality, the past retained in the memory, the imaginary future, and the present coexist and are closely intertwined and impact one another. This book focuses on one of the fragments of the human temporality called the complex present. A detailed analysis of the classical and modern concepts has enabled the authors to put forward the idea of the multi-component structure of the present. For the concept of the complex present, the authors proposed a novel account that involves a qualitative description and a special mathematical formalism. This formalism takes into account human goal-oriented behavior and uncertainty in human perception. The present book can be interesting for theoreticians, physicists dealing with modeling systems where the human factor plays a crucial role, philosophers who are interested in applying philosophical concepts to constructing mathematical models, and psychologists whose research is related to modeling mental processes.
Organisational Semiotics offers an effective approach to analysing organisations and modelling organisational behaviour. The methods and techniques derived from Organisational Semiotics enable us to study the organisation by examining how information is created and used for communication, coordination and performance of actions towards organisational objectives. The latest development of the young discipline and its applications have been reported in this book, which provides a useful guide and a valuable reference to anyone working in the areas of organisational study and information systems development.
The papers in this volume comprise the refereed proceedings of the First Int- national Conference on Computer and Computing Technologies in Agriculture (CCTA 2007), in Wuyishan, China, 2007. This conference is organized by China Agricultural University, Chinese Society of Agricultural Engineering and the Beijing Society for Information Technology in Agriculture. The purpose of this conference is to facilitate the communication and cooperation between institutions and researchers on theories, methods and implementation of computer science and information technology. By researching information technology development and the - sources integration in rural areas in China, an innovative and effective approach is expected to be explored to promote the technology application to the development of modern agriculture and contribute to the construction of new countryside. The rapid development of information technology has induced substantial changes and impact on the development of China's rural areas. Western thoughts have exerted great impact on studies of Chinese information technology devel- ment and it helps more Chinese and western scholars to expand their studies in this academic and application area. Thus, this conference, with works by many prominent scholars, has covered computer science and technology and information development in China's rural areas; and probed into all the important issues and the newest research topics, such as Agricultural Decision Support System and Expert System, GIS, GPS, RS and Precision Farming, CT applications in Rural Area, Agricultural System Simulation, Evolutionary Computing, etc.
For organizations, it's imperative to have the ability to analyze data sources, harmonize disparate data elements, and communicate the results of the analysis in an effective manner to stakeholders. Created by certified enterprise data architect Jeff Voivoda, this simple guide to data analysis and harmonization begins by identifying the problems caused by inefficient data storage. It moves through the life cycle of identifying, gathering, recording, harmonizing and presenting data so that it is organized and comprehensible.Other key areas covered include the following: Seeking out the right experts Reviewing data standards and considerations Grouping and managing data Understanding the practical applications of data analysis Suggesting next steps in the development life cycle.It's essential to understand data requirements, management tools, and industry-wide standards if you want your organization to succeed or improve on its already strong position. Determine your next strategic step and manage your data as an asset with "Data Analysis and Harmonization."
This book explores non-extensive statistical mechanics in non-equilibrium thermodynamics, and presents an overview of the strong nonlinearity of chaos and complexity in natural systems, drawing on relevant mathematics from topology, measure-theory, inverse and ill-posed problems, set-valued analysis, and nonlinear functional analysis. It offers a self-contained theory of complexity and complex systems as the steady state of non-equilibrium systems, denoting a homeostatic dynamic equilibrium between stabilizing order and destabilizing disorder.
This volume covers a variety of topics in the field of research in strategic management and information technology. These topics include organizational fit and flexibility and the determinants of business unit reliance on information technologies.
Geocomputation may be viewed as the application of a computational science paradigm to study a wide range of problems in geographical systems contexts.This volume presents a clear, comprehensive and thoroughly state-of-the-art overview of current research, written by leading figures in the field.It provides important insights into this new and rapidly developing field and attempts to establish the principles, and to develop techniques for solving real world problems in a wide array of application domains with a catalyst to greater understanding of what geocomputation is and what it entails.The broad coverage makes it invaluable reading for resarchers and professionals in geography, environmental and economic sciences as well as for graduate students of spatial science and computer science.
This text describes the advanced concepts and techniques used for ASIC chip synthesis, formal verification and static timing analysis, using the Synopsys suite of tools. In addition, the entire ASIC design flow methodology targeted for VDSM (Very-Deep-Sub-Micron) technologies is covered in detail. The emphasis of this book is on real-time application of Synopsys tools used to combat various problems seen at VDSM geometries. Readers are exposed to an effective design methodology for handling complex, sub-micron ASIC designs. Significance is placed on HDL coding styles, synthesis and optimization, dynamic simulation, formal verification, DFT scan insertion, links to layout, and static timing analysis. At each step, problems related to each phase of the design flow are identified, with solutions and work-arounds described in detail. In addition, crucial issues related to layout, which includes clock tree synthesis and back-end integration (links to layout) are also discussed at length. The book is intended for anyone who is involved in the ASIC design methodology, starting from RTL synthesis to final tape-out. Target audiences for this book are practicing ASIC design engineers and graduate students undertaking advanced courses in ASIC chip design and DFT techniques.
As instructors move further into the incorporation of 21st century technologies in adult education, a new paradigm of digitally-enriched mediated learning has emerged. ""Adult Learning in the Digital Age: Perspectives on Online Technologies and Outcomes"" provides a comprehensive framework of trends and issues related to adult learning for the facilitation of authentic learning in the age of digital technology. This significant reference source offers researchers, academicians, and practitioners a valuable compendium of expert ideas, practical experiences, field challenges, and potential opportunities concerning the advancement of new technological and pedagogical techniques used in adult schooling.
With the recent advancements and implementations of technology within the global community, various regions of the world have begun to transform. The idea of smart transportation and mobility is a specific field that has been implemented among countless areas around the world that are focused on intelligent and efficient environments. Despite its strong influence and potential, sustainable mobility still faces multiple demographic and environmental challenges. New perspectives, improvements, and solutions are needed in order to successfully apply efficient and sustainable transportation within populated environments. Implications of Mobility as a Service (MaaS) in Urban and Rural Environments: Emerging Research and Opportunities is a pivotal reference source that provides vital research on recent transportation improvements and the development of mobility systems in populated regions. While highlighting topics such as human-machine interaction, alternative vehicles, and sustainable development, this publication explores competitive solutions for transport efficiency as well as its impact on citizens' quality of life. This book is ideally designed for researchers, environmentalists, civil engineers, architects, policymakers, strategists, academicians, and students seeking current research on mobility advancements in urban and rural areas across the globe.
This book encapsulates some work done in the DIRC project concerned with trust and responsibility in socio-technical systems. It brings together a range of disciplinary approaches - computer science, sociology and software engineering - to produce a socio-technical systems perspective on the issues surrounding trust in technology in complex settings. Computer systems can only bring about their purported benefits if functionality, users and usability are central to their design and deployment. Thus, technology can only be trusted in situ and in everyday use if these issues have been brought to bear on the process of technology design, implementation and use. The studies detailed in this book analyse the ways in which trust in technology is achieved and/or worked around in everyday situations in a range of settings - including hospitals, a steelworks, a public enquiry, the financial services sector and air traffic control.
Digital Timing Macromodeling for VLSI Design Verification first of all provides an extensive history of the development of simulation techniques. It presents detailed discussion of the various techniques implemented in circuit, timing, fast-timing, switch-level timing, switch-level, and gate-level simulation. It also discusses mixed-mode simulation and interconnection analysis methods. The review in Chapter 2 gives an understanding of the advantages and disadvantages of the many techniques applied in modern digital macromodels. The book also presents a wide variety of techniques for performing nonlinear macromodeling of digital MOS subcircuits which address a large number of shortcomings in existing digital MOS macromodels. Specifically, the techniques address the device model detail, transistor coupling capacitance, effective channel length modulation, series transistor reduction, effective transconductance, input terminal dependence, gate parasitic capacitance, the body effect, the impact of parasitic RC-interconnects, and the effect of transmission gates. The techniques address major sources of errors in existing macromodeling techniques, which must be addressed if macromodeling is to be accepted in commercial CAD tools by chip designers. The techniques presented in Chapters 4-6 can be implemented in other macromodels, and are demonstrated using the macromodel presented in Chapter 3. The new techniques are validated over an extremely wide range of operating conditions: much wider than has been presented for previous macromodels, thus demonstrating the wide range of applicability of these techniques.
REAL-TIME MANAGEMENT OF RESOURCE ALLOCATION SYSTEMS focuses on the problem of managing the resource allocation taking place within the operational context of many contemporary technological applications, including flexibly automated production systems, automated railway and/or monorail transportation systems, electronic workflow management systems, and business transaction supporting systems. A distinct trait of all these applications is that they limit the role of the human element to remote high-level supervision, while placing the burden of the real-time monitoring and coordination of the ongoing activity upon a computerized control system. Hence, any applicable control paradigm must address not only the issues of throughput maximization, work-in-process inventory reduction, and delay and cost minimization, that have been the typical concerns for past studies on resource allocation, but it must also guarantee the operational correctness and the behavioral consistency of the underlying automated system. The resulting problem is rather novel for the developers of these systems, since, in the past, many of its facets were left to the jurisdiction of the present human intelligence. It is also complex, due to the high levels of choice a" otherwise known as flexibility a" inherent in the operation of these environments. This book proposes a control paradigm that offers a comprehensive and integrated solution to, both, the behavioral / logical and the performance-oriented control problems underlying the management of the resource allocation taking place in the aforementioned highly automated technological applications. Building upon a series of fairly recent results fromDiscrete Event Systems theory, the proposed paradigm is distinguished by: (i) its robustness to the experienced stochasticities and operational contingencies; (ii) its scalability to the large-scale nature of the target technological applications; and (iii) its operational efficiency. These three properties are supported through the adoption of a "closed-loop" structure for the proposed control scheme, and also, through a pertinent decomposition of the overall control function to a logical and a performance-oriented controller for the underlying resource allocation. REAL-TIME MANAGEMENT OF RESOURCE ALLOCATION SYSTEMS provides a rigorous study of the control problems addressed by each of these two controllers, and of their integration to a unified control function. A notion of optimal control is formulated for each of these problems, but it turns out that the corresponding optimal policies are computationally intractable. Hence, a large part of the book is devoted to the development of effective and computationally efficient approximations for these optimal control policies, especially for those that correspond to the more novel logical control problem.
An Introduction to R and Python for Data Analysis helps teach students to code in both R and Python simultaneously. As both R and Python can be used in similar manners, it is useful and efficient to learn both at the same time, helping lecturers and students to teach and learn more, save time, whilst reinforcing the shared concepts and differences of the systems. This tandem learning is highly useful for students, helping them to become literate in both languages, and develop skills which will be handy after their studies. This book presumes no prior experience with computing, and is intended to be used by students from a variety of backgrounds. The side-by-side formatting of this book helps introductory graduate students quickly grasp the basics of R and Python, with the exercises providing helping them to teach themselves the skills they will need upon the completion of their course, as employers now ask for competency in both R and Python. Teachers and lecturers will also find this book useful in their teaching, providing a singular work to help ensure their students are well trained in both computer languages. All data for exercises can be found here: https://github.com/tbrown122387/r_and_python_book/tree/master/data. Key features: - Teaches R and Python in a "side-by-side" way. - Examples are tailored to aspiring data scientists and statisticians, not software engineers. - Designed for introductory graduate students. - Does not assume any mathematical background.
Computer-Aided Verification is a collection of papers that begins with a general survey of hardware verification methods. Ms. Gupta starts with the issue of verification itself and develops a taxonomy of verification methodologies, focusing especially upon recent advances. Although her emphasis is hardware verification, most of what she reports applies to software verification as well. Graphical presentation is coming to be a de facto requirement for a friendly' user interface. The second paper presents a generic format for graphical presentations of coordinating systems represented by automata. The last two papers as a pair, present a variety of generic techniques for reducing the computational cost of computer-aided verification based upon explicit computational memory: the first of the two gives a time-space trade-off, while the second gives a technique which trades space for a (sometimes predictable) probability of error. Computer-Aided Verification is an edited volume of original research. This research work has also been published as a special issue of the journal Formal Methods in System Design, 1:2-3. |
![]() ![]() You may like...
Fundamentals of Spatial Information…
Robert Laurini, Derek Thompson
Hardcover
R1,458
Discovery Miles 14 580
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
Discovering Computers, Essentials…
Susan Sebok, Jennifer Campbell, …
Paperback
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
|