![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Applications of computing > Databases
To optimally design and manage a directory service, IS architects
and managers must understand current state-of-the-art products.
Directory Services covers Novell's NDS eDirectory, Microsoft's
Active Directory, UNIX directories and products by NEXOR, MaxWare,
Siemens, Critical Path and others. Directory design fundamentals
and products are woven into case studies of large enterprise
deployments. Cox thoroughly explores replication, security,
migration and legacy system integration and interoperability.
Business issues such as how to cost justify, plan, budget and
manage a directory project are also included. The book culminates
in a visionary discussion of future trends and emerging directory
technologies including the strategic direction of the top directory
products, the impact of wireless technology on directory enabled
applications and using directories to customize content delivery
from the Enterprise Portal.
The design of computer systems to be embedded in critical real-time applications is a complex task. Such systems must not only guarantee to meet hard real-time deadlines imposed by their physical environment, they must guarantee to do so dependably, despite both physical faults (in hardware) and design faults (in hardware or software). A fault-tolerance approach is mandatory for these guarantees to be commensurate with the safety and reliability requirements of many life- and mission-critical applications. A Generic Fault-Tolerant Architecture for Real-Time Dependable Systems explains the motivations and the results of a collaborative project(*), whose objective was to significantly decrease the lifecycle costs of such fault-tolerant systems. The end-user companies participating in this project currently deploy fault-tolerant systems in critical railway, space and nuclear-propulsion applications. However, these are proprietary systems whose architectures have been tailored to meet domain-specific requirements. This has led to very costly, inflexible, and often hardware-intensive solutions that, by the time they are developed, validated and certified for use in the field, can already be out-of-date in terms of their underlying hardware and software technology. The project thus designed a generic fault-tolerant architecture with two dimensions of redundancy and a third multi-level integrity dimension for accommodating software components of different levels of criticality. The architecture is largely based on commercial off-the-shelf (COTS) components and follows a software-implemented approach so as to minimise the need for special hardware. Using an associated development and validationenvironment, system developers may configure and validate instances of the architecture that can be shown to meet the very diverse requirements of railway, space, nuclear-propulsion and other critical real-time applications. This book describes the rationale of the generic architecture, the design and validation of its communication, scheduling and fault-tolerance components, and the tools that make up its design and validation environment. The book concludes with a description of three prototype systems that have been developed following the proposed approach. (*) Esprit project No. 20716: GUARDS: a Generic Upgradable Architecture for Real-time Dependable Systems.
This book provides an overview of the resources and research projects that are bringing Big Data and High Performance Computing (HPC) on converging tracks. It demystifies Big Data and HPC for the reader by covering the primary resources, middleware, applications, and tools that enable the usage of HPC platforms for Big Data management and processing.Through interesting use-cases from traditional and non-traditional HPC domains, the book highlights the most critical challenges related to Big Data processing and management, and shows ways to mitigate them using HPC resources. Unlike most books on Big Data, it covers a variety of alternatives to Hadoop, and explains the differences between HPC platforms and Hadoop.Written by professionals and researchers in a range of departments and fields, this book is designed for anyone studying Big Data and its future directions. Those studying HPC will also find the content valuable.
Calendar units, such as months and days, clock units, such as hours and seconds, and specialized units, such as business days and academic years, play a major role in a wide range of information system applications. System support for reasoning about these units, called granularities in this book, is important for the efficient design, use, and implementation of such applications. The book deals with several aspects of temporal information and provides a unifying model for granularities. It is intended for computer scientists and engineers who are interested in the formal models and technical development of specific issues. Practitioners can learn about critical aspects that must be taken into account when designing and implementing databases supporting temporal information. Lecturers may find this book useful for an advanced course on databases. Moreover, any graduate student working on time representation and reasoning, either in data or knowledge bases, should definitely read it.
Learn how to apply the principles of machine learning to time series modeling with this indispensable resource Machine Learning for Time Series Forecasting with Python is an incisive and straightforward examination of one of the most crucial elements of decision-making in finance, marketing, education, and healthcare: time series modeling. Despite the centrality of time series forecasting, few business analysts are familiar with the power or utility of applying machine learning to time series modeling. Author Francesca Lazzeri, a distinguished machine learning scientist and economist, corrects that deficiency by providing readers with comprehensive and approachable explanation and treatment of the application of machine learning to time series forecasting. Written for readers who have little to no experience in time series forecasting or machine learning, the book comprehensively covers all the topics necessary to: Understand time series forecasting concepts, such as stationarity, horizon, trend, and seasonality Prepare time series data for modeling Evaluate time series forecasting models' performance and accuracy Understand when to use neural networks instead of traditional time series models in time series forecasting Machine Learning for Time Series Forecasting with Python is full real-world examples, resources and concrete strategies to help readers explore and transform data and develop usable, practical time series forecasts. Perfect for entry-level data scientists, business analysts, developers, and researchers, this book is an invaluable and indispensable guide to the fundamental and advanced concepts of machine learning applied to time series modeling.
A field manual on contextualizing cyber threats, vulnerabilities, and risks to connected cars through penetration testing and risk assessment Hacking Connected Cars deconstructs the tactics, techniques, and procedures (TTPs) used to hack into connected cars and autonomous vehicles to help you identify and mitigate vulnerabilities affecting cyber-physical vehicles. Written by a veteran of risk management and penetration testing of IoT devices and connected cars, this book provides a detailed account of how to perform penetration testing, threat modeling, and risk assessments of telematics control units and infotainment systems. This book demonstrates how vulnerabilities in wireless networking, Bluetooth, and GSM can be exploited to affect confidentiality, integrity, and availability of connected cars. Passenger vehicles have experienced a massive increase in connectivity over the past five years, and the trend will only continue to grow with the expansion of The Internet of Things and increasing consumer demand for always-on connectivity. Manufacturers and OEMs need the ability to push updates without requiring service visits, but this leaves the vehicle's systems open to attack. This book examines the issues in depth, providing cutting-edge preventative tactics that security practitioners, researchers, and vendors can use to keep connected cars safe without sacrificing connectivity. Perform penetration testing of infotainment systems and telematics control units through a step-by-step methodical guide Analyze risk levels surrounding vulnerabilities and threats that impact confidentiality, integrity, and availability Conduct penetration testing using the same tactics, techniques, and procedures used by hackers From relatively small features such as automatic parallel parking, to completely autonomous self-driving cars--all connected systems are vulnerable to attack. As connectivity becomes a way of life, the need for security expertise for in-vehicle systems is becoming increasingly urgent. Hacking Connected Cars provides practical, comprehensive guidance for keeping these vehicles secure.
This book discusses the development of a theory of info-statics as a sub-theory of the general theory of information. It describes the factors required to establish a definition of the concept of information that fixes the applicable boundaries of the phenomenon of information, its linguistic structure and scientific applications. The book establishes the definitional foundations of information and how the concepts of uncertainty, data, fact, evidence and evidential things are sequential derivatives of information as the primary category, which is a property of matter and energy. The sub-definitions are extended to include the concepts of possibility, probability, expectation, anticipation, surprise, discounting, forecasting, prediction and the nature of past-present-future information structures. It shows that the factors required to define the concept of information are those that allow differences and similarities to be established among universal objects over the ontological and epistemological spaces in terms of varieties and identities. These factors are characteristic and signal dispositions on the basis of which general definitional foundations are developed to construct the general information definition (GID). The book then demonstrates that this definition is applicable to all types of information over the ontological and epistemological spaces. It also defines the concepts of uncertainty, data, fact, evidence and knowledge based on the GID. Lastly, it uses set-theoretic analytics to enhance the definitional foundations, and shows the value of the theory of info-statics to establish varieties and categorial varieties at every point of time and thus initializes the construct of the theory of info-dynamics.
Today's information technology and security networks demand increasingly complex algorithms and cryptographic systems. Individuals implementing security policies for their companies must utilize technical skill and information technology knowledge to implement these security mechanisms. Cryptography & Security Devices: Mechanisms & Applications addresses cryptography from the perspective of the security services and mechanisms available to implement these services: discussing issues such as e-mail security, public-key architecture, virtual private networks, Web services security, wireless security, and the confidentiality and integrity of security services. This book provides scholars and practitioners in the field of information assurance working knowledge of fundamental encryption algorithms and systems supported in information technology and secure communication networks.
Data warehouses have captured the attention of practitioners and researchers alike. But the design and optimization of data warehouses remains an art rather than a science. This book presents the first comparative review of the state of the art and best current practice of data warehouses. It covers source and data integration, multidimensional aggregation, query optimization, update propagation, metadata management, quality assessment, and design optimization. Also, based on results of the European Data Warehouse Quality project, it offers a conceptual framework by which the architecture and quality of data warehouse efforts can be assessed and improved using enriched metadata management combined with advanced techniques from databases, business modeling, and artificial intelligence. For researchers and database professionals in academia and industry, the book offers an excellent introduction to the issues of quality and metadata usage in the context of data warehouses.
Neural networks represent a powerful data processing technique that has reached maturity and broad application. When clearly understood and appropriately used, they are a mandatory component in the toolbox of any engineer who wants make the best use of the available data, in order to build models, make predictions, mine data, recognize shapes or signals, etc. Ranging from theoretical foundations to real-life applications, this book is intended to provide engineers and researchers with clear methodologies for taking advantage of neural networks in industrial, financial or banking applications, many instances of which are presented in the book. For the benefit of readers wishing to gain deeper knowledge of the topics, the book features appendices that provide theoretical details for greater insight, and algorithmic details for efficient programming and implementation. The chapters have been written by experts and edited to present a coherent and comprehensive, yet not redundant, practically oriented introduction.
This book constitutes the thoroughly refereed post-conference proceedings of the 11th IFIP WG 6.11 Conference on e-Business, e-Services and e-Society, I3E 2011, held in Kaunas, Lithuania, in October 2011. The 25 revised papers presented were carefully reviewed and selected from numerous submissions. They are organized in the following topical sections: e-government and e-governance, e-services, digital goods and products, e-business process modeling and re-engineering, innovative e-business models and implementation, e-health and e-education, and innovative e-business models.
The explosion of computer use and internet communication has placed
new emphasis on the ability to store, retrieve and search for all
types of images, both still photo and video images. The success and
the future of visual information retrieval depends on the cutting
edge research and applications explored in this book. It combines
the expertise from both computer vision and database research.
The central purpose of this collection of essays is to make a creative addition to the debates surrounding the cultural heritage domain. In the 21st century the world faces epochal changes which affect every part of society, including the arenas in which cultural heritage is made, held, collected, curated, exhibited, or simply exists. The book is about these changes; about the decentring of culture and cultural heritage away from institutional structures towards the individual; about the questions which the advent of digital technologies is demanding that we ask and answer in relation to how we understand, collect and make available Europe's cultural heritage. Cultural heritage has enormous potential in terms of its contribution to improving the quality of life for people, understanding the past, assisting territorial cohesion, driving economic growth, opening up employment opportunities and supporting wider developments such as improvements in education and in artistic careers. Given that spectrum of possible benefits to society, the range of studies that follow here are intended to be a resource and stimulus to help inform not just professionals in the sector but all those with an interest in cultural heritage.
Virtually all nontrivial and modern service related problems and systems involve data volumes and types that clearly fall into what is presently meant as "big data", that is, are huge, heterogeneous, complex, distributed, etc. Data mining is a series of processes which include collecting and accumulating data, modeling phenomena, and discovering new information, and it is one of the most important steps to scientific analysis of the processes of services. Data mining application in services requires a thorough understanding of the characteristics of each service and knowledge of the compatibility of data mining technology within each particular service, rather than knowledge only in calculation speed and prediction accuracy. Varied examples of services provided in this book will help readers understand the relation between services and data mining technology. This book is intended to stimulate interest among researchers and practitioners in the relation between data mining technology and its application to other fields.
Social network analysis applications have experienced tremendous advances within the last few years due in part to increasing trends towards users interacting with each other on the internet. Social networks are organized as graphs, and the data on social networks takes on the form of massive streams, which are mined for a variety of purposes. Social Network Data Analytics covers an important niche in the social network analytics field. This edited volume, contributed by prominent researchers in this field, presents a wide selection of topics on social network data mining such as Structural Properties of Social Networks, Algorithms for Structural Discovery of Social Networks and Content Analysis in Social Networks. This book is also unique in focussing on the data analytical aspects of social networks in the internet scenario, rather than the traditional sociology-driven emphasis prevalent in the existing books, which do not focus on the unique data-intensive characteristics of online social networks. Emphasis is placed on simplifying the content so that students and practitioners benefit from this book. This book targets advanced level students and researchers concentrating on computer science as a secondary text or reference book. Data mining, database, information security, electronic commerce and machine learning professionals will find this book a valuable asset, as well as primary associations such as ACM, IEEE and Management Science.
This book inclusively and systematically presents the fundamental methods, models and techniques of practical application of grey data analysis, bringing together the authors' many years of theoretical exploration, real-life application, and teaching. It also reflects the majority of recent theoretical and applied advances in the theory achieved by scholars from across the world, providing readers a vivid overall picture of this new theory and its pioneering research activities. The book includes 12 chapters, covering the introduction to grey systems, a novel framework of grey system theory, grey numbers and their operations, sequence operators and grey data mining, grey incidence analysis models, grey clustering evaluation models, series of GM models, combined grey models, techniques for grey systems forecasting, grey models for decision-making, techniques for grey control, etc. It also includes a software package that allows practitioners to conveniently and practically employ the theory and methods presented in this book. All methods and models presented here were chosen for their practical applicability and have been widely employed in various research works. I still remember 1983, when I first participated in a course on Grey System Theory. The mimeographed teaching materials had a blue cover and were presented as a book. It was like finding a treasure: This fascinating book really inspired me as a young intellectual going through a period of confusion and lack of academic direction. It shone with pearls of wisdom and offered a beacon in the mist for a man trying to find his way in academic research. This book became the guiding light in my life journey, inspiring me to forge an indissoluble bond with Grey System Theory. --Sifeng Liu
This book gathers authoritative contributions in the field of Soft Computing. Based on selected papers presented at the 7th World Conference on Soft Computing, which was held on May 29-31, 2018, in Baku, Azerbaijan, it describes new theoretical advances, as well as cutting-edge methods and applications. New theories and algorithms in fuzzy logic, cognitive modeling, graph theory and metaheuristics are discussed, and applications in data mining, social networks, control and robotics, geoscience, biomedicine and industrial management are described. This book offers a timely, broad snapshot of recent developments, including thought-provoking trends and challenges that are yielding new research directions in the diverse areas of Soft Computing.
This book gathers high-quality papers presented at the International Conference on Smart Trends for Information Technology and Computer Communications (SmartCom 2020), organized by the Global Knowledge Research Foundation (GR Foundation) from 23 to 24 January 2020. It covers the state-of-the-art and emerging topics in information, computer communications, and effective strategies for their use in engineering and managerial applications. It also explores and discusses the latest technological advances in, and future directions for, information and knowledge computing and its applications.
Explains processes and scenarios (process chains) for planning with SAP characteristics. Uses the latest releases of SAP R/3 and APO (Advanced Planning & Optimization software). The levels scenario, process and function are explained from the business case down to the implementation level and the relations between these levels are consistently pointed out throughout the book Many illustrations help to understand the interdependencies between scenario, process and function Aims to help avoiding costly dead ends and securing a smooth implementation and management of supply chains
Cultural forces govern a synergistic relationship among information institutions that shapes their roles collectively and individually. Cultural synergy is the combination of perception- and behavior-shaping knowledge within, between, and among groups. Our hyperlinked era makes information-sharing among institutions critically important for scholarship as well as for the advancement of humankind. Information institutions are those that have, or share in, the mission to preserve, conserve, and disseminate information objects and their informative content. A central idea is the notion of social epistemology that information institutions arise culturally from social forces of the cultures they inhabit, and that their purpose is to disseminate that culture. All information institutions are alike in critical ways. Intersecting lines of cultural mission are trajectories for synergy for allowing us to perceive the universe of information institutions as interconnected and evolving and moving forward in distinct ways for the improvement of the condition of humankind through the building up of its knowledge base and of its information-sharing processes. This book is an exploration of the cultural synergy that can be realized by seeing commonalities among information institutions (sometimes also called cultural heritage institutions): museums, libraries, and archives. The hyperlinked era of the Semantic Web makes information sharing among institutions critically important for scholarship as well as the advancement of mankind. The book addresses the origins of cultural information institutions, the history of the professions that run them, and the social imperative of information organization as a catalyst for semantic synergy.
This proceedings book presents selected papers from the 4th Conference on Signal and Information Processing, Networking and Computers (ICSINC) held in Qingdao, China on May 23-25, 2018. It focuses on the current research in a wide range of areas related to information theory, communication systems, computer science, signal processing, aerospace technologies, and other related technologies. With contributions from experts from both academia and industry, it is a valuable resource anyone interested in this field.
This edited book first consolidates the results of the EU-funded EDISON project (Education for Data Intensive Science to Open New science frontiers), which developed training material and information to assist educators, trainers, employers, and research infrastructure managers in identifying, recruiting and inspiring the data science professionals of the future. It then deepens the presentation of the information and knowledge gained to allow for easier assimilation by the reader. The contributed chapters are presented in sequence, each chapter picking up from the end point of the previous one. After the initial book and project overview, the chapters present the relevant data science competencies and body of knowledge, the model curriculum required to teach the required foundations, profiles of professionals in this domain, and use cases and applications. The text is supported with appendices on related process models. The book can be used to develop new courses in data science, evaluate existing modules and courses, draft job descriptions, and plan and design efficient data-intensive research teams across scientific disciplines.
This book gathers visionary ideas from leading academics and scientists to predict the future of wireless communication and enabling technologies in 2050 and beyond. The content combines a wealth of illustrations, tables, business models, and novel approaches to the evolution of wireless communication. The book also provides glimpses into the future of emerging technologies, end-to-end systems, and entrepreneurial and business models, broadening readers' understanding of potential future advances in the field and their influence on society at large
Cyberspace security is a critical subject of our times. On the one hand the development of Internet, mobile communications, distributed computing, computer software and databases storing essential enterprise information has helped to conduct business and personal communication between individual people. On the other hand it has created many opportunities for abuse, fraud and expensive damage. This book is a selection of the best papers presented at the NATO Advanced Research Workshop dealing with the Subject of Cyberspace Security and Defense. The level of the individual contributions in the volume is advanced and suitable for senior and graduate students, researchers and technologists who wish to get some feeling of the state of the art in several sub-disciplines of Cyberspace security. Several papers provide a broad-brush description of national security issues and brief summaries of technology states. These papers can be read and appreciated by technically enlightened managers and executives who want to understand security issues and approaches to technical solutions. An important question of our times is not "Should we do something for enhancing our digital assets security," the question is "How to do it."
"The Berkeley DB Book" is a practical guide to the intricacies of the Berkeley DB. This book covers in-depth the complex design issues that are mostly only touched on in terse footnotes within the dense Berkeley DB reference manual. It explains the technology at a higher level and also covers the internals, providing generous code and design examples. In this book, you will get to see a developer's perspective on intriguing design issues in Berkeley DB-based applications, and you will be able to choose design options for specific conditions. Also included is a special look at fault tolerance and high-availability frameworks. Berkeley DB is becoming the database of choice for large-scale applications like search engines and high-traffic web sites. |
![]() ![]() You may like...
Global Trends in Intelligent Computing…
B. K. Tripathy, D P Acharjya
Hardcover
R6,506
Discovery Miles 65 060
Ontology-Based Applications for…
Mohammad Nazir Ahmad, Robert M Colomb, …
Hardcover
R4,872
Discovery Miles 48 720
Handbook of Research on Engineering…
Bhushan Patil, Manisha Vohra
Hardcover
R10,287
Discovery Miles 102 870
Information Systems Engineering - From…
Paul Johannesson, Eva Soderstrom
Hardcover
R2,849
Discovery Miles 28 490
Big Data and Analytics - Strategic and…
Vincenzo Morabito
Hardcover
How to Alleviate Digital Transformation…
Setrag Khoshafian
Hardcover
|