![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Databases
The purpose of the 3rd International Conference on Enterprise Information Systems (ICEIS) was to bring together researchers, engineers, and practitioners interested in the advances and business applications of information systems. The research papers published here have been carefully selected from those presented at the conference, and focus on real world applications covering four main themes: database and information systems integration; artificial intelligence and decision support systems; information systems analysis and specification; and internet computing and electronic commerce. Audience: This book will be of interest to information technology professionals, especially those working on systems integration, databases, decision support systems, or electronic commerce. It will also be of use to middle managers who need to work with information systems and require knowledge of current trends in development methods and applications.
The volume contains the papers presented at the fifth working conference on Communications and Multimedia Security (CMS 2001), held on May 21-22, 2001 at (and organized by) the GMD -German National Research Center for Information Technology GMD - Integrated Publication and Information Systems Institute IPSI, in Darmstadt, Germany. The conference is arranged jointly by the Technical Committees 11 and 6 of the International Federation of Information Processing (IFIP) The name "Communications and Multimedia Security" was first used in 1995, Reinhard Posch organized the first in this series of conferences in Graz, Austria, following up on the previously national (Austrian) "IT Sicherheit" conferences held in Klagenfurt (1993) and Vienna (1994). In 1996, the CMS took place in Essen, Germany; in 1997 the conference moved to Athens, Greece. The CMS 1999 was held in Leuven, Belgium. This conference provides a forum for presentations and discussions on issues which combine innovative research work with a highly promising application potential in the area of security for communication and multimedia security. State-of-the-art issues as well as practical experiences and new trends in the areas were topics of interest again, as it has already been the case at previous conferences. This year, the organizers wanted to focus the attention on watermarking and copyright protection for e commerce applications and multimedia data. We also encompass excellent work on recent advances in cryptography and their applications. In recent years, digital media data have enormously gained in importance."
As adoption of Electronic Health Record Systems (EHR-Ss) shifts from early adopters to mainstream, an increasingly large group of decision makers must assess what they want from EHR-Ss and how to go about making their choices. The purpose of this book is to inform that decision. This book explains typical needs of a variety of stakeholders, describes current and imminent technologies, and assesses the available evidence regarding issues in implementing and using EHR-Ss. Divided into four important sections--Needs, Current State, Technology, and Going Forward--the book provides the background and general notions regarding the EHRS and lays out the framework; delves into the historical review; presents a high-level view of EHR systems, focused on the needs of different stakeholders in the health care and the health enterprise; offers practical views of existing systems and current (and short-term future) issues in specifying a EHR system and deciding how to approach the institution of such a system; deals with technology issues, from front- to back-end; and describes where we are and where we should be going with EHR systems. Designed for use by chief information officers, chief medical informatics officers, medical liaisons to hospital systems, private practitioners, and business managers at academic and non-academic hospitals, care management organizations, and practices. The book could be used in any medical or health informatics course, at any level (undergrad, fellowship, MBA).
Rules represent a simplified means of programming, congruent with our understanding of human brain constructs. With the advent of business rules management systems, it has been possible to introduce rule-based programming to nonprogrammers, allowing them to map expert intent into code in applications such as fraud detection, financial transactions, healthcare, retail, and marketing. However, a remaining concern is the quality, safety, and reliability of the resulting programs. This book is on business rules programs, that is, rule programs as handled in business rules management systems. Its conceptual contribution is to present the foundation for treating business rules as a topic of scientific investigation in semantics and program verification, while its technical contribution is to present an approach to the formal verification of business rules programs. The author proposes a method for proving correctness properties for a business rules program in a compositional way, meaning that the proof of a correctness property for a program is built up from correctness properties for the individual rules-thus bridging a gap between the intuitive understanding of rules and the formal semantics of rule programs. With this approach the author enables rule authors and tool developers to understand, express formally, and prove properties of the execution behavior of business rules programs. This work will be of interest to practitioners and researchers in the areas of program verification, enterprise computing, database management, and artificial intelligence.
Integrity and Internal Control in Information Systems is a state-of-the-art book that establishes the basis for an ongoing dialogue between the IT security specialists and the internal control specialists so that both may work more effectively together to assist in creating effective business systems in the future. Building on the issues presented in the preceding volume of this series, this book seeks further answers to the following questions: What precisely do business managers need in order to have confidence in the integrity of their information systems and their data? What is the status quo of research and development in this area? Where are the gaps between business needs on the one hand and research/development on the other; what needs to be done to bridge these gaps? Integrity and Internal Control in Information Systems contains the selected proceedings of the Second Working Conference on Integrity and Internal Control in Information Systems, sponsored by the International Federation for Information Processing (IFIP) and held in Warrenton, Virginia, USA, in November 1998. It will be essential reading for academics and practitioners in computer science, information technology, business informatics, accountancy and edp-auditing.
This book thoroughly covers the remote sensing visualization and analysis techniques based on computational imaging and vision in Earth science. Remote sensing is considered a significant information source for monitoring and mapping natural and man-made land through the development of sensor resolutions that committed different Earth observation platforms. The book includes related topics for the different systems, models, and approaches used in the visualization of remote sensing images. It offers flexible and sophisticated solutions for removing uncertainty from the satellite data. It introduces real time big data analytics to derive intelligence systems in enterprise earth science applications. Furthermore, the book integrates statistical concepts with computer-based geographic information systems (GIS). It focuses on image processing techniques for observing data together with uncertainty information raised by spectral, spatial, and positional accuracy of GPS data. The book addresses several advanced improvement models to guide the engineers in developing different remote sensing visualization and analysis schemes. Highlights on the advanced improvement models of the supervised/unsupervised classification algorithms, support vector machines, artificial neural networks, fuzzy logic, decision-making algorithms, and Time Series Model and Forecasting are addressed. This book guides engineers, designers, and researchers to exploit the intrinsic design remote sensing systems. The book gathers remarkable material from an international experts' panel to guide the readers during the development of earth big data analytics and their challenges.
The field of data mining has made significant and far-reaching advances over the past three decades.Because of its potential power for solving complex problems, data mining has been successfully applied to diverse areas such as business, engineering, social media, and biological science. Many of these applications search for patterns in complex structural information. In biomedicine for example, modeling complex biological systems requires linking knowledge across many levels of science, from genes to disease. Further, the data characteristics of the problems have also grown from static to dynamic and spatiotemporal, complete to incomplete, and centralized to distributed, and grow in their scope and size (this is known as "big data"). The effective integration of big data for decision-making also requires privacy preservation. The contributions to this monograph summarize the advances of data mining in the respective fields. This volume consists of nine chapters that address subjects ranging from mining data from opinion, spatiotemporal databases, discriminative subgraph patterns, path knowledge discovery, social media, and privacy issues to the subject of computation reduction via binary matrix factorization."
This book focuses on different facets of flight data analysis, including the basic goals, methods, and implementation techniques. As mass flight data possesses the typical characteristics of time series, the time series analysis methods and their application for flight data have been illustrated from several aspects, such as data filtering, data extension, feature optimization, similarity search, trend monitoring, fault diagnosis, and parameter prediction, etc. An intelligent information-processing platform for flight data has been established to assist in aircraft condition monitoring, training evaluation and scientific maintenance. The book will serve as a reference resource for people working in aviation management and maintenance, as well as researchers and engineers in the fields of data analysis and data mining.
The purposeofthis book is to providea recordofthe stateofthe art in Topic Detection and Tracking (TDT) in a single place. Research in TDT has been going on for about five years, and publications related to it are scattered all over the place as technical reports, unpublished manuscripts, or in numerous conference proceedings. The third and fourth in a series of on-going TDT evaluations marked a turning point in the research. As such. it provides an excellent time to pause. review the state of the art. gather lessons learned, and describe the open challenges. This book is a collection oftechnical papers. As such, its primary audience is researchers interested in the the current state of TDT research, researchers who hope to leverage that work sothat theirown efforts can avoid pointlessdu plication and false starts. It might also pointthem in the direction ofinteresting unsolved problems within the area. The book is also of interest to practition ers in fields that are related to TDT--e.g., Information Retrieval. Automatic Speech Recognition. Machine Learning, Information Extraction, and so on. In thosecases, TDTmay provide arich application domain for theirown research, or it might address similarenough problems that some lessons learned can be tweaked slightly to answer-perhaps partiallY-"
Database and Application Security XV provides a forum for original research results, practical experiences, and innovative ideas in database and application security. With the rapid growth of large databases and the application systems that manage them, security issues have become a primary concern in business, industry, government and society. These concerns are compounded by the expanding use of the Internet and wireless communication technologies. This volume covers a wide variety of topics related to security and privacy of information in systems and applications, including:
Database and Application Security XV contains papers, keynote addresses, and panel discussions from the Fifteenth Annual Working Conference on Database and Application Security, organized by the International Federation for Information Processing (IFIP) Working Group 11.3 and held July 15-18, 2001 in Niagara on the Lake, Ontario, Canada.
This guide is for practicing statisticians and data scientists who use IBM SPSS for statistical analysis of big data in business and finance. This is the first of a two-part guide to SPSS for Windows, introducing data entry into SPSS, along with elementary statistical and graphical methods for summarizing and presenting data. Part I also covers the rudiments of hypothesis testing and business forecasting while Part II will present multivariate statistical methods, more advanced forecasting methods, and multivariate methods. IBM SPSS Statistics offers a powerful set of statistical and information analysis systems that run on a wide variety of personal computers. The software is built around routines that have been developed, tested, and widely used for more than 20 years. As such, IBM SPSS Statistics is extensively used in industry, commerce, banking, local and national governments, and education. Just a small subset of users of the package include the major clearing banks, the BBC, British Gas, British Airways, British Telecom, the Consumer Association, Eurotunnel, GSK, TfL, the NHS, Shell, Unilever, and W.H.S. Although the emphasis in this guide is on applications of IBM SPSS Statistics, there is a need for users to be aware of the statistical assumptions and rationales underpinning correct and meaningful application of the techniques available in the package; therefore, such assumptions are discussed, and methods of assessing their validity are described. Also presented is the logic underlying the computation of the more commonly used test statistics in the area of hypothesis testing. Mathematical background is kept to a minimum.
Putting capability management into practice requires both a solid theoretical foundation and realistic approaches. This book introduces a development methodology that integrates business and information system development and run-time adjustment based on the concept of capability by presenting the main findings of the CaaS project - the Capability-Driven Development (CDD) methodology, the architecture and components of the CDD environment, examples of real-world applications of CDD, and aspects of CDD usage for creating business value and new opportunities. Capability thinking characterizes an organizational mindset, putting capabilities at the center of the business model and information systems development. It is expected to help organizations and in particular digital enterprises to increase flexibility and agility in adapting to changes in their economic and regulatory environments. Capability management denotes the principles of how capability thinking should be implemented in an organization and the organizational means. This book is intended for anyone who wants to explore the opportunities for developing and managing context-dependent business capabilities and the supporting business services. It does not require a detailed understanding of specific development methods and tools, although some background knowledge and experience in information system development is advisable. The individual chapters have been written by leading researchers in the field of information systems development, enterprise modeling and capability management, as well as practitioners and industrial experts from these fields.
Great advances have been made in the database field. Relational and object- oriented databases, distributed and client/server databases, and large-scale data warehousing are among the more notable. However, none of these advances promises to have as great and direct an effect on the daily lives of ordinary citizens as video databases. Video databases will provide a quantum jump in our ability to deal with visual data, and in allowing people to access and manipulate visual information in ways hitherto thought impossible. Video Database Systems: Issues, Products and Applications gives practical information on academic research issues, commercial products that have already been developed, and the applications of the future driving this research and development. This book can also be considered a reference text for those entering the field of video or multimedia databases, as well as a reference for practitioners who want to identify the kinds of products needed in order to utilize video databases. Video Database Systems: Issues, Products and Applications covers concepts, products and applications. It is written at a level which is less detailed than that normally found in textbooks but more in-depth than that normally written in trade press or professional reference books. Thus, it seeks to serve both an academic and industrial audience by providing a single source of information about the research issues in the field, and the state-of-the-art of practice.
This book explores community dynamics within social media. Using Wikipedia as an example, the volume explores communities that rely upon commons-based peer production. Fundamental theoretical principles spanning such domains as organizational configurations, leadership roles, and social evolutionary theory are developed. In the context of Wikipedia, these theories explain how a functional elite of highly productive editors has emerged and why they are responsible for a majority of the content. It explains how the elite shapes the project and how this group tends to become stable and increasingly influential over time. Wikipedia has developed a new and resilient social hierarchy, an adhocracy, which combines features of traditional and new, online, social organizations. The book presents a set of practical approaches for using these theories in real-world practice. This work fundamentally changes the way we think about social media leadership and evolution, emphasizing the crucial contributions of leadership, of elite social roles, and of group global structure to the overall success and stability of large social media projects. Written in an accessible and direct style, the book will be of interest to academics as well as professionals with an interest in social media and commons-based peer production processes.
Steganography, a means by which two or more parties may communicate using "invisible" or "subliminal" communication, and watermarking, a means of hiding copyright data in images, are becoming necessary components of commercial multimedia applications that are subject to illegal use. This is a comprehensive survey of steganography and watermarking and their application to modern communications and multimedia. It helps the reader to understand steganography, the history of this previously neglected element of cryptography, the hurdles of international law on strong cryptographic techniques, and a description of the methods you can use to hide information in modern media. Included in this discussion is an overview of "steganalysis", methods which can be used to break stenographic communication. This resource also includes an introduction to and survey of watermarking methods, and discusses this method's similarities to and differences from steganography. The reader should gain a working knowledge of watermarking's pros and cons, and learn the legal implications of watermarking and copyright issues on the Internet.
In our increasingly mobile world the ability to access information on demand at any time and place can satisfy people's information needs as well as confer on them a competitive advantage. The emergence of battery-operated, low-cost and portable computers such as palmtops and PDAs, coupled with the availability and exploitation of wireless networks, have made possible the potential for ubiquitous computing. Through the wireless networks, portable equipments will become an integrated part of existing distributed computing environments, and mobile users can have access to data stored at information servers located at the static portion of the network even while they are on the move. Traditionally, information is retrieved following a request-response model. However, this model is no longer adequate in a wireless computing environment. First, the wireless channel is unreliable and the bandwidth is low compared to the wired counterpart. Second, the environment is essentially asymmetric with a large number of mobile users accessing a small number of servers. Third, battery-operated portable devices can typically operate only for a short time because of the short battery lifespan. Thus, clients are expected to be disconnected most of the time. To overcome these limitations, there has been a proliferation of research efforts on designing data delivery mechanisms to support wireless computing more effectively. Data Dissemination in Wireless Computing Environments focuses on such mechanisms. The purpose is to provide a thorough and comprehensive review of recent advances on energy-efficient data delivery protocols, efficient wireless channel bandwidth utilization, reliable broadcasting and cache invalidation strategies for clients with long disconnection time. Besides surveying existing methods, this book also compares and evaluates some of the more promising schemes.
This book presents the cyber culture of micro, macro, cosmological, and virtual computing. The book shows how these work to formulate, explain, and predict the current processes and phenomena monitoring and controlling technology in the physical and virtual space.The authors posit a basic proposal to transform description of the function truth table and structure adjacency matrix to a qubit vector that focuses on memory-driven computing based on logic parallel operations performance. The authors offer a metric for the measurement of processes and phenomena in a cyberspace, and also the architecture of logic associative computing for decision-making and big data analysis.The book outlines an innovative theory and practice of design, test, simulation, and diagnosis of digital systems based on the use of a qubit coverage-vector to describe the functional components and structures. Authors provide a description of the technology for SoC HDL-model diagnosis, based on Test Assertion Blocks Activated Graph. Examples of cyber-physical systems for digital monitoring and cloud management of social objects and transport are proposed. A presented automaton model of cosmological computing explains the cyclical and harmonious evolution of matter-energy essence, and also a space-time form of the Universe.
Many real-time systems rely on static scheduling algorithms. This includes cyclic scheduling, rate monotonic scheduling and fixed schedules created by off-line scheduling techniques such as dynamic programming, heuristic search, and simulated annealing. However, for many real-time systems, static scheduling algorithms are quite restrictive and inflexible. For example, highly automated agile manufacturing, command, control and communications, and distributed real-time multimedia applications all operate over long lifetimes and in highly non-deterministic environments. Dynamic real-time scheduling algorithms are more appropriate for these systems and are used in such systems. Many of these algorithms are based on earliest deadline first (EDF) policies. There exists a wealth of literature on EDF-based scheduling with many extensions to deal with sophisticated issues such as precedence constraints, resource requirements, system overload, multi-processors, and distributed systems. Deadline Scheduling for Real-Time Systems: EDF and Related Algorithms aims at collecting a significant body of knowledge on EDF scheduling for real-time systems, but it does not try to be all-inclusive (the literature is too extensive). The book primarily presents the algorithms and associated analysis, but guidelines, rules, and implementation considerations are also discussed, especially for the more complicated situations where mathematical analysis is difficult. In general, it is very difficult to codify and taxonomize scheduling knowledge because there are many performance metrics, task characteristics, and system configurations. Also, adding to the complexity is the fact that a variety of algorithms have beendesigned for different combinations of these considerations. In spite of the recent advances there are still gaps in the solution space and there is a need to integrate the available solutions. For example, a list of issues to consider includes: preemptive versus non-preemptive tasks, uni-processors versus multi-processors, using EDF at dispatch time versus EDF-based planning, precedence constraints among tasks, resource constraints, periodic versus aperiodic versus sporadic tasks, scheduling during overload, fault tolerance requirements, and providing guarantees and levels of guarantees (meeting quality of service requirements). Deadline Scheduling for Real-Time Systems: EDF and Related Algorithms should be of interest to researchers, real-time system designers, and instructors and students, either as a focussed course on deadline-based scheduling for real-time systems, or, more likely, as part of a more general course on real-time computing. The book serves as an invaluable reference in this fast-moving field.
This book presents a state-of-the art review of current perspectives on Communications and Multimedia Security. It contains the Proceedings of the 3rd Joint Working Conference of IFIP TC6 and TC11, arranged by the International Federation for Information Processing and held in Athens, Greece in September 1997. The book aims to cover the subject of Communications and Multimedia Systems Security, as fully as possible. It constitutes an essential reading for information technology security specialists; computer professionals; communication systems professionals; EDP managers; EDP auditors; managers, researchers and students working on the subject.
This book presents the combined peer-reviewed proceedings of the tenth International Symposium on Intelligent Distributed Computing (IDC'2016), which was held in Paris, France from October 10th to 12th, 2016. The 23 contributions address a range of topics related to theory and application of intelligent distributed computing, including: Intelligent Distributed Agent-Based Systems, Ambient Intelligence and Social Networks, Computational Sustainability, Intelligent Distributed Knowledge Representation and Processing, Smart Networks, Networked Intelligence and Intelligent Distributed Applications, amongst others.
Chaos-based cryptography, attracting many researchers in the past decade, is a research field across two fields, i.e., chaos (nonlinear dynamic system) and cryptography (computer and data security). It Chaos' properties, such as randomness and ergodicity, have been proved to be suitable for designing the means for data protection. The book gives a thorough description of chaos-based cryptography, which consists of chaos basic theory, chaos properties suitable for cryptography, chaos-based cryptographic techniques, and various secure applications based on chaos. Additionally, it covers both the latest research results and some open issues or hot topics. The book creates a collection of high-quality chapters contributed by leading experts in the related fields. It embraces a wide variety of aspects of the related subject areas and provide a scientifically and scholarly sound treatment of state-of-the-art techniques to students, researchers, academics, personnel of law enforcement and IT practitioners who are interested or involved in the study, research, use, design and development of techniques related to chaos-based cryptography.
Requirements engineering has since long acknowledged the importance of the notion that system requirements are stakeholder goals-rather than system functions-and ought to be elicited, modeled and analyzed accordingly. In this book, Nurcan and her co-editors collected twenty contributions from leading researchers in requirements engineering with the intention to comprehensively present an overview of the different perspectives that exist today, in 2010, on the concept of intention in the information systems community. These original papers honor Colette Rolland for her contributions to this field, as she was probably the first to emphasize that 'intention' has to be considered as a first-class concept in information systems engineering. Written by long-term collaborators (and most often friends) of Colette Rolland, this volume covers topics like goal-oriented requirements engineering, model-driven development, method engineering, and enterprise modeling. As such, it is a tour d'horizon of Colette Rolland's lifework, and is presented to her on the occasion of her retirement at CaISE 2010 in Hammamet, the conference she once cofounded and which she helped to grow and prosper for more than 20 years. |
You may like...
X-Kit Achieve Essential Reference Guide…
O. Hendry, S. Gosher, …
Paperback
R202
Discovery Miles 2 020
Jolly Phonics Student Book 1 - In Print…
Sara Wernham, Sue Lloyd
Paperback
R153
Discovery Miles 1 530
|