![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Databases
This book illustrates the current work of leading multilevel
modeling (MLM) researchers from around the world. The book's goal is to critically examine the real problems that
occur when trying to use MLMs in applied research, such as power,
experimental design, and model violations. This presentation of
cutting-edge work and statistical innovations in multilevel
modeling includes topics such as growth modeling, repeated measures
analysis, nonlinear modeling, outlier detection, and meta
analysis. This volume will be beneficial for researchers with advanced statistical training and extensive experience in applying multilevel models, especially in the areas of education; clinical intervention; social, developmental and health psychology, and other behavioral sciences; or as a supplement for an introductory graduate-level course.
The new emphasis on physical security resulting from the terrorist threat has forced many information security professionals to struggle to maintain their organization's focus on protecting information assets. In order to command attention, they need to emphasize the broader role of information security in the strategy of their companies. Until now, however, most books about strategy and planning have focused on the production side of the business, rather than operations.
This book addresses and examines the impacts of applications and services for data management and analysis, such as infrastructure, platforms, software, and business processes, on both academia and industry. The chapters cover effective approaches in dealing with the inherent complexity and increasing demands of big data management from an applications perspective. Various case studies included have been reported by data analysis experts who work closely with their clients in such fields as education, banking, and telecommunications. Understanding how data management has been adapted to these applications will help students, instructors and professionals in the field. Application areas also include the fields of social network analysis, bioinformatics, and the oil and gas industries.
This book offers researchers an understanding of the fundamental issues and a good starting point to work on this rapidly expanding field. It provides a comprehensive survey of current developments of heterogeneous information network. It also presents the newest research in applications of heterogeneous information networks to similarity search, ranking, clustering, recommendation. This information will help researchers to understand how to analyze networked data with heterogeneous information networks. Common data mining tasks are explored, including similarity search, ranking, and recommendation. The book illustrates some prototypes which analyze networked data. Professionals and academics working in data analytics, networks, machine learning, and data mining will find this content valuable. It is also suitable for advanced-level students in computer science who are interested in networking or pattern recognition.
Data Mining and Multi agent Integration aims to re?ect state of the art research and development of agent mining interaction and integration (for short, agent min ing). The book was motivated by increasing interest and work in the agents data min ing, and vice versa. The interaction and integration comes about from the intrinsic challenges faced by agent technology and data mining respectively; for instance, multi agent systems face the problem of enhancing agent learning capability, and avoiding the uncertainty of self organization and intelligence emergence. Data min ing, if integrated into agent systems, can greatly enhance the learning skills of agents, and assist agents with predication of future states, thus initiating follow up action or intervention. The data mining community is now struggling with mining distributed, interactive and heterogeneous data sources. Agents can be used to man age such data sources for data access, monitoring, integration, and pattern merging from the infrastructure, gateway, message passing and pattern delivery perspectives. These two examples illustrate the potential of agent mining in handling challenges in respective communities. There is an excellent opportunity to create innovative, dual agent mining interac tion and integration technology, tools and systems which will deliver results in one new technology.
As Information Technology becomes a vital part of our everyday activities, ranging from personal use to government and defense applications, the need to develop high-assurance systems increases. Data and applications security and privacy are crucial elements in developing such systems. Research Directions in Data and Applications Security XVIII
presents original unpublished research results, practical
experiences, and innovative ideas in the field of data and
applications security and privacy. Topics presented in this volume
include: This book is the eighteenth volume in the series produced by the International Federation for Information Processing (IFIP) Working Group 11.3 on Data and Applications Security. It contains twenty-three papers and two invited talks that were presented at the Eighteenth Annual IFIP WG 11.3 Conference on Data and Applications Security, which was sponsored by IFIP and held in Sitges, Catalonia, Spain in July 2004. Research Directions in Data and Applications Security XVIII is a high-quality reference volume that addresses several aspects of information protection, and is aimed at researchers, educators, students, and developers.
The advent of the World Wide Web has changed the perspectives of groupware systems. The interest and deployment of Internet and intranet groupware solutions is growing rapidly, not just in academic circles but also in the commercial arena. The first generation of Web-based groupware tools has already started to emerge, and leading groupware vendors are urgently adapting their products for compatibility and integration with Web technologies. The focus of Groupware and the World Wide Web is to explore the potential for Web-based groupware. This book includes an analysis of the key characteristics of the Web, presenting reasons for its success, and describes developments of a diverse range of Web-based groupware systems. An emphasis on the technical obstacles and challenges is implemented by more analytical discussions and perspectives, including that of Information Technology managers looking to deploy groupware solutions within their organizations. Written by experts from different backgrounds - academic and commercial, technical and organizational - this book provides a unique overview of and insight into current issues and future possibilities concerning extension of the World Wide Web for group working.
Advance Praise for Indian Mujahideen: Computational Analysis and Public Policy This book presents a highly innovative computational approach to analyzing the strategic behavior of terrorist groups and formulating counter-terrorism policies. It would be very useful for international security analysts and policymakers. Uzi Arad, National Security Advisor to the Prime Minister of Israel and Head, Israel National Security Council (2009-2011) An important book on a complex security problem. Issues have been analysed in depth based on quality research. Insightful and well-balanced in describing the way forward. Naresh Chandra, Indian Ambassador to the USA (1996-2001) and Cabinet Secretary (1990-1992). An objective and clinical account of the origins, aims, extra-territorial links and modus-operandi, of a growingly dangerous terrorist organization that challenges the federal, democratic, secular and pluralistic ethos of India s polity. The authors have meticulously researched and analysed the multi-faceted challenges that the Indian Mujahideen poses and realistically dwelt on the ways in which these challenges could be faced and overcome. G. Parthasarathy, High Commissioner of India to Australia (1995-1998) and Pakistan (1998-2000). This book provides the first in-depth look at how advanced mathematics and modern computing technology can influence insights on analysis and policies directed at the Indian Mujahideen (IM) terrorist group. The book also summarizes how the IM group is committed to the destabilization of India by leveraging links with other terror groups such as Lashkar-e-Taiba, and through support from the Pakistani Government and Pakistan s intelligence service. Foreword by The Hon. Louis J. Freeh."
This monograph gives a thorough treatment of the celebrated compositions of signature and encryption that allow for verifiability, that is, to efficiently prove properties about the encrypted data. This study is provided in the context of two cryptographic primitives: (1) designated confirmer signatures, an opaque signature which was introduced to control the proliferation of certified copies of documents, and (2) signcryption, a primitive that offers privacy and authenticity at once in an efficient way. This book is a useful resource to researchers in cryptology and information security, graduate and PhD students, and security professionals.
There are two groups of researchers who are interested in designing network protocols and who cannot (yet) effectively communicate with one another c- cerning these protocols. The first is the group of protocol verifiers, and the second is the group of protocol implementors. The main reason for the lack of effective communication between these two groups is that these groups use languages with quite different semantics to specify network protocols. On one hand, the protocol verifiers use specification languages whose semantics are abstract, coarse-grained, and with large atom- ity. Clearly, protocol specifications that are developed based on such semantics are easier to prove correct. On the other hand, the protocol implementors use specification languages whose semantics are concrete, fine-grained, and with small atomicity. Protocol specifications that are developed based on such - mantics are easier to implement using system programming languages such as C, C++, and Java. To help in closing this communication gap between the group of protocol verifiers and the group of protocol implementors, we present in this monograph a protocol specification language called the Timed Abstract Protocol (or TAP, for short) notation. This notation is greatly influenced by the Abstract Protocol Notation in the textbook Elements of Network Protocol Design, written by the second author, Mohamed G. Gouda. The TAP notation has two types of sem- tics: an abstract semantics that appeals to the protocol verifiers and a concrete semantics thatappeals to the protocol implementors group.
The use of information and communication technologies to support public administrations, governments and decision makers has been recorded for more than 20 years and dubbed e-Government. Moving towards open governance roadmaps worldwide, electronic participation and citizen engagement stand out as a new domain, important both for decision makers and citizens; and over the last decade, there have been a variety of related pilot projects and innovative approaches. With contributions from leading researchers, Charalabidis and Koussouris provide the latest research findings such as theoretical foundations, principles, methodologies, architectures, technical frameworks, cases and lessons learnt within the domain of open, collaborative governance and online citizen engagement. The book is divided into three sections: Section one, "Public Policy Debate Foundations," lays the foundations regarding processes and methods for scoping, planning, evaluating and transforming citizen engagement. The second section, "Information and Communication Technologies for Citizen Participation," details practical approaches to designing and creating collaborative governance infrastructures and citizen participation for businesses and administrations. Lastly, the third section on "Future Research Directions of Open, Collaborative ICT-enabled Governance" provides a constructive critique of the developments in the past and presents prospects regarding future challenges and research directions. The book is mainly written for academic researchers and graduate students working in the computer, social, political and management sciences. Its audience includes researchers and practitioners in e-Governance, public administration officials, policy and decision makers at the local, national and international level engaged in the design and creation of policies and services, and ICT professionals engaged in e-Governance and policy modelling projects and solutions.
Language, Compilers and Run-time Systems for Scalable Computers contains 20 articles based on presentations given at the third workshop of the same title, and 13 extended abstracts from the poster session. Starting with new developments in classical problems of parallel compiler design, such as dependence analysis and an exploration of loop parallelism, the book goes on to address the issues of compiler strategy for specific architectures and programming environments. Several chapters investigate support for multi-threading, object orientation, irregular computation, locality enhancement, and communication optimization. Issues of the interface between language and operating system support are also discussed. Finally, the load balance issues are discussed in different contexts, including sparse matrix computation and iteratively balanced adaptive solvers for partial differential equations. Some additional topics are also discussed in the extended abstracts. Each chapter provides a bibliography of relevant papers and the book can thus be used as a reference to the most up-to-date research in parallel software engineering.
This book illustrates the current work of leading multilevel modeling (MLM) researchers from around the world. The book's goal is to critically examine the real problems that occur when trying to use MLMs in applied research, such as power, experimental design, and model violations. This presentation of cutting-edge work and statistical innovations in multilevel modeling includes topics such as growth modeling, repeated measures analysis, nonlinear modeling, outlier detection, and meta analysis. This volume will be beneficial for researchers with advanced statistical training and extensive experience in applying multilevel models, especially in the areas of education; clinical intervention; social, developmental and health psychology, and other behavioral sciences; or as a supplement for an introductory graduate-level course.
Explains the basic concepts of Python and its role in machine learning Provides comprehensive coverage of feature-engineering including real-time case studies Perceive the structural patterns with reference to data science and statistics and analytics Includes machine learning based structured exercises Appreciates different algorithmic concepts of machine learning including unsupervised, supervised and reinforcement learning
Constraint databases provide extra expressive power over relational databases in a largely hidden way at the data-storage or physical level. Constraints, such as linear or polynomial equations, are used to represent large sets in a compact manner. They keep the view of the database for a user or application programmer almost as simple as in relational databases. "Introduction to Constraint Databases" comprehensively covers both constraint-database theory and several sample systems. The book reveals how constraint databases bring together techniques from a variety of fields, such as logic and model theory, algebraic and computational geometry, and symbolic computation, to the design and analysis of data models and query languages. Constraint databases are shown to be powerful and simple tools for data modeling and querying in application areas¿such as environmental modeling, bioinformatics, and computer vision--that are not suitable for relational databases. Specific applications are examined in geographic information systems, spatiotemporal data management, linear programming, genome databases, model checking of automata, and other areas. Topics and features: *Offers a database perspective and a focus on simplicity at the user level *Utilizes simple tools for determining whether queries are safe or not *Incorporates scientist-supplied descriptions of applications *Explains constraint databases from a developer's viewpoint *Provides extensive exercise sets, and sample software systems, that facilitate rapid learning of the topic within a real-world software context This volume presents a comprehensive introduction to the theory and applications of constraint database systems, which provide new methods for the design of data models and query languages. It is an essential resource for advanced students, practitioners, and professionals in computer science, database systems, and information systems.
Digital Watermarking for Digital Media discusses the new aspects of digital watermarking in a worldwide context. Approached not only from the technical side, but the business and legal sides as well, this book discusses digital watermarking as it relates to many areas of digital media. Broad in its approach, Digital Watermarking for Digital Media provides a comprehensive overview not provided by any texts. Students in information technology, law, multimedia design, and economics will all find valuable material here. But this book is not limited to only students. Artists, composers, lawyers, and publishers will all find value in this digital watermarking book.
Thisbookpresentsmaterialwhichismorealgorithmicallyorientedthanmost alternatives.Italsodealswithtopicsthatareatorbeyondthestateoftheart. Examples include practical and applicable wavelet and other multiresolution transform analysis. New areas are broached like the ridgelet and curvelet transforms. The reader will ?nd in this book an engineering approach to the interpretation of scienti?c data. Compared to the 1st Edition, various additions have been made throu- out, and the topics covered have been updated. The background or en- ronment of this book's topics include continuing interest in e-science and the virtual observatory, which are based on web based and increasingly web service based science and engineering. Additional colleagues whom we would like to acknowledge in this 2nd edition include: Bedros Afeyan, Nabila Aghanim, Emmanuel Cand' es, David Donoho, Jalal Fadili, and Sandrine Pires, We would like to particularly - knowledge Olivier Forni who contributed to the discussion on compression of hyperspectral data, Yassir Moudden on multiwavelength data analysis and Vicent Mart' ?nez on the genus function. The cover image to this 2nd edition is from the Deep Impact project. It was taken approximately 8 minutes after impact on 4 July 2005 with the CLEAR6 ?lter and deconvolved using the Richardson-Lucy method. We thank Don Lindler, Ivo Busko, Mike A'Hearn and the Deep Impact team for the processing of this image and for providing it to us.
Every second, users produce large amounts of image data from medical and satellite imaging systems. Image mining techniques that are capable of extracting useful information from image data are becoming increasingly useful, especially in medicine and the health sciences. Biomedical Image Analysis and Mining Techniques for Improved Health Outcomes addresses major techniques regarding image processing as a tool for disease identification and diagnosis, as well as treatment recommendation. Highlighting current research intended to advance the medical field, this publication is essential for use by researchers, advanced-level students, academicians, medical professionals, and technology developers. An essential addition to the reference material available in the field of medicine, this timely publication covers a range of applied research on data mining, image processing, computational simulation, data visualization, and image retrieval.
Based on interdisciplinary research into "Directional Change", a new data-driven approach to financial data analysis, Detecting Regime Change in Computational Finance: Data Science, Machine Learning and Algorithmic Trading applies machine learning to financial market monitoring and algorithmic trading. Directional Change is a new way of summarising price changes in the market. Instead of sampling prices at fixed intervals (such as daily closing in time series), it samples prices when the market changes direction ("zigzags"). By sampling data in a different way, this book lays out concepts which enable the extraction of information that other market participants may not be able to see. The book includes a Foreword by Richard Olsen and explores the following topics: Data science: as an alternative to time series, price movements in a market can be summarised as directional changes Machine learning for regime change detection: historical regime changes in a market can be discovered by a Hidden Markov Model Regime characterisation: normal and abnormal regimes in historical data can be characterised using indicators defined under Directional Change Market Monitoring: by using historical characteristics of normal and abnormal regimes, one can monitor the market to detect whether the market regime has changed Algorithmic trading: regime tracking information can help us to design trading algorithms It will be of great interest to researchers in computational finance, machine learning and data science. About the Authors Jun Chen received his PhD in computational finance from the Centre for Computational Finance and Economic Agents, University of Essex in 2019. Edward P K Tsang is an Emeritus Professor at the University of Essex, where he co-founded the Centre for Computational Finance and Economic Agents in 2002.
This book deals with Invitations to Tender (ITTs) for the provision of Facility Management (FM) services. It presents a framework to support companies in preparing clear, comprehensive and effective ITTs, focusing on such key aspects as: organizational structures, tools and procedures for managing information, allocation of information responsibilities, procedures for services monitoring and control, quality policies, and risk management. It discusses and analyzes a range of basic terms and concepts, procedures, and international standards concerning the Tendering Process, as well as the contents of ITTs, which should represent the translation of information needs into requirements related to: the client's goals, main categories of information to deal with, expected organization of information, modalities of reporting and control, and level of knowledge to be reached. A further major focus is on potential key innovation scenarios concerning current FM practice, such as Sustainable Procurement, Building Information Modeling (BIM), Big Data and Internet of Things (IoT) technologies, highlighting both the possible benefits and the possible risks and implications that could negatively affect the quality of FM service provision if not properly treated within the ITT. The book will be of interest to real estate owners, demand organizations and facility managers, enhancing their ability to prepare, interpret and/or critically analyze ITTs.
Web mining is moving the World Wide Web toward a more useful environment in which users can quickly and easily find the information they need. Web mining uses document content, hyperlink structure, and usage statistics to assist users in meeting their needed information. This book provides a record of current research and practical applications in Web searching. It includes techniques that will improve the utilization of the Web by the design of Websites, as well as the design and application of search agents. This book presents this research and related applications in a manner that encourages additional work toward improving the reduction of information overflow, which is so common today in Web search results.
This text introduces the concepts of information warfare from a
non-military, organizational perspective. It is designed to
stimulate managers to develop policies, strategies, and tactics for
the aggressive use and defence of their data and knowledge base.
The book covers the full gambit of information warfare subjects
from the direct attack on computer systems to the more subtle
psychological technique of perception management. It provides the
framework needed to build management strategies in this area. The
topics covered include the basics of information warfare, corporate
intelligence systems, the use of deception, security of systems,
modes of attack, a methodology to develop defensive measures, plus
specific issues associated with information warfare.
This volume explores the diverse applications of advanced tools and technologies of the emerging field of big data and their evidential value in business. It examines the role of analytics tools and methods of using big data in strengthening businesses to meet today's information challenges and shows how businesses can adapt big data for effective businesses practices. This volume shows how big data and the use of data analytics is being effectively adopted more frequently, especially in companies that are looking for new methods to develop smarter capabilities and tackle challenges in dynamic processes. Many illustrative case studies are presented that highlight how companies in every sector are now focusing on harnessing data to create a new way of doing business. |
You may like...
Blockchain Technology: Platforms, Tools…
Pethuru Raj, Ganesh Chandra Deka
Hardcover
R4,211
Discovery Miles 42 110
Abuse of Dominant Position: New…
Mark-Oliver Mackenrodt, Beatriz Conde Gallego, …
Hardcover
R2,661
Discovery Miles 26 610
|