|
|
Books > Computing & IT > Applications of computing
Data mapping in a data warehouse is the process of creating a link
between two distinct data models' (source and target)
tables/attributes. Data mapping is required at many stages of DW
life-cycle to help save processor overhead; every stage has its own
unique requirements and challenges. Therefore, many data warehouse
professionals want to learn data mapping in order to move from an
ETL (extract, transform, and load data between databases) developer
to a data modeler role. Data Mapping for Data Warehouse Design
provides basic and advanced knowledge about business intelligence
and data warehouse concepts including real life scenarios that
apply the standard techniques to projects across various domains.
After reading this book, readers will understand the importance of
data mapping across the data warehouse life cycle.
ClearRevise is all about making your revision easy. At the end of
the course, doing practice papers is useful - but an exam tutor can
make a big difference. This book helps provide support from both
angles and will really help you to ace the exam. The first section
is your exam tutor. It shows you example questions with model
answers. Just like a tutor, it gives you exam tips and lets you
know what the examiner is looking for. Secondly, you are then given
similar questions from the same topic for you to have a go at,
applying your knowledge and tips. With over 400 marks in this
section and all the answers provided you'll easily revise the
topics as you go. Lastly, there are two complete exam papers
written in the same style as the live OCR papers to try. They're
exactly the same length and marks as the real exam, providing a
realistic experience and a great opportunity to show how much
you've progressed.
Data Simplification: Taming Information With Open Source Tools
addresses the simple fact that modern data is too big and complex
to analyze in its native form. Data simplification is the process
whereby large and complex data is rendered usable. Complex data
must be simplified before it can be analyzed, but the process of
data simplification is anything but simple, requiring a specialized
set of skills and tools. This book provides data scientists from
every scientific discipline with the methods and tools to simplify
their data for immediate analysis or long-term storage in a form
that can be readily repurposed or integrated with other data.
Drawing upon years of practical experience, and using numerous
examples and use cases, Jules Berman discusses the principles,
methods, and tools that must be studied and mastered to achieve
data simplification, open source tools, free utilities and snippets
of code that can be reused and repurposed to simplify data, natural
language processing and machine translation as a tool to simplify
data, and data summarization and visualization and the role they
play in making data useful for the end user.
The effective application of knowledge management principles has
proven to be beneficial for modern organizations. When utilized in
the academic community, these frameworks can enhance the value and
quality of research initiatives. Enhancing Academic Research With
Knowledge Management Principles is a pivotal reference source for
the latest research on implementing theoretical frameworks of
information management in the context of academia and universities.
Featuring extensive coverage on relevant areas such as data mining,
organizational and academic culture, this publication is an ideal
resource for researchers, academics, practitioners, professionals,
and students.
The world is witnessing the growth of a global movement facilitated
by technology and social media. Fueled by information, this
movement contains enormous potential to create more accountable,
efficient, responsive, and effective governments and businesses, as
well as spurring economic growth. Big Data Governance and
Perspectives in Knowledge Management is a collection of innovative
research on the methods and applications of applying robust
processes around data, and aligning organizations and skillsets
around those processes. Highlighting a range of topics including
data analytics, prediction analysis, and software development, this
book is ideally designed for academicians, researchers, information
science professionals, software developers, computer engineers,
graduate-level computer science students, policymakers, and
managers seeking current research on the convergence of big data
and information governance as two major trends in information
management.
Faced with the exponential development of Big Data and both its
legal and economic repercussions, we are still slightly in the dark
concerning the use of digital information. In the perpetual balance
between confidentiality and transparency, this data will lead us to
call into question how we understand certain paradigms, such as the
Hippocratic Oath in medicine. As a consequence, a reflection on the
study of the risks associated with the ethical issues surrounding
the design and manipulation of this "massive data" seems to be
essential. This book provides a direction and ethical value to
these significant volumes of data. It proposes an ethical analysis
model and recommendations to better keep this data in check. This
empirical and ethico-technical approach brings together the first
aspects of a moral framework directed toward thought, conscience
and the responsibility of citizens concerned by the use of data of
a personal nature.
With the emergence of the Java 3D API, the creation of high quality
3D animated graphics for Java applications and applets becomes a
possibility. With numerous aspects of the business, science,
medical, and educational fields implementing this technology, the
need for familiarity of Java 3D amplifies.""Interactive Web-Based
Virtual Reality with Java 3D"" provides both advanced and novice
programmers with comprehensive, detailed coverage of all of the
important issues in Java 3D. This essential book delivers
illustrations of essential keywords, syntax, and methods to provide
an easy-to-read learning experience for the reader.
Advances in Computers carries on a tradition of excellence,
presenting detailed coverage of innovations in computer hardware,
software, theory, design, and applications. The book provides
contributors with a medium in which they can explore their subjects
in greater depth and breadth than journal articles typically allow.
The articles included in this book will become standard references,
with lasting value in this rapidly expanding field.
The development of artificial intelligence (AI) involves the
creation of computer systems that can do activities that would
ordinarily require human intelligence, such as visual perception,
speech recognition, decision making, and language translation.
Through increasingly complex programming approaches, it has been
transforming and advancing the discipline of computer science.
Artificial Intelligence Methods and Applications in Computer
Engineering illuminates how today's computer engineers and
scientists can use AI in real-world applications. It focuses on a
few current and emergent AI applications, allowing a more in-depth
discussion of each topic. Covering topics such as biomedical
research applications, navigation systems, and search engines, this
premier reference source is an excellent resource for computer
scientists, computer engineers, IT managers, students and educators
of higher education, librarians, researchers, and academicians.
Vehicular traffic congestion and accidents remain universal issues
in today's world. Due to the continued growth in the use of
vehicles, optimizing traffic management operations is an immense
challenge. To reduce the number of traffic accidents, improve the
performance of transportation systems, enhance road safety, and
protect the environment, vehicular ad-hoc networks have been
introduced. Current developments in wireless communication,
computing paradigms, big data, and cloud computing enable the
enhancement of these networks, equipped with wireless communication
capabilities and high-performance processing tools. Cloud-Based Big
Data Analytics in Vehicular Ad-Hoc Networks is a pivotal reference
source that provides vital research on cloud and data analytic
applications in intelligent transportation systems. While
highlighting topics such as location routing, accident detection,
and data warehousing, this publication addresses future challenges
in vehicular ad-hoc networks and presents viable solutions. This
book is ideally designed for researchers, computer scientists,
engineers, automobile industry professionals, IT practitioners,
academicians, and students seeking current research on cloud
computing models in vehicular networks.
Emerging Trends in Applications and Infrastructures for
Computational Biology, Bioinformatics, and Systems Biology: Systems
and Applications covers the latest trends in the field with special
emphasis on their applications. The first part covers the major
areas of computational biology, development and application of
data-analytical and theoretical methods, mathematical modeling, and
computational simulation techniques for the study of biological and
behavioral systems. The second part covers bioinformatics, an
interdisciplinary field concerned with methods for storing,
retrieving, organizing, and analyzing biological data. The book
also explores the software tools used to generate useful biological
knowledge. The third part, on systems biology, explores how to
obtain, integrate, and analyze complex datasets from multiple
experimental sources using interdisciplinary tools and techniques,
with the final section focusing on big data and the collection of
datasets so large and complex that it becomes difficult to process
using conventional database management systems or traditional data
processing applications.
Data is powerful. It separates leaders from laggards and it drives
business disruption, transformation, and reinvention. Today's most
progressive companies are using the power of data to propel their
industries into new areas of innovation, specialization, and
optimization. The horsepower of new tools and technologies have
provided more opportunities than ever to harness, integrate, and
interact with massive amounts of disparate data for business
insights and value - something that will only continue in the era
of the Internet of Things. And, as a new breed of tech-savvy and
digitally native knowledge workers rise to the ranks of data
scientist and visual analyst, the needs and demands of the people
working with data are changing, too. The world of data is changing
fast. And, it's becoming more visual. Visual insights are becoming
increasingly dominant in information management, and with the
reinvigorated role of data visualization, this imperative is a
driving force to creating a visual culture of data discovery. The
traditional standards of data visualizations are making way for
richer, more robust and more advanced visualizations and new ways
of seeing and interacting with data. However, while data
visualization is a critical tool to exploring and understanding
bigger and more diverse and dynamic data, by understanding and
embracing our human hardwiring for visual communication and
storytelling and properly incorporating key design principles and
evolving best practices, we take the next step forward to transform
data visualizations from tools into unique visual information
assets.
The development of new and effective analytical and numerical
models is essential to understanding the performance of a variety
of structures. As computational methods continue to advance, so too
do their applications in structural performance modeling and
analysis. Modeling and Simulation Techniques in Structural
Engineering presents emerging research on computational techniques
and applications within the field of structural engineering. This
timely publication features practical applications as well as new
research insights and is ideally designed for use by engineers, IT
professionals, researchers, and graduate-level students.
The highly dynamic world of information technology service
management stresses the benefits of the quick and correct
implementation of IT services. A disciplined approach relies on a
separate set of assumptions and principles as an agile approach,
both of which have complicated implementation processes as well as
copious benefits. Combining these two approaches to enhance the
effectiveness of each, while difficult, can yield exceptional
dividends. Balancing Agile and Disciplined Engineering and
Management Approaches for IT Services and Software Products is an
essential publication that focuses on clarifying theoretical
foundations of balanced design methods with conceptual frameworks
and empirical cases. Highlighting a broad range of topics including
business trends, IT service, and software development, this book is
ideally designed for software engineers, software developers,
programmers, information technology professionals, researchers,
academicians, and students.
The WWW era made billions of people dramatically dependent on the
progress of data technologies, out of which Internet search and Big
Data are arguably the most notable. Structured Search paradigm
connects them via a fundamental concept of key-objects evolving out
of keywords as the units of search. The key-object data model and
KeySQL revamp the data independence principle making it applicable
for Big Data and complement NoSQL with full-blown structured
querying functionality. The ultimate goal is extracting Big
Information from the Big Data. As a Big Data Consultant, Mikhail
Gilula combines academic background with 20 years of industry
experience in the database and data warehousing technologies
working as a Sr. Data Architect for Teradata, Alcatel-Lucent, and
PayPal, among others. He has authored three books, including The
Set Model for Database and Information Systems and holds four US
Patents in Structured Search and Data Integration.
INTELLIGENT SECURITY SYSTEMS Dramatically improve your
cybersecurity using AI and machine learning In Intelligent Security
Systems, distinguished professor and computer scientist Dr. Leon
Reznik delivers an expert synthesis of artificial intelligence,
machine learning and data science techniques, applied to computer
security to assist readers in hardening their computer systems
against threats. Emphasizing practical and actionable strategies
that can be immediately implemented by industry professionals and
computer device's owners, the author explains how to install and
harden firewalls, intrusion detection systems, attack recognition
tools, and malware protection systems. He also explains how to
recognize and counter common hacking activities. This book bridges
the gap between cybersecurity education and new data science
programs, discussing how cutting-edge artificial intelligence and
machine learning techniques can work for and against cybersecurity
efforts. Intelligent Security Systems includes supplementary
resources on an author-hosted website, such as classroom
presentation slides, sample review, test and exam questions, and
practice exercises to make the material contained practical and
useful. The book also offers: A thorough introduction to computer
security, artificial intelligence, and machine learning, including
basic definitions and concepts like threats, vulnerabilities,
risks, attacks, protection, and tools An exploration of firewall
design and implementation, including firewall types and models,
typical designs and configurations, and their limitations and
problems Discussions of intrusion detection systems (IDS),
including architecture topologies, components, and operational
ranges, classification approaches, and machine learning techniques
in IDS design A treatment of malware and vulnerabilities detection
and protection, including malware classes, history, and development
trends Perfect for undergraduate and graduate students in computer
security, computer science and engineering, Intelligent Security
Systems will also earn a place in the libraries of students and
educators in information technology and data science, as well as
professionals working in those fields.
|
|