|
|
Books > Computing & IT > General theory of computing > General
Quality assurance is an essential aspect for ensuring the success
of corporations worldwide. Consistent quality requirements across
organizations of similar types ensure that these requirements can
be accurately and easily evaluated. Shaping the Future Through
Standardization is an essential scholarly book that examines
quality and standardization within diverse organizations globally
with a special focus on future perspectives, including how
standards and standardization may shape the future. Featuring a
wide range of topics such as economics, pedagogy, and management,
this book is ideal for academicians, researchers, decision makers,
policymakers, managers, corporate professionals, and students.
Over the last two decades, researchers are looking at imbalanced
data learning as a prominent research area. Many critical
real-world application areas like finance, health, network, news,
online advertisement, social network media, and weather have
imbalanced data, which emphasizes the research necessity for
real-time implications of precise fraud/defaulter detection, rare
disease/reaction prediction, network intrusion detection, fake news
detection, fraud advertisement detection, cyber bullying
identification, disaster events prediction, and more. Machine
learning algorithms are based on the heuristic of
equally-distributed balanced data and provide the biased result
towards the majority data class, which is not acceptable
considering imbalanced data is omnipresent in real-life scenarios
and is forcing us to learn from imbalanced data for foolproof
application design. Imbalanced data is multifaceted and demands a
new perception using the novelty at sampling approach of data
preprocessing, an active learning approach, and a cost perceptive
approach to resolve data imbalance. The Handbook of Research on
Data Preprocessing, Active Learning, and Cost Perceptive Approaches
for Resolving Data Imbalance offers new aspects for imbalanced data
learning by providing the advancements of the traditional methods,
with respect to big data, through case studies and research from
experts in academia, engineering, and industry. The chapters
provide theoretical frameworks and the latest empirical research
findings that help to improve the understanding of the impact of
imbalanced data and its resolving techniques based on data
preprocessing, active learning, and cost perceptive approaches.
This book is ideal for data scientists, data analysts, engineers,
practitioners, researchers, academicians, and students looking for
more information on imbalanced data characteristics and solutions
using varied approaches.
Many approaches have sprouted from artificial intelligence (AI) and
produced major breakthroughs in the computer science and
engineering industries. Deep learning is a method that is
transforming the world of data and analytics. Optimization of this
new approach is still unclear, however, and there's a need for
research on the various applications and techniques of deep
learning in the field of computing. Deep Learning Techniques and
Optimization Strategies in Big Data Analytics is a collection of
innovative research on the methods and applications of deep
learning strategies in the fields of computer science and
information systems. While highlighting topics including data
integration, computational modeling, and scheduling systems, this
book is ideally designed for engineers, IT specialists, data
analysts, data scientists, engineers, researchers, academicians,
and students seeking current research on deep learning methods and
its application in the digital industry.
Pultrusion: State-of-the-Art Process Models with Applications,
Second Edition is a detailed guide to pultrusion, providing
methodical coverage of process models and computation simulation,
governing principles and science, and key challenges to help
readers enable process optimization and scale-up. This new edition
has been revised and expanded to include the latest advances,
state-of-the-art process models, and governing principles. The main
challenges in pultrusion, such as the process induced residual
stresses, shape distortions, thermal history, species conversion,
phase changes, impregnation of the reinforcements and pulling force
are described, with related examples are provided. Moreover,
strategies for having a reliable and optimized process using
probabilistic approaches and optimization algorithms are
summarized. Another focus of this book is on the thermo-chemical
and mechanical analyses of the pultrusion process for industrial
profiles.
It is crucial that forensic science meets challenges such as
identifying hidden patterns in data, validating results for
accuracy, and understanding varying criminal activities in order to
be authoritative so as to hold up justice and public safety.
Artificial intelligence, with its potential subsets of machine
learning and deep learning, has the potential to transform the
domain of forensic science by handling diverse data, recognizing
patterns, and analyzing, interpreting, and presenting results.
Machine Learning and deep learning frameworks, with developed
mathematical and computational tools, facilitate the investigators
to provide reliable results. Further study on the potential uses of
these technologies is required to better understand their benefits.
Aiding Forensic Investigation Through Deep Learning and Machine
Learning Frameworks provides an outline of deep learning and
machine learning frameworks and methods for use in forensic science
to produce accurate and reliable results to aid investigation
processes. The book also considers the challenges, developments,
advancements, and emerging approaches of deep learning and machine
learning. Covering key topics such as biometrics, augmented
reality, and fraud investigation, this reference work is crucial
for forensic scientists, law enforcement, computer scientists,
researchers, scholars, academicians, practitioners, instructors,
and students.
Advances in digital technologies continue to impact all areas of
life, including the business sector. Digital transformation is
ascertained to usher in the digitalized economy and involves new
concepts and management tools that must be considered in the
context of management science and practice. For business leaders to
ensure their companies remain competitive and relevant, it is
essential for them to utilize these innovative technologies and
strategies. The Handbook of Research on Digital Transformation
Management and Tools highlights new digital concepts within
management, such as digitalization and digital disruption, and
addresses the paradigm shift in management science incurred by the
digital transformation towards the digitalized economy. Covering a
range of important topics such as cultural economy, online consumer
behavior, sustainability, and social media, this major reference
work is crucial for managers, business owners, researchers,
scholars, academicians, practitioners, instructors, and students.
In this technological age, the information technology (IT) industry
is an important facet of society and business. The IT industry is
able to become more efficient and successful through the
examination of its structure and a larger understanding of the
individuals that work in the field. Multidisciplinary Perspectives
on Human Capital and Information Technology Professionals is a
critical scholarly resource that focuses on IT as an industry and
examines it from an array of academic viewpoints. Featuring
coverage on a wide range of topics, such as employee online
communities, role stress, and competence frameworks, this book is
targeted toward academicians, students, and researchers seeking
relevant research on IT as an industry.
Due to the increasing availability of affordable internet services,
the number of users, and the need for a wider range of
multimedia-based applications, internet usage is on the rise. With
so many users and such a large amount of data, the requirements of
analyzing large data sets leads to the need for further
advancements to information processing. Big Data Processing with
Hadoop is an essential reference source that discusses possible
solutions for millions of users working with a variety of data
applications, who expect fast turnaround responses, but encounter
issues with processing data at the rate it comes in. Featuring
research on topics such as market basket analytics, scheduler load
simulator, and writing YARN applications, this book is ideally
designed for IoT professionals, students, and engineers seeking
coverage on many of the real-world challenges regarding big data.
Changing business environments and information technology
advancements fundamentally reshaped the traditional information
landscape in our contemporary society, urging companies to seek
innovative ways to diffuse and manage assets on a global scale. It
is crucial for society to understand the new methodologies and
common practices that organizations can utilize to leverage their
knowledge into practice. Global Information Diffusion and
Management in Contemporary Society is an essential reference source
featuring research on the development and implementation of
contemporary global information management initiatives in
organizations. Including coverage on a multitude of topics such as
data security, global manufacturing, and information governance,
this book explores the importance of information management in a
global context. This book is ideally designed for managers,
information systems specialists, professionals, researchers, and
administrators seeking current research on the theories and
applications of global information management.
With technology creating a more competitive market, the global
economy has been continually evolving in recent years. These
technological developments have drastically changed the ways
organizations manage their resources, as they are constantly
seeking innovative methods to implement new systems. Because of
this, there is an urgent need for empirical research that studies
advancing theories and applications that organizations can use to
successfully handle information and supplies. Novel Theories and
Applications of Global Information Resource Management is a pivotal
reference source that provides vital research on developing
practices for businesses to effectively manage their assets on a
global scale. While highlighting topics such as enterprise systems,
library management, and information security, this publication
explores the implementation of technological innovation into
business techniques as well as the methods of controlling
information in a contemporary society. This book is ideally
designed for brokers, accountants, marketers, researchers, data
scientists, financiers, managers, and academicians seeking current
research on global resource management.
The idea of this book grew out of a symposium that was held at
Stony Brook in September 2012 in celebration of David S.Warren's
fundamental contributions to Computer Science and the area of Logic
Programming in particular. Logic Programming (LP) is at the nexus
of Knowledge Representation, Artificial Intelligence, Mathematical
Logic, Databases, and Programming Languages. It is fascinating and
intellectually stimulating due to the fundamental interplay among
theory, systems, and applications brought about by logic. Logic
programs are more declarative in the sense that they strive to be
logical specifications of "what" to do rather than "how" to do it,
and thus they are high-level and easier to understand and maintain.
Yet, without being given an actual algorithm, LP systems implement
the logical specifications automatically. Several books cover the
basics of LP but focus mostly on the Prolog language with its
incomplete control strategy and non-logical features. At the same
time, there is generally a lack of accessible yet comprehensive
collections of articles covering the key aspects in declarative LP.
These aspects include, among others, well-founded vs. stable model
semantics for negation, constraints, object-oriented LP, updates,
probabilistic LP, and evaluation methods, including top-down vs.
bottom-up, and tabling. For systems, the situation is even less
satisfactory, lacking accessible literature that can help train the
new crop of developers, practitioners, and researchers. There are a
few guides onWarren's Abstract Machine (WAM), which underlies most
implementations of Prolog, but very little exists on what is needed
for constructing a state-of-the-art declarative LP inference
engine. Contrast this with the literature on, say, Compilers, where
one can first study a book on the general principles and algorithms
and then dive in the particulars of a specific compiler. Such
resources greatly facilitate the ability to start making meaningful
contributions quickly. There is also a dearth of articles about
systems that support truly declarative languages, especially those
that tie into first-order logic, mathematical programming, and
constraint solving. LP helps solve challenging problems in a wide
range of application areas, but in-depth analysis of their
connection with LP language abstractions and LP implementation
methods is lacking. Also, rare are surveys of challenging
application areas of LP, such as Bioinformatics, Natural Language
Processing, Verification, and Planning. The goal of this book is to
help fill in the previously mentioned void in the LP literature. It
offers a number of overviews on key aspects of LP that are suitable
for researchers and practitioners as well as graduate students. The
following chapters in theory, systems, and applications of LP are
included.
Competition in today's global market offers strong motivation for
the development of sophisticated tools within computer science. The
neuron multi-functional technology platform is a developing field
of study that regards the various interactive approaches that can
be applied within this subject matter. As advancing technologies
continue to emerge, managers and researchers need a compilation of
research that discusses the advancements and specific
implementations of these intelligent approaches with this platform.
Avatar-Based Control, Estimation, Communications, and Development
of Neuron Multi-Functional Technology Platforms is a pivotal
reference source that provides vital research on the application of
artificial and natural approaches towards neuron-based programs.
While highlighting topics such as natural intelligence,
neurolinguistics, and smart data storage, this publication presents
techniques, case studies, and methodologies that combine the use of
intelligent artificial and natural approaches with optimization
techniques for facing problems and combines many types of hardware
and software with a variety of communication technologies to enable
the development of innovative applications. This book is ideally
designed for researchers, practitioners, scientists, field experts,
professors, and students seeking current research on the
optimization of avatar-based advancements in multifaceted
technology systems.
|
|