|
|
Books > Computing & IT
BIG DATA ANALYTICS FOR INTERNET OF THINGS Discover the latest
developments in IoT Big Data with a new resource from established
and emerging leaders in the field Big Data Analytics for Internet
of Things delivers a comprehensive overview of all aspects of big
data analytics in Internet of Things (IoT) systems. The book
includes discussions of the enabling technologies of IoT data
analytics, types of IoT data analytics, challenges in IoT data
analytics, demand for IoT data analytics, computing platforms,
analytical tools, privacy, and security. The distinguished editors
have included resources that address key techniques in the analysis
of IoT data. The book demonstrates how to select the appropriate
techniques to unearth valuable insights from IoT data and offers
novel designs for IoT systems. With an abiding focus on practical
strategies with concrete applications for data analysts and IoT
professionals, Big Data Analytics for Internet of Things also
offers readers: A thorough introduction to the Internet of Things,
including IoT architectures, enabling technologies, and
applications An exploration of the intersection between the
Internet of Things and Big Data, including IoT as a source of Big
Data, the unique characteristics of IoT data, etc. A discussion of
the IoT data analytics, including the data analytical requirements
of IoT data and the types of IoT analytics, including predictive,
descriptive, and prescriptive analytics A treatment of machine
learning techniques for IoT data analytics Perfect for
professionals, industry practitioners, and researchers engaged in
big data analytics related to IoT systems, Big Data Analytics for
Internet of Things will also earn a place in the libraries of IoT
designers and manufacturers interested in facilitating the
efficient implementation of data analytics strategies.
Multifunctional Nanocomposites for Targeted Drug Delivery in Cancer
Therapy explores the design, synthesis, and application of
different multifunctional nanocomposites drug delivery system for
cancer treatment. It encompasses initial chapters discussing
introductory information about cancer, followed by chapters
focusing on the detailed information about various novel drug
delivery systems for treatment of several organ site cancers such
as prostate, skin, breast, lung, liver, pancreas, stomach, colon,
blood, mouth and throat. It is a valuable resource for cancer
researchers, oncologists, graduate students, and members of
biomedical research who need to understand more about novel
nanotechnologies applied to cancer treatment.
Handbook of Medical Image Computing and Computer Assisted
Intervention presents important advanced methods and state-of-the
art research in medical image computing and computer assisted
intervention, providing a comprehensive reference on current
technical approaches and solutions, while also offering proven
algorithms for a variety of essential medical imaging applications.
This book is written primarily for university researchers, graduate
students and professional practitioners (assuming an elementary
level of linear algebra, probability and statistics, and signal
processing) working on medical image computing and computer
assisted intervention.
Quality assurance is an essential aspect for ensuring the success
of corporations worldwide. Consistent quality requirements across
organizations of similar types ensure that these requirements can
be accurately and easily evaluated. Shaping the Future Through
Standardization is an essential scholarly book that examines
quality and standardization within diverse organizations globally
with a special focus on future perspectives, including how
standards and standardization may shape the future. Featuring a
wide range of topics such as economics, pedagogy, and management,
this book is ideal for academicians, researchers, decision makers,
policymakers, managers, corporate professionals, and students.
Medical and information communication technology professionals are
working to develop robust classification techniques, especially in
healthcare data/image analysis, to ensure quick diagnoses and
treatments to patients. Without fast and immediate access to
healthcare databases and information, medical professionals'
success rates and treatment options become limited and fall to
disastrous levels. Advanced Classification Techniques for
Healthcare Analysis provides emerging insight into classification
techniques in delivering quality, accurate, and affordable
healthcare, while also discussing the impact health data has on
medical treatments. Featuring coverage on a broad range of topics
such as early diagnosis, brain-computer interface, metaheuristic
algorithms, clustering techniques, learning schemes, and mobile
telemedicine, this book is ideal for medical professionals,
healthcare administrators, engineers, researchers, academicians,
and technology developers seeking current research on furthering
information and communication technology that improves patient
care.
Software engineering has surfaced as an industrial field that is
continually evolving due to the emergence of advancing technologies
and innovative methodologies. Scrum is the most recent revolution
that is transforming traditional software procedures, which has
researchers and practitioners scrambling to find the best
techniques for implementation. The continued development of this
agile process requires an extensive level of research on up-to-date
findings and applicable practices. Agile Scrum Implementation and
Its Long-Term Impact on Organizations is a collection of innovative
research on the methods and applications of scrum practices in
developing agile software systems. The book combines perspectives
from both the academic and professional communities as the
challenges and solutions expressed by each group can create a
better understanding of how practice must be applied in the real
world of software development. While highlighting topics including
scrum adoption, iterative deployment, and human impacts, this book
is ideally designed for researchers, developers, engineers,
practitioners, academicians, programmers, students, and educators
seeking current research on practical improvements in agile
software progression using scrum methodologies.
Over the last two decades, researchers are looking at imbalanced
data learning as a prominent research area. Many critical
real-world application areas like finance, health, network, news,
online advertisement, social network media, and weather have
imbalanced data, which emphasizes the research necessity for
real-time implications of precise fraud/defaulter detection, rare
disease/reaction prediction, network intrusion detection, fake news
detection, fraud advertisement detection, cyber bullying
identification, disaster events prediction, and more. Machine
learning algorithms are based on the heuristic of
equally-distributed balanced data and provide the biased result
towards the majority data class, which is not acceptable
considering imbalanced data is omnipresent in real-life scenarios
and is forcing us to learn from imbalanced data for foolproof
application design. Imbalanced data is multifaceted and demands a
new perception using the novelty at sampling approach of data
preprocessing, an active learning approach, and a cost perceptive
approach to resolve data imbalance. The Handbook of Research on
Data Preprocessing, Active Learning, and Cost Perceptive Approaches
for Resolving Data Imbalance offers new aspects for imbalanced data
learning by providing the advancements of the traditional methods,
with respect to big data, through case studies and research from
experts in academia, engineering, and industry. The chapters
provide theoretical frameworks and the latest empirical research
findings that help to improve the understanding of the impact of
imbalanced data and its resolving techniques based on data
preprocessing, active learning, and cost perceptive approaches.
This book is ideal for data scientists, data analysts, engineers,
practitioners, researchers, academicians, and students looking for
more information on imbalanced data characteristics and solutions
using varied approaches.
Many approaches have sprouted from artificial intelligence (AI) and
produced major breakthroughs in the computer science and
engineering industries. Deep learning is a method that is
transforming the world of data and analytics. Optimization of this
new approach is still unclear, however, and there's a need for
research on the various applications and techniques of deep
learning in the field of computing. Deep Learning Techniques and
Optimization Strategies in Big Data Analytics is a collection of
innovative research on the methods and applications of deep
learning strategies in the fields of computer science and
information systems. While highlighting topics including data
integration, computational modeling, and scheduling systems, this
book is ideally designed for engineers, IT specialists, data
analysts, data scientists, engineers, researchers, academicians,
and students seeking current research on deep learning methods and
its application in the digital industry.
If you look around you will find that all computer systems, from
your portable devices to the strongest supercomputers, are
heterogeneous in nature. The most obvious heterogeneity is the
existence of computing nodes of different capabilities (e.g.
multicore, GPUs, FPGAs, ...). But there are also other
heterogeneity factors that exist in computing systems, like the
memory system components, interconnection, etc. The main reason for
these different types of heterogeneity is to have good performance
with power efficiency. Heterogeneous computing results in both
challenges and opportunities. This book discusses both. It shows
that we need to deal with these challenges at all levels of the
computing stack: from algorithms all the way to process technology.
We discuss the topic of heterogeneous computing from different
angles: hardware challenges, current hardware state-of-the-art,
software issues, how to make the best use of the current
heterogeneous systems, and what lies ahead. The aim of this book is
to introduce the big picture of heterogeneous computing. Whether
you are a hardware designer or a software developer, you need to
know how the pieces of the puzzle fit together. The main goal is to
bring researchers and engineers to the forefront of the research
frontier in the new era that started a few years ago and is
expected to continue for decades. We believe that academics,
researchers, practitioners, and students will benefit from this
book and will be prepared to tackle the big wave of heterogeneous
computing that is here to stay.
Technology is used in various forms within today’s modern market.
Businesses and companies, specifically, are beginning to manage
their effectiveness and performance using intelligent systems and
other modes of digitization. The rise of artificial intelligence
and automation has caused organizations to re-examine how they
utilize their personnel and how to train employees for new
skillsets using these technologies. These responsibilities fall on
the shoulders of human resources, creating a need for further
understanding of autonomous systems and their capabilities within
organizational progression. Transforming Human Resource Functions
With Automation is a collection of innovative research on the
methods and applications of artificial intelligence and autonomous
systems within human resource management and modern alterations
that are occurring. While highlighting topics including cloud-based
systems, robotics, and social media, this book is ideally designed
for managers, practitioners, researchers, executives, policymakers,
strategists, academicians, and students seeking current research on
advancements within human resource strategies through the
implementation of information technology and automation.
Pultrusion: State-of-the-Art Process Models with Applications,
Second Edition is a detailed guide to pultrusion, providing
methodical coverage of process models and computation simulation,
governing principles and science, and key challenges to help
readers enable process optimization and scale-up. This new edition
has been revised and expanded to include the latest advances,
state-of-the-art process models, and governing principles. The main
challenges in pultrusion, such as the process induced residual
stresses, shape distortions, thermal history, species conversion,
phase changes, impregnation of the reinforcements and pulling force
are described, with related examples are provided. Moreover,
strategies for having a reliable and optimized process using
probabilistic approaches and optimization algorithms are
summarized. Another focus of this book is on the thermo-chemical
and mechanical analyses of the pultrusion process for industrial
profiles.
Communication based on the internet of things (IoT) generates huge
amounts of data from sensors over time, which opens a wide range of
applications and areas for researchers. The application of
analytics, machine learning, and deep learning techniques over such
a large volume of data is a very challenging task. Therefore, it is
essential to find patterns, retrieve novel insights, and predict
future behavior using this large amount of sensory data. Artificial
intelligence (AI) has an important role in facilitating analytics
and learning in the IoT devices. Applying AI-Based IoT Systems to
Simulation-Based Information Retrieval provides relevant frameworks
and the latest empirical research findings in the area. It is ideal
for professionals who wish to improve their understanding of the
strategic role of trust at different levels of the information and
knowledge society and trust at the levels of the global economy,
networks and organizations, teams and work groups, information
systems, and individuals as actors in the networked environments.
Covering topics such as blockchain visualization, computer-aided
drug discovery, and health monitoring, this premier reference
source is an excellent resource for business leaders and
executives, IT managers, security professionals, data scientists,
students and faculty of higher education, librarians, hospital
administrators, researchers, and academicians.
It is crucial that forensic science meets challenges such as
identifying hidden patterns in data, validating results for
accuracy, and understanding varying criminal activities in order to
be authoritative so as to hold up justice and public safety.
Artificial intelligence, with its potential subsets of machine
learning and deep learning, has the potential to transform the
domain of forensic science by handling diverse data, recognizing
patterns, and analyzing, interpreting, and presenting results.
Machine Learning and deep learning frameworks, with developed
mathematical and computational tools, facilitate the investigators
to provide reliable results. Further study on the potential uses of
these technologies is required to better understand their benefits.
Aiding Forensic Investigation Through Deep Learning and Machine
Learning Frameworks provides an outline of deep learning and
machine learning frameworks and methods for use in forensic science
to produce accurate and reliable results to aid investigation
processes. The book also considers the challenges, developments,
advancements, and emerging approaches of deep learning and machine
learning. Covering key topics such as biometrics, augmented
reality, and fraud investigation, this reference work is crucial
for forensic scientists, law enforcement, computer scientists,
researchers, scholars, academicians, practitioners, instructors,
and students.
Intelligent technologies have emerged as imperative tools in
computer science and information security. However, advanced
computing practices have preceded new methods of attacks on the
storage and transmission of data. Developing approaches such as
image processing and pattern recognition are susceptible to
breaches in security. Modern protection methods for these
innovative techniques require additional research. The Handbook of
Research on Intelligent Data Processing and Information Security
Systems provides emerging research exploring the theoretical and
practical aspects of cyber protection and applications within
computer science and telecommunications. Special attention is paid
to data encryption, steganography, image processing, and
recognition, and it targets professionals who want to improve their
knowledge in order to increase strategic capabilities and
organizational effectiveness. As such, this book is ideal for
analysts, programmers, computer engineers, software engineers,
mathematicians, data scientists, developers, IT specialists,
academicians, researchers, and students within fields of
information technology, information security, robotics, artificial
intelligence, image processing, computer science, and
telecommunications.
Spatial Regression Analysis Using Eigenvector Spatial Filtering
provides theoretical foundations and guides practical
implementation of the Moran eigenvector spatial filtering (MESF)
technique. MESF is a novel and powerful spatial statistical
methodology that allows spatial scientists to account for spatial
autocorrelation in their georeferenced data analyses. Its appeal is
in its simplicity, yet its implementation drawbacks include serious
complexities associated with constructing an eigenvector spatial
filter. This book discusses MESF specifications for various
intermediate-level topics, including spatially varying coefficients
models, (non) linear mixed models, local spatial autocorrelation,
space-time models, and spatial interaction models. Spatial
Regression Analysis Using Eigenvector Spatial Filtering is
accompanied by sample R codes and a Windows application with
illustrative datasets so that readers can replicate the examples in
the book and apply the methodology to their own application
projects. It also includes a Foreword by Pierre Legendre.
Brain-machine interfacing or brain-computer interfacing (BMI/BCI)
is an emerging and challenging technology used in engineering and
neuroscience. The ultimate goal is to provide a pathway from the
brain to the external world via mapping, assisting, augmenting or
repairing human cognitive or sensory-motor functions. In this book
an international panel of experts introduce signal processing and
machine learning techniques for BMI/BCI and outline their practical
and future applications in neuroscience, medicine, and
rehabilitation, with a focus on EEG-based BMI/BCI methods and
technologies. Topics covered include discriminative learning of
connectivity pattern of EEG; feature extraction from EEG
recordings; EEG signal processing; transfer learning algorithms in
BCI; convolutional neural networks for event-related potential
detection; spatial filtering techniques for improving individual
template-based SSVEP detection; feature extraction and
classification algorithms for image RSVP based BCI; decoding music
perception and imagination using deep learning techniques;
neurofeedback games using EEG-based Brain-Computer Interface
Technology; affective computing system and more.
|
|