|
|
Books > Computing & IT
Advances in Geophysics, Volume 61 - Machine Learning and Artificial
Intelligence in Geosciences, the latest release in this
highly-respected publication in the field of geophysics, contains
new chapters on a variety of topics, including a historical review
on the development of machine learning, machine learning to
investigate fault rupture on various scales, a review on machine
learning techniques to describe fractured media, signal
augmentation to improve the generalization of deep neural networks,
deep generator priors for Bayesian seismic inversion, as well as a
review on homogenization for seismology, and more.
Security in IoT Social Networks takes a deep dive into security
threats and risks, focusing on real-world social and financial
effects. Mining and analyzing enormously vast networks is a vital
part of exploiting Big Data. This book provides insight into the
technological aspects of modeling, searching, and mining for
corresponding research issues, as well as designing and analyzing
models for resolving such challenges. The book will help start-ups
grow, providing research directions concerning security mechanisms
and protocols for social information networks. The book covers
structural analysis of large social information networks,
elucidating models and algorithms and their fundamental properties.
Moreover, this book includes smart solutions based on artificial
intelligence, machine learning, and deep learning for enhancing the
performance of social information network security protocols and
models. This book is a detailed reference for academicians,
professionals, and young researchers. The wide range of topics
provides extensive information and data for future research
challenges in present-day social information networks.
Cybersecurity has been gaining serious attention and recently has
become an important topic of concern for organizations, government
institutions, and largely for people interacting with digital
online systems. As many individual and organizational activities
continue to grow and are conducted in the digital environment, new
vulnerabilities have arisen which have led to cybersecurity
threats. The nature, source, reasons, and sophistication for
cyberattacks are not clearly known or understood, and many times
invisible cyber attackers are never traced or can never be found.
Cyberattacks can only be known once the attack and the destruction
have already taken place long after the attackers have left.
Cybersecurity for computer systems has increasingly become
important because the government, military, corporate, financial,
critical infrastructure, and medical organizations rely heavily on
digital network systems, which process and store large volumes of
data on computer devices that are exchanged on the internet, and
they are vulnerable to ""continuous"" cyberattacks. As
cybersecurity has become a global concern, it needs to be clearly
understood, and innovative solutions are required. Advancing
Cybersecurity for Digital Transformation: Opportunities and
Challenges looks deeper into issues, problems, and innovative
solutions and strategies that are linked to cybersecurity. This
book will provide important knowledge that can impact the
improvement of cybersecurity, which can add value in terms of
innovation to solving cybersecurity threats. The chapters cover
cybersecurity challenges, technologies, and solutions in the
context of different industries and different types of threats.
This book is ideal for cybersecurity researchers, professionals,
scientists, scholars, and managers, as well as practitioners,
stakeholders, researchers, academicians, and students interested in
the latest advancements in cybersecurity for digital
transformation.
Parallel Programming with OpenACC is a modern, practical guide to
implementing dependable computing systems. The book explains how
anyone can use OpenACC to quickly ramp-up application performance
using high-level code directives called pragmas. The OpenACC
directive-based programming model is designed to provide a simple,
yet powerful, approach to accelerators without significant
programming effort. Author Rob Farber, working with a team of
expert contributors, demonstrates how to turn existing applications
into portable GPU accelerated programs that demonstrate immediate
speedups. The book also helps users get the most from the latest
NVIDIA and AMD GPU plus multicore CPU architectures (and soon for
Intel (R) Xeon Phi (TM) as well). Downloadable example codes
provide hands-on OpenACC experience for common problems in
scientific, commercial, big-data, and real-time systems. Topics
include writing reusable code, asynchronous capabilities, using
libraries, multicore clusters, and much more. Each chapter explains
how a specific aspect of OpenACC technology fits, how it works, and
the pitfalls to avoid. Throughout, the book demonstrates how the
use of simple working examples that can be adapted to solve
application needs.
This book presents research on recent developments in collective
decision-making. With contributions from leading scholars from a
variety of disciplines, it provides an up-to-date overview of
applications in social choice theory, welfare economics, and
industrial organization. The contributions address, amongst others,
topics such as measuring power, the manipulability of collective
decisions, and experimental approaches. Applications range from
analysis of the complicated institutional rules of the European
Union to responsibility-based allocation of cartel
damages or the design of webpage rankings. With its
interdisciplinary focus, the book seeks to bridge the gap between
different disciplinary approaches by pointing to open questions
that can only be resolved through collaborative efforts.
Edsger Wybe Dijkstra (1930-2002) was one of the most influential
researchers in the history of computer science, making fundamental
contributions to both the theory and practice of computing. Early
in his career, he proposed the single-source shortest path
algorithm, now commonly referred to as Dijkstra's algorithm. He
wrote (with Jaap Zonneveld) the first ALGOL 60 compiler, and
designed and implemented with his colleagues the influential THE
operating system. Dijkstra invented the field of concurrent
algorithms, with concepts such as mutual exclusion, deadlock
detection, and synchronization. A prolific writer and forceful
proponent of the concept of structured programming, he convincingly
argued against the use of the Go To statement. In 1972 he was
awarded the ACM Turing Award for "fundamental contributions to
programming as a high, intellectual challenge; for eloquent
insistence and practical demonstration that programs should be
composed correctly, not just debugged into correctness; for
illuminating perception of problems at the foundations of program
design." Subsequently he invented the concept of self-stabilization
relevant to fault-tolerant computing. He also devised an elegant
language for nondeterministic programming and its weakest
precondition semantics, featured in his influential 1976 book A
Discipline of Programming in which he advocated the development of
programs in concert with their correctness proofs. In the later
stages of his life, he devoted much attention to the development
and presentation of mathematical proofs, providing further support
to his long-held view that the programming process should be viewed
as a mathematical activity. In this unique new book, 31 computer
scientists, including five recipients of the Turing Award, present
and discuss Dijkstra's numerous contributions to computing science
and assess their impact. Several authors knew Dijkstra as a friend,
teacher, lecturer, or colleague. Their biographical essays and
tributes provide a fascinating multi-author picture of Dijkstra,
from the early days of his career up to the end of his life.
Multi-Paradigm Modelling for Cyber-Physical Systems explores
modeling and analysis as crucial activities in the development of
Cyber-Physical Systems, which are inherently cross-disciplinary in
nature and require distinct modeling techniques related to
different disciplines, as well as a common background knowledge.
This book will serve as a reference for anyone starting in the
field of CPS who needs a solid foundation of modeling, including a
comprehensive introduction to existing techniques and a clear
explanation of their advantages and limitations. This book is aimed
at both researchers and practitioners who are interested in various
modeling paradigms across computer science and engineering.
Data analytics is proving to be an ally for epidemiologists as they
join forces with data scientists to address the scale of crises.
Analytics examined from many sources can derive insights and be
used to study and fight global outbreaks. Pandemic analytics is a
modern way to combat a problem as old as humanity itself: the
proliferation of disease. Machine Learning and Data Analytics for
Predicting, Managing, and Monitoring Disease explores different
types of data and discusses how to prepare data for analysis,
perform simple statistical analyses, create meaningful data
visualizations, predict future trends from data, and more by
applying cutting edge technology such as machine learning and data
analytics in the wake of the COVID-19 pandemic. Covering a range of
topics such as mental health analytics during COVID-19, data
analysis and machine learning using Python, and statistical model
development and deployment, it is ideal for researchers,
academicians, data scientists, technologists, data analysts,
diagnosticians, healthcare professionals, computer scientists, and
students.
Advances in Imaging and Electron Physics, Volume 216, merges two
long-running serials, Advances in Electronics and Electron Physics
and Advances in Optical and Electron Microscopy. The series
features extended articles on the physics of electron devices
(especially semiconductor devices), particle optics at high and low
energies, microlithography, image science, digital image
processing, electromagnetic wave propagation, electron microscopy
and the computing methods used in all these domains.
Trends in Deep Learning Methodologies: Algorithms, Applications,
and Systems covers deep learning approaches such as neural
networks, deep belief networks, recurrent neural networks,
convolutional neural networks, deep auto-encoder, and deep
generative networks, which have emerged as powerful computational
models. Chapters elaborate on these models which have shown
significant success in dealing with massive data for a large number
of applications, given their capacity to extract complex hidden
features and learn efficient representation in unsupervised
settings. Chapters investigate deep learning-based algorithms in a
variety of application, including biomedical and health
informatics, computer vision, image processing, and more. In recent
years, many powerful algorithms have been developed for matching
patterns in data and making predictions about future events. The
major advantage of deep learning is to process big data analytics
for better analysis and self-adaptive algorithms to handle more
data. Deep learning methods can deal with multiple levels of
representation in which the system learns to abstract higher level
representations of raw data. Earlier, it was a common requirement
to have a domain expert to develop a specific model for each
specific application, however, recent advancements in
representation learning algorithms allow researchers across various
subject domains to automatically learn the patterns and
representation of the given data for the development of specific
models.
Quantum Inspired Computational Intelligence: Research and
Applications explores the latest quantum computational intelligence
approaches, initiatives, and applications in computing,
engineering, science, and business. The book explores this emerging
field of research that applies principles of quantum mechanics to
develop more efficient and robust intelligent systems. Conventional
computational intelligence-or soft computing-is conjoined with
quantum computing to achieve this objective. The models covered can
be applied to any endeavor which handles complex and meaningful
information.
Innovation is the key to maintain competitive advantage. Innovation
in products, processes, and business models help companies to
provide economic value to their customers. Identifying the
innovative ideas, implementing those ideas, and absorbing them in
the market requires investing many resources that could incur large
costs. Technology encourages companies to foster innovation to
remain competitive in the marketplace. Emerging Technologies for
Innovation Management in the Software Industry serves as a resource
for technology absorption in companies supporting innovation. It
highlights the role of technology to assist software
companies-especially small start-ups-to innovate their products,
processes, and business models. This book provides the necessary
guidelines of which tools to use and under what situations.
Covering topics such as risk management, prioritization approaches,
and digitally-enabled innovation processes, this premier reference
source is an ideal resource for entrepreneurs, software developers,
software managers, business leaders, engineers, students and
faculty of higher education, researchers, and academicians.
Developing new approaches and reliable enabling technologies in the
healthcare industry is needed to enhance our overall quality of
life and lead to a healthier, innovative, and secure society.
Further study is required to ensure these current technologies,
such as big data analytics and artificial intelligence, are
utilized to their utmost potential and are appropriately applied to
advance society. Big Data Analytics and Artificial Intelligence in
the Healthcare Industry discusses technologies and emerging topics
regarding reliable and innovative solutions applied to the
healthcare industry and considers various applications, challenges,
and issues of big data and artificial intelligence for enhancing
our quality of life. Covering a range of topics such as electronic
health records, machine learning, and e-health, this reference work
is ideal for healthcare professionals, computer scientists, data
analysts, researchers, practitioners, scholars, academicians,
instructors, and students.
Virtual Reality (VR) is the use of computer technology to construct
an environment that is simulated. VR places the user inside and in
the center of the experience, unlike conventional user interfaces.
Users are immersed and able to connect with 3D environments instead
of seeing a screen in front of them. The computer has to role to
provide the experiences of the user in this artificial environment
by simulating as many senses as possible, such as sight, hearing,
touch and smell. In Augmented Reality (AR) we have an enhanced
version of the real physical world that is achieved through the use
of digital visual elements, sound, or other sensory stimuli
delivered via technology. It can be seen as VR imposed into real
life. In both VR and AR the experience is composed of a virtual or
extended world, an immersion technology, sensory feedback and
interactivity. These elements use a multitude of technologies that
must work together and presented to the user seamlessly integrated
and synchronized. This book is dedicated to applications, new
technologies and emerging trends in the fields of virtual reality
and augmented reality in healthcare. It is intended to cover
technical areas as well as areas of applied intervention. It is
expected to cover hardware and software technologies while
encompassing all components of the virtual experience. The main
goal of this book is to show how to put Virtual Reality in action
by linking academic and informatics researchers with professionals
who use and need VR in their day-a-day work, with a special focus
on healthcare professionals and related areas. The idea is to
disseminate and exchange the knowledge, information and technology
provided by the international communities in the area of VR, AR and
XR throughout the 21st century. Another important goal is to
synthesize all the trends, best practices, methodologies, languages
and tools which are used to implement VR. In order to shape the
future of VR, new paradigms and technologies should be discussed,
not forgetting aspects related to regulation and certification of
VR technologies, especially in the healthcare area. These last
topics are crucial for the standardization of VR. This book will
present important achievements and will show how to use VR
technologies in a full range of settings able to provide decision
support anywhere and anytime using this new approach.
|
|