|
|
Books > Computing & IT > General theory of computing
With new technologies, such as computer vision, internet of things,
mobile computing, e-governance and e-commerce, and wide
applications of social media, organizations generate a huge volume
of data and at a much faster rate than several years ago. Big data
in large-/small-scale systems, characterized by high volume,
diversity, and velocity, increasingly drives decision making and is
changing the landscape of business intelligence. From governments
to private organizations, from communities to individuals, all
areas are being affected by this shift. There is a high demand for
big data analytics that offer insights for computing efficiency,
knowledge discovery, problem solving, and event prediction. To
handle this demand and this increase in big data, there needs to be
research on innovative and optimized machine learning algorithms in
both large- and small-scale systems. Applications of Big Data in
Large- and Small-Scale Systems includes state-of-the-art research
findings on the latest development, up-to-date issues, and
challenges in the field of big data and presents the latest
innovative and intelligent applications related to big data. This
book encompasses big data in various multidisciplinary fields from
the medical field to agriculture, business research, and smart
cities. While highlighting topics including machine learning, cloud
computing, data visualization, and more, this book is a valuable
reference tool for computer scientists, data scientists and
analysts, engineers, practitioners, stakeholders, researchers,
academicians, and students interested in the versatile and
innovative use of big data in both large-scale and small-scale
systems.
Advances in Nonvolatile Memory and Storage Technology, Second
Edition, addresses recent developments in the non-volatile memory
spectrum, from fundamental understanding, to technological aspects.
The book provides up-to-date information on the current memory
technologies as related by leading experts in both academia and
industry. To reflect the rapidly changing field, many new chapters
have been included to feature the latest in RRAM technology,
STT-RAM, memristors and more. The new edition describes the
emerging technologies including oxide-based ferroelectric memories,
MRAM technologies, and 3D memory. Finally, to further widen the
discussion on the applications space, neuromorphic computing
aspects have been included. This book is a key resource for
postgraduate students and academic researchers in physics,
materials science and electrical engineering. In addition, it will
be a valuable tool for research and development managers concerned
with electronics, semiconductors, nanotechnology, solid-state
memories, magnetic materials, organic materials and portable
electronic devices.
Recent advances in information and communication technologies have
enhanced the standards of metropolitan planning and development.
With the increase in mobile communication, this will help to
deliver innovative new services and apps in the field of urban
e-planning. New Approaches, Methods, and Tools in Urban E-Planning
is a key resource for the latest academic research on recent
innovations in urban e-planning, citizen e-participation, the use
of social media, and new forms of data collection and idea
generation for urban planning. Presenting broad coverage among a
variety of pertinent views and themes such as ethnography,
e-consultation, and civic engagement, this book is ideally designed
for planners, policymakers, researchers, and graduate students
interested in how recent technological advancements are enhancing
the traditional practices in e-planning.
Social media has emerged as a powerful tool that reaches a wide
audience with minimum time and effort. It has a diverse role in
society and human life and can boost the visibility of information
that allows citizens the ability to play a vital role in creating
and fostering social change. This practice can have both positive
and negative consequences on society. Examining the Roles of IT and
Social Media in Democratic Development and Social Change is a
collection of innovative research on the methods and applications
of social media within community development and democracy. While
highlighting topics including information capitalism, ethical
issues, and e-governance, this book is ideally designed for social
workers, politicians, public administrators, sociologists,
journalists, policymakers, government administrators, academicians,
researchers, and students seeking current research on social
advancement and change through social media and technology.
Quality assurance is an essential aspect for ensuring the success
of corporations worldwide. Consistent quality requirements across
organizations of similar types ensure that these requirements can
be accurately and easily evaluated. Shaping the Future Through
Standardization is an essential scholarly book that examines
quality and standardization within diverse organizations globally
with a special focus on future perspectives, including how
standards and standardization may shape the future. Featuring a
wide range of topics such as economics, pedagogy, and management,
this book is ideal for academicians, researchers, decision makers,
policymakers, managers, corporate professionals, and students.
Over the last two decades, researchers are looking at imbalanced
data learning as a prominent research area. Many critical
real-world application areas like finance, health, network, news,
online advertisement, social network media, and weather have
imbalanced data, which emphasizes the research necessity for
real-time implications of precise fraud/defaulter detection, rare
disease/reaction prediction, network intrusion detection, fake news
detection, fraud advertisement detection, cyber bullying
identification, disaster events prediction, and more. Machine
learning algorithms are based on the heuristic of
equally-distributed balanced data and provide the biased result
towards the majority data class, which is not acceptable
considering imbalanced data is omnipresent in real-life scenarios
and is forcing us to learn from imbalanced data for foolproof
application design. Imbalanced data is multifaceted and demands a
new perception using the novelty at sampling approach of data
preprocessing, an active learning approach, and a cost perceptive
approach to resolve data imbalance. The Handbook of Research on
Data Preprocessing, Active Learning, and Cost Perceptive Approaches
for Resolving Data Imbalance offers new aspects for imbalanced data
learning by providing the advancements of the traditional methods,
with respect to big data, through case studies and research from
experts in academia, engineering, and industry. The chapters
provide theoretical frameworks and the latest empirical research
findings that help to improve the understanding of the impact of
imbalanced data and its resolving techniques based on data
preprocessing, active learning, and cost perceptive approaches.
This book is ideal for data scientists, data analysts, engineers,
practitioners, researchers, academicians, and students looking for
more information on imbalanced data characteristics and solutions
using varied approaches.
Many approaches have sprouted from artificial intelligence (AI) and
produced major breakthroughs in the computer science and
engineering industries. Deep learning is a method that is
transforming the world of data and analytics. Optimization of this
new approach is still unclear, however, and there's a need for
research on the various applications and techniques of deep
learning in the field of computing. Deep Learning Techniques and
Optimization Strategies in Big Data Analytics is a collection of
innovative research on the methods and applications of deep
learning strategies in the fields of computer science and
information systems. While highlighting topics including data
integration, computational modeling, and scheduling systems, this
book is ideally designed for engineers, IT specialists, data
analysts, data scientists, engineers, researchers, academicians,
and students seeking current research on deep learning methods and
its application in the digital industry.
Pultrusion: State-of-the-Art Process Models with Applications,
Second Edition is a detailed guide to pultrusion, providing
methodical coverage of process models and computation simulation,
governing principles and science, and key challenges to help
readers enable process optimization and scale-up. This new edition
has been revised and expanded to include the latest advances,
state-of-the-art process models, and governing principles. The main
challenges in pultrusion, such as the process induced residual
stresses, shape distortions, thermal history, species conversion,
phase changes, impregnation of the reinforcements and pulling force
are described, with related examples are provided. Moreover,
strategies for having a reliable and optimized process using
probabilistic approaches and optimization algorithms are
summarized. Another focus of this book is on the thermo-chemical
and mechanical analyses of the pultrusion process for industrial
profiles.
It is crucial that forensic science meets challenges such as
identifying hidden patterns in data, validating results for
accuracy, and understanding varying criminal activities in order to
be authoritative so as to hold up justice and public safety.
Artificial intelligence, with its potential subsets of machine
learning and deep learning, has the potential to transform the
domain of forensic science by handling diverse data, recognizing
patterns, and analyzing, interpreting, and presenting results.
Machine Learning and deep learning frameworks, with developed
mathematical and computational tools, facilitate the investigators
to provide reliable results. Further study on the potential uses of
these technologies is required to better understand their benefits.
Aiding Forensic Investigation Through Deep Learning and Machine
Learning Frameworks provides an outline of deep learning and
machine learning frameworks and methods for use in forensic science
to produce accurate and reliable results to aid investigation
processes. The book also considers the challenges, developments,
advancements, and emerging approaches of deep learning and machine
learning. Covering key topics such as biometrics, augmented
reality, and fraud investigation, this reference work is crucial
for forensic scientists, law enforcement, computer scientists,
researchers, scholars, academicians, practitioners, instructors,
and students.
In this technological age, the information technology (IT) industry
is an important facet of society and business. The IT industry is
able to become more efficient and successful through the
examination of its structure and a larger understanding of the
individuals that work in the field. Multidisciplinary Perspectives
on Human Capital and Information Technology Professionals is a
critical scholarly resource that focuses on IT as an industry and
examines it from an array of academic viewpoints. Featuring
coverage on a wide range of topics, such as employee online
communities, role stress, and competence frameworks, this book is
targeted toward academicians, students, and researchers seeking
relevant research on IT as an industry.
Due to the increasing availability of affordable internet services,
the number of users, and the need for a wider range of
multimedia-based applications, internet usage is on the rise. With
so many users and such a large amount of data, the requirements of
analyzing large data sets leads to the need for further
advancements to information processing. Big Data Processing with
Hadoop is an essential reference source that discusses possible
solutions for millions of users working with a variety of data
applications, who expect fast turnaround responses, but encounter
issues with processing data at the rate it comes in. Featuring
research on topics such as market basket analytics, scheduler load
simulator, and writing YARN applications, this book is ideally
designed for IoT professionals, students, and engineers seeking
coverage on many of the real-world challenges regarding big data.
Changing business environments and information technology
advancements fundamentally reshaped the traditional information
landscape in our contemporary society, urging companies to seek
innovative ways to diffuse and manage assets on a global scale. It
is crucial for society to understand the new methodologies and
common practices that organizations can utilize to leverage their
knowledge into practice. Global Information Diffusion and
Management in Contemporary Society is an essential reference source
featuring research on the development and implementation of
contemporary global information management initiatives in
organizations. Including coverage on a multitude of topics such as
data security, global manufacturing, and information governance,
this book explores the importance of information management in a
global context. This book is ideally designed for managers,
information systems specialists, professionals, researchers, and
administrators seeking current research on the theories and
applications of global information management.
The idea of this book grew out of a symposium that was held at
Stony Brook in September 2012 in celebration of David S.Warren's
fundamental contributions to Computer Science and the area of Logic
Programming in particular. Logic Programming (LP) is at the nexus
of Knowledge Representation, Artificial Intelligence, Mathematical
Logic, Databases, and Programming Languages. It is fascinating and
intellectually stimulating due to the fundamental interplay among
theory, systems, and applications brought about by logic. Logic
programs are more declarative in the sense that they strive to be
logical specifications of "what" to do rather than "how" to do it,
and thus they are high-level and easier to understand and maintain.
Yet, without being given an actual algorithm, LP systems implement
the logical specifications automatically. Several books cover the
basics of LP but focus mostly on the Prolog language with its
incomplete control strategy and non-logical features. At the same
time, there is generally a lack of accessible yet comprehensive
collections of articles covering the key aspects in declarative LP.
These aspects include, among others, well-founded vs. stable model
semantics for negation, constraints, object-oriented LP, updates,
probabilistic LP, and evaluation methods, including top-down vs.
bottom-up, and tabling. For systems, the situation is even less
satisfactory, lacking accessible literature that can help train the
new crop of developers, practitioners, and researchers. There are a
few guides onWarren's Abstract Machine (WAM), which underlies most
implementations of Prolog, but very little exists on what is needed
for constructing a state-of-the-art declarative LP inference
engine. Contrast this with the literature on, say, Compilers, where
one can first study a book on the general principles and algorithms
and then dive in the particulars of a specific compiler. Such
resources greatly facilitate the ability to start making meaningful
contributions quickly. There is also a dearth of articles about
systems that support truly declarative languages, especially those
that tie into first-order logic, mathematical programming, and
constraint solving. LP helps solve challenging problems in a wide
range of application areas, but in-depth analysis of their
connection with LP language abstractions and LP implementation
methods is lacking. Also, rare are surveys of challenging
application areas of LP, such as Bioinformatics, Natural Language
Processing, Verification, and Planning. The goal of this book is to
help fill in the previously mentioned void in the LP literature. It
offers a number of overviews on key aspects of LP that are suitable
for researchers and practitioners as well as graduate students. The
following chapters in theory, systems, and applications of LP are
included.
With technology creating a more competitive market, the global
economy has been continually evolving in recent years. These
technological developments have drastically changed the ways
organizations manage their resources, as they are constantly
seeking innovative methods to implement new systems. Because of
this, there is an urgent need for empirical research that studies
advancing theories and applications that organizations can use to
successfully handle information and supplies. Novel Theories and
Applications of Global Information Resource Management is a pivotal
reference source that provides vital research on developing
practices for businesses to effectively manage their assets on a
global scale. While highlighting topics such as enterprise systems,
library management, and information security, this publication
explores the implementation of technological innovation into
business techniques as well as the methods of controlling
information in a contemporary society. This book is ideally
designed for brokers, accountants, marketers, researchers, data
scientists, financiers, managers, and academicians seeking current
research on global resource management.
|
You may like...
Oracle 12c - SQL
Joan Casteel
Paperback
(1)
R1,321
R1,228
Discovery Miles 12 280
|