![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing
Due to the growing use of web applications and communication devices, the use of data has increased throughout various industries. It is necessary to develop new techniques for managing data in order to ensure adequate usage. The Handbook of Research on Pattern Engineering System Development for Big Data Analytics is a critical scholarly resource that examines the incorporation of pattern management in business technologies as well as decision making and prediction process through the use of data management and analysis. Featuring coverage on a broad range of topics such as business intelligence, feature extraction, and data collection, this publication is geared towards professionals, academicians, practitioners, and researchers seeking current research on the development of pattern management systems for business applications.
An intellectual property discussion is central to qualitative research projects, and ethical guidelines are essential to the safe accomplishment of research projects. Undertaking research studies without adhering to ethics may be dangerous to researchers and research subjects. Therefore, it is important to understand and develop practical techniques for handling ethics with a specific focus on qualitative projects so that researchers conducting this type of research may continue to use ethical practices at every step of the project. Data Analysis and Methods of Qualitative Research: Emerging Research and Opportunities discusses in detail the methods related to the social constructionist paradigm that is popular with qualitative research projects. These methods help researchers undertake ideal qualitative projects that are free from quantitative research techniques/concepts all while acquiring practical skills in handling ethics and ethical issues in qualitative projects. The chapters each contain case studies, learning outcomes, question and answer sections, and discuss critical research philosophies in detail along with topics such as ethics, research design, data gathering and sampling methods, research outputs, data analysis, and report writing. Featuring a wide range of topics such as epistemology, probability sampling, and big data, this book is ideal for researchers, practitioners, computer scientists, academicians, analysts, coders, and students looking to become competent qualitative research specialists.
The universe is considered an expansive informational field subjected to a general organizational law. The organization of the deployment results in the emergence of an autonomous organization of spatial and material elements endowed with permanence, which are generated on an informational substratum where an organizational law is exercised at all scales. The initial action of a generating informational element produces a quantity of basic informational elements that multiply to form other informational elements that will either be neutral, constituting the basic spatial elements, or active, forming quantum elements. The neutral basic elements will form the space by a continuous aggregation and will represent the substrate of the informational links, allowing the active informational elements to communicate, in order to aggregate and organize themselves. Every active element is immersed in an informational envelope, allowing it to continue its organization through constructive communications. The organizational law engages the active quantum elements to aggregate and produce new and more complex quantum elements, then molecular elements, massive elements, suns and planets. Gravity will then be the force of attraction exerted by the informational envelopes of the aggregates depending on their mass, to develop them by acquisition of new aggregates. The organizational communication of the informational envelopes of all of the physical material elements on Earth will enable the organization of living things, with reproduction managed by communications between the informational envelopes of the elements, realizing a continuous and powerful evolution.
Whether you are brand new to data mining or working on your tenth predictive analytics project, "Commercial Data Mining" will be there for you as an accessible reference outlining the entire process and related themes. In this book, you'll learn that your organization does not need a huge volume of data or a Fortune 500 budget to generate business using existing information assets. Expert author David Nettleton guides you through the process from beginning to end and covers everything from business objectives to data sources, and selection to analysis and predictive modeling. "Commercial Data Mining" includes case studies and practical
examples from Nettleton's more than 20 years of commercial
experience. Real-world cases covering customer loyalty,
cross-selling, and audience prediction in industries including
insurance, banking, and media illustrate the concepts and
techniques explained throughout the book.
Information communication technologies (ICT) have long been important in supporting doctoral study. Though ICTs have been integrated into educational practices at all levels, there is little understanding of how effective these technologies are in supporting resource development for students and researchers in academic institutions. Enhancing the Role of ICT in Doctoral Research Processes is a collection of innovative research that identifies the ways that doctoral supervisors and students perceive the role of ICTs within the doctoral research process and supports the development of guidelines to enhance ICT skills within these programs. While highlighting topics including professional development, online learning, and ICT management, this book is ideally designed for academicians, researchers, and professionals seeking current research on ICT use for doctoral research.
Method engineering is a very young field. Generally, method engineering can be considered from engineering of an entire methodology for information systems development to engineering of modeling techniques according to project requirements. Computer aided method engineering is about generation and use of information systems design techniques according to user needs. Some times such environments are called generic tools or MetaCASE. Computer-Aided Method Engineering: Designing Case Repositories for the 21st Century presents a contribution on a methodology and architecture of a CASE repository, forwarding a theory that will bring about the component based development into CASE tool design and development covering a repository construction principle for the 21st century.
This book provides a snapshot of the state of current research at the interface between machine learning and healthcare with special emphasis on machine learning projects that are (or are close to) achieving improvement in patient outcomes. The book provides overviews on a range of technologies including detecting artefactual events in vital signs monitoring data; patient physiological monitoring; tracking infectious disease; predicting antibiotic resistance from genomic data; and managing chronic disease. With contributions from an international panel of leading researchers, this book will find a place on the bookshelves of academic and industrial researchers and advanced students working in healthcare technologies, biomedical engineering, and machine learning.
"Implementing Analytics" demystifies the concept, technology and
application of analytics and breaks its implementation down to
repeatable and manageable steps, making it possible for widespread
adoption across all functions of an organization. "Implementing
Analytics "simplifies and helps democratize a very specialized
discipline to foster business efficiency and innovation without
investing in multi-million dollar technology and manpower. A
technology agnostic methodology that breaks down complex tasks like
model design and tuning and emphasizes business decisions rather
than the technology behind analytics. Simplifies the understanding of analytics from a technical and functional perspective and shows a wide array of problems that can be tackled using existing technology Provides a detailed step by step approach to identify opportunities, extract requirements, design variables and build and test models. It further explains the business decision strategies to use analytics models and provides an overview for governance and tuning Helps formalize analytics projects from staffing, technology and implementation perspectives Emphasizes machine learning and data mining over statistics and shows how the role of a Data Scientist can be broken down and still deliver the value by building a robust development process
Today's work is characterized by a high degree of innovation and thus demands a thorough overview of relevant knowledge in the world and in organizations. Semantic Work Environments support the work of the user by collecting knowledge about needs and providing processed and improved knowledge to be integrated into work. ""Emerging Technologies for Semantic Work Environments: Techniques, Methods, and Applications"" describes an overview of the emerging field of Semantic Work Environments by combining various research studies and underlining the similarities between different processes, issues and approaches in order to provide the reader with techniques, methods, and applications of the study.
This book uncovers stakes and possibilities offered by Computational Intelligence and Predictive Analytics to Medical Science. The main focus is on data technologies,classification, analysis and mining, information retrieval, and in the algorithms needed to elaborate the informations. A section with use cases and applications follows the two main parts of the book, respectively dedicated to the foundations and techniques of the discipline.
"Efficient Computation of Argumentation Semantics" addresses argumentation semantics and systems, introducing readers to cutting-edge decomposition methods that drive increasingly efficient logic computation in AI and intelligent systems. Such complex and distributed systems are increasingly used in the automation and transportation systems field, and particularly autonomous systems, as well as more generic intelligent computation research. The Series in Intelligent Systems publishes titles that cover
state-of-the-art knowledge and the latest advances in research and
development in intelligent systems. Its scope includes theoretical
studies, design methods, and real-world implementations and
applications. The series publishes titles in three core sub-topic
areas: intelligent automation, intelligent transportation systems,
and intelligent computing.
Since its first volume in 1960, Advances in Computers has
presented detailed coverage of innovations in computer hardware,
software, theory, design, and applications. It has also provided
contributors with a medium in which they can explore their subjects
in greater depth and breadth than journal articles usually allow.
As a result, many articles have become standard references that
continue to be of significant, lasting value in this rapidly
expanding field.
For the past decade or more, much of cell biology research has been focused on determining the key molecules involved in different cellular processes, an analytical problem that has been amenable to biochemical and genetic approaches. Now, we face an integrative problem of understanding how all of these molecules work together to produce living cells, a challenge that requires using quantitative approaches to model the complex interactions within a cell, and testing those models with careful quantitative measurements. This book is an introductory overview of the various approaches, methods, techniques, and models employed in quantitative cell biology, which are reviewed in greater detail in the other volumes in this e-book series. Particular emphasis is placed on the goals and purpose of quantitative analysis and modeling, and the special challenges that cell biology holds for understanding life at the physical level.
This textbook offers an insightful study of the intelligent Internet-driven revolutionary and fundamental forces at work in society. Readers will have access to tools and techniques to mentor and monitor these forces rather than be driven by changes in Internet technology and flow of money. These submerged social and human forces form a powerful synergistic foursome web of (a) processor technology, (b) evolving wireless networks of the next generation, (c) the intelligent Internet, and (d) the motivation that drives individuals and corporations. In unison, the technological forces can tear human lives apart for the passive or provide a cohesive set of opportunities for the knowledgeable to lead and reap the rewards in the evolved knowledge society. The book also provides in-depth coverage of the functions
embedded in modern processors and intelligent communication
networks. It focuses on the convergence of the design of modern
processor technologies with the switching and routing methodologies
of global intelligent networks. Most of the concepts that are
generic to the design of terra-flop parallel processors and the
terra-bit fiber-optic networks are presented. This book also
highlights recent developments in computer and processor
technologies into the microscopic and macroscopic medical functions
in hospitals and medical centers.
"Visual Computing for Medicine, Second Edition, "offers
cutting-edge visualization techniques and their applications in
medical diagnosis, education, and treatment. The book
includesalgorithms, applications, and ideas on achieving
reliability of results and clinical evaluation of the techniques
covered. Preim and Botha illustrate visualization techniques
fromresearch, but also cover the information required to solve
practical clinical problems. They base the book on several years of
combined teaching and research experience. This new edition
includes six new chapters on treatment planning, guidance and
training; an updated appendix on software support for visual
computing for medicine; and a new global structure that better
classifies and explains the major lines of work in the field.
"Calculus of Thought: Neuromorphic Logistic Regression in Cognitive Machines" is a must-read for all scientists about a very simple computation method designed to simulate big-data neural processing. This book is inspired by the Calculus Ratiocinator idea of Gottfried Leibniz, which is that machine computation should be developed to simulate human cognitive processes, thus avoiding problematic subjective bias in analytic solutions to practical and scientific problems. The reduced error logistic regression (RELR) method is proposed
as such a "Calculus of Thought." This book reviews how RELR's
completely automated processing may parallel important aspects of
explicit and implicit learning in neural processes. It emphasizes
the fact that RELR is really just a simple adjustment to already
widely used logistic regression, along with RELR's new applications
that go well beyond standard logistic regression in prediction and
explanation. Readers will learn how RELR solves some of the most
basic problems in today s big and small data related to high
dimensionality, multi-colinearity, and cognitive bias in capricious
outcomes commonly involving human behavior.
In recent years, swarm intelligence has become a popular computational approach among researchers working on optimization problems throughout the globe. Several algorithms inside swarm intelligence have been implemented due to their application to real-world issues and other advantages. A specific procedure, Fireworks Algorithm, is an emerging method that studies the explosion process of fireworks within local areas. Applications of this developing program are undiscovered, and research is necessary for scientists to fully understand the workings of this innovative system. The Handbook of Research on Fireworks Algorithms and Swarm Intelligence is a pivotal reference source that provides vital research on theory analysis, improvements, and applications of fireworks algorithm. While highlighting topics such as convergence rate, parameter applications, and global optimization analysis, this publication explores up-to-date progress on the specific techniques of this algorithm. This book is ideally designed for researchers, data scientists, mathematicians, engineers, software developers, postgraduates, and academicians seeking coverage on this evolutionary computation method.
Websites are a central part of today's business world; however, with the vast amount of information that constantly changes and the frequency of required updates, this can come at a high cost to modern businesses. Web Data Mining and the Development of Knowledge-Based Decision Support Systems is a key reference source on decision support systems in view of end user accessibility and identifies methods for extraction and analysis of useful information from web documents. Featuring extensive coverage across a range of relevant perspectives and topics, such as semantic web, machine learning, and expert systems, this book is ideally designed for web developers, internet users, online application developers, researchers, and faculty.
Statistical learning and analysis techniques have become extremely important today, given the tremendous growth in the size of heterogeneous data collections and the ability to process it even from physically distant locations. Recent advances made in the field of machine learning provide a strong framework for robust learning from the diverse corpora and continue to impact a variety of research problems across multiple scientific disciplines. The aim of this handbook is to familiarize beginners as well as experts with some of the recent techniques in this field. The Handbook is divided in two sections: Theory and
Applications, covering machine learning, data analytics,
biometrics, document recognition and security. emphasis on applications-oriented techniques
Swarm Intelligence and bio-inspired computation have become
increasing popular in the last two decades. Bio-inspired algorithms
such as ant colony algorithms, bat algorithms, bee algorithms,
firefly algorithms, cuckoo search and particle swarm optimization
have been applied in almost every area of science and engineering
with a dramatic increase of number of relevant publications. This
book reviews the latest developments in swarm intelligence and
bio-inspired computation from both the theory and application side,
providing a complete resource that analyzes and discusses the
latest and future trends in research directions. It can help new
researchers to carry out timely research and inspire readers to
develop new algorithms. With its impressive breadth and depth, this
book will be useful for advanced undergraduate students, PhD
students and lecturers in computer science, engineering and science
as well as researchers and engineers. |
You may like...
Kreatiewe Kombinasies In Eietydse Tuine
Louise van Rooyen, Suzette Stephenson
Paperback
|