![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
Recent decades have witnessed the thriving development of new mathematical, computational and theoretical approaches, such as bioinformatics and neuroinformatics, to tackle fundamental issues in biology. These approaches focus no longer on individual units, such as nerve cells or genes, but rather on dynamic patterns of interactions between them. This volume explores the concept in full, featuring contributions from a global group of contributors, many of whom are pre-eminent in their field.
This book brings together historical notes, reviews of research developments, fresh ideas on how to make VC (Vapnik-Chervonenkis) guarantees tighter, and new technical contributions in the areas of machine learning, statistical inference, classification, algorithmic statistics, and pattern recognition. The contributors are leading scientists in domains such as statistics, mathematics, and theoretical computer science, and the book will be of interest to researchers and graduate students in these domains.
This book discusses major milestones in Rohit Jivanlal Parikh's scholarly work. Highlighting the transition in Parikh's interest from formal languages to natural languages, and how he approached Wittgenstein's philosophy of language, it traces the academic trajectory of a brilliant scholar whose work opened up various new avenues in research. This volume is part of Springer's book series Outstanding Contributions to Logic, and honours Rohit Parikh and his works in many ways. Parikh is a leader in the realm of ideas, offering concepts and definitions that enrich the field and lead to new research directions. Parikh has contributed to a variety of areas in logic, computer science and game theory. In mathematical logic his contributions have been in recursive function theory, proof theory and non-standard analysis; in computer science, in the areas of modal, temporal and dynamic logics of programs and semantics of programs, as well as logics of knowledge; in artificial intelligence in the area of belief revision; and in game theory in the formal analysis of social procedures, with a strong undercurrent of philosophy running through all his work.This is not a collection of articles limited to one theme, or even directly connected to specific works by Parikh, but instead all papers are inspired and influenced by Parikh in some way, adding structures to and enriching "Parikh-land". The book presents a brochure-like overview of Parikh-land before providing an "introductory video" on the sights and sounds that you experience when reading the book.
Graphs are widely used to represent structural information in the form of objects and connections between them. Graph transformation is the rule-based manipulation of graphs, an increasingly important concept in computer science and related fields. This is the first textbook treatment of the algebraic approach to graph transformation, based on algebraic structures and category theory. Part I is an introduction to the classical case of graph and typed graph transformation. In Part II basic and advanced results are first shown for an abstract form of replacement systems, so-called adhesive high-level replacement systems based on category theory, and are then instantiated to several forms of graph and Petri net transformation systems. Part III develops typed attributed graph transformation, a technique of key relevance in the modeling of visual languages and in model transformation. Part IV contains a practical case study on model transformation and a presentation of the AGG (attributed graph grammar) tool environment. Finally the appendix covers the basics of category theory, signatures and algebras. The book addresses both research scientists and graduate students in computer science, mathematics and engineering.
Computer science is the science of the future, and already underlies every facet of business and technology, and much of our everyday lives. In addition, it will play a crucial role in the science the 21st century, which will be dominated by biology and biochemistry, similar to the role of mathematics in the physical sciences of the 20th century. In this award-winning best-seller, the author and his co-author focus on the fundamentals of computer science, which revolve around the notion of the "algorithm." They discuss the design of algorithms, and their efficiency and correctness, the inherent limitations of algorithms and computation, quantum algorithms, concurrency, large systems and artificial intelligence. Throughout, the authors, in their own words, stress the 'fundamental and robust nature of the science in a form that is virtually independent of the details of specific computers, languages and formalisms'. This version of the book is published to celebrate 25 years since its first edition, and in honor of the Alan M. Turing Centennial year. Turing was a true pioneer of computer science, whose work forms the underlying basis of much of this book. "
This book is written for anyone who is interested in how a field of research evolves and the fundamental role of understanding uncertainties involved in different levels of analysis, ranging from macroscopic views to meso- and microscopic ones. We introduce a series of computational and visual analytic techniques, from research areas such as text mining, deep learning, information visualization and science mapping, such that readers can apply these tools to the study of a subject matter of their choice. In addition, we set the diverse set of methods in an integrative context, that draws upon insights from philosophical, sociological, and evolutionary theories of what drives the advances of science, such that the readers of the book can guide their own research with their enriched theoretical foundations. Scientific knowledge is complex. A subject matter is typically built on its own set of concepts, theories, methodologies and findings, discovered by generations of researchers and practitioners. Scientific knowledge, as known to the scientific community as a whole, experiences constant changes. Some changes are long-lasting, whereas others may be short lived. How can we keep abreast of the state of the art as science advances? How can we effectively and precisely convey the status of the current science to the general public as well as scientists across different disciplines? The study of scientific knowledge in general has been overwhelmingly focused on scientific knowledge per se. In contrast, the status of scientific knowledge at various levels of granularity has been largely overlooked. This book aims to highlight the role of uncertainties, in developing a better understanding of the status of scientific knowledge at a particular time, and how its status evolves over the course of the development of research. Furthermore, we demonstrate how the knowledge of the types of uncertainties associated with scientific claims serves as an integral and critical part of our domain expertise.
This book presents some recent works on the application of Soft Computing techniques in information access on the World Wide Web. The book comprises 15 chapters from internationally known researchers and is divided in four parts reflecting the areas of research of the presented works such as Document Classification, Semantic Web, Web Information Retrieval and Web Applications. This book demonstrates that Web Information Retrieval is a stimulating area of research where Soft Computing technologies can be applied satisfactorily.
This text centers around three main subjects. The first is the concept of modularity and independence in classical logic and nonmonotonic and other nonclassical logic, and the consequences on syntactic and semantical interpolation and language change. In particular, we will show the connection between interpolation for nonmonotonic logic and manipulation of an abstract notion of size. Modularity is essentially the ability to put partial results achieved independently together for a global result. The second aspect of the book is the authors' uniform picture of conditionals, including many-valued logics and structures on the language elements themselves and on the truth value set. The third topic explained by the authors is neighbourhood semantics, their connection to independence, and their common points and differences for various logics, e.g., for defaults and deontic logic, for the limit version of preferential logics, and for general approximation. The book will be of value to researchers and graduate students in logic and theoretical computer science.
Networks have become nearly ubiquitous and increasingly complex, and their support of modern enterprise environments has become fundamental. Accordingly, robust network management techniques are essential to ensure optimal performance of these networks. This monograph treats the application of numerous graph-theoretic algorithms to a comprehensive analysis of dynamic enterprise networks. Network dynamics analysis yields valuable information about network performance, efficiency, fault prediction, cost optimization, indicators and warnings. Based on many years of applied research of generic network dynamics, this work covers a number of elegant applications (including many new and experimental results) of traditional graph theory algorithms and techniques to computationally tractable network dynamics analysis to motivate network analysts, practitioners and researchers alike. The material is also suitable for graduate courses addressing state-of-the-art applications of graph theory in analysis of dynamic communication networks, dynamic databasing, and knowledge management.
Confronting the digital revolution in academia, this book examines the application of new computational techniques and visualisation technologies in the Arts & Humanities. Uniting differing perspectives, leading and emerging scholars discuss the theoretical and practical challenges that computation raises for these disciplines.
Computer simulation and mathematical modelling are the most important approaches in the quantitative analysis of the diffusive processes fundamental to many physical, chemical, biological, and geological systems. This comprehensive text/reference addresses the key issues in the "Modelling and Simulation of Diffusive Processes" from a broad range of different application areas. Applying an holistic approach, the book presents illuminating viewpoints drawn from an international selection of experts across a wide spectrum of disciplines, from computer science, mathematics and engineering, to natural resource management, environmental sciences, applied geo-sciences, agricultural sciences, and theoretical medicine. Topics and features: presents a detailed introduction to diffusive processes and modelling; discusses diffusion and molecular transport in living cells, and suspended sediment in open channels; examines the mathematical modelling of peristaltic transport of nanofluids, and isotachophoretic separation of ionic samples in microfluidics; reviews thermal characterization of non-homogeneous media, and scale-dependent porous dispersion resulting from velocity fluctuations; describes the modelling of nitrogen fate and transport at the sediment-water interface, and groundwater flow in unconfined aquifers; investigates two-dimensional solute transport from a varying pulse type point source, and futile cycles in metabolic flux modelling; studies contaminant concentration prediction along unsteady groundwater flow, and modelling synovial fluid flow in human joints; explores the modelling of soil organic carbon, and crop growth simulation. This interdisciplinary volume will be invaluable to researchers, lecturers and graduate students from such diverse fields as computer science, mathematics, hydrology, agriculture and biology.
The two volumes IFIP AICT 545 and 546 constitute the refereed post-conference proceedings of the 11th IFIP WG 5.14 International Conference on Computer and Computing Technologies in Agriculture, CCTA 2017, held in Jilin, China, in August 2017. The 100 revised papers included in the two volumes were carefully reviewed and selected from 282 submissions. They cover a wide range of interesting theories and applications of information technology in agriculture. The papers focus on four topics: Internet of Things and big data in agriculture, precision agriculture and agricultural robots, agricultural information services, and animal and plant phenotyping for agriculture.
In 1998-99, at the dawn of the SoC Revolution, we wrote Surviving the SOC Revolution: A Guide to Platform Based Design. In that book, we focused on presenting guidelines and best practices to aid engineers beginning to design complex System-on-Chip devices (SoCs). Now, in 2003, facing the mid-point of that revolution, we believe that it is time to focus on winning. In this book, Winning the SoC Revolution: Experiences in Real Design, we gather the best practical experiences in how to design SoCs from the most advanced design groups, while setting the issues and techniques in the context of SoC design methodologies. As an edited volume, this book has contributions from the leading design houses who are winning in SoCs - Altera, ARM, IBM, Philips, TI, UC Berkeley, and Xilinx. These chapters present the many facets of SoC design - the platform based approach, how to best utilize IP, Verification, FPGA fabrics as an alternative to ASICs, and next generation process technology issues. We also include observations from Ron Wilson of CMP Media on best practices for SoC design team collaboration. We hope that by utilizing this book, you too, will win the SoC Revolution.
This book constitutes the thoroughly refereed post conference proceedings of the 6th IFIP WG 9.2, 9.6/11.7, 11.4, 11.6/PrimeLife International Summer School, held in Helsingborg, Sweden, in August 2010. The 27 revised papers were carefully selected from numerous submissions during two rounds of reviewing. They are organized in topical sections on terminology, privacy metrics, ethical, social, and legal aspects, data protection and identity management, eID cards and eID interoperability, emerging technologies, privacy for eGovernment and AAL applications, social networks and privacy, privacy policies, and usable privacy.
The past decades have seen significant improvements in 3D imaging where the related techniques and technologies have advanced to a mature state. These exciting developments have sparked increasing interest in the challenges and opportunities afforded by 3D sensing. As a consequence, the emerging area of safety and security related imaging incorporates these important new technologies beyond the limitations of 2D image processing.This book presents the thoroughly revised versions of lectures given by leading researchers during the Workshop on Advanced 3D Imaging for Safety and Security in conjunction with the International Conference on Computer Vision and Pattern Recognition CVPR 2005, held in San Diego, CA, USA in June 2005.It covers the current state of the art in 3D imaging for safety and security.
Today, new skills are required to compete in a global economy where organizations have new alternatives to choose from. In the next ten years as baby boomers retire, even more opportunities will become available. Finding IT professionals with specific skills is no easy feat. Today s job skills require not only strong technical skills, but also excellent business, industry, communication, marketing and negotiating abilities. Managing IT Human Resources: Considerations for Organizations and Personnel provides a comprehensive presentation of current and emerging perspectives focusing on all aspects of managing IT HR from the view of both practitioners and academics located around the globe. It will focus on the results of recent research (from leading practitioners and academics) and their implications to IT human resource considerations. It presents what IT professionals are seeking in a position, characteristics of the IT environment that contributes to the HR complexity, the retention of IT talent, stress in the workplace, IT career development, and the impact of IT outsourcing.
Cognitive Intelligence with Neutrosophic Statistics in Bioinformatics investigates and presents the many applications that have arisen in the last ten years using neutrosophic statistics in bioinformatics, medicine, agriculture and cognitive science. This book will be very useful to the scientific community, appealing to audiences interested in fuzzy, vague concepts from which uncertain data are collected, including academic researchers, practicing engineers and graduate students. Neutrosophic statistics is a generalization of classical statistics. In classical statistics, the data is known, formed by crisp numbers. In comparison, data in neutrosophic statistics has some indeterminacy. This data may be ambiguous, vague, imprecise, incomplete, and even unknown. Neutrosophic statistics refers to a set of data, such that the data or a part of it are indeterminate in some degree, and to methods used to analyze the data.
"Fixed-Point Algorithms for Inverse Problems in Science and Engineering" presents some of the most recent work from top-notch researchers studying projection and other first-order fixed-point algorithms in several areas of mathematics and the applied sciences. The material presented provides a survey of the state-of-the-art theory and practice in fixed-point algorithms, identifying emerging problems driven by applications, and discussing new approaches for solving these problems. This book incorporates diverse perspectives from broad-ranging areas of research including, variational analysis, numerical linear algebra, biotechnology, materials science, computational solid-state physics, and chemistry. Topics presented include: Theory of Fixed-point algorithms: convex analysis, convex optimization, subdifferential calculus, nonsmooth analysis, proximal point methods, projection methods, resolvent and related fixed-point theoretic methods, and monotone operator theory. Numerical analysis of fixed-point algorithms: choice of step lengths, of weights, of blocks for block-iterative and parallel methods, and of relaxation parameters; regularization of ill-posed problems; numerical comparison of various methods. Areas of Applications: engineering (image and signal reconstruction and decompression problems), computer tomography and radiation treatment planning (convex feasibility problems), astronomy (adaptive optics), crystallography (molecular structure reconstruction), computational chemistry (molecular structure simulation) and other areas. Because of the variety of applications presented, this book can easily serve as a basis for new and innovated research and collaboration.
This book constitutes Part IV of the refereed four-volume post-conference proceedings of the 4th IFIP TC 12 International Conference on Computer and Computing Technologies in Agriculture, CCTA 2010, held in Nanchang, China, in October 2010. The 352 revised papers presented were carefully selected from numerous submissions. They cover a wide range of interesting theories and applications of information technology in agriculture, including simulation models and decision-support systems for agricultural production, agricultural product quality testing, traceability and e-commerce technology, the application of information and communication technology in agriculture, and universal information service technology and service systems development in rural areas.
This book includes 23 papers dealing with the impact of modern information and communication technologies that support a wide variety of communities: local communities, virtual communities, and communities of practice, such as knowledge communities and scientific communities. The volume is the result of the second multidisciplinary "Communities and Technologies Conference," a major event in this emerging research field. The various chapters discuss how communities are affected by technologies, and how understanding of the way that communities function can be used in improving information systems design. This state of the art overview will be of interest to computer and information scientists, social scientists and practitioners alike.
For generations, humans have fantasized about the ability to create devices that can see into a person's mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science fiction stories. Recent advances in cognitive neuroscience and brain imaging technologies have started to turn these myths into a reality, and are providing us with the ability to interface directly with the human brain. This ability is made possible through the use of sensors that monitor physical processes within the brain which correspond with certain forms of thought. Brain-Computer Interfaces: Applying our Minds to Human-Computer Interaction broadly surveys research in the Brain-Computer Interface domain. More specifically, each chapter articulates some of the challenges and opportunities for using brain sensing in Human-Computer Interaction work, as well as applying Human-Computer Interaction solutions to brain sensing work. For researchers with little or no expertise in neuroscience or brain sensing, the book provides background information to equip them to not only appreciate the state-of-the-art, but also ideally to engage in novel research. For expert Brain-Computer Interface researchers, the book introduces ideas that can help in the quest to interpret intentional brain control and develop the ultimate input device. It challenges researchers to further explore passive brain sensing to evaluate interfaces and feed into adaptive computing systems. Most importantly, the book will connect multiple communities allowing research to leverage their work and expertise and blaze into the future.
The growing demand of speed, accuracy, and reliability in scientific and engineering computing has been accelerating the merging of symbolic and numeric computations. These two types of computation coexist in mathematics yet are separated in traditional research of mathematical computation. This book presents 27 research articles on the integration and interaction of symbolic and numeric computation.
This book presents the theory of continuum mechanics for mechanical, thermodynamical, and electrodynamical systems. It shows how to obtain governing equations and it applies them by computing the reality. It uses only open-source codes developed under the FEniCS project and includes codes for 20 engineering applications from mechanics, fluid dynamics, applied thermodynamics, and electromagnetism. Moreover, it derives and utilizes the constitutive equations including coupling terms, which allow to compute multiphysics problems by incorporating interactions between primitive variables, namely, motion, temperature, and electromagnetic fields. An engineering system is described by the primitive variables satisfying field equations that are partial differential equations in space and time. The field equations are mostly coupled and nonlinear, in other words, difficult to solve. In order to solve the coupled, nonlinear system of partial differential equations, the book uses a novel collection of open-source packages developed under the FEniCS project. All primitive variables are solved at once in a fully coupled fashion by using finite difference method in time and finite element method in space.
Third International Conference on Recent Trends in Information, Telecommunication and Computing - ITC 2012. ITC 2012 will be held during Aug 03-04, 2012, Kochi, India. ITC 2012, is to bring together innovative academics and industrial experts in the field of Computer Science, Information Technology, Computational Engineering, and Communication to a common forum. The primary goal of the conference is to promote research and developmental activities in Computer Science, Information Technology, Computational Engineering, and Communication. Another goal is to promote scientific information interchange between researchers, developers, engineers, students, and practitioners. |
You may like...
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
Computer-Graphic Facial Reconstruction
John G. Clement, Murray K. Marks
Hardcover
R2,327
Discovery Miles 23 270
Discovering Computers, Essentials…
Susan Sebok, Jennifer Campbell, …
Paperback
|