![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing
Since its first volume in 1960, Advances in Computers has
presented detailed coverage of innovations in computer hardware,
software, theory, design, and applications. It has also provided
contributors with a medium in which they can explore their subjects
in greater depth and breadth than journal articles usually allow.
As a result, many articles have become standard references that
continue to be of significant, lasting value in this rapidly
expanding field.
This book is intended to make recent results on the derivation of higher order numerical schemes for random ordinary differential equations (RODEs) available to a broader readership, and to familiarize readers with RODEs themselves as well as the closely associated theory of random dynamical systems. In addition, it demonstrates how RODEs are being used in the biological sciences, where non-Gaussian and bounded noise are often more realistic than the Gaussian white noise in stochastic differential equations (SODEs). RODEs are used in many important applications and play a fundamental role in the theory of random dynamical systems. They can be analyzed pathwise with deterministic calculus, but require further treatment beyond that of classical ODE theory due to the lack of smoothness in their time variable. Although classical numerical schemes for ODEs can be used pathwise for RODEs, they rarely attain their traditional order since the solutions of RODEs do not have sufficient smoothness to have Taylor expansions in the usual sense. However, Taylor-like expansions can be derived for RODEs using an iterated application of the appropriate chain rule in integral form, and represent the starting point for the systematic derivation of consistent higher order numerical schemes for RODEs. The book is directed at a wide range of readers in applied and computational mathematics and related areas as well as readers who are interested in the applications of mathematical models involving random effects, in particular in the biological sciences.The level of this book is suitable for graduate students in applied mathematics and related areas, computational sciences and systems biology. A basic knowledge of ordinary differential equations and numerical analysis is required.
As computers have infiltrated virtually every facet of our lives, so has computer science influenced nearly every academic subject in science, engineering, medicine, social science, the arts and humanities. Michael Knee offers a selective guide to the major resources and tools central to the entire industry. A discussion of three commonly used subject classification systems precedes an annotated bibliography of over 500 items. As computers have infiltrated virtually every facet of our lives, so has computer science influenced nearly every academic subject in science, engineering, medicine, social science, the arts and humanities. Michael Knee offers a selective guide to the major resources and tools central to the computer industry: teaching institutions, research institutes and laboratories, manufacturers, standardization organizations, professional associations and societies, and publishers. He begins with a discussion of the three subject classification systems most commonly used to describe, index, and manage computer science information: the Association for Computing Machinery, Inspec, and the Library of Congress. An annotated bibliography of over 500 items follows, grouped by material type, and featuring a mix of classic works and current sources.
This book questions the relevance of computation to the physical universe. Our theories deliver computational descriptions, but the gaps and discontinuities in our grasp suggest a need for continued discourse between researchers from different disciplines, and this book is unique in its focus on the mathematical theory of incomputability and its relevance for the real world. The core of the book consists of thirteen chapters in five parts on extended models of computation; the search for natural examples of incomputable objects; mind, matter, and computation; the nature of information, complexity, and randomness; and the mathematics of emergence and morphogenesis. This book will be of interest to researchers in the areas of theoretical computer science, mathematical logic, and philosophy.
Collected together in this book are ten state-of-the-art expository articles on the most important topics in optimization, written by leading experts in the field. The book therefore provides a primary reference for those performing research in some area of optimization or for those who have some basic knowledge of optimization techniques but wish to learn the most up-to-date and efficient algorithms for particular classes of problems. The first sections of each chapter are expository and therefore accessible to master's level graduate students. However, the chapters also contain advanced material on current topics of interest to researchers. For instance there are chapters which describe the polynomial-time linear programming algorithms of Khachian and Karmarkar and the techniques used to solve combinatorial and integer programming problems, an order of magnitude larger than was possible just a few years ago. Overall a comprehensive yet lively and up-to-date discussion of the state-of-the-art in optimization is presented in this book.
The book presents laboratory experiments concerning ARM microcontrollers, and discusses the architecture of the Tiva Cortex-M4 ARM microcontrollers from Texas Instruments, describing various ways of programming them. Given the meager peripherals and sensors available on the kit, the authors describe the design of Padma - a circuit board with a large set of peripherals and sensors that connects to the Tiva Launchpad and exploits the Tiva microcontroller family's on-chip features. ARM microcontrollers, which are classified as 32-bit devices, are currently the most popular of all microcontrollers. They cover a wide range of applications that extend from traditional 8-bit devices to 32-bit devices. Of the various ARM subfamilies, Cortex-M4 is a middle-level microcontroller that lends itself well to data acquisition and control as well as digital signal manipulation applications. Given the prominence of ARM microcontrollers, it is important that they should be incorporated in academic curriculums. However, there is a lack of up-to-date teaching material - textbooks and comprehensive laboratory manuals. In this book each of the microcontroller's resources - digital input and output, timers and counters, serial communication channels, analog-to-digital conversion, interrupt structure and power management features - are addressed in a set of more than 70 experiments to help teach a full semester course on these microcontrollers. Beyond these physical interfacing exercises, it describes an inexpensive BoB (break out board) that allows students to learn how to design and build standalone projects, as well a number of illustrative projects.
In this book the editors have gathered a number of contributions by persons who have been working on problems of Cognitive Technology (CT). The present collection initiates explorations of the human mind via the technologies the mind produces. These explorations take as their point of departure the question What happens when humans produce new technologies? Two interdependent perspectives from which such a production can be approached are adopted: - How and why constructs that have their origins in human mental life are embodied in physical environments when people fabricate their habitat, even to the point of those constructs becoming that very habitat - How and why these fabricated habitats affect, and feed back into, human mental life. The aim of the CT research programme is to determine, in general, which technologies, and in particular, which interactive computer-based technologies, are humane with respect to the cognitive development and evolutionary adaptation of their end users. But what does it really mean to be humane in a technological world? To shed light on this central issue other pertinent questions are raised, e.g. - Why are human minds externalised, i.e., what purpose does the process of externalisation serve? - What can we learn about the human mind by studying how it externalises itself? - How does the use of externalised mental constructs (the objects we call 'tools') change people fundamentally? - To what extent does human interaction with technology serve as an amplification of human cognition, and to what extent does it lead to a atrophy of the human mind? The book calls for a reflection on what a tool is. Strong parallels between CT andenvironmentalism are drawn: both are seen as trends having originated in our need to understand how we manipulate, by means of the tools we have created, our natural habitat consisting of, on the one hand, the cognitive environment which generates thought and determines action, and on the other hand, the physical environment in which thought and action are realised. Both trends endeavour to protect the human habitat from the unwanted or uncontrolled impact of technology, and are ultimately concerned with the ethics and aesthetics of tool design and tool use. Among the topics selected by the contributors to the book, the following themes emerge (the list is not exhaustive): using technology to empower the cognitively impaired; the ethics versus aesthetics of technology; the externalisation of emotive and affective life and its special dialectic ('mirror') effects; creativity enhancement: cognitive space, problem tractability; externalisation of sensory life and mental imagery; the engineering and modelling aspects of externalised life; externalised communication channels and inner dialogue; externalised learning protocols; relevance analysis as a theoretical framework for cognitive technology.
In this monograph we introduce and examine four new temporal logic formalisms that can be used as specification languages for the automated verification of the reliability of hardware and software designs with respect to a desired behavior. The work is organized in two parts. In the first part two logics for computations, the graded computation tree logic and the computation tree logic with minimal model quantifiers are discussed. These have proved to be useful in describing correct executions of monolithic closed systems. The second part focuses on logics for strategies, strategy logic and memoryful alternating-time temporal logic, which have been successfully applied to formalize several properties of interactive plays in multi-entities systems modeled as multi-agent games.
This book introduces new logic primitives for electronic design automation tools. The author approaches fundamental EDA problems from a different, unconventional perspective, in order to demonstrate the key role of rethinking EDA solutions in overcoming technological limitations of present and future technologies. The author discusses techniques that improve the efficiency of logic representation, manipulation and optimization tasks by taking advantage of majority and biconditional logic primitives. Readers will be enabled to accelerate formal methods by studying core properties of logic circuits and developing new frameworks for logic reasoning engines.
Useful to healthcare providers, severity indices conclude which patients are most at risk for infection as well as the intensity of illness while in the hospital. ""Text Mining Techniques for Healthcare Provider Quality Determination: Methods for Rank Comparisons"" discusses the general practice of defining a patient severity index for risk adjustments and comparison of patient outcomes to assess quality factors. This ""Premier Reference Source"" examines the consequences of patient severity models and investigates the general assumptions required to perform standard severity adjustment.
This book constitutes the refereed post-conference proceedings of the 10th IFIP WG 5.14 International Conference on Computer and Computing Technologies in Agriculture, CCTA 2016, held in Dongying, China, in October 2016. The 55 revised papers presented were carefully reviewed and selected from 128 submissions. They cover a wide range of interesting theories and applications of information technology in agriculture, including intelligent sensing, cloud computing, key technologies of the Internet of Things, precision agriculture, animal husbandry information technology, including Internet + modern animal husbandry, livestock big data platform and cloud computing applications, intelligent breeding equipment, precision production models, water product networking and big data , including fishery IoT, intelligent aquaculture facilities, and big data applications.
This book sets out to define and consolidate the field of bioinformation studies in its transnational and global dimensions, drawing on debates in science and technology studies, anthropology and sociology. It provides situated analyses of bioinformation journeys across domains and spheres of interpretation. As unprecedented amounts of data relating to biological processes and lives are collected, aggregated, traded and exchanged, infrastructural systems and machine learners produce real consequences as they turn indeterminate data into actionable decisions for states, companies, scientific researchers and consumers. Bioinformation accrues multiple values as it transverses multiple registers and domains, and as it is transformed from bodies to becoming a subject of analysis tied to particular social relations, promises, desires and futures. The volume harnesses the anthropological sensibility for situated, fine-grained, ethnographically grounded analysis to develop an interdisciplinary dialogue on the conceptual, political, social and ethical dimensions posed by bioinformation.
Das Buch behandelt Prinzipien und Methoden der Software-Entwicklung fA1/4r Kommunikationsnetze, basierend auf praktischen Erfahrungen aus einer Reihe von Software-Projekten. Die spezifischen Merkmale dieser Software sind parallele AblAufe, zeitkritisches Antwortverhalten, komplexe FunktionalitAt und sehr hohe QualitAtsanforderungen. Eine wesentliche Rolle bei der Beherrschung der Software-KomplexitAt spielt die Architektur. Sie stellt die Regeln und Methoden fA1/4r einen effektiven Systementwurf zur VerfA1/4gung, auf dem sich der gesamte Entwicklungsprozess abstA1/4tzen kann. Dazu gehArt eine vollstAndige Spezifikationsmethodik auf der Grundlage einer formalen Sprache, deren Semantik an den typischen Merkmalen von Kommunikations-Software ausgerichtet ist. Schwerpunkt der AusfA1/4hrungen ist die Anpassung der Software-Entwicklung an die steigenden Anforderungen bezA1/4glich FunktionalitAt, Marktorientierung, Kosten und Zeit.
The best selling 'Algorithmics' presents the most important, concepts, methods and results that are fundamental to the science of computing. It starts by introducing the basic ideas of algorithms, including their structures and methods of data manipulation. It then goes on to demonstrate how to design accurate and efficient algorithms, and discusses their inherent limitations. As the author himself says in the preface to the book; 'This book attempts to present a readable account of some of the most important and basic topics of computer science, stressing the fundamental and robust nature of the science in a form that is virtually independent of the details of specific computers, languages and formalisms'.
This book illustrates how to use description logic-based formalisms to their full potential in the creation, indexing, and reuse of multimedia semantics. To do so, it introduces researchers to multimedia semantics by providing an in-depth review of state-of-the-art standards, technologies, ontologies, and software tools. It draws attention to the importance of formal grounding in the knowledge representation of multimedia objects, the potential of multimedia reasoning in intelligent multimedia applications, and presents both theoretical discussions and best practices in multimedia ontology engineering. Readers already familiar with mathematical logic, Internet, and multimedia fundamentals will learn to develop formally grounded multimedia ontologies, and map concept definitions to high-level descriptors. The core reasoning tasks, reasoning algorithms, and industry-leading reasoners are presented, while scene interpretation via reasoning is also demonstrated. Overall, this book offers readers an essential introduction to the formal grounding of web ontologies, as well as a comprehensive collection and review of description logics (DLs) from the perspectives of expressivity and reasoning complexity. It covers best practices for developing multimedia ontologies with formal grounding to guarantee decidability and obtain the desired level of expressivity while maximizing the reasoning potential. The capabilities of such multimedia ontologies are demonstrated by DL implementations with an emphasis on multimedia reasoning applications.
Advanced Topics in Information Technology Standards and Standardization Research is a series of books which features the most current research findings in all aspects of IT standardization research, from a diversity of angles, traversing the traditional boundaries between individual disciplines. ""Advanced Topics in Information Technology Standards and Standardization Research, Volume 1"", is a part of this series. ""Advanced Topics in Information Technology Standards and Standardization Research, Volume 1,"" presents a collection of chapters addressing a variety of aspects related to IT standards and the setting of standards. This book covers a variety of topics, such as economic aspects of standards, alliances in standardization and the relation between 'formal' standards bodies and industry consortia. It also offers a glimpse inside a standards working group, as well as a look at applications of standards in different sectors.
This book first focuses on the explanation of the theory about focal mechanisms and moment tensor solutions and their role in the modern seismology. The second part of the book compiles several state-of-the-art case studies in different seismotectonic settings of the planet.The assessment of seismic hazard and the reduction of losses due to future earthquakes is probably the most important contribution of seismology to society. In this regard, the understanding of reliable determination seismic source and of its uncertainty can play a key role in contributing to geodynamic investigation, seismic hazard assessment and earthquake studies. In the last two decades, the use of waveforms recorded at local-to-regional distances has increased considerably. Waveform modeling has been used also to estimate faulting parameters of small-to-moderate sized earthquakes.
An increasing number of global institutions look to advancements in technology to enhance access to learning and development and, in doing so, seek collaborative opportunities to maximize the benefits of educational technology. Cases on Technology Enhanced Learning through Collaborative Opportunities analyzes and evaluates how organizations and institutions of learning in the developing and developed world are adapting to technology enhanced learning environments and exploring transnational collaborative opportunities, providing prospects for learning, growth and development through a blend of traditional and technological methods.
This book opens the door to a new interesting and ambitious world of reversible and quantum computing research. It presents the state of the art required to travel around that world safely. Top world universities, companies and government institutions are in a race of developing new methodologies, algorithms and circuits on reversible logic, quantum logic, reversible and quantum computing and nano-technologies. In this book, twelve reversible logic synthesis methodologies are presented for the first time in a single literature with some new proposals. Also, the sequential reversible logic circuitries are discussed for the first time in a book. Reversible logic plays an important role in quantum computing. Any progress in the domain of reversible logic can be directly applied to quantum logic. One of the goals of this book is to show the application of reversible logic in quantum computing. A new implementation of wavelet and multiwavelet transforms using quantum computing is performed for this purpose. Researchers in academia or industry and graduate students, who work in logic synthesis, quantum computing, nano-technology, and low power VLSI circuit design, will be interested in this book.
The papers in this volume represent the most timely and advanced contributions to the 2014 Joint Applied Statistics Symposium of the International Chinese Statistical Association (ICSA) and the Korean International Statistical Society (KISS), held in Portland, Oregon. The contributions cover new developments in statistical modeling and clinical research: including model development, model checking, and innovative clinical trial design and analysis. Each paper was peer-reviewed by at least two referees and also by an editor. The conference was attended by over 400 participants from academia, industry, and government agencies around the world, including from North America, Asia, and Europe. It offered 3 keynote speeches, 7 short courses, 76 parallel scientific sessions, student paper sessions, and social events.
This edited volume collects the research results presented at the 14th International Symposium on Computer Methods in Biomechanics and Biomedical Engineering, Tel Aviv, Israel, 2016. The topical focus includes, but is not limited to, cardiovascular fluid dynamics, computer modeling of tissue engineering, skin and spine biomechanics, as well as biomedical image analysis and processing. The target audience primarily comprises research experts in the field of bioengineering, but the book may also be beneficial for graduate students alike.
Evolutionary algorithms constitute a class of well-known algorithms, which are designed based on the Darwinian theory of evolution and Mendelian theory of heritage. They are partly based on random and partly based on deterministic principles. Due to this nature, it is challenging to predict and control its performance in solving complex nonlinear problems. Recently, the study of evolutionary dynamics is focused not only on the traditional investigations but also on the understanding and analyzing new principles, with the intention of controlling and utilizing their properties and performances toward more effective real-world applications. In this book, based on many years of intensive research of the authors, is proposing novel ideas about advancing evolutionary dynamics towards new phenomena including many new topics, even the dynamics of equivalent social networks. In fact, it includes more advanced complex networks and incorporates them with the CMLs (coupled map lattices), which are usually used for spatiotemporal complex systems simulation and analysis, based on the observation that chaos in CML can be controlled, so does evolution dynamics. All the chapter authors are, to the best of our knowledge, originators of the ideas mentioned above and researchers on evolutionary algorithms and chaotic dynamics as well as complex networks, who will provide benefits to the readers regarding modern scientific research on related subjects. |
![]() ![]() You may like...
Agile Scrum Implementation and Its…
Kenneth R Walsh, Sathiadev Mahesh, …
Hardcover
R6,500
Discovery Miles 65 000
Advances in Database Technology - EDBT…
Alain Pirotte, Claude Delobel, …
Paperback
R3,128
Discovery Miles 31 280
Oberon-2 Programming with Windows
Joerg R. Muhlbacher, Bernhard Leisch, …
Mixed media product
R2,655
Discovery Miles 26 550
Formal Methods for Open Object-Based…
Paolo Ciancarini, Alessandro Fantechi, …
Hardcover
R5,838
Discovery Miles 58 380
A Student Guide to Object-Oriented…
Carol Britton, Jill Doake
Paperback
R1,482
Discovery Miles 14 820
|