Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > General theory of computing
This book is intended to make recent results on the derivation of higher order numerical schemes for random ordinary differential equations (RODEs) available to a broader readership, and to familiarize readers with RODEs themselves as well as the closely associated theory of random dynamical systems. In addition, it demonstrates how RODEs are being used in the biological sciences, where non-Gaussian and bounded noise are often more realistic than the Gaussian white noise in stochastic differential equations (SODEs). RODEs are used in many important applications and play a fundamental role in the theory of random dynamical systems. They can be analyzed pathwise with deterministic calculus, but require further treatment beyond that of classical ODE theory due to the lack of smoothness in their time variable. Although classical numerical schemes for ODEs can be used pathwise for RODEs, they rarely attain their traditional order since the solutions of RODEs do not have sufficient smoothness to have Taylor expansions in the usual sense. However, Taylor-like expansions can be derived for RODEs using an iterated application of the appropriate chain rule in integral form, and represent the starting point for the systematic derivation of consistent higher order numerical schemes for RODEs. The book is directed at a wide range of readers in applied and computational mathematics and related areas as well as readers who are interested in the applications of mathematical models involving random effects, in particular in the biological sciences.The level of this book is suitable for graduate students in applied mathematics and related areas, computational sciences and systems biology. A basic knowledge of ordinary differential equations and numerical analysis is required.
As science becomes increasingly computational, the limits of what is computationally tractable become a barrier to scientific progress. Many scientific problems, however, are amenable to human problem solving skills that complement computational power. By leveraging these skills on a larger scale-beyond the relatively few individuals currently engaged in scientific inquiry-there is the potential for new scientific discoveries. This book presents a framework for mapping open scientific problems into video games. The game framework combines computational power with human problem solving and creativity to work toward solving scientific problems that neither computers nor humans could previously solve alone. To maximize the potential contributors to scientific discovery, the framework designs a game to be played by people with no formal scientific background and incentivizes long-term engagement with a myriad of collaborative or competitive reward structures. The framework allows for the continual coevolution of the players and the game to each other: as players gain expertise through gameplay, the game changes to become a better tool. The framework is validated by being applied to proteomics problems with the video game Foldit. Foldit players have contributed to novel discoveries in protein structure prediction, protein design, and protein structure refinement algorithms. The coevolution of human problem solving and computer tools in an incentivized game framework is an exciting new scientific pathway that can lead to discoveries currently unreachable by other methods.
Tourism is one of the leading industries worldwide. The magnitude of growth in tourism will bring both opportunities and problems to source and destination markets in years to come, especially in the internal and external exchange of information in the industry. ""Information and Communication Technologies in Support of the Tourism Industry"" examines the process of transformation as it relates to the tourism industry, and the changes to that industry from modern electronic communications. ""Information and Communication Technologies in Support of the Tourism Industry"" covers not only geographically supportive technologies in communication, but also in terms of culture, economics, marketing, social, and regional issues. In-depth analyses range from the use of the Internet to supply information to the emerging patterns of tourist decision making and investments.
Physically unclonable functions (PUFs) are innovative physical security primitives that produce unclonable and inherent instance-specific measurements of physical objects; in many ways they are the inanimate equivalent of biometrics for human beings. Since they are able to securely generate and store secrets, they allow us to bootstrap the physical implementation of an information security system. In this book the author discusses PUFs in all their facets: the multitude of their physical constructions, the algorithmic and physical properties which describe them, and the techniques required to deploy them in security applications. The author first presents an extensive overview and classification of PUF constructions, with a focus on so-called intrinsic PUFs. He identifies subclasses, implementation properties, and design techniques used to amplify submicroscopic physical distinctions into observable digital response vectors. He lists the useful qualities attributed to PUFs and captures them in descriptive definitions, identifying the truly PUF-defining properties in the process, and he also presents the details of a formal framework for deploying PUFs and similar physical primitives in cryptographic reductions. The author then describes a silicon test platform carrying different intrinsic PUF structures which was used to objectively compare their reliability, uniqueness, and unpredictability based on experimental data. In the final chapters, the author explains techniques for PUF-based entity identification, entity authentication, and secure key generation. He proposes practical schemes that implement these techniques, and derives and calculates measures for assessing different PUF constructions in these applications based on the quality of their response statistics. Finally, he presents a fully functional prototype implementation of a PUF-based cryptographic key generator, demonstrating the full benefit of using PUFs and the efficiency of the processing techniques described. This is a suitable introduction and reference for security researchers and engineers, and graduate students in information security and cryptography.
The book presents laboratory experiments concerning ARM microcontrollers, and discusses the architecture of the Tiva Cortex-M4 ARM microcontrollers from Texas Instruments, describing various ways of programming them. Given the meager peripherals and sensors available on the kit, the authors describe the design of Padma - a circuit board with a large set of peripherals and sensors that connects to the Tiva Launchpad and exploits the Tiva microcontroller family's on-chip features. ARM microcontrollers, which are classified as 32-bit devices, are currently the most popular of all microcontrollers. They cover a wide range of applications that extend from traditional 8-bit devices to 32-bit devices. Of the various ARM subfamilies, Cortex-M4 is a middle-level microcontroller that lends itself well to data acquisition and control as well as digital signal manipulation applications. Given the prominence of ARM microcontrollers, it is important that they should be incorporated in academic curriculums. However, there is a lack of up-to-date teaching material - textbooks and comprehensive laboratory manuals. In this book each of the microcontroller's resources - digital input and output, timers and counters, serial communication channels, analog-to-digital conversion, interrupt structure and power management features - are addressed in a set of more than 70 experiments to help teach a full semester course on these microcontrollers. Beyond these physical interfacing exercises, it describes an inexpensive BoB (break out board) that allows students to learn how to design and build standalone projects, as well a number of illustrative projects.
This book constitutes the refereed post-conference proceedings of the 10th IFIP WG 5.14 International Conference on Computer and Computing Technologies in Agriculture, CCTA 2016, held in Dongying, China, in October 2016. The 55 revised papers presented were carefully reviewed and selected from 128 submissions. They cover a wide range of interesting theories and applications of information technology in agriculture, including intelligent sensing, cloud computing, key technologies of the Internet of Things, precision agriculture, animal husbandry information technology, including Internet + modern animal husbandry, livestock big data platform and cloud computing applications, intelligent breeding equipment, precision production models, water product networking and big data , including fishery IoT, intelligent aquaculture facilities, and big data applications.
This book questions the relevance of computation to the physical universe. Our theories deliver computational descriptions, but the gaps and discontinuities in our grasp suggest a need for continued discourse between researchers from different disciplines, and this book is unique in its focus on the mathematical theory of incomputability and its relevance for the real world. The core of the book consists of thirteen chapters in five parts on extended models of computation; the search for natural examples of incomputable objects; mind, matter, and computation; the nature of information, complexity, and randomness; and the mathematics of emergence and morphogenesis. This book will be of interest to researchers in the areas of theoretical computer science, mathematical logic, and philosophy.
This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.
With the proliferation of Software-as-a-Service (SaaS) offerings, it is becoming increasingly important for individual SaaS providers to operate their services at a low cost. This book investigates SaaS from the perspective of the provider and shows how operational costs can be reduced by using "multi tenancy," a technique for consolidating a large number of customers onto a small number of servers. Specifically, the book addresses multi tenancy on the database level, focusing on in-memory column databases, which are the backbone of many important new enterprise applications. For efficiently implementing multi tenancy in a farm of databases, two fundamental challenges must be addressed, (i) workload modeling and (ii) data placement. The first involves estimating the (shared) resource consumption for multi tenancy on a single in-memory database server. The second consists in assigning tenants to servers in a way that minimizes the number of required servers (and thus costs) based on the assumed workload model. This step also entails replicating tenants for performance and high availability. This book presents novel solutions to both problems.
This book gathers threads that have evolved across different mathematical disciplines into seamless narrative. It deals with condition as a main aspect in the understanding of the performance ---regarding both stability and complexity--- of numerical algorithms. While the role of condition was shaped in the last half-century, so far there has not been a monograph treating this subject in a uniform and systematic way. The book puts special emphasis on the probabilistic analysis of numerical algorithms via the analysis of the corresponding condition. The exposition's level increases along the book, starting in the context of linear algebra at an undergraduate level and reaching in its third part the recent developments and partial solutions for Smale's 17th problem which can be explained within a graduate course. Its middle part contains a condition-based course on linear programming that fills a gap between the current elementary expositions of the subject based on the simplex method and those focusing on convex programming.
In a fragment entitled Elementa Nova Matheseos Universalis (1683?) Leibniz writes "the mathesis [...] shall deliver the method through which things that are conceivable can be exactly determined"; in another fragment he takes the mathesis to be "the science of all things that are conceivable." Leibniz considers all mathematical disciplines as branches of the mathesis and conceives the mathesis as a general science of forms applicable not only to magnitudes but to every object that exists in our imagination, i.e. that is possible at least in principle. As a general science of forms the mathesis investigates possible relations between "arbitrary objects" ("objets quelconques"). It is an abstract theory of combinations and relations among objects whatsoever. In 1810 the mathematician and philosopher Bernard Bolzano published a booklet entitled Contributions to a Better-Grounded Presentation of Mathematics. There is, according to him, a certain objective connection among the truths that are germane to a certain homogeneous field of objects: some truths are the "reasons" ("Grunde") of others, and the latter are "consequences" ("Folgen") of the former. The reason-consequence relation seems to be the counterpart of causality at the level of a relation between true propositions. Arigorous proof is characterized in this context as a proof that shows the reason of the proposition that is to be proven. Requirements imposed on rigorous proofs seem to anticipate normalization results in current proof theory. The contributors of Mathesis Universalis, Computability and Proof, leading experts in the fields of computer science, mathematics, logic and philosophy, show the evolution of these and related ideas exploring topics in proof theory, computability theory, intuitionistic logic, constructivism and reverse mathematics, delving deeply into a contextual examination of the relationship between mathematical rigor and demands for simplification.
This book features selected papers presented at the 2nd International Conference on Advanced Computing Technologies and Applications, held at SVKM's Dwarkadas J. Sanghvi College of Engineering, Mumbai, India, from 28 to 29 February 2020. Covering recent advances in next-generation computing, the book focuses on recent developments in intelligent computing, such as linguistic computing, statistical computing, data computing and ambient applications.
This book introduces new logic primitives for electronic design automation tools. The author approaches fundamental EDA problems from a different, unconventional perspective, in order to demonstrate the key role of rethinking EDA solutions in overcoming technological limitations of present and future technologies. The author discusses techniques that improve the efficiency of logic representation, manipulation and optimization tasks by taking advantage of majority and biconditional logic primitives. Readers will be enabled to accelerate formal methods by studying core properties of logic circuits and developing new frameworks for logic reasoning engines.
This book first focuses on the explanation of the theory about focal mechanisms and moment tensor solutions and their role in the modern seismology. The second part of the book compiles several state-of-the-art case studies in different seismotectonic settings of the planet.The assessment of seismic hazard and the reduction of losses due to future earthquakes is probably the most important contribution of seismology to society. In this regard, the understanding of reliable determination seismic source and of its uncertainty can play a key role in contributing to geodynamic investigation, seismic hazard assessment and earthquake studies. In the last two decades, the use of waveforms recorded at local-to-regional distances has increased considerably. Waveform modeling has been used also to estimate faulting parameters of small-to-moderate sized earthquakes.
Evolutionary algorithms constitute a class of well-known algorithms, which are designed based on the Darwinian theory of evolution and Mendelian theory of heritage. They are partly based on random and partly based on deterministic principles. Due to this nature, it is challenging to predict and control its performance in solving complex nonlinear problems. Recently, the study of evolutionary dynamics is focused not only on the traditional investigations but also on the understanding and analyzing new principles, with the intention of controlling and utilizing their properties and performances toward more effective real-world applications. In this book, based on many years of intensive research of the authors, is proposing novel ideas about advancing evolutionary dynamics towards new phenomena including many new topics, even the dynamics of equivalent social networks. In fact, it includes more advanced complex networks and incorporates them with the CMLs (coupled map lattices), which are usually used for spatiotemporal complex systems simulation and analysis, based on the observation that chaos in CML can be controlled, so does evolution dynamics. All the chapter authors are, to the best of our knowledge, originators of the ideas mentioned above and researchers on evolutionary algorithms and chaotic dynamics as well as complex networks, who will provide benefits to the readers regarding modern scientific research on related subjects.
This book illustrates how to use description logic-based formalisms to their full potential in the creation, indexing, and reuse of multimedia semantics. To do so, it introduces researchers to multimedia semantics by providing an in-depth review of state-of-the-art standards, technologies, ontologies, and software tools. It draws attention to the importance of formal grounding in the knowledge representation of multimedia objects, the potential of multimedia reasoning in intelligent multimedia applications, and presents both theoretical discussions and best practices in multimedia ontology engineering. Readers already familiar with mathematical logic, Internet, and multimedia fundamentals will learn to develop formally grounded multimedia ontologies, and map concept definitions to high-level descriptors. The core reasoning tasks, reasoning algorithms, and industry-leading reasoners are presented, while scene interpretation via reasoning is also demonstrated. Overall, this book offers readers an essential introduction to the formal grounding of web ontologies, as well as a comprehensive collection and review of description logics (DLs) from the perspectives of expressivity and reasoning complexity. It covers best practices for developing multimedia ontologies with formal grounding to guarantee decidability and obtain the desired level of expressivity while maximizing the reasoning potential. The capabilities of such multimedia ontologies are demonstrated by DL implementations with an emphasis on multimedia reasoning applications.
In this monograph we introduce and examine four new temporal logic formalisms that can be used as specification languages for the automated verification of the reliability of hardware and software designs with respect to a desired behavior. The work is organized in two parts. In the first part two logics for computations, the graded computation tree logic and the computation tree logic with minimal model quantifiers are discussed. These have proved to be useful in describing correct executions of monolithic closed systems. The second part focuses on logics for strategies, strategy logic and memoryful alternating-time temporal logic, which have been successfully applied to formalize several properties of interactive plays in multi-entities systems modeled as multi-agent games.
This edited volume collects the research results presented at the 14th International Symposium on Computer Methods in Biomechanics and Biomedical Engineering, Tel Aviv, Israel, 2016. The topical focus includes, but is not limited to, cardiovascular fluid dynamics, computer modeling of tissue engineering, skin and spine biomechanics, as well as biomedical image analysis and processing. The target audience primarily comprises research experts in the field of bioengineering, but the book may also be beneficial for graduate students alike.
This thesis deals with topological orders from two different perspectives: from a condensed matter point of view, where topological orders are considered as breakthrough phases of matter; and from the emerging realm of quantum computation, where topological quantum codes are considered the most appealing platform against decoherence. The thesis reports remarkable studies from both sides. It thoroughly investigates a topological order called the double semion model, a counterpart of the Kitaev model but exhibiting richer quasiparticles as excitations. A new model for symmetry enriched topological order is constructed, which adds an onsite global symmetry to the double semion model. Using this topological phase, a new example of topological code is developed, the semion code, which is non-CSS, additive, non-Pauli and within the stabiliser formalism. Furthermore, the thesis analyses the Rashba spin-orbit coupling within topological insulators, turning the helical edge states into generic edges modes with potential application in spinstronics. New types of topological superconductors are proposed and the novel properties of the correspondingly created Majorana fermions are investigated. These Majorana fermions have inherent properties enabling braiding and the performance of logical gates as fundamental blocks for a universsal quantum computator.
This collection of peer-reviewed workshop papers provides comprehensive coverage of cutting-edge research into topological approaches to data analysis and visualization. It encompasses the full range of new algorithms and insights, including fast homology computation, comparative analysis of simplification techniques, and key applications in materials and medical science. The book also addresses core research challenges such as the representation of large and complex datasets, and integrating numerical methods with robust combinatorial algorithms. In keeping with the focus of the TopoInVis 2017 Workshop, the contributions reflect the latest advances in finding experimental solutions to open problems in the sector. They provide an essential snapshot of state-of-the-art research, helping researchers to keep abreast of the latest developments and providing a basis for future work. Gathering papers by some of the world's leading experts on topological techniques, the book represents a valuable contribution to a field of growing importance, with applications in disciplines ranging from engineering to medicine.
This book presents the mathematical background underlying security modeling in the context of next-generation cryptography. By introducing new mathematical results in order to strengthen information security, while simultaneously presenting fresh insights and developing the respective areas of mathematics, it is the first-ever book to focus on areas that have not yet been fully exploited for cryptographic applications such as representation theory and mathematical physics, among others. Recent advances in cryptanalysis, brought about in particular by quantum computation and physical attacks on cryptographic devices, such as side-channel analysis or power analysis, have revealed the growing security risks for state-of-the-art cryptographic schemes. To address these risks, high-performance, next-generation cryptosystems must be studied, which requires the further development of the mathematical background of modern cryptography. More specifically, in order to avoid the security risks posed by adversaries with advanced attack capabilities, cryptosystems must be upgraded, which in turn relies on a wide range of mathematical theories. This book is suitable for use in an advanced graduate course in mathematical cryptography, while also offering a valuable reference guide for experts.
Voronoi diagrams partition space according to the influence certain sites exert on their environment. Since the 17th century, such structures play an important role in many areas like Astronomy, Physics, Chemistry, Biology, Ecology, Economics, Mathematics and Computer Science. They help to describe zones of political influence, to determine the hospital nearest to an accident site, to compute collision-free paths for mobile robots, to reconstruct curves and surfaces from sample points, to refine triangular meshes, and to design location strategies for competing markets. This unique book offers a state-of-the-art view of Voronoi diagrams and their structure, and it provides efficient algorithms towards their computation. Readers with an entry-level background in algorithms can enjoy a guided tour of gently increasing difficulty through a fascinating area. Lecturers might find this volume a welcome source for their courses on computational geometry. Experts are offered a broader view, including many alternative solutions, and up-to-date references to the existing literature; they might benefit in their own research or application development. |
You may like...
Systems Analysis And Design In A…
John Satzinger, Robert Jackson, …
Hardcover
(1)
Technology In Action Complete, Global…
Alan Evans, Kendall Martin, …
Paperback
R2,490
Discovery Miles 24 900
Discovering Computers (c)2017
Mark Frydenberg, Misty Vermaat, …
Paperback
(3)
R966 Discovery Miles 9 660
|