![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing
An exciting aspect of contemporary legal scholarship is a concern for law from a global perspective across all legal fields. The book draws upon examples from North America, Western Europe, Africa, Asia, Eastern Europe, and Latin America. It refers to the basic private law fields of torts, property, contracts, and family law. It also refers to the basic public law fields of constitutional law, administrative law, criminal law, and international law. It analyzes diverse legal policy problems from a perspective that is designed to produce solutions whereby conservatives, liberals, and other major viewpoints can all come out ahead of their best initial expectations simultaneously. Such solutions can be considered an important part of an innovative concept of justice that emphasizes being effective, efficient, and equitable simultaneously, rather than compromising on any of those justice components. Another exciting aspect of contemporary legal scholarship is a concern for the use of modern technology in the form of microcomputer software that can be helpful in law teaching, practice, and research. Computer-aided instruction can supplement the case method by using what-if analysis to make changes in the goals to be achieved, alternative decisions available for achieving them, the factual relations, and other inputs to see how the decisions might change with changes in those inputs. Computer-aided law practice can be helpful in counseling, negotiation, mediation, case analysis, legal policy evaluation, and advocacy. Computer-aided research can be helpful in testing deductive or statistical models to determine how well they can explain variance across the judicial process or other legal processes.
This book is open access under a CC BY 4.0 license. This easy-to-read book introduces the basics of solving partial differential equations by means of finite difference methods. Unlike many of the traditional academic works on the topic, this book was written for practitioners. Accordingly, it especially addresses: the construction of finite difference schemes, formulation and implementation of algorithms, verification of implementations, analyses of physical behavior as implied by the numerical solutions, and how to apply the methods and software to solve problems in the fields of physics and biology.
Biological and biomedical studies have entered a new era over the past two decades thanks to the wide use of mathematical models and computational approaches. A booming of computational biology, which sheerly was a theoretician's fantasy twenty years ago, has become a reality. Obsession with computational biology and theoretical approaches is evidenced in articles hailing the arrival of what are va- ously called quantitative biology, bioinformatics, theoretical biology, and systems biology. New technologies and data resources in genetics, such as the International HapMap project, enable large-scale studies, such as genome-wide association st- ies, which could potentially identify most common genetic variants as well as rare variants of the human DNA that may alter individual's susceptibility to disease and the response to medical treatment. Meanwhile the multi-electrode recording from behaving animals makes it feasible to control the animal mental activity, which could potentially lead to the development of useful brain-machine interfaces. - bracing the sheer volume of genetic, genomic, and other type of data, an essential approach is, ?rst of all, to avoid drowning the true signal in the data. It has been witnessed that theoretical approach to biology has emerged as a powerful and st- ulating research paradigm in biological studies, which in turn leads to a new - search paradigm in mathematics, physics, and computer science and moves forward with the interplays among experimental studies and outcomes, simulation studies, and theoretical investigations.
This book presents a collection of research papers that address the challenge of how to develop software in a principled way that, in particular, enables reasoning. The individual papers approach this challenge from various perspectives including programming languages, program verification, and the systematic variation of software. Topics covered include programming abstractions for concurrent and distributed software, specification and verification techniques for imperative programs, and development techniques for software product lines. With this book the editors and authors wish to acknowledge - on the occasion of his 60th birthday - the work of Arnd Poetzsch-Heffter, who has made major contributions to software technology throughout his career. It features articles on Arnd's broad research interests including, among others, the implementation of programming languages, formal semantics, specification and verification of object-oriented and concurrent programs, programming language design, distributed systems, software modeling, and software product lines. All contributing authors are leading experts in programming languages and software engineering who have collaborated with Arnd in the course of his career. Overall, the book offers a collection of high-quality articles, presenting original research results, major case studies, and inspiring visions. Some of the work included here was presented at a symposium in honor of Arnd Poetzsch-Heffter, held in Kaiserslautern, Germany, in November 2018.
These contributions, written by the foremost international researchers and practitioners of Genetic Programming (GP), explore the synergy between theoretical and empirical results on real-world problems, producing a comprehensive view of the state of the art in GP. Topics in this volume include: evolutionary constraints, relaxation of selection mechanisms, diversity preservation strategies, flexing fitness evaluation, evolution in dynamic environments, multi-objective and multi-modal selection, foundations of evolvability, evolvable and adaptive evolutionary operators, foundation of injecting expert knowledge in evolutionary search, analysis of problem difficulty and required GP algorithm complexity, foundations in running GP on the cloud - communication, cooperation, flexible implementation, and ensemble methods. Additional focal points for GP symbolic regression are: (1) The need to guarantee convergence to solutions in the function discovery mode; (2) Issues on model validation; (3) The need for model analysis workflows for insight generation based on generated GP solutions - model exploration, visualization, variable selection, dimensionality analysis; (4) Issues in combining different types of data. Readers will discover large-scale, real-world applications of GP to a variety of problem domains via in-depth presentations of the latest and most significant results.
Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovi, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational algorithms, and multidisciplinary applications. Special features of this volume: - Presents results and approximation methods in various computational settings including: polynomial and orthogonal systems, analytic functions, and differential equations. - Provides a historical overview of approximation theory and many of its subdisciplines; - Contains new results from diverse areas of research spanning mathematics, engineering, and the computational sciences. "Approximation and Computation" is intended for mathematicians and researchers focusing on approximation theory and numerical analysis, but can also be a valuable resource to students and researchers in the computational and applied sciences."
This book focuses on the different representations and cryptographic properties of Booleans functions, presents constructions of Boolean functions with some good cryptographic properties. More specifically, Walsh spectrum description of the traditional cryptographic properties of Boolean functions, including linear structure, propagation criterion, nonlinearity, and correlation immunity are presented. Constructions of symmetric Boolean functions and of Boolean permutations with good cryptographic properties are specifically studied. This book is not meant to be comprehensive, but with its own focus on some original research of the authors in the past. To be self content, some basic concepts and properties are introduced. This book can serve as a reference for cryptographic algorithm designers, particularly the designers of stream ciphers and of block ciphers, and for academics with interest in the cryptographic properties of Boolean functions.
This monograph illustrates important notions in security reductions and essential techniques in security reductions for group-based cryptosystems. Using digital signatures and encryption as examples, the authors explain how to program correct security reductions for those cryptographic primitives. Various schemes are selected and re-proven in this book to demonstrate and exemplify correct security reductions. This book is suitable for researchers and graduate students engaged with public-key cryptography.
Images are all around us The proliferation of low-cost, high-quality imaging devices has led to an explosion in acquired images. When these images are acquired from a microscope, telescope, satellite, or medical imaging device, there is a statistical image processing task: the inference of something--an artery, a road, a DNA marker, an oil spill--from imagery, possibly noisy, blurry, or incomplete. A great many textbooks have been written on image processing. However this book does not so much focus on images, per se, but rather on spatial data sets, with one or more measurements taken over a two or higher dimensional space, and to which standard image-processing algorithms may not apply. There are many important data analysis methods developed in this text for such statistical image problems. Examples abound throughout remote sensing (satellite data mapping, data assimilation, climate-change studies, land use), medical imaging (organ segmentation, anomaly detection), computer vision (image classification, segmentation), and other 2D/3D problems (biological imaging, porous media). The goal, then, of this text is to address methods for solving multidimensional statistical problems. The text strikes a balance between mathematics and theory on the one hand, versus applications and algorithms on the other, by deliberately developing the basic theory (Part I), the mathematical modeling (Part II), and the algorithmic and numerical methods (Part III) of solving a given problem. The particular emphases of the book include inverse problems, multidimensional modeling, random fields, and hierarchical methods.
Blockchain and other trustless systems have gone from being relatively obscure technologies, which were only known to a small community of computer scientists and cryptologists, to mainstream phenomena that are now considered powerful game changers for many industries. This book explores and assesses real-world use cases and case studies on blockchain and related technologies. The studies describe the respective applications and address how these technologies have been deployed, the rationale behind their application, and finally, their outcomes. The book shares a wealth of experiences and lessons learned regarding financial markets, energy, SCM, healthcare, law and compliance. Given its scope, it is chiefly intended for academics and practitioners who want to learn more about blockchain applications.
This book highlights the current challenges for engineers involved in product development and the associated changes in procedure they make necessary. Methods for systematically analyzing the requirements for safety and security mechanisms are described using examples of how they are implemented in software and hardware, and how their effectiveness can be demonstrated in terms of functional and design safety are discussed. Given today's new E-mobility and automated driving approaches, new challenges are arising and further issues concerning "Road Vehicle Safety" and "Road Traffic Safety" have to be resolved. To address the growing complexity of vehicle functions, as well as the increasing need to accommodate interdisciplinary project teams, previous development approaches now have to be reconsidered, and system engineering approaches and proven management systems need to be supplemented or wholly redefined. The book presents a continuous system development process, starting with the basic requirements of quality management and continuing until the release of a vehicle and its components for road use. Attention is paid to the necessary definition of the respective development item, the threat-, hazard- and risk analysis, safety concepts and their relation to architecture development, while the book also addresses the aspects of product realization in mechanics, electronics and software as well as for subsequent testing, verification, integration and validation phases. In November 2011, requirements for the Functional Safety (FuSa) of road vehicles were first published in ISO 26262. The processes and methods described here are intended to show developers how vehicle systems can be implemented according to ISO 26262, so that their compliance with the relevant standards can be demonstrated as part of a safety case, including audits, reviews and assessments.
It is clear that computation is playing an increasingly prominent role in the development of mathematics, as well as in the natural and social sciences. The work of Stephen Wolfram over the last several decades has been a salient part in this phenomenon helping founding the field of Complex Systems, with many of his constructs and ideas incorporated in his book A New Kind of Science (ANKS) becoming part of the scientific discourse and general academic knowledge--from the now established Elementary Cellular Automata to the unconventional concept of mining the Computational Universe, from today's widespread Wolfram's Behavioural Classification to his principles of Irreducibility and Computational Equivalence. This volume, with a Foreword by Gregory Chaitin and an Afterword by Cris Calude, covers these and other topics related to or motivated by Wolfram's seminal ideas, reporting on research undertaken in the decade following the publication of Wolfram's NKS book. Featuring 39 authors, its 23 contributions are organized into seven parts: Mechanisms in Programs & Nature Systems Based on Numbers & Simple Programs Social and Biological Systems & Technology Fundamental Physics The Behavior of Systems & the Notion of Computation Irreducibility & Computational Equivalence Reflections and Philosophical Implications.
The main objective of pervasive computing systems is to create environments where computers become invisible by being seamlessly integrated and connected into our everyday environment, where such embedded computers can then provide inf- mation and exercise intelligent control when needed, but without being obtrusive. Pervasive computing and intelligent multimedia technologies are becoming incre- ingly important to the modern way of living. However, many of their potential applications have not yet been fully realized. Intelligent multimedia allows dynamic selection, composition and presentation of the most appropriate multimedia content based on user preferences. A variety of applications of pervasive computing and - telligent multimedia are being developed for all walks of personal and business life. Pervasive computing (often synonymously called ubiquitous computing, palpable computing or ambient intelligence) is an emerging ?eld of research that brings in revolutionary paradigms for computing models in the 21st century. Pervasive c- puting is the trend towards increasingly ubiquitous connected computing devices in the environment, a trend being brought about by a convergence of advanced el- tronic - and particularly, wireless - technologies and the Internet. Recent advances in pervasive computers, networks, telecommunications and information technology, along with the proliferation of multimedia mobile devices - such as laptops, iPods, personal digital assistants (PDAs) and cellular telephones - have further stimulated the development of intelligent pervasive multimedia applications. These key te- nologiesarecreatingamultimediarevolutionthatwillhavesigni?cantimpactacross a wide spectrum of consumer, business, healthcare and governmental domains.
* Draws on work across multiple disciplines, from astrobiology and physics to linguistics and the social sciences, making it appealing to graduates from a wide variety of fields. * The first accessible introduction into the important work of philosopher Howard Pattee. * Aims to equip readers with new approaches to simple and complex systems theory to take into any respective discipline.
This unique textbook/reference presents unified coverage of bioinformatics topics relating to both biological sequences and biological networks, providing an in-depth analysis of cutting-edge distributed algorithms, as well as of relevant sequential algorithms. In addition to introducing the latest algorithms in this area, more than fifteen new distributed algorithms are also proposed. Topics and features: reviews a range of open challenges in biological sequences and networks; describes in detail both sequential and parallel/distributed algorithms for each problem; suggests approaches for distributed algorithms as possible extensions to sequential algorithms, when the distributed algorithms for the topic are scarce; proposes a number of new distributed algorithms in each chapter, to serve as potential starting points for further research; concludes each chapter with self-test exercises, a summary of the key points, a comparison of the algorithms described, and a literature review.
Improved geospatial instrumentation and technology such as in laser scanning has now resulted in millions of data being collected, e.g., point clouds. It is in realization that such huge amount of data requires efficient and robust mathematical solutions that this third edition of the book extends the second edition by introducing three new chapters: Robust parameter estimation, Multiobjective optimization and Symbolic regression. Furthermore, the linear homotopy chapter is expanded to include nonlinear homotopy. These disciplines are discussed first in the theoretical part of the book before illustrating their geospatial applications in the applications chapters where numerous numerical examples are presented. The renewed electronic supplement contains these new theoretical and practical topics, with the corresponding Mathematica statements and functions supporting their computations introduced and applied. This third edition is renamed in light of these technological advancements.
In this essay collection, leading physicists, philosophers, and historians attempt to fill the empty theoretical ground in the foundations of information and address the related question of the limits to our knowledge of the world. Over recent decades, our practical approach to information and its exploitation has radically outpaced our theoretical understanding - to such a degree that reflection on the foundations may seem futile. But it is exactly fields such as quantum information, which are shifting the boundaries of the physically possible, that make a foundational understanding of information increasingly important. One of the recurring themes of the book is the claim by Eddington and Wheeler that information involves interaction and putting agents or observers centre stage. Thus, physical reality, in their view, is shaped by the questions we choose to put to it and is built up from the information residing at its core. This is the root of Wheeler's famous phrase "it from bit." After reading the stimulating essays collected in this volume, readers will be in a good position to decide whether they agree with this view.
This book provides a general overview of multiple instance learning (MIL), defining the framework and covering the central paradigms. The authors discuss the most important algorithms for MIL such as classification, regression and clustering. With a focus on classification, a taxonomy is set and the most relevant proposals are specified. Efficient algorithms are developed to discover relevant information when working with uncertainty. Key representative applications are included. This book carries out a study of the key related fields of distance metrics and alternative hypothesis. Chapters examine new and developing aspects of MIL such as data reduction for multi-instance problems and imbalanced MIL data. Class imbalance for multi-instance problems is defined at the bag level, a type of representation that utilizes ambiguity due to the fact that bag labels are available, but the labels of the individual instances are not defined. Additionally, multiple instance multiple label learning is explored. This learning framework introduces flexibility and ambiguity in the object representation providing a natural formulation for representing complicated objects. Thus, an object is represented by a bag of instances and is allowed to have associated multiple class labels simultaneously. This book is suitable for developers and engineers working to apply MIL techniques to solve a variety of real-world problems. It is also useful for researchers or students seeking a thorough overview of MIL literature, methods, and tools.
The latest work by the world's leading authorities on the use of formal methods in computer science is presented in this volume, based on the 1995 International Summer School in Marktoberdorf, Germany. Logic is of special importance in computer science, since it provides the basis for giving correct semantics of programs, for specification and verification of software, and for program synthesis. The lectures presented here provide the basic knowledge a researcher in this area should have and give excellent starting points for exploring the literature. Topics covered include semantics and category theory, machine based theorem proving, logic programming, bounded arithmetic, proof theory, algebraic specifications and rewriting, algebraic algorithms, and type theory.
The last decade has witnessed a rapid surge of interest in new sensing and monitoring devices for wellbeing and healthcare. One key development in this area is wireless, wearable and implantable "in vivo" monitoring and intervention. A myriad of platforms are now available from both academic institutions and commercial organisations. They permit the management of patients with both acute and chronic symptoms, including diabetes, cardiovascular diseases, treatment of epilepsy and other debilitating neurological disorders. Despite extensive developments in sensing technologies, there are significant research issues related to system integration, sensor miniaturisation, low-power sensor interface, wireless telemetry and signal processing. In the 2nd edition of this popular and authoritative reference on Body Sensor Networks (BSN), major topics related to the latest technological developments and potential clinical applications are discussed, with contents covering. Biosensor Design, Interfacing and Nanotechnology Wireless Communication and Network Topologies Communication Protocols and Standards Energy Harvesting and Power Delivery Ultra-low Power Bio-inspired Processing Multi-sensor Fusion and Context Aware Sensing Autonomic Sensing Wearable, Ingestible Sensor Integration and Exemplar Applications System Integration and Wireless Sensor Microsystems The book also provides a comprehensive review of the current wireless sensor development platforms and a step-by-step guide to developing your own BSN applications through the use of BSN development kit.
This book provides a comprehensive presentation of the most advanced research results and technological developments enabling understanding, qualifying and mitigating the soft errors effect in advanced electronics, including the fundamental physical mechanisms of radiation induced soft errors, the various steps that lead to a system failure, the modelling and simulation of soft error at various levels (including physical, electrical, netlist, event driven, RTL, and system level modelling and simulation), hardware fault injection, accelerated radiation testing and natural environment testing, soft error oriented test structures, process-level, device-level, cell-level, circuit-level, architectural-level, software level and system level soft error mitigation techniques. The book contains a comprehensive presentation of most recent advances on understanding, qualifying and mitigating the soft error effect in advanced electronic systems, presented by academia and industry experts in reliability, fault tolerance, EDA, processor, SoC and system design, and in particular, experts from industries that have faced the soft error impact in terms of product reliability and related business issues and were in the forefront of the countermeasures taken by these companies at multiple levels in order to mitigate the soft error effects at a cost acceptable for commercial products. In a fast moving field, where the impact on ground level electronics is very recent and its severity is steadily increasing at each new process node, impacting one after another various industry sectors (as an example, the Automotive Electronics Council comes to publish qualification requirements on soft errors), research and technology developments and industrial practices have evolve very fast, outdating the most recent books edited at 2004.
In this book you will learn all of the basics required to rig any character in Maya. The book covers everything from joints to wires, from the connection editor to pruning small weights. With over 30 example files, 200 images, and countless step by step tutorials, you will be shown exactly how to rig a foot and leg, a hand, and much more. The thing that separates this book from the competition is answering the question of "Why?." The book covers exactly what is a gimbal lock and how do you avoid it? What axis should you use as your primary one? How do I add an influence object to fix a rig already in progress and why would I use a joint in some cases? Knowing why things are done the way they are will allow you to innovate and create new rigs. Knowing why will allow you to troubleshoot someone else's rig, and progress far beyond being a beginner. This book will help the beginner build a solid foundation and is a great addition for any character rigger using Maya. Proceeds donated to charity.
This book presents the first paradigm of social multimedia computing completely from the user perspective. Different from traditional multimedia and web multimedia computing which are content-centric, social multimedia computing rises under the participatory Web2.0 and is essentially user-centric. The goal of this book is to emphasize the user factor in facilitating effective solutions towards both multimedia content analysis, user modeling and customized user services. Advanced topics like cross-network social multimedia computing are also introduced as extensions and potential directions along this research line.
This book gathers selected high-quality research papers presented at Arab Conference for Emerging Technologies 2020 organized virtually in Cairo during 21-23 June 2020. This book emphasizes the role and recent developments in the field of emerging technologies and artificial intelligence, and related technologies with a special focus on sustainable development in the Arab world. The book targets high-quality scientific research papers with applications, including theory, practical, prototypes, new ideas, case studies and surveys which cover machine learning applications in data science. |
![]() ![]() You may like...
Understanding Statistics Using R
Randall Schumacker, Sara Tomek
Hardcover
R3,623
Discovery Miles 36 230
Metals and Society - An Introduction to…
Nicholas Arndt, Clement Ganino
Hardcover
R1,518
Discovery Miles 15 180
Multiobjective Linear and Integer…
Carlos Henggeler Antunes, Maria Joao Alves, …
Hardcover
R3,865
Discovery Miles 38 650
Geomorphological Mapping, Volume 15…
Mike Smith, James Griffiths
Hardcover
Entity-Relationship Approach - ER '92…
Gunther Pernul, A. Min Tjoa
Paperback
R1,713
Discovery Miles 17 130
Agile Scrum Implementation and Its…
Kenneth R Walsh, Sathiadev Mahesh, …
Hardcover
R6,500
Discovery Miles 65 000
|