![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing
This self-contained essay collection is published to commemorate half a century of Bell's theorem. Like its much acclaimed predecessor "Quantum [Un]Speakables: From Bell to Quantum Information" (published 2002), it comprises essays by many of the worlds leading quantum physicists and philosophers. These revisit the foundations of quantum theory as well as elucidating the remarkable progress in quantum technologies achieved in the last couple of decades. Fundamental concepts such as entanglement, nonlocality and contextuality are described in an accessible manner and, alongside lively descriptions of the various theoretical and experimental approaches, the book also delivers interesting philosophical insights. The collection as a whole will serve as a broad introduction for students and newcomers as well as delighting the scientifically literate general reader.
The world-class National Palace Museum (NPM) in Taiwan possesses a repository of the largest collection of Chinese cultural treasures of outstanding quality. Through implementing a two-organizational restructuring, and shifting its operational focus from being object-oriented to public-centered, it aims to capture the attention of people and promote awareness of the culture and traditions of China. In this vein, the NPM combines its expertise in museum service with the possibilities afforded by Information Technology (IT). This book analyses the research results of a team sponsored by the National Science Council in Taiwan to observe the development processes and accomplishments, and to conduct scientific researches covering not only the technology and management disciplines, but also the humanities and social science disciplines. The development process of new digital content and IT-enabled services of NPM would be a useful benchmark for museums, cultural and creative organizations and traditional organizations in Taiwan and around the world.
Everything you know about the future is wrong. Presumptive Design: Design Provocations for Innovation is for people "inventing" the future: future products, services, companies, strategies and policies. It introduces a design-research method that shortens time to insights from months to days. Presumptive Design is a fundamentally agile approach to identifying your audiences' key needs. Offering rapidly crafted artifacts, your teams collaborate with your customers to identify preferred and profitable elements of your desired outcome. Presumptive Design focuses on your users' problem space, informing your business strategy, your project's early stage definition, and your innovation pipeline. Comprising discussions of design theory with case studies and how-to's, the book offers business leadership, management and innovators the benefits of design thinking and user experience in the context of early stage problem definition. Presumptive Design is an advanced technique and quick to use: within days of reading this book, your research and design teams can apply the approach to capture a risk-reduced view of your future.
We are now entering an era where the human world assumes recognition of itself as data. Much of humanity's basis for existence is becoming subordinate to software processes that tabulate, index, and sort the relations that comprise what we perceive as reality. The acceleration of data collection threatens to relinquish ephemeral modes of representation to ceaseless processes of computation. This situation compels the human world to form relations with non-human agencies, to establish exchanges with software processes in order to allow a profound upgrade of our own ontological understanding. By mediating with a higher intelligence, we may be able to rediscover the inner logic of the age of intelligent machines. In The End of the Future, Stephanie Polsky conceives an understanding of the digital through its dynamic intersection with the advent and development of the nation-state, race, colonization, navigational warfare, mercantilism, and capitalism, and the mathematical sciences over the past five centuries, the era during which the world became "modern." The book animates the twenty-first century as an era in which the screen has split off from itself and proliferated onto multiple surfaces, allowing an inverted image of totalitarianism to flash up and be altered to support our present condition of binary apperception. It progresses through a recognition of atomized political power, whose authority lies in the control not of the means of production, but of information, and in which digital media now serves to legitimize and promote a customized micropolitics of identity management. On this new apostolate plane, humanity may be able to shape a new world in which each human soul is captured and reproduced as an autonomous individual bearing affects and identities. The digital infrastructure of the twenty-first century makes it possible for power to operate through an esoteric mathematical means, and for factual material to be manipulated in the interest of advancing the means of control. This volume travels a course from Elizabethan England, to North American slavery, through cybernetic Social Engineering, Cold War counterinsurgency, and the (neo)libertarianism of Silicon Valley in order to arrive at a place where an organizing intelligence that started from an ambition to resourcefully manipulate physical bodies has ended with their profound neutralization.
This text presents an algebraic approach to the construction of several important families of quantum codes derived from classical codes by applying the well-known Calderbank-Shor-Steane (CSS), Hermitian, and Steane enlargement constructions to certain classes of classical codes. In addition, the book presents families of asymmetric quantum codes with good parameters and provides a detailed description of the procedures adopted to construct families of asymmetric quantum convolutional codes.Featuring accessible language and clear explanations, the book is suitable for use in advanced undergraduate and graduate courses as well as for self-guided study and reference. It provides an expert introduction to algebraic techniques of code construction and, because all of the constructions are performed algebraically, it enables the reader to construct families of codes, rather than only codes with specific parameters. The text offers an abundance of worked examples, exercises, and open-ended problems to motivate the reader to further investigate this rich area of inquiry. End-of-chapter summaries and a glossary of key terms allow for easy review and reference.
In information technology, unlike many other fields, the need to support the unique perspective of technologically advanced students and deliver technology-rich content presents unique challenges. Today's IT students need the ability to interact with their instructor in near-real time, interact with their peers and project team members, and access and manipulate technology tools in the pursuit of their educational objectives.""Handbook of Distance Learning for Real-Time and Asynchronous Information Technology Education"" delves deep into the construct of real-time, asynchronous education through information technology, pooling experiences from seasoned researchers and educators to detail their past successes and failures, discussing their techniques, hardships, and triumphs in the search for innovative and effective distance learning education for IT programs. This Premier Reference Source answers the increasing demand for a fundamental, decisive source on this cutting-edge issue facing all institutions, covering topics such as asynchronous communication, real-time instruction, multimedia content, content delivery, and distance education technologies.
This unique text/reference provides an overview of crossbar-based interconnection networks, offering novel perspectives on these important components of high-performance, parallel-processor systems. A particular focus is placed on solutions to the blocking and scalability problems. Topics and features: introduces the fundamental concepts in interconnection networks in multi-processor systems, including issues of blocking, scalability, and crossbar networks; presents a classification of interconnection networks, and provides information on recognizing each of the networks; examines the challenges of blocking and scalability, and analyzes the different solutions that have been proposed; reviews a variety of different approaches to improve fault tolerance in multistage interconnection networks; discusses the scalable crossbar network, which is a non-blocking interconnection network that uses small-sized crossbar switches as switching elements. This invaluable work will be of great benefit to students, researchers and practitioners interested in computer networks, parallel processing and reliability engineering. The text is also essential reading for course modules on interconnection network design and reliability.
This book addresses a number of questions from the perspective of complex systems: How can we quantitatively understand the life phenomena? How can we model life systems as complex bio-molecular networks? Are there any methods to clarify the relationships among the structures, dynamics and functions of bio-molecular networks? How can we statistically analyse large-scale bio-molecular networks? Focusing on the modeling and analysis of bio-molecular networks, the book presents various sophisticated mathematical and statistical approaches. The life system can be described using various levels of bio-molecular networks, including gene regulatory networks, and protein-protein interaction networks. It first provides an overview of approaches to reconstruct various bio-molecular networks, and then discusses the modeling and dynamical analysis of simple genetic circuits, coupled genetic circuits, middle-sized and large-scale biological networks, clarifying the relationships between the structures, dynamics and functions of the networks covered. In the context of large-scale bio-molecular networks, it introduces a number of statistical methods for exploring important bioinformatics applications, including the identification of significant bio-molecules for network medicine and genetic engineering. Lastly, the book describes various state-of-art statistical methods for analysing omics data generated by high-throughput sequencing. This book is a valuable resource for readers interested in applying systems biology, dynamical systems or complex networks to explore the truth of nature.
The book provides a comprehensive introduction and a novel mathematical foundation of the field of information geometry with complete proofs and detailed background material on measure theory, Riemannian geometry and Banach space theory. Parametrised measure models are defined as fundamental geometric objects, which can be both finite or infinite dimensional. Based on these models, canonical tensor fields are introduced and further studied, including the Fisher metric and the Amari-Chentsov tensor, and embeddings of statistical manifolds are investigated. This novel foundation then leads to application highlights, such as generalizations and extensions of the classical uniqueness result of Chentsov or the Cramer-Rao inequality. Additionally, several new application fields of information geometry are highlighted, for instance hierarchical and graphical models, complexity theory, population genetics, or Markov Chain Monte Carlo. The book will be of interest to mathematicians who are interested in geometry, information theory, or the foundations of statistics, to statisticians as well as to scientists interested in the mathematical foundations of complex systems.
This book examines construction safety from the perspective of informatics and econometrics. It demonstrates the potential of employing various information technology approaches to share construction safety knowledge. In addition, it presents the application of econometrics in construction safety studies, such as an analytic hierarchy process used to create a construction safety index. It also discusses structure equation and dynamic panel models for the analysis of construction safety claims. Lastly, it describes the use of mathematical and econometric models to investigate construction practitioners' safety.
This book offers readers an easy introduction into quantum computing as well as into the design for corresponding devices. The authors cover several design tasks which are important for quantum computing and introduce corresponding solutions. A special feature of the book is that those tasks and solutions are explicitly discussed from a design automation perspective, i.e., utilizing clever algorithms and data structures which have been developed by the design automation community for conventional logic (i.e., for electronic devices and systems) and are now applied for this new technology. By this, relevant design tasks can be conducted in a much more efficient fashion than before - leading to improvements of several orders of magnitude (with respect to runtime and other design objectives). Describes the current state of the art for designing quantum circuits, for simulating them, and for mapping them to real hardware; Provides a first comprehensive introduction into design automation for quantum computing that tackles practically relevant tasks; Targets the quantum computing community as well as the design automation community, showing both perspectives to quantum computing, and what impressive improvements are possible when combining the knowledge of both communities.
Handbook of Research on Ambient Intelligence and Smart Environments: Trends and Perspectives covers the cutting-edge aspects of AMI applications, specifically those involving the effective design, realization, and implementation of a comprehensive AmI application. This pertinent publication targets researchers and practitioners in Ambient Intelligence, as well as those in ubiquitous and pervasive computing, artificial intelligence, sensor networks, knowledge representation, automated reasoning and learning, system and software engineering, and man-machine interfaces.
This book is an introduction into methodology and practice of analysis, design and implementation of distributed health information systems. Special attention is dedicated to security and interoperability of such systems as well as to advanced electronic health record approaches. In the book, both available architectures and implementations but also current and future innovations are considered. Therefore, the component paradigm, UML, XML, eHealth are discussed in a concise way. Many practical solutions specified and implemented first in the author's environment are presented in greater detail. The book addresses information scientists, administrators, health professionals, managers and other users of health information systems.
This book discusses the fusion of mobile and WiFi network data with semantic technologies and diverse context sources for offering semantically enriched context-aware services in the telecommunications domain. It presents the OpenMobileNetwork as a platform for providing estimated and semantically enriched mobile and WiFi network topology data using the principles of Linked Data. This platform is based on the OpenMobileNetwork Ontology consisting of a set of network context ontology facets that describe mobile network cells as well as WiFi access points from a topological perspective and geographically relate their coverage areas to other context sources. The book also introduces Linked Crowdsourced Data and its corresponding Context Data Cloud Ontology, which is a crowdsourced dataset combining static location data with dynamic context information. Linked Crowdsourced Data supports the OpenMobileNetwork by providing the necessary context data richness for more sophisticated semantically enriched context-aware services. Various application scenarios and proof of concept services as well as two separate evaluations are part of the book. As the usability of the provided services closely depends on the quality of the approximated network topologies, it compares the estimated positions for mobile network cells within the OpenMobileNetwork to a small set of real-world cell positions. The results prove that context-aware services based on the OpenMobileNetwork rely on a solid and accurate network topology dataset. The book also evaluates the performance of the exemplary Semantic Tracking as well as Semantic Geocoding services, verifying the applicability and added value of semantically enriched mobile and WiFi network data.
This pioneering book presents new models for the thermomechanical behavior of composite materials and structures taking into account internal physico-chemical transformations such as thermodecomposition, sublimation and melting at high temperatures (up to 3000 K). It is of great importance for the design of new thermostable materials and for the investigation of reliability and fire safety of composite structures. It also supports the investigation of interaction of composites with laser irradiation and the design of heat-shield systems. Structural methods are presented for calculating the effective mechanical and thermal properties of matrices, fibres and unidirectional, reinforced by dispersed particles and textile composites, in terms of properties of their constituent phases. Useful calculation methods are developed for characteristics such as the rate of thermomechanical erosion of composites under high-speed flow and the heat deformation of composites with account of chemical shrinkage. The author expansively compares modeling results with experimental data, and readers will find unique experimental results on mechanical and thermal properties of composites under temperatures up to 3000 K. Chapters show how the behavior of composite shells under high temperatures is simulated by the finite-element method and so cylindrical and axisymmetric composite shells and composite plates are investigated under local high-temperature heating. < The book will be of interest to researchers and to engineers designing composite structures, and invaluable to materials scientists developing advanced performance thermostable materials.
Note: This is the second printing. It contains all of the corrections as of May 2017 as well as an updated back cover. Roger Wagner's Assembly Lines articles originally appeared in Softalk magazine from October 1980 to June 1983. The first fifteen articles were reprinted in 1982 in Assembly Lines: The Book. Now, for the first time, all thirty-three articles are available in one complete volume. This edition also contains all of the appendices from the original book as well as new appendices on the 65C02, zero-page memory usage, and a beginner's guide to using the Merlin Assembler. The book is designed for students of all ages: the nostalgic programmer enjoying the retro revolution, the newcomer interested in learning low-level assembly coding, or the embedded systems developer using the latest 65C02 chips from Western Design Center. "Roger Wagner didn't just read the first book on programming the Apple computer-he wrote it." - Steve Wozniak
This treatise presents an integrated perspective on the interplay of set theory and graph theory, providing an extensive selection of examples that highlight how methods from one theory can be used to better solve problems originated in the other. Features: explores the interrelationships between sets and graphs and their applications to finite combinatorics; introduces the fundamental graph-theoretical notions from the standpoint of both set theory and dyadic logic, and presents a discussion on set universes; explains how sets can conveniently model graphs, discussing set graphs and set-theoretic representations of claw-free graphs; investigates when it is convenient to represent sets by graphs, covering counting and encoding problems, the random generation of sets, and the analysis of infinite sets; presents excerpts of formal proofs concerning graphs, whose correctness was verified by means of an automated proof-assistant; contains numerous exercises, examples, definitions, problems and insight panels.
Several Python programming books feature tools designed for experimental psychologists. What sets this book apart is its focus on eye-tracking. Eye-tracking is a widely used research technique in psychology and neuroscience labs. Research grade eye-trackers are typically faster, more accurate, and of course, more expensive than the ones seen in consumer goods or usability labs. Not surprisingly, a successful eye-tracking study usually requires sophisticated computer programming. Easy syntax and flexibility make Python a perfect choice for this task, especially for psychology researchers with little or no computer programming experience. This book offers detailed coverage of the Pylink library, a Python interface for the gold standard EyeLink (R) eye-trackers, with many step-by-step example scripts. This book is a useful reference for eye-tracking researchers, but you can also use it as a textbook for graduate-level programming courses.
This is a volume of chapters on the historical study of information, computing, and society written by seven of the most senior, distinguished members of the History of Computing field. These are edited, expanded versions of papers presented in a distinguished lecture series in 2018 at the University of Colorado Boulder - in the shadow of the Flatirons, the front range of the Rocky Mountains. Topics range widely across the history of computing. They include the digitalization of computer and communication technologies, gender history of computing, the history of data science, incentives for innovation in the computing field, labor history of computing, and the process of standardization. Authors were given wide latitude to write on a topic of their own choice, so long as the result is an exemplary article that represents the highest level of scholarship in the field, producing articles that scholars in the field will still look to read twenty years from now. The intention is to publish articles of general interest, well situated in the research literature, well grounded in source material, and well-polished pieces of writing. The volume is primarily of interest to historians of computing, but individual articles will be of interest to scholars in media studies, communication, computer science, cognitive science, general and technology history, and business.
Threshold graphs have a beautiful structure and possess many important mathematical properties. They have applications in many areas including computer science and psychology. Over the last 20 years the interest in threshold graphs has increased significantly, and the subject continues to attract much attention. The book contains many open problems and research ideas which will appeal to graduate students and researchers interested in graph theory. But above all "Threshold Graphs and Related Topics" provides a valuable source of information for all those working in this field.
Auf der Grundlage aktueller Entwicklungen der Wirtschaftsinformatik
stellen die Autoren Konzepte dar, die als Richtlinie fA1/4r die
kA1/4nftige Entwicklung von einem die operationalen Systeme, Data
Warehouse-, Informations-, Experten- und Data Mining Systeme
umfassenden Anwendungssystem dienen kAnnen.
The world of corporate management benefits when organizations realize the profitability, reliability, and flexibility obtained through IT standardization. Toward Corporate IT Standardization Management: Frameworks and Solutions details the IT standards conceptual model through insightful case studies that illustrate the factors affecting the performance of business processes. By offering organizations the opportunity to enhance process performance through IT standardization, this reference work demonstrates the effectiveness of IT standards, and the applicable techniques for implementation and management of such practices. This book features information useful to educators and students in the fields of Information Systems, IT-Management, Business Studies, and Economics, as well as IT practitioners and IS Managers.
This volume presents a selection of reports from scientific projects requiring high end computing resources on the Hitachi SR8000-F1 supercomputer operated by Leibniz Computing Center in Munich. All reports were presented at the joint HLRB and KONWHIR workshop at the Technical University of Munich in October 2002. The following areas of scientific research are covered: Applied Mathematics, Biosciences, Chemistry, Computational Fluid Dynamics, Cosmology, Geosciences, High-Energy Physics, Informatics, Nuclear Physics, Solid-State Physics. Moreover, projects from interdisciplinary research within the KONWIHR framework (Competence Network for Scientific High Performance Computing in Bavaria) are also included. Each report summarizes its scientific background and discusses the results with special consideration of the quantity and quality of Hitachi SR8000 resources needed to complete the research.
Ontologies and formal representations of knowledge are extremely powerful tools for modeling and managing large applications in several domains ranging from knowledge engineering, to data mining, to the semantic web. Ontology Theory, Management and Design: Advanced Tools and Models, explores the wide range of applications for ontologies, while providing a complete view of the both the theory behind the design and the problems posed by the practical development and use of ontologies. This reference presents an in-depth and forward looking analysis of current research, illustrating the importance of this field and pointing toward to the future of knowledge engineering, management and information technology. |
![]() ![]() You may like...
Systems Analysis And Design In A…
John Satzinger, Robert Jackson, …
Hardcover
![]()
The Oxford Handbook of Music and…
Sheila Whiteley, Shara Rambarran
Hardcover
R4,890
Discovery Miles 48 900
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
|