![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
This thesis devotes three introductory chapters to outlining basic recipes for constructing the quantum Hamiltonian of an arbitrary superconducting circuit, starting from classical circuit design. Since a superconducting circuit is one of the most promising platforms for realizing a practical quantum computer, anyone who is starting out in the field will benefit greatly from this introduction. The second focus of the introduction is the ultrastrong light-matter interaction (USC), where the latest developments are described. This is followed by three main research works comprising quantum memory in USC; scaling up the 1D circuit to a 2D lattice configuration; creation of Noisy Intermediate-Scale Quantum era quantum error correction codes and polariton-mediated qubit-qubit interaction. The research work detailed in this thesis will make a major contribution to the development of quantum random access memory, a prerequisite for various quantum machine learning algorithms and applications.
This volume collects contributions written by different experts in honor of Prof. Jaime Munoz Masque. It covers a wide variety of research topics, from differential geometry to algebra, but particularly focuses on the geometric formulation of variational calculus; geometric mechanics and field theories; symmetries and conservation laws of differential equations, and pseudo-Riemannian geometry of homogeneous spaces. It also discusses algebraic applications to cryptography and number theory. It offers state-of-the-art contributions in the context of current research trends. The final result is a challenging panoramic view of connecting problems that initially appear distant.
Knowledge and Technology Management in Virtual Organizations: Issues, Trends, Opportunities and Solutions presents a collection of the most recent contributions in the areas of organization, knowledge, and technology management in the context of virtual enterprises. This book contains important and in-depth information on four dimensions: semantic, managerial, technological, and social. The semantic dimensions covered in this book are ontological and organizational approaches, concepts, organizational models, and knowledge management models. In respect to managerial dimensions, this book covers process management, integration management, relationship management, process integration, knowledge management, technology integration management, and information integration. ""Knowledge and Technology Management in Virtual Organizations: Issues, Trends, Opportunities and Solutions"" presents the technological dimension by explaining the infrastructures and technologies to support technology and information integration standards and protocols. Lastly, this title highlights the social dimension, including human resources management, human resources integration, social issues, social impact, social requirements, and communities of knowledge.
This work presents the Clifford-Cauchy-Dirac (CCD) technique for solving problems involving the scattering of electromagnetic radiation from materials of all kinds. It allows anyone who is interested to master techniques that lead to simpler and more efficient solutions to problems of electromagnetic scattering than are currently in use. The technique is formulated in terms of the Cauchy kernel, single integrals, Clifford algebra and a whole-field approach. This is in contrast to many conventional techniques that are formulated in terms of Green's functions, double integrals, vector calculus and the combined field integral equation (CFIE). Whereas these conventional techniques lead to an implementation using the method of moments (MoM), the CCD technique is implemented as alternating projections onto convex sets in a Banach space. The ultimate outcome is an integral formulation that lends itself to a more direct and efficient solution than conventionally is the case, and applies without exception to all types of materials. On any particular machine, it results in either a faster solution for a given problem or the ability to solve problems of greater complexity. The Clifford-Cauchy-Dirac technique offers very real and significant advantages in uniformity, complexity, speed, storage, stability, consistency and accuracy.
Ecological Assessment of Polymers Strategies for Product Stewardship and Regulatory Programs John D. Hamilton and Roger Sutcliffe The expense of providing ecological assessments of new commercial products is formidable. The cost of the failure to comply with the current regulations--measured in fines, liability damages, and loss of public trust--is potentially much, much higher. Establishing effective environmental product stewardship strategies for assessment upfront not only promotes initial and continued compliance, it can reduce costs via the more efficient development of new products. Based on the collaboration of the Rohm and Haas Company and S.C. Johnson Wax with other manufacturers, contract laboratories, universities, and government agencies, Ecological Assessment of Polymers is the first complete reference to provide environment-oriented information about polymers from a product development and regulatory compliance perspective. A number of books deal with the potential hazards of pesticides and solvents. This is the first to focus on the commercial synthetic polymers that exist in laundry detergents, paints, super-absorbent diapers, packaging materials, and many other consumer and industrial products. Using the principles of environmental toxicology and chemistry, Ecological Assessment of Polymers approaches environmental evaluation as a decision-making process. The book demonstrates how assessment can be used as a planning tool for developing products, reducing potential liability, and creating new products, processes, and disposal systems. Featured discussions:
The study of network theory is a highly interdisciplinary field, which has emerged as a major topic of interest in various disciplines ranging from physics and mathematics, to biology and sociology. This book promotes the diverse nature of the study of complex networks by balancing the needs of students from very different backgrounds. It references the most commonly used concepts in network theory, provides examples of their applications in solving practical problems, and clear indications on how to analyse their results. In the first part of the book, students and researchers will discover the quantitative and analytical tools necessary to work with complex networks, including the most basic concepts in network and graph theory, linear and matrix algebra, as well as the physical concepts most frequently used for studying networks. They will also find instruction on some key skills such as how to proof analytic results and how to manipulate empirical network data. The bulk of the text is focused on instructing readers on the most useful tools for modern practitioners of network theory. These include degree distributions, random networks, network fragments, centrality measures, clusters and communities, communicability, and local and global properties of networks. The combination of theory, example and method that are presented in this text, should ready the student to conduct their own analysis of networks with confidence and allow teachers to select appropriate examples and problems to teach this subject in the classroom.
This unique collection of research papers offers a comprehensive and up-to-date guide to algebraic approaches to rough sets and reasoning with vagueness. It bridges important gaps, outlines intriguing future research directions, and connects algebraic approaches to rough sets with those for other forms of approximate reasoning. In addition, the book reworks algebraic approaches to axiomatic granularity. Given its scope, the book offers a valuable resource for researchers and teachers in the areas of rough sets and algebras of rough sets, algebraic logic, non classical logic, fuzzy sets, possibility theory, formal concept analysis, computational learning theory, category theory, and other formal approaches to vagueness and approximate reasoning. Consultants in AI and allied fields will also find the book to be of great practical value.
This textbook intends to be a comprehensive and substantially self-contained two-volume book covering performance, reliability, and availability evaluation subjects. The volumes focus on computing systems, although the methods may also be applied to other systems. The first volume covers Chapter 1 to Chapter 14, whose subtitle is ``Performance Modeling and Background". The second volume encompasses Chapter 15 to Chapter 25 and has the subtitle ``Reliability and Availability Modeling, Measuring and Workload, and Lifetime Data Analysis". This text is helpful for computer performance professionals for supporting planning, design, configuring, and tuning the performance, reliability, and availability of computing systems. Such professionals may use these volumes to get acquainted with specific subjects by looking at the particular chapters. Many examples in the textbook on computing systems will help them understand the concepts covered in each chapter. The text may also be helpful for the instructor who teaches performance, reliability, and availability evaluation subjects. Many possible threads could be configured according to the interest of the audience and the duration of the course. Chapter 1 presents a good number of possible courses programs that could be organized using this text.
The CMOS Cookbook contains all you need to know to understand and
successfully use CMOS (Complementary Metal-Oxide Semiconductor)
integrated circuits. Written in a "cookbook" format that requires
little math, this practical, user-oriented book covers all the
basics for working with digital logic and many of its end
appilations.
Today, fuzzy methods provide tools to handle data sets in relevant, robust and interpretable ways, making it possible to model and exploit imprecision and uncertainty in data modeling and data mining. Scalable Fuzzy Algorithms for Data Management and Analysis: Methods and Design presents innovative, cutting-edge fuzzy techniques that highlight the relevance of fuzziness for huge data sets in the perspective of scalability issues, from both a theoretical and experimental point of view. It covers a wide scope of research areas including data representation, structuring and querying as well as information retrieval and data mining. It encompasses different forms of databases, including data warehouses, data cubes, tabular or relational data, and many applications among which music warehouses, video mining, bioinformatics, semantic web and data streams.
This is a student solutions manual for Elementary Number Theory with Applications 1st edition by Thomas Koshy (2002). Note that the textbook itself is not included in this purchase. From the back cover of the textbook: Modern technology has brought a new dimension to the power of number theory: constant practical use. Once considered the purest of pure mathematics, number theory has become an essential tool in the rapid development of technology in a number of areas, including art, coding theory, cryptology, and computer science. The range of fascinating applications confirms the boundlessness of human ingenuity and creativity. Elementary Number Theory captures the author's fascination for the subject: its beauty, elegance, and historical development, and the opportunities number theory provides for experimentation, exploration, and, of course, its marvelous applications.
Chapters "Turing and Free Will: A New Take on an Old Debate" and "Turing and the History of Computer Music" are available open access under a Creative Commons Attribution 4.0 International License via link.springer.com.
This book examines a writing activity that has recently fallen into disrepute. Outlining has a bad reputation among students, even though many teachers and textbooks still recommend the process. In part, the author argues, the medium is to blame. Paper and ink make the revision difficult. But if one uses an electronic outliner, the activity can be very helpful in developing a thoughtful and effective document, particularly one that spans many pages and deals with a complicated subject. Outlining Goes Electronic takes an historical approach, examining the way people developed the idea of outlining, from the classical period to the present. We see that the medium in which people worked strongly shaped their assumptions, ideas, and use of outlines. In developing a theoretical model of outlining as an activity, the author argues that a relatively new electronic tool-software that accelerates and performs the process of outlining-can give us a new perspective from which to engage previous classroom models of writing, recent writing theory, and current practice in the technical writing field.
This book serves not only as an introduction, but also as an
advanced text and reference source in the field of deterministic
optimal control systems governed by ordinary differential
equations. It also includes an introduction to the classical
calculus of variations.
We are now entering an era where the human world assumes recognition of itself as data. Much of humanity's basis for existence is becoming subordinate to software processes that tabulate, index, and sort the relations that comprise what we perceive as reality. The acceleration of data collection threatens to relinquish ephemeral modes of representation to ceaseless processes of computation. This situation compels the human world to form relations with non-human agencies, to establish exchanges with software processes in order to allow a profound upgrade of our own ontological understanding. By mediating with a higher intelligence, we may be able to rediscover the inner logic of the age of intelligent machines. In The End of the Future, Stephanie Polsky conceives an understanding of the digital through its dynamic intersection with the advent and development of the nation-state, race, colonization, navigational warfare, mercantilism, and capitalism, and the mathematical sciences over the past five centuries, the era during which the world became "modern." The book animates the twenty-first century as an era in which the screen has split off from itself and proliferated onto multiple surfaces, allowing an inverted image of totalitarianism to flash up and be altered to support our present condition of binary apperception. It progresses through a recognition of atomized political power, whose authority lies in the control not of the means of production, but of information, and in which digital media now serves to legitimize and promote a customized micropolitics of identity management. On this new apostolate plane, humanity may be able to shape a new world in which each human soul is captured and reproduced as an autonomous individual bearing affects and identities. The digital infrastructure of the twenty-first century makes it possible for power to operate through an esoteric mathematical means, and for factual material to be manipulated in the interest of advancing the means of control. This volume travels a course from Elizabethan England, to North American slavery, through cybernetic Social Engineering, Cold War counterinsurgency, and the (neo)libertarianism of Silicon Valley in order to arrive at a place where an organizing intelligence that started from an ambition to resourcefully manipulate physical bodies has ended with their profound neutralization.
This book discusses efficient prediction techniques for the current state-of-the-art High Efficiency Video Coding (HEVC) standard, focusing on the compression of a wide range of video signals, such as 3D video, Light Fields and natural images. The authors begin with a review of the state-of-the-art predictive coding methods and compression technologies for both 2D and 3D multimedia contents, which provides a good starting point for new researchers in the field of image and video compression. New prediction techniques that go beyond the standardized compression technologies are then presented and discussed. In the context of 3D video, the authors describe a new predictive algorithm for the compression of depth maps, which combines intra-directional prediction, with flexible block partitioning and linear residue fitting. New approaches are described for the compression of Light Field and still images, which enforce sparsity constraints on linear models. The Locally Linear Embedding-based prediction method is investigated for compression of Light Field images based on the HEVC technology. A new linear prediction method using sparse constraints is also described, enabling improved coding performance of the HEVC standard, particularly for images with complex textures based on repeated structures. Finally, the authors present a new, generalized intra-prediction framework for the HEVC standard, which unifies the directional prediction methods used in the current video compression standards, with linear prediction methods using sparse constraints. Experimental results for the compression of natural images are provided, demonstrating the advantage of the unified prediction framework over the traditional directional prediction modes used in HEVC standard.
The main aim of this book is to discuss model order reduction (MOR) methods for differential-algebraic equations (DAEs) with linear coefficients that make use of splitting techniques before applying model order reduction. The splitting produces a system of ordinary differential equations (ODE) and a system of algebraic equations, which are then reduced separately. For the reduction of the ODE system, conventional MOR methods can be used, whereas for the reduction of the algebraic systems new methods are discussed. The discussion focuses on the index-aware model order reduction method (IMOR) and its variations, methods for which the so-called index of the original model is automatically preserved after reduction.
This volume examines the complex, contradictory discourses of hypertext. Using theoretical material from cultural theory, radical and border pedagogies, and technology criticism, the text discusses three primary ways hypertext is articulated: as automated book (technical communication), as virtual commodity (online databases), and as environment for constructing and exploring multiple subject positions (postmodern hypertext in composition and literature). I would recommend the entire book to researchers and academics who recognize the need to integrate new technologies into our classrooms and pedagogies. - Technical Communication
This book explains the most prominent and some promising new, general techniques that combine metaheuristics with other optimization methods. A first introductory chapter reviews the basic principles of local search, prominent metaheuristics, and tree search, dynamic programming, mixed integer linear programming, and constraint programming for combinatorial optimization purposes. The chapters that follow present five generally applicable hybridization strategies, with exemplary case studies on selected problems: incomplete solution representations and decoders; problem instance reduction; large neighborhood search; parallel non-independent construction of solutions within metaheuristics; and hybridization based on complete solution archives. The authors are among the leading researchers in the hybridization of metaheuristics with other techniques for optimization, and their work reflects the broad shift to problem-oriented rather than algorithm-oriented approaches, enabling faster and more effective implementation in real-life applications. This hybridization is not restricted to different variants of metaheuristics but includes, for example, the combination of mathematical programming, dynamic programming, or constraint programming with metaheuristics, reflecting cross-fertilization in fields such as optimization, algorithmics, mathematical modeling, operations research, statistics, and simulation. The book is a valuable introduction and reference for researchers and graduate students in these domains.
Conceptual modeling has always been one of the main issues in information systems engineering as it aims to describe the general knowledge of the system at an abstract level that facilitates user understanding and software development. This collection of selected papers provides a comprehensive and extremely readable overview of what conceptual modeling is and perspectives on making it more and more relevant in our society. It covers topics like modeling the human genome, blockchain technology, model-driven software development, data integration, and wiki-like repositories and demonstrates the general applicability of conceptual modeling to various problems in diverse domains. Overall, this book is a source of inspiration for everybody in academia working on the vision of creating a strong, fruitful and creative community of conceptual modelers. With this book the editors and authors want to honor Prof. Antoni Olive for his enormous and ongoing contributions to the conceptual modeling discipline. It was presented to him on the occasion of his keynote at ER 2017 in Valencia, a conference that he has contributed to and supported for over 20 years. Thank you very much to Antoni for so many years of cooperation and friendship.
The Internet generation of interculturally minded, socially networked leaders is redefining the workplace. Management is slow to respond. Asian philosophy - with concepts like ba, Zen, feng shui, and ki - is becoming increasingly important for tomorrow's leader. Blend that with the Scandinavian mindset of egalitarianism, openness, and gender equity, add sophisticated use of network effects and you begin to understand the true logic of Internet in the workplace. Cultural diversity and technological dependence are global trends that demand constant attention. Know: - How to lead without being the leader - How to adapt quickly to change - How to thrive on diversity - How to be a trendsetter in technology - How to be on top but still have a life. |
You may like...
Development of China's Cultural Industry
Chang Jiang, Jialian Li, …
Hardcover
R2,655
Discovery Miles 26 550
Die Braambos Bly Brand - Nie-teoloë Se…
Pieter Malan, Chris Jones
Paperback
|