![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing
This book is aimed at two kinds of readers: firstly, people working in or near mathematics, who are curious about continued fractions; and secondly, senior or graduate students who would like an extensive introduction to the analytic theory of continued fractions. The book contains several recent results and new angles of approach and thus should be of interest to researchers throughout the field. The first five chapters contain an introduction to the basic theory, while the last seven chapters present a variety of applications. Finally, an appendix presents a large number of special continued fraction expansions. This very readable book also contains many valuable examples and problems.
This book presents the proceedings of The EAI International Conference on Computer Science: Applications in Engineering and Health Services (COMPSE 2019). The conference highlighted the latest research innovations and applications of algorithms designed for optimization applications within the fields of Science, Computer Science, Engineering, Information Technology, Management, Finance and Economics and Health Systems. Focusing on a variety of methods and systems as well as practical examples, this conference is a significant resource for post graduate-level students, decision makers, and researchers in both public and private sectors who are seeking research-based methods for modelling uncertain and unpredictable real-world problems.
Note: This is the second printing. It contains all of the corrections as of May 2017 as well as an updated back cover. Roger Wagner's Assembly Lines articles originally appeared in Softalk magazine from October 1980 to June 1983. The first fifteen articles were reprinted in 1982 in Assembly Lines: The Book. Now, for the first time, all thirty-three articles are available in one complete volume. This edition also contains all of the appendices from the original book as well as new appendices on the 65C02, zero-page memory usage, and a beginner's guide to using the Merlin Assembler. The book is designed for students of all ages: the nostalgic programmer enjoying the retro revolution, the newcomer interested in learning low-level assembly coding, or the embedded systems developer using the latest 65C02 chips from Western Design Center. "Roger Wagner didn't just read the first book on programming the Apple computer-he wrote it." - Steve Wozniak
This book presents selected papers from the 3rd International Workshop on Computational Engineering held in Stuttgart from October 6 to 10, 2014, bringing together innovative contributions from related fields with computer science and mathematics as an important technical basis among others. The workshop discussed the state of the art and the further evolution of numerical techniques for simulation in engineering and science. We focus on current trends in numerical simulation in science and engineering, new requirements arising from rapidly increasing parallelism in computer architectures, and novel mathematical approaches. Accordingly, the chapters of the book particularly focus on parallel algorithms and performance optimization, coupled systems, and complex applications and optimization.
The authors describe systematic methods for uncovering scientific laws a priori, on the basis of intuition, or "Gedanken Experiments". Mathematical expressions of scientific laws are, by convention, constrained by the rule that their form must be invariant with changes of the units of their variables. This constraint makes it possible to narrow down the possible forms of the laws. It is closely related to, but different from, dimensional analysis. It is a mathematical book, largely based on solving functional equations. In fact, one chapter is an introduction to the theory of functional equations.
The book provides suggestions on how to start using bionic optimization methods, including pseudo-code examples of each of the important approaches and outlines of how to improve them. The most efficient methods for accelerating the studies are discussed. These include the selection of size and generations of a study's parameters, modification of these driving parameters, switching to gradient methods when approaching local maxima, and the use of parallel working hardware. Bionic Optimization means finding the best solution to a problem using methods found in nature. As Evolutionary Strategies and Particle Swarm Optimization seem to be the most important methods for structural optimization, we primarily focus on them. Other methods such as neural nets or ant colonies are more suited to control or process studies, so their basic ideas are outlined in order to motivate readers to start using them. A set of sample applications shows how Bionic Optimization works in practice. From academic studies on simple frames made of rods to earthquake-resistant buildings, readers follow the lessons learned, difficulties encountered and effective strategies for overcoming them. For the problem of tuned mass dampers, which play an important role in dynamic control, changing the goal and restrictions paves the way for Multi-Objective-Optimization. As most structural designers today use commercial software such as FE-Codes or CAE systems with integrated simulation modules, ways of integrating Bionic Optimization into these software packages are outlined and examples of typical systems and typical optimization approaches are presented. The closing section focuses on an overview and outlook on reliable and robust as well as on Multi-Objective-Optimization, including discussions of current and upcoming research topics in the field concerning a unified theory for handling stochastic design processes.
This timely text/reference presents a comprehensive review of the workflow scheduling algorithms and approaches that are rapidly becoming essential for a range of software applications, due to their ability to efficiently leverage diverse and distributed cloud resources. Particular emphasis is placed on how workflow-based automation in software-defined cloud centers and hybrid IT systems can significantly enhance resource utilization and optimize energy efficiency. Topics and features: describes dynamic workflow and task scheduling techniques that work across multiple (on-premise and off-premise) clouds; presents simulation-based case studies, and details of real-time test bed-based implementations; offers analyses and comparisons of a broad selection of static and dynamic workflow algorithms; examines the considerations for the main parameters in projects limited by budget and time constraints; covers workflow management systems, workflow modeling and simulation techniques, and machine learning approaches for predictive workflow analytics. This must-read work provides invaluable practical insights from three subject matter experts in the cloud paradigm, which will empower IT practitioners and industry professionals in their daily assignments. Researchers and students interested in next-generation software-defined cloud environments will also greatly benefit from the material in the book.
This book shows you 1000 ways to seduce your website's visitor: Make visitors to buyers. Better: Make customers to friends. It explains how to become a successful seller by viral marketing at the web.
The world-class National Palace Museum (NPM) in Taiwan possesses a repository of the largest collection of Chinese cultural treasures of outstanding quality. Through implementing a two-organizational restructuring, and shifting its operational focus from being object-oriented to public-centered, it aims to capture the attention of people and promote awareness of the culture and traditions of China. In this vein, the NPM combines its expertise in museum service with the possibilities afforded by Information Technology (IT). This book analyses the research results of a team sponsored by the National Science Council in Taiwan to observe the development processes and accomplishments, and to conduct scientific researches covering not only the technology and management disciplines, but also the humanities and social science disciplines. The development process of new digital content and IT-enabled services of NPM would be a useful benchmark for museums, cultural and creative organizations and traditional organizations in Taiwan and around the world.
The concept of innovation management and learning organizations concepts strongly emphasize the high role of human/intellectual capital in the company and the crucial function of knowledge in modern society. However, there is often a paradox between managerial language and actual practice in many organizations: on one hand, knowledge-workers are perceived as the most valued members of organizations while, on the other, they are being manipulated and "engineered"-commonly driven to burn-out, and deprived of family life. All this leads to the emergence of new organizational phenomena that, up to now, have been insufficiently analyzed and described. Management Practices in High-Tech Environments studies this issue thoroughly from an international, comparative, cross-cultural perspective, presenting cutting-edge research on management practices in American, European, Asian and Middle-Eastern high-tech companies, with particular focus on fieldwork-driven, but reflective, contributions.
Alfred Tarski was one of the two giants of the twentieth-century development of logic, along with Kurt Goedel. The four volumes of this collection contain all of Tarski's papers and abstracts published during his lifetime, as well as a comprehensive bibliography. Here will be found many of the works, spanning the period 1921 through 1979, which are the bedrock of contemporary areas of logic, whether in mathematics or philosophy. These areas include the theory of truth in formalized languages, decision methods and undecidable theories, foundations of geometry, set theory, and model theory, algebraic logic, and universal algebra.
Recent industry surveys expect the cloud computing services market to be in excess of $20 billion and cloud computing jobs to be in excess of 10 million worldwide in 2014 alone. In addition, since a majority of existing information technology (IT) jobs is focused on maintaining legacy in-house systems, the demand for these kinds of jobs is likely to drop rapidly if cloud computing continues to take hold of the industry. However, there are very few educational options available in the area of cloud computing beyond vendor-specific training by cloud providers themselves. Cloud computing courses have not found their way (yet) into mainstream college curricula. This book is written as a textbook on cloud computing for educational programs at colleges. It can also be used by cloud service providers who may be interested in offering a broader perspective of cloud computing to accompany their own customer and employee training programs. The typical reader is expected to have completed a couple of courses in programming using traditional high-level languages at the college-level, and is either a senior or a beginning graduate student in one of the science, technology, engineering or mathematics (STEM) fields. We have tried to write a comprehensive book that transfers knowledge through an immersive "hands-on approach," where the reader is provided the necessary guidance and knowledge to develop working code for real-world cloud applications. Additional support is available at the book's website: www.cloudcomputingbook.info Organization The book is organized into three main parts. Part I covers technologies that form the foundations of cloud computing. These include topics such as virtualization, load balancing, scalability & elasticity, deployment, and replication. Part II introduces the reader to the design & programming aspects of cloud computing. Case studies on design and implementation of several cloud applications in the areas such as image processing, live streaming and social networks analytics are provided. Part III introduces the reader to specialized aspects of cloud computing including cloud application benchmarking, cloud security, multimedia applications and big data analytics. Case studies in areas such as IT, healthcare, transportation, networking and education are provided.
This book presents new efficient methods for optimization in realistic large-scale, multi-agent systems. These methods do not require the agents to have the full information about the system, but instead allow them to make their local decisions based only on the local information, possibly obtained during communication with their local neighbors. The book, primarily aimed at researchers in optimization and control, considers three different information settings in multi-agent systems: oracle-based, communication-based, and payoff-based. For each of these information types, an efficient optimization algorithm is developed, which leads the system to an optimal state. The optimization problems are set without such restrictive assumptions as convexity of the objective functions, complicated communication topologies, closed-form expressions for costs and utilities, and finiteness of the system's state space.
Cloud service benchmarking can provide important, sometimes surprising insights into the quality of services and leads to a more quality-driven design and engineering of complex software architectures that use such services. Starting with a broad introduction to the field, this book guides readers step-by-step through the process of designing, implementing and executing a cloud service benchmark, as well as understanding and dealing with its results. It covers all aspects of cloud service benchmarking, i.e., both benchmarking the cloud and benchmarking in the cloud, at a basic level. The book is divided into five parts: Part I discusses what cloud benchmarking is, provides an overview of cloud services and their key properties, and describes the notion of a cloud system and cloud-service quality. It also addresses the benchmarking lifecycle and the motivations behind running benchmarks in particular phases of an application lifecycle. Part II then focuses on benchmark design by discussing key objectives (e.g., repeatability, fairness, or understandability) and defining metrics and measurement methods, and by giving advice on developing own measurement methods and metrics. Next, Part III explores benchmark execution and implementation challenges and objectives as well as aspects like runtime monitoring and result collection. Subsequently, Part IV addresses benchmark results, covering topics such as an abstract process for turning data into insights, data preprocessing, and basic data analysis methods. Lastly, Part V concludes the book with a summary, suggestions for further reading and pointers to benchmarking tools available on the Web. The book is intended for researchers and graduate students of computer science and related subjects looking for an introduction to benchmarking cloud services, but also for industry practitioners who are interested in evaluating the quality of cloud services or who want to assess key qualities of their own implementations through cloud-based experiments.
This book presents the outcomes of the trans- and interdisciplinary research project NEMo (Nachhaltige Erfullung von Mobilitatsbedurfnissen im landlichen Raum - Sustainable Fulfilment of Mobility Needs in Rural Areas). Due to demographic change, it is becoming increasingly difficult for rural districts and communities to maintain a basic set of public transport services such as bus and train transit without encountering issues regarding necessary social participation, sensible regional value creation and, last but not least, achievable environmental protection goals. At the same time, the demand for mobility in rural areas will continue to rise in the future, e.g. due to the concentration of medical care facilities and shopping centres close to cities. Focusing on the development of sustainable and innovative mobility services and business models, this book explains how new mobility offers can be created in which citizens themselves become mobility providers. To do so, it combines the findings of the individual research groups with external contributions from science and practice.
This self-contained essay collection is published to commemorate half a century of Bell's theorem. Like its much acclaimed predecessor "Quantum [Un]Speakables: From Bell to Quantum Information" (published 2002), it comprises essays by many of the worlds leading quantum physicists and philosophers. These revisit the foundations of quantum theory as well as elucidating the remarkable progress in quantum technologies achieved in the last couple of decades. Fundamental concepts such as entanglement, nonlocality and contextuality are described in an accessible manner and, alongside lively descriptions of the various theoretical and experimental approaches, the book also delivers interesting philosophical insights. The collection as a whole will serve as a broad introduction for students and newcomers as well as delighting the scientifically literate general reader.
Everything you know about the future is wrong. Presumptive Design: Design Provocations for Innovation is for people "inventing" the future: future products, services, companies, strategies and policies. It introduces a design-research method that shortens time to insights from months to days. Presumptive Design is a fundamentally agile approach to identifying your audiences' key needs. Offering rapidly crafted artifacts, your teams collaborate with your customers to identify preferred and profitable elements of your desired outcome. Presumptive Design focuses on your users' problem space, informing your business strategy, your project's early stage definition, and your innovation pipeline. Comprising discussions of design theory with case studies and how-to's, the book offers business leadership, management and innovators the benefits of design thinking and user experience in the context of early stage problem definition. Presumptive Design is an advanced technique and quick to use: within days of reading this book, your research and design teams can apply the approach to capture a risk-reduced view of your future.
We are now entering an era where the human world assumes recognition of itself as data. Much of humanity's basis for existence is becoming subordinate to software processes that tabulate, index, and sort the relations that comprise what we perceive as reality. The acceleration of data collection threatens to relinquish ephemeral modes of representation to ceaseless processes of computation. This situation compels the human world to form relations with non-human agencies, to establish exchanges with software processes in order to allow a profound upgrade of our own ontological understanding. By mediating with a higher intelligence, we may be able to rediscover the inner logic of the age of intelligent machines. In The End of the Future, Stephanie Polsky conceives an understanding of the digital through its dynamic intersection with the advent and development of the nation-state, race, colonization, navigational warfare, mercantilism, and capitalism, and the mathematical sciences over the past five centuries, the era during which the world became "modern." The book animates the twenty-first century as an era in which the screen has split off from itself and proliferated onto multiple surfaces, allowing an inverted image of totalitarianism to flash up and be altered to support our present condition of binary apperception. It progresses through a recognition of atomized political power, whose authority lies in the control not of the means of production, but of information, and in which digital media now serves to legitimize and promote a customized micropolitics of identity management. On this new apostolate plane, humanity may be able to shape a new world in which each human soul is captured and reproduced as an autonomous individual bearing affects and identities. The digital infrastructure of the twenty-first century makes it possible for power to operate through an esoteric mathematical means, and for factual material to be manipulated in the interest of advancing the means of control. This volume travels a course from Elizabethan England, to North American slavery, through cybernetic Social Engineering, Cold War counterinsurgency, and the (neo)libertarianism of Silicon Valley in order to arrive at a place where an organizing intelligence that started from an ambition to resourcefully manipulate physical bodies has ended with their profound neutralization.
This text presents an algebraic approach to the construction of several important families of quantum codes derived from classical codes by applying the well-known Calderbank-Shor-Steane (CSS), Hermitian, and Steane enlargement constructions to certain classes of classical codes. In addition, the book presents families of asymmetric quantum codes with good parameters and provides a detailed description of the procedures adopted to construct families of asymmetric quantum convolutional codes.Featuring accessible language and clear explanations, the book is suitable for use in advanced undergraduate and graduate courses as well as for self-guided study and reference. It provides an expert introduction to algebraic techniques of code construction and, because all of the constructions are performed algebraically, it enables the reader to construct families of codes, rather than only codes with specific parameters. The text offers an abundance of worked examples, exercises, and open-ended problems to motivate the reader to further investigate this rich area of inquiry. End-of-chapter summaries and a glossary of key terms allow for easy review and reference.
In information technology, unlike many other fields, the need to support the unique perspective of technologically advanced students and deliver technology-rich content presents unique challenges. Today's IT students need the ability to interact with their instructor in near-real time, interact with their peers and project team members, and access and manipulate technology tools in the pursuit of their educational objectives.""Handbook of Distance Learning for Real-Time and Asynchronous Information Technology Education"" delves deep into the construct of real-time, asynchronous education through information technology, pooling experiences from seasoned researchers and educators to detail their past successes and failures, discussing their techniques, hardships, and triumphs in the search for innovative and effective distance learning education for IT programs. This Premier Reference Source answers the increasing demand for a fundamental, decisive source on this cutting-edge issue facing all institutions, covering topics such as asynchronous communication, real-time instruction, multimedia content, content delivery, and distance education technologies.
This unique text/reference provides an overview of crossbar-based interconnection networks, offering novel perspectives on these important components of high-performance, parallel-processor systems. A particular focus is placed on solutions to the blocking and scalability problems. Topics and features: introduces the fundamental concepts in interconnection networks in multi-processor systems, including issues of blocking, scalability, and crossbar networks; presents a classification of interconnection networks, and provides information on recognizing each of the networks; examines the challenges of blocking and scalability, and analyzes the different solutions that have been proposed; reviews a variety of different approaches to improve fault tolerance in multistage interconnection networks; discusses the scalable crossbar network, which is a non-blocking interconnection network that uses small-sized crossbar switches as switching elements. This invaluable work will be of great benefit to students, researchers and practitioners interested in computer networks, parallel processing and reliability engineering. The text is also essential reading for course modules on interconnection network design and reliability.
The book provides a comprehensive introduction and a novel mathematical foundation of the field of information geometry with complete proofs and detailed background material on measure theory, Riemannian geometry and Banach space theory. Parametrised measure models are defined as fundamental geometric objects, which can be both finite or infinite dimensional. Based on these models, canonical tensor fields are introduced and further studied, including the Fisher metric and the Amari-Chentsov tensor, and embeddings of statistical manifolds are investigated. This novel foundation then leads to application highlights, such as generalizations and extensions of the classical uniqueness result of Chentsov or the Cramer-Rao inequality. Additionally, several new application fields of information geometry are highlighted, for instance hierarchical and graphical models, complexity theory, population genetics, or Markov Chain Monte Carlo. The book will be of interest to mathematicians who are interested in geometry, information theory, or the foundations of statistics, to statisticians as well as to scientists interested in the mathematical foundations of complex systems.
This book offers readers an easy introduction into quantum computing as well as into the design for corresponding devices. The authors cover several design tasks which are important for quantum computing and introduce corresponding solutions. A special feature of the book is that those tasks and solutions are explicitly discussed from a design automation perspective, i.e., utilizing clever algorithms and data structures which have been developed by the design automation community for conventional logic (i.e., for electronic devices and systems) and are now applied for this new technology. By this, relevant design tasks can be conducted in a much more efficient fashion than before - leading to improvements of several orders of magnitude (with respect to runtime and other design objectives). Describes the current state of the art for designing quantum circuits, for simulating them, and for mapping them to real hardware; Provides a first comprehensive introduction into design automation for quantum computing that tackles practically relevant tasks; Targets the quantum computing community as well as the design automation community, showing both perspectives to quantum computing, and what impressive improvements are possible when combining the knowledge of both communities.
Handbook of Research on Ambient Intelligence and Smart Environments: Trends and Perspectives covers the cutting-edge aspects of AMI applications, specifically those involving the effective design, realization, and implementation of a comprehensive AmI application. This pertinent publication targets researchers and practitioners in Ambient Intelligence, as well as those in ubiquitous and pervasive computing, artificial intelligence, sensor networks, knowledge representation, automated reasoning and learning, system and software engineering, and man-machine interfaces. |
![]() ![]() You may like...
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
|