![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Applications of computing > General
This book is a tribute to Kenichi Morita's ideas and achievements in theoretical computer science, reversibility and computationally universal mathematical machines. It offers a unique source of information on universality and reversibility in computation and is an indispensable book for computer scientists, mathematicians, physicists and engineers. Morita is renowned for his works on two-dimensional language accepting automata, complexity of Turing machines, universality of cellular automata, regular and context-free array grammars, and undecidability. His high-impact works include findings on parallel generation and parsing of array languages by means of reversible automata, construction of a reversible automaton from Fredkin gates, solving a firing squad synchronization problem in reversible cellular automata, self-reproduction in reversible cellular spaces, universal reversible two-counter machines, solution of nondeterministic polynomial (NP) problems in hyperbolic cellular automata, reversible P-systems, a new universal reversible logic element with memory, and reversibility in asynchronous cellular automata. Kenichi Morita's achievements in reversibility, universality and theory of computation are celebrated in over twenty high-profile contributions from his colleagues, collaborators, students and friends. The theoretical constructs presented in this book are amazing in their diversity and depth of intellectual insight, addressing: queue automata, hyperbolic cellular automata, Abelian invertible automata, number-conserving cellular automata, Brownian circuits, chemical automata, logical gates implemented via glider collisions, computation in swarm networks, picture arrays, universal reversible counter machines, input-position-restricted models of language acceptors, descriptional complexity and persistence of cellular automata, partitioned cellular automata, firing squad synchronization algorithms, reversible asynchronous automata, reversible simulations of ranking trees, Shor's factorization algorithms, and power consumption of cellular automata.
"Specification and transformation of programs" is short for a methodology of software development where, from a formal specification of a problem to be solved, programs correctly solving that problem are constructed by stepwise application of formal, semantics-preserving transformation rules. The approach considers programming as a formal activity. Consequently, it requires some mathematical maturity and, above all, the will to try something new. A somewhat experienced programmer or a third- or fourth-year student in computer science should be able to master most of this material - at least, this is the level I have aimed at. This book is primarily intended as a general introductory textbook on transformational methodology. As with any methodology, reading and understanding is necessary but not sufficient. Therefore, most of the chapters contain a set of exercises for practising as homework. Solutions to these exercises exist and can, in principle, be obtained at nominal cost from the author upon request on appropriate letterhead. In addition, the book also can be seen as a comprehensive account of the particular transformational methodology developed within the Munich CIP project.
Deryn Watson and David Tinsley The topic of the conference, integrating infonnation technology into education, is both broad and multi-facetted. In order to help focus the papers and discussion we identified 7 themes: * Current developments in society and education influencing integration; * Teachers, their roles and concerns; * Learners, their expectations of and behaviour in an integrated environment; * Developments and concerns in the curriculum; * Successes and failures in existing practice; * Organisation and management of integrated environments; * Identification of social and political influences. Each author was invited to focus on one theme, and these remained strands throughout as can be seen from the short papers and focus group reports. The first and most significant concern therefore was to be clear about our notions of integration; what do we mean and how is this relevant? Our keynote paper from Cornu clearly marked out this debate by examining the notion of integration and alerting us to the fact that as long as the use of IT is still added to the curriculum, then integration has not yet begun.
This book contains the ceremonials and the proceedings pertaining to the Int- national Symposium CCN2005 on "Complex Computing-Networks: A Link between Brain-like and Wave-Oriented Electrodynamics Algorithms," convened at Do ?u ? University of Istanbul, Turkey, on 13-14 June 2005, in connection with the bestowal of the honorary doctorate degrees on Professors Leopold B. Felsen and Leon O. Chua, for their extraordinary achievements in electromagnetics, and n- linear systems, respectively. The symposium was co-organized by Cem Goknar and Levent Sevgi, in consultation with Leopold B. Felsen and Leon O. Chua. Istanbul is a city with wonderful natural and historical surroundings, a city not only interconnecting Asia and Europe but also Eastern and Western cultures. Therefore, CCN2005 was a memorable event not only in the lifetime of Drs. Felsen, Chua, and their families, but also for all the other participants who were there to congratulate the recipients and participate in the symposium."
In the last few years CMOS technology has become increas ingly dominant for realizing Very Large Scale Integrated (VLSI) circuits. The popularity of this technology is due to its high den sity and low power requirement. The ability to realize very com plex circuits on a single chip has brought about a revolution in the world of electronics and computers. However, the rapid advance ments in this area pose many new problems in the area of testing. Testing has become a very time-consuming process. In order to ease the burden of testing, many schemes for designing the circuit for improved testability have been presented. These design for testability techniques have begun to catch the attention of chip manufacturers. The trend is towards placing increased emphasis on these techniques. Another byproduct of the increase in the complexity of chips is their higher susceptibility to faults. In order to take care of this problem, we need to build fault-tolerant systems. The area of fault-tolerant computing has steadily gained in importance. Today many universities offer courses in the areas of digital system testing and fault-tolerant computing. Due to the impor tance of CMOS technology, a significant portion of these courses may be devoted to CMOS testing. This book has been written as a reference text for such courses offered at the senior or graduate level. Familiarity with logic design and switching theory is assumed. The book should also prove to be useful to professionals working in the semiconductor industry."
Understanding digital modes and practices of traditional rhetoric are essential in emphasising information and interaction in human-to-human and human-computer contexts. These emerging technologies are essential in gauging information processes across global contexts. Digital Rhetoric and Global Literacies: Communication Modes and Digital Practices in the Networked World compiles relevant theoretical frameworks, current practical applications, and emerging practices of digital rhetoric. Highlighting the key principles and understandings of the underlying modes, practices, and literacies of communication, this book is a vital guide for professionals, scholars, researchers, and educators interested in finding clarity and enrichment in the diverse perspectives of digital rhetoric research.
E-ffective Writing for E-Learning Environments integrates research and practice in user-centered design and learning design for instructors in post-secondary institutions and learning organizations who are developing e-learning resources. The book is intended as a development guide for experts in areas other than instructional or educational technology (in other words, experts in cognate areas such as Biology or English or Nursing) rather than as a learning design textbook. The organization of the book reflects the development process for a resource, course, or program from planning and development through formative evaluation, and identifies trends and issues that faculty or developers might encounter along the way. The account of the process of one faculty member's course development journey illustrates the suggested design guidelines. The accompanying practice guide provides additional information, examples, learning activities, and tools to supplement the text.
The success of VHDL since it has been balloted in 1987 as an IEEE standard may look incomprehensible to the large population of hardware designers, who had never heared of Hardware Description Languages before (for at least 90% of them), as well as to the few hundreds of specialists who had been working on these languages for a long time (25 years for some of them). Until 1988, only a very small subset of designers, in a few large companies, were used to describe their designs using a proprietary HDL, or sometimes a HDL inherited from a University when some software environment happened to be developped around it, allowing usability by third parties. A number of benefits were definitely recognized to this practice, such as functional verification of a specification through simulation, first performance evaluation of a tentative design, and sometimes automatic microprogram generation or even automatic high level synthesis. As there was apparently no market for HDL's, the ECAD vendors did not care about them, start-up companies were seldom able to survive in this area, and large users of proprietary tools were spending more and more people and money just to maintain their internal system.
From Google search to self-driving cars to human longevity, is Alphabet creating a neoteric Garden of Eden or Bentham's Panopticon? Will King Solomon's challenge supersede the Turing test for artificial intelligence? Can transhumanism mitigate existential threats to humankind? These are some of the overarching questions in this book, which explores the impact of information awareness on humanity starting from the Book of Genesis to the Royal Library of Alexandria in the 3rd century BC to the modern day of Google Search, IBM Watson, and Wolfram|Alpha. The book also covers Search Engine Optimization, Google AdWords, Google Maps, Google Local Search, and what every business leader must know about digital transformation. "Search is curiosity, and that will never be done," said Google's first female engineer and Yahoo's sixth CEO Marissa Mayer. The truth is out there; we just need to know how to Google it!
In the globalizing world, knowledge and information (and the social and technological settings for their production and communication) are now seen as keys to economic prosperity. The economy of a knowledge city creates value-added products using research, technology, and brainpower. The social benefit of knowledge-based urban development (KBUD); however, extends beyond aggregate economic growth. ""Knowledge-Based Urban Development"" covers the theoretical, thematic, and country-specific issues of knowledge cities to underline the growing importance of KBUD all around the world, providing academics, researchers, and practitioners with substantive research on the decisive lineaments of urban development for knowledge-based production (drawing attention to new planning processes to foster such development), and worldwide best practices and case studies in the field of urban development.
Microprocessors are the key component of the infrastructure of our 21st-century electronic- and digital information-based society. More than four billion are sold each year for use in 'intelligent' electronic devices; ranging from smart egg-timer through to aircraft management systems. Most of these processor devices appear in the form of highly-integrated microcontrollers, which comprize a core microprocessor together with memory and analog/digital peripheral ports. By using simple cores, these single-chip computers are the cost- and size-effective means of adding the brains to previous dumb widgets; such as the credit card. Using the same winning format as the successful Springer guide, The Quintessential PIC (R) Microcontroller, this down-to-earth new textbook/guide has been completely rewritten based on the more powerful PIC18 enhanced-range Microchip MCU family. Throughout the book, commercial hardware and software products are used to illustrate the material, as readers are provided real-world in-depth guidance on the design, construction and programming of small, embedded microcontroller-based systems. Suitable for stand-alone usage, the text does not require a prerequisite deep understanding of digital systems. Topics and features: uses an in-depth bottom-up approach to the topic of microcontroller design using the Microchip enhanced-range PIC18 (R) microcontroller family as the exemplar; includes fully worked examples and self-assessment questions, with additional support material available on an associated website; provides a standalone module on foundation topics in digital, logic and computer architecture for microcontroller engineering; discusses the hardware aspects of interfacing and interrupt handling, with an emphasis on the integration of hardware and software; covers parallel and serial input/output, timing, analog, and EEPROM data-handling techniques; presents a practical build-and-program case study, as well as illustrating simple testing strategies. This useful text/reference book will be of great value to industrial engineers, hobbyists and people in academia. Students of Electronic Engineering and Computer Science, at both undergraduate and postgraduate level, will also find this an ideal textbook, with many helpful learning tools. Dr. Sid Katzen is Associate to the School of Engineering, University of Ulster at Jordanstown, Northern Ireland.
This book is concerned with the associated issues between the differing paradigms of academic and organizational computing infrastructures. Driven by the increasing impact Information Communication Technology (ICT) has on our working and social lives, researchers within the Computer Supported Cooperative Work (CSCW) field try and find ways to situate new hardware and software in rapidly changing socio-digital ecologies. Adopting a design-orientated research perspective, researchers from the European Society for Socially Embedded Technologies (EUSSET) elaborate on the challenges and opportunities we face through the increasing permeation of society by ICT from commercial, academic, design and organizational perspectives. Designing Socially Embedded Technologies in the Real-World is directed at researchers, industry practitioners and will be of great interest to any other societal actors who are involved with the design of IT systems.
This book is for people who work in the tech industry-computer and data scientists, software developers and engineers, designers, and people in business, marketing or management roles. It is also for people who are involved in the procurement and deployment of advanced applications, algorithms, and AI systems, and in policy making. Together, they create the digital products, services, and systems that shape our societies and daily lives. The book's aim is to empower people to take responsibility, to 'upgrade' their skills for ethical reflection, inquiry, and deliberation. It introduces ethics in an accessible manner with practical examples, outlines of different ethical traditions, and practice-oriented methods. Additional online resources are available at: ethicsforpeoplewhoworkintech.com.
Performance evaluation of increasingly complex human-made systems requires the use of simulation models. However, these systems are difficult to describe and capture by succinct mathematical models. The purpose of this book is to address the difficulties of the optimization of complex systems via simulation models or other computation-intensive models involving possible stochastic effects and discrete choices. This book establishes distinct advantages of the "softer" ordinal approach for search-based type problems, analyzes its general properties, and shows the many orders of magnitude improvement in computational efficiency that is possible.
As suggested by the title of this book, I will present a collection of coherently related applications and a theoretical development of a general systems theory. Hopefully, this book will invite all readers to sample an exciting and challenging (even fun ) piece of interdisciplinary research, that has characterized the scientific and technological achievements of the twentieth century. And, I hope that many of them will be motivated to do additional reading and to contribute to topics along the lines described in the following pages. Since the applications in this volume range through many scientific disciplines, from sociology to atomic physics, from Einstein's relativity theory to Dirac's quan tum mechanics, from optimization theory to unreasonable effectiveness of mathe matics to foundations of mathematical modeling, from general systems theory to Schwartz's distributions, special care has been given to write each application in a language appropriate to that field. That is, mathematical symbols and abstractions are used at different levels so that readers in various fields will find it possible to read. Also, because of the wide range of applications, each chapter has been written so that, in general, there is no need to reference a different chapter in order to understand a specific application. At the same time, if a reader has the desire to go through the entire book without skipping any chapter, it is strongly suggested to refer back to Chapters 2 and 3 as often as possible.
In April 1993, an interdisciplinary NATO Advanced Research Workshop on "Collaborative dialogue technologies in distance learning" was held in Segovia, Spain. The workshop brought together researchers in fields related to distance learning using computer-mediated communication. The statement of justification of the NATO ARW follows hereafter. Justification of the NATO Advanced Research Workshop on Collaborative Dialogue Technologies in Distance Learning Computer Mediated Communication (CMC) systems have features that reduce some temporal, physical and social constraints on communication. Theories of communication have shifted from viewing communication as a linear transmission of messages by a sender to a receiver, to viewing it as a social paradigm, where individuals are actors in a network of interdependent relationships embedded in organizational and social structures. Recent research focuses on models of information-sharing to support not only the activities of individuals but also the problem-solving activities of groups, such as decision-making, planning or co writing. This area of research is called Computer Supported Cooperative Work (CSCW). The Artificial Intelligence (AI) approach uses knowledge-based systems to enhance and facilitate all these processes, including the possibility of using natural language. The traditional model of distance education places a strong emphasis on indepen dent study, supported by well developed learning materials. This model can be characterized as one-way media. However, the potential of CMC to provide better guidance to the student in Higher Distance Education has been quickly recognized for at least two kind of activities: information sharing and interaction."
The developments within the computationally and numerically oriented ar eas of Operations Research, Finance, Statistics and Economics have been sig nificant over the past few decades. Each area has been developing its own computer systems and languages that suit its needs, but there is relatively little cross-fertilization among them yet. This volume contains a collection of papers that each highlights a particular system, language, model or paradigm from one of the computational disciplines, aimed at researchers and practitioners from the other fields. The 15 papers cover a number of relevant topics: Models and Modelling in Operations Research and Economics, novel High-level and Object-Oriented approaches to programming, through advanced uses of Maple and MATLAB, and applications and solution of Differential Equations in Finance. It is hoped that the material in this volume will whet the reader's appetite for discovering and exploring new approaches to old problems, and in the longer run facilitate cross-fertilization among the fields. We would like to thank the contributing authors, the reviewers, the publisher, and last, but not least, Jesper Saxtorph, Anders Nielsen, and Thomas Stidsen for invaluable technical assistance."
Integrated circuits are finding ever wider applications through a range of industries. Introduction to VLSI Process Engineering presents the design principles for devices, describes the overall VLSI process, and deals with the essential manufacturing technologies and inspection procedures.
The CoreGRID Network of Excellence (NoE) project began in September 2004. Two months later, in November 2004, the first CoreGRID Integra tion Workshop was held within the framework of the prestigious international Dagstuhl seminars. CoreGRID aims at strengthening and advancing long-term research, knowledge transfer and integration in the area of Grid and Peer-to- Peer technologies. CoreGRID is a Network of Excellence - a new type of project within the European 6th Framework Programme, to ensure progressive evolution and durable integration of the European Grid research community. To achieve this objective, CoreGRID brings together a critical mass of we- established researchers and doctoral students from forty-two institutions that have constructed an ambitious joint programme of activities. Although excellence is a goal to which CoreGRID is committed, durable integration is our main concern. It means that CoreGRID has to carry out activ ities to improve the effectiveness of European research in Grid by coordinating and adapting the participants' activities in Grid research, to share resources such as Grid testbeds, to encourage exchange of research staff and students, and to ensure close collaboration and wide dissemination of its results to the international community. Organising CoreGRID Integration Workshops is one of the activities that aims at identifying and promoting durable collaboration between partners involved in the network."
BE 2002 is the second in a series of conferences on eCommerce, eBusiness, and eGovemment organised by the three IFIP committees TC6, TC8, and TCll. As BE 2001 did last year in Zurich, BE 2002 continues to provide a forum for users, engineers, and researchers from academia, industry and government to present their latest findings in eCommerce, eBusiness, and eGovernment applications and the underlying technologies which support those applications. This year's conference comprises a main track with sessions on eGovernment, Trust, eMarkets, Fraud and Security, eBusiness (both B2B and B2C), the Design of systems, eLearning, Public and Health Systems, Web Design, and the Applications of and Procedures for eCommerce and eBusiness, as well as two associated Workshops (not included in these proceedings): eBusiness Models in the Digital Online Music and Online News Sectors; and eBusiness Standardisation - Challenges and Solutions for the Networked Economy. The 47 papers accepted for presentation in these sessions and published in this book of proceedings were selected from 80 submissions. They were rigorously reviewed (all papers were double-blind refereed) before being selected by the International Programme Committee. This rejection rate of almost 50% indicates just how seriously the Committee took its quality control activities.
Health institutions are investing in and fielding information technology solutions at an unprecedented pace. With the recommendations from the Institute of Medicine around information technology solutions for patient safety, mandates from industry groups such as Leapfrog about using infor mation systems to improve health care, and the move toward evidence based practice, health institutions cannot afford to retain manual practices. The installation of multi-million dollar computerized health systems repre sents the very life blood of contemporary clinical operations and a crucial link to the financial viability of institutions. Yet, the implementation of health information systems is exceptionally complex, expensive and often just plain messy. The need for improvement in the art and science of systems implemen tation is clear: up to 70-80% of information technology installations fail. The reasons are multi-faceted, ranging from the complexity of the diverse workflows being computerized, the intricate nature of health organizations, the knowledge and skills of users to other reasons such as strategies for obtaining key executive support, weaving through the politics peculiar to the institution, and technical facets including the usability of systems. Thus, the art and science of successfully implementing systems remains deeply layered in elusiveness. Still, given the pervasiveness of system implementa tions and the importance of the outcomes, this is a critical topic, especially for nurses and informatics nurse specialists."
From a linguistic perspective, it is quanti?cation which makes all the di?- ence between "having no dollars" and "having a lot of dollars". And it is the meaning of the quanti?er "most" which eventually decides if "Most Ame- cans voted Kerry" or "Most Americans voted Bush" (as it stands). Natural language(NL)quanti?erslike"all","almostall","many"etc. serveanimp- tant purpose because they permit us to speak about properties of collections, as opposed to describing speci?c individuals only; in technical terms, qu- ti?ers are a 'second-order' construct. Thus the quantifying statement "Most Americans voted Bush" asserts that the set of voters of George W. Bush c- prisesthemajorityofAmericans,while"Bushsneezes"onlytellsussomething about a speci?c individual. By describing collections rather than individuals, quanti?ers extend the expressive power of natural languages far beyond that of propositional logic and make them a universal communication medium. Hence language heavily depends on quantifying constructions. These often involve fuzzy concepts like "tall", and they frequently refer to fuzzy quantities in agreement like "about ten", "almost all", "many" etc. In order to exploit this expressive power and make fuzzy quanti?cation available to technical applications, a number of proposals have been made how to model fuzzy quanti?ers in the framework of fuzzy set theory. These approaches usually reduce fuzzy quanti?cation to a comparison of scalar or fuzzy cardinalities [197, 132].
Testing techniques for VLSI circuits are undergoing many exciting changes. The predominant method for testing digital circuits consists of applying a set of input stimuli to the IC and monitoring the logic levels at primary outputs. If, for one or more inputs, there is a discrepancy between the observed output and the expected output then the IC is declared to be defective. A new approach to testing digital circuits, which has come to be known as IDDQ testing, has been actively researched for the last fifteen years. In IDDQ testing, the steady state supply current, rather than the logic levels at the primary outputs, is monitored. Years of research suggests that IDDQ testing can significantly improve the quality and reliability of fabricated circuits. This has prompted many semiconductor manufacturers to adopt this testing technique, among them Philips Semiconductors, Ford Microelectronics, Intel, Texas Instruments, LSI Logic, Hewlett-Packard, SUN microsystems, Alcatel, and SGS Thomson. This increase in the use of IDDQ testing should be of interest to three groups of individuals associated with the IC business: Product Managers and Test Engineers, CAD Tool Vendors and Circuit Designers. Introduction to IDDQ Testing is designed to educate this community. The authors have summarized in one volume the main findings of more than fifteen years of research in this area.
|
![]() ![]() You may like...
Indentured - Behind The Scenes At Gupta…
Rajesh Sundaram
Paperback
![]()
Kirstenbosch - A Visitor's Guide
Colin Paterson-Jones, John Winter
Paperback
|