![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
E-ffective Writing for E-Learning Environments integrates research and practice in user-centered design and learning design for instructors in post-secondary institutions and learning organizations who are developing e-learning resources. The book is intended as a development guide for experts in areas other than instructional or educational technology (in other words, experts in cognate areas such as Biology or English or Nursing) rather than as a learning design textbook. The organization of the book reflects the development process for a resource, course, or program from planning and development through formative evaluation, and identifies trends and issues that faculty or developers might encounter along the way. The account of the process of one faculty member's course development journey illustrates the suggested design guidelines. The accompanying practice guide provides additional information, examples, learning activities, and tools to supplement the text.
Microprocessors are the key component of the infrastructure of our 21st-century electronic- and digital information-based society. More than four billion are sold each year for use in 'intelligent' electronic devices; ranging from smart egg-timer through to aircraft management systems. Most of these processor devices appear in the form of highly-integrated microcontrollers, which comprize a core microprocessor together with memory and analog/digital peripheral ports. By using simple cores, these single-chip computers are the cost- and size-effective means of adding the brains to previous dumb widgets; such as the credit card. Using the same winning format as the successful Springer guide, The Quintessential PIC (R) Microcontroller, this down-to-earth new textbook/guide has been completely rewritten based on the more powerful PIC18 enhanced-range Microchip MCU family. Throughout the book, commercial hardware and software products are used to illustrate the material, as readers are provided real-world in-depth guidance on the design, construction and programming of small, embedded microcontroller-based systems. Suitable for stand-alone usage, the text does not require a prerequisite deep understanding of digital systems. Topics and features: uses an in-depth bottom-up approach to the topic of microcontroller design using the Microchip enhanced-range PIC18 (R) microcontroller family as the exemplar; includes fully worked examples and self-assessment questions, with additional support material available on an associated website; provides a standalone module on foundation topics in digital, logic and computer architecture for microcontroller engineering; discusses the hardware aspects of interfacing and interrupt handling, with an emphasis on the integration of hardware and software; covers parallel and serial input/output, timing, analog, and EEPROM data-handling techniques; presents a practical build-and-program case study, as well as illustrating simple testing strategies. This useful text/reference book will be of great value to industrial engineers, hobbyists and people in academia. Students of Electronic Engineering and Computer Science, at both undergraduate and postgraduate level, will also find this an ideal textbook, with many helpful learning tools. Dr. Sid Katzen is Associate to the School of Engineering, University of Ulster at Jordanstown, Northern Ireland.
The success of VHDL since it has been balloted in 1987 as an IEEE standard may look incomprehensible to the large population of hardware designers, who had never heared of Hardware Description Languages before (for at least 90% of them), as well as to the few hundreds of specialists who had been working on these languages for a long time (25 years for some of them). Until 1988, only a very small subset of designers, in a few large companies, were used to describe their designs using a proprietary HDL, or sometimes a HDL inherited from a University when some software environment happened to be developped around it, allowing usability by third parties. A number of benefits were definitely recognized to this practice, such as functional verification of a specification through simulation, first performance evaluation of a tentative design, and sometimes automatic microprogram generation or even automatic high level synthesis. As there was apparently no market for HDL's, the ECAD vendors did not care about them, start-up companies were seldom able to survive in this area, and large users of proprietary tools were spending more and more people and money just to maintain their internal system.
Performance evaluation of increasingly complex human-made systems requires the use of simulation models. However, these systems are difficult to describe and capture by succinct mathematical models. The purpose of this book is to address the difficulties of the optimization of complex systems via simulation models or other computation-intensive models involving possible stochastic effects and discrete choices. This book establishes distinct advantages of the "softer" ordinal approach for search-based type problems, analyzes its general properties, and shows the many orders of magnitude improvement in computational efficiency that is possible.
This book contains the ceremonials and the proceedings pertaining to the Int- national Symposium CCN2005 on "Complex Computing-Networks: A Link between Brain-like and Wave-Oriented Electrodynamics Algorithms," convened at Do ?u ? University of Istanbul, Turkey, on 13-14 June 2005, in connection with the bestowal of the honorary doctorate degrees on Professors Leopold B. Felsen and Leon O. Chua, for their extraordinary achievements in electromagnetics, and n- linear systems, respectively. The symposium was co-organized by Cem Goknar and Levent Sevgi, in consultation with Leopold B. Felsen and Leon O. Chua. Istanbul is a city with wonderful natural and historical surroundings, a city not only interconnecting Asia and Europe but also Eastern and Western cultures. Therefore, CCN2005 was a memorable event not only in the lifetime of Drs. Felsen, Chua, and their families, but also for all the other participants who were there to congratulate the recipients and participate in the symposium."
Integrated circuits are finding ever wider applications through a range of industries. Introduction to VLSI Process Engineering presents the design principles for devices, describes the overall VLSI process, and deals with the essential manufacturing technologies and inspection procedures.
The CoreGRID Network of Excellence (NoE) project began in September 2004. Two months later, in November 2004, the first CoreGRID Integra tion Workshop was held within the framework of the prestigious international Dagstuhl seminars. CoreGRID aims at strengthening and advancing long-term research, knowledge transfer and integration in the area of Grid and Peer-to- Peer technologies. CoreGRID is a Network of Excellence - a new type of project within the European 6th Framework Programme, to ensure progressive evolution and durable integration of the European Grid research community. To achieve this objective, CoreGRID brings together a critical mass of we- established researchers and doctoral students from forty-two institutions that have constructed an ambitious joint programme of activities. Although excellence is a goal to which CoreGRID is committed, durable integration is our main concern. It means that CoreGRID has to carry out activ ities to improve the effectiveness of European research in Grid by coordinating and adapting the participants' activities in Grid research, to share resources such as Grid testbeds, to encourage exchange of research staff and students, and to ensure close collaboration and wide dissemination of its results to the international community. Organising CoreGRID Integration Workshops is one of the activities that aims at identifying and promoting durable collaboration between partners involved in the network."
This book is a tribute to Kenichi Morita's ideas and achievements in theoretical computer science, reversibility and computationally universal mathematical machines. It offers a unique source of information on universality and reversibility in computation and is an indispensable book for computer scientists, mathematicians, physicists and engineers. Morita is renowned for his works on two-dimensional language accepting automata, complexity of Turing machines, universality of cellular automata, regular and context-free array grammars, and undecidability. His high-impact works include findings on parallel generation and parsing of array languages by means of reversible automata, construction of a reversible automaton from Fredkin gates, solving a firing squad synchronization problem in reversible cellular automata, self-reproduction in reversible cellular spaces, universal reversible two-counter machines, solution of nondeterministic polynomial (NP) problems in hyperbolic cellular automata, reversible P-systems, a new universal reversible logic element with memory, and reversibility in asynchronous cellular automata. Kenichi Morita's achievements in reversibility, universality and theory of computation are celebrated in over twenty high-profile contributions from his colleagues, collaborators, students and friends. The theoretical constructs presented in this book are amazing in their diversity and depth of intellectual insight, addressing: queue automata, hyperbolic cellular automata, Abelian invertible automata, number-conserving cellular automata, Brownian circuits, chemical automata, logical gates implemented via glider collisions, computation in swarm networks, picture arrays, universal reversible counter machines, input-position-restricted models of language acceptors, descriptional complexity and persistence of cellular automata, partitioned cellular automata, firing squad synchronization algorithms, reversible asynchronous automata, reversible simulations of ranking trees, Shor's factorization algorithms, and power consumption of cellular automata.
BE 2002 is the second in a series of conferences on eCommerce, eBusiness, and eGovemment organised by the three IFIP committees TC6, TC8, and TCll. As BE 2001 did last year in Zurich, BE 2002 continues to provide a forum for users, engineers, and researchers from academia, industry and government to present their latest findings in eCommerce, eBusiness, and eGovernment applications and the underlying technologies which support those applications. This year's conference comprises a main track with sessions on eGovernment, Trust, eMarkets, Fraud and Security, eBusiness (both B2B and B2C), the Design of systems, eLearning, Public and Health Systems, Web Design, and the Applications of and Procedures for eCommerce and eBusiness, as well as two associated Workshops (not included in these proceedings): eBusiness Models in the Digital Online Music and Online News Sectors; and eBusiness Standardisation - Challenges and Solutions for the Networked Economy. The 47 papers accepted for presentation in these sessions and published in this book of proceedings were selected from 80 submissions. They were rigorously reviewed (all papers were double-blind refereed) before being selected by the International Programme Committee. This rejection rate of almost 50% indicates just how seriously the Committee took its quality control activities.
In April 1993, an interdisciplinary NATO Advanced Research Workshop on "Collaborative dialogue technologies in distance learning" was held in Segovia, Spain. The workshop brought together researchers in fields related to distance learning using computer-mediated communication. The statement of justification of the NATO ARW follows hereafter. Justification of the NATO Advanced Research Workshop on Collaborative Dialogue Technologies in Distance Learning Computer Mediated Communication (CMC) systems have features that reduce some temporal, physical and social constraints on communication. Theories of communication have shifted from viewing communication as a linear transmission of messages by a sender to a receiver, to viewing it as a social paradigm, where individuals are actors in a network of interdependent relationships embedded in organizational and social structures. Recent research focuses on models of information-sharing to support not only the activities of individuals but also the problem-solving activities of groups, such as decision-making, planning or co writing. This area of research is called Computer Supported Cooperative Work (CSCW). The Artificial Intelligence (AI) approach uses knowledge-based systems to enhance and facilitate all these processes, including the possibility of using natural language. The traditional model of distance education places a strong emphasis on indepen dent study, supported by well developed learning materials. This model can be characterized as one-way media. However, the potential of CMC to provide better guidance to the student in Higher Distance Education has been quickly recognized for at least two kind of activities: information sharing and interaction."
In the globalizing world, knowledge and information (and the social and technological settings for their production and communication) are now seen as keys to economic prosperity. The economy of a knowledge city creates value-added products using research, technology, and brainpower. The social benefit of knowledge-based urban development (KBUD); however, extends beyond aggregate economic growth. ""Knowledge-Based Urban Development"" covers the theoretical, thematic, and country-specific issues of knowledge cities to underline the growing importance of KBUD all around the world, providing academics, researchers, and practitioners with substantive research on the decisive lineaments of urban development for knowledge-based production (drawing attention to new planning processes to foster such development), and worldwide best practices and case studies in the field of urban development.
As suggested by the title of this book, I will present a collection of coherently related applications and a theoretical development of a general systems theory. Hopefully, this book will invite all readers to sample an exciting and challenging (even fun ) piece of interdisciplinary research, that has characterized the scientific and technological achievements of the twentieth century. And, I hope that many of them will be motivated to do additional reading and to contribute to topics along the lines described in the following pages. Since the applications in this volume range through many scientific disciplines, from sociology to atomic physics, from Einstein's relativity theory to Dirac's quan tum mechanics, from optimization theory to unreasonable effectiveness of mathe matics to foundations of mathematical modeling, from general systems theory to Schwartz's distributions, special care has been given to write each application in a language appropriate to that field. That is, mathematical symbols and abstractions are used at different levels so that readers in various fields will find it possible to read. Also, because of the wide range of applications, each chapter has been written so that, in general, there is no need to reference a different chapter in order to understand a specific application. At the same time, if a reader has the desire to go through the entire book without skipping any chapter, it is strongly suggested to refer back to Chapters 2 and 3 as often as possible.
The developments within the computationally and numerically oriented ar eas of Operations Research, Finance, Statistics and Economics have been sig nificant over the past few decades. Each area has been developing its own computer systems and languages that suit its needs, but there is relatively little cross-fertilization among them yet. This volume contains a collection of papers that each highlights a particular system, language, model or paradigm from one of the computational disciplines, aimed at researchers and practitioners from the other fields. The 15 papers cover a number of relevant topics: Models and Modelling in Operations Research and Economics, novel High-level and Object-Oriented approaches to programming, through advanced uses of Maple and MATLAB, and applications and solution of Differential Equations in Finance. It is hoped that the material in this volume will whet the reader's appetite for discovering and exploring new approaches to old problems, and in the longer run facilitate cross-fertilization among the fields. We would like to thank the contributing authors, the reviewers, the publisher, and last, but not least, Jesper Saxtorph, Anders Nielsen, and Thomas Stidsen for invaluable technical assistance."
From a linguistic perspective, it is quanti?cation which makes all the di?- ence between "having no dollars" and "having a lot of dollars". And it is the meaning of the quanti?er "most" which eventually decides if "Most Ame- cans voted Kerry" or "Most Americans voted Bush" (as it stands). Natural language(NL)quanti?erslike"all","almostall","many"etc. serveanimp- tant purpose because they permit us to speak about properties of collections, as opposed to describing speci?c individuals only; in technical terms, qu- ti?ers are a 'second-order' construct. Thus the quantifying statement "Most Americans voted Bush" asserts that the set of voters of George W. Bush c- prisesthemajorityofAmericans,while"Bushsneezes"onlytellsussomething about a speci?c individual. By describing collections rather than individuals, quanti?ers extend the expressive power of natural languages far beyond that of propositional logic and make them a universal communication medium. Hence language heavily depends on quantifying constructions. These often involve fuzzy concepts like "tall", and they frequently refer to fuzzy quantities in agreement like "about ten", "almost all", "many" etc. In order to exploit this expressive power and make fuzzy quanti?cation available to technical applications, a number of proposals have been made how to model fuzzy quanti?ers in the framework of fuzzy set theory. These approaches usually reduce fuzzy quanti?cation to a comparison of scalar or fuzzy cardinalities [197, 132].
Health institutions are investing in and fielding information technology solutions at an unprecedented pace. With the recommendations from the Institute of Medicine around information technology solutions for patient safety, mandates from industry groups such as Leapfrog about using infor mation systems to improve health care, and the move toward evidence based practice, health institutions cannot afford to retain manual practices. The installation of multi-million dollar computerized health systems repre sents the very life blood of contemporary clinical operations and a crucial link to the financial viability of institutions. Yet, the implementation of health information systems is exceptionally complex, expensive and often just plain messy. The need for improvement in the art and science of systems implemen tation is clear: up to 70-80% of information technology installations fail. The reasons are multi-faceted, ranging from the complexity of the diverse workflows being computerized, the intricate nature of health organizations, the knowledge and skills of users to other reasons such as strategies for obtaining key executive support, weaving through the politics peculiar to the institution, and technical facets including the usability of systems. Thus, the art and science of successfully implementing systems remains deeply layered in elusiveness. Still, given the pervasiveness of system implementa tions and the importance of the outcomes, this is a critical topic, especially for nurses and informatics nurse specialists."
The papers contained in this volume were presented at the fourth edition of the IFIP International Conference on Theoretical Computer Science (IFIP TCS), held August 23-24, 2006 in Santiago, Chile. They were selected from 44 pa pers submitted from 17 countries in response to the call for papers. A total of 16 submissions were accepted as full papers, yielding an acceptance rate of about 36%. Papers sohcited for IFIP TCS 2006 were meant to constitute orig inal contributions in two general areas: Algorithms, Complexity and Models of Computation; and Logic, Semantics, Specification and Verification. The conference also included six invited presentations: Marcelo Arenas (P- tificia Universidad Catolica de Chile, Chile), Jozef Gruska (Masaryk University, Czech Republic), Claudio Gutierrez (Universidad de Chile, Chile), Marcos Kiwi (Universidad de Chile, Chile), Nicola Santoro (Carleton University, Canada), and Mihalis Yannakakis (Columbia University, USA). The abstracts of those presentations are included in this volume. In addition, Jozef Gruska and Nicola Santoro accepted our invitation to write full papers related to their talks. Those two surveys are included in the present volume as well. TCS is a biannual conference. The first edition was held in Sendai (Japan, 2000), followed by Montreal (Canada, 2002) and Toulouse (France, 2004)."
From Google search to self-driving cars to human longevity, is Alphabet creating a neoteric Garden of Eden or Bentham's Panopticon? Will King Solomon's challenge supersede the Turing test for artificial intelligence? Can transhumanism mitigate existential threats to humankind? These are some of the overarching questions in this book, which explores the impact of information awareness on humanity starting from the Book of Genesis to the Royal Library of Alexandria in the 3rd century BC to the modern day of Google Search, IBM Watson, and Wolfram|Alpha. The book also covers Search Engine Optimization, Google AdWords, Google Maps, Google Local Search, and what every business leader must know about digital transformation. "Search is curiosity, and that will never be done," said Google's first female engineer and Yahoo's sixth CEO Marissa Mayer. The truth is out there; we just need to know how to Google it!
Testing techniques for VLSI circuits are undergoing many exciting changes. The predominant method for testing digital circuits consists of applying a set of input stimuli to the IC and monitoring the logic levels at primary outputs. If, for one or more inputs, there is a discrepancy between the observed output and the expected output then the IC is declared to be defective. A new approach to testing digital circuits, which has come to be known as IDDQ testing, has been actively researched for the last fifteen years. In IDDQ testing, the steady state supply current, rather than the logic levels at the primary outputs, is monitored. Years of research suggests that IDDQ testing can significantly improve the quality and reliability of fabricated circuits. This has prompted many semiconductor manufacturers to adopt this testing technique, among them Philips Semiconductors, Ford Microelectronics, Intel, Texas Instruments, LSI Logic, Hewlett-Packard, SUN microsystems, Alcatel, and SGS Thomson. This increase in the use of IDDQ testing should be of interest to three groups of individuals associated with the IC business: Product Managers and Test Engineers, CAD Tool Vendors and Circuit Designers. Introduction to IDDQ Testing is designed to educate this community. The authors have summarized in one volume the main findings of more than fifteen years of research in this area.
TO CRYPTOGRAPHY EXERCISE BOOK Thomas Baignkres EPFL, Switzerland Pascal Junod EPFL, Switzerland Yi Lu EPFL, Switzerland Jean Monnerat EPFL, Switzerland Serge Vaudenay EPFL, Switzerland Springer - Thomas Baignbres Pascal Junod EPFL - I&C - LASEC Lausanne, Switzerland Lausanne, Switzerland Yi Lu Jean Monnerat EPFL - I&C - LASEC EPFL-I&C-LASEC Lausanne, Switzerland Lausanne, Switzerland Serge Vaudenay Lausanne, Switzerland Library of Congress Cataloging-in-Publication Data A C.I.P. Catalogue record for this book is available from the Library of Congress. A CLASSICAL INTRODUCTION TO CRYPTOGRAPHY EXERCISE BOOK by Thomas Baignkres, Palcal Junod, Yi Lu, Jean Monnerat and Serge Vaudenay ISBN- 10: 0-387-27934-2 e-ISBN-10: 0-387-28835-X ISBN- 13: 978-0-387-27934-3 e-ISBN- 13: 978-0-387-28835-2 Printed on acid-free paper. O 2006 Springer Science]Business Media, Inc. All rights reserved. This work may not be translated or copied in whole or in part without the written permission of the publisher (Springer Science+Business Media, Inc., 233 Spring Street, New York, NY 10013, USA), except for brief excerpts in connection with reviews or scholarly analysis. Use in connection with any form of information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now know or hereafter developed is forbidden. The use in this publication of trade names, trademarks, service marks and similar terms, even if the are not identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to proprietary rights. Printed in the United States of America.
Aimed at final year undergraduate students, this is the first volume to publish in a new series of text covering core subjects in operational research in an accessible student-friendly format. This volume presents simulation paired with inventory control. The Operational Research Series aims to provide a new generation of European-originated texts of practical relevance to todays student. To guarantee accessibility, the texts are concise and have a non-mathematical orientation. These texts will provide students with the grounding in operational research theory they need to become the innovators of tomorrow.
This text looks at how computers are being used in primary classrooms and how they could be used better. Its three sections focus upon: how do we investigate learning through talk around computers? What affects the quality of group work around computers? What can teachers do to improve this?
There are many challenges facing organizations today as they incorporate electronic marketing methods into their strategy. Advances in Electronic Marketing examines these challenges within three major themes: the global environment, the strategic/technological realm, and the buyer behavior of online consumers. Each chapter raises important issues, practical applications, and relevant solutions for the electronic marketer. Advances in Electronic Marketing not only addresses Internet marketing and the World Wide Web, but also other electronic marketing tools, such as geographic information systems, database marketing, and mobile advertising. This book provides researchers and practitioners with an updated source of knowledge on electronic marketing methods.
This edited text draws together the insights of numerous worldwide eminent academics to evaluate the condition of predictive policing and artificial intelligence (AI) as interlocked policy areas. Predictive and AI technologies are growing in prominence and at an unprecedented rate. Powerful digital crime mapping tools are being used to identify crime hotspots in real-time, as pattern-matching and search algorithms are sorting through huge police databases populated by growing volumes of data in an eff ort to identify people liable to experience (or commit) crime, places likely to host it, and variables associated with its solvability. Facial and vehicle recognition cameras are locating criminals as they move, while police services develop strategies informed by machine learning and other kinds of predictive analytics. Many of these innovations are features of modern policing in the UK, the US and Australia, among other jurisdictions. AI promises to reduce unnecessary labour, speed up various forms of police work, encourage police forces to more efficiently apportion their resources, and enable police officers to prevent crime and protect people from a variety of future harms. However, the promises of predictive and AI technologies and innovations do not always match reality. They often have significant weaknesses, come at a considerable cost and require challenging trade- off s to be made. Focusing on the UK, the US and Australia, this book explores themes of choice architecture, decision- making, human rights, accountability and the rule of law, as well as future uses of AI and predictive technologies in various policing contexts. The text contributes to ongoing debates on the benefits and biases of predictive algorithms, big data sets, machine learning systems, and broader policing strategies and challenges. Written in a clear and direct style, this book will appeal to students and scholars of policing, criminology, crime science, sociology, computer science, cognitive psychology and all those interested in the emergence of AI as a feature of contemporary policing.
Parallel and distributed computing is one of the foremost
technologies for shaping New Horizons of Parallel and Distributed Computing is a collection of self-contained chapters written by pioneering researchers to provide solutions for newly emerging problems in this field. This volume will not only provide novel ideas, work in progress and state-of-the-art techniques in the field, but will also stimulate future research activities in the area of parallel and distributed computing with applications. New Horizons of Parallel and Distributed Computing is intended for industry researchers and developers, as well as for academic researchers and advanced-level students in computer science and electrical engineering. A valuable reference work, it is also suitable as a textbook. |
You may like...
Discovering Computers, Essentials…
Susan Sebok, Jennifer Campbell, …
Paperback
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
Computer-Graphic Facial Reconstruction
John G. Clement, Murray K. Marks
Hardcover
R2,327
Discovery Miles 23 270
The Handbook of Multimodal-Multisensor…
Sharon Oviatt, Bjoern Schuller, …
Hardcover
|