![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
The CoreGRID Network of Excellence (NoE) project began in September 2004. Two months later, in November 2004, the first CoreGRID Integra tion Workshop was held within the framework of the prestigious international Dagstuhl seminars. CoreGRID aims at strengthening and advancing long-term research, knowledge transfer and integration in the area of Grid and Peer-to- Peer technologies. CoreGRID is a Network of Excellence - a new type of project within the European 6th Framework Programme, to ensure progressive evolution and durable integration of the European Grid research community. To achieve this objective, CoreGRID brings together a critical mass of we- established researchers and doctoral students from forty-two institutions that have constructed an ambitious joint programme of activities. Although excellence is a goal to which CoreGRID is committed, durable integration is our main concern. It means that CoreGRID has to carry out activ ities to improve the effectiveness of European research in Grid by coordinating and adapting the participants' activities in Grid research, to share resources such as Grid testbeds, to encourage exchange of research staff and students, and to ensure close collaboration and wide dissemination of its results to the international community. Organising CoreGRID Integration Workshops is one of the activities that aims at identifying and promoting durable collaboration between partners involved in the network."
This book investigates in detail the emerging deep learning (DL) technique in computational physics, assessing its promising potential to substitute conventional numerical solvers for calculating the fields in real-time. After good training, the proposed architecture can resolve both the forward computing and the inverse retrieve problems. Pursuing a holistic perspective, the book includes the following areas. The first chapter discusses the basic DL frameworks. Then, the steady heat conduction problem is solved by the classical U-net in Chapter 2, involving both the passive and active cases. Afterwards, the sophisticated heat flux on a curved surface is reconstructed by the presented Conv-LSTM, exhibiting high accuracy and efficiency. Besides, the electromagnetic parameters of complex medium such as the permittivity and conductivity are retrieved by a cascaded framework in Chapter 4. Additionally, a physics-informed DL structure along with a nonlinear mapping module are employed to obtain the space/temperature/time-related thermal conductivity via the transient temperature in Chapter 5. Finally, in Chapter 6, a series of the latest advanced frameworks and the corresponding physics applications are introduced. As deep learning techniques are experiencing vigorous development in computational physics, more people desire related reading materials. This book is intended for graduate students, professional practitioners, and researchers who are interested in DL for computational physics.
This book is a tribute to Kenichi Morita's ideas and achievements in theoretical computer science, reversibility and computationally universal mathematical machines. It offers a unique source of information on universality and reversibility in computation and is an indispensable book for computer scientists, mathematicians, physicists and engineers. Morita is renowned for his works on two-dimensional language accepting automata, complexity of Turing machines, universality of cellular automata, regular and context-free array grammars, and undecidability. His high-impact works include findings on parallel generation and parsing of array languages by means of reversible automata, construction of a reversible automaton from Fredkin gates, solving a firing squad synchronization problem in reversible cellular automata, self-reproduction in reversible cellular spaces, universal reversible two-counter machines, solution of nondeterministic polynomial (NP) problems in hyperbolic cellular automata, reversible P-systems, a new universal reversible logic element with memory, and reversibility in asynchronous cellular automata. Kenichi Morita's achievements in reversibility, universality and theory of computation are celebrated in over twenty high-profile contributions from his colleagues, collaborators, students and friends. The theoretical constructs presented in this book are amazing in their diversity and depth of intellectual insight, addressing: queue automata, hyperbolic cellular automata, Abelian invertible automata, number-conserving cellular automata, Brownian circuits, chemical automata, logical gates implemented via glider collisions, computation in swarm networks, picture arrays, universal reversible counter machines, input-position-restricted models of language acceptors, descriptional complexity and persistence of cellular automata, partitioned cellular automata, firing squad synchronization algorithms, reversible asynchronous automata, reversible simulations of ranking trees, Shor's factorization algorithms, and power consumption of cellular automata.
BE 2002 is the second in a series of conferences on eCommerce, eBusiness, and eGovemment organised by the three IFIP committees TC6, TC8, and TCll. As BE 2001 did last year in Zurich, BE 2002 continues to provide a forum for users, engineers, and researchers from academia, industry and government to present their latest findings in eCommerce, eBusiness, and eGovernment applications and the underlying technologies which support those applications. This year's conference comprises a main track with sessions on eGovernment, Trust, eMarkets, Fraud and Security, eBusiness (both B2B and B2C), the Design of systems, eLearning, Public and Health Systems, Web Design, and the Applications of and Procedures for eCommerce and eBusiness, as well as two associated Workshops (not included in these proceedings): eBusiness Models in the Digital Online Music and Online News Sectors; and eBusiness Standardisation - Challenges and Solutions for the Networked Economy. The 47 papers accepted for presentation in these sessions and published in this book of proceedings were selected from 80 submissions. They were rigorously reviewed (all papers were double-blind refereed) before being selected by the International Programme Committee. This rejection rate of almost 50% indicates just how seriously the Committee took its quality control activities.
In April 1993, an interdisciplinary NATO Advanced Research Workshop on "Collaborative dialogue technologies in distance learning" was held in Segovia, Spain. The workshop brought together researchers in fields related to distance learning using computer-mediated communication. The statement of justification of the NATO ARW follows hereafter. Justification of the NATO Advanced Research Workshop on Collaborative Dialogue Technologies in Distance Learning Computer Mediated Communication (CMC) systems have features that reduce some temporal, physical and social constraints on communication. Theories of communication have shifted from viewing communication as a linear transmission of messages by a sender to a receiver, to viewing it as a social paradigm, where individuals are actors in a network of interdependent relationships embedded in organizational and social structures. Recent research focuses on models of information-sharing to support not only the activities of individuals but also the problem-solving activities of groups, such as decision-making, planning or co writing. This area of research is called Computer Supported Cooperative Work (CSCW). The Artificial Intelligence (AI) approach uses knowledge-based systems to enhance and facilitate all these processes, including the possibility of using natural language. The traditional model of distance education places a strong emphasis on indepen dent study, supported by well developed learning materials. This model can be characterized as one-way media. However, the potential of CMC to provide better guidance to the student in Higher Distance Education has been quickly recognized for at least two kind of activities: information sharing and interaction."
As suggested by the title of this book, I will present a collection of coherently related applications and a theoretical development of a general systems theory. Hopefully, this book will invite all readers to sample an exciting and challenging (even fun ) piece of interdisciplinary research, that has characterized the scientific and technological achievements of the twentieth century. And, I hope that many of them will be motivated to do additional reading and to contribute to topics along the lines described in the following pages. Since the applications in this volume range through many scientific disciplines, from sociology to atomic physics, from Einstein's relativity theory to Dirac's quan tum mechanics, from optimization theory to unreasonable effectiveness of mathe matics to foundations of mathematical modeling, from general systems theory to Schwartz's distributions, special care has been given to write each application in a language appropriate to that field. That is, mathematical symbols and abstractions are used at different levels so that readers in various fields will find it possible to read. Also, because of the wide range of applications, each chapter has been written so that, in general, there is no need to reference a different chapter in order to understand a specific application. At the same time, if a reader has the desire to go through the entire book without skipping any chapter, it is strongly suggested to refer back to Chapters 2 and 3 as often as possible.
The developments within the computationally and numerically oriented ar eas of Operations Research, Finance, Statistics and Economics have been sig nificant over the past few decades. Each area has been developing its own computer systems and languages that suit its needs, but there is relatively little cross-fertilization among them yet. This volume contains a collection of papers that each highlights a particular system, language, model or paradigm from one of the computational disciplines, aimed at researchers and practitioners from the other fields. The 15 papers cover a number of relevant topics: Models and Modelling in Operations Research and Economics, novel High-level and Object-Oriented approaches to programming, through advanced uses of Maple and MATLAB, and applications and solution of Differential Equations in Finance. It is hoped that the material in this volume will whet the reader's appetite for discovering and exploring new approaches to old problems, and in the longer run facilitate cross-fertilization among the fields. We would like to thank the contributing authors, the reviewers, the publisher, and last, but not least, Jesper Saxtorph, Anders Nielsen, and Thomas Stidsen for invaluable technical assistance."
From a linguistic perspective, it is quanti?cation which makes all the di?- ence between "having no dollars" and "having a lot of dollars". And it is the meaning of the quanti?er "most" which eventually decides if "Most Ame- cans voted Kerry" or "Most Americans voted Bush" (as it stands). Natural language(NL)quanti?erslike"all","almostall","many"etc. serveanimp- tant purpose because they permit us to speak about properties of collections, as opposed to describing speci?c individuals only; in technical terms, qu- ti?ers are a 'second-order' construct. Thus the quantifying statement "Most Americans voted Bush" asserts that the set of voters of George W. Bush c- prisesthemajorityofAmericans,while"Bushsneezes"onlytellsussomething about a speci?c individual. By describing collections rather than individuals, quanti?ers extend the expressive power of natural languages far beyond that of propositional logic and make them a universal communication medium. Hence language heavily depends on quantifying constructions. These often involve fuzzy concepts like "tall", and they frequently refer to fuzzy quantities in agreement like "about ten", "almost all", "many" etc. In order to exploit this expressive power and make fuzzy quanti?cation available to technical applications, a number of proposals have been made how to model fuzzy quanti?ers in the framework of fuzzy set theory. These approaches usually reduce fuzzy quanti?cation to a comparison of scalar or fuzzy cardinalities [197, 132].
In the globalizing world, knowledge and information (and the social and technological settings for their production and communication) are now seen as keys to economic prosperity. The economy of a knowledge city creates value-added products using research, technology, and brainpower. The social benefit of knowledge-based urban development (KBUD); however, extends beyond aggregate economic growth. ""Knowledge-Based Urban Development"" covers the theoretical, thematic, and country-specific issues of knowledge cities to underline the growing importance of KBUD all around the world, providing academics, researchers, and practitioners with substantive research on the decisive lineaments of urban development for knowledge-based production (drawing attention to new planning processes to foster such development), and worldwide best practices and case studies in the field of urban development.
Health institutions are investing in and fielding information technology solutions at an unprecedented pace. With the recommendations from the Institute of Medicine around information technology solutions for patient safety, mandates from industry groups such as Leapfrog about using infor mation systems to improve health care, and the move toward evidence based practice, health institutions cannot afford to retain manual practices. The installation of multi-million dollar computerized health systems repre sents the very life blood of contemporary clinical operations and a crucial link to the financial viability of institutions. Yet, the implementation of health information systems is exceptionally complex, expensive and often just plain messy. The need for improvement in the art and science of systems implemen tation is clear: up to 70-80% of information technology installations fail. The reasons are multi-faceted, ranging from the complexity of the diverse workflows being computerized, the intricate nature of health organizations, the knowledge and skills of users to other reasons such as strategies for obtaining key executive support, weaving through the politics peculiar to the institution, and technical facets including the usability of systems. Thus, the art and science of successfully implementing systems remains deeply layered in elusiveness. Still, given the pervasiveness of system implementa tions and the importance of the outcomes, this is a critical topic, especially for nurses and informatics nurse specialists."
The papers contained in this volume were presented at the fourth edition of the IFIP International Conference on Theoretical Computer Science (IFIP TCS), held August 23-24, 2006 in Santiago, Chile. They were selected from 44 pa pers submitted from 17 countries in response to the call for papers. A total of 16 submissions were accepted as full papers, yielding an acceptance rate of about 36%. Papers sohcited for IFIP TCS 2006 were meant to constitute orig inal contributions in two general areas: Algorithms, Complexity and Models of Computation; and Logic, Semantics, Specification and Verification. The conference also included six invited presentations: Marcelo Arenas (P- tificia Universidad Catolica de Chile, Chile), Jozef Gruska (Masaryk University, Czech Republic), Claudio Gutierrez (Universidad de Chile, Chile), Marcos Kiwi (Universidad de Chile, Chile), Nicola Santoro (Carleton University, Canada), and Mihalis Yannakakis (Columbia University, USA). The abstracts of those presentations are included in this volume. In addition, Jozef Gruska and Nicola Santoro accepted our invitation to write full papers related to their talks. Those two surveys are included in the present volume as well. TCS is a biannual conference. The first edition was held in Sendai (Japan, 2000), followed by Montreal (Canada, 2002) and Toulouse (France, 2004)."
From Google search to self-driving cars to human longevity, is Alphabet creating a neoteric Garden of Eden or Bentham's Panopticon? Will King Solomon's challenge supersede the Turing test for artificial intelligence? Can transhumanism mitigate existential threats to humankind? These are some of the overarching questions in this book, which explores the impact of information awareness on humanity starting from the Book of Genesis to the Royal Library of Alexandria in the 3rd century BC to the modern day of Google Search, IBM Watson, and Wolfram|Alpha. The book also covers Search Engine Optimization, Google AdWords, Google Maps, Google Local Search, and what every business leader must know about digital transformation. "Search is curiosity, and that will never be done," said Google's first female engineer and Yahoo's sixth CEO Marissa Mayer. The truth is out there; we just need to know how to Google it!
Testing techniques for VLSI circuits are undergoing many exciting changes. The predominant method for testing digital circuits consists of applying a set of input stimuli to the IC and monitoring the logic levels at primary outputs. If, for one or more inputs, there is a discrepancy between the observed output and the expected output then the IC is declared to be defective. A new approach to testing digital circuits, which has come to be known as IDDQ testing, has been actively researched for the last fifteen years. In IDDQ testing, the steady state supply current, rather than the logic levels at the primary outputs, is monitored. Years of research suggests that IDDQ testing can significantly improve the quality and reliability of fabricated circuits. This has prompted many semiconductor manufacturers to adopt this testing technique, among them Philips Semiconductors, Ford Microelectronics, Intel, Texas Instruments, LSI Logic, Hewlett-Packard, SUN microsystems, Alcatel, and SGS Thomson. This increase in the use of IDDQ testing should be of interest to three groups of individuals associated with the IC business: Product Managers and Test Engineers, CAD Tool Vendors and Circuit Designers. Introduction to IDDQ Testing is designed to educate this community. The authors have summarized in one volume the main findings of more than fifteen years of research in this area.
TO CRYPTOGRAPHY EXERCISE BOOK Thomas Baignkres EPFL, Switzerland Pascal Junod EPFL, Switzerland Yi Lu EPFL, Switzerland Jean Monnerat EPFL, Switzerland Serge Vaudenay EPFL, Switzerland Springer - Thomas Baignbres Pascal Junod EPFL - I&C - LASEC Lausanne, Switzerland Lausanne, Switzerland Yi Lu Jean Monnerat EPFL - I&C - LASEC EPFL-I&C-LASEC Lausanne, Switzerland Lausanne, Switzerland Serge Vaudenay Lausanne, Switzerland Library of Congress Cataloging-in-Publication Data A C.I.P. Catalogue record for this book is available from the Library of Congress. A CLASSICAL INTRODUCTION TO CRYPTOGRAPHY EXERCISE BOOK by Thomas Baignkres, Palcal Junod, Yi Lu, Jean Monnerat and Serge Vaudenay ISBN- 10: 0-387-27934-2 e-ISBN-10: 0-387-28835-X ISBN- 13: 978-0-387-27934-3 e-ISBN- 13: 978-0-387-28835-2 Printed on acid-free paper. O 2006 Springer Science]Business Media, Inc. All rights reserved. This work may not be translated or copied in whole or in part without the written permission of the publisher (Springer Science+Business Media, Inc., 233 Spring Street, New York, NY 10013, USA), except for brief excerpts in connection with reviews or scholarly analysis. Use in connection with any form of information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now know or hereafter developed is forbidden. The use in this publication of trade names, trademarks, service marks and similar terms, even if the are not identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to proprietary rights. Printed in the United States of America.
There are many challenges facing organizations today as they incorporate electronic marketing methods into their strategy. Advances in Electronic Marketing examines these challenges within three major themes: the global environment, the strategic/technological realm, and the buyer behavior of online consumers. Each chapter raises important issues, practical applications, and relevant solutions for the electronic marketer. Advances in Electronic Marketing not only addresses Internet marketing and the World Wide Web, but also other electronic marketing tools, such as geographic information systems, database marketing, and mobile advertising. This book provides researchers and practitioners with an updated source of knowledge on electronic marketing methods.
Parallel and distributed computing is one of the foremost
technologies for shaping New Horizons of Parallel and Distributed Computing is a collection of self-contained chapters written by pioneering researchers to provide solutions for newly emerging problems in this field. This volume will not only provide novel ideas, work in progress and state-of-the-art techniques in the field, but will also stimulate future research activities in the area of parallel and distributed computing with applications. New Horizons of Parallel and Distributed Computing is intended for industry researchers and developers, as well as for academic researchers and advanced-level students in computer science and electrical engineering. A valuable reference work, it is also suitable as a textbook.
Hybrid dynamical systems, both continuous and discrete dynamics and variables, have attracted considerable interest recently. This emerging area is found at the interface of control theory and computer engineering, focusing on the analogue and digital aspects of systems and devices. They are essential for advances in modern digital- controller technology. "Qualitative Theory of Hybrid Dynamical Systems" provides a thorough development and systematic presentation of the foundations and framework for hybrid dynamical systems. The presentation offers an accessible, but precise, development of the mathematical models, conditions for existence of limit cycles, and criteria of their stability. The book largely concentrates on the case of discretely controlled continuous-time systems and their relevance for modeling aspects of flexible manufacturing systems and dynamically routed queuing networks. Features and topics: *differential automata*development and use of the concept "cyclic linear differential automata" (CLDA)*switched single-server flow networks coverage*application to specific models of manufacturing systems and queuing networks*select collection of open problems for the subject*self-contained presentation of topics, with the necessary background This new book is an excellent resource for the study and analysis of hybrid dynamical systems used in systems and control engineering. Researchers, postgraduates and professionals in control engineering and computer engineering will find the book an up-to-date development of the relevant new concepts and tools.
The advancement of technology in today's world has led to the progression of several professional fields. This includes the classroom, as teachers have begun using new technological strategies to increase student involvement and motivation. ICT innovation including virtual reality and blended learning methods has changed the scope of classroom environments across the globe; however, significant research is lacking in this area. ICTs and Innovation for Didactics of Social Sciences is a fundamental reference focused on didactics of social sciences and ICTs including issues related to innovation, resources, and strategies for teachers that can link to the transformation of social sciences teaching and learning as well as societal transformation. While highlighting topics such as blended learning, augmented reality, and virtual classrooms, this book is ideally designed for researchers, administrators, educators, practitioners, and students interested in understanding current relevant ICT resources and innovative strategies for the didactic of social sciences and didactic possibilities in relation to concrete conceptual contents, resolution of problems, planning, decision making, development of social skills, attention, and motivation promoting a necessary technological literacy.
Today more than 90% of all programmable processors are employed in embedded systems. This number is actually not surprising, contemplating that in a typical home you might find one or two PCs equipped with high-performance standard processors, and probably dozens of embedded systems, including electronic entertainment, household, and telecom devices, each of them equipped with one or more embedded processors. The question arises why programmable processors are so popular in embedded system design. The answer lies in the fact that they help to narrow the gap between chip capacity and designer productivity. Embedded processors cores are nothing but one step further towards improved design reuse, just along the lines of standard cells in logic synthesis and macrocells in RTL synthesis in earlier times of IC design. Additionally, programmable processors permit to migrate functionality from hardware to software, resulting in an even improved reuse factor as well as greatly increased flexibility. The LISA processor design platform (LPDP) presented in Architecture Exploration for Embedded Processors with LISA addresses recent design challenges and results in highly satisfactory solutions. The LPDP covers all major high-level phases of embedded processor design and is capable of automatically generating almost all required software development tools from processor models in the LISA language. It supports a profiling-based, stepwise refinement of processor models down to cycle-accurate and even RTL synthesis models. Moreover, it elegantly avoids model inconsistencies otherwise omnipresent in traditional design flows. The next step in design reuse is already in sight: SoC platforms, i.e., partially pre-designed multi-processor templates that can be quickly tuned towards given applications thereby guaranteeing a high degree of hardware/software reuse in system-level design. Consequently, the LPDP approach goes even beyond processor architecture design. The LPDP solution explicitly addresses SoC integration issues by offering comfortable APIs for external simulation environments as well as clever solutions for the problem of both efficient and user-friendly heterogeneous multiprocessor debugging.
This text looks at how computers are being used in primary classrooms and how they could be used better. Its three sections focus upon: how do we investigate learning through talk around computers? What affects the quality of group work around computers? What can teachers do to improve this?
This book aims at providing a view of the current trends in the development of research on Synthesis and Control of Discrete Event Systems. Papers col lected in this volume are based on a selection of talks given in June and July 2001 at two independent meetings: the Workshop on Synthesis of Concurrent Systems, held in Newcastle upon Tyne as a satellite event of ICATPN/ICACSD and organized by Ph. Darondeau and L. Lavagno, and the Symposium on the Supervisory Control of Discrete Event Systems (SCODES), held in Paris as a satellite event of CAV and organized by B. Caillaud and X. Xie. Synthesis is a generic term that covers all procedures aiming to construct from specifications given as input objects matching these specifications. The ories and applications of synthesis have been studied and developped for long in connection with logics, programming, automata, discrete event systems, and hardware circuits. Logics and programming are outside the scope of this book, whose focus is on Discrete Event Systems and Supervisory Control. The stress today in this field is on a better applicability of theories and algorithms to prac tical systems design. Coping with decentralization or distribution and caring for an efficient realization of the synthesized systems or controllers are of the utmost importance in areas so diverse as the supervision of embedded or man ufacturing systems, or the implementation of protocols in software or in hard ware."
Aimed at final year undergraduate students, this is the first volume to publish in a new series of text covering core subjects in operational research in an accessible student-friendly format. This volume presents simulation paired with inventory control. The Operational Research Series aims to provide a new generation of European-originated texts of practical relevance to todays student. To guarantee accessibility, the texts are concise and have a non-mathematical orientation. These texts will provide students with the grounding in operational research theory they need to become the innovators of tomorrow.
Identifying Emerging Trends in Technological Innovation Doctoral programs in science and engineering are important sources of innovative ideas and techniques that might lead to new products and technological innovation. Certainly most PhD students are not experienced researchers and are in the process of learning how to do research. Nevertheless, a number of empiric studies also show that a high number of technological innovation ideas are produced in the early careers of researchers. The combination of the eagerness to try new approaches and directions of young doctoral students with the experience and broad knowledge of their supervisors is likely to result in an important pool of innovation potential. The DoCEIS doctoral conference on Computing, Electrical and Industrial En- neering aims at creating a space for sharing and discussing ideas and results from doctoral research in these inter-related areas of engineering. Innovative ideas and hypotheses can be better enhanced when presented and discussed in an encouraging and open environment. DoCEIS aims to provide such an environment, releasing PhD students from the pressure of presenting their propositions in more formal contexts. |
You may like...
Postharvest Disinfection of Fruits and…
Mohammed Wasim Siddiqui
Paperback
Therapeutic, Probiotic, and…
Alexandru Mihai Grumezescu, Alina Maria Holban
Paperback
Durability and Reliability of Medical…
Mike Jenkins, Artemis Stamboulis
Hardcover
R4,034
Discovery Miles 40 340
|