![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
This institute was organized and presented by an international group of scholars interested in the advancement of instructional design automation through theory, research and applied evaluation. Members of the organizing committee included Dr. Klaus Breuer from disce (Germany), Dr. Jose J. Gonzalez from Agder College of Engineering (Norway), Dr. Begofia Gros from the University of Barcelona, Dr. J. Michael Spector from the Armstrong Laboratory (USA). Dr. Gonzalez, co-director of the institute, and the staff of Agder College were directly responsible for the preparation and operation of the institute in Grimstad, Norway. The institute was held on the campus of Agder College of Engineering, July 12-23, 1993. The theme of the institute extended the initial work developed by the presenters at a NATO Advanced Research Workshop held in Sitges, Spain in 1992. During the two week institute, 21 presentations were made including papers and demonstrations. In addition to the formal presentations, working groups and on-site study groups provided opportunities for the students to participate directly in program activities. An important outcome for the working groups was the formal preparation of their efforts in chapters for this volume.
Deryn Watson CapBIT 97, Capacity Building for Information Technologies in Education in Developing Countries, from which this publication derives, was an invited IFIP working conference sponsored by Working Groups in secondary (WG 3. 1), elementary (WG 3. 5), and vocational and professional (WG 3. 4) education under the auspices ofIFIP Technical Committee for Education (TC3). The conference was held in Harare, Zimbabwe 25th - 29th August 1997. CapBIT '97 was the first time that the IFIP Technical Committee for Education had held a conference in a developing country. When the Computer Society of Zimbabwe offered to host the event, we determined that the location and conference topic reflect the importance of issues facing countries at all stages of developmen- especially Information Technologies (IT) development. Information Technologies have become, within a short time, one of the basic building blocks of modem industrial society. Understanding IT, and mastering basic skills and concepts of IT, are now regarded as part of the core education of all people around the world, alongside reading and writing. IT now permeates the business environment and underpins the success of modem corporations as well as providing government with cost-effective civil service systems. At the same time, the tools and technologies of IT are of value in the process of learning, and in the organisation and management of learning institutions.
Frequency Compensation Techniques for Low-Power Operational Amplifiers is intended for professional designers of integrated amplifiers, emphasizing low-voltage and low-power solutions. The book bridges the gap between the professional designer's needs and available techniques for frequency compensation. It does so by explaining existing techniques and introducing several new techniques including Hybrid Nested Miller compensation, Multipath Miller Zero cancellation and Multipath Conditionally Stable compensation. All compensation techniques are treated in a stage-number-based order, progressing from a single transistor to circuits with six stages and more. Apart from discussing the mathematical basis of the compensation methods, the book provides the reader with the factual information that is required for practicing the design of integrated feedback amplifiers and many worked out examples. What is more, many bipolar and CMOS operational amplifier realizations, along with their measurement results, prove the effectiveness of the compensation techniques in real-life circuits. The text focuses on low-voltage, low-power integrated amplifiers. Many of the presented bipolar circuits operate at supply voltages down to 1V, while several CMOS amplifiers that function correctly just slightly above this voltage are demonstrated. The lowest measured power consumption amounts to 17muW for a class AB CMOS opAmp with 120dB gain. Despite this attention to low voltage and low power, the frequency compensation strategies provided are universally applicable. The fundamental approach followed leads to efficient compensation strategies that are well guarded against the parameter variations inherent to the mass-fabrication of integrated circuits. The book is essential reading for practicing analog design engineers and researchers in the field. It is also suitable as a text for an advanced course on the subject.
The SCAN conference, the International Symposium on Scientific Com puting, Computer Arithmetic and Validated Numerics, takes place bian nually under the joint auspices of GAMM (Gesellschaft fiir Angewandte Mathematik und Mechanik) and IMACS (International Association for Mathematics and Computers in Simulation). SCAN-98 attracted more than 100 participants from 21 countries all over the world. During the four days from September 22 to 25, nine highlighted, plenary lectures and over 70 contributed talks were given. These figures indicate a large participation, which was partly caused by the attraction of the organizing country, Hungary, but also the effec tive support system have contributed to the success. The conference was substantially supported by the Hungarian Research Fund OTKA, GAMM, the National Technology Development Board OMFB and by the J6zsef Attila University. Due to this funding, it was possible to subsidize the participation of over 20 scientists, mainly from Eastern European countries. It is important that the possibly first participation of 6 young researchers was made possible due to the obtained support. The number of East-European participants was relatively high. These results are especially valuable, since in contrast to the usual 2 years period, the present meeting was organized just one year after the last SCAN-xx conference."
Small Business Clustering Technology: Applications in Marketing, Management, Finance and IT examines the development and role of small business clusters from a variety of disciplines - economics, marketing, management, and information systems. Unlike many issues that are hampered by ideological problems between disciplines, this book proves that there is an approach suggesting that cluster analysis is truly interdisciplinary. ""Small Business Clustering Technology: Applications in Marketing, Management, Finance and IT"" brings together perspectives on small business clusters from a range of disciplines and countries, highlights the commonalities in the literature, and gives a range of case studies illustrating the variety of clusters throughout the world.
Python Programming and Numerical Methods: A Guide for Engineers and Scientists introduces programming tools and numerical methods to engineering and science students, with the goal of helping the students to develop good computational problem-solving techniques through the use of numerical methods and the Python programming language. Part One introduces fundamental programming concepts, using simple examples to put new concepts quickly into practice. Part Two covers the fundamentals of algorithms and numerical analysis at a level that allows students to quickly apply results in practical settings.
Many challenges lie ahead in the development of a global information society. Culture and democracy are two areas which may be under particular threat. The book reflects on today's complex and uncertain cultural and democratic developments arising as a result of an increasingly global, technologically-connected world. In particular it focuses on the Internet, examining new metaphors for communication, defining the issues at stake and proposing options, actions and solutions. Among the issues discussed were: multi-cultural developments; cultural sensitivities and the involvement of cultural minorities; generation gaps; gender issues; technology access for the elderly and the disabled; technology transfer.
The primary goal of this book is to present to the scientific and management communities a selection of applications using more recent Soft Computing (SC) and Computing with Words and Perceptions (CWP) models and techniques meant to solve the economics and financial problems. The selected examples could also serve as a starting point or as an opening out, in the SC and CWP techniques application to a wider range of problems in economics and finance. Decision making in the present world is becoming more and more sophisticated, time consuming and difficult for human beings who require more and more computational support. This book addresses the significant increase on research and applications of Soft Computing and Computing with Words and Perceptions for decision making in Economics and Finance in recent years. Decision making is heavily based on information and knowledge usually extracted from the analysis of large amounts of data. Data mining techniques enabled with the capability to integrate human experience could be used for a more realistic business decision support. Computing with Words and Perceptions introduced by Lotfi Zadeh, can serve as a basis for such extension of traditional data mining and decision making systems. Fuzzy logic as a main constituent of CWP gives powerful tools for modeling and processing linguistic information defined on numerical domain.
Very large scale integration (VLSI) technologies are now maturing with a current emphasis toward submicron structures and sophisticated applications combining digital as well as analog circuits on a single chip. Abundant examples are found on today's advanced systems for telecom munications, robotics, automotive electronics, image processing, intelli gent sensors, etc .. Exciting new applications are being unveiled in the field of neural computing where the massive use of analog/digital VLSI technologies will have a significant impact. To match such a fast technological trend towards single chip ana logi digital VLSI systems, researchers worldwide have long realized the vital need of producing advanced computer aided tools for designing both digital and analog circuits and systems for silicon integration. Ar chitecture and circuit compilation, device sizing and the layout genera tion are but a few familiar tasks on the world of digital integrated circuit design which can be efficiently accomplished by matured computer aided tools. In contrast, the art of tools for designing and producing analog or even analogi digital integrated circuits is quite primitive and still lack ing the industrial penetration and acceptance already achieved by digital counterparts. In fact, analog design is commonly perceived to be one of the most knowledge-intensive design tasks and analog circuits are still designed, largely by hand, by expert intimately familiar with nuances of the target application and integrated circuit fabrication process. The techniques needed to build good analog circuits seem to exist solely as expertise invested in individual designers."
Jerome McGann has been at the forefront of the digital revolution in the humanities. His pioneering critical projects on the World Wide Web have redefined traditional notions about interpreting literature. In this trailblazing book, McGann explores the profound implications digital media have for the core critical tasks of the humanities.Drawing on his work as editor of the acclaimed hypertext project The Rossetti Archive, he sets the foundation for a new critical practice for the digital age. Digital media, he demonstrates, can do much more than organize access to great works of literature and art. Beyond their acknowledged editorial and archival capabilities, digital media are also critical tools of unprecedented power. In McGann’s practical vision, digital tools give scholars a flexible, dynamic means for interpreting expressive works—especially those that combine text and image. Radiant Textuality demonstrates eloquently how new technologies can deepen our understanding of complex, multi-layered works of the human imagination in ways never before thought possible.
Web Systems Design and Online Consumer Behavior takes and interdisciplinary approach toward systems design in the online environment by providing an understanding of how consumers behave while shopping online and how certain system design elements may impact consumers' perceptions, attitude, intentions, and actual behavior. This book contains theoretical and empirical research from expert scholars in a number of areas including communications, psychology, marketing and advertising, and information systems. This book provides an integrated look at the subject area as described above to further our understanding of the linkage among various disciplines inherently connected with one another in electronic commerce.
This textbook presents a thorough foundation to the theory of computation. Combining intuitive descriptions and illustrations with rigorous arguments and detailed proofs for key topics, the logically structured discussion guides the reader through the core concepts of automata and languages, computability, and complexity of computation. Topics and features: presents a detailed introduction to the theory of computation, complete with concise explanations of the mathematical prerequisites; provides end-of-chapter problems with solutions, in addition to chapter-opening summaries and numerous examples and definitions throughout the text; draws upon the author's extensive teaching experience and broad research interests; discusses finite automata, context-free languages, and pushdown automata; examines the concept, universality and limitations of the Turing machine; investigates computational complexity based on Turing machines and Boolean circuits, as well as the notion of NP-completeness.
This book is a result of the Tenth International Conference on Information Systems Development (ISD2001) held at Royal Holloway, University of London, United Kingdom, during September 5-7, 2001. ISD 2001 carries on the fine tradition established by the first Polish-Scandinavian Seminar on Current Trends in Information Systems Development Methodologies, held in Gdansk, Poland in 1988. Through the years, this seminar evolved into an International Conference on Information Systems Development. The Conference gives participants an opportunity to express ideas on the current state of the art in information systems development, and to discuss and exchange views on new methods, tools, applications as well as theory. In all, 55 papers were presented at ISD2001 organised into twelve tracks covering the following themes: Systems Analysis and Development, Modelling, Methodology, Database Systems, Collaborative Systems, Theory, Knowledge Management, Project Management, IS Education, Management issues, E-Commerce. and Technical Issues. We would like to thank all the contributing authors for making this book possible and for their participation in ISD200 1. We are grateful to our panel of paper reviewers for their help and support. We would also like to express our sincere thanks to Ceri Bowyer and Steve Brown for their unfailing support with organising ISD2001.
Chance discovery means discovering chances - the breaking points in systems, the marketing windows in business, etc. It involves determining the significance of some piece of information about an event and then using this new knowledge in decision making. The techniques developed combine data mining methods for finding rare but important events with knowledge management, groupware, and social psychology. The reader will find many applications, such as finding information on the Internet, recognizing changes in customer behavior, detecting the first signs of an imminent earthquake, etc. This first book dedicated to chance discovery covers the state of the art in the theory and methods and examines typical scenarios, and it thus appeals to researchers working on new techniques and algorithms and also to professionals dealing with real-world applications.
Data science has always been an effective way of extracting knowledge and insights from information in various forms. One industry that can utilize the benefits from the advances in data science is the healthcare field. The Handbook of Research on Data Science for Effective Healthcare Practice and Administration is a critical reference source that overviews the state of data analysis as it relates to current practices in the health sciences field. Covering innovative topics such as linear programming, simulation modeling, network theory, and predictive analytics, this publication is recommended for all healthcare professionals, graduate students, engineers, and researchers that are seeking to expand their knowledge of efficient techniques for information analysis in the healthcare professions.
Distributed Infrastructure Support For E-Commerce And Distributed
Applications is organized in three parts. The first part
constitutes an overview, a more detailed motivation of the problem
context, and a tutorial-like introduction to middleware systems.
The second part is comprised of a set of chapters that study
solutions to leverage the trade-off between a transparent
programming model and application-level enabled resource control.
The third part of this book presents three detailed distributed
application case studies and demonstrates how standard middleware
platforms fail to adequately cope with resource control needs of
the application designer in these three cases:
This book proposes a new approach to circuit simulation that is still in its infancy. The reason for publishing this work as a monograph at this time is to quickly distribute these ideas to the research community for further study. The book is based on a doctoral dissertation undertaken at MIT between 1982 and 1985. In 1982 the author joined a research group that was applying bounding techniques to simple VLSI timing analysis models. The conviction that bounding analysis could also be successfully applied to sophisticated digital MOS circuit models led to the research presented here. Acknowledgments 'me author would like to acknowledge many helpful discussions and much support from his research group at MIT, including Lance Glasser, John Wyatt, Jr., and Paul Penfield, Jr. Many others have also contributed to this work in some way, including Albert Ruchli, Mark Horowitz, Rich Zippel, Chtis Terman, Jacob White, Mark Matson, Bob Armstrong, Steve McCormick, Cyrus Bamji, John Wroclawski, Omar Wing, Gary Dare, Paul Bassett, and Rick LaMaire. The author would like to give special thanks to his wife, Deborra, for her support and many contributions to the presentation of this research. The author would also like to thank his parents for their encouragement, and IBM for its financial support of t, I-Jis project through a graduate fellowship. THE BOUNDING APPROACH TO VLSI CIRCUIT SIMULATION 1. INTRODUCTION The VLSI revolution of the 1970's has created a need for new circuit analysis techniques.
Real-time model predictive controller (MPC) implementation in active vibration control (AVC) is often rendered difficult by fast sampling speeds and extensive actuator-deformation asymmetry. If the control of lightly damped mechanical structures is assumed, the region of attraction containing the set of allowable initial conditions requires a large prediction horizon, making the already computationally demanding on-line process even more complex. Model Predictive Vibration Control provides insight into the predictive control of lightly damped vibrating structures by exploring computationally efficient algorithms which are capable of low frequency vibration control with guaranteed stability and constraint feasibility. In addition to a theoretical primer on active vibration damping and model predictive control, Model Predictive Vibration Control provides a guide through the necessary steps in understanding the founding ideas of predictive control applied in AVC such as: * the implementation of computationally efficient algorithms * control strategies in simulation and experiment and * typical hardware requirements for piezoceramics actuated smart structures. The use of a simple laboratory model and inclusion of over 170 illustrations provides readers with clear and methodical explanations, making Model Predictive Vibration Control the ideal support material for graduates, researchers and industrial practitioners with an interest in efficient predictive control to be utilized in active vibration attenuation.
The first volume in a series which aims to focus on advances in computational biology. This volume discusses such topics as: fluctuations in the shape of flexible macromolecules; the hydration of carbohydrates as seen by computer simulation; and studies of salt-peptide solutions.
Circuit simulation has become an essential tool in circuit design and without it's aid, analogue and mixed-signal IC design would be impossible. However the applicability and limitations of circuit simulators have not been generally well understood and this book now provides a clear and easy to follow explanation of their function. The material covered includes the algorithms used in circuit simulation and the numerical techniques needed for linear and non-linear DC analysis, transient analysis and AC analysis. The book goes on to explain the numeric methods to include sensitivity and tolerance analysis and optimisation of component values for circuit design. The final part deals with logic simulation and mixed-signal simulation algorithms. There are comprehensive and detailed descriptions of the numerical methods and the material is presented in a way that provides for the needs of both experienced engineers who wish to extend their knowledge of current tools and techniques, and of advanced students and researchers who wish to develop new simulators.
Switching Theory for Logic Synthesis covers the basic topics of switching theory and logic synthesis in fourteen chapters. Chapters 1 through 5 provide the mathematical foundation. Chapters 6 through 8 include an introduction to sequential circuits, optimization of sequential machines and asynchronous sequential circuits. Chapters 9 through 14 are the main feature of the book. These chapters introduce and explain various topics that make up the subject of logic synthesis: multi-valued input two-valued output function, logic design for PLDs/FPGAs, EXOR-based design, and complexity theories of logic networks. An appendix providing a history of switching theory is included. The reference list consists of over four hundred entries. Switching Theory for Logic Synthesis is based on the author's lectures at Kyushu Institute of Technology as well as seminars for CAD engineers from various Japanese technology companies. Switching Theory for Logic Synthesis will be of interest to CAD professionals and students at the advanced level. It is also useful as a textbook, as each chapter contains examples, illustrations, and exercises.
The power of modern information systems and information technology (lSIIT) offers new opportunities to rethink, at the broadest levels, existing business strategies, approaches and practices. Over the past decade, IT has opened up new business opportunities, led to the development of new strategic IS and challenged all managers and users of ISIIT to devise new ways to make better use of information. Yet this era which began with much confidence and optimism is now suffering under a legacy of systems that are increasingly failing to meet business needs, and lasting fixes are proving costly and difficult to implement. General management is experiencing a crisis of confidence in their IS functions and in the chiefinformation systems officers who lead them (Earl and Feeney, 1994:11). The concern for chief executive officers is that they are confronting a situation that is seemingly out of control. They are asking, 'What is the best way to rein in these problems and effectively assess IS performance? Further, how can we be certain that IS is adequately adding value to the organisational bottom line?' On the other hand, IS executives and professionals who are responsible for creating, managing and maintaining the organisation's systems are worried about the preparedness of general managers to cope with the growth in new technologies and systems. They see IT having a polarising effect on general managers; it either bedazzles or frightens them (Davenport, 1994: 119).
Advances in Computer and Information Sciences and Engineering includes a set of rigorously reviewed world-class manuscripts addressing and detailing state-of-the-art research projects in the areas of Computer Science, Software Engineering, Computer Engineering, and Systems Engineering and Sciences. Advances in Computer and Information Sciences and Engineering includes selected papers from the conference proceedings of the International Conference on Systems, Computing Sciences and Software Engineering (SCSS 2007) which was part of the International Joint Conferences on Computer, Information and Systems Sciences and Engineering (CISSE 2007).
Embedded core processors are becoming a vital part of today's system-on-a-chip in the growing areas of telecommunications, multimedia and consumer electronics. This is mainly in response to a need to track evolving standards with the flexibility of embedded software. Consequently, maintaining the high product performance and low product cost requires a careful design of the processor tuned to the application domain. With the increased presence of instruction-set processors, retargetable software compilation techniques are critical, not only for improving engineering productivity, but to allow designers to explore the architectural possibilities for the application domain. Retargetable Compilers for Embedded Core Processors, with a Foreword written by Ahmed Jerraya and Pierre Paulin, overviews the techniques of modern retargetable compilers and shows the application of practical techniques to embedded instruction-set processors. The methods are highlighted with examples from industry processors used in products for multimedia, telecommunications, and consumer electronics. An emphasis is given to the methodology and experience gained in applying two different retargetable compiler approaches in industrial settings. The book also discusses many pragmatic areas such as language support, source code abstraction levels, validation strategies, and source-level debugging. In addition, new compiler techniques are described which support address generation for DSP architecture trends. The contribution is an address calculation transformation based on an architectural model. Retargetable Compilers for Embedded Core Processors will be of interest to embedded system designers and programmers, the developers of electronic design automation (EDA) tools for embedded systems, and researchers in hardware/software co-design. |
You may like...
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
Computer-Graphic Facial Reconstruction
John G. Clement, Murray K. Marks
Hardcover
R2,327
Discovery Miles 23 270
Discovering Computers, Essentials…
Susan Sebok, Jennifer Campbell, …
Paperback
Infinite Words, Volume 141 - Automata…
Dominique Perrin, Jean-Eric Pin
Hardcover
R4,065
Discovery Miles 40 650
|