![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
This text is about methods used for the computer simulation of analog systems. It concentrates on electronic applications, but many of the methods are applicable to other engineering problems as well. This revised edition (1st, 1983) encompasses recent theoretical developments and program-writing ti
Faster, better and cheaper are challenges that IT-companies face
every day. The customer's expectations shall be met in a world
where constant change in environment, organization and technology
are the rule rather that the exception. A solution for meeting
these challenges is to share knowledge and experience - use the
company's own experience, and the experience of other companies.
Process Improvement in Practice - A Handbook for IT Companies
tackles the problems involved in launching these solutions.
Energy Power Risk: Derivatives, Computation and Optimization is a comprehensive guide presenting the latest mathematical and computational tools required for the quantification and management of energy power risk. Written by a practitioner with many years' experience in the field, it provides readers with valuable insights in to the latest practices and methodologies used in today's markets, showing readers how to create innovative quantitative models for energy and power risk and derivative valuation. The book begins with an introduction to the mathematics of Brownian motion and stochastic processes, covering Geometric Brownian motion, Ito's lemma, Ito's Isometry, the Ornstein Uhlenbeck process and more. It then moves on to the simulation of power prices and the valuation of energy derivatives, before considering software engineering techniques for energy risk and portfolio optimization. The book also covers additional topics including wind and solar generation, intraday storage, generation and demand optionality. Written in a highly practical manner and with example C++ and VBA code provided throughout, Energy Power Risk: Derivatives, Computation and Optimization will be an essential reference for quantitative analysts, financial engineers and other practitioners in the field of energy risk management, as well as researchers and students interested in the industry and how it works.
Written by one of the founding fathers of Quantum Information, this book gives an accessible (albeit mathematically rigorous), self-contained introduction to quantum information theory. The central role is played by the concept of quantum channel and its entropic and information characteristics. In this revised edition, the main results have been updated to reflect the most recent developments in this very active field of research.
Offers support for a wide range of products for the RISC
System/6000 An important reference for all programmers and product
development
This is the first book to cover verification strategies and methodologies for SOC verification from system level verification to the design sign-off. All the verification aspects in this exciting new book are illustrated with a single reference design for Bluetooth application.
Information engineering and applications is the field of study concerned with constructing information computing, intelligent systems, mathematical models, numerical solution techniques, and using computers and other electronic devices to analyze and solve natural scientific, social scientific and engineering problems. Information engineering is an important underpinning for techniques used in information and computational science and there are many unresolved problems worth studying. The Proceedings of the 2nd International Conference on Information Engineering and Applications (IEA 2012), which was held in Chongqing, China, from October 26-28, 2012, discusses the most innovative research and developments including technical challenges and social, legal, political, and economic issues. A forum for engineers and scientists in academia, industry, and government, the Proceedings of the 2nd International Conference on Information Engineering and Applications presents ideas, results, works in progress, and experience in all aspects of information engineering and applications.
This annual publication deals with how microcomputers and other computers can be applied to improving the explanatory and evaluative roles of modern social science. Each volume contains chapters by experts in political science, psychology, sociology, economics and computer science.
Software architectures have gained wide popularity in the last decade. They generally play a fundamental role in coping with the inherent difficulties of the development of large-scale and complex software systems. Component-oriented and aspect-oriented programming enables software engineers to implement complex applications from a set of pre-defined components. Software Architectures and Component Technology collects excellent chapters on software architectures and component technologies from well-known authors, who not only explain the advantages, but also present the shortcomings of the current approaches while introducing novel solutions to overcome the shortcomings. The unique features of this book are: evaluates the current architecture design methods and component composition techniques and explains their shortcomings; presents three practical architecture design methods in detail; gives four industrial architecture design examples; presents conceptual models for distributed message-based architectures; explains techniques for refining architectures into components; presents the recent developments in component and aspect-oriented techniques; explains the status of research on Piccola, Hyper/JA(R), Pluggable Composite Adapters and Composition Filters. Software Architectures and Component Technology is a suitable text for graduate level students in computer science and engineering, and as a reference for researchers and practitioners in industry.
This book is the first in a set of forthcoming books focussed on state-of-the-art development in the VLSI Signal Processing area. It is a response to the tremendous research activities taking place in that field. These activities have been driven by two factors: the dramatic increase in demand for high speed signal processing, especially in consumer elec tronics, and the evolving microelectronic technologies. The available technology has always been one of the main factors in determining al gorithms, architectures, and design strategies to be followed. With every new technology, signal processing systems go through many changes in concepts, design methods, and implementation. The goal of this book is to introduce the reader to the main features of VLSI Signal Processing and the ongoing developments in this area. The focus of this book is on: * Current developments in Digital Signal Processing (DSP) pro cessors and architectures - several examples and case studies of existing DSP chips are discussed in Chapter 1. * Features and requirements of image and video signal processing architectures - both applications specific integrated circuits (ASICs) and programmable image processors are studied in Chapter 2. * New market areas for signal processing - especially in consumer electronics such as multimedia, teleconferencing, and movie on demand. * Impact of arithmetic circuitry on the performance of DSP pro cessors - several topics are discussed in Chapter 3 such as: number representation, arithmetic algorithms and circuits, and implementa tion.
After the mainframe and personal computer eras, the third major era in computer science, ubiquitous computing, describes the state of technology in which networked computers would surround every user. The Handbook of Research on Ubiquitous Computing Technology for Real Time Enterprises combines the fundamental methods, algorithms, and concepts of pervasive computing with current innovations and solutions to emerging challenges. With more than 25 authoritative contributions by over 50 of the world's leading experts this groundbreaking resource systemically covers such salient topics as network and application scalability, wireless network connectivity, adaptability and "context-aware" computing, information technology security and liability, and human computer interaction.
Embedded systems are characterized by the presence of processors running application-specific software. Recent years have seen a large growth of such systems, and this trend is projected to continue with the growth of systems on a chip. Many of these systems have strict performance and cost requirements. To design these systems, sophisticated timing analysis tools are needed to accurately determine the extreme case (best case and worst case) performance of the software components. Existing techniques for this analysis have one or more of the following limitations: they cannot model complicated programs they cannot model advanced micro-architectural features of the processor, such as cache memories and pipelines they cannot be easily retargeted for new hardware platforms. In Performance Analysis of Real-Time Embedded Software, a new timing analysis technique is presented to overcome the above limitations. The technique determines the bounds on the extreme case (best case and worst case) execution time of a program when running on a given hardware system. It partitions the problem into two sub-problems: program path analysis and microarchitecture modeling. Performance Analysis of Real-Time Embedded Software will be of interest to Design Automation professionals as well as designers of circuits and systems.
This volume contains contributions to the First Kazakh-German Advanced Research Workshop on Computational Science and High Performance Computing presented in September 2005 at Almaty, Kazakhstan. The workshop was organized by the High Performance Computing Center Stuttgart (Stuttgart, Germany), al-Farabi Kazakh National University (Almaty, Kazakhstan) and the Institute of Computational Technologies SB RAS (Novosibirsk, Russia) in the framework of activities of the German-Russian Center for Computational Technologies and High Performance
Technologic and virtual development is growing, creating an environment of online gaming that can be used as an effective and motivational instrument for math didactics in education. Simulation and Gaming for Mathematical Education: Epistemology and Teaching Strategies provides leading research on ways for various learning environments to be created referring to math didactics through redefinition and reassessment of teaching experiences. A defining collection of field advancements, this publication gradually leads readers through the steps of planning innovative strategies in math education.
The expanding field of adult learning encompasses the study and practice of utilizing sound instructional design principals, technology, and learning theory as a means to solve educational challenges and human performance issues relating to adults, often occurring online. ""Online Education and Adult Learning: New Frontiers for Teaching Practices"" disseminates current issues and trends emerging in the field of adult e-learning and online instruction for the design of better products and services. This advanced publication provides theoretical understanding of the essential links between authentic learning, social psychology, instructional design, e-learning, online education, and various additional methods of adult learning.
This book presents suitable methodologies for the dynamic analysis of multibody mechanical systems with joints. It contains studies and case studies of real and imperfect joints. The book is intended for researchers, engineers, and graduate students in applied and computational mechanics.
As governmental entities face accelerating public demand for electronic services and the internal need to utilize technology to achieve superior outcomes and operational efficiency, traditional techniques and tools are radically reshaping and evolving into innovative electronic methods of conducting governmental activities.""E-Government Diffusion, Policy, and Impact: Advanced Issues and Practices"" sheds light on how e-government technologies are shaping today's knowledge society from the ground roots of the citizen experience to the supreme level of policy and decision making. With chapters providing insights into such critical topics as public service delivery, technological diffusion, and e-readiness, this publication offers researchers, students, policy makers, and practitioners a quality depiction of worldwide social practice and how advancements within the realm of technology will affect all corners of the globe.
Systematic Design of Analog IP Blocks introduces a design
methodology that can help to bridge the productivity gap. Two
different types of designs, depending on the design challenge, have
been identified: commodity IP and star IP. Each category requires a
different approach to boost design productivity. Commodity IP
blocks are well suited to be automated in an analog synthesis
environment and provided as soft IP. The design knowledge is
usually common knowledge, and reuse is high accounting for the
setup time needed for the analog library. Star IP still changes as
technology evolves and the design cost can only be reduced by
following a systematic design approach supported by point tools to
relieve the designer from error-prone, repetitive tasks, allowing
him/her to focus on new ideas to push the limits of the design.
Distributed and communicating objects are becoming ubiquitous. In global, Grid and Peer-to-Peer computing environments, extensive use is made of objects interacting through method calls. So far, no general formalism has been proposed for the foundation of such systems. Caromel and Henrio are the first to define a calculus for distributed objects interacting using asynchronous method calls with generalized futures, i.e., wait-by-necessity -- a must in large-scale systems, providing both high structuring and low coupling, and thus scalability. The authors provide very generic results on expressiveness and determinism, and the potential of their approach is further demonstrated by its capacity to cope with advanced issues such as mobility, groups, and components. Researchers and graduate students will find here an extensive review of concurrent languages and calculi, with comprehensive figures and summaries. Developers of distributed systems can adopt the many implementation strategies that are presented and analyzed in detail. Preface by Luca Cardelli
Introduction or Why I wrote this book N the fallof 1997 a dedicated troff user e-rnalled me the macros he used to typeset his books. 1took one look inside his fileand thought, "I can do I this;It'sjustcode. " Asan experiment1spent aweekand wrote a Cprogram and troff macros which formatted and typeset a membership directory for a scholarly society with approximately 2,000 members. When 1 was done, I could enter two commands, and my program and troff would convert raw membershipdata into 200 pages ofPostScriptin 35 seconds. Previously, it had taken me several days to prepare camera-readycopy for the directory using a word processor. For completeness 1sat down and tried to write 1EXmacros for the typesetting. 1failed. Although ninety-five percent of my macros worked, I was unable to prepare the columns the project required. As my frustration grew, 1began this book-mentally, in myhead-as an answer to the question, "Why is 'lEX so hard to learn?" Why use Tgx? Lest you accuse me of the old horse and cart problem, 1should address the question, "Why use 1EX at all?" before 1explain why 'lEX is hard. Iuse 'lEXfor the followingreasons. Itisstable, fast, free, and it uses ASCII. Ofcourse, the most important reason is: 'lEX does a fantastic job. Bystable, I mean it is not likely to change in the next 10 years (much less the next one or two), and it is free of bugs. Both ofthese are important.
Information Technology (IT) has found its way into legal practice and as part of that into the judiciary. The present publication provides an impression of the developments in three continents, or, better, a country in each of these continents: Australia (Australia), Singapore (Asia), and Venezuela (South America). In addition, reports by Norway, the Netherlands and Italy are provided. These countries can be qualified as the best equipped and organized in IT for the judiciary in Europe. Amongst the issues addressed are electronic filing systems, decision support systems, the employment of knowledge management, and on-line services, including publication of verdicts. The central issue in the development of IT support for the judiciary worldwide appears the use of case management systems. IT is used as a means to bring about changes in most countries. In a very strong way this happened in Singapore, where IT was used to transform an old-fashioned organization, where it was hard or even impossible to get access to case information, into a modern, well-equipped institution. This book is meant to be a comprehensive source on the use of IT in legal organizations, in particular the judiciary, practitioners (attorneys, company lawyers, consultants), governments, but also for academics, both students and staff. This is Volume 4 in the Information Technology and Law (IT&Law) Series
Computing in the Nordic countries started in late 1940s mainly as an engineering activity to build computing devices to perform mathematical calculations and assist mathematicians and engineers in scientific problem solving. The early computers of the Nordic countries emerged during the 1950s and had names like BARK, BESK, DASK, SMIL, SARA, ESKO, and NUSSE. Each of them became a nucleus in institutes and centres for mathematical computations programmed and used by highly qualified professionals. However, one should not forget the punched-card machine technology at this time that had existed for several decades. In addition, we have a Nordic name, namely Frederik Rosing Bull, contributing to the fundaments of punched card technology and forming the French company Bull. Commercial products such as FACIT EDB and SAAB D20-series computers in Sweden, the Danish GIER computer, the Nokia MIKKO computer in Finland, as well as the computers of Norsk Data in Norway followed the early computers. In many cases, however, companies and institutions did not further develop or exploit Nordic computing hardware, even though it exhibited technical advantages. Consequently, in the 1970s, US computers, primarily from IBM, flooded the Nordic market.
Logic Synthesis Using Synopsys (R), Second Edition is for anyone who hates reading manuals but would still like to learn logic synthesis as practised in the real world. Synopsys Design Compiler, the leading synthesis tool in the EDA marketplace, is the primary focus of the book. The contents of this book are specially organized to assist designers accustomed to schematic capture-based design to develop the required expertise to effectively use the Synopsys Design Compiler. Over 100 `Classic Scenarios' faced by designers when using the Design Compiler have been captured, discussed and solutions provided. These scenarios are based on both personal experiences and actual user queries. A general understanding of the problem-solving techniques provided should help the reader debug similar and more complicated problems. In addition, several examples and dc_shell scripts (Design Compiler scripts) have also been provided. Logic Synthesis Using Synopsys (R), Second Edition is an updated and revised version of the very successful first edition. The second edition covers several new and emerging areas, in addition to improvements in the presentation and contents in all chapters from the first edition. With the rapid shrinking of process geometries it is becoming increasingly important that `physical' phenomenon like clusters and wire loads be considered during the synthesis phase. The increasing demand for FPGAs has warranted a greater focus on FPGA synthesis tools and methodology. Finally, behavioral synthesis, the move to designing at a higher level of abstraction than RTL, is fast becoming a reality. These factors have resulted in the inclusion of separate chapters in the second edition to cover Links to Layout, FPGA Synthesis and Behavioral Synthesis, respectively. Logic Synthesis Using Synopsys (R), Second Edition has been written with the CAD engineer in mind. A clear understanding of the synthesis tool concepts, its capabilities and the related CAD issues will help the CAD engineer formulate an effective synthesis-based ASIC design methodology. The intent is also to assist design teams to better incorporate and effectively integrate synthesis with their existing in-house design methodology and CAD tools.
Educational institutions in which administrators, managers and teachers will be working in the late 1990's will be far different from those oftoday. Schools, which until recently were lagging behind in the implementation of information technology (IT) in their administration and management, are now attempting to close the gap. A massive and rapid computerization process in schools, school districts and throughout the other Ievels of the educational system, including universities, has made computers an integral part of the educational management scene. A computer on the desk of every educational management statT might become a reality in the near future. The term "IT" includes three main components: hardware, software - mainly management information systems (MIS)/decision support systems (DSS) and human factors. Presently, successful implementation depends on adequate software and on human factors. MIS/DSSs are being implemented with the aim of providing meaningful support for school employees in their daily activities, and to improve their performance, etfectiveness and efficiency. Much like at universities, usable and accessible school databases are being established, encompassing data on students, teachers, employees, classrooms, grade Ievels, courses, student achievements and behavior, school space, curriculum, finance, inventory, transportation, etc. |
You may like...
Computer-Graphic Facial Reconstruction
John G. Clement, Murray K. Marks
Hardcover
R2,327
Discovery Miles 23 270
Discovering Computers - Digital…
Misty Vermaat, Mark Ciampa, …
Paperback
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
R1,136
Discovery Miles 11 360
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
|