![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
The evaluation of IT and its business value are recently the subject of many academic and business discussions, as business managers, management consultants and researchers regularly question whether and how the contribution of IT to business performance can be evaluated effectively. Investments in IT are growing extensively and business managers worry about the fact that the benefits of IT investments might not be as high as expected. This phenomenon is often called the IT investment paradox or the IT Black Hole: larges sums are invested in IT that seem to be swallowed by a large black hole without rendering many returns. Information Systems Evaluation Management discusses these issues among others, through its presentation of the most current research in the field of IS evaluation. It is an area of study that touches upon a variety of types of businesses and organization essentially all those who involve IT in their business practices.
This book focuses on the aspects related to the parallelization of evolutionary computations, such as parallel genetic operators, parallel fitness evaluation, distributed genetic algorithms, and parallel hardware implementations, as well as on their impact on several applications. It offers a wide spectrum of sample works developed in leading research about parallel implementations of efficient techniques at the heart of computational intelligence.
System-Level Synthesis deals with the concurrent design of electronic applications, including both hardware and software. The issue has become the bottleneck in the design of electronic systems, including both hardware and software, in several major industrial fields, including telecommunications, automotive and aerospace engineering. The major difficulty with the subject is that it demands contributions from several research fields, including system specification, system architecture, hardware design, and software design. Most existing book cover well only a few aspects of system-level synthesis. The present volume presents a comprehensive discussion of all the aspects of system-level synthesis. Each topic is covered by a contribution written by an international authority on the subject.
Analog Design Issues in Digital VLSI Circuits and Systems brings together in one place important contributions and up-to-date research results in this fast moving area. Analog Design Issues in Digital VLSI Circuits and Systems serves as an excellent reference, providing insight into some of the most challenging research issues in the field.
Mobile devices are the 'it' technology, and everyone wants to know how to apply them to their environments. This book brings together the best examples and insights for implementing mobile technology in libraries. Chapters cover a wide variety of the most important tools and procedures from developing applications to marketing and augmented reality. Readers of this volume will get complete and timely knowledge of library applications for handheld devices. The Handheld Librarian conferences have been a centrepiece of learning about how to apply mobile technologies to library services and collections as well as a forum for sharing examples and lessons learned. The conferences have brought our profession forward into the trend and kept us up to date with ongoing advances. This volume brings together the best from that rich story and presents librarians with the basic information they need to successfully make the case for and implement programs leveraging mobile devices in their libraries. Authors of the diverse practical and well researched pieces originate in all types of libraries and segments of the profession. This wide representation ensures that front line librarians, library administrators, systems staff, even library professors will find this volume perfectly geared for their needs. This book was published as a special issue of The Reference Librarian.
Timing research in high performance VLSI systems has advanced at a steady pace over the last few years. Tools, however, especially theoretical mechanisms, lag behind. Much of the present timing research relies heavily on timing diagrams, which although intuitive, are inadequate for analysis of large designs with many parameters. Further, timing diagrams offer only approximations, not exact solutions to many timing problems and provide little insight in the cases where temporal properties of a design interact intricately with the design's logical functionalities. Timed Boolean Functions presents a methodology for timing research which facilitates analysis and design of circuits and systems in a unified temporal and logical domain. The goal of the book is to present the central idea of representing logical and timing information in a common structure, TBFs, and to present a canonical form suitable for efficient manipulation. This methodology is then applied to practical applications to provide intuition and insight into the subject so that these general methods can be adapted to specific engineering problems and also to further the research necessary to enhance the understanding of the field. Timed Boolean Functions is written for professionals involved in timing research and digital designers who want to enhance their understanding of the timing aspects of high speed circuits. The prerequisites are a common background in logic design, computer algorithms, combinatorial optimization and a certain degree of mathematical sophistication.
It has long been apparent to academic library administrators that the current technical services operations within libraries need to be redirected and refocused in terms of both format priorities and human resources. A number of developments and directions have made this reorganization imperative, many of which have been accelerated by the current economic crisis. All of the chapters detail some aspect of technical services reorganization due to downsizing and/or reallocation of human resources, retooling professional and support staff in higher level duties and/or non-MARC metadata, "value-added" metadata opportunities, outsourcing redundant activities, and shifting resources from analog to digital object organization and description. This book will assist both catalogers and library administrators with concrete examples of moving technical services operations and personnel from the analog to the digital environment. This book was published as a special double issue of Cataloging & Classification Quarterly.
The size of technically producible integrated circuits increases continuously, but the ability to design and verify these circuits does not keep up. Therefore today 's design flow has to be improved. Using a visionary approach, this book analyzes the current design methodology and verification methodology, a number of deficiencies are identified and solutions suggested. Improvements in the methodology as well as in the underlying algorithms are proposed.
Linear algebra is growing in importance. 3D entertainment, animations in movies and video games are developed using linear algebra. Animated characters are generated using equations straight out of this book. Linear algebra is used to extract knowledge from the massive amounts of data generated from modern technology. The Fourth Edition of this popular text introduces linear algebra in a comprehensive, geometric, and algorithmic way. The authors start with the fundamentals in 2D and 3D, then move on to higher dimensions, expanding on the fundamentals and introducing new topics, which are necessary for many real-life applications and the development of abstract thought. Applications are introduced to motivate topics. The subtitle, A Geometry Toolbox, hints at the book's geometric approach, which is supported by many sketches and figures. Furthermore, the book covers applications of triangles, polygons, conics, and curves. Examples demonstrate each topic in action. This practical approach to a linear algebra course, whether through classroom instruction or self-study, is unique to this book. New to the Fourth Edition: Ten new application sections. A new section on change of basis. This concept now appears in several places. Chapters 14-16 on higher dimensions are notably revised. A deeper look at polynomials in the gallery of spaces. Introduces the QR decomposition and its relevance to least squares. Similarity and diagonalization are given more attention, as are eigenfunctions. A longer thread on least squares, running from orthogonal projections to a solution via SVD and the pseudoinverse. More applications for PCA have been added. More examples, exercises, and more on the kernel and general linear spaces. A list of applications has been added in Appendix A. The book gives instructors the option of tailoring the course for the primary interests of their students: mathematics, engineering, science, computer graphics, and geometric modeling.
Application-Driven Architecture Synthesis describes the state of the art of architectural synthesis for complex real-time processing. In order to deal with the stringent timing requirements and the intricacies of complex real-time signal and data processing, target architecture styles and target application domains have been adopted to make the synthesis approach feasible. These approaches are also heavily application-driven, which is illustrated by many realistic demonstrations, used as examples in the book. The focus is on domains where application-specific solutions are attractive, such as significant parts of audio, telecom, instrumentation, speech, robotics, medical and automotive processing, image and video processing, TV, multi-media, radar, sonar. Application-Driven Architecture Synthesis is of interest to both academics and senior design engineers and CAD managers in industry. It provides an excellent overview of what capabilities to expect from future practical design tools, and includes an extensive bibliography.
Gay provides an authoritative overview of the major developments, people, and organizations that have shaped the design and use of computers. He also describes innovations in computer research and technology (including highlights from developments in supercomputers, supercomputer networks, and Internet 2), the latest trends in consumer products (new applications that influence the way individuals and businesses interface with their world), social issues related to computers (Microsoft's influence, privacy, encryption, universal access, and adaptive technologies), and information on computer careers and how to prepare for them.
The objective of the NATO Advanced Research Workshop "Learning electricity and electronics with advanced educational technology" was to bring together researchers coming from different domains. Electricity education is a domain where a lot of research has already been made. The first meeting on electricity teaching was organized in 1984 by R. Duit, W. Jung and C. von Rhoneck in Ludwigsburg (Germany). Since then, research has been going on and we can consider that the workshop was the successor of this first meeting. Our goal was not to organize a workshop grouping only people producing software in the field of electricity education or more generally in the field of physics education, even if this software was based on artificial intelligence techniques. On the contrary, we wanted this workshop to bring together researchers involved in the connection between cognitive science and the learning of a well defined domain such as electricity. So during the workshop, people doing research in physics education, cognitive psychology, and artificial intelligence had the opportunity to discuss and exchange. These proceedings reflect the different points of view. The main idea is that designing a learning environment needs the confrontation of different approaches. The proceedings are organized in five parts which reflect these different aspects.
'Et moi, ..., si j'avait su comment en revenir, One service mathematics has rendered the je n'y serais point alIe.' human race. It has put common sense back Jules Verne where it belongs, on the topmost shelf next to the dusty canister labelled 'discarded non The series is divergent; therefore we may be sense'. able to do something with it. Eric T. Bell O. Heaviside Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non linearities abound. Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service topology has rendered mathematical physics .. .'; 'One service logic has rendered com puter science .. .'; 'One service category theory has rendered mathematics .. .'. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series."
What do philosophy and computer science have in common? It turns out, quite a lot! In providing an introduction to computer science (using Python), Daniel Lim presents in this book key philosophical issues, ranging from external world skepticism to the existence of God to the problem of induction. These issues, and others, are introduced through the use of critical computational concepts, ranging from image manipulation to recursive programming to elementary machine learning techniques. In illuminating some of the overlapping conceptual spaces of computer science and philosophy, Lim teaches the reader fundamental programming skills and also allows her to develop the critical thinking skills essential for examining some of the enduring questions of philosophy. Key Features Teaches readers actual computer programming, not merely ideas about computers Includes fun programming projects (like digital image manipulation and Game of Life simulation), allowing the reader to develop the ability to write larger computer programs that require decomposition, abstraction, and algorithmic thinking Uses computational concepts to introduce, clarify, and develop a variety of philosophical issues Covers various aspects of machine learning and relates them to philosophical issues involving science and induction as well as to ethical issues Provides a framework to critically analyze arguments in classic and contemporary philosophical debates
The roots of the project which culminates with the writing of this book can be traced to the work on logic synthesis started in 1979 at the IBM Watson Research Center and at University of California, Berkeley. During the preliminary phases of these projects, the impor tance of logic minimization for the synthesis of area and performance effective circuits clearly emerged. In 1980, Richard Newton stirred our interest by pointing out new heuristic algorithms for two-level logic minimization and the potential for improving upon existing approaches. In the summer of 1981, the authors organized and participated in a seminar on logic manipulation at IBM Research. One of the goals of the seminar was to study the literature on logic minimization and to look at heuristic algorithms from a fundamental and comparative point of view. The fruits of this investigation were surprisingly abundant: it was apparent from an initial implementation of recursive logic minimiza tion (ESPRESSO-I) that, if we merged our new results into a two-level minimization program, an important step forward in automatic logic synthesis could result. ESPRESSO-II was born and an APL implemen tation was created in the summer of 1982. The results of preliminary tests on a fairly large set of industrial examples were good enough to justify the publication of our algorithms. It is hoped that the strength and speed of our minimizer warrant its Italian name, which denotes both express delivery and a specially-brewed black coffee."
This book presents the proceedings of the Third International Conference on Electrical Engineering and Control (ICEECA2017). It covers new control system models and troubleshooting tips, and also addresses complex system requirements, such as increased speed, precision and remote capabilities, bridging the gap between the complex, math-heavy controls theory taught in formal courses, and the efficient implementation required in real-world industry settings. Further, it considers both the engineering aspects of signal processing and the practical issues in the broad field of information transmission and novel technologies for communication networks and modern antenna design. This book is intended for researchers, engineers, and advanced postgraduate students in control and electrical engineering, computer science, signal processing, as well as mechanical and chemical engineering.
Discusses all the major tools and techniques for Decision Support System supported by examples Techniques are explained considering their deterministic and stochastic aspects Covers network tools including GERT and Q-GERT Explains application of both probability and fuzzy orientation in the pertinent techniques Includes a number of relevant case studies along with a dedicated chapter on software
The advancement of technology in today's world has led to the progression of several professional fields. This includes the classroom, as teachers have begun using new technological strategies to increase student involvement and motivation. ICT innovation including virtual reality and blended learning methods has changed the scope of classroom environments across the globe; however, significant research is lacking in this area. ICTs and Innovation for Didactics of Social Sciences is a fundamental reference focused on didactics of social sciences and ICTs including issues related to innovation, resources, and strategies for teachers that can link to the transformation of social sciences teaching and learning as well as societal transformation. While highlighting topics such as blended learning, augmented reality, and virtual classrooms, this book is ideally designed for researchers, administrators, educators, practitioners, and students interested in understanding current relevant ICT resources and innovative strategies for the didactic of social sciences and didactic possibilities in relation to concrete conceptual contents, resolution of problems, planning, decision making, development of social skills, attention, and motivation promoting a necessary technological literacy.
E-agriculture and e-government have transformed public service delivery across the globe, though there are still a number of associated economic, social, political, and legal challenges. E-Agriculture and E-Government for Global Policy Development: Implications and Future Directions provides critical research and knowledge on electronic agriculture and e-government development experiences from around the world. This authoritative reference source describes and evaluates real-life e-agriculture and e-government case studies, examines theoretical frameworks, and discusses key global policy development issues, challenges, and constraints on socio-economic advancements.
Test generation is one of the most difficult tasks facing the designer of complex VLSI-based digital systems. Much of this difficulty is attributable to the almost universal use in testing of low, gate-level circuit and fault models that predate integrated circuit technology. It is long been recognized that the testing prob lem can be alleviated by the use of higher-level methods in which multigate modules or cells are the primitive components in test generation; however, the development of such methods has proceeded very slowly. To be acceptable, high-level approaches should be applicable to most types of digital circuits, and should provide fault coverage comparable to that of traditional, low-level methods. The fault coverage problem has, perhaps, been the most intractable, due to continued reliance in the testing industry on the single stuck-line (SSL) fault model, which is tightly bound to the gate level of abstraction. This monograph presents a novel approach to solving the foregoing problem. It is based on the systematic use of multibit vectors rather than single bits to represent logic signals, including fault signals. A circuit is viewed as a collection of high-level components such as adders, multiplexers, and registers, interconnected by n-bit buses. To match this high-level circuit model, we introduce a high-level bus fault that, in effect, replaces a large number of SSL faults and allows them to be tested in parallel. However, by reducing the bus size from n to one, we can obtain the traditional gate-level circuit and models."
Open Distributed Processing contains the selected proceedings of the Third International Conference on Open Distributed Systems, organized by the International Federation for Information Processing and held in Brisbane, Australia, in February 1995. The book deals with the interconnectivity problems that advanced computer networking raises, providing those working in the area with the most recent research, including security and management issues.
"Soft Computing and its Applications in Business and Economics," or SC-BE for short, is a work whose importance is hard to exaggerate. Authored by leading contributors to soft computing and its applications, SC-BE is a sequel to an earlier book by Professors R. A. Aliev and R. R. Aliev, "Soft Computing and Its Applications," World Scientific, 200l. SC-BE is a self-contained exposition of the foundations of soft computing, and presents a vast compendium of its applications to business, finance, decision analysis and economics. One cannot but be greatly impressed by the wide variety of applications - applications ranging from use of fuzzy logic in transportation and health case systems, to use of a neuro-fuzzy approach to modeling of credit risk in trading, and application of soft computing to e-commerce. To view the contents of SC-BE in a clearer perspective, a bit of history is in order. In science, as in other realms of human activity, there is a tendency to be nationalistic - to commit oneself to a particular methodology and relegate to a position of inferiority or irrelevance all alternative methodologies. As we move further into the age of machine intelligence and automated reasoning, we run into more and more problems which do not lend themselves to solution through the use of our favorite methodology.
Performance and Reliability Analysis of Computer Systems: An Example-Based Approach Using the SHARPE Software Package provides a variety of probabilistic, discrete-state models used to assess the reliability and performance of computer and communication systems. The models included are combinatorial reliability models (reliability block diagrams, fault trees and reliability graphs), directed, acyclic task precedence graphs, Markov and semi-Markov models (including Markov reward models), product-form queueing networks and generalized stochastic Petri nets. A practical approach to system modeling is followed; all of the examples described are solved and analyzed using the SHARPE tool. In structuring the book, the authors have been careful to provide the reader with a methodological approach to analytical modeling techniques. These techniques are not seen as alternatives but rather as an integral part of a single process of assessment which, by hierarchically combining results from different kinds of models, makes it possible to use state-space methods for those parts of a system that require them and non-state-space methods for the more well-behaved parts of the system. The SHARPE (Symbolic Hierarchical Automated Reliability and Performance Evaluator) package is the toolchest' that allows the authors to specify stochastic models easily and solve them quickly, adopting model hierarchies and very efficient solution techniques. All the models described in the book are specified and solved using the SHARPE language; its syntax is described and the source code of almost all the examples discussed is provided. Audience: Suitable for use in advanced level courses covering reliability and performance of computer and communications systems and by researchers and practicing engineers whose work involves modeling of system performance and reliability.
Service Intelligence and Service Science: Evolutionary Technologies and Challenges the emerging fields of service intelligence and service science, positioning them as the most promising directions for the evolution of service computing. This book demonstrates the critical role such areas play in supporting service computing processes, and furthers an increase in current research, best practices, and new directions in service computing technologies and applications. |
You may like...
Analytic Aspects of Convexity
Gabriele Bianchi, Andrea Colesanti, …
Hardcover
R2,511
Discovery Miles 25 110
Unmanned Robotic Systems and…
Mahmut Reyhanoglu, Geert De Cubber
Hardcover
R3,052
Discovery Miles 30 520
Smart Electromechanical Systems - Group…
Andrey E. Gorodetskiy, Irina L. Tarasova
Hardcover
R2,695
Discovery Miles 26 950
Advances in Sustainable Machining and…
Kishor Kumar Gajrani, Arbind Prasad, …
Hardcover
R3,805
Discovery Miles 38 050
Applications of Operations Research and…
William P. Fox, Robert Burks
Hardcover
R3,373
Discovery Miles 33 730
Macro-Engineering and the Earth - World…
U W Kitzinger, E. G Frankel
Hardcover
R1,914
Discovery Miles 19 140
|