![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
Fuzzy Sets in the Management of Uncertainty presents an overview of current problems in business management, primarily for those situations involving decision making of an economic-financial nature. The monograph therefore discusses problems of planning, programming, control and brings light to the entire financial network in its three phases: raising funds, analysis and investment. Special attention is paid to production processes and marketing of products and services. This monograph is a highly readable overview and introduction for scientists, professionals, graduate students, managers and consultants in the growing field of applications and fuzzy logic in the field of management.
The thesis work was in two major parts: development and testing of
a new approach to detecting and
Mobile devices are the 'it' technology, and everyone wants to know how to apply them to their environments. This book brings together the best examples and insights for implementing mobile technology in libraries. Chapters cover a wide variety of the most important tools and procedures from developing applications to marketing and augmented reality. Readers of this volume will get complete and timely knowledge of library applications for handheld devices. The Handheld Librarian conferences have been a centrepiece of learning about how to apply mobile technologies to library services and collections as well as a forum for sharing examples and lessons learned. The conferences have brought our profession forward into the trend and kept us up to date with ongoing advances. This volume brings together the best from that rich story and presents librarians with the basic information they need to successfully make the case for and implement programs leveraging mobile devices in their libraries. Authors of the diverse practical and well researched pieces originate in all types of libraries and segments of the profession. This wide representation ensures that front line librarians, library administrators, systems staff, even library professors will find this volume perfectly geared for their needs. This book was published as a special issue of The Reference Librarian.
The growth of mobile technology has caused considerable changes in the way we interact with one another within both personal and business environments. Advancements in mobile computing and mobile multimedia resonate with engineers, strategists, developers, and managers while also determining the behavior and interaction of end users. Advancing the Next-Generation of Mobile Computing: Emerging Technologies offers historical perspectives on mobile computing, as well as new frameworks and methodologies for mobile networks, intelligent mobile applications, and mobile computing applications. This collection of research aims to inform researchers, designers, and users of mobile technology and promote awareness of new trends and tools in this growing field of study.
Timing research in high performance VLSI systems has advanced at a steady pace over the last few years. Tools, however, especially theoretical mechanisms, lag behind. Much of the present timing research relies heavily on timing diagrams, which although intuitive, are inadequate for analysis of large designs with many parameters. Further, timing diagrams offer only approximations, not exact solutions to many timing problems and provide little insight in the cases where temporal properties of a design interact intricately with the design's logical functionalities. Timed Boolean Functions presents a methodology for timing research which facilitates analysis and design of circuits and systems in a unified temporal and logical domain. The goal of the book is to present the central idea of representing logical and timing information in a common structure, TBFs, and to present a canonical form suitable for efficient manipulation. This methodology is then applied to practical applications to provide intuition and insight into the subject so that these general methods can be adapted to specific engineering problems and also to further the research necessary to enhance the understanding of the field. Timed Boolean Functions is written for professionals involved in timing research and digital designers who want to enhance their understanding of the timing aspects of high speed circuits. The prerequisites are a common background in logic design, computer algorithms, combinatorial optimization and a certain degree of mathematical sophistication.
Gay provides an authoritative overview of the major developments, people, and organizations that have shaped the design and use of computers. He also describes innovations in computer research and technology (including highlights from developments in supercomputers, supercomputer networks, and Internet 2), the latest trends in consumer products (new applications that influence the way individuals and businesses interface with their world), social issues related to computers (Microsoft's influence, privacy, encryption, universal access, and adaptive technologies), and information on computer careers and how to prepare for them.
It has long been apparent to academic library administrators that the current technical services operations within libraries need to be redirected and refocused in terms of both format priorities and human resources. A number of developments and directions have made this reorganization imperative, many of which have been accelerated by the current economic crisis. All of the chapters detail some aspect of technical services reorganization due to downsizing and/or reallocation of human resources, retooling professional and support staff in higher level duties and/or non-MARC metadata, "value-added" metadata opportunities, outsourcing redundant activities, and shifting resources from analog to digital object organization and description. This book will assist both catalogers and library administrators with concrete examples of moving technical services operations and personnel from the analog to the digital environment. This book was published as a special double issue of Cataloging & Classification Quarterly.
Linear algebra is growing in importance. 3D entertainment, animations in movies and video games are developed using linear algebra. Animated characters are generated using equations straight out of this book. Linear algebra is used to extract knowledge from the massive amounts of data generated from modern technology. The Fourth Edition of this popular text introduces linear algebra in a comprehensive, geometric, and algorithmic way. The authors start with the fundamentals in 2D and 3D, then move on to higher dimensions, expanding on the fundamentals and introducing new topics, which are necessary for many real-life applications and the development of abstract thought. Applications are introduced to motivate topics. The subtitle, A Geometry Toolbox, hints at the book's geometric approach, which is supported by many sketches and figures. Furthermore, the book covers applications of triangles, polygons, conics, and curves. Examples demonstrate each topic in action. This practical approach to a linear algebra course, whether through classroom instruction or self-study, is unique to this book. New to the Fourth Edition: Ten new application sections. A new section on change of basis. This concept now appears in several places. Chapters 14-16 on higher dimensions are notably revised. A deeper look at polynomials in the gallery of spaces. Introduces the QR decomposition and its relevance to least squares. Similarity and diagonalization are given more attention, as are eigenfunctions. A longer thread on least squares, running from orthogonal projections to a solution via SVD and the pseudoinverse. More applications for PCA have been added. More examples, exercises, and more on the kernel and general linear spaces. A list of applications has been added in Appendix A. The book gives instructors the option of tailoring the course for the primary interests of their students: mathematics, engineering, science, computer graphics, and geometric modeling.
The objective of the NATO Advanced Research Workshop "Learning electricity and electronics with advanced educational technology" was to bring together researchers coming from different domains. Electricity education is a domain where a lot of research has already been made. The first meeting on electricity teaching was organized in 1984 by R. Duit, W. Jung and C. von Rhoneck in Ludwigsburg (Germany). Since then, research has been going on and we can consider that the workshop was the successor of this first meeting. Our goal was not to organize a workshop grouping only people producing software in the field of electricity education or more generally in the field of physics education, even if this software was based on artificial intelligence techniques. On the contrary, we wanted this workshop to bring together researchers involved in the connection between cognitive science and the learning of a well defined domain such as electricity. So during the workshop, people doing research in physics education, cognitive psychology, and artificial intelligence had the opportunity to discuss and exchange. These proceedings reflect the different points of view. The main idea is that designing a learning environment needs the confrontation of different approaches. The proceedings are organized in five parts which reflect these different aspects.
This book focuses on the aspects related to the parallelization of evolutionary computations, such as parallel genetic operators, parallel fitness evaluation, distributed genetic algorithms, and parallel hardware implementations, as well as on their impact on several applications. It offers a wide spectrum of sample works developed in leading research about parallel implementations of efficient techniques at the heart of computational intelligence.
Analog Design Issues in Digital VLSI Circuits and Systems brings together in one place important contributions and up-to-date research results in this fast moving area. Analog Design Issues in Digital VLSI Circuits and Systems serves as an excellent reference, providing insight into some of the most challenging research issues in the field.
Application-Driven Architecture Synthesis describes the state of the art of architectural synthesis for complex real-time processing. In order to deal with the stringent timing requirements and the intricacies of complex real-time signal and data processing, target architecture styles and target application domains have been adopted to make the synthesis approach feasible. These approaches are also heavily application-driven, which is illustrated by many realistic demonstrations, used as examples in the book. The focus is on domains where application-specific solutions are attractive, such as significant parts of audio, telecom, instrumentation, speech, robotics, medical and automotive processing, image and video processing, TV, multi-media, radar, sonar. Application-Driven Architecture Synthesis is of interest to both academics and senior design engineers and CAD managers in industry. It provides an excellent overview of what capabilities to expect from future practical design tools, and includes an extensive bibliography.
'Et moi, ..., si j'avait su comment en revenir, One service mathematics has rendered the je n'y serais point alIe.' human race. It has put common sense back Jules Verne where it belongs, on the topmost shelf next to the dusty canister labelled 'discarded non The series is divergent; therefore we may be sense'. able to do something with it. Eric T. Bell O. Heaviside Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non linearities abound. Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service topology has rendered mathematical physics .. .'; 'One service logic has rendered com puter science .. .'; 'One service category theory has rendered mathematics .. .'. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series."
This book takes a formal approach to teaching software engineering, using not only UML, but also Object Constraint Language (OCL) for specification and analysis of designed models. Employing technical details typically missing from existing textbooks on software engineering, the author shows how precise specifications lead to static verification of software systems. In addition, data management is given the attention that is required in order to produce a successful software project. Uses constraints in all phases of software development Follows recent developments in software technologies Technical coverage of data management issues and software verification Illustrated throughout to present analysis, specification, implementation and verification of multiple applications Includes end-of-chapter exercises and Instructor Presentation Slides
The evaluation of IT and its business value are recently the subject of many academic and business discussions, as business managers, management consultants and researchers regularly question whether and how the contribution of IT to business performance can be evaluated effectively. Investments in IT are growing extensively and business managers worry about the fact that the benefits of IT investments might not be as high as expected. This phenomenon is often called the IT investment paradox or the IT Black Hole: larges sums are invested in IT that seem to be swallowed by a large black hole without rendering many returns. Information Systems Evaluation Management discusses these issues among others, through its presentation of the most current research in the field of IS evaluation. It is an area of study that touches upon a variety of types of businesses and organization essentially all those who involve IT in their business practices.
The roots of the project which culminates with the writing of this book can be traced to the work on logic synthesis started in 1979 at the IBM Watson Research Center and at University of California, Berkeley. During the preliminary phases of these projects, the impor tance of logic minimization for the synthesis of area and performance effective circuits clearly emerged. In 1980, Richard Newton stirred our interest by pointing out new heuristic algorithms for two-level logic minimization and the potential for improving upon existing approaches. In the summer of 1981, the authors organized and participated in a seminar on logic manipulation at IBM Research. One of the goals of the seminar was to study the literature on logic minimization and to look at heuristic algorithms from a fundamental and comparative point of view. The fruits of this investigation were surprisingly abundant: it was apparent from an initial implementation of recursive logic minimiza tion (ESPRESSO-I) that, if we merged our new results into a two-level minimization program, an important step forward in automatic logic synthesis could result. ESPRESSO-II was born and an APL implemen tation was created in the summer of 1982. The results of preliminary tests on a fairly large set of industrial examples were good enough to justify the publication of our algorithms. It is hoped that the strength and speed of our minimizer warrant its Italian name, which denotes both express delivery and a specially-brewed black coffee."
Discusses all the major tools and techniques for Decision Support System supported by examples Techniques are explained considering their deterministic and stochastic aspects Covers network tools including GERT and Q-GERT Explains application of both probability and fuzzy orientation in the pertinent techniques Includes a number of relevant case studies along with a dedicated chapter on software
Open Distributed Processing contains the selected proceedings of the Third International Conference on Open Distributed Systems, organized by the International Federation for Information Processing and held in Brisbane, Australia, in February 1995. The book deals with the interconnectivity problems that advanced computer networking raises, providing those working in the area with the most recent research, including security and management issues.
Test generation is one of the most difficult tasks facing the designer of complex VLSI-based digital systems. Much of this difficulty is attributable to the almost universal use in testing of low, gate-level circuit and fault models that predate integrated circuit technology. It is long been recognized that the testing prob lem can be alleviated by the use of higher-level methods in which multigate modules or cells are the primitive components in test generation; however, the development of such methods has proceeded very slowly. To be acceptable, high-level approaches should be applicable to most types of digital circuits, and should provide fault coverage comparable to that of traditional, low-level methods. The fault coverage problem has, perhaps, been the most intractable, due to continued reliance in the testing industry on the single stuck-line (SSL) fault model, which is tightly bound to the gate level of abstraction. This monograph presents a novel approach to solving the foregoing problem. It is based on the systematic use of multibit vectors rather than single bits to represent logic signals, including fault signals. A circuit is viewed as a collection of high-level components such as adders, multiplexers, and registers, interconnected by n-bit buses. To match this high-level circuit model, we introduce a high-level bus fault that, in effect, replaces a large number of SSL faults and allows them to be tested in parallel. However, by reducing the bus size from n to one, we can obtain the traditional gate-level circuit and models."
E-agriculture and e-government have transformed public service delivery across the globe, though there are still a number of associated economic, social, political, and legal challenges. E-Agriculture and E-Government for Global Policy Development: Implications and Future Directions provides critical research and knowledge on electronic agriculture and e-government development experiences from around the world. This authoritative reference source describes and evaluates real-life e-agriculture and e-government case studies, examines theoretical frameworks, and discusses key global policy development issues, challenges, and constraints on socio-economic advancements.
"Soft Computing and its Applications in Business and Economics," or SC-BE for short, is a work whose importance is hard to exaggerate. Authored by leading contributors to soft computing and its applications, SC-BE is a sequel to an earlier book by Professors R. A. Aliev and R. R. Aliev, "Soft Computing and Its Applications," World Scientific, 200l. SC-BE is a self-contained exposition of the foundations of soft computing, and presents a vast compendium of its applications to business, finance, decision analysis and economics. One cannot but be greatly impressed by the wide variety of applications - applications ranging from use of fuzzy logic in transportation and health case systems, to use of a neuro-fuzzy approach to modeling of credit risk in trading, and application of soft computing to e-commerce. To view the contents of SC-BE in a clearer perspective, a bit of history is in order. In science, as in other realms of human activity, there is a tendency to be nationalistic - to commit oneself to a particular methodology and relegate to a position of inferiority or irrelevance all alternative methodologies. As we move further into the age of machine intelligence and automated reasoning, we run into more and more problems which do not lend themselves to solution through the use of our favorite methodology.
Worldwide, the urge is being felt to pave the way towards the introduction of an electronic government. Many countries recognise the potential of digital aids in providing information and services to citizens, organisations and companies. Recent developments have put pressure on the legislature to provide an adequate legal framework for electronic administrative communication. Thus, various countries have started to draft provisions in their administrative law in order to remove legal impediments that hamper electronic services from public administrations. Written by specialists from different countries, E-Government and its Implications for Administrative Law provides an overview and analysis of such legislative developments in France, Germany, Norway and the United States. What approach has been taken in these countries? What specific provisions have been formulated to facilitate electronic administrative communication and at what level? What requirements are introduced to gain sufficient trust in electronic service delivery? In providing an in-depth analysis of the legislative projects in the various countries, this book gives a glance at the differences in policy making as well as the lessons that can be learned for future regulatory projects to amend administrative law for the digital era. This is Volume 1 in the Information Technology and Law (IT&Law) Series
Today, a major component of any project management effort is the combined use of qualitative and quantitative tools. While publications on qualitative approaches to project management are widely available, few project management books have focused on the quantitative approaches. This book represents the first major project management book with a practical focus on the quantitative approaches to project management. The book organizes quantitative techniques into an integrated framework for project planning, scheduling, and control. Numerous illustrative examples are presented. Topics covered in the book include PERT/CPM/PDM and extensions, mathematical project scheduling, heuristic project scheduling, project economics, statistical data analysis for project planning, computer simulation, assignment and transportation problems, and learning curve analysis. Chapter one gives a brief overview of project management, presenting a general-purpose project management model. Chapter two covers CPM, PERT, and PDM network techniques. Chapter three covers project scheduling subject to resource constraints. Chapter four covers project optimization. Chapter five discusses economic analysis for project planning and control. Chapter six discusses learning curve analysis. Chapter seven covers statistical data analysis for project planning and control. Chapter eight presents techniques for project analysis and selection. Tables and figures are used throughout the book to enhance the effectiveness of the discussions. This book is excellent as a textbook for upper-level undergraduate and graduate courses in Industrial Engineering, Engineering Management, and Business, and as a detailed, comprehensive guidefor corporate management.
Performance and Reliability Analysis of Computer Systems: An Example-Based Approach Using the SHARPE Software Package provides a variety of probabilistic, discrete-state models used to assess the reliability and performance of computer and communication systems. The models included are combinatorial reliability models (reliability block diagrams, fault trees and reliability graphs), directed, acyclic task precedence graphs, Markov and semi-Markov models (including Markov reward models), product-form queueing networks and generalized stochastic Petri nets. A practical approach to system modeling is followed; all of the examples described are solved and analyzed using the SHARPE tool. In structuring the book, the authors have been careful to provide the reader with a methodological approach to analytical modeling techniques. These techniques are not seen as alternatives but rather as an integral part of a single process of assessment which, by hierarchically combining results from different kinds of models, makes it possible to use state-space methods for those parts of a system that require them and non-state-space methods for the more well-behaved parts of the system. The SHARPE (Symbolic Hierarchical Automated Reliability and Performance Evaluator) package is the toolchest' that allows the authors to specify stochastic models easily and solve them quickly, adopting model hierarchies and very efficient solution techniques. All the models described in the book are specified and solved using the SHARPE language; its syntax is described and the source code of almost all the examples discussed is provided. Audience: Suitable for use in advanced level courses covering reliability and performance of computer and communications systems and by researchers and practicing engineers whose work involves modeling of system performance and reliability. |
You may like...
The Economics of Biotechnology
James D. Gaisford, Jill E. Hobbs, …
Hardcover
R3,507
Discovery Miles 35 070
Advances in Teaching Physical Chemistry
Mark D. Ellison, Tracy A. Schoolcraft
Hardcover
R5,294
Discovery Miles 52 940
Spectroscopic Methods in the Study of…
Jacob (Theo) Kloprogge
Hardcover
R2,721
Discovery Miles 27 210
Reactive Species Detection in Biology…
Frederick A Villamena
Hardcover
Successful Women in Chemistry…
Amber S. Hinkle, Jody A. Kocsis
Hardcover
R2,795
Discovery Miles 27 950
NMR Spectroscopy in the Undergraduate…
David Soulsby, Laura J. Anna, …
Hardcover
R4,835
Discovery Miles 48 350
Correlative Light and Electron…
Thomas Muller-Reichert, Paul Verkade
Hardcover
R3,961
Discovery Miles 39 610
|