Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > Computer programming
It is, indeed, widely acceptable today that nowhere is it more important to focus on the improvement of software quality than in the case of systems with requirements in the areas of safety and reliability - especially for distributed, real-time and embedded systems. Thus, much research work is under progress in these fields, since software process improvement impinges directly on achieved levels of quality, and many application experiments aim to show quantitative results demonstrating the efficacy of particular approaches. Requirements for safety and reliability - like other so-called non-functional requirements for computer-based systems - are often stated in imprecise and ambiguous terms, or not at all. Specifications focus on functional and technical aspects, with issues like safety covered only implicitly, or not addressed directly because they are felt to be obvious; unfortunately what is obvious to an end user or system user is progressively less so to others, to the extend that a software developer may not even be aware that safety is an issue. Therefore, there is a growing evidence for encouraging greater understanding of safety and reliability requirements issues, right across the spectrum from end user to software developer; not just in traditional safety-critical areas (e.g. nuclear, aerospace) but also acknowledging the need for such things as heart pacemakers and other medical and robotic systems to be highly dependable.
Digital Intermediation offers a new framework for understanding content creation and distribution across automated media platforms - a new mediatisation process. The book draws on empirical and theoretical research to carefully identify and describe a number of unseen digital infrastructures that contribute to a predictive media production process through technologies, institutions and automation. Field data is drawn from several international sites, including Los Angeles, San Francisco, Portland, London, Amsterdam, Munich, Berlin, Hamburg, Sydney and Cartagena. By highlighting an increasingly automated content production and distribution process, the book responds to a number of regulatory debates on the societal impact of social media platforms. It highlights emerging areas of key importance that shape the production and distribution of social media content, including micro-platformization and digital first personalities. The book explains how technologies, institutions and automation are used within agencies to increase exposure for the talent they manage, while providing inside access to the processes and requirements of producers who create content for platform algorithms. Finally, it outlines user agency as a strategy for those who seek diversity in the information they access on automated social media content distribution platforms. The findings in this book provide key recommendations for policymakers working within digital media platforms, and will be invaluable reading for students and academics interested in automated media environments.
The quadratic assignment problem (QAP) was introduced in 1957 by Koopmans and Beckmann to model a plant location problem. Since then the QAP has been object of numerous investigations by mathematicians, computers scientists, ope- tions researchers and practitioners. Nowadays the QAP is widely considered as a classical combinatorial optimization problem which is (still) attractive from many points of view. In our opinion there are at last three main reasons which make the QAP a popular problem in combinatorial optimization. First, the number of re- life problems which are mathematically modeled by QAPs has been continuously increasing and the variety of the fields they belong to is astonishing. To recall just a restricted number among the applications of the QAP let us mention placement problems, scheduling, manufacturing, VLSI design, statistical data analysis, and parallel and distributed computing. Secondly, a number of other well known c- binatorial optimization problems can be formulated as QAPs. Typical examples are the traveling salesman problem and a large number of optimization problems in graphs such as the maximum clique problem, the graph partitioning problem and the minimum feedback arc set problem. Finally, from a computational point of view the QAP is a very difficult problem. The QAP is not only NP-hard and - hard to approximate, but it is also practically intractable: it is generally considered as impossible to solve (to optimality) QAP instances of size larger than 20 within reasonable time limits.
Effective compilers allow for a more efficient execution of application programs for a given computer architecture, while well-conceived architectural features can support more effective compiler optimization techniques. A well thought-out strategy of trade-offs between compilers and computer architectures is the key to the successful designing of highly efficient and effective computer systems. From embedded micro-controllers to large-scale multiprocessor systems, it is important to understand the interaction between compilers and computer architectures. The goal of the Annual Workshop on Interaction between Compilers and Computer Architectures (INTERACT) is to promote new ideas and to present recent developments in compiler techniques and computer architectures that enhance each other's capabilities and performance. Interaction Between Compilers and Computer Architectures is an updated and revised volume consisting of seven papers originally presented at the Fifth Workshop on Interaction between Compilers and Computer Architectures (INTERACT-5), which was held in conjunction with the IEEE HPCA-7 in Monterrey, Mexico in 2001. This volume explores recent developments and ideas for better integration of the interaction between compilers and computer architectures in designing modern processors and computer systems. Interaction Between Compilers and Computer Architectures is suitable as a secondary text for a graduate level course, and as a reference for researchers and practitioners in industry.
Mathematical programming has know a spectacular diversification in the last few decades. This process has happened both at the level of mathematical research and at the level of the applications generated by the solution methods that were created. To write a monograph dedicated to a certain domain of mathematical programming is, under such circumstances, especially difficult. In the present monograph we opt for the domain of fractional programming. Interest of this subject was generated by the fact that various optimization problems from engineering and economics consider the minimization of a ratio between physical and/or economical functions, for example cost/time, cost/volume, cost/profit, or other quantities that measure the efficiency of a system. For example, the productivity of industrial systems, defined as the ratio between the realized services in a system within a given period of time and the utilized resources, is used as one of the best indicators of the quality of their operation. Such problems, where the objective function appears as a ratio of functions, constitute fractional programming problem. Due to its importance in modeling various decision processes in management science, operational research, and economics, and also due to its frequent appearance in other problems that are not necessarily economical, such as information theory, numerical analysis, stochastic programming, decomposition algorithms for large linear systems, etc., the fractional programming method has received particular attention in the last three decade
Algorithmic discrete mathematics plays a key role in the development of information and communication technologies, and methods that arise in computer science, mathematics and operations research in particular in algorithms, computational complexity, distributed computing and optimization are vital to modern services such as mobile telephony, online banking and VoIP. This book examines communication networking from a mathematical viewpoint. The contributing authors took part in the European COST action 293 a four-year program of multidisciplinary research on this subject. In this book they offer introductory overviews and state-of-the-art assessments of current and future research in the fields of broadband, optical, wireless and ad hoc networks. Particular topics of interest are design, optimization, robustness and energy consumption. The book will be of interest to graduate students, researchers and practitioners in the areas of networking, theoretical computer science, operations research, distributed computing and mathematics."
C++/CLI is Microsofts latest extension to C++ that targets the heart of .NET 2.0, the common language runtime. "Expert Visual C++/CLI" is written by visual C++ MVP -->Marcus Heege-->, who examines the core of the C++/CLI language. He explains both how the language elements work and how Microsoft intends them to be used. Even if you're new to C++/CLI and are planning to migrate to it from another language, this book will ground you in the core language elements and give you the confidence to explore further and migrate effectively. It provides concise, yet in-depth coverage of all major C++/CLI features; short code examples succinctly illustrate syntax and concepts, and more elaborate examples show how C++/CLI should be used.
'Subdivision' is a way of representing smooth shapes in a computer. A curve or surface (both of which contain an in?nite number of points) is described in terms of two objects. One object is a sequence of vertices, which we visualise as a polygon, for curves, or a network of vertices, which we visualise by drawing the edges or faces of the network, for surfaces. The other object is a set of rules for making denser sequences or networks. When applied repeatedly, the denser and denser sequences are claimed to converge to a limit, which is the curve or surface that we want to represent. This book focusses on curves, because the theory for that is complete enough that a book claiming that our understanding is complete is exactly what is needed to stimulate research proving that claim wrong. Also because there are already a number of good books on subdivision surfaces. The way in which the limit curve relates to the polygon, and a lot of interesting properties of the limit curve, depend on the set of rules, and this book is about how one can deduce those properties from the set of rules, and how one can then use that understanding to construct rules which give the properties that one wants.
Software Process Modeling brings together experts to discuss relevant results in software process modeling, and expresses their personal view of this field. This book focuses on new aspects of software process modeling. Specifically, it deals with socio-technological aspects, process modeling for new development types (open source software, dependability applications, etc.) and organization change management. The computer audience is placing growing demands on the software industry today. Consumers are looking for more complex products that are, at the same time, easier to use. Software developer organizations are expected to produce higher quality products and deliver them to the public faster. In so doing, however, globally distributed development teams have to cope with understaffing and changing technologies. The challenges for the software industry are apparently mounting. Over the years, a variety of software process models have been designed to structure, describe and prescribe the software systems construction process. Most recently, software process modeling is increasingly dealing with new challenges raised by the tests that the software industry has to stand. Software Process Modeling is designed for a professional audience of researchers and practitioners in industry. The book is also suitable for graduate-level students in computer science.
In any software design project, the analysis stage - documenting and designing technical requirements for the needs of users - is vital to the success of the project.This book provides a thorough introduction & survey to all aspects of analysis. This new edition provides new features including: additional chapters on system Development Life Cycle & Data Element Naming Conventions & Standards; more coverage on converting logical models to physical models, how to generate DDL & testing database functionalities; expansion of database section with concepts such as denormalization, security & change control; developments on new design & technologies, particularly in the area of web analysis and design; a revised Web/Commerce chapter, which addresses component middleware for complex systems design; and, new case studies. This book is a valuable resource and guide for all information systems students, practitioners and professionals who need an in-depth understanding of the principles of the analysis and design process.
The new multimedia standards (for example, MPEG-21) facilitate the
seamless integration of multiple modalities into interoperable
multimedia frameworks, transforming the way people work and
interact with multimedia data. These key technologies and
multimedia solutions interact and collaborate with each other in
increasingly effective ways, contributing to the multimedia
revolution and having a significant impact across a wide spectrum
of consumer, business, healthcare, education and governmental
domains. This book aims to provide a complete coverage of the areas
outlined and to bring together the researchers from academic and
industry as well as practitioners to share ideas, challenges and
solutions relating to the multifaceted aspects of this field.
In recent years there has been a remarkable convergence of interest in programming languages based on ALGOL 60. Researchers interested in the theory of procedural and object-oriented languages discovered that ALGOL 60 shows how to add procedures and object classes to simple imperative languages in a general and clean way. And, on the other hand, researchers interested in purely functional languages discovered that ALGOL 60 shows how to add imperative mechanisms to functional languages in a way that does not compromise their desirable properties. Unfortunately, many of the key works in this field have been rather hard to obtain. The primary purpose of this collection is to make the most significant material on ALGoL-like languages conveniently available to graduate students and researchers. Contents Introduction to Volume 1 1 Part I Historical Background 1 Part n Basic Principles 3 Part III Language Design 5 Introduction to Volume 2 6 Part IV Functor-Category Semantics 7 Part V Specification Logic 7 Part VI Procedures and Local Variables 8 Part vn Interference, Irreversibility and Concurrency 9 Acknowledgements 11 Bibliography 11 Introduction to Volume 1 This volume contains historical and foundational material, and works on lan guage design. All of the material should be accessible to beginning graduate students in programming languages and theoretical Computer Science."
Linear Programming provides an in-depth look at simplex based as well as the more recent interior point techniques for solving linear programming problems. Starting with a review of the mathematical underpinnings of these approaches, the text provides details of the primal and dual simplex methods with the primal-dual, composite, and steepest edge simplex algorithms. This then is followed by a discussion of interior point techniques, including projective and affine potential reduction, primal and dual affine scaling, and path following algorithms. Also covered is the theory and solution of the linear complementarity problem using both the complementary pivot algorithm and interior point routines. A feature of the book is its early and extensive development and use of duality theory. Audience: The book is written for students in the areas of mathematics, economics, engineering and management science, and professionals who need a sound foundation in the important and dynamic discipline of linear programming.
There are increasing opportunities to consider the application of semantic technologies for business information systems. Semantic technologies are expected to improve business processes and information systems, and lead to savings in cost and time as well as improved efficiency. Semantic Technologies for Business and Information Systems Engineering: Concepts and Applications investigates the application of semantic technologies to business and information systems engineering. This reference work assists researchers in academia and industry, students, business process analysts, information management professionals, software engineers, and other practitioners in gaining knowledge on applying semantic technologies for advanced business information systems, in annotating semantics to business processes, and in semantically integrating advanced business information systems.
The connected dominating set has been a classic subject studied in graph theory since 1975. Since the 1990s, it has been found to have important applications in communication networks, especially in wireless networks, as a virtual backbone. Motivated from those applications, many papers have been published in the literature during last 15 years. Now, the connected dominating set has become a hot research topic in computer science. In this book, we are going to collect recent developments on the connected dominating set, which presents the state of the art in the study of connected dominating sets. The book consists of 16 chapters. Except the 1st one, each chapter is devoted to one problem, and consists of three parts, motivation and overview, problem complexity analysis, and approximation algorithm designs, which will lead the reader to see clearly about the background, formulation, existing important research results, and open problems. Therefore, this would be a very valuable reference book for researchers in computer science and operations research, especially in areas of theoretical computer science, computer communication networks, combinatorial optimization, and discrete mathematics.
Putting capability management into practice requires both a solid theoretical foundation and realistic approaches. This book introduces a development methodology that integrates business and information system development and run-time adjustment based on the concept of capability by presenting the main findings of the CaaS project - the Capability-Driven Development (CDD) methodology, the architecture and components of the CDD environment, examples of real-world applications of CDD, and aspects of CDD usage for creating business value and new opportunities. Capability thinking characterizes an organizational mindset, putting capabilities at the center of the business model and information systems development. It is expected to help organizations and in particular digital enterprises to increase flexibility and agility in adapting to changes in their economic and regulatory environments. Capability management denotes the principles of how capability thinking should be implemented in an organization and the organizational means. This book is intended for anyone who wants to explore the opportunities for developing and managing context-dependent business capabilities and the supporting business services. It does not require a detailed understanding of specific development methods and tools, although some background knowledge and experience in information system development is advisable. The individual chapters have been written by leading researchers in the field of information systems development, enterprise modeling and capability management, as well as practitioners and industrial experts from these fields.
Modern methods of filter design and controller design often yield systems of very high order, posing a problem for their implementation. Over the past two decades or so, sophisticated methods have been developed to achieve simplification of filters and controllers. Such methods often come with easy-to-use error bounds, and in the case of controller simplification methods, such error bounds will usually be related to closed-loop properties.This book is the first comprehensive treatment of approximation methods for filters and controllers. It is fully up to date, and it is authored by two leading researchers who have personally contributed to the development of some of the methods. Balanced truncation, Hankel norm reduction, multiplicative reduction, weighted methods and coprime factorization methods are all discussed.The book is amply illustrated with examples, and will equip practising control engineers and graduates for intelligent use of commercial software modules for model and controller reduction.
2. The Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 3. Convergence Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ., . . . . 60 4. Complexity Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63 5. Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 A Simple Proof for a Result of Ollerenshaw on Steiner Trees . . . . . . . . . . 68 Xiufeng Du, Ding-Zhu Du, Biao Gao, and Lixue Qii 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68 2. In the Euclidean Plane . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 3. In the Rectilinear Plane . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 4. Discussion . . . . . . . . . . . . -. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 Optimization Algorithms for the Satisfiability (SAT) Problem . . . . . . . . . 72 Jun Gu 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 2. A Classification of SAT Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7:3 3. Preliminaries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . IV 4. Complete Algorithms and Incomplete Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . 81 5. Optimization: An Iterative Refinement Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86 6. Local Search Algorithms for SAT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 7. Global Optimization Algorithms for SAT Problem . . . . . . . . . . . . . . . . . . . . . . . . 106 8. Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137 9. Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140 10. Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143 Ergodic Convergence in Proximal Point Algorithms with Bregman Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155 Osman Guier 1. Introduction . . .: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155 2. Convergence for Function Minimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158 3. Convergence for Arbitrary Maximal Monotone Operators . . . . . . . . . . . .
Genetic and evolutionary algorithms (GEAs) have often achieved an enviable success in solving optimization problems in a wide range of disciplines. This book provides effective optimization algorithms for solving a broad class of problems quickly, accurately, and reliably by employing evolutionary mechanisms.
This book introduces the parallel and distributed approach to logic programming, examining existing models of distributed logic programming, and proposing an alternative framework for distributed logic programming using extended Petri nets. The hardwired realization of the Petri net based framework is presented in detail, and principles of mapping of a logic program on to the proposed framework are outlined. Finally, the book explores the scope of Petri net models in designing next-generation deductive database machines.
System Modeling and Optimization XX deals with new developments in
the areas of optimization, optimal control and system modeling. The
themes range across various areas of optimization: continuous and
discrete, numerical and analytical, finite and infinite
dimensional, deterministic and stochastic, static and dynamic,
theory and applications, foundations and case studies. Besides some
classical topics, modern areas are also presented in the
contributions, including robust optimization, filter methods,
optimization of power networks, data mining and risk control.
Describing a new optimization algorithm, the "Teaching-Learning-Based Optimization (TLBO)," in a clear and lucid style, this book maximizes reader insights into how the TLBO algorithm can be used to solve continuous and discrete optimization problems involving single or multiple objectives. As the algorithm operates on the principle of teaching and learning, where teachers influence the quality of learners' results, the elitist version of TLBO algorithm (ETLBO) is described along with applications of the TLBO algorithm in the fields of electrical engineering, mechanical design, thermal engineering, manufacturing engineering, civil engineering, structural engineering, computer engineering, electronics engineering, physics and biotechnology. The book offers a valuable resource for scientists, engineers and practitioners involved in the development and usage of advanced optimization algorithms.
Written expressly for hardware designers, this book presents a formal model of VHDL clearly specifying both the static and dynamic semantics of VHDL. It provides a mathematical framework for representing VHDL constructs and shows how those constructs can be formally manipulated to reason about VHDL.
Information systems for manufacturing often follow a three-layer architecture based on an enterprise resource planning (ERP) layer (for order planning), a manufacturing execution system (MES) layer (for factory control), and a shop floor layer (for machine control). Future requirements on flexibility and adaptability require a much closer integration of ERP systems with the manufacturing floor. To achieve this integration, an MES often pushes customer orders to the manufacturing floor in a flexible manner. Moreover, a large amount of shop floor data needs to be filtered and fed into business planning applications such as production planning or supply chain management. Radio Frequency Identification (RFID) chips can play an important role in the collection and management of such data. Gunther, Kletti, and Kubach explain the potential advantages of using RFID technology in a modern manufacturing and supply chain context. Areas of emphasis include integration of RFID data into legacy IT architectures, RFID-MES-ERP integration, and cost-benefit considerations. Their presentation is not restricted to intra-company production planning, but also emphasizes the benefits of inter-company collaboration. Six case studies based on SAP s ERP systems and MPDV s MES solution show how to successfully implement cross-company supply chain integration using RFID technology."
Mathematical Programming has been of significant interest and relevance in engineering, an area that is very rich in challenging optimization problems. In particular, many design and operational problems give rise to nonlinear and mixed-integer nonlinear optimization problems whose modeling and solu tion is often nontrivial. Furthermore, with the increased computational power and development of advanced analysis (e. g. , process simulators, finite element packages) and modeling systems (e. g. , GAMS, AMPL, SPEEDUP, ASCEND, gPROMS), the size and complexity of engineering optimization models is rapidly increasing. While the application of efficient local solvers (nonlinear program ming algorithms) has become widespread, a major limitation is that there is often no guarantee that the solutions that are generated correspond to global optima. In some cases finding a local solution might be adequate, but in others it might mean incurring a significant cost penalty, or even worse, getting an incorrect solution to a physical problem. Thus, the need for finding global optima in engineering is a very real one. It is the purpose of this monograph to present recent developments of tech niques and applications of deterministic approaches to global optimization in engineering. The present monograph is heavily represented by chemical engi neers; and to a large extent this is no accident. The reason is that mathematical programming is an active and vibrant area of research in chemical engineering. This trend has existed for about 15 years. |
You may like...
Writing Better Requirements - Writing…
Ian Alexander, Richard Stevens
Paperback
R2,122
Discovery Miles 21 220
An Introduction to XML and Web…
Anders Moller, Michael Schwartzbach
Paperback
R2,413
Discovery Miles 24 130
Data Abstraction and Problem Solving…
Janet Prichard, Frank Carrano
Paperback
R2,163
Discovery Miles 21 630
C++ How to Program: Horizon Edition
Harvey Deitel, Paul Deitel
Paperback
R1,779
Discovery Miles 17 790
|