![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
This is a comprehensive study of various time-dependent scheduling problems in single-, parallel- and dedicated-machine environments. In addition to complexity issues and exact or heuristic algorithms which are typically presented in scheduling books, the author also includes more advanced topics such as matrix methods in time-dependent scheduling, time-dependent scheduling with two criteria and time-dependent two-agent scheduling. The reader should be familiar with the basic notions of calculus, discrete mathematics and combinatorial optimization theory, while the book offers introductory material on theory of algorithms, NP-complete problems, and the basics of scheduling theory. The author includes numerous examples, figures and tables, he presents different classes of algorithms using pseudocode, he completes all chapters with extensive bibliographies, and he closes the book with comprehensive symbol and subject indexes. The previous edition of the book focused on computational complexity of time-dependent scheduling problems. In this edition, the author concentrates on models of time-dependent job processing times and algorithms for solving time-dependent scheduling problems. The book is suitable for researchers working on scheduling, problem complexity, optimization, heuristics and local search algorithms.
This book is about describing the meaning of programming languages. The author teaches the skill of writing semantic descriptions as an efficient way to understand the features of a language. While a compiler or an interpreter offers a form of formal description of a language, it is not something that can be used as a basis for reasoning about that language nor can it serve as a definition of a programming language itself since this must allow a range of implementations. By writing a formal semantics of a language a designer can yield a far shorter description and tease out, analyse and record design choices. Early in the book the author introduces a simple notation, a meta-language, used to record descriptions of the semantics of languages. In a practical approach, he considers dozens of issues that arise in current programming languages and the key techniques that must be mastered in order to write the required formal semantic descriptions. The book concludes with a discussion of the eight key challenges: delimiting a language (concrete representation), delimiting the abstract content of a language, recording semantics (deterministic languages), operational semantics (non-determinism), context dependency, modelling sharing, modelling concurrency, and modelling exits. The content is class-tested and suitable for final-year undergraduate and postgraduate courses. It is also suitable for any designer who wants to understand languages at a deep level. Most chapters offer projects, some of these quite advanced exercises that ask for complete descriptions of languages, and the book is supported throughout with pointers to further reading and resources. As a prerequisite the reader should know at least one imperative high-level language and have some knowledge of discrete mathematics notation for logic and set theory.
The information infrastructure - comprising computers, embedded devices, networks and software systems - is vital to operations in every sector: inf- mation technology, telecommunications, energy, banking and ?nance, tra- portation systems, chemicals, agriculture and food, defense industrial base, public health and health care, national monuments and icons, drinking water and water treatment systems, commercial facilities, dams, emergency services, commercial nuclear reactors, materials and waste, postal and shipping, and government facilities. Global business and industry, governments, indeed - ciety itself, cannot function if major components of the critical information infrastructure are degraded, disabled or destroyed. This book, Critical Infrastructure Protection IV, is the fourth volume in the annual series produced by IFIP Working Group 11.10 on Critical Infr- tructure Protection, an active international community of scientists, engineers, practitioners and policy makers dedicated to advancing research, development and implementation e?orts related to critical infrastructure protection. The book presents original research results and innovative applications in the area of infrastructure protection. Also, it highlights the importance of weaving s- ence, technology and policy in crafting sophisticated, yet practical, solutions that will help secure information, computer and network assets in the various critical infrastructure sectors. This volume contains seventeen edited papers from the Fourth Annual IFIP Working Group 11.10 International Conference on Critical Infrastructure P- tection, held at the National Defense University, Washington, DC, March 15- 17, 2010. The papers were refereed by members of IFIP Working Group 11.10 and other internationally-recognized experts in critical infrastructure prot- tion.
This book explores the possible creation and impact of electronic markets underpinned by government. How far could electronic trade go? The author outlines a world in which open online marketplaces are routinely used to trade everything from office space to bicycle rental between individuals. Each transaction would be guaranteed by the system, not the reputation of the seller. Anyone could enter the market as an equal. The author argues that the electronic marketplaces of the future will have widespread and fundamental economic and social consequences. For more information about Guaranteed Electronic Markets visit the Gems Website at www.gems.org.uk
This book is a result of ISD2000-The Ninth International Conference on Infor mation Systems Development: Methods and Tools, Theory and Practice, held August 14-16, in Kristiansand, Norway. The ISD conference has its roots in the first Polish Scandinavian Seminar on Current Trends in Information Systems Development Method ologies, held in Gdansk, Poland in 1988. This year, as the conference carries into the new millennium this fine tradition, it was fitting that it returned to Scandinavia. Velkommen tilbake Next year, ISD crosses the North Sea and in the traditions of the Vikings, invades England. Like every ISD conference, ISD2000 gave participants an opportunity to express ideas on the current state of the art in information systems development, and to discuss and exchange views about new methods, tools and applications. This is particularly important now, since the field of ISD has seen rapid, and often bewildering, changes. To quote a Chinese proverb, we are indeed cursed, or blessed, depending on how we choose to look at it, to be "living in interesting times.""
An essential resource on the future of IP networks Everyone agrees that Internet Protocol (IP) has played and will play a major role in the evolution of networks and services. The exact nature and scope of that role, however, remains a point of discussion. Assembling the foremost experts in their respective fields, editors Salah Aidarous and Thomas Plevyak present the community with an invaluable resource for research and development in Managing IP Networks: Challenges and Opportunities. Issues related to end-to-end network and service management, for example, will dramatically impact the future of IP networks, yet there remains a scholarly deficit between the significance of these topics and their presence in the current literature. Aidarous and Plevyak make up this difference, addressing these and other critical challenges affecting the growth of IP networks, with contributions from:
Network operations engineers, computer scientists, and management software vendors, as well as professionals in the continuing education and research and development communities, will find Managing IP Networks to be an invaluable addition to their professional libraries.
Advances in Computers, Volume 112, the latest volume in a series published since 1960, presents detailed coverage of innovations in computer hardware, software, theory, design and applications. Chapters in this updated volume include Mobile Application Quality Assurance, Advances in Combinatorial Testing, Advances in Applications of Object Constraint Language for Software Engineering, Advances in Techniques for Test Prioritization, Data Warehouse Testing, Mutation Testing Advances: An Analysis and Survey, Event-Based Concurrency: Applications, Abstractions, and Analyses, and A Taxonomy of Software Integrity Protection Techniques.
This book focuses on the design methodologies of various quantum circuits, DNA circuits, DNA-quantum circuits and quantum-DNA circuits. It considers the merits and challenges of multivalued logic circuits in quantum, DNA, quantum-DNA and DNA-quantum computing. Multiple-Valued Computing in Quantum Molecular Biology: Arithmetic and Combinational Circuits is Volume 1 of a two-volume set. From fundamentals to advanced levels, this book discusses different multiple-valued logic DNA-quantum and quantum-DNA circuits. The text consists of four parts. Part I introduces multiple-valued quantum computing and DNA computing. It contains the basic understanding of multiple-valued quantum computing, multiple-valued DNA computing, multiple-valued quantum-DNA computing and multiple-valued DNA-quantum computing. Part II examines heat calculation, speed calculation, heat transfer, data conversion and data management in multi-valued quantum, DNA, quantum-DNA and DNA-quantum computing. Part III discusses multiple-valued logic operations in quantum and DNA computing such as ternary AND, NAND, OR, NOR, XOR, XNOR and multiple-valued arithmetic operations such as adder, multiplier, divider and more. Quantum-DNA and DNA-quantum multiple-valued arithmetic operations are also explained in this section. Part IV explains multiple-valued quantum and DNA combinational circuits such as multiple-valued DNA-quantum and quantum-DNA multiplexer, demultiplexer, encoder and decoder. This book will be of great help to researchers and students in quantum computing, DNA computing, quantum-DNA computing and DNA-quantum computing researchers.
Discusses various design aspects of multiple-valued logic DNA-quantum and quantum-DNA sequential circuits, memory devices, programmable logic devices and nano-processors Presents how multiple-valued quantum, DNA, quantum-DNA and DNA-quantum nano processors are designed with algorithms Examines the architecture and design procedure of memory devices such as Random Access Memory (RAM), Read Only Memory (ROM) Reviews the designs and algorithms of Multiple-valued quantum, DNA, quantum-DNA and DNA-quantum nano processors.
Computational Intelligence (CI) has emerged as a novel and highly diversified paradigm supporting the design, analysis and deployment of intelligent systems. This book presents a careful selection of the field that very well reflects the breadth of the discipline. It covers a range of highly relevant and practical design principles governing the development of intelligent systems in data mining, robotics, bioinformatics, and intelligent tutoring systems. The lucid presentations, coherent organization, breadth and the authoritative coverage of the area make the book highly attractive for everybody interested in the design and analysis of intelligent systems.
This newly revised edition of the highly successful 1997 book offers professionals and students an up-to-date, in-depth understanding of how payments are made electronically across the Internet. The second edition explores the very latest developments in this quickly expanding area, including the newest security techniques and methods, and features a completely new chapter on the exciting advances in mobile commerce. Pub 8/01.
Advances in Computers, Volume 113, the latest volume in this innovative series published since 1960, presents detailed coverage of new advancements in computer hardware, software, theory, design and applications. Chapters in this updated release include A Survey on Regression Test-case Prioritization, Symbolic Execution and Recent Applications to Worst-Case Execution, Load Testing and Security Analysis, Model Based Test Cases Reuse and Optimization, Advances in Using Agile and Lean Processes for Software Development, Three Open Problems in the Context of E2E Web Testing and a Vision: NEONATE, Experiences with replicable experiments and replication kits for software engineering research, and Advances in Symbolic Execution.
This volume is the first extensive study of the historical and philosophical connections between technology and mathematics. Coverage includes the use of mathematics in ancient as well as modern technology, devices and machines for computation, cryptology, mathematics in technological education, the epistemology of computer-mediated proofs, and the relationship between technological and mathematical computability. The book also examines the work of such historical figures as Gottfried Wilhelm Leibniz, Charles Babbage, Ada Lovelace, and Alan Turing.
This book describes how a key signal/image processing algorithm - that of the fast Hartley transform (FHT) or, via a simple conversion routine between their outputs, of the real-data version of the ubiquitous fast Fourier transform (FFT) - might best be formulated to facilitate computationally-efficient solutions. The author discusses this for both 1-D (such as required, for example, for the spectrum analysis of audio signals) and m-D (such as required, for example, for the compression of noisy 2-D images or the watermarking of 3-D video signals) cases, but requiring few computing resources (i.e. low arithmetic/memory/power requirements, etc.). This is particularly relevant for those application areas, such as mobile communications, where the available silicon resources (as well as the battery-life) are expected to be limited. The aim of this monograph, where silicon-based computing technology and a resource-constrained environment is assumed and the data is real-valued in nature, has thus been to seek solutions that best match the actual problem needing to be solved.
This volume offers an up-to-date overview of essential concepts and modern approaches to computational modelling, including the use of experimental techniques related to or directly inspired by them. The book introduces, at increasing levels of complexity and with the non-specialist in mind, state-of-the-art topics ranging from single-cell and molecular descriptions to circuits and networks. Four major themes are covered, including subcellular modelling of ion channels and signalling pathways at the molecular level, single-cell modelling at different levels of spatial complexity, network modelling from local microcircuits to large-scale simulations of entire brain areas and practical examples. Each chapter presents a systematic overview of a specific topic and provides the reader with the fundamental tools needed to understand the computational modelling of neural dynamics. This book is aimed at experimenters and graduate students with little or no prior knowledge of modelling who are interested in learning about computational models from the single molecule to the inter-areal communication of brain structures. The book will appeal to computational neuroscientists, engineers, physicists and mathematicians interested in contributing to the field of neuroscience. Chapters 6, 10 and 11 are available open access under a Creative Commons Attribution 4.0 International License via link.springer.com.
The aim of this book is to provide detailed coverage of the topics in the new OCR AS and A Level Computer Science specifications H046 / H446. The book is divided into twelve sections and within each section, each chapter covers material that can comfortably be taught in one or two lessons. Material that is applicable only to the second year of the full A Level is clearly marked. Sometimes this may include an entire chapter and at other times, just a small part of a chapter. Each chapter contains exercises and questions, some new and some from past examination questions. Answers to all these are available to teachers only in a free Teacher's Pack which can be ordered from our website www.pgonline.co.uk. This book has been written to cover the topics which will be examined in the written papers at both AS and A Level. Sections 10, 11 and 12 relate principally to problem solving skills, with programming techniques covered in sufficient depth to allow students to answer questions in Component 02. Pseudocode, rather than any specific programming language, is used in the algorithms given in the text. Sample Python programs which implement many of the algorithms are included in a folder with the Teacher's Pack.
Cellular automata are widely-used tools for simulation in physics, ecology, evolution, mathematics and other fields. They are also digital "toy universes" worthy of study in their own right, with a significant and growing body of enthusiastic investigators. This book will present many of the most interesting new developments and applications of cellular automata. There has not been a compilation like this for some time and this field is due for an explosion of research interest in the rather near future.
This second volume in the series, "Strategic Management and Information Technology" presents a coherent set of papers that deal with the challenges of leveraging information technology for designing inter-organizational relationships. Instead of assembling a set of papers that are loosely connected to the broad theme of strategy and information technology, this volume presents a well-knit compendium of papers on a coherent topic.
Control technology is a new learning environment which offers the opportunity to take up the economic and educational challenge of enabling people to adapt to new technologies and use them to solve problems. Giving young children (and also adults) easy access to control technology introduces them to a learning environment where they can build their knowledge across a range of topics. As they build and program their own automata and robots, they learn to solve problems, work incollaboration, and be creative. They also learn more about science, electronics, physics, computer literacy, computer assisted manufacturing, and so on. This book, based on a NATO Advanced Research Workshop in the Special Programme on Advanced Educational Technology, presents a cross-curricular approach to learning about control technology. The recommended methodology is active learning, where the teacher's role is to stimulate the learner to build knowledge by providing him/her with appropriate materials (hardware and software) and suggestions to develop the target skills. The results are encouraging, although more tools are needed to help the learner to generalize from his/her concrete experiment in control technology as well as to evaluate its effect on the target skills. The contributions not only discuss epistemological controversies linked to such learning environments as control technology, but also report on the state of the art and new developments in the field and present some stimulating ideas.
This book collects and explains the many theorems concerning the existence of certificates of positivity for polynomials that are positive globally or on semialgebraic sets. A certificate of positivity for a real polynomial is an algebraic identity that gives an immediate proof of a positivity condition for the polynomial. Certificates of positivity have their roots in fundamental work of David Hilbert from the late 19th century on positive polynomials and sums of squares. Because of the numerous applications of certificates of positivity in mathematics, applied mathematics, engineering, and other fields, it is desirable to have methods for finding, describing, and characterizing them. For many of the topics covered in this book, appropriate algorithms, computational methods, and applications are discussed. This volume contains a comprehensive, accessible, up-to-date treatment of certificates of positivity, written by an expert in the field. It provides an overview of both the theory and computational aspects of the subject, and includes many of the recent and exciting developments in the area. Background information is given so that beginning graduate students and researchers who are not specialists can learn about this fascinating subject. Furthermore, researchers who work on certificates of positivity or use them in applications will find this a useful reference for their work.
This book presents a peer reviewed selection of extended versions of ten original papers that were presented at the 15th International Symposium on Computers in Education (SIIE 2013) held in Viseu, Portugal. The book provide a representative view of current Information and Communications Technology (ICT) educational research approaches in the Ibero-American context as well as internationally. It includes studies that range from elementary to higher education, from traditional to distance learning settings. It considers special needs and other inclusive issues, across a range of disciplines, using multiple and diverse perspectives and technologies to furnish detailed information on the latest trends in ICT and education globally. Design, development and evaluation of educational software; ICT use and evaluation methodologies; social web and collaborative systems; and learning communities are some of the topics covered.
The application of 3D methodology has recently been receiving increasing attention at many PET centres, and this monograph is an attempt to provide a state-of-the-art review of this methodology, covering 3D reconstruction methods, quantitative procedures, current tomography performance, and clinical and research applications. No such review has been available until now to assist PET researchers in understanding and implementing 3D methodology, and in evaluating the performance of the available imaging technology. In all the chapters, the subject matter is treated in sufficient depth to appeal equally to the physicist or engineer who wishes to establish the methodology, and to PET investigators with experience in 2D PET who wish to familiarize themselves with the concepts and advantages of 3D, and to be made aware of the pitfalls.
The book presents a comprehensive view on Flow-Aware Networking. It starts with a brief overview of the known QoS architectures based on the concept of a flow. Then, the original FAN concept is presented, along with its variations proposed by the authors. The next chapter covers a very valuable feature of the FAN architecture, namely its ability to assure net neutrality. The chapters that follow will discuss, in detail, a variety of issues making the FAN concept implementable, including congestion control, fairness, resilience to failures, service differentiation and degradation. The final chapter presents the test implementation of the FAN router, including the environment used and performance tests. Chapters are supplemented with problems to solve, along with their solutions. The pedagogical character of the book is supported by a number of illustrative examples contained in most of the chapters. At the end of the book, a glossary of the key terms is included, along with a comprehensive bibliography. Flow-based traffic management is currently becoming a mainstream. There is plenty of Quality of Service (QoS) techniques based on flows. Software-Defined Networking with its dominant protocol OpenFlow also follows this trend. Flow-Aware Networking (FAN) is a promising QoS architecture. Information on FAN can be found in various research papers. It is, therefore highly scattered. This book gathers practically all relevant information regarding FAN and puts it together. Quality of Service assurance is one of the key challenges of today's Internet. The existing approaches to provide QoS do not meet expectations of network operators, managers and users although numerous efforts in this area have been reported. One of the most promising concepts is the Flow-Aware Network (FAN). FAN can play a key role in assuring the net neutrality, smoothly combining interests of all the involved parties. The authors of the proposal have been involved in FAN research practically since its inception at the start of the 21st century. The book reports the wide experiences the authors accumulated in the subject area during the work on common FAN-related projects conducted with the team of James Roberts that proposed the original FAN concept as well as other leading research groups in Europe and the USA. One of the aims of the book is to accompany courses taught by the authors.
The availability of today's online information systems rapidly increases the relevance of dynamic decision making within a large number of operational contexts. Whenever a sequence of interdependent decisions occurs, making a single decision raises the need for anticipation of its future impact on the entire decision process. Anticipatory support is needed for a broad variety of dynamic and stochastic decision problems from different operational contexts such as finance, energy management, manufacturing and transportation. Example problems include asset allocation, feed-in of electricity produced by wind power as well as scheduling and routing. All these problems entail a sequence of decisions contributing to an overall goal and taking place in the course of a certain period of time. Each of the decisions is derived by solution of an optimization problem. As a consequence a stochastic and dynamic decision problem resolves into a series of optimization problems to be formulated and solved by anticipation of the remaining decision process. However, actually solving a dynamic decision problem by means of approximate dynamic programming still is a major scientific challenge. Most of the work done so far is devoted to problems allowing for formulation of the underlying optimization problems as linear programs. Problem domains like scheduling and routing, where linear programming typically does not produce a significant benefit for problem solving, have not been considered so far. Therefore, the industry demand for dynamic scheduling and routing is still predominantly satisfied by purely heuristic approaches to anticipatory decision making. Although this may work well for certain dynamic decision problems, these approaches lack transferability of findings to other, related problems. This book has serves two major purposes: It provides a comprehensive and unique view of anticipatory optimization for dynamic decision making. It fully integrates Markov decision processes, dynamic programming, data mining and optimization and introduces a new perspective on approximate dynamic programming. Moreover, the book identifies different degrees of anticipation, enabling an assessment of specific approaches to dynamic decision making. It shows for the first time how to successfully solve a dynamic vehicle routing problem by approximate dynamic programming. It elaborates on every building block required for this kind of approach to dynamic vehicle routing. Thereby the book has a pioneering character and is intended to provide a footing for the dynamic vehicle routing community."
Countering Cyber Sabotage: Introducing Consequence-Driven, Cyber-Informed Engineering (CCE) introduces a new methodology to help critical infrastructure owners, operators and their security practitioners make demonstrable improvements in securing their most important functions and processes. Current best practice approaches to cyber defense struggle to stop targeted attackers from creating potentially catastrophic results. From a national security perspective, it is not just the damage to the military, the economy, or essential critical infrastructure companies that is a concern. It is the cumulative, downstream effects from potential regional blackouts, military mission kills, transportation stoppages, water delivery or treatment issues, and so on. CCE is a validation that engineering first principles can be applied to the most important cybersecurity challenges and in so doing, protect organizations in ways current approaches do not. The most pressing threat is cyber-enabled sabotage, and CCE begins with the assumption that well-resourced, adaptive adversaries are already in and have been for some time, undetected and perhaps undetectable. Chapter 1 recaps the current and near-future states of digital technologies in critical infrastructure and the implications of our near-total dependence on them. Chapters 2 and 3 describe the origins of the methodology and set the stage for the more in-depth examination that follows. Chapter 4 describes how to prepare for an engagement, and chapters 5-8 address each of the four phases. The CCE phase chapters take the reader on a more granular walkthrough of the methodology with examples from the field, phase objectives, and the steps to take in each phase. Concluding chapter 9 covers training options and looks towards a future where these concepts are scaled more broadly. |
You may like...
The Banishment of Beverland - Sex, Sin…
Karen Eline Hollewand
Hardcover
R4,332
Discovery Miles 43 320
|