![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
From the reviews of the 1st edition: "This book provides a comprehensive and detailed account of different topics in algorithmic 3-dimensional topology, culminating with the recognition procedure for Haken manifolds and including the up-to-date results in computer enumeration of 3-manifolds. Originating from lecture notes of various courses given by the author over a decade, the book is intended to combine the pedagogical approach of a graduate textbook (without exercises) with the completeness and reliability of a research monograph... All the material, with few exceptions, is presented from the peculiar point of view of special polyhedra and special spines of 3-manifolds. This choice contributes to keep the level of the exposition really elementary. In conclusion, the reviewer subscribes to the quotation from the back cover: "the book fills a gap in the existing literature and will become a standard reference for algorithmic 3-dimensional topology both for graduate students and researchers." Zentralblatt fur Mathematik 2004 For this 2nd edition, new results, new proofs, and commentaries for a better orientation of the reader have been added. In particular, in Chapter 7 several new sections concerning applications of the computer program "3-Manifold Recognizer" have been included. "
A formal method is not the main engine of a development process, its contribution is to improve system dependability by motivating formalisation where useful. This book summarizes the results of the DEPLOY research project on engineering methods for dependable systems through the industrial deployment of formal methods in software development. The applications considered were in automotive, aerospace, railway, and enterprise information systems, and microprocessor design. The project introduced a formal method, Event-B, into several industrial organisations and built on the lessons learned to provide an ecosystem of better tools, documentation and support to help others to select and introduce rigorous systems engineering methods. The contributing authors report on these projects and the lessons learned. For the academic and research partners and the tool vendors, the project identified improvements required in the methods and supporting tools, while the industrial partners learned about the value of formal methods in general. A particular feature of the book is the frank assessment of the managerial and organisational challenges, the weaknesses in some current methods and supporting tools, and the ways in which they can be successfully overcome. The book will be of value to academic researchers, systems and software engineers developing critical systems, industrial managers, policymakers, and regulators.
The computer is the great technological and scientific innovation of the last half of the twentieth century. It has revolutionized how we organize information, how we communicate with each other, and even the way that we think about the human mind. Computers have eased the drudgery of such tasks as calculating sums and clerical work, making them both more bearable and more efficient. The computer has become ubiquitous in many aspects of business, recreation, and everyday life, and the trend is that they are becoming both more powerful and easier to use. Computers: The Life Story of a Technology provides an accessible overview of this ever changing technology history, giving students and lay readers an understanding of the complete scope of its history from ancient times to the present day. In addition to providing a concise biography of how this technology developed, this book provides insights into how the computer has changed our lives: * Demonstrates how, just as the invention of the steam engine in the 1700s stimulated scientists to think of the laws of nature in terms of machines, the success of the computer in the late 1900s prompted scientists to think of the basic laws of the universe as being similar to the operation of a computer. * Provides a worldwide examination of computing, and how such needs as security and defense during the Cold War drove the development of computing technology. * Shows how the computer has entered almost every aspect of daily life in the 21st century The volume includes a glossary of terms, a timeline of important events, and a selected bibliography of useful resources for further information.
This book describes the benefits that emerge when the fields of constraint programming and concurrency meet. On the one hand, constraints can be used in concurrency theory to increase the conciseness and the expressive power of concurrent languages from a pragmatic point of view. On the other hand, problems modeled by using constraints can be solved faster and more efficiently using a concurrent system. Both directions are explored providing two separate lines of development. Firstly the expressive power of a concurrent language is studied, namely Constraint Handling Rules, that supports constraints as a primitive construct. The features of this language which make it Turing powerful are shown. Then a framework is proposed to solve constraint problems that is intended to be deployed on a concurrent system. For the development of this framework the concurrent language Jolie following the Service Oriented paradigm is used. Based on this experience, an extension to Service Oriented Languages is also proposed in order to overcome some of their limitations and to improve the development of concurrent applications.
Information security and copyright protection are more important today than before. Digital watermarking is one of the widely used techniques used in the world in the area of information security. This book introduces a number of digital watermarking techniques and is divided into four parts. The first part introduces the importance of watermarking techniques and intelligent technology. The second part includes a number of watermarking techniques. The third part includes the hybrid watermarking techniques and the final part presents conclusions. This book is directed to students, professors, researchers and application engineers who are interested in the area of information security.
The present book is the result of a three year research project which investigated the creative act of composing by means of algorithmic composition. Central to the investigation are the compositional strategies of 12 composers, which were documented through a dialogic and cyclic process of modelling and evaluating musical materials. The aesthetic premises and compositional approaches configure a rich spectrum of diverse positions, which is reflected also in the kinds of approaches and methods used. These approaches and methods include the generation and evaluation of chord sequences using genetic algorithms, the application of morphing strategies to research harmonic transformations, an automatic classification of personal preferences via machine learning, and an application of mathematical music theory to the analysis and resynthesis of musical material. The second part of the book features contributions by Sandeep Bhagwati, William Brooks, David Cope, Darla Crispin, Nicolas Donin, and Guerino Mazzola. These authors variously consider the project from different perspectives, offer independent approaches, or provide more general reflections from their respective research fields.
Competitive intelligence uses public sources to obtain valuable information on competition and competitors. By using competitive intelligence aggressively and intelligently, corporations can obtain information on potential acquisition targets, markets, key personnel, the probable emergence of new products, or the financial strength or contracts of a competing firm. An absolutely indispensable playbook for anyone who has to compete during the information explosion. "Martin Sikora, Editor, Mergers and AcquisitionS" Competitive intelligence uses public sources to obtain valuable information on competition and competitors. In an open society such as our own, businesses place a great deal of information in the public domain. By using competitive intelligence aggressively and intelligently, corporations can obtain information on potential acquisition targets, markets, key personnel, the probable emergence of new products, or the financial strength or contracts of a competing firm. In fact, the authors contend that as much as 90 percent of the information required to decide on a course of litigation, acquisitions, expansion, new product introduction, or financing, is available through competitive intelligence.
Fundamental Problems in Computing is in honor of Professor Daniel J. Rosenkrantz, a distinguished researcher in Computer Science. Professor Rosenkrantz has made seminal contributions to many subareas of Computer Science including formal languages and compilers, automata theory, algorithms, database systems, very large scale integrated systems, fault-tolerant computing and discrete dynamical systems. For many years, Professor Rosenkrantz served as the Editor-in-Chief of the Journal of the Association for Computing Machinery (JACM), a very prestigious archival journal in Computer Science. His contributions to Computer Science have earned him many awards including the Fellowship from ACM and the ACM SIGMOD Contributions Award.
This volume contains the articles presented at the 21st International Meshing Roundtable (IMR) organized, in part, by Sandia National Laboratories and was held on October 7-10, 2012 in San Jose, CA, USA. The first IMR was held in 1992, and the conference series has been held annually since. Each year the IMR brings together researchers, developers, and application experts in a variety of disciplines, from all over the world, to present and discuss ideas on mesh generation and related topics. The technical papers in this volume present theoretical and novel ideas and algorithms with practical potential, as well as technical applications in science and engineering, geometric modeling, computer graphics, and visualization.
This volume represents the papers reviewed and accepted for the HOIT2007 conference held at the Indian Institute of Technology Madras in the city of Chennai, India in August 2007. This volume addresses many of the major themes of current interest in the field, with a particular focus on community-based technologies. This comprehensive book is divided into five different sections reflecting the most up-to-date research on computers and society.
This book is located at the interface of online learning within a context of English language studies and academic literacy and is underpinned, from a critical theoretical perspective, by an understanding of the implications of the digital divide for developing countries worldwide. The work is an exploration of online learning in an undergraduate English language and academic literacy classroom at a university in South Africa, and theorises the need for technology in developing countries as a means of social inclusion. The aim is to explore the extent to which communities of practice are enabled in an online environment, among English non-mother tongue speakers from technologically under-resourced backgrounds. This study examines the extent to which the students participate, negotiate meaning, and construct identities in online spaces. From a sociocultural perspective this book locates learning as a form of interaction and co-participation, and argues that learning occurs within specific contexts, hence the focus on how individuals become members of 'communities of practice'.
In recent years Genetic Algorithms (GA) and Artificial Neural
Networks (ANN) have progressively increased in importance amongst
the techniques routinely used in chemometrics. This book contains
contributions from experts in the field is divided in two sections
(GA and ANN). In each part, tutorial chapters are included in which
the theoretical bases of each technique are expertly (but simply)
described. These are followed by application chapters in which
special emphasis will be given to the advantages of the application
of GA or ANN to that specific problem, compared to classical
techniques, and to the risks connected with its misuse.
This is volume 73 of "Advances in Computers." This series, which
began publication in 1960, is the oldest continuously published
anthology that chronicles the ever- changing information technology
field. In these volumes we publish from 5 to 7 chapters, three
times per year, that cover the latest changes to the design,
development, use and implications of computer technology on society
today. In this current volume, subtitled "Emerging Technologies,"
we discuss several new advances in computer software generation as
well as describe new applications of those computers.
The innovative process of open source software is led in greater part by the end-users; therefore this aspect of open source software remains significant beyond the realm of traditional software development. Open Source Software Dynamics, Processes, and Applications is a multidisciplinary collection of research and approaches on the applications and processes of open source software. Highlighting the development processes performed by software programmers, the motivations of its participants, and the legal and economic issues that have been raised; this book is essential for scholars, students, and practitioners in the fields of software engineering and management as well as sociology.
later versions. In addition, the CD-ROM contains a complete solutions manual that includes detailed solutions to all the problems in the book. If the reader does not wish to consult these solutions, then a brief list of answers is provided in printed form at the end of the book. Iwouldliketothankmyfamilymembersfortheirhelpandcontinuedsupportwi- out which this book would not have been possible. I would also like to acknowledge the help of the editior at Springer-Verlag (Dr. Thomas Ditzinger) for his assistance in bringing this book out in its present form. Finally, I would like to thank my brother, Nicola, for preparing most of the line drawings in both editions. In this edition, I am providing two email addresses for my readers to contact me (pkattan@tedata. net. jo and pkattan@lsu. edu). The old email address that appeared in the ?rst edition was cancelled in 2004. December 2006 Peter I. Kattan PrefacetotheFirstEdition 3 This is a book for people who love ?nite elements and MATLAB . We will use the popular computer package MATLAB as a matrix calculator for doing ?nite element analysis. Problems will be solved mainly using MATLAB to carry out the tedious and lengthy matrix calculations in addition to some manual manipulations especially when applying the boundary conditions. In particular the steps of the ?nite element method are emphasized in this book. The reader will not ?nd ready-made MATLAB programsforuseasblackboxes. Insteadstep-by-stepsolutionsof?niteelementpr- lems are examined in detail using MATLAB.
This book reviews the algorithms for processing geometric data, with a practical focus on important techniques not covered by traditional courses on computer vision and computer graphics. Features: presents an overview of the underlying mathematical theory, covering vector spaces, metric space, affine spaces, differential geometry, and finite difference methods for derivatives and differential equations; reviews geometry representations, including polygonal meshes, splines, and subdivision surfaces; examines techniques for computing curvature from polygonal meshes; describes algorithms for mesh smoothing, mesh parametrization, and mesh optimization and simplification; discusses point location databases and convex hulls of point sets; investigates the reconstruction of triangle meshes from point clouds, including methods for registration of point clouds and surface reconstruction; provides additional material at a supplementary website; includes self-study exercises throughout the text.
This book presents intuitive explanations of the principles and applications of power system resiliency, as well as a number of straightforward and practical methods for the impact analysis of risk events on power system operations. It also describes the challenges of modelling, distribution networks, optimal scheduling, multi-stage planning, deliberate attacks, cyber-physical systems and SCADA-based smart grids, and how to overcome these challenges. Further, it highlights the resiliency issues using various methods, including strengthening the system against high impact events with low frequency and the fast recovery of the system properties. A large number of specialists have collaborated to provide innovative solutions and research in power systems resiliency. They discuss the fundamentals and contemporary materials of power systems resiliency, theoretical and practical issues, as well as current issues and methods for controlling the risk attacks and other threats to AC power systems. The book includes theoretical research, significant results, case studies, and practical implementation processes to offer insights into electric power and engineering and energy systems. Showing how systems should respond in case of malicious attacks, and helping readers to decide on the best approaches, this book is essential reading for electrical engineers, researchers and specialists. The book is also useful as a reference for undergraduate and graduate students studying the resiliency and reliability of power systems.
IT securiteers - The human and technical dimension working for the organisation. Current corporate governance regulations and international standards lead many organisations, big and small, to the creation of an information technology (IT) security function in their organisational chart or to the acquisition of services from the IT security industry. More often than desired, these teams are only useful for companies' executives to tick the corresponding box in a certification process, be it ISO, ITIL, PCI, etc. Many IT security teams do not provide business value to their company. They fail to really protect the organisation from the increasing number of threats targeting its information systems. IT Security Management provides an insight into how to create and grow a team of passionate IT security professionals. We will call them "securiteers." They will add value to the business, improving the information security stance of organisations.
This book contains papers presented at the International Symposium on Elect- magnetic Fields in Mechatronics, Electrical and Electronic Engineering ISEF'07 which was held in Prague, the Czech Republic, from September 13 to 15, 2007. ISEF conferences have been organized since 1985 and from the very beginning it was a common initiative of Polish and other European researchers who have dealt with electromagnetic ?eld in electrical engineering. The conference travels through Europe and is organized in various academic centres. Relatively often, it was held in some Polish city as the initiative was on the part of Polish scientists. Now ISEF is much more international and successive events take place in different European academic centres renowned for electromagnetic research. This time it was Prague, famous for its beauty and historical background, as it is the place where many c- tures mingle. The venue of the conference was the historical building of Charles University, placed just in the centre of Prague. The Technical University of Prague, in turn, constituted the logistic centre of the conference. It is the tradition of the ISEF meetings that they try to tackle quite a vast area of computational and applied electromagnetics. Moreover, the ISEF symposia aim at combining theory and practice; therefore the majority of papers are deeply rooted in engineering problems, being simultaneously of a high theoretical level.
This book contains the collection of papers presented at the conference of the International Federation for Information Processing Working Group 8.2 "Information and Organizations." The conference took place during June 21-24, 2009 at the Universidade do Minho in Guimaraes, Portugal. The conference entitled "CreativeSME - The Role of IS in Leveraging the Intelligence and Creativity of SME's" attracted high-quality submissions from across the world. Each paper was reviewed by at least two reviewers in a double-blind review process. In addition to the 19 papers presented at the conference, there were five panels and four workshops, which covered a range of issues relevant to SMEs, creativity and information systems. We would like to show our appreciation of the efforts of our two invited keynote speakers, Michael Dowling of the University of Regensburg, Germany and Carlos Zorrinho, Portuguese coordinator of the Lisbon Strategy and the Technological Plan. The following organizations supported the conference through financial or other contributions and we would like to thank them for their engagement: "
With the rise of mobile and wireless technologies, more sustainable networks are necessary to support communication. These next-generation networks can now be utilized to extend the growing era of the Internet of Things. Enabling Technologies and Architectures for Next-Generation Networking Capabilities is an essential reference source that explores the latest research and trends in large-scale 5G technologies deployment, software-defined networking, and other emerging network technologies. Featuring research on topics such as data management, heterogeneous networks, and spectrum sensing, this book is ideally designed for computer engineers, technology developers, network administrators and researchers, professionals, and graduate-level students seeking coverage on current and future network technologies.
Multisensor fusion systems are only practical if the algorithms used are practical and effective, and if there is efficient database support. The first part of this book discusses a wide range of issues related to the development of robust, context-sensitive, and efficient data fusion algorithms. The second part addresses database requirements, structures, and issues related to achieving overall computational efficiency. Featuring highly accessible notation, the processing model and database issues presented in the text are aimed at system developers working in sensor fusion, automatic target recognition, multiple-target tracking, robotic control, automated image understanding, and large-scale integration and fabrication.
Systems for Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) are currently separate. The potential of the latest technologies and changes in operational and analytical applications over the last decade have given rise to the unification of these systems, which can be of benefit for both workloads. Research and industry have reacted and prototypes of hybrid database systems are now appearing. Benchmarks are the standard method for evaluating, comparing and supporting the development of new database systems. Because of the separation of OLTP and OLAP systems, existing benchmarks are only focused on one or the other. With the rise of hybrid database systems, benchmarks to assess these systems will be needed as well. Based on the examination of existing benchmarks, a new benchmark for hybrid database systems is introduced in this book. It is furthermore used to determine the effect of adding OLAP to an OLTP workload and is applied to analyze the impact of typically used optimizations in the historically separate OLTP and OLAP domains in mixed-workload scenarios. |
You may like...
Higher Dimensional Varieties and…
Karoly Jr. Boeroeczky, Janos Kollar, …
Hardcover
R2,899
Discovery Miles 28 990
Financial Reform in Central and Eastern…
Zdenek Drabek, Stephany Griffith-Jones
Hardcover
R4,019
Discovery Miles 40 190
The Mind-Gut Connection - How The Hidden…
Emeran Mayer
Paperback
(1)
|