![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
The book introduces a hot topic of novel and emerging computing paradigms and architectures -computation by travelling waves in reaction-diffusion media. A reaction-diffusion computer is a massively parallel computing device, where the micro-volumes of the chemical medium act as elementary few-bit processors, and chemical species diffuse and react in parallel. In the reaction-diffusion computer both the data and the results of the computation are encoded as concentration profiles of the reagents, or local disturbances of concentrations, whilst the computation per se is performed via the spreading and interaction of waves caused by the local disturbances. The monograph brings together results of a decade-long study into designing experimental and simulated prototypes of reaction-diffusion computing devices for image processing, path planning, robot navigation, computational geometry, logics and artificial intelligence. The book is unique because it gives a comprehensive presentation of the theoretical and experimental foundations, and cutting-edge computation techniques, chemical laboratory experimental setups and hardware implementation technology employed in the development of novel nature-inspired computing devices.
Action theory is the object of growing attention in a variety of scientific disciplines and this is the first volume to offer a synthetic view of the range of approaches possible in the topic. The volume focuses on the nexus of formal action theory with a startlingly diverse set of subjects, which range from logic, linguistics, artificial intelligence and automata theory to jurisprudence, deontology and economics. It covers semantic, mathematical and logical aspects of action, showing how the problem of action breaks the boundaries of traditional branches of logic located in syntactics and semantics and now lies on lies on the borderline between logical pragmatics and praxeology. The chapters here focus on specialized tasks in formal action theory, beginning with a thorough description and formalization of the language of action and moving through material on the differing models of action theory to focus on probabilistic models, the relations of formal action theory to deontic logic and its key applications in algorithmic and programming theory. The coverage thus fills a notable lacuna in the literary corpus and offers solid formal underpinning in cognitive science by approaching the problem of cognition as a composite action of mind.
This book focuses on the basic control and filtering synthesis problems for discrete-time switched linear systems under time-dependent switching signals. Chapter 1, as an introduction of the book, gives the backgrounds and motivations of switched systems, the definitions of the typical time-dependent switching signals, the differences and links to other types of systems with hybrid characteristics and a literature review mainly on the control and filtering for the underlying systems. By summarizing the multiple Lyapunov-like functions (MLFs) approach in which different requirements on comparisons of Lyapunov function values at switching instants, a series of methodologies are developed for the issues on stability and stabilization, and l2-gain performance or tube-based robustness for l disturbance, respectively, in Chapters 2 and 3. Chapters 4 and 5 are devoted to the control and filtering problems for the time-dependent switched linear systems with either polytopic uncertainties or measurable time-varying parameters in different sense of disturbances. The asynchronous switching problem, where there is time lag between the switching of the currently activated system mode and the controller/filter to be designed, is investigated in Chapter 6. The systems with various time delays under typical time-dependent switching signals are addressed in Chapter 7.
Recent decades have witnessed the thriving development of new mathematical, computational and theoretical approaches, such as bioinformatics and neuroinformatics, to tackle fundamental issues in biology. These approaches focus no longer on individual units, such as nerve cells or genes, but rather on dynamic patterns of interactions between them. This volume explores the concept in full, featuring contributions from a global group of contributors, many of whom are pre-eminent in their field.
This book gives senior undergraduate and beginning graduate students and researchers in computer vision, applied mathematics, computer graphics, and robotics a self-contained introduction to the geometry of 3D vision; that is the reconstruction of 3D models of objects from a collection of 2D images. Following a brief introduction, Part I provides background materials for the rest of the book. The two fundamental transformations, namely rigid body motion and perspective projection are introduced and image formation and feature extraction discussed. Part II covers the classic theory of two view geometry based on the so-called epipolar constraint. Part III shows that a more proper tool for studying the geometry of multiple views is the so- called rank considtion on the multiple view matrix. Part IV develops practical reconstruction algorithms step by step as well as discusses possible extensions of the theory. Exercises are provided at the end of each chapter. Software for examples and algorithms are available on the author's website.
This proceedings volume collects review articles that summarize research conducted at the Munich Centre of Advanced Computing (MAC) from 2008 to 2012. The articles address the increasing gap between what should be possible in Computational Science and Engineering due to recent advances in algorithms, hardware, and networks, and what can actually be achieved in practice; they also examine novel computing architectures, where computation itself is a multifaceted process, with hardware awareness or ubiquitous parallelism due to many-core systems being just two of the challenges faced. Topics cover both the methodological aspects of advanced computing (algorithms, parallel computing, data exploration, software engineering) and cutting-edge applications from the fields of chemistry, the geosciences, civil and mechanical engineering, etc., reflecting the highly interdisciplinary nature of the Munich Centre of Advanced Computing.
A practical introduction to fundamentals of computer arithmetic Computer arithmetic is one of the foundations of computer science and engineering. Designed as both a practical reference for engineers and computer scientists and an introductory text for students of electrical engineering and the computer and mathematical sciences, Arithmetic and Logic in Computer Systems describes the various algorithms and implementations in computer arithmetic and explains the fundamental principles that guide them. Focusing on promoting an understanding of the concepts, Professor Mi Lu addresses:
To assist the reader, alternative methods are examined and thorough explanations of the material are supplied, along with discussions of the reasoning behind the theory. Ample examples and problems help the reader master the concepts.
Increasing system complexity has created a pressing need for better design tools and associated methodologies and languages for meeting the stringent time to market and cost constraints. Platform-centric and platfo- based system-on-chip (SoC) design methodologies, based on reuse of software and hardware functionality, has also gained increasing exposure and usage within the Electronic System-Level (ESL) design communities. The book proposes a new methodology for realizing platform-centric design of complex systems, and presents a detailed plan for its implementation. The proposed plan allows component vendors, system integrators and product developers to collaborate effectively and efficiently to create complex products within budget and schedule constraints. This book focuses more on the use of platforms in the design of products, and not on the design of platforms themselves. Platform-centric design is not for everyone, as some may feel that it does not allow them to differentiate their offering from competitors to a significant degree. However, its proponents may claim that the time-- market and cost advantages of platform-centric design more than compensate for any drawbacks.
Introduction The International Federation for Information Processing (IFIP) is a non-profit umbrella organization for national societies working in the field of information processing. It was founded in 1960 under the auspices of UNESCO. It is organized into several technical c- mittees. This book represents the proceedings of the 2008 conference of technical committee 8 (TC8), which covers the field of infor- tion systems. TC8 aims to promote and encourage the advancement of research and practice of concepts, methods, techniques and issues related to information systems in organisations. TC8 has established eight working groups covering the following areas: design and evaluation of information systems; the interaction of information systems and the organization; decision support systems; e-business information systems: multi-disciplinary research and practice; inf- mation systems in public administration; smart cards, technology, applications and methods; and enterprise information systems. Further details of the technical committee and its working groups can be found on our website (ifiptc8. dsi. uminho. pt). This conference was part of IFIP's World Computer Congress in Milan, Italy which took place 7-10 September 2008. The occasion celebrated the 32nd anniversary of IFIP TC8. The call for papers invited researchers, educators, and practitioners to submit papers and panel proposals that advance concepts, methods, techniques, tools, issues, education, and practice of information systems in organi- tions. Thirty one submissions were received.
Gatewatching: Collaborative Online News Production is the first comprehensive study of the latest wave of online news publications. The book investigates the collaborative publishing models of key news Websites, ranging from the worldwide Indymedia network to the massively successful technology news site Slashdot, and further to the multitude of Weblogs that have emerged in recent years. Building on collaborative approaches borrowed from the open source software development community, this book illustrates how gatewatching provides an alternative to gatekeeping and other traditional journalistic models of reporting, and has enabled millions of users around the world to participate in the online news publishing process.
Getting organizations going is one thing. Stopping them is another. This book examines how and why organizations become trapped in disastrous decisions. The focal point is Project Taurus, an IT venture commissioned by the London Stock Exchange and supported by numerous City Institutions. Taurus was intended to transform London's antiquated manual share settlement procedures into a state of the art electronic system that would be the envy of the world. The project collapsed after three year's intensive work and investments totalling almost GBP500 million. This book is an in depth study of escalation in decision making. The author has interviewed a number of people who played a key role and presents a most readable account of what actually happened. At the same time she sets the case in the broader literature of decision making.
This book provides the basic theory, techniques, and algorithms of modern cryptography that are applicable to network and cyberspace security. It consists of the following nine main chapters: Chapter 1 provides the basic concepts and ideas of cyberspace and cyberspace security, Chapters 2 and 3 provide an introduction to mathematical and computational preliminaries, respectively. Chapters 4 discusses the basic ideas and system of secret-key cryptography, whereas Chapters 5, 6, and 7 discuss the basic ideas and systems of public-key cryptography based on integer factorization, discrete logarithms, and elliptic curves, respectively. Quantum-safe cryptography is presented in Chapter 8 and offensive cryptography, particularly cryptovirology, is covered in Chapter 9. This book can be used as a secondary text for final-year undergraduate students and first-year postgraduate students for courses in Computer, Network, and Cyberspace Security. Researchers and practitioners working in cyberspace security and network security will also find this book useful as a reference.
This is a collection of classic research papers on the Dempster-Shafer theory of belief functions. The book is the authoritative reference in the field of evidential reasoning and an important archival reference in a wide range of areas including uncertainty reasoning in artificial intelligence and decision making in economics, engineering, and management. The book includes a foreword reflecting the development of the theory in the last forty years.
Smart cards have recently emerged as a key computer network and Internet security technology. These plastic cards contain an embedded microprocessor, allowing them to be programmed to perform specific duties. This extensively updated, second edition of the popular Artech House book, Smart Card Security and Applications, offers a current overview of the ways smart cards address the computer security issues of today's varied applications. Brand new discussions on multi-application operating systems, computer networks, and the Internet are included to keep technical and business professionals abreast of the very latest developments in this field. The book provides technical details on the newest protection mechanisms, features a discussion on the effects of recent attacks, and presents a clear methodology for solving unique security problems.
In 1984, Working Group 8.2 of the International Federation for
Information Processing (IFIP) threw down the gauntlet at its
Manchester conference, challenging the traditionalist orthodoxy
with its uncommon research approaches and topics. Manchester 1984,
followed by research methods conferences in Copenhagen (1990) and
Philadelphia (1997), marked the growing legitimacy of the
linguistic and qualitative turns in Information Systems research
and played a key role in making qualitative methods a respected
part of IS research. As evidenced by the papers in this volume,
Working Group 8.2 conferences showcase fresh thinking, provocative
sessions, and intellectual stimulation. The spirited, at times
boisterous, and always enlivening debate has turned WG8.2
conferences into life-changing and discipline-changing
inspirational events.
CSIE 2011 is an international scientific Congress for distinguished scholars engaged in scientific, engineering and technological research, dedicated to build a platform for exploring and discussing the future of Computer Science and Information Engineering with existing and potential application scenarios. The congress has been held twice, in Los Angeles, USA for the first and in Changchun, China for the second time, each of which attracted a large number of researchers from all over the world. The congress turns out to develop a spirit of cooperation that leads to new friendship for addressing a wide variety of ongoing problems in this vibrant area of technology and fostering more collaboration over the world. The congress, CSIE 2011, received 2483 full paper and abstract submissions from 27 countries and regions over the world. Through a rigorous peer review process, all submissions were refereed based on their quality of content, level of innovation, significance, originality and legibility. 688 papers have been accepted for the international congress proceedings ultimately. "
This book is about wireless local area networks (WLANs) based upon the IEEE 802.11 standards. It has three primary objectives: * To introduce the principles of 802.11 wireless networks and show how to con- ure equipment in order to implement various network solutions. * To provide an understandingof the security implicationsof wireless networks and demonstrate how vulnerabilities can be mitigated. * To introduce the underlying 802.11 protocols and build mathematical models in order to analyse performance in a WLAN environment. The book is aimed at industry professionals as well as undergraduate and gra- ate level students. It is intended as a companion for a university course on wireless networking. A practical approach is adopted in this book; examples are provided throughout, supported by detailed instructions. We cover a number of wireless vendors; namely, Cisco's Aironet, Alactel-Lucent's Omniaccess and Meru Networks. While separate vendors, all three systems have a Cisco IOS-like command-line interface. The GNU/Linux operating system is used extensively throughout this book.
For the first time advances in semiconductor manufacturing do not lead to a corresponding increase in performance. At 65 nm and below it is predicted that only a small portion of performance increase will be attributed to shrinking geometries while the lion share is due to innovative processor architectures. To substantiate this assertion it is instructive to look at major drivers of the semiconductor industry: wireless communications and multimedia. Both areas are characterized by an exponentially increasing demand of computational power, which cannot be provided in an energy-efficient manner by traditional processor architectures. Todaya (TM)s applications in wireless communications and multimedia require highly specialized and optimized architectures. New software tools and a sophisticated methodology above RTL are required to answer the challenges of designing an optimized application specific processor (ASIP). This book offers an automated and fully integrated implementation flow and compares it to common implementation practice. Case-studies emphasise that neither the architectural advantages nor the design space of ASIPs are sacrificed for an automated implementation. Realizing a building block which fulfils the requirements on programmability and computational power is now efficiently possible for the first time. Optimized ASIP Synthesis from Architecture Description Language Models inspires hardware designers as well as application engineers to design powerful ASIPs that will make their SoC designs unique.
This book contains papers presented at the fifth and sixth Teraflop Workshop. It presents the state-of-the-art in high performance computing and simulation on modern supercomputer architectures. It covers trends in hardware and software development in general and specifically the future of vector-based systems and heterogeneous architectures. It covers computational fluid dynamics, fluid-structure interaction, physics, chemistry, astrophysics, and climate research.
Computer simulation and mathematical modelling are the most important approaches in the quantitative analysis of the diffusive processes fundamental to many physical, chemical, biological, and geological systems. This comprehensive text/reference addresses the key issues in the "Modelling and Simulation of Diffusive Processes" from a broad range of different application areas. Applying an holistic approach, the book presents illuminating viewpoints drawn from an international selection of experts across a wide spectrum of disciplines, from computer science, mathematics and engineering, to natural resource management, environmental sciences, applied geo-sciences, agricultural sciences, and theoretical medicine. Topics and features: presents a detailed introduction to diffusive processes and modelling; discusses diffusion and molecular transport in living cells, and suspended sediment in open channels; examines the mathematical modelling of peristaltic transport of nanofluids, and isotachophoretic separation of ionic samples in microfluidics; reviews thermal characterization of non-homogeneous media, and scale-dependent porous dispersion resulting from velocity fluctuations; describes the modelling of nitrogen fate and transport at the sediment-water interface, and groundwater flow in unconfined aquifers; investigates two-dimensional solute transport from a varying pulse type point source, and futile cycles in metabolic flux modelling; studies contaminant concentration prediction along unsteady groundwater flow, and modelling synovial fluid flow in human joints; explores the modelling of soil organic carbon, and crop growth simulation. This interdisciplinary volume will be invaluable to researchers, lecturers and graduate students from such diverse fields as computer science, mathematics, hydrology, agriculture and biology.
Individuals, businesses, organizations, and countries all benefit from having access to data. People who generate data do it voluntarily, forming their habits, patterns, and behaviors in the process. Their psychological characteristics will be better understood as a result of the data that they generate, allowing them to make intelligent decisions. Organizations are motivated by the desire to collect and analyze as much data as possible from the general public or future customers in order to better understand their psychological features and influence them to purchase their products or services. As a result, there has been a great deal of debate concerning the use of data from the perspectives of individuals, organizations, the public, and the government. Digital Psychology's Impact on Business and Society considers the phenomena of digital psychology and society in general and evaluates individual strategies and those of businesses, organizations, and even nations. Covering topics such as big data, marketing, social media, and social computing, this reference work is ideal for policymakers, psychologists, business owners, managers, industry professionals, researchers, scholars, practitioners, academicians, instructors, and students.
This dictionary contains 13,000 terms with more than 4,000
cross-references used in the following fields: automation,
technology of management and regulation, computing machine and data
processing, computer control, automation of industry, laser
technology, theory of information and theory of signals, theory of
algorithms and programming, philosophical bases of cybernetics,
cybernetics and mathematical methods.
This book constitutes the thoroughly refereed post conference proceedings of the 6th IFIP WG 9.2, 9.6/11.7, 11.4, 11.6/PrimeLife International Summer School, held in Helsingborg, Sweden, in August 2010. The 27 revised papers were carefully selected from numerous submissions during two rounds of reviewing. They are organized in topical sections on terminology, privacy metrics, ethical, social, and legal aspects, data protection and identity management, eID cards and eID interoperability, emerging technologies, privacy for eGovernment and AAL applications, social networks and privacy, privacy policies, and usable privacy.
This book contains extended and revised versions of the best papers presented at the 18th IFIP WG 10.5/IEEE International Conference on Very Large Scale Integration, VLSI-SoC 2010, held in Madrid, Spain, in September 2010. The 14 papers included in the book were carefully reviewed and selected from the 52 full papers presented at the conference. The papers cover a wide variety of excellence in VLSI technology and advanced research. They address the current trend toward increasing chip integration and technology process advancements bringing about stimulating new challenges both at the physical and system-design levels, as well as in the test of theses systems. |
You may like...
Discrete Optimization, Volume 11 - The…
E. Boros, P. L Hammer
Hardcover
R2,439
Discovery Miles 24 390
Differential Equations in Engineering…
Nupur Goyal, Piotr Kulczycki, …
Hardcover
R4,214
Discovery Miles 42 140
Linear Integer Programming - Theory…
Elias Munapo, Santosh Kumar
Hardcover
R3,775
Discovery Miles 37 750
|