![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
For the first time advances in semiconductor manufacturing do not lead to a corresponding increase in performance. At 65 nm and below it is predicted that only a small portion of performance increase will be attributed to shrinking geometries while the lion share is due to innovative processor architectures. To substantiate this assertion it is instructive to look at major drivers of the semiconductor industry: wireless communications and multimedia. Both areas are characterized by an exponentially increasing demand of computational power, which cannot be provided in an energy-efficient manner by traditional processor architectures. Todaya (TM)s applications in wireless communications and multimedia require highly specialized and optimized architectures. New software tools and a sophisticated methodology above RTL are required to answer the challenges of designing an optimized application specific processor (ASIP). This book offers an automated and fully integrated implementation flow and compares it to common implementation practice. Case-studies emphasise that neither the architectural advantages nor the design space of ASIPs are sacrificed for an automated implementation. Realizing a building block which fulfils the requirements on programmability and computational power is now efficiently possible for the first time. Optimized ASIP Synthesis from Architecture Description Language Models inspires hardware designers as well as application engineers to design powerful ASIPs that will make their SoC designs unique.
The papers containedin this volume were presentedat the 5th IFIP InternationalC- ference on Theoretical Computer Science (IFIP TCS), 7-10 September 2008, Milan, Italy. TCS is a bi-annual conference.The ?rst conferenceof the series was held in Sendai (Japan, 2000), followed by Montreal (Canada, 2002), Toulouse (France, 2004) and Santiago (Chile, 2006).TCS is organizedby IFIP TC1 (Technical Committee 1: Fo- dations of Computer Science) and Working Group 2.2 of IFIP TC2 (Technical C- mittee 2: Software: Theory and Practice). TCS 2008 was part of the 20th IFIP World Computer Congress (WCC 2008), constituting the TC1 Track of WCC 2008. The contributed papers were selected from 36+45 submissions from altogether 30 countries. A total of 14+16 submissions were accepted as full papers. Papers in this volume are original contributions in two general areas: Track A: Algorithms, C- plexity and Models of Computation;and Track B: Logic, Semantics, Speci?cation and Veri?cation. The conference also included seven invited presentations, from Luca Cardelli, Thomas Ehrhard, Javier Esparza, Antonio Restivo, Tim Roughgarden, Gr- gorz Rozenberg and Avraham Trakhtman. These presentations are included (except one) in this volume. In particular, Luca Cardelli, Javier Esparza, Antonio Restivo, Tim Roughgarden and Avraham Trakhtman accepted our invitation to write full papers - lated to their talks.
Symbolic Integration I is destined to become the standard reference work in the field. Manuel Bronstein is a leading expert on this topic and his book is the first to treat the subject both comprehensively and in sufficient detail - incorporating new results along the way. The book addresses mathematicians and computer scientists interested in symbolic computation, developers and programmers of computer algebra systems as well as users of symbolic integration methods. Many algorithms are given in pseudocode ready for immediate implementation, making the book equally suitable as a textbook for lecture courses on symbolic integration. This second edition offers a new chapter on parallel integration, a number of other improvements and a couple of additional exercises. From the reviews of the first edition: ..". The writing is excellent, and the author provides a clear and coherent treatment of the problem of symbolic integration of transcendental functions " F. Winkler, Computing Reviews 1997 "
This book contains extended and revised versions of the best papers presented at the 18th IFIP WG 10.5/IEEE International Conference on Very Large Scale Integration, VLSI-SoC 2010, held in Madrid, Spain, in September 2010. The 14 papers included in the book were carefully reviewed and selected from the 52 full papers presented at the conference. The papers cover a wide variety of excellence in VLSI technology and advanced research. They address the current trend toward increasing chip integration and technology process advancements bringing about stimulating new challenges both at the physical and system-design levels, as well as in the test of theses systems.
Smart cards have recently emerged as a key computer network and Internet security technology. These plastic cards contain an embedded microprocessor, allowing them to be programmed to perform specific duties. This extensively updated, second edition of the popular Artech House book, Smart Card Security and Applications, offers a current overview of the ways smart cards address the computer security issues of today's varied applications. Brand new discussions on multi-application operating systems, computer networks, and the Internet are included to keep technical and business professionals abreast of the very latest developments in this field. The book provides technical details on the newest protection mechanisms, features a discussion on the effects of recent attacks, and presents a clear methodology for solving unique security problems.
This book presents a comprehensive introduction to Internetware, covering aspects ranging from the fundamental principles and engineering methodologies to operational platforms, quality measurements and assurance and future directions. It also includes guidelines and numerous representative real-world case studies that serve as an invaluable reference resource for software engineers involved in the development of Internetware applications. Providing a detailed analysis of current trends in modern software engineering in the Internet, it offers an essential blueprint and an important contribution to the research on software engineering and systems for future Internet computing.
This volume contains 27 contributions to the Forth Russian-German Advanced Research Workshop on Computational Science and High Performance Computing presented in October 2009 in Freiburg, Germany. The workshop was organized jointly by the High Performance Computing Center Stuttgart (HLRS), the Institute of Computational Technologies of the Siberian Branch of the Russian Academy of Sciences (ICT SB RAS) and the Section of Applied Mathematics of the University of Freiburg (IAM Freiburg) The contributions range from computer science, mathematics and high performance computing to applications in mechanical and aerospace engineering. They show a wealth of theoretical work and simulation experience with a potential of bringing together theoretical mathematical modelling and usage of high performance computing systems presenting the state of the art of computational technologies.
One of the world s leading problems in the field of national security is protection of borders and borderlands. This book addresses multiple issues on advanced innovative methods of multi-level control of both ground (UGVs) and aerial drones (UAVs). Those objects combined with innovative algorithms become autonomous objects capable of patrolling chosen borderland areas by themselves and automatically inform the operator of the system about potential place of detection of a specific incident. This is achieved by using sophisticated methods of generation of non-collision trajectory for those types of objects and enabling automatic integration of both ground and aerial unmanned vehicles. The topics included in this book also cover presentation of complete information and communication technology (ICT) systems capable of control, observation and detection of various types of incidents and threats. This book is a valuable source of information for constructors and developers of such solutions for uniformed services. Scientists and researchers involved in computer vision, image processing, data fusion, control algorithms or IC can find many valuable suggestions and solutions. Multiple challenges for such systems are also presented. "
HACKING: Ultimate Hacking for Beginners Hacking is a widespread problem that has compromised the records of individuals, major corporations, and even the federal government. This book lists the various ways hackers can breach the security of an individual or an organization's data and network. Its information is for learning purposes only, and the hacking techniques should not be tried because it is a crime to hack someone's personal details without his or her consent. In HACKING: Ultimate Hacking for Beginners you will learn: The advantages and disadvantages of Bluetooth technology. The tools and software that is used for Bluetooth hacking with a brief description The four primary methods of hacking a website and a brief explanation of each Seven different types of spamming, with a focus on email spamming and how to prevent it. Eight common types of security breaches How to understand the process of hacking computers and how to protect against it Using CAPTCHA to prevent hacking
When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, control, and finance.
Recent decades have witnessed the thriving development of new mathematical, computational and theoretical approaches, such as bioinformatics and neuroinformatics, to tackle fundamental issues in biology. These approaches focus no longer on individual units, such as nerve cells or genes, but rather on dynamic patterns of interactions between them. This volume explores the concept in full, featuring contributions from a global group of contributors, many of whom are pre-eminent in their field.
This book brings together historical notes, reviews of research developments, fresh ideas on how to make VC (Vapnik-Chervonenkis) guarantees tighter, and new technical contributions in the areas of machine learning, statistical inference, classification, algorithmic statistics, and pattern recognition. The contributors are leading scientists in domains such as statistics, mathematics, and theoretical computer science, and the book will be of interest to researchers and graduate students in these domains.
This book discusses major milestones in Rohit Jivanlal Parikh's scholarly work. Highlighting the transition in Parikh's interest from formal languages to natural languages, and how he approached Wittgenstein's philosophy of language, it traces the academic trajectory of a brilliant scholar whose work opened up various new avenues in research. This volume is part of Springer's book series Outstanding Contributions to Logic, and honours Rohit Parikh and his works in many ways. Parikh is a leader in the realm of ideas, offering concepts and definitions that enrich the field and lead to new research directions. Parikh has contributed to a variety of areas in logic, computer science and game theory. In mathematical logic his contributions have been in recursive function theory, proof theory and non-standard analysis; in computer science, in the areas of modal, temporal and dynamic logics of programs and semantics of programs, as well as logics of knowledge; in artificial intelligence in the area of belief revision; and in game theory in the formal analysis of social procedures, with a strong undercurrent of philosophy running through all his work.This is not a collection of articles limited to one theme, or even directly connected to specific works by Parikh, but instead all papers are inspired and influenced by Parikh in some way, adding structures to and enriching "Parikh-land". The book presents a brochure-like overview of Parikh-land before providing an "introductory video" on the sights and sounds that you experience when reading the book.
This dictionary contains 13,000 terms with more than 4,000
cross-references used in the following fields: automation,
technology of management and regulation, computing machine and data
processing, computer control, automation of industry, laser
technology, theory of information and theory of signals, theory of
algorithms and programming, philosophical bases of cybernetics,
cybernetics and mathematical methods.
Graphs are widely used to represent structural information in the form of objects and connections between them. Graph transformation is the rule-based manipulation of graphs, an increasingly important concept in computer science and related fields. This is the first textbook treatment of the algebraic approach to graph transformation, based on algebraic structures and category theory. Part I is an introduction to the classical case of graph and typed graph transformation. In Part II basic and advanced results are first shown for an abstract form of replacement systems, so-called adhesive high-level replacement systems based on category theory, and are then instantiated to several forms of graph and Petri net transformation systems. Part III develops typed attributed graph transformation, a technique of key relevance in the modeling of visual languages and in model transformation. Part IV contains a practical case study on model transformation and a presentation of the AGG (attributed graph grammar) tool environment. Finally the appendix covers the basics of category theory, signatures and algebras. The book addresses both research scientists and graduate students in computer science, mathematics and engineering.
Computer science is the science of the future, and already underlies every facet of business and technology, and much of our everyday lives. In addition, it will play a crucial role in the science the 21st century, which will be dominated by biology and biochemistry, similar to the role of mathematics in the physical sciences of the 20th century. In this award-winning best-seller, the author and his co-author focus on the fundamentals of computer science, which revolve around the notion of the "algorithm." They discuss the design of algorithms, and their efficiency and correctness, the inherent limitations of algorithms and computation, quantum algorithms, concurrency, large systems and artificial intelligence. Throughout, the authors, in their own words, stress the 'fundamental and robust nature of the science in a form that is virtually independent of the details of specific computers, languages and formalisms'. This version of the book is published to celebrate 25 years since its first edition, and in honor of the Alan M. Turing Centennial year. Turing was a true pioneer of computer science, whose work forms the underlying basis of much of this book. "
This book is written for anyone who is interested in how a field of research evolves and the fundamental role of understanding uncertainties involved in different levels of analysis, ranging from macroscopic views to meso- and microscopic ones. We introduce a series of computational and visual analytic techniques, from research areas such as text mining, deep learning, information visualization and science mapping, such that readers can apply these tools to the study of a subject matter of their choice. In addition, we set the diverse set of methods in an integrative context, that draws upon insights from philosophical, sociological, and evolutionary theories of what drives the advances of science, such that the readers of the book can guide their own research with their enriched theoretical foundations. Scientific knowledge is complex. A subject matter is typically built on its own set of concepts, theories, methodologies and findings, discovered by generations of researchers and practitioners. Scientific knowledge, as known to the scientific community as a whole, experiences constant changes. Some changes are long-lasting, whereas others may be short lived. How can we keep abreast of the state of the art as science advances? How can we effectively and precisely convey the status of the current science to the general public as well as scientists across different disciplines? The study of scientific knowledge in general has been overwhelmingly focused on scientific knowledge per se. In contrast, the status of scientific knowledge at various levels of granularity has been largely overlooked. This book aims to highlight the role of uncertainties, in developing a better understanding of the status of scientific knowledge at a particular time, and how its status evolves over the course of the development of research. Furthermore, we demonstrate how the knowledge of the types of uncertainties associated with scientific claims serves as an integral and critical part of our domain expertise.
This book presents some recent works on the application of Soft Computing techniques in information access on the World Wide Web. The book comprises 15 chapters from internationally known researchers and is divided in four parts reflecting the areas of research of the presented works such as Document Classification, Semantic Web, Web Information Retrieval and Web Applications. This book demonstrates that Web Information Retrieval is a stimulating area of research where Soft Computing technologies can be applied satisfactorily.
This text centers around three main subjects. The first is the concept of modularity and independence in classical logic and nonmonotonic and other nonclassical logic, and the consequences on syntactic and semantical interpolation and language change. In particular, we will show the connection between interpolation for nonmonotonic logic and manipulation of an abstract notion of size. Modularity is essentially the ability to put partial results achieved independently together for a global result. The second aspect of the book is the authors' uniform picture of conditionals, including many-valued logics and structures on the language elements themselves and on the truth value set. The third topic explained by the authors is neighbourhood semantics, their connection to independence, and their common points and differences for various logics, e.g., for defaults and deontic logic, for the limit version of preferential logics, and for general approximation. The book will be of value to researchers and graduate students in logic and theoretical computer science.
Networks have become nearly ubiquitous and increasingly complex, and their support of modern enterprise environments has become fundamental. Accordingly, robust network management techniques are essential to ensure optimal performance of these networks. This monograph treats the application of numerous graph-theoretic algorithms to a comprehensive analysis of dynamic enterprise networks. Network dynamics analysis yields valuable information about network performance, efficiency, fault prediction, cost optimization, indicators and warnings. Based on many years of applied research of generic network dynamics, this work covers a number of elegant applications (including many new and experimental results) of traditional graph theory algorithms and techniques to computationally tractable network dynamics analysis to motivate network analysts, practitioners and researchers alike. The material is also suitable for graduate courses addressing state-of-the-art applications of graph theory in analysis of dynamic communication networks, dynamic databasing, and knowledge management.
Confronting the digital revolution in academia, this book examines the application of new computational techniques and visualisation technologies in the Arts & Humanities. Uniting differing perspectives, leading and emerging scholars discuss the theoretical and practical challenges that computation raises for these disciplines.
Computer simulation and mathematical modelling are the most important approaches in the quantitative analysis of the diffusive processes fundamental to many physical, chemical, biological, and geological systems. This comprehensive text/reference addresses the key issues in the "Modelling and Simulation of Diffusive Processes" from a broad range of different application areas. Applying an holistic approach, the book presents illuminating viewpoints drawn from an international selection of experts across a wide spectrum of disciplines, from computer science, mathematics and engineering, to natural resource management, environmental sciences, applied geo-sciences, agricultural sciences, and theoretical medicine. Topics and features: presents a detailed introduction to diffusive processes and modelling; discusses diffusion and molecular transport in living cells, and suspended sediment in open channels; examines the mathematical modelling of peristaltic transport of nanofluids, and isotachophoretic separation of ionic samples in microfluidics; reviews thermal characterization of non-homogeneous media, and scale-dependent porous dispersion resulting from velocity fluctuations; describes the modelling of nitrogen fate and transport at the sediment-water interface, and groundwater flow in unconfined aquifers; investigates two-dimensional solute transport from a varying pulse type point source, and futile cycles in metabolic flux modelling; studies contaminant concentration prediction along unsteady groundwater flow, and modelling synovial fluid flow in human joints; explores the modelling of soil organic carbon, and crop growth simulation. This interdisciplinary volume will be invaluable to researchers, lecturers and graduate students from such diverse fields as computer science, mathematics, hydrology, agriculture and biology.
The two volumes IFIP AICT 545 and 546 constitute the refereed post-conference proceedings of the 11th IFIP WG 5.14 International Conference on Computer and Computing Technologies in Agriculture, CCTA 2017, held in Jilin, China, in August 2017. The 100 revised papers included in the two volumes were carefully reviewed and selected from 282 submissions. They cover a wide range of interesting theories and applications of information technology in agriculture. The papers focus on four topics: Internet of Things and big data in agriculture, precision agriculture and agricultural robots, agricultural information services, and animal and plant phenotyping for agriculture.
In 1998-99, at the dawn of the SoC Revolution, we wrote Surviving the SOC Revolution: A Guide to Platform Based Design. In that book, we focused on presenting guidelines and best practices to aid engineers beginning to design complex System-on-Chip devices (SoCs). Now, in 2003, facing the mid-point of that revolution, we believe that it is time to focus on winning. In this book, Winning the SoC Revolution: Experiences in Real Design, we gather the best practical experiences in how to design SoCs from the most advanced design groups, while setting the issues and techniques in the context of SoC design methodologies. As an edited volume, this book has contributions from the leading design houses who are winning in SoCs - Altera, ARM, IBM, Philips, TI, UC Berkeley, and Xilinx. These chapters present the many facets of SoC design - the platform based approach, how to best utilize IP, Verification, FPGA fabrics as an alternative to ASICs, and next generation process technology issues. We also include observations from Ron Wilson of CMP Media on best practices for SoC design team collaboration. We hope that by utilizing this book, you too, will win the SoC Revolution. |
You may like...
Edexcel GCSE (9-1) Psychology Student…
Christine Brain, Karren Smith, …
Paperback
(1)R1,231 Discovery Miles 12 310
|