![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
At a time when Internet use is closely tracked and social networking sites supply data for targeted advertising, Lars Heide presents the first academic study of the invention that fueled today's information revolution: the punched card. Early punched cards helped to process the United States census in 1890. They soon proved useful in calculating invoices and issuing pay slips. As demand for more sophisticated systems and reading machines increased in both the United States and Europe, punched cards served ever-larger data-processing purposes. Insurance companies, public utilities, businesses, and governments all used them to keep detailed records of their customers, competitors, employees, citizens, and enemies. The United States used punched-card registers in the late 1930s to pay roughly 21 million Americans their Social Security pensions, Vichy France used similar technologies in an attempt to mobilize an army against the occupying German forces, and the Germans in 1941 developed several punched-card registers to make the war effort--and surveillance of minorities--more effective. Heide's analysis of these three major punched-card systems, as well as the impact of the invention on Great Britain, illustrates how different cultures collected personal and financial data and how they adapted to new technologies. This comparative study will interest students and scholars from a wide range of disciplines, including the history of technology, computer science, business history, and management and organizational studies.
With the rise of mobile and wireless technologies, more sustainable networks are necessary to support communication. These next-generation networks can now be utilized to extend the growing era of the Internet of Things. Enabling Technologies and Architectures for Next-Generation Networking Capabilities is an essential reference source that explores the latest research and trends in large-scale 5G technologies deployment, software-defined networking, and other emerging network technologies. Featuring research on topics such as data management, heterogeneous networks, and spectrum sensing, this book is ideally designed for computer engineers, technology developers, network administrators and researchers, professionals, and graduate-level students seeking coverage on current and future network technologies.
Multisensor fusion systems are only practical if the algorithms used are practical and effective, and if there is efficient database support. The first part of this book discusses a wide range of issues related to the development of robust, context-sensitive, and efficient data fusion algorithms. The second part addresses database requirements, structures, and issues related to achieving overall computational efficiency. Featuring highly accessible notation, the processing model and database issues presented in the text are aimed at system developers working in sensor fusion, automatic target recognition, multiple-target tracking, robotic control, automated image understanding, and large-scale integration and fabrication.
Systems for Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) are currently separate. The potential of the latest technologies and changes in operational and analytical applications over the last decade have given rise to the unification of these systems, which can be of benefit for both workloads. Research and industry have reacted and prototypes of hybrid database systems are now appearing. Benchmarks are the standard method for evaluating, comparing and supporting the development of new database systems. Because of the separation of OLTP and OLAP systems, existing benchmarks are only focused on one or the other. With the rise of hybrid database systems, benchmarks to assess these systems will be needed as well. Based on the examination of existing benchmarks, a new benchmark for hybrid database systems is introduced in this book. It is furthermore used to determine the effect of adding OLAP to an OLTP workload and is applied to analyze the impact of typically used optimizations in the historically separate OLTP and OLAP domains in mixed-workload scenarios.
This book presents computer programming as a key method for solving mathematical problems. There are two versions of the book, one for MATLAB and one for Python. The book was inspired by the Springer book TCSE 6: A Primer on Scientific Programming with Python (by Langtangen), but the style is more accessible and concise, in keeping with the needs of engineering students. The book outlines the shortest possible path from no previous experience with programming to a set of skills that allows the students to write simple programs for solving common mathematical problems with numerical methods in engineering and science courses. The emphasis is on generic algorithms, clean design of programs, use of functions, and automatic tests for verification.
Mathematical logic is a branch of mathematics that takes axiom systems and mathematical proofs as its objects of study. This book shows how it can also provide a foundation for the development of information science and technology. The first five chapters systematically present the core topics of classical mathematical logic, including the syntax and models of first-order languages, formal inference systems, computability and representability, and Goedel's theorems. The last five chapters present extensions and developments of classical mathematical logic, particularly the concepts of version sequences of formal theories and their limits, the system of revision calculus, proschemes (formal descriptions of proof methods and strategies) and their properties, and the theory of inductive inference. All of these themes contribute to a formal theory of axiomatization and its application to the process of developing information technology and scientific theories. The book also describes the paradigm of three kinds of language environments for theories and it presents the basic properties required of a meta-language environment. Finally, the book brings these themes together by describing a workflow for scientific research in the information era in which formal methods, interactive software and human invention are all used to their advantage. The second edition of the book includes major revisions on the proof of the completeness theorem of the Gentzen system and new contents on the logic of scientific discovery, R-calculus without cut, and the operational semantics of program debugging. This book represents a valuable reference for graduate and undergraduate students and researchers in mathematics, information science and technology, and other relevant areas of natural sciences. Its first five chapters serve as an undergraduate text in mathematical logic and the last five chapters are addressed to graduate students in relevant disciplines.
This Volume discusses the underlying principles and analysis of the different concepts associated with an emerging socio-inspired optimization tool referred to as Cohort Intelligence (CI). CI algorithms have been coded in Matlab and are freely available from the link provided inside the book. The book demonstrates the ability of CI methodology for solving combinatorial problems such as Traveling Salesman Problem and Knapsack Problem in addition to real world applications from the healthcare, inventory, supply chain optimization and Cross-Border transportation. The inherent ability of handling constraints based on probability distribution is also revealed and proved using these problems.
Jack Ganssle has been forming the careers of embedded engineers for
20+ years. He has done this with four books, over 500 articles, a
weekly column, and continuous lecturing. Technology moves fast and
since the first edition of this best-selling classic much has
changed. The new edition will reflect the author's new and ever
evolving philosophy in the face of new technology and realities.
No aspect of business, public, or private lives in developed economies can be discussed today without acknowledging the role of information and communication technologies (ICT). A shortage of studies still exists, however, on how ICTs can help developing economies. Leveraging Developing Economies with the Use of Information Technology: Trends and Tools moves toward filling the gap in research on ICT and developing nations, bringing these countries one step closer to advancement through technology. This essential publication will bring together ideas, views, and perspectives helpful to government officials, business professionals, and other individuals worldwide as they consider the use of ICT for socio-economic progress in the developing world.
Power consumption becomes the most important design goal in a wide range of electronic systems. There are two driving forces towards this trend: continuing device scaling and ever increasing demand of higher computing power. First, device scaling continues to satisfy Moore's law via a conventional way of scaling (More Moore) and a new way of exploiting the vertical integration (More than Moore). Second, mobile and IT convergence requires more computing power on the silicon chip than ever. Cell phones are now evolving towards mobile PC. PCs and data centers are becoming commodities in house and a must in industry. Both supply enabled by device scaling and demand triggered by the convergence trend realize more computation on chip (via multi-core, integration of diverse functionalities on mobile SoCs, etc.) and finally more power consumption incurring power-related issues and constraints. "Energy-Aware System Design: Algorithms and Architectures" provides state-of-the-art ideas for low power design methods from circuit, architecture to software level andoffers design case studies in three fast growing areas of mobile storage, biomedical and security. Important topics and features: - Describes very recent advanced issues and methods for energy-aware design at each design level from circuit andarchitecture toalgorithm level, and also covering important blocks including low power main memory subsystem and on-chip network at architecture level - Explains efficient power conversion and delivery which is becoming important as heterogeneous power sources are adopted for digital and non-digital parts - Investigates 3D die stacking emphasizing temperature awareness for better perspective on energy efficiency - Presents three practical energy-aware design case studies; novel storage device (e.g., solid state disk), biomedical electronics (e.g., cochlear and retina implants), and wireless surveillance camera systems. Researchers and engineers in the field of hardware and software design will find this book an excellent starting point to catch up with the state-of-the-art ideas of low power design.
The 1960s were perhaps a decade of confusion, when scientists faced d- culties in dealing with imprecise information and complex dynamics. A new set theory and then an in?nite-valued logic of Lot? A. Zadeh were so c- fusing that they were called fuzzy set theory and fuzzy logic; a deterministic system found by E. N. Lorenz to have random behaviours was so unusual that it was lately named a chaotic system. Just like irrational and imaginary numbers, negative energy, anti-matter, etc., fuzzy logic and chaos were gr- ually and eventually accepted by many, if not all, scientists and engineers as fundamental concepts, theories, as well as technologies. In particular, fuzzy systems technology has achieved its maturity with widespread applications in many industrial, commercial, and technical ?elds, ranging from control, automation, and arti?cial intelligence to image/signal processing, patternrecognition, andelectroniccommerce.Chaos, ontheother hand, wasconsideredoneofthethreemonumentaldiscoveriesofthetwentieth century together with the theory of relativity and quantum mechanics. As a very special nonlinear dynamical phenomenon, chaos has reached its current outstanding status from being merely a scienti?c curiosity in the mid-1960s to an applicable technology in the late 1990s. Finding the intrinsic relation between fuzzy logic and chaos theory is certainlyofsigni?cantinterestandofpotentialimportance.Thepast20years have indeed witnessed some serious explorations of the interactions between fuzzylogicandchaostheory, leadingtosuchresearchtopicsasfuzzymodeling of chaotic systems using Takagi-Sugeno models, linguistic descriptions of chaotic systems, fuzzy control of chaos, and a combination of fuzzy control technology and chaos theory for various engineering pract
This book is intended as an introduction to fuzzy algebraic hyperstructures. As the first in its genre, it includes a number of topics, most of which reflect the authors' past research and thus provides a starting point for future research directions. The book is organized in five chapters. The first chapter introduces readers to the basic notions of algebraic structures and hyperstructures. The second covers fuzzy sets, fuzzy groups and fuzzy polygroups. The following two chapters are concerned with the theory of fuzzy Hv-structures: while the third chapter presents the concept of fuzzy Hv-subgroup of Hv-groups, the fourth covers the theory of fuzzy Hv-ideals of Hv-rings. The final chapter discusses several connections between hypergroups and fuzzy sets, and includes a study on the association between hypergroupoids and fuzzy sets endowed with two membership functions. In addition to providing a reference guide to researchers, the book is also intended as textbook for undergraduate and graduate students.
This book provides an overview of the confluence of ideas in Turing's era and work and examines the impact of his work on mathematical logic and theoretical computer science. It combines contributions by well-known scientists on the history and philosophy of computability theory as well as on generalised Turing computability. By looking at the roots and at the philosophical and technical influence of Turing's work, it is possible to gather new perspectives and new research topics which might be considered as a continuation of Turing's working ideas well into the 21st century. The Stored-Program Universal Computer: Did Zuse Anticipate Turing and von Neumann?" is available open access under a Creative Commons Attribution 4.0 International License via link.springer.com
This book offers a coherent and comprehensive approach to feature subset selection in the scope of classification problems, explaining the foundations, real application problems and the challenges of feature selection for high-dimensional data. The authors first focus on the analysis and synthesis of feature selection algorithms, presenting a comprehensive review of basic concepts and experimental results of the most well-known algorithms. They then address different real scenarios with high-dimensional data, showing the use of feature selection algorithms in different contexts with different requirements and information: microarray data, intrusion detection, tear film lipid layer classification and cost-based features. The book then delves into the scenario of big dimension, paying attention to important problems under high-dimensional spaces, such as scalability, distributed processing and real-time processing, scenarios that open up new and interesting challenges for researchers. The book is useful for practitioners, researchers and graduate students in the areas of machine learning and data mining.
This edited volume is intended to address in a comprehensive and
integrated manner three major areas of national and international
security research from an information systems-centric perspective:
legal and policy frameworks; intelligence and security informatics;
and emergency preparedness and infrastructure protection. The
discussions are replete with real-world case studies and examples
that present the concepts using an integrated, action-oriented and
theory-based approach to validate the frameworks presented and to
provide specific insights on the technical approaches and
organizational issues under investigation.
Ambient Intelligence is one of the new paradigms in the development of information and communication technology, which has attracted much attention over the past years. The aim is the to integrate technology into people environment in such a way that it improves their daily lives in terms of well-being, creativity, and productivity. Ambient Intelligence is a multidisciplinary concept, which heavily builds on a number of fundamental breakthroughs that have been achieved in the development of new hardware concepts over the past years. New insights in nano and micro electronics, packaging and interconnection technology, large-area electronics, energy scavenging devices, wireless sensors, low power electronics and computing platforms enable the realization of the heaven of ambient intelligence by overcoming the hell of physics. Based on contributions from leading technical experts, this book presents a number of key topics on novel hardware developments, thus providing the reader a good insight into the physical basis of ambient intelligence. It also indicates key research challenges that must be addressed in the future.
The creation and consumption of content, especially visual content, is ingrained into our modern world. This book contains a collection of texts centered on the evaluation of image retrieval systems. To enable reproducible evaluation we must create standardized benchmarks and evaluation methodologies. The individual chapters in this book highlight major issues and challenges in evaluating image retrieval systems and describe various initiatives that provide researchers with the necessary evaluation resources. In particular they describe activities within ImageCLEF, an initiative to evaluate cross-language image retrieval systems which has been running as part of the Cross Language Evaluation Forum (CLEF) since 2003. To this end, the editors collected contributions from a range of people: those involved directly with ImageCLEF, such as the organizers of specific image retrieval or annotation tasks; participants who have developed techniques to tackle the challenges set forth by the organizers; and people from industry and academia involved with image retrieval and evaluation generally. Mostly written for researchers in academia and industry, the book stresses the importance of combing textual and visual information - a multimodal approach - for effective retrieval. It provides the reader with clear ideas about information retrieval and its evaluation in contexts and domains such as healthcare, robot vision, press photography, and the Web.
Instructional Design in the Real World: A View from the Trenches offers guidance on how the traditional instructional design system has been used and how it must be changed to work within other systems. The environments and systems that affect the ADDIE (Analysis, Design, Development, Implementation, Evaluation) process and to which it must be adapted include corporations, industry, consulting organizations, health care facilities, church and charitable groups, the military, the government, educational institutions, and others. Its application must be filtered and altered by the environments and the systems where the learning or training takes place. Every chapter includes a case study showing how the application of ID strategies, learning theories, systems theory, management theories and practices and communication tools and practices are adapted and applied in various environments. The chapters also contain lessons learned, tool tips, and suggestions for the future.
This book adheres to the vision that in the future compelling user experiences will be key differentiating benefits of products and services. Evaluating the user experience plays a central role, not only during the design process, but also during regular usage: for instance a video recorder that recommends TV programs that fit your current mood, a product that measures your current level of relaxation and produces advice on how to balance your life, or a module that alerts a factory operator when he is getting drowsy. Such systems are required to assess and interpret user experiences (almost) in real-time, and that is exactly what this book is about. How to achieve this? What are potential applications of psychophysiological measurements? Are real-time assessments based on monitoring of user behavior possible? If so, which elements are critical? Are behavioral aspects important? Which technology can be used? How important are intra-individual differences? What can we learn from products already on the market? The book gathers a group of invited authors from different backgrounds, such as technology, academy and business. This is a mosaic of their work, and that of Philips Research, in the assessment of user experience, covering the full range from academic research to commercial propositions..
Candida Ferreira thoroughly describes the basic ideas of gene expression programming (GEP) and numerous modifications to this powerful new algorithm. This monograph provides all the implementation details of GEP so that anyone with elementary programming skills will be able to implement it themselves. The book also includes a self-contained introduction to this new exciting field of computational intelligence, including several new algorithms for decision tree induction, data mining, classifier systems, function finding, polynomial induction, times series prediction, evolution of linking functions, automatically defined functions, parameter optimization, logic synthesis, combinatorial optimization, and complete neural network induction. The book also discusses some important and controversial evolutionary topics that might be refreshing to both evolutionary computer scientists and biologists. This second edition has been substantially revised and extended with five new chapters, including a new chapter describing two new algorithms for inducing decision trees with nominal and numeric/mixed attributes."
Computable Calculus treats the fundamental topic of calculus in a
novel way that is more in tune with today's computer age.
Comprising 11 chapters and an accompanying CD-ROM, the book
presents mathematical analysis that has been created to deal with
constructively defined concepts. The book's "show your work"
approach makes it easier to understand the pitfalls of various
computations and, more importantly, how to avoid these pitfalls.
Calculus has been used in solving many scientific and engineering problems. For optimization problems, however, the differential calculus technique sometimes has a drawback when the objective function is step-wise, discontinuous, or multi-modal, or when decision variables are discrete rather than continuous. Thus, researchers have recently turned their interests into metaheuristic algorithms that have been inspired by natural phenomena such as evolution, animal behavior, or metallic annealing. This book especially focuses on a music-inspired metaheuristic algorithm, harmony search. Interestingly, there exists an analogy between music and optimization: each musical instrument corresponds to each decision variable; musical note corresponds to variable value; and harmony corresponds to solution vector. Just like musicians in Jazz improvisation play notes randomly or based on experiences in order to find fantastic harmony, variables in the harmony search algorithm have random values or previously-memorized good values in order to find optimal solution.
Thecontinuousandincreasinginterestconcerningvectoroptimizationperc- tible in the research community, where contributions dealing with the theory of duality abound lately, constitutes the main motivation that led to writing this book. Decisive was also the research experience of the authors in this ?eld, materialized in a number of works published within the last decade. The need for a book on duality in vector optimization comes from the fact that despite the large amount of papers in journals and proceedings volumes, no book mainly concentrated on this topic was available so far in the scienti?c landscape. There is a considerable presence of books, not all recent releases, on vector optimization in the literature. We mention here the ones due to Chen,HuangandYang(cf. [49]),EhrgottandGandibleux(cf. [65]),Eichfelder (cf. [66]), Goh and Yang (cf. [77]), G.. opfert and Nehse (cf. [80]), G.. opfert, - ahi, Tammer and Z? alinescu (cf. [81]), Jahn (cf. [104]), Kaliszewski (cf. [108]), Luc (cf. [125]), Miettinen (cf. [130]), Mishra, Wang and Lai (cf. [131,132]) and Sawaragi, Nakayama and Tanino (cf. [163]), where vector duality is at most tangentially treated. We hope that from our e?orts will bene? t not only researchers interested in vector optimization, but also graduate and und- graduate students. The framework we consider is taken as general as possible, namely we work in (locally convex) topological vector spaces, going to the usual ?nite - mensional setting when this brings additional insights or relevant connections to the existing literature. |
You may like...
Musical Echoes - South African Women…
Carol Ann Muller, Sathima Bea Benjamin
Paperback
R877
Discovery Miles 8 770
1 Recce: Volume 3 - Onsigbaarheid Is Ons…
Alexander Strachan
Paperback
Late Life Jazz - The Life and Career of…
Ken Crossland, Malcolm MacFarlane
Hardcover
R999
Discovery Miles 9 990
Nonlinear Dynamics in Geosciences
Anastasios A. Tsonis, James B. Elsner
Hardcover
R5,260
Discovery Miles 52 600
An Introduction to State Space Time…
Jacques J.F. Commandeur, Siem Jan Koopman
Hardcover
R1,872
Discovery Miles 18 720
|