![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > General
This volume is a collation of original contributions from the key actors of a new trend in the contemporary theory of knowledge and belief, that we call "dynamic epistemology." It brings the works of these researchers under a single umbrella by highlighting the coherence of their current themes, and by establishing connections between topics that, up until now, have been investigated independently. It also illustrates how the new analytical toolbox unveils questions about the theory of knowledge, belief, preference, action, and rationality, in a number of central axes in dynamic epistemology: temporal, social, probabilistic and even deontic dynamics.
As technology grows as the largest source of modern economic growth, the emergence of new models is currently challenging the standard western model of organizational management. Companies from all over the world have succeeded in creating emerging economies with these new models and are now competing with established multinational corporations Organizational Innovation and IT Governance in Emerging Economies develops a methodological framework that supports new approaches of technological innovation by companies. This reference book provides contributions from experts in emerging economies, highlighting specific case studies of home grown companies from these emerging markets, offering lessons on how traditional multinationals can compete with these new companies for policymakers, government officers, academics, researchers, students, and practitioners.
The electronics and information technology revolution continues, but it is a critical time in the development of technology. Once again, we stand on the brink of a new era where emerging research will yield exciting applications and products destined to transform and enrich our daily lives! The potential is staggering and the ultimate impact is unimaginable, considering the continuing marriage of te- nology with fields such as medicine, communications and entertainment, to name only a few. But who will actually be responsible for transforming these potential new pr- ucts into reality? The answer, of course, is today's (and tomorrow's) design en- neers! The design of integrated circuits today remains an essential discipline in s- port of technological progress, and the authors of this book have taken a giant step forward in the development of a practice-oriented treatise for design engineers who are interested in the practical, industry-driven world of integrated circuit - sign.
It is the aim of INDICES to document recent explorations in the various fields of philosophical logic and formal linguistics and their applications in other disciplines. The main emphasis of this series is on self-contained monographs covering particular areas of recent research and surveys of methods, problems, and results in all fields of inquiry where recourse to logical analysis and logical methods has been fruitful. INDICES will contain monographs dealing with the central areas of philosophical logic (extensional and intensional systems, indexical logics, non-classical logics, philosophy of logic, etc.) as well as studies in which these systems are applied to specific issues in philosophy, in the formal semantics of natural languages, the foundations of linguistic theory, in computational linguistics, and in theoretical computer science. Constructive type theory was first presented in 1970, by the Swedish logician Per Martin-Lof. It has become one of the main approaches used in the foundations of mathematics and computer science. But it has remained relatively unknown among linguists and philosophers, although it provides a considerable extension of the concepts and techniques of logic. The book first gives an introduction to type theory from the point of view of linguistics and the philosophy of language. Type theory is then applied in the areas of quantification, anaphora, temporal reference, and the structure of text and discourse. By virtue of the type-theoretical concepts of proof object and context, various phenomena of dependence and progression in language can be discussed in precise terms, and several well-known problems can be solved. A categorial grammar is presented togenerate formally a fragment of English, together with an example of a computer implementation.
Assertion-based design is a powerful new paradigm that is facilitating quality improvement in electronic design. Assertions are statements used to describe properties of the design (I.e., design intent), that can be included to actively check correctness throughout the design cycle and even the lifecycle of the product. With the appearance of two new languages, PSL and SVA, assertions have already started to improve verification quality and productivity. This is the first book that presents an under-the-hood view of generating assertion checkers, and as such provides a unique and consistent perspective on employing assertions in major areas, such as: specification, verification, debugging, on-line monitoring and design quality improvement.
Essential Computational Thinking: Computer Science from Scratch helps students build a theoretical and practical foundation for learning computer science. Rooted in fundamental science, this text defines elementary ideas including data and information, quantifies these ideas mathematically, and, through key concepts in physics and computation, demonstrates the relationship between computer science and the universe itself. In Part I, students explore the theoretical underpinnings of computer science in a wide-ranging manner. Readers receive a robust overview of essential computational theories and programming ideas, as well as topics that examine the mathematical and physical foundations of computer science. Part 2 presents the basics of computation and underscores programming as an invaluable tool in the discipline. Students can apply their newfound knowledge and begin writing substantial programs immediately. Finally, Part 3 explores more sophisticated computational ideas, including object-oriented programing, databases, data science, and some of the underlying principles of machine learning. Essential Computational Thinking is an ideal text for a firmly technical CS0 course in computer science. It is also a valuable resource for highly-motivated non-computer science majors at the undergraduate or graduate level who are interested in learning more about the discipline for either professional or personal development.
This book presents a comprehensive overview of the various aspects for the development of smart cities from a European perspective. It presents both theoretical concepts as well as empirical studies and cases of smart city programs and their capacity to create value for citizens. The contributions in this book are a result of an increasing interest for this topic, supported by both national governments and international institutions. The book offers a large panorama of the most important aspects of smart cities evolution and implementation. It compares European best practices and analyzes how smart projects and programs in cities could help to improve the quality of life in the urban space and to promote cultural and economic development.
'Behavior' is an increasingly important concept in the scientific, societal, economic, cultural, political, military, living and virtual worlds. Behavior computing, or behavior informatics, consists of methodologies, techniques and practical tools for examining and interpreting behaviours in these various worlds. Behavior computing contributes to the in-depth understanding, discovery, applications and management of behavior intelligence. With contributions from leading researchers in this emerging field Behavior Computing: Modeling, Analysis, Mining and Decision includes chapters on: representation and modeling behaviors; behavior ontology; behaviour analysis; behaviour pattern mining; clustering complex behaviors; classification of complex behaviors; behaviour impact analysis; social behaviour analysis; organizational behaviour analysis; and behaviour computing applications. Behavior Computing: Modeling, Analysis, Mining and Decision provides a dedicated source of reference for the theory and applications of behavior informatics and behavior computing. Researchers, research students and practitioners in behavior studies, including computer science, behavioral science, and social science communities will find this state of the art volume invaluable.
Introduction The International Federation for Information Processing (IFIP) is a non-profit umbrella organization for national societies working in the field of information processing. It was founded in 1960 under the auspices of UNESCO. It is organized into several technical c- mittees. This book represents the proceedings of the 2008 conference of technical committee 8 (TC8), which covers the field of infor- tion systems. TC8 aims to promote and encourage the advancement of research and practice of concepts, methods, techniques and issues related to information systems in organisations. TC8 has established eight working groups covering the following areas: design and evaluation of information systems; the interaction of information systems and the organization; decision support systems; e-business information systems: multi-disciplinary research and practice; inf- mation systems in public administration; smart cards, technology, applications and methods; and enterprise information systems. Further details of the technical committee and its working groups can be found on our website (ifiptc8. dsi. uminho. pt). This conference was part of IFIP's World Computer Congress in Milan, Italy which took place 7-10 September 2008. The occasion celebrated the 32nd anniversary of IFIP TC8. The call for papers invited researchers, educators, and practitioners to submit papers and panel proposals that advance concepts, methods, techniques, tools, issues, education, and practice of information systems in organi- tions. Thirty one submissions were received.
Global Optimization has emerged as one of the most exciting new areas of mathematical programming. Global optimization has received a wide attraction from many fields in the past few years, due to the success of new algorithms for addressing previously intractable problems from diverse areas such as computational chemistry and biology, biomedicine, structural optimization, computer sciences, operations research, economics, and engineering design and control. This book contains refereed invited papers submitted at the 4th international confer ence on Frontiers in Global Optimization held at Santorini, Greece during June 8-12, 2003. Santorini is one of the few sites of Greece, with wild beauty created by the explosion of a volcano which is in the middle of the gulf of the island. The mystic landscape with its numerous mult-extrema, was an inspiring location particularly for researchers working on global optimization. The three previous conferences on "Recent Advances in Global Opti mization," "State-of-the-Art in Global Optimization," and "Optimization in Computational Chemistry and Molecular Biology: Local and Global approaches" took place at Princeton University in 1991, 1995, and 1999, respectively. The papers in this volume focus on de terministic methods for global optimization, stochastic methods for global optimization, distributed computing methods in global optimization, and applications of global optimiza tion in several branches of applied science and engineering, computer science, computational chemistry, structural biology, and bio-informatics."
This is a book about a code and about coding. The code is a case study which has been used to teachcourses in e-Science atthe Australian NationalUniv- sity since 2001. Students learn advanced programming skills and techniques TM in the Java language. Above all, they learn to apply useful object-oriented design patterns as they progressively refactor and enhance the software. We think our case study,EScope, is as close to real life as you can get! It is a smaller version of a networked, graphical, waveform browser which is used in the control rooms of fusion energy experiments around the world. It is quintessential "e-Science" in the sense of e-Science being "computer science and information technology in the service of science". It is not, speci?cally, "Grid-enabled", but we develop it in a way that will facilitate its deployment onto the Grid. The standard version ofEScope interfaces with a specialised database for waveforms, and related data, known asMDSplus. On the acc- panying CD, we have provided you with software which will enable you to installMDSplus,EScope and sample data ?les onto Windows or Linux c- puters. There is much additional software including many versions of the case study as it gets built up and progressively refactored using design patterns. There will be a home web-site for this book which will contain up-to-date information about the software and other aspects of the case study.
A Modular Calculus for the Average Cost of Data Structuring introduces MOQA, a new domain-specific programming language which guarantees the average-case time analysis of its programs to be modular.Time in this context refers to a broad notion of cost, which can be used to estimate the actual running time, but also other quantitative information such as power consumption, while modularity means that the average time of a program can be easily computed from the times of its constituents--something that no programming language of this scope has been able to guarantee so far. MOQA principles can be incorporated in any standard programming language. MOQA supports tracking of data and their distributions throughout computations, based on the notion of random bag preservation. This allows a unified approach to average-case time analysis, and resolves fundamental bottleneck problems in the area. The main techniques are illustrated in an accompanying Flash tutorial, where the visual nature of this method can provide new teaching ideas for algorithms courses. This volume, with forewords by Greg Bollella and Dana Scott, presents novel programs based on the new advances in this area, including the first randomness-preserving version of Heapsort. Programs are provided, along with derivations of their average-case time, to illustrate the radically different approach to average-case timing. The automated static timing tool applies the Modular Calculus to extract the average-case running time of programs directly from their MOQA code. A Modular Calculus for the Average Cost of Data Structuring is designed for a professional audience composed of researchers and practitioners in industry, with an interest in algorithmic analysis and also static timing and power analysis--areas of growing importance. It is also suitable as an advanced-level text or reference book for students in computer science, electrical engineering and mathematics. Michel Schellekens obtained his PhD from Carnegie Mellon University, following which he worked as a Marie Curie Fellow at Imperial College London. Currently he is an Associate Professor at the Department of Computer Science in University College Cork - National University of Ireland, Cork, where he leads the Centre for Efficiency-Oriented Languages (CEOL) as a Science Foundation Ireland Principal Investigator.
This book records the very first Working Conference of the newly established IFIP Working Group on Human-Work Interaction Design, which was hosted by the University of Madeira in 2006. The theme of the conference was on synthesizing work analysis and design sketching, with a particular focus on how to read design sketches within different approaches to analysis and design of human-work interaction. Authors were encouraged to submit papers about design sketches - for interfaces, for organizations of work etc. - that they themselves had worked on. During the conference, they presented the lessons they had learnt from the design and evaluation process, citing reasons for why the designs worked or why they did not work. Researchers, designers and analysts in this way confronted concrete design problems in complex work domains and used this unique opportunity to share their own design problems and solutions with the community. To successfully practice and do research within Human - Work Interaction Design requires a high level of personal skill, which the conference aimed at by confronting designers and work analysts and those whose research is both analysis and design. They were asked to collaborate in small groups about analysis and solutions to a common design problem.
High-speed, power-efficient analog integrated circuits can be used as standalone devices or to interface modern digital signal processors and micro-controllers in various applications, including multimedia, communication, instrumentation, and control systems. New architectures and low device geometry of complementary metaloxidesemiconductor (CMOS) technologies have accelerated the movement toward system on a chip design, which merges analog circuits with digital, and radio-frequency components.
This book provides an insight into the possibilities that so-called ""Electronic Government"" has to offer. It demonstrates the elements belonging to the concept of E-Government and acts as a point of reference for those aiming to implement it. Checklists and lists of questions enable self-assessment at local, state and federal levels, highlighting opportunities for further development. The book cannot be described as technical - programmers will not find any instructions. Instead, it is designed to act as a point of orientation for decision makers in the field of government and politics, without the need to get bogged down in technical details. Central to the book are the following questions: what is Electronic Government, what advantages does it bring to those involved with it, and how can it be introduced?
Aware that the readers like a scientific array, the author strived to satiate this overlooked desirability. The mosaic of topics offered here, was for addressing this forgotten craving. Studying the invented over the years, showed how inventiveness was affected by fiction, intuition, deliberate thinking and the tabooed. Though unlooked for, the author came up with a new Classification Of Inventions. The connoted proved that inventiveness could be learned. Besides inventing the materialistic, man was also enthused to invent the spiritualistic. This led the author to discuss our changing views on Mythopoeia, Religion, the Expiration of Man, our Distopian Cultures and our global insociability. Thus and so, these subjects were an appropriated connubiality between the materialistic and the established by fuliginous credos. His contrived methodologies, to name a few comprised inventions for: collecting spilt oil lost to the sea; desalting sea water; protecting our affluent and the influential from being spied on or targeted by snipers; severing bloodlessly our skin folds hence winning the battle of the flap. share them with his readers and to leave them behind for the indulgers of coming generations. Credentials though important, yet intuition as instanced know no boundries for the insighted.
Set your students on track to achieve the best grade possible with My Revision Notes: AQA A-level Computer Science. Our clear and concise approach to revision will help students learn, practise and apply their skills and understanding. Coverage of key content is combined with practical study tips and effective revision strategies to create a guide that can be relied on to build both knowledge and confidence. With My Revision Notes: AQA A-level Computer Science, students can: > Consolidate knowledge with clear, focused and relevant content coverage, based on what examiners are looking for > Develop understanding with self-testing - our regular 'Now test yourself,' tasks and answers will help commit knowledge to memory > Improve technique through exam-style practice questions, expert tips and examples of typical mistakes to avoid > Identify key connections between topics and subjects with our 'Learning links' focus > Plan and manage a successful revision programme with our topic-by-topic planner, new exam breakdown feature, user-friendly definitions throughout and questions and answers online
This book contains papers presented at the fifth and sixth Teraflop Workshop. It presents the state-of-the-art in high performance computing and simulation on modern supercomputer architectures. It covers trends in hardware and software development in general and specifically the future of vector-based systems and heterogeneous architectures. It covers computational fluid dynamics, fluid-structure interaction, physics, chemistry, astrophysics, and climate research.
Social media has revolutionized how individuals, communities, and organizations create, share, and consume information. Similarly, social media offers numerous opportunities as well as enormous social and economic ills for individuals, communities, and organizations. Despite the increase in popularity of social networking sites and related digital media, there are limited data and studies on consumption patterns of the new media by different global communities. Analyzing Global Social Media Consumption is an essential reference book that investigates the current trends, practices, and newly emerging narratives on theoretical and empirical research on all aspects of social media and its global use. Covering topics that include fake news detection, social media addiction, and motivations and impacts of social media use, this book is ideal for big data analysts, media and communications experts, researchers, academicians, and students in media and communications, information systems, and information technology study programs.
First studied in social insects like ants, indirect self-organizing interactions - known as "stigmergy" - occur when one individual modifies the environment and another subsequently responds to the new environment. The implications of self-organizing behavior extend to robotics and beyond. This book explores the application of stigmergy for a variety of optimization problems. The volume comprises 12 chapters including an introductory chapter conveying the fundamental definitions, inspirations and research challenges.
- Nigel Holmes is one of the leading graphic and information designers of the late 20th and 21st century - The book is written in non-academic, easy to understand language, is full of visual examples (historical and contemporary) and will appeal to any level of reader - This is the first book to focus on humor and joy in relation to information graphics and data visualization, and it teaches the reader how to use humor and joy to make visual information more understandable |
You may like...
Skin We Are In - A Celebration Of The…
Sindiwe Magona, Nina G. Jablonski
Paperback
R135
Discovery Miles 1 350
|