![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Applications of computing > General
Ambient Intelligence is one of the new paradigms in the development of information and communication technology, which has attracted much attention over the past years. The aim is the to integrate technology into people environment in such a way that it improves their daily lives in terms of well-being, creativity, and productivity. Ambient Intelligence is a multidisciplinary concept, which heavily builds on a number of fundamental breakthroughs that have been achieved in the development of new hardware concepts over the past years. New insights in nano and micro electronics, packaging and interconnection technology, large-area electronics, energy scavenging devices, wireless sensors, low power electronics and computing platforms enable the realization of the heaven of ambient intelligence by overcoming the hell of physics. Based on contributions from leading technical experts, this book presents a number of key topics on novel hardware developments, thus providing the reader a good insight into the physical basis of ambient intelligence. It also indicates key research challenges that must be addressed in the future.
This book constitutes the refereed proceedings of the 6th IFIP TC 5 International Conference on Computational Intelligence and Its Applications, CIIA 2018, held in Oran, Algeria, in May 2018. The 56 full papers presented were carefully reviewed and selected from 202 submissions. They are organized in the following topical sections: data mining and information retrieval; evolutionary computation; machine learning; optimization; planning and scheduling; wireless communication and mobile computing; Internet of Things (IoT) and decision support systems; pattern recognition and image processing; and semantic web services.
Nash equilibrium is the central solution concept in Game Theory. Since Nash's original paper in 1951, it has found countless applications in modeling strategic behavior of traders in markets, (human) drivers and (electronic) routers in congested networks, nations in nuclear disarmament negotiations, and more. A decade ago, the relevance of this solution concept was called into question by computer scientists, who proved (under appropriate complexity assumptions) that computing a Nash equilibrium is an intractable problem. And if centralized, specially designed algorithms cannot find Nash equilibria, why should we expect distributed, selfish agents to converge to one? The remaining hope was that at least approximate Nash equilibria can be efficiently computed.Understanding whether there is an efficient algorithm for approximate Nash equilibrium has been the central open problem in this field for the past decade. In this book, we provide strong evidence that even finding an approximate Nash equilibrium is intractable. We prove several intractability theorems for different settings (two-player games and many-player games) and models (computational complexity, query complexity, and communication complexity). In particular, our main result is that under a plausible and natural complexity assumption ("Exponential Time Hypothesis for PPAD"), there is no polynomial-time algorithm for finding an approximate Nash equilibrium in two-player games. The problem of approximate Nash equilibrium in a two-player game poses a unique technical challenge: it is a member of the class PPAD, which captures the complexity of several fundamental total problems, i.e., problems that always have a solution; and it also admits a quasipolynomial time algorithm. Either property alone is believed to place this problem far below NP-hard problems in the complexity hierarchy; having both simultaneously places it just above P, at what can be called the frontier of intractability. Indeed, the tools we develop in this book to advance on this frontier are useful for proving hardness of approximation of several other important problems whose complexity lies between P and NP: Brouwer's fixed point, market equilibrium, CourseMatch (A-CEEI), densest k-subgraph, community detection, VC dimension and Littlestone dimension, and signaling in zero-sum games.
In this volume, designed for engineers and scientists working in the area of Computational Fluid Dynamics (CFD), experts offer assessments of the capabilities of CFD, highlight some fundamental issues and barriers, and propose novel approaches to overcome these problems. They also offer new avenues for research in traditional and non-traditional disciplines. The scope of the papers ranges from the scholarly to the practical. This book is distinguished from earlier surveys by its emphasis on the problems facing CFD and by its focus on non-traditional applications of CFD techniques. There have been several significant developments in CFD since the last workshop held in 1990 and this book brings together the key developments in a single unified volume.
Candida Ferreira thoroughly describes the basic ideas of gene expression programming (GEP) and numerous modifications to this powerful new algorithm. This monograph provides all the implementation details of GEP so that anyone with elementary programming skills will be able to implement it themselves. The book also includes a self-contained introduction to this new exciting field of computational intelligence, including several new algorithms for decision tree induction, data mining, classifier systems, function finding, polynomial induction, times series prediction, evolution of linking functions, automatically defined functions, parameter optimization, logic synthesis, combinatorial optimization, and complete neural network induction. The book also discusses some important and controversial evolutionary topics that might be refreshing to both evolutionary computer scientists and biologists. This second edition has been substantially revised and extended with five new chapters, including a new chapter describing two new algorithms for inducing decision trees with nominal and numeric/mixed attributes."
This book presents four mathematical essays which explore the foundations of mathematics and related topics ranging from philosophy and logic to modern computer mathematics. While connected to the historical evolution of these concepts, the essays place strong emphasis on developments still to come. The book originated in a 2002 symposium celebrating the work of Bruno Buchberger, Professor of Computer Mathematics at Johannes Kepler University, Linz, Austria, on the occasion of his 60th birthday. Among many other accomplishments, Professor Buchberger in 1985 was the founding editor of the Journal of Symbolic Computation; the founder of the Research Institute for Symbolic Computation (RISC) and its chairman from 1987-2000; the founder in 1990 of the Softwarepark Hagenberg, Austria, and since then its director. More than a decade in the making, Mathematics, Computer Science and Logic - A Never Ending Story includes essays by leading authorities, on such topics as mathematical foundations from the perspective of computer verification; a symbolic-computational philosophy and methodology for mathematics; the role of logic and algebra in software engineering; and new directions in the foundations of mathematics. These inspiring essays invite general, mathematically interested readers to share state-of-the-art ideas which advance the never ending story of mathematics, computer science and logic. Mathematics, Computer Science and Logic - A Never Ending Story is edited by Professor Peter Paule, Bruno Buchberger s successor as director of the Research Institute for Symbolic Computation. "
Graph theory gained initial prominence in science and engineering through its strong links with matrix algebra and computer science. Moreover, the structure of the mathematics is well suited to that of engineering problems in analysis and design. The methods of analysis in this book employ matrix algebra, graph theory and meta-heuristic algorithms, which are ideally suited for modern computational mechanics. Efficient methods are presented that lead to highly sparse and banded structural matrices. The main features of the book include: application of graph theory for efficient analysis; extension of the force method to finite element analysis; application of meta-heuristic algorithms to ordering and decomposition (sparse matrix technology); efficient use of symmetry and regularity in the force method; and simultaneous analysis and design of structures.
This monograph presents recursion theory from a generalized point of view centered on the computational aspects of definability. A major theme is the study of the structures of degrees arising from two key notions of reducibility, the Turing degrees and the hyperdegrees, using techniques and ideas from recursion theory, hyperarithmetic theory, and descriptive set theory. The emphasis is on the interplay between recursion theory and set theory, anchored on the notion of definability. The monograph covers a number of fundamental results in hyperarithmetic theory as well as some recent results on the structure theory of Turing and hyperdegrees. It also features a chapter on the applications of these investigations to higher randomness.
Thepurposeofthe 7thIEEE/ACISInternationalConferenceonComputerandInfor- tion Science (ICIS2008)and the 2nd IEEE/ACISInternationalWorkshop on e-Activity (IWEA 2008) to be held on May 14-16, 2008 in Portland, Oregon, U.S.A. is to bring together scientists, engineers, computer users, and students to share their experiences and exchange new ideas and research results about all aspects (theory, applications and tools) of computer and information science; and to discuss the practical challenges - countered along the way and the solutions adopted to solve them. In January, 2008 one of editors of this book approached in house editor Dr. Thomas Ditzingeraboutpreparingavolumecontainingextendedandimprovedversionsofsome of the papers selected for presentation at the conference and workshop. Upon receiving Dr. Ditzinger's approval, conference organizers selected 23 outstanding papers from ICIS/IWEA 2008, all of which you will nd in this volume of Springer's Studies in Computational Intelligence. In chapter 1, Fabio Perez Marzullo et al. describe a model driven architecture (MDA) approachfor assessing database performance.The authorspresent a pro ling technique that offers a way to assess performance and identify aws, while performing software construction activities. In chapter 2, authorsHuy Nguyen Anh Pham and EvangelosTriantaphyllouoffera new approachfortesting classi cation algorithms, and present thisapproachthroughrean- ysis of the Pima Indian diabetes dataset, one of the most well-known datasets used for this purpose. The new method put forth by the authors is dubbed the Homogeneity- Based Algorithm(HBA), and it aims to optimally control the over ttingand overgen- alization behaviors that have proved problematic for previous classi cation algorithms on this dataset.
Cognitive Intelligence with Neutrosophic Statistics in Bioinformatics investigates and presents the many applications that have arisen in the last ten years using neutrosophic statistics in bioinformatics, medicine, agriculture and cognitive science. This book will be very useful to the scientific community, appealing to audiences interested in fuzzy, vague concepts from which uncertain data are collected, including academic researchers, practicing engineers and graduate students. Neutrosophic statistics is a generalization of classical statistics. In classical statistics, the data is known, formed by crisp numbers. In comparison, data in neutrosophic statistics has some indeterminacy. This data may be ambiguous, vague, imprecise, incomplete, and even unknown. Neutrosophic statistics refers to a set of data, such that the data or a part of it are indeterminate in some degree, and to methods used to analyze the data.
This book presents a comprehensive overview of the various aspects for the development of smart cities from a European perspective. It presents both theoretical concepts as well as empirical studies and cases of smart city programs and their capacity to create value for citizens. The contributions in this book are a result of an increasing interest for this topic, supported by both national governments and international institutions. The book offers a large panorama of the most important aspects of smart cities evolution and implementation. It compares European best practices and analyzes how smart projects and programs in cities could help to improve the quality of life in the urban space and to promote cultural and economic development.
This book is a timely document of state-of-the-art techniques in the domain of contact tracing applications. Well known in the field of medical science, this topic has recently received attention from governments, industries and academic communities due to the COVID-19 pandemic. This book provides a link between new proposals related to contact tracing applications and a contextual literature review primarily from the cryptologic viewpoint. As these applications are related to security and privacy of individuals, analyzing them from cryptologic viewpoint is of utmost importance. Therefore, present developments from cryptologic aspects of most proposals around the world, including Singapore, Europe, USA, Australia and India, have been discussed. Providing an in-depth study on the design rationale of each protocol, this book is of value to researchers, students and professionals alike.
Assertion-based design is a powerful new paradigm that is facilitating quality improvement in electronic design. Assertions are statements used to describe properties of the design (I.e., design intent), that can be included to actively check correctness throughout the design cycle and even the lifecycle of the product. With the appearance of two new languages, PSL and SVA, assertions have already started to improve verification quality and productivity. This is the first book that presents an under-the-hood view of generating assertion checkers, and as such provides a unique and consistent perspective on employing assertions in major areas, such as: specification, verification, debugging, on-line monitoring and design quality improvement.
"Soar: A Cognitive Architecture in Perspective" represents a European perspective on Soar with the exception of the special contribution from Allen Newell arguing for Unified Theories of Cognition. The various papers derive from the work of the Soar Research Group that has been active at the University of Groningen, The Netherlands, since 1987. The work reported here has been inspired in particular by two topics that precipitated the group's interest in Soar in the first place: - road user behaviour and the temporal organization of behaviour, more specifically planning. At the same time, the various contributions go beyond the simple use of Soar as a convenient medium for modelling human cognitive activity. In every paper one or more fundamental issues are raised that touch upon the very nature and consistency of Soar as an intelligent architecture. As a result the reader will learn about the operator implementation problem, chunking, multitasking, the need to constrain the depth of the goal stack, and induction etc. Soar is still at a relatively early stage of development. It does, nevertheless, constitute an important breakthrough in the area of computer architectures for general intelligence. Soar shows one important direction that future efforts to build intelligent systems should take if they aim for a comprehensive, and psychologically meaningful, theory of cognition. This is argued by Newell in his contribution to this volume. For this reason, the Soar system will probably play an important integrative role within cognitive science in bringing together important subdomains of psychology, computer science, linguistics, and the neurosciences. Although Soar is not the only "architecture for intelligence", it is one of the most advanced and theoretically best motivated architectures presently available. This work should be of special interest to researchers in the domains of cognitive science, computer science and artificial intelligence, cognitive psychology, and the philosophy of mind.
'Behavior' is an increasingly important concept in the scientific, societal, economic, cultural, political, military, living and virtual worlds. Behavior computing, or behavior informatics, consists of methodologies, techniques and practical tools for examining and interpreting behaviours in these various worlds. Behavior computing contributes to the in-depth understanding, discovery, applications and management of behavior intelligence. With contributions from leading researchers in this emerging field Behavior Computing: Modeling, Analysis, Mining and Decision includes chapters on: representation and modeling behaviors; behavior ontology; behaviour analysis; behaviour pattern mining; clustering complex behaviors; classification of complex behaviors; behaviour impact analysis; social behaviour analysis; organizational behaviour analysis; and behaviour computing applications. Behavior Computing: Modeling, Analysis, Mining and Decision provides a dedicated source of reference for the theory and applications of behavior informatics and behavior computing. Researchers, research students and practitioners in behavior studies, including computer science, behavioral science, and social science communities will find this state of the art volume invaluable.
The electronics and information technology revolution continues, but it is a critical time in the development of technology. Once again, we stand on the brink of a new era where emerging research will yield exciting applications and products destined to transform and enrich our daily lives! The potential is staggering and the ultimate impact is unimaginable, considering the continuing marriage of te- nology with fields such as medicine, communications and entertainment, to name only a few. But who will actually be responsible for transforming these potential new pr- ucts into reality? The answer, of course, is today's (and tomorrow's) design en- neers! The design of integrated circuits today remains an essential discipline in s- port of technological progress, and the authors of this book have taken a giant step forward in the development of a practice-oriented treatise for design engineers who are interested in the practical, industry-driven world of integrated circuit - sign.
The two volumes IFIP AICT 545 and 546 constitute the refereed post-conference proceedings of the 11th IFIP WG 5.14 International Conference on Computer and Computing Technologies in Agriculture, CCTA 2017, held in Jilin, China, in August 2017. The 100 revised papers included in the two volumes were carefully reviewed and selected from 282 submissions. They cover a wide range of interesting theories and applications of information technology in agriculture. The papers focus on four topics: Internet of Things and big data in agriculture, precision agriculture and agricultural robots, agricultural information services, and animal and plant phenotyping for agriculture.
This book is written for anyone who is interested in how a field of research evolves and the fundamental role of understanding uncertainties involved in different levels of analysis, ranging from macroscopic views to meso- and microscopic ones. We introduce a series of computational and visual analytic techniques, from research areas such as text mining, deep learning, information visualization and science mapping, such that readers can apply these tools to the study of a subject matter of their choice. In addition, we set the diverse set of methods in an integrative context, that draws upon insights from philosophical, sociological, and evolutionary theories of what drives the advances of science, such that the readers of the book can guide their own research with their enriched theoretical foundations. Scientific knowledge is complex. A subject matter is typically built on its own set of concepts, theories, methodologies and findings, discovered by generations of researchers and practitioners. Scientific knowledge, as known to the scientific community as a whole, experiences constant changes. Some changes are long-lasting, whereas others may be short lived. How can we keep abreast of the state of the art as science advances? How can we effectively and precisely convey the status of the current science to the general public as well as scientists across different disciplines? The study of scientific knowledge in general has been overwhelmingly focused on scientific knowledge per se. In contrast, the status of scientific knowledge at various levels of granularity has been largely overlooked. This book aims to highlight the role of uncertainties, in developing a better understanding of the status of scientific knowledge at a particular time, and how its status evolves over the course of the development of research. Furthermore, we demonstrate how the knowledge of the types of uncertainties associated with scientific claims serves as an integral and critical part of our domain expertise.
This is a book about a code and about coding. The code is a case study which has been used to teachcourses in e-Science atthe Australian NationalUniv- sity since 2001. Students learn advanced programming skills and techniques TM in the Java language. Above all, they learn to apply useful object-oriented design patterns as they progressively refactor and enhance the software. We think our case study,EScope, is as close to real life as you can get! It is a smaller version of a networked, graphical, waveform browser which is used in the control rooms of fusion energy experiments around the world. It is quintessential "e-Science" in the sense of e-Science being "computer science and information technology in the service of science". It is not, speci?cally, "Grid-enabled", but we develop it in a way that will facilitate its deployment onto the Grid. The standard version ofEScope interfaces with a specialised database for waveforms, and related data, known asMDSplus. On the acc- panying CD, we have provided you with software which will enable you to installMDSplus,EScope and sample data ?les onto Windows or Linux c- puters. There is much additional software including many versions of the case study as it gets built up and progressively refactored using design patterns. There will be a home web-site for this book which will contain up-to-date information about the software and other aspects of the case study.
This book provides an extensive overview of the diffusion of Information and Communication Technologies (ICTs) in developing countries between 2000 and 2012. It covers issues such as country-specific ICT diffusion patterns, technological substitution and technological convergence. By identifying social, economic and institutional prerequisites and analyzing critical country-specific conditions, the author develops a new approach to explaining the emergence of their technological takeoff. Readers will discover how developing countries are now adopting ICTs, rapidly catching up with the developed world in terms of ICT access and use.
The convergence of biology and computer science was initially motivated by the need to organize and process a growing number of biological observations resulting from rapid advances in experimental techniques. Today, however, close collaboration between biologists, biochemists, medical researchers, and computer scientists has also generated remarkable benefits for the field of computer science. Systemic Approaches in Bioinformatics and Computational Systems Biology: Recent Advances presents new techniques that have resulted from the application of computer science methods to the organization and interpretation of biological data. The book covers three subject areas: bioinformatics, computational biology, and computational systems biology. It focuses on recent, systemic approaches in computer science and mathematics that have been used to model, simulate, and more generally, experiment with biological phenomena at any scale.
This volume is a collation of original contributions from the key actors of a new trend in the contemporary theory of knowledge and belief, that we call "dynamic epistemology." It brings the works of these researchers under a single umbrella by highlighting the coherence of their current themes, and by establishing connections between topics that, up until now, have been investigated independently. It also illustrates how the new analytical toolbox unveils questions about the theory of knowledge, belief, preference, action, and rationality, in a number of central axes in dynamic epistemology: temporal, social, probabilistic and even deontic dynamics.
Recently, the emergence of wireless and mobile networks has made possible the admission of electronic commerce to a new application and research subject: mobile commerce, defined as the exchange or buying and selling of commodities, services, or information on the Internet through the use of mobile handheld devices. In just a few years, mobile commerce has emerged from nowhere to become the hottest new trend in business transactions. However, the prosperity and popularity of mobile commerce will be brought to a higher level only if information is securely and safely exchanged among end systems (mobile users and content providers). Advances in Security and Payment Methods for Mobile Commerce includes high-quality research papers and industrial and practice articles in the areas of mobile commerce security and payment from academics and industrialists. It covers research and development results of lasting significance in the theory, design, implementation, analysis, and application of mobile commerce security and payment. |
![]() ![]() You may like...
Servings Of Self-Mastery Journal - Steps…
Alistair Mokoena
Paperback
Careers - An Organisational Perspective
Melinde Coetzee, Dries Schreuder
Paperback
Is Your Thinking Keeping You Poor? - 50…
Douglas Kruger
Paperback
![]()
|