![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
Graphs are widely used to represent structural information in the form of objects and connections between them. Graph transformation is the rule-based manipulation of graphs, an increasingly important concept in computer science and related fields. This is the first textbook treatment of the algebraic approach to graph transformation, based on algebraic structures and category theory. Part I is an introduction to the classical case of graph and typed graph transformation. In Part II basic and advanced results are first shown for an abstract form of replacement systems, so-called adhesive high-level replacement systems based on category theory, and are then instantiated to several forms of graph and Petri net transformation systems. Part III develops typed attributed graph transformation, a technique of key relevance in the modeling of visual languages and in model transformation. Part IV contains a practical case study on model transformation and a presentation of the AGG (attributed graph grammar) tool environment. Finally the appendix covers the basics of category theory, signatures and algebras. The book addresses both research scientists and graduate students in computer science, mathematics and engineering.
Computer science is the science of the future, and already underlies every facet of business and technology, and much of our everyday lives. In addition, it will play a crucial role in the science the 21st century, which will be dominated by biology and biochemistry, similar to the role of mathematics in the physical sciences of the 20th century. In this award-winning best-seller, the author and his co-author focus on the fundamentals of computer science, which revolve around the notion of the "algorithm." They discuss the design of algorithms, and their efficiency and correctness, the inherent limitations of algorithms and computation, quantum algorithms, concurrency, large systems and artificial intelligence. Throughout, the authors, in their own words, stress the 'fundamental and robust nature of the science in a form that is virtually independent of the details of specific computers, languages and formalisms'. This version of the book is published to celebrate 25 years since its first edition, and in honor of the Alan M. Turing Centennial year. Turing was a true pioneer of computer science, whose work forms the underlying basis of much of this book. "
This book is written for anyone who is interested in how a field of research evolves and the fundamental role of understanding uncertainties involved in different levels of analysis, ranging from macroscopic views to meso- and microscopic ones. We introduce a series of computational and visual analytic techniques, from research areas such as text mining, deep learning, information visualization and science mapping, such that readers can apply these tools to the study of a subject matter of their choice. In addition, we set the diverse set of methods in an integrative context, that draws upon insights from philosophical, sociological, and evolutionary theories of what drives the advances of science, such that the readers of the book can guide their own research with their enriched theoretical foundations. Scientific knowledge is complex. A subject matter is typically built on its own set of concepts, theories, methodologies and findings, discovered by generations of researchers and practitioners. Scientific knowledge, as known to the scientific community as a whole, experiences constant changes. Some changes are long-lasting, whereas others may be short lived. How can we keep abreast of the state of the art as science advances? How can we effectively and precisely convey the status of the current science to the general public as well as scientists across different disciplines? The study of scientific knowledge in general has been overwhelmingly focused on scientific knowledge per se. In contrast, the status of scientific knowledge at various levels of granularity has been largely overlooked. This book aims to highlight the role of uncertainties, in developing a better understanding of the status of scientific knowledge at a particular time, and how its status evolves over the course of the development of research. Furthermore, we demonstrate how the knowledge of the types of uncertainties associated with scientific claims serves as an integral and critical part of our domain expertise.
This book presents some recent works on the application of Soft Computing techniques in information access on the World Wide Web. The book comprises 15 chapters from internationally known researchers and is divided in four parts reflecting the areas of research of the presented works such as Document Classification, Semantic Web, Web Information Retrieval and Web Applications. This book demonstrates that Web Information Retrieval is a stimulating area of research where Soft Computing technologies can be applied satisfactorily.
Networks have become nearly ubiquitous and increasingly complex, and their support of modern enterprise environments has become fundamental. Accordingly, robust network management techniques are essential to ensure optimal performance of these networks. This monograph treats the application of numerous graph-theoretic algorithms to a comprehensive analysis of dynamic enterprise networks. Network dynamics analysis yields valuable information about network performance, efficiency, fault prediction, cost optimization, indicators and warnings. Based on many years of applied research of generic network dynamics, this work covers a number of elegant applications (including many new and experimental results) of traditional graph theory algorithms and techniques to computationally tractable network dynamics analysis to motivate network analysts, practitioners and researchers alike. The material is also suitable for graduate courses addressing state-of-the-art applications of graph theory in analysis of dynamic communication networks, dynamic databasing, and knowledge management.
Confronting the digital revolution in academia, this book examines the application of new computational techniques and visualisation technologies in the Arts & Humanities. Uniting differing perspectives, leading and emerging scholars discuss the theoretical and practical challenges that computation raises for these disciplines.
The two volumes IFIP AICT 545 and 546 constitute the refereed post-conference proceedings of the 11th IFIP WG 5.14 International Conference on Computer and Computing Technologies in Agriculture, CCTA 2017, held in Jilin, China, in August 2017. The 100 revised papers included in the two volumes were carefully reviewed and selected from 282 submissions. They cover a wide range of interesting theories and applications of information technology in agriculture. The papers focus on four topics: Internet of Things and big data in agriculture, precision agriculture and agricultural robots, agricultural information services, and animal and plant phenotyping for agriculture.
Cognitive Intelligence with Neutrosophic Statistics in Bioinformatics investigates and presents the many applications that have arisen in the last ten years using neutrosophic statistics in bioinformatics, medicine, agriculture and cognitive science. This book will be very useful to the scientific community, appealing to audiences interested in fuzzy, vague concepts from which uncertain data are collected, including academic researchers, practicing engineers and graduate students. Neutrosophic statistics is a generalization of classical statistics. In classical statistics, the data is known, formed by crisp numbers. In comparison, data in neutrosophic statistics has some indeterminacy. This data may be ambiguous, vague, imprecise, incomplete, and even unknown. Neutrosophic statistics refers to a set of data, such that the data or a part of it are indeterminate in some degree, and to methods used to analyze the data.
This book constitutes Part IV of the refereed four-volume post-conference proceedings of the 4th IFIP TC 12 International Conference on Computer and Computing Technologies in Agriculture, CCTA 2010, held in Nanchang, China, in October 2010. The 352 revised papers presented were carefully selected from numerous submissions. They cover a wide range of interesting theories and applications of information technology in agriculture, including simulation models and decision-support systems for agricultural production, agricultural product quality testing, traceability and e-commerce technology, the application of information and communication technology in agriculture, and universal information service technology and service systems development in rural areas.
For generations, humans have fantasized about the ability to create devices that can see into a person's mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science fiction stories. Recent advances in cognitive neuroscience and brain imaging technologies have started to turn these myths into a reality, and are providing us with the ability to interface directly with the human brain. This ability is made possible through the use of sensors that monitor physical processes within the brain which correspond with certain forms of thought. Brain-Computer Interfaces: Applying our Minds to Human-Computer Interaction broadly surveys research in the Brain-Computer Interface domain. More specifically, each chapter articulates some of the challenges and opportunities for using brain sensing in Human-Computer Interaction work, as well as applying Human-Computer Interaction solutions to brain sensing work. For researchers with little or no expertise in neuroscience or brain sensing, the book provides background information to equip them to not only appreciate the state-of-the-art, but also ideally to engage in novel research. For expert Brain-Computer Interface researchers, the book introduces ideas that can help in the quest to interpret intentional brain control and develop the ultimate input device. It challenges researchers to further explore passive brain sensing to evaluate interfaces and feed into adaptive computing systems. Most importantly, the book will connect multiple communities allowing research to leverage their work and expertise and blaze into the future.
Information systems have five main areas of research and practice in which humans relate to information and communications technology. Typically isolated from one another, these areas are: the nature of computers and information, the creation of information technologies, the development of artifacts for human use, the usage of information systems, and information technology as our environment. Philosophical Frameworks for Understanding Information Systems strives to develop philosophical frameworks for these five areas and provides researchers, scholars, and provides practitioners in fields such as information systems, public administration, library science, education, and business management with an exemplary reference resource.
Third International Conference on Recent Trends in Information, Telecommunication and Computing - ITC 2012. ITC 2012 will be held during Aug 03-04, 2012, Kochi, India. ITC 2012, is to bring together innovative academics and industrial experts in the field of Computer Science, Information Technology, Computational Engineering, and Communication to a common forum. The primary goal of the conference is to promote research and developmental activities in Computer Science, Information Technology, Computational Engineering, and Communication. Another goal is to promote scientific information interchange between researchers, developers, engineers, students, and practitioners.
Get the expert perspective and practical advice on big data The Big Data-Driven Business: How to Use Big Data to Win Customers, Beat Competitors, and Boost Profits makes the case that big data is for real, and more than just big hype. The book uses real-life examples from Nate Silver to Copernicus, and Apple to Blackberry to demonstrate how the winners of the future will use big data to seek the truth. Written by a marketing journalist and the CEO of a multi-million-dollar B2B marketing platform that reaches more than 90% of the U.S. business population, this book is a comprehensive and accessible guide on how to win customers, beat competitors, and boost the bottom line with big data. The marketplace has entered an era where the customer holds all the cards. With unprecedented choice in both the consumer world and the B2B world, it's imperative that businesses gain a greater understanding of their customers and prospects. Big data is the key to this insight, because it provides a comprehensive view of a company's customers who they are, and who they may be tomorrow. The Big Data-Driven Business is a complete guide to the future of business as seen through the lens of big data, with expert advice on real-world applications. * Learn what big data is, and how it will transform the enterprise * Explore why major corporations are betting their companies on marketing technology * Read case studies of big data winners and losers * Discover how to change privacy and security, and remodel marketing Better information allows for better decisions, better targeting, and better reach. Big data has become an indispensable tool for the most effective marketers in the business, and it's becoming less of a competitive advantage and more like an industry standard. Remaining relevant as the marketplace evolves requires a full understanding and application of big data, and The Big Data-Driven Business provides the practical guidance businesses need.
This book approaches macroprudential oversight from the viewpoint of three tasks. The focus concerns a tight integration of means for risk communication into analytical tools for risk identification and risk assessment. Generally, this book explores approaches for representing complex data concerning financial entities on low-dimensional displays. Data and dimension reduction methods, and their combinations, hold promise for representing multivariate data structures in easily understandable formats. Accordingly, this book creates a Self-Organizing Financial Stability Map (SOFSM), and lays out a general framework for mapping the state of financial stability. Beyond external risk communication, the aim of the visual means is to support disciplined and structured judgmental analysis based upon policymakers' experience and domain intelligence.
Ambient Intelligence (AmI) is an integrating technology for supporting a pervasive and transparent infrastructure for implementing smart environments. Such technology is used to enable environments for detecting events and behaviors of people and for responding in a contextually relevant fashion. AmI proposes a multi-disciplinary approach for enhancing human machine interaction. Ambient Intelligence: A Novel Paradigm is a compilation of edited chapters describing current state-of-the-art and new research techniques including those related to intelligent visual monitoring, face and speech recognition, innovative education methods, as well as smart and cognitive environments. The authors start with a description of the iDorm as an example of a smart environment conforming to the AmI paradigm, and introduces computer vision as an important component of the system. Other computer vision examples describe visual monitoring for the elderly, classic and novel surveillance techniques using clusters of cameras installed in indoor and outdoor application domains, and the monitoring of public spaces. Face and speech recognition systems are also covered as well as enhanced LEGO blocks for novel educational purposes. The book closes with a provocative chapter on how a cybernetic system can be designed as the backbone of a human machine interaction.
Over the last thirty years an abundance of papers have been writ ten on adaptive dynamic control systems. Nevertheless, now it may be predicted with confidence that the adaptive mechanics, a new division, new line of inquiry in one of the violently developing fields of cybernetic mechanics, is emerging. The birth process falls far short of being com pleted. There appear new problems and methods of their solution in the framework of adaptive nonlinear dynamics. Therefore, the present work cannot be treated as a certain polished, brought-to-perfection school textbook. More likely, this is an attempt to show a number of well known scientific results in the parametric synthesis of nonlinear systems (this, strictly speaking, accounts for the availability of many reviews), as well as to bring to notice author's developments on this question undoubtedly modern and topical. The nonlinear, and practically La grangian, systems cover a wide class of classical objects in theoretical mechanics, and primarily solid-body (robotic, gyroscopic, rocket-cosmic, and other) systems. And what is rather important, they have a direct trend to practical application. To indicate this discussion, I should like to notice that it does not touch upon the questions concerned with the linear and stochastic con trolobjects. Investigated are only nonlinear deterministic systems being in the conditions when some system parameters are either unknown or beyond the reach of measurement, or they execute an unknown limited and fairly smooth drift in time."
This volume in the series brings together reknowned experts in the
field to present the reader with an account of the latest
developments in quantum mechanics, molecular dynamics, and the
teaching of computational chemistry.
This comprehensive book draws together experts to explore how knowledge technologies can be exploited to create new multimedia applications, and how multimedia technologies can provide new contexts for the use of knowledge technologies. Thorough coverage of all relevant topics is given. The step-by-step approach guides the reader from fundamental enabling technologies of ontologies, analysis and reasoning, through to applications which have hitherto had less attention.
The book presents state of the art knowledge about Decision-Making Support Systems (DMSS). Its main goals are to generate a compendium of quality theoretical and applied papers on decision-making support systems, to help diffuse scarce knowledge about effective methods and strategies for successfully designing, developing, implementing, and evaluating decision-making support systems, and to create an awareness among academicians and practitioners about the relevance of decision-making support systems in the current complex and dynamic management environment. Decision-Making Support Systems: Achievements and Challenges for the New Decade is a comprehensive compilation of DMSS thought and vision, dealing with issues such as decision making concepts in organizations.
This book contains articles written by experts on a wide range of topics that are associated with the analysis and management of biological information at the molecular level. It contains chapters on RNA and protein structure analysis, DNA computing, sequence mapping, genome comparison, gene expression data mining, metabolic network modeling, and phyloinformatics. The important work of some representative researchers in bioinformatics is brought together for the first time in one volume. The topic is treated in depth and is related to, where applicable, other emerging technologies such as data mining and visualization. The goal of the book is to introduce readers to the principle techniques of bioinformatics in the hope that they will build on them to make new discoveries of their own. Contents: Exploring RNA Intermediate Conformations with the Massively Parallel Genetic Algorithm; Introduction to Self-Assembling DNA Nanostructures for Computation and Nanofabrication; Mapping Sequence to Rice FPC; Graph Theoretic Sequence Clustering Algorithms and their Applications to Genome Comparison; The Protein Information Resource for Functional Genomics and Proteomics; High-Grade Ore for Data Mining in 3D Structures; Protein Classification: A Geometric Hashing Approach; Interrelated Clustering: An Approach for Gene Expression Data Analysis; Creating Metabolic Network Models Using Text Mining and Expert Knowledge; Phyloinformatics and Tree Networks. Readership: Molecular biologists who rely on computers and mathematical scientists with interests in biology.
In 2002, the International Conference on Computer Aided Design (ICCAD) celebrates its 20th anniversary. This book commemorates contributions made by ICCAD to the broad field of design automation during that time. The foundation of ICCAD in 1982 coincided with the growth of Large Scale Integration. The sharply increased functionality of board-level circuits led to a major demand for more powerful Electronic Design Automation (EDA) tools. At the same time, LSI grew quickly and advanced circuit integration became widely avail able. This, in turn, required new tools, using sophisticated modeling, analysis and optimization algorithms in order to manage the evermore complex design processes. Not surprisingly, during the same period, a number of start-up com panies began to commercialize EDA solutions, complementing various existing in-house efforts. The overall increased interest in Design Automation (DA) re quired a new forum for the emerging community of EDA professionals; one which would be focused on the publication of high-quality research results and provide a structure for the exchange of ideas on a broad scale. Many of the original ICCAD volunteers were also members of CANDE (Computer-Aided Network Design), a workshop of the IEEE Circuits and Sys tem Society. In fact, it was at a CANDE workshop that Bill McCalla suggested the creation of a conference for the EDA professional. (Bill later developed the name)."
The growing commercial market of Microwave/ Millimeter wave industry over the past decade has led to the explosion of interests and opportunities for the design and development of microwave components.The design of most microwave components requires the use of commercially available electromagnetic (EM) simulation tools for their analysis. In the design process, the simulations are carried out by varying the design parameters until the desired response is obtained. The optimization of design parameters by manual searching is a cumbersome and time consuming process. Soft computing methods such as Genetic Algorithm (GA), Artificial Neural Network (ANN) and Fuzzy Logic (FL) have been widely used by EM researchers for microwave design since last decade. The aim of these methods is to tolerate imprecision, uncertainty, and approximation to achieve robust and low cost solution in a small time frame. Modeling and optimization are essential parts and powerful tools for the microwave/millimeter wave design. This book deals with the development and use of soft computing methods for tackling challenging design problems in the microwave/millimeter wave domain. The aim in the development of these methods is to obtain the design in small time frame while improving the accuracy of the design for a wide range of applications. To achieve this goal, a few diverse design problems of microwave field, representing varied challenges in the design, such as different microstrip antennas, microwave filters, a microstrip-via and also some critical high power components such as nonlinear tapers and RF-windows are considered as case-study design problems. Different design methodologies are developed for these applications. The presents soft computing methods, their review for microwave/millimeter wave design problems and specific case-study problems to infuse better insight and understanding of the subject. |
You may like...
Brain Games Puzzles for Kids - Bible…
Publications International Ltd, Brain Games
Spiral bound
R334
Discovery Miles 3 340
|