![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Applications of computing > General
This book contains papers presented at the fifth and sixth Teraflop Workshop. It presents the state-of-the-art in high performance computing and simulation on modern supercomputer architectures. It covers trends in hardware and software development in general and specifically the future of vector-based systems and heterogeneous architectures. It covers computational fluid dynamics, fluid-structure interaction, physics, chemistry, astrophysics, and climate research.
Information systems have five main areas of research and practice in which humans relate to information and communications technology. Typically isolated from one another, these areas are: the nature of computers and information, the creation of information technologies, the development of artifacts for human use, the usage of information systems, and information technology as our environment. Philosophical Frameworks for Understanding Information Systems strives to develop philosophical frameworks for these five areas and provides researchers, scholars, and provides practitioners in fields such as information systems, public administration, library science, education, and business management with an exemplary reference resource.
Computational paleontology is simply a term applied to using computers and its facilities in the field of paleontology. However, we should be exactly precise in describing the term through explaining the main themes of this motivating and attractive scientific field. The uppermost aim of this book is to explain how computation could be competent in fetching fossils to life and the past to present. Computers for paleontologists save time and costs, interpret mysterious events precisely and accurately, visualize the ancient life definitely and undeniably.
Recent decades have witnessed the thriving development of new mathematical, computational and theoretical approaches, such as bioinformatics and neuroinformatics, to tackle fundamental issues in biology. These approaches focus no longer on individual units, such as nerve cells or genes, but rather on dynamic patterns of interactions between them. This volume explores the concept in full, featuring contributions from a global group of contributors, many of whom are pre-eminent in their field.
This volume offers the advice of selected expert contributors on the application of heterogeneous methods for managing uncertainty and imprecision in databases. It contains both survey chapters on classic topics such as "flexible querying in databases," and up to date information on "database models to represent imperfect data." Further, it includes specific contributions on uncertainty management in database integration, and in representing and querying semistructured and spatial data.
Get the expert perspective and practical advice on big data The Big Data-Driven Business: How to Use Big Data to Win Customers, Beat Competitors, and Boost Profits makes the case that big data is for real, and more than just big hype. The book uses real-life examples from Nate Silver to Copernicus, and Apple to Blackberry to demonstrate how the winners of the future will use big data to seek the truth. Written by a marketing journalist and the CEO of a multi-million-dollar B2B marketing platform that reaches more than 90% of the U.S. business population, this book is a comprehensive and accessible guide on how to win customers, beat competitors, and boost the bottom line with big data. The marketplace has entered an era where the customer holds all the cards. With unprecedented choice in both the consumer world and the B2B world, it's imperative that businesses gain a greater understanding of their customers and prospects. Big data is the key to this insight, because it provides a comprehensive view of a company's customers who they are, and who they may be tomorrow. The Big Data-Driven Business is a complete guide to the future of business as seen through the lens of big data, with expert advice on real-world applications. * Learn what big data is, and how it will transform the enterprise * Explore why major corporations are betting their companies on marketing technology * Read case studies of big data winners and losers * Discover how to change privacy and security, and remodel marketing Better information allows for better decisions, better targeting, and better reach. Big data has become an indispensable tool for the most effective marketers in the business, and it's becoming less of a competitive advantage and more like an industry standard. Remaining relevant as the marketplace evolves requires a full understanding and application of big data, and The Big Data-Driven Business provides the practical guidance businesses need.
This is a collection of classic research papers on the Dempster-Shafer theory of belief functions. The book is the authoritative reference in the field of evidential reasoning and an important archival reference in a wide range of areas including uncertainty reasoning in artificial intelligence and decision making in economics, engineering, and management. The book includes a foreword reflecting the development of the theory in the last forty years.
This book presents some recent works on the application of Soft Computing techniques in information access on the World Wide Web. The book comprises 15 chapters from internationally known researchers and is divided in four parts reflecting the areas of research of the presented works such as Document Classification, Semantic Web, Web Information Retrieval and Web Applications. This book demonstrates that Web Information Retrieval is a stimulating area of research where Soft Computing technologies can be applied satisfactorily.
In 1984, Working Group 8.2 of the International Federation for
Information Processing (IFIP) threw down the gauntlet at its
Manchester conference, challenging the traditionalist orthodoxy
with its uncommon research approaches and topics. Manchester 1984,
followed by research methods conferences in Copenhagen (1990) and
Philadelphia (1997), marked the growing legitimacy of the
linguistic and qualitative turns in Information Systems research
and played a key role in making qualitative methods a respected
part of IS research. As evidenced by the papers in this volume,
Working Group 8.2 conferences showcase fresh thinking, provocative
sessions, and intellectual stimulation. The spirited, at times
boisterous, and always enlivening debate has turned WG8.2
conferences into life-changing and discipline-changing
inspirational events.
CSIE 2011 is an international scientific Congress for distinguished scholars engaged in scientific, engineering and technological research, dedicated to build a platform for exploring and discussing the future of Computer Science and Information Engineering with existing and potential application scenarios. The congress has been held twice, in Los Angeles, USA for the first and in Changchun, China for the second time, each of which attracted a large number of researchers from all over the world. The congress turns out to develop a spirit of cooperation that leads to new friendship for addressing a wide variety of ongoing problems in this vibrant area of technology and fostering more collaboration over the world. The congress, CSIE 2011, received 2483 full paper and abstract submissions from 27 countries and regions over the world. Through a rigorous peer review process, all submissions were refereed based on their quality of content, level of innovation, significance, originality and legibility. 688 papers have been accepted for the international congress proceedings ultimately. "
This book provides a holistic perspective on Digital Twin (DT) technologies, and presents cutting-edge research in the field. It assesses the opportunities that DT can offer for smart cities, and covers the requirements for ensuring secure, safe and sustainable smart cities. Further, the book demonstrates that DT and its benefits with regard to: data visualisation, real-time data analytics, and learning leading to improved confidence in decision making; reasoning, monitoring and warning to support accurate diagnostics and prognostics; acting using edge control and what-if analysis; and connection with back-end business applications hold significant potential for applications in smart cities, by employing a wide range of sensory and data-acquisition systems in various parts of the urban infrastructure. The contributing authors reveal how and why DT technologies that are used for monitoring, visualising, diagnosing and predicting in real-time are vital to cities' sustainability and efficiency. The concepts outlined in the book represents a city together with all of its infrastructure elements, which communicate with each other in a complex manner. Moreover, securing Internet of Things (IoT) which is one of the key enablers of DT's is discussed in details and from various perspectives. The book offers an outstanding reference guide for practitioners and researchers in manufacturing, operations research and communications, who are considering digitising some of their assets and related services. It is also a valuable asset for graduate students and academics who are looking to identify research gaps and develop their own proposals for further research.
Agda is an advanced programming language based on Type Theory. Agda's type system is expressive enough to support full functional verification of programs, in two styles. In external verification, we write pure functional programs and then write proofs of properties about them. The proofs are separate external artifacts, typically using structural induction. In internal verification, we specify properties of programs through rich types for the programs themselves. This often necessitates including proofs inside code, to show the type checker that the specified properties hold. The power to prove properties of programs in these two styles is a profound addition to the practice of programming, giving programmers the power to guarantee the absence of bugs, and thus improve the quality of software more than previously possible. Verified Functional Programming in Agda is the first book to provide a systematic exposition of external and internal verification in Agda, suitable for undergraduate students of Computer Science. No familiarity with functional programming or computer-checked proofs is presupposed. The book begins with an introduction to functional programming through familiar examples like booleans, natural numbers, and lists, and techniques for external verification. Internal verification is considered through the examples of vectors, binary search trees, and Braun trees. More advanced material on type-level computation, explicit reasoning about termination, and normalization by evaluation is also included. The book also includes a medium-sized case study on Huffman encoding and decoding.
For the first time advances in semiconductor manufacturing do not lead to a corresponding increase in performance. At 65 nm and below it is predicted that only a small portion of performance increase will be attributed to shrinking geometries while the lion share is due to innovative processor architectures. To substantiate this assertion it is instructive to look at major drivers of the semiconductor industry: wireless communications and multimedia. Both areas are characterized by an exponentially increasing demand of computational power, which cannot be provided in an energy-efficient manner by traditional processor architectures. Todaya (TM)s applications in wireless communications and multimedia require highly specialized and optimized architectures. New software tools and a sophisticated methodology above RTL are required to answer the challenges of designing an optimized application specific processor (ASIP). This book offers an automated and fully integrated implementation flow and compares it to common implementation practice. Case-studies emphasise that neither the architectural advantages nor the design space of ASIPs are sacrificed for an automated implementation. Realizing a building block which fulfils the requirements on programmability and computational power is now efficiently possible for the first time. Optimized ASIP Synthesis from Architecture Description Language Models inspires hardware designers as well as application engineers to design powerful ASIPs that will make their SoC designs unique.
The papers containedin this volume were presentedat the 5th IFIP InternationalC- ference on Theoretical Computer Science (IFIP TCS), 7-10 September 2008, Milan, Italy. TCS is a bi-annual conference.The ?rst conferenceof the series was held in Sendai (Japan, 2000), followed by Montreal (Canada, 2002), Toulouse (France, 2004) and Santiago (Chile, 2006).TCS is organizedby IFIP TC1 (Technical Committee 1: Fo- dations of Computer Science) and Working Group 2.2 of IFIP TC2 (Technical C- mittee 2: Software: Theory and Practice). TCS 2008 was part of the 20th IFIP World Computer Congress (WCC 2008), constituting the TC1 Track of WCC 2008. The contributed papers were selected from 36+45 submissions from altogether 30 countries. A total of 14+16 submissions were accepted as full papers. Papers in this volume are original contributions in two general areas: Track A: Algorithms, C- plexity and Models of Computation;and Track B: Logic, Semantics, Speci?cation and Veri?cation. The conference also included seven invited presentations, from Luca Cardelli, Thomas Ehrhard, Javier Esparza, Antonio Restivo, Tim Roughgarden, Gr- gorz Rozenberg and Avraham Trakhtman. These presentations are included (except one) in this volume. In particular, Luca Cardelli, Javier Esparza, Antonio Restivo, Tim Roughgarden and Avraham Trakhtman accepted our invitation to write full papers - lated to their talks.
Networks have become nearly ubiquitous and increasingly complex, and their support of modern enterprise environments has become fundamental. Accordingly, robust network management techniques are essential to ensure optimal performance of these networks. This monograph treats the application of numerous graph-theoretic algorithms to a comprehensive analysis of dynamic enterprise networks. Network dynamics analysis yields valuable information about network performance, efficiency, fault prediction, cost optimization, indicators and warnings. Based on many years of applied research of generic network dynamics, this work covers a number of elegant applications (including many new and experimental results) of traditional graph theory algorithms and techniques to computationally tractable network dynamics analysis to motivate network analysts, practitioners and researchers alike. The material is also suitable for graduate courses addressing state-of-the-art applications of graph theory in analysis of dynamic communication networks, dynamic databasing, and knowledge management.
Set your students on track to achieve the best grade possible with My Revision Notes: AQA A-level Computer Science. Our clear and concise approach to revision will help students learn, practise and apply their skills and understanding. Coverage of key content is combined with practical study tips and effective revision strategies to create a guide that can be relied on to build both knowledge and confidence. With My Revision Notes: AQA A-level Computer Science, students can: > Consolidate knowledge with clear, focused and relevant content coverage, based on what examiners are looking for > Develop understanding with self-testing - our regular 'Now test yourself,' tasks and answers will help commit knowledge to memory > Improve technique through exam-style practice questions, expert tips and examples of typical mistakes to avoid > Identify key connections between topics and subjects with our 'Learning links' focus > Plan and manage a successful revision programme with our topic-by-topic planner, new exam breakdown feature, user-friendly definitions throughout and questions and answers online
An open process of restandardization, conducted by the IEEE, has led to the definitions of the new VHDL standard. The changes make VHDL safer, more portable, and more powerful. VHDL also becomes bigger and more complete. The canonical simulator of VHDL is enriched by new mechanisms, the predefined environment is more complete, and the syntax is more regular and flexible. Discrepancies and known bugs of VHDL'87 have been fixed. However, the new VHDL'92 is compatible with VHDL'87, with some minor exceptions. This book presents the new VHDL'92 for the VHDL designer. New features are explained and classified. Examples are provided, each new feature is given a rationale and its impact on design methodology, and performance is analyzed. Where appropriate, pitfalls and traps are explained. The VHDL designer should quickly be able to find the feature needed to evaluate the benefits it brings, to modify previous VHDL'87 code to make it more efficient, more portable, and more flexible. This text should be a useful update for all VHDL designers and managers involved in electronic design.
For generations, humans have fantasized about the ability to create devices that can see into a person's mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science fiction stories. Recent advances in cognitive neuroscience and brain imaging technologies have started to turn these myths into a reality, and are providing us with the ability to interface directly with the human brain. This ability is made possible through the use of sensors that monitor physical processes within the brain which correspond with certain forms of thought. Brain-Computer Interfaces: Applying our Minds to Human-Computer Interaction broadly surveys research in the Brain-Computer Interface domain. More specifically, each chapter articulates some of the challenges and opportunities for using brain sensing in Human-Computer Interaction work, as well as applying Human-Computer Interaction solutions to brain sensing work. For researchers with little or no expertise in neuroscience or brain sensing, the book provides background information to equip them to not only appreciate the state-of-the-art, but also ideally to engage in novel research. For expert Brain-Computer Interface researchers, the book introduces ideas that can help in the quest to interpret intentional brain control and develop the ultimate input device. It challenges researchers to further explore passive brain sensing to evaluate interfaces and feed into adaptive computing systems. Most importantly, the book will connect multiple communities allowing research to leverage their work and expertise and blaze into the future.
This book constitutes Part IV of the refereed four-volume post-conference proceedings of the 4th IFIP TC 12 International Conference on Computer and Computing Technologies in Agriculture, CCTA 2010, held in Nanchang, China, in October 2010. The 352 revised papers presented were carefully selected from numerous submissions. They cover a wide range of interesting theories and applications of information technology in agriculture, including simulation models and decision-support systems for agricultural production, agricultural product quality testing, traceability and e-commerce technology, the application of information and communication technology in agriculture, and universal information service technology and service systems development in rural areas.
The modern electronic testing has a forty year history. Test professionals hold some fairly large conferences and numerous workshops, have a journal, and there are over one hundred books on testing. Still, a full course on testing is offered only at a few universities, mostly by professors who have a research interest in this area. Apparently, most professors would not have taken a course on electronic testing when they were students. Other than the computer engineering curriculum being too crowded, the major reason cited for the absence of a course on electronic testing is the lack of a suitable textbook. For VLSI the foundation was provided by semiconductor device techn- ogy, circuit design, and electronic testing. In a computer engineering curriculum, therefore, it is necessary that foundations should be taught before applications. The field of VLSI has expanded to systems-on-a-chip, which include digital, memory, and mixed-signalsubsystems. To our knowledge this is the first textbook to cover all three types of electronic circuits. We have written this textbook for an undergraduate "foundations" course on electronic testing. Obviously, it is too voluminous for a one-semester course and a teacher will have to select from the topics. We did not restrict such freedom because the selection may depend upon the individual expertise and interests. Besides, there is merit in having a larger book that will retain its usefulness for the owner even after the completion of the course. With equal tenacity, we address the needs of three other groups of readers.
Graphs are widely used to represent structural information in the form of objects and connections between them. Graph transformation is the rule-based manipulation of graphs, an increasingly important concept in computer science and related fields. This is the first textbook treatment of the algebraic approach to graph transformation, based on algebraic structures and category theory. Part I is an introduction to the classical case of graph and typed graph transformation. In Part II basic and advanced results are first shown for an abstract form of replacement systems, so-called adhesive high-level replacement systems based on category theory, and are then instantiated to several forms of graph and Petri net transformation systems. Part III develops typed attributed graph transformation, a technique of key relevance in the modeling of visual languages and in model transformation. Part IV contains a practical case study on model transformation and a presentation of the AGG (attributed graph grammar) tool environment. Finally the appendix covers the basics of category theory, signatures and algebras. The book addresses both research scientists and graduate students in computer science, mathematics and engineering.
Third International Conference on Recent Trends in Information, Telecommunication and Computing - ITC 2012. ITC 2012 will be held during Aug 03-04, 2012, Kochi, India. ITC 2012, is to bring together innovative academics and industrial experts in the field of Computer Science, Information Technology, Computational Engineering, and Communication to a common forum. The primary goal of the conference is to promote research and developmental activities in Computer Science, Information Technology, Computational Engineering, and Communication. Another goal is to promote scientific information interchange between researchers, developers, engineers, students, and practitioners. |
![]() ![]() You may like...
The Gig Economy - Workers and Media in…
Brian Dolber, Michelle Rodino-Colocino, …
Hardcover
R4,567
Discovery Miles 45 670
|