![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing
Computer processing and internet communication has changed the way we learn, work, play and associate with each other. In the case of China, the introduction of the computer and mass availability of the internet has boosted economic growth, sped up social progress, transformed the political landscape and changed the lifestyle of the Chinese people in profound ways. This new book discusses the influence that advances in computers and increased use and dependence on the internet has had on China, with a particular focus on cyberspace governance.
Preprocessing, or data reduction, is a standard technique for simplifying and speeding up computation. Written by a team of experts in the field, this book introduces a rapidly developing area of preprocessing analysis known as kernelization. The authors provide an overview of basic methods and important results, with accessible explanations of the most recent advances in the area, such as meta-kernelization, representative sets, polynomial lower bounds, and lossy kernelization. The text is divided into four parts, which cover the different theoretical aspects of the area: upper bounds, meta-theorems, lower bounds, and beyond kernelization. The methods are demonstrated through extensive examples using a single data set. Written to be self-contained, the book only requires a basic background in algorithmics and will be of use to professionals, researchers and graduate students in theoretical computer science, optimization, combinatorics, and related fields.
Systems Analysis and Design, Eighth Edition offers a practical, visually appealing approach to information systems development.
High-throughput sequencing has revolutionised the field of biological sequence analysis. Its application has enabled researchers to address important biological questions, often for the first time. This book provides an integrated presentation of the fundamental algorithms and data structures that power modern sequence analysis workflows. The topics covered range from the foundations of biological sequence analysis (alignments and hidden Markov models), to classical index structures (k-mer indexes, suffix arrays and suffix trees), Burrows-Wheeler indexes, graph algorithms and a number of advanced omics applications. The chapters feature numerous examples, algorithm visualisations, exercises and problems, each chosen to reflect the steps of large-scale sequencing projects, including read alignment, variant calling, haplotyping, fragment assembly, alignment-free genome comparison, transcript prediction and analysis of metagenomic samples. Each biological problem is accompanied by precise formulations, providing graduate students and researchers in bioinformatics and computer science with a powerful toolkit for the emerging applications of high-throughput sequencing.
Cyberthreats to U.S. infrastructure and other assets are a growing concern to policymakers. Information and communications technology (ICT) is ubiquitous and relied upon for government services, corporate business processes, and individual professional and personal pursuits -- almost every facet of modern life. Many ICT devices and other components are interdependent, and disruption of one component may have a negative, cascading effect on others. A denial of service, theft or manipulation of data, or damage to critical infrastructure through a cyber-based attack could have significant impacts on national security, the economy, and the livelihood and safety of individual citizens. The federal legislative framework for cybersecurity is complex, with more than 50 statutes addressing various aspects of it either directly or indirectly. Many observers do not believe that the current framework is sufficient to address the growing concerns about the security of cyberspace in the United States. However, no major cybersecurity legislation has been enacted since 2002. This book provides an overview of the 2013 cybersecurity executive order and discusses considerations for Congress. It also provides ways to improve the critical infrastructure of cybersecurity.
Prolog for logic programming is one of the most intensively studied software languages in the 1980s. During the same period, the data-flow model for parallel computation attracted a lot of attention of researchers in the computer science; hence, it was very natural that several approaches were tried toward combining the two and implementing logic programs in parallel machines with the data-flow architecture. These approaches, however, were rather indirect ones in the sense that they developed programs describing AND/OR-parallelism for deduction using a data-flow language and executed them in a data-flow computer, and yet did not devise a direct' model for parallel execution (reasoning) of a logic program. This book discusses fuzzy logic inferencing for Pong; dislog; SEProlog; and provides direct graphical representations of first-order logic for inference.
If you struggle with binary multiplication, or Big O Notation, this is the book for you. This textbook companion will help improve your essential maths skills for computer science, whichever awarding body specification you're following. You can use it throughout your course, whenever you feel you need some extra help. - Develop your understanding of both maths and computer science with all worked examples and questions within a computer science context - Improve your confidence with a step-by-step approach to every maths skill - Measure your progress with guided and non-guided questions to see how you're improving - Understand where you're going wrong with full worked solutions to every question - Feel confident in expert guidance from experienced teachers and examiners Victoria Ellis and Gavin Craddock, reviewed by Dr Kathleen Maitland, Senior Lecturer in Computing and Director of the SAS Student Academy at Birmingham City University
Whether it's software, a cell phone, or a refrigerator, your customer wants?no, expects?your product to be easy to use. This fully revised handbook provides clear, step-by-step guidelines to help you test your product for usability. Completely updated with current industry best practices, it can give you that all-important marketplace advantage: products that perform the way users expect. You?ll learn to recognize factors that limit usability, decide where testing should occur, set up a test plan to assess goals for your product's usability, and more.
Students are guided through the latest trends in computer concepts and technology in an exciting and easy-to-follow format. Updated for currency, DISCOVERING COMPUTERS provides s the most up-to-date information on the latest technology in today's digital world.
Understanding and Troubleshooting Your PC from the Shelly Cashman Series (R) is designed to provide a basic understanding of how personal computers work. Created for the classroom, this text covers the basics of PC repair with an emphasis on troubleshooting and maintenance. Its full-color, extremely visual design will use conceptual art to help students learn about technical computer concepts.
Are you ready for embedded PP-DS? Advance your production planning and detailed scheduling with this comprehensive guide! Discover how the PP-DS integration model has been simplified with SAP S/4HANA. Then follow step-by-step instructions for configuring and running PP-DS in your system, from determining your requirements to monitoring your results. With details on advanced features, troubleshooting, and migration, this is your all-in-one PP-DS resource.In this book, you'll learn about:a. Master Data Walk through the PP-DS integration model, and see which master data objects are still required in SAP S/4HANA. Set up your master data so that all advanced features run smoothly in your system.b. Configuration Learn how to configure embedded PP-DS, step by step. Begin with basic settings for master data and SAP liveCache, and then move on to heuristics, the product view, the planning board, and more. c. ExecutionFrom leveraging planning, service, and scheduling heuristics to using the PP-DS optimizer, learn how to execute successful planning and scheduling runs in SAP S/4HANA. Highlights Include: 1) Master data 2) Configuration 3) Data transfer 4) Service and scheduling heuristics5) Block planning6) Shelf life planning 7) Push production 8) Alert monitor 9) Administration 10) Migration.
Combinatorial optimization is a multidisciplinary scientific area, lying in the interface of three major scientific domains: mathematics, theoretical computer science and management. The three volumes of the Combinatorial Optimization series aim to cover a wide range of topics in this area. These topics also deal with fundamental notions and approaches as with several classical applications of combinatorial optimization. Concepts of Combinatorial Optimization, is divided into three parts: - On the complexity of combinatorial optimization problems, presenting basics about worst-case and randomized complexity; - Classical solution methods, presenting the two most-known methods for solving hard combinatorial optimization problems, that are Branch-and-Bound and Dynamic Programming; - Elements from mathematical programming, presenting fundamentals from mathematical programming based methods that are in the heart of Operations Research since the origins of this field.
Luciano Floridi presents a book that will set the agenda for the philosophy of information. PI is the philosophical field concerned with (1) the critical investigation of the conceptual nature and basic principles of information, including its dynamics, utilisation, and sciences, and (2) the elaboration and application of information-theoretic and computational methodologies to philosophical problems. This book lays down, for the first time, the conceptual foundations for this new area of research. It does so systematically, by pursuing three goals. Its metatheoretical goal is to describe what the philosophy of information is, its problems, approaches, and methods. Its introductory goal is to help the reader to gain a better grasp of the complex and multifarious nature of the various concepts and phenomena related to information. Its analytic goal is to answer several key theoretical questions of great philosophical interest, arising from the investigation of semantic information.
First published in 1958, John von Neumann's classic work The Computer and the Brain explored the analogies between computing machines and the living human brain. Von Neumann showed that the brain operates both digitally and analogically, but also has its own unique statistical language. And more than fifty years after its inception the von Neumann architecture - an organizational framework for computer design - still lies at the heart of today's machines. In his foreword to this new edition, Ray Kurzweil, a futurist famous for his own musings on the relationship between technology and consciousness, places von Neumann's work in a historical context and shows how it remains relevant today.
We -- the users turned creators and distributors of content -- are TIME's Person of the Year 2006, and AdAge's Advertising Agency of the Year 2007. We form a new Generation C. We have MySpace, YouTube, and OurMedia; we run social software, and drive the development of Web 2.0. But beyond the hype, what's really going on? In this groundbreaking exploration of our developing participatory online culture, Axel Bruns establishes the core principles which drive the rise of collaborative content creation in environments, from open source through blogs and Wikipedia to Second Life. This book shows that what's emerging here is no longer just a new form of content production, but a new process for the continuous creation and extension of knowledge and art by collaborative communities: produsage. The implications of the gradual shift from production to produsage are profound, and will affect the very core of our culture, economy, society, and democracy.
Bring your computer literacy course back to the BASICS. COMPUTER LITERACY BASICS: A COMPREHENSIVE GUIDE TO IC3 provides an introduction to computer concepts and skills, which maps to the newest Computing Core Certification (IC3) standards. Designed with new learners in mind, this text covers Computing Fundamentals, Key Applications, and Living Online - everything students need to pass the IC3 exam, and finish the course as confident computer users.
As the second volume of the "Digital Oil & Gas Pipeline: Research and Practice" series of monographs, this book introduces the implementation strategies, examples and technical roadmaps of two important aspects of the Digital Oil & Gas Pipeline construction: pipeline real-time data integration and pipeline network virtual reality system. Two example of pipeline real-time data integration are elaborated: integration of pipeline WebGIS (Geographic Information System) and pipeline SCADA (Supervisory Control and Data Acquisition) via OPC (OLE for Process Control) technology, integration of pipeline network virtual reality system and pipeline SCADA via OPC, JNI (Java Native Interface) and SAI (Scene Access Interface). The pipeline network virtual reality system aims for the pipeline virtual expression, interaction, and 3D visual management. It can be used for pipeline route visual design and plan, immersive pipeline industry training, remote visual supervision and control, etc. The implementation details of the pipeline network virtual reality system, including 3D pipeline and terrain modeling with X3D (Extensible 3D) technology, improving large-scene display performance and speed in the network environment using LOD (Level of Detail) technology, interaction of virtual pipeline scenes, and pipeline 3D visual monitoring, are also introduced. The knowledge and experience delivered by this book will provide useful reference for the readers from the industries of oil & gas pipeline, GIS, Virtual Reality, industrial control, etc.
Hack your ride! In this volume of Make:, you'll find a 21-page special section on connected cars. You'll also see the world's cutest go-kart, DIY electric vehicles, 12 bike mods, and learn about custom dashboard computing. And if you can't wait for the upcoming movie, build yourself a working, Star Wars-inspired, BB-8 droid! This issue also features skill builders on spray paint, choosing the right battery, and working with sheet metal and rivets. On top of that, you'll find 40 projects, including: A 3D-printed RC race car The million-color flashlight Water balloon cannon
This book, presented in three volumes, examines environmental disciplines in relation to major players in contemporary science: Big Data, artificial intelligence and cloud computing. Today, there is a real sense of urgency regarding the evolution of computer technology, the ever-increasing volume of data, threats to our climate and the sustainable development of our planet. As such, we need to reduce technology just as much as we need to bridge the global socio-economic gap between the North and South; between universal free access to data (open data) and free software (open source). In this book, we pay particular attention to certain environmental subjects, in order to enrich our understanding of cloud computing. These subjects are: erosion; urban air pollution and atmospheric pollution in Southeast Asia; melting permafrost (causing the accelerated release of soil organic carbon in the atmosphere); alert systems of environmental hazards (such as forest fires, prospective modeling of socio-spatial practices and land use); and web fountains of geographical data. Finally, this book asks the question: in order to find a pattern in the data, how do we move from a traditional computing model-based world to pure mathematical research? After thorough examination of this topic, we conclude that this goal is both transdisciplinary and achievable.
This book, presented in three volumes, examines environmental disciplines in relation to major players in contemporary science: Big Data, artificial intelligence and cloud computing. Today, there is a real sense of urgency regarding the evolution of computer technology, the ever-increasing volume of data, threats to our climate and the sustainable development of our planet. As such, we need to reduce technology just as much as we need to bridge the global socio-economic gap between the North and South; between universal free access to data (open data) and free software (open source). In this book, we pay particular attention to certain environmental subjects, in order to enrich our understanding of cloud computing. These subjects are: erosion; urban air pollution and atmospheric pollution in Southeast Asia; melting permafrost (causing the accelerated release of soil organic carbon in the atmosphere); alert systems of environmental hazards (such as forest fires, prospective modeling of socio-spatial practices and land use); and web fountains of geographical data. Finally, this book asks the question: in order to find a pattern in the data, how do we move from a traditional computing model-based world to pure mathematical research? After thorough examination of this topic, we conclude that this goal is both transdisciplinary and achievable. |
![]() ![]() You may like...
Systems Analysis and Design
Harry J. Rosenblatt, Scott Tilley
Hardcover
Discovering Computers, Essentials…
Susan Sebok, Steven Freund, …
Paperback
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
Discovering Computers (c)2017
Jennifer Campbell, Mark Frydenberg, …
Paperback
![]()
|