Your cart is empty
As people use self-tracking devices and other digital technologies, they generate increasing quantities of personal information online. These data have many benefits, but they can also be accessed and exploited by third parties. Using rich examples from popular culture and empirical research, Deborah Lupton develops a fresh and intriguing perspective on how people make sense of and use their personal data, and what they know about others who use this information. Drawing on feminist new materialism theory and the anthropology of material culture, she acknowledges the importance of paying attention to embodied experiences, as well as discourses and ideas, in identifying the ways in which people make and enact data, and data make and enact people. Arguing that personal data are more-than-human phenomena, invested with diverse forms of vitalities, Lupton reveals significant implications for data futures, politics and ethics. Lupton's novel approach to understanding personal data will be of interest to students and scholars in media and cultural studies, sociology, anthropology, surveillance studies, information studies, cultural geography and science and technology studies.
As the world grows increasingly complex-in issues of sustainability, culture, and technology-businesses and governments are searching for a form of problem solving that can effectively respond to unprecedented levels of ambiguity and disorder. Traditional "linear thinking" has been disparaged by the popular media as being inadequate for dealing with the global economic crisis. Traditional forms of marketing and product development have been rejected by businesses that need to find new ways of staying competitive in a global economy. Yet little has been offered as an alternative. It is not enough to demand that someone "be more innovative" without offering the tools to succeed. This book offers a way of thinking about complicated, multifaceted problems with a repeatable degree of success. Design synthesis methods can be applied in business to produce new and compelling products and services, or these methods can be applied in government with the goal of changing culture and bettering society. In both contexts, there is a need for timely and aggressive action. This text is intended to act as a practitioner's guide to using the magic of design to solve complex problems.
This book provides the foundations of both linear and nonlinear analysis necessary for understanding and working in twenty-first century applied and computational mathematics. In addition to the standard topics, this text includes several key concepts of modern applied mathematical analysis that should be, but are not typically, included in advanced undergraduate and beginning graduate mathematics curricula. This material is the introductory foundation upon which algorithm analysis, optimization, probability, statistics, differential equations, machine learning, and control theory are built. When used in concert with the free supplemental lab materials, this text teaches students both the theory and the computational practice of modern mathematical analysis. Foundations of Applied Mathematics, Volume 1: Mathematical Analysis includes several key topics not usually treated in courses at this level, such as uniform contraction mappings, the continuous linear extension theorem, Daniell-Lebesgue integration, resolvents, spectral resolution theory, and pseudospectra. Ideas are developed in a mathematically rigorous way and students are provided with powerful tools and beautiful ideas that yield a number of nice proofs, all of which contribute to a deep understanding of advanced analysis and linear algebra. Carefully thought out exercises and examples are built on each other to reinforce and retain concepts and ideas and to achieve greater depth. Associated lab materials are available that expose students to applications and numerical computation and reinforce the theoretical ideas taught in the text. The text and labs combine to make students technically proficient and to answer the age-old question, ""When am I going to use this?
This book presents a comprehensive, structured, up-to-date survey on instruction selection. The survey is structured according to two dimensions: approaches to instruction selection from the past 45 years are organized and discussed according to their fundamental principles, and according to the characteristics of the supported machine instructions. The fundamental principles are macro expansion, tree covering, DAG covering, and graph covering. The machine instruction characteristics introduced are single-output, multi-output, disjoint-output, inter-block, and interdependent machine instructions. The survey also examines problems that have yet to be addressed by existing approaches. The book is suitable for advanced undergraduate students in computer science, graduate students, practitioners, and researchers.
Many industry experts consider unsupervised learning the next frontier in artificial intelligence, one that may hold the key to general artificial intelligence. Since the majority of the world's data is unlabeled, conventional supervised learning cannot be applied. Unsupervised learning, on the other hand, can be applied to unlabeled datasets to discover meaningful patterns buried deep in the data, patterns that may be near impossible for humans to uncover. Author Ankur Patel shows you how to apply unsupervised learning using two simple, production-ready Python frameworks: Scikit-learn and TensorFlow using Keras. With code and hands-on examples, data scientists will identify difficult-to-find patterns in data and gain deeper business insight, detect anomalies, perform automatic feature engineering and selection, and generate synthetic datasets. All you need is programming and some machine learning experience to get started. Compare the strengths and weaknesses of the different machine learning approaches: supervised, unsupervised, and reinforcement learning Set up and manage machine learning projects end-to-end Build an anomaly detection system to catch credit card fraud Clusters users into distinct and homogeneous groups Perform semisupervised learning Develop movie recommender systems using restricted Boltzmann machines Generate synthetic images using generative adversarial networks
What does it take to be the leader of a design firm or group? We often assume they have all the answers, but in this rapidly evolving industry they're forced to find their way like the rest of us. So how do good design leaders manage? If you lead a design group, or want to understand the people who do, this insightful book explores behind-the-scenes strategies and tactics from leaders of top design companies throughout North America. Based on scores of interviews he conducted over a two-year period-from small companies to massive corporations like ESPN-author Richard Banfield covers a wide range of topics, including: How design leaders create a healthy company culture Innovative ways for attracting and nurturing talent Creating productive workspaces, and handling remote employees Staying on top of demands while making time for themselves Consistent patterns among vastly different leadership styles Techniques and approaches for keeping the work pipeline full Making strategic and tactical plans for the future Mistakes that design leaders made-and how they bounced back
Data driven Artificial Intelligence (AI) and Machine Learning (ML) in digital pathology, radiology, and dermatology is very promising. In specific cases, for example, Deep Learning (DL), even exceeding human performance. However, in the context of medicine it is important for a human expert to verify the outcome. Consequently, there is a need for transparency and re-traceability of state-of-the-art solutions to make them usable for ethical responsible medical decision support. Moreover, big data is required for training, covering a wide spectrum of a variety of human diseases in different organ systems. These data sets must meet top-quality and regulatory criteria and must be well annotated for ML at patient-, sample-, and image-level. Here biobanks play a central and future role in providing large collections of high-quality, well-annotated samples and data. The main challenges are finding biobanks containing ''fit-for-purpose'' samples, providing quality related meta-data, gaining access to standardized medical data and annotations, and mass scanning of whole slides including efficient data management solutions.
In the forthcoming years, citizens of many countries will be
provided with electronic identity cards. eID solutions may not only
be used for passports, but also for communication with government
authorities or local administrations, as well as for secure
personal identification and access control in e-business. Further
eID applications will be implemented in the healthcare sector. For
some of these solutions we will not need a physical data carrier at
This book considers specific inferential issues arising from the analysis of dynamic shapes with the attempt to solve the problems at hand using probability models and nonparametric tests. The models are simple to understand and interpret and provide a useful tool to describe the global dynamics of the landmark configurations. However, because of the non-Euclidean nature of shape spaces, distributions in shape spaces are not straightforward to obtain. The book explores the use of the Gaussian distribution in the configuration space, with similarity transformations integrated out. Specifically, it works with the offset-normal shape distribution as a probability model for statistical inference on a sample of a temporal sequence of landmark configurations. This enables inference for Gaussian processes from configurations onto the shape space. The book is divided in two parts, with the first three chapters covering material on the offset-normal shape distribution, and the remaining chapters covering the theory of NonParametric Combination (NPC) tests. The chapters offer a collection of applications which are bound together by the theme of this book. They refer to the analysis of data from the FG-NET (Face and Gesture Recognition Research Network) database with facial expressions. For these data, it may be desirable to provide a description of the dynamics of the expressions, or testing whether there is a difference between the dynamics of two facial expressions or testing which of the landmarks are more informative in explaining the pattern of an expression.
Provides a detailed analysis of the standards and technologies enabling applications for the wireless Internet of Things The Wireless Internet of Things: A Guide to the Lower Layers presents a practitioner's perspective toward the Internet of Things (IoT) focusing on over-the-air interfaces used by applications such as home automation, sensor networks, smart grid, and healthcare. The author--a noted expert in the field--examines IoT as a protocol-stack detailing the physical layer of the wireless links, as both a radio and a modem, and the media access control (MAC) that enables communication in congested bands. Focusing on low-power wireless personal area networks (WPANs) the text outlines the physical and MAC layer standards used by ZigBee, Bluetooth LE, Z-Wave, and Thread. The text deconstructs these standards and provides background including relevant communication theory, modulation schemes, and access methods. The author includes a discussion on Wi-Fi and gateways, and explores their role in IoT. He introduces radio topologies used in software-defined radio implementations for the WPANs. The book also discusses channel modelling and link budget analysis for WPANs in IoT. This important text: Introduces IEEE 802.15.4, ITU-T G.9959, and Bluetooth LE as physical layer technology standards enabling wireless IoT Takes a layered approach in order to cultivate an appreciation for the various standards that enable interoperability Provides clarity on wireless standards with particular focus on actual implementation Written for IoT application and platform developers as well as digital signal processing, network, and wireless communication engineers; The Wireless Internet of Things: A Guide to the Lower Layersoffers an inclusive overview of the complex field of wireless IoT, exploring its beneficial applications that are proliferating in a variety of industries.
This book will change the way you think about problems. It focuses on creating solutions to all sorts of complex problems by taking a practical, problem-solving approach. It discusses not only what needs to be done, but it also provides guidance and examples of how to do it. The book applies systems thinking to systems engineering and introduces several innovative concepts such as direct and indirect stakeholders and the Nine-System Model, which provides the context for the activities performed in the project, along with a framework for successful stakeholder management. A list of the figures and tables in this book is available at https://www.crcpress.com/9781138387935. FEATURES * Treats systems engineering as a problem-solving methodology * Describes what tools systems engineers use and how they use them in each state of the system lifecycle * Discusses the perennial problem of poor requirements, defines the grammar and structure of a requirement, and provides a template for a good imperative construction statement and the requirements for writing requirements * Provides examples of bad and questionable requirements and explains the reasons why they are bad and questionable * Introduces new concepts such as direct and indirect stakeholders and the Shmemp! * Includes the Nine-System Model and other unique tools for systems engineering
If you have ever worked on an Agile software development project, you know the importance of face-to-face communication. Having both business and IT professionals working together in the same room can become the critical success factor. Can Agile be successful though when team members are scattered across rooms, buildings, regions, or even countries? Yes! By following the Design for Hybrid Agile Adoption (DH2A) approach, framework and set of templates and tools explained in this book, you can implement successful Agile projects. This book contains three sections: Section I provides the basics of distributed Agile and DH2A, compares collocated with distributed Agile, and shares the rewards of following a distributed Agile approach. Section II dives into the DH2A methodology, with entire chapters dedicated to the Appraisal Segment, Estimation Segment, Planning Segment, and Implementation Segment. In addition there is a chapter in Section II on the roles required to make DH2A a success. Section III focuses on the DH2A framework, with an emphasis on Project Management Office and Governance. Actual case studies are used to illustrate the many useful tools within this text.
Pathways to Excellence offers dynamic provision to fulfil the Computing and ICT elements of the Technology Experiences and Outcomes at level 3 of Curriculum for Excellence. The book acts as an access point to a range of textual and online digital resources designed to support delivery of the Outcomes in a motivating style that encourages active and collaborative learning throughout. A mapping grid ensures coverage af all experiences and outcomes, as well as literacy, numeracy and health & wellbeing requirements, and templates for cross-curricular projects are designed to: - involve individual and groupwork and include opportunities to experience and develop organisational and planning skills. - incorporate the use of ICT to research, communicate, organise and present information. - provide a range of opportunities to develop critical thinking & peer and self-evaluation skills - include checklists and criteria to aid assessment - connect with a wide range of other curricular areas - provide opportunities to contribute to numeracy and literacy outcomes Accompanying Dynamic Learning materials offer Lesson Builder suggestions, interactive whiteboard activities, image support, weblinks, quizzes and teacher support.
Thorough, systematic introduction to serious cryptography, especially strong in modern forms of cipher solution used by experts. Nihilist, grille, U. S. Army, key-phrase, multiple-alphabet, Gronsfeld, Porta, Beaufort, periodic ciphers and more. Simple and advanced methods. 166 specimens to solve-with solutions.
The use of computation and simulation has become an essential part of the scientific process. Being able to transform a theory into an algorithm requires significant theoretical insight, detailed physical and mathematical understanding, and a working level of competency in programming. This upper-division text provides an unusually broad survey of the topics of modern computational physics from a multidisciplinary, computational science point of view. Its philosophy is rooted in learning by doing (assisted by many model programs), with new scientific materials as well as with the Python programming language. Python has become very popular, particularly for physics education and large scientific projects. It is probably the easiest programming language to learn for beginners, yet is also used for mainstream scientific computing, and has packages for excellent graphics and even symbolic manipulations. The text is designed for an upper-level undergraduate or beginning graduate course and provides the reader with the essential knowledge to understand computational tools and mathematical methods well enough to be successful. As part of the teaching of using computers to solve scientific problems, the reader is encouraged to work through a sample problem stated at the beginning of each chapter or unit, which involves studying the text, writing, debugging and running programs, visualizing the results, and the expressing in words what has been done and what can be concluded. Then there are exercises and problems at the end of each chapter for the reader to work on their own (with model programs given for that purpose).
Genetic Algorithms in Java Basics is a brief introduction to solving problems using genetic algorithms, with working projects and solutions written in the Java programming language. This brief book will guide you step-by-step through various implementations of genetic algorithms and some of their common applications, with the aim to give you a practical understanding allowing you to solve your own unique, individual problems. After reading this book you will be comfortable with the language specific issues and concepts involved with genetic algorithms and you'll have everything you need to start building your own. Genetic algorithms are frequently used to solve highly complex real world problems and with this book you too can harness their problem solving capabilities. Understanding how to utilize and implement genetic algorithms is an essential tool in any respected software developers toolkit. So step into this intriguing topic and learn how you too can improve your software with genetic algorithms, and see real Java code at work which you can develop further for your own projects and research. Guides you through the theory behind genetic algorithms Explains how genetic algorithms can be used for software developers trying to solve a range of problems Provides a step-by-step guide to implementing genetic algorithms in Java
This book presents a collection of research papers that address the challenge of how to develop software in a principled way that, in particular, enables reasoning. The individual papers approach this challenge from various perspectives including programming languages, program verification, and the systematic variation of software. Topics covered include programming abstractions for concurrent and distributed software, specification and verification techniques for imperative programs, and development techniques for software product lines. With this book the editors and authors wish to acknowledge - on the occasion of his 60th birthday - the work of Arnd Poetzsch-Heffter, who has made major contributions to software technology throughout his career. It features articles on Arnd's broad research interests including, among others, the implementation of programming languages, formal semantics, specification and verification of object-oriented and concurrent programs, programming language design, distributed systems, software modeling, and software product lines. All contributing authors are leading experts in programming languages and software engineering who have collaborated with Arnd in the course of his career. Overall, the book offers a collection of high-quality articles, presenting original research results, major case studies, and inspiring visions. Some of the work included here was presented at a symposium in honor of Arnd Poetzsch-Heffter, held in Kaiserslautern, Germany, in November 2018.
A revealing look at how negative biases against women of color are embedded in search engine results and algorithms Run a Google search for "black girls"-what will you find? "Big Booty" and other sexually explicit terms are likely to come up as top search terms. But, if you type in "white girls," the results are radically different. The suggested porn sites and un-moderated discussions about "why black women are so sassy" or "why black women are so angry" presents a disturbing portrait of black womanhood in modern society. In Algorithms of Oppression, Safiya Umoja Noble challenges the idea that search engines like Google offer an equal playing field for all forms of ideas, identities, and activities. Data discrimination is a real social problem; Noble argues that the combination of private interests in promoting certain sites, along with the monopoly status of a relatively small number of Internet search engines, leads to a biased set of search algorithms that privilege whiteness and discriminate against people of color, specifically women of color. Through an analysis of textual and media searches as well as extensive research on paid online advertising, Noble exposes a culture of racism and sexism in the way discoverability is created online. As search engines and their related companies grow in importance-operating as a source for email, a major vehicle for primary and secondary school learning, and beyond-understanding and reversing these disquieting trends and discriminatory practices is of utmost importance. An original, surprising and, at times, disturbing account of bias on the internet, Algorithms of Oppression contributes to our understanding of how racism is created, maintained, and disseminated in the 21st century.
Many students, engineers, scientists and researchers have benefited from the practical, programming-oriented style of the previous editions of Programming the Finite Element Method, learning how to develop computer programs to solve specific engineering problems using the finite element method. This new fifth edition offers timely revisions that include programs and subroutine libraries fully updated to Fortran 2003, which are freely available online, and provides updated material on advances in parallel computing, thermal stress analysis, plasticity return algorithms, convection boundary conditions, and interfaces to third party tools such as ParaView, METIS and ARPACK. As in the previous editions, a wide variety of problem solving capabilities are presented including structural analysis, elasticity and plasticity, construction processes in geomechanics, uncoupled and coupled steady and transient fluid flow and linear and nonlinear solid dynamics. Key features: Updated to take into account advances in parallel computing as well as new material on thermal stress analysis Programs use an updated version of Fortran 2003 Includes exercises for students Accompanied by website hosting software Programming the Finite Element Method , Fifth Edition is an ideal textbook for undergraduate and postgraduate students in civil and mechanical engineering, applied mathematics and numerical analysis, and is also a comprehensive reference for researchers and practitioners . Further information and source codes described in this text can be accessed at the following web sites: www.inside.mines.edu/~vgriffit /PFEM5 for the serial programs from Chapters 4-11 www.parafem.org.uk for the parallel programs from Chapter 12
This revised edition covers all aspects of public health informatics and discusses the creation and management of an information technology infrastructure that is essential in linking state and local organizations in their efforts to gather data for the surveillance and prevention. Public health officials will have to understand basic principles of information resource management in order to make the appropriate technology choices that will guide the future of their organizations. Public health continues to be at the forefront of modern medicine, given the importance of implementing a population-based health approach and to addressing chronic health conditions. This book provides informatics principles and examples of practice in a public health context. In doing so, it clarifies the ways in which newer information technologies will improve individual and community health status. This book's primary purpose is to consolidate key information and promote a strategic approach to information systems and development, making it a resource for use by faculty and students of public health, as well as the practicing public health professional. Chapter highlights include: The Governmental and Legislative Context of Informatics; Assessing the Value of Information Systems; Ethics, Information Technology, and Public Health; and Privacy, Confidentiality, and Security. Review questions are featured at the end of every chapter. Aside from its use for public health professionals, the book will be used by schools of public health, clinical and public health nurses and students, schools of social work, allied health, and environmental sciences.
Learn how to use R 4, write and save R scripts, read in and write out data files, use built-in functions, and understand common statistical methods. This in-depth tutorial includes key R 4 features including a new color palette for charts, an enhanced reference counting system (useful for big data), and new data import settings for text (as well as the statistical methods to model text-based, categorical data). Each chapter starts with a list of learning outcomes and concludes with a summary of any R functions introduced in that chapter, along with exercises to test your new knowledge. The text opens with a hands-on installation of R and CRAN packages for both Windows and macOS. The bulk of the book is an introduction to statistical methods (non-proof-based, applied statistics) that relies heavily on R (and R visualizations) to understand, motivate, and conduct statistical tests and modeling. Beginning R 4 shows the use of R in specific cases such as ANOVA analysis, multiple and moderated regression, data visualization, hypothesis testing, and more. It takes a hands-on, example-based approach incorporating best practices with clear explanations of the statistics being done. You will: Acquire and install R and RStudio Import and export data from multiple file formats Analyze data and generate graphics (including confidence intervals) Interactively conduct hypothesis testing Code multiple and moderated regression solutions Who This Book Is For Programmers and data analysts who are new to R. Some prior experience in programming is recommended.
This book integrates the foundations of quantum computing with a hands-on coding approach to this emerging field; it is the first work to bring these strands together in an updated manner. This work is suitable for both academic coursework and corporate technical training. This volume comprises three books under one cover: Part I outlines the necessary foundations of quantum computing and quantum circuits. Part II walks through the canon of quantum computing algorithms and provides code on a range of quantum computing methods in current use. Part III covers the mathematical toolkit required to master quantum computing. Additional resources include a table of operators and circuit elements and a companion GitHub site providing code and updates. Jack D. Hidary is a research scientist in quantum computing and in AI at Alphabet X, formerly Google X. "Quantum Computing will change our world in unexpected ways. Everything technology leaders, engineers and graduate students need is in this book including the methods and hands-on code to program on this novel platform." -Eric Schmidt, PhD, Former Chairman and CEO of Google; Founder, Innovation Endeavors
This book serves as both a textbook and handbook on the benchmarking of systems and components used as building blocks of modern information and communication technology applications. It provides theoretical and practical foundations as well as an in-depth exploration of modern benchmarks and benchmark development. The book is divided into two parts: foundations and applications. The first part introduces the foundations of benchmarking as a discipline, covering the three fundamental elements of each benchmarking approach: metrics, workloads, and measurement methodology. The second part focuses on different application areas, presenting contributions in specific fields of benchmark development. These contributions address the unique challenges that arise in the conception and development of benchmarks for specific systems or subsystems, and demonstrate how the foundations and concepts in the first part of the book are being used in existing benchmarks. Further, the book presents a number of concrete applications and case studies based on input from leading benchmark developers from consortia such as the Standard Performance Evaluation Corporation (SPEC) and the Transaction Processing Performance Council (TPC). Providing both practical and theoretical foundations, as well as a detailed discussion of modern benchmarks and their development, the book is intended as a handbook for professionals and researchers working in areas related to benchmarking. It offers an up-to-date point of reference for existing work as well as latest results, research challenges, and future research directions. It also can be used as a textbook for graduate and postgraduate students studying any of the many subjects related to benchmarking. While readers are assumed to be familiar with the principles and practices of computer science, as well as software and systems engineering, no specific expertise in any subfield of these disciplines is required.
You may like...
Data Structures Using C++
D. Malik Paperback
Oracle 12c - SQL
Joan Casteel Paperback (1)
BTEC First in I&CT Revision Workbook
Paperback R193 Discovery Miles 1 930
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, … Paperback
BTEC First in I&CT Revision Guide
Paperback R191 Discovery Miles 1 910
Introduction to Computer Theory
Daniel I. A. Cohen Paperback (4)
R5,288 Discovery Miles 52 880
Systems Analysis and Design
Harry J. Rosenblatt, Scott Tilley Hardcover
BTEC Level 3 National IT Student Book 1
Karen Anderson, Alan Jarvis, … Paperback (1)
R942 Discovery Miles 9 420
C# Programming - From Problem Analysis…
Barbara Doyle Paperback
Discovering Computers (c)2017
Jennifer Campbell, Mark Frydenberg, … Paperback (3)