![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
Drawing on the innovative concept of Organizational IQ and a study of companies in seventeen countries, Survival of the Smartest charts a course for managers to follow into the twenty-first century. At the heart of the book is the authors' tool for assessing an organization's future health, which they call Organizational IQ. It measures a company's ability to quickly process information and make effective decisions. As industry clock speeds accelerate everywhere, a high IQ has become a prerequisite for survival. Low IQ companies that the authors studied, on the other hand, have already vanished. Case studies from Hewlett-Packard, British Petroleum, Sun Microsystems, and Chrysler, among others, illustrate how companies can improve their Organizational IQs. How did Hewlett-Packard become the dominant player in printing? How did British Petroleum transform itself from a stodgy behemoth into the most agile and competitive player in the oil industry? How did Chrysler rise from the brink of bankruptcy to become the auto industry's prized asset? In these companies, technology by itself played only a secondary role: to be successful, the entire organization had to become smarter. The authors show how key strategic decisions turned around these companies' Organizational IQs—and with it, their fortunes. A detailed company case study takes you in slow motion through the different steps you can take to improve the IQ of your own organization. Survival of the Smartest offers a rare blend of a coherent framework, in-depth company case studies, a sound research base, and a detailed, step-by-step implementation example. Based on a landmark study of 164 organizations worldwide, conducted as part of a partnership between Stanford University, McKinsey & Company, and the University of Augsburg, Organizational IQ is proving to be the acid test for the success or failure of companies around the world. "Survival of the Smartest is a practical and action-oriented road map for managing in the Information Age. It can save you much pain by tapping into the experience of managers who have done it before."—Eric Benhamou, CEO, 3Com "Most executives acknowledge that they need to prepare their organizations for the challenges of the 21st century. But how do you know where you stand and what are the most important levers for improvement? Survival of the Smartest is a must read for any manager who is looking for answers to these questions."—Guenter Mooshammer, Senior Vice President, Vishay Semiconductor "Mendelson and Ziegler brilliantly capture the character of organizations smart enough to function successfully in a turbulent world. Loaded with powerful concepts and how companies have successfully applied them, Survival of the Smartest will teach you how to create a "High-IQ" company and avoid the traps that make good companies mediocre."—Jerry Porras, Lane Professor of Organizational Behavior, Stanford Business School, and coauthor, Built to Last. "Professor Mendelson and Dr. Ziegler make it abundantly clear that if your organization is not on the path to becoming a high-IQ company, then it is probably on the road to extinction. This book packs a lot of wisdom that is eminently useful to companies large and small, hi-tech and low-tech. With a solid research base at Stanford Business School and loads of real-world experience, Survival of the Smartest captures essential truths for management in the Information Age."—Charles Fine, Professor, Sloan School of Management, MIT, and author of Clockspeed: Winning Industry Control in the Age of Temporary Advantage. "This book shows how winning information-age firms can turn speed from a potential problem into a weapon and turn a flood of information into a competitive advantage."—Scott Cook, cofounder, Intuit, Inc. "Survival of the Smartest is a comprehensive set of tools for measuring the IQ of your organization and systematically improving an organization's ability to learn. I found the book to be both a pragmatic and practical way to hone the competitiveness of any company."—John McHugh, General Manager, Hewlett-Packard ProCurve Networking Business
Digital transformation is a multidimensional concept and involves many moving parts. Successful digital transformation requires a fresh approach to harnessing people, processes, technology, and data to develop new business models and digital ecosystems. One main barrier could be an overemphasis on applying technology to expand the business rather than transforming the people's mindsets to do things differently. Thus, it is important to develop a holistic view of these parts and assemble them to foster the right conditions for digital transformation to happen. Business leaders and executives must be equipped with a wide range of digital competencies to thrive in a rapidly changing digital environment. Digital Transformation: Strategy, Execution, and Technology provides an overall view on the strategy, execution, and technology for organizations aiming to transform digitally. It offers insights on how to become more successful in the digital age by explaining the importance and relevance of the various building blocks which form the foundation of a digital organization. It shows the reader how to develop these building blocks in the organization as part of the digital transformation journey from both a business and technical perspective. Highlights of the book include: Digital transformation strategy Digital governance and risk management Digital organization and change management Experimental learning and design thinking Digital product management Agile and DevSecOps Digital enterprise architecture Business applications of digital technology This practical guide is written keeping business and information technology professionals and digital transformation practitioners in mind. It is also suitable for students pursuing postgraduate degrees and participants attending executive education programs in business and information technology.
This volume presents a programming model, similar to object-oriented programming, that imposes a strict discipline on the form of the constituent objects and interactions among them. Concurrency considerations have been eliminated from the model itself and are introduced only during implementation, thereby freeing programmers from dealing with concurrency explicitly. Moreover, the resulting software designs are typically more modular and easier to analyze than the more traditional ones. Numerous examples illustrate various aspects of the model and reveal that a few simple, integrated features are adequate for designing complex applications. Topics and features: * Presents a simple, easy-to-understand multiprogramming model * Provides extensive development of the underlying theory * Emphasizes program composition, thereby making possible programming of large systems through modular designs * Eliminates explicit concurrency considerations during program design * Supplies efficient implementation schemes for distributed platforms. This book addresses the problem of developing complex distributed applications on wide-area networks, such as the Internet and World Wide Web, by using effective program design principles. Computer scientists, computer engineers, and software engineers will find the book an authoritative guide to large-scale multiprogramming.
Covers fundamental concepts and framework of CPS Reviews CPS and construction project management Reviews CPS applications in construction sector Covers CPS and IoT integrated network Reviews challenges and security aspects of construction CPS
Explores how social media defines consumer behaviour. Discovers how social media works to keep the user always on. Reviews why social media can shape a more extreme political and cultural ideology in users. Studies how social media algorithms can shape a predictable and homogeneous culture. Develops critical and multidisciplinary thinking about the impact of social media in shaping a predictable society.
This book is the first of its kind to introduce the integration of ethics, laws, risks, and policies in cyberspace. The book provides understanding of the ethical and legal aspects of cyberspace along with the risks involved. It also addresses current and proposed cyber policies, serving as a summary of the state of the art cyber laws in the United States. It also, importantly, incorporates various risk management and security strategies from a number of organizations. Using easy-to-understand language and incorporating case studies, the authors begin with the consideration of ethics and law in cybersecurity and then go on to take into account risks and security policies. The section on risk covers identification, analysis, assessment, management, and remediation. The very important topic of cyber insurance is covered as well-its benefits, types, coverage, etc. The section on cybersecurity policy acquaints readers with the role of policies in cybersecurity and how they are being implemented by means of frameworks. The authors provide a policy overview followed by discussions of several popular cybersecurity frameworks, such as NIST, COBIT, PCI/DSS, ISO series, etc.
Teaching at Scale explores the characteristics and parameters of large-scale online learning by identifying, in its perceived drawbacks, a wealth of educational opportunities in disguise. Scalable learning platforms have exploded in popularity over recent years, with MOOCS (massive open online courses), online degree programs, informal learning communities, and alternative credentials all drawing significant enrollments. But, as many educators are asking, are the challenges to delivering education at scale too great and the compromises too many? This book guides instructors to leverage their complex responsibilities-open-ended assessments at scale, individuated feedback to students, academic integrity in less controlled environments, and more-into significant assets. Informed by real-world institutional experience as well as key research in cognitive science and the learning sciences, each chapter provides practical strategies for educators and administrators seeking to solve problems and fulfill the high-quality, broad-access potential of large-scale instruction for lifelong learners.
This book offers a straight-forward guide to the fundamental work of governing bodies and the people who serve on them. The aim is of the book is to help every member serving on a governing body understand and improve their contribution to the entity and governing body they serve. The book is rooted in research, including five years' work by the author as a Research Fellow of Nuffield College, Oxford.
Guides the reader into the mysteries of water. Provides a state-of-the-art overview in computer simulations and experiments on water. Brings together leading scientists in the field of water.
Teaching at Scale explores the characteristics and parameters of large-scale online learning by identifying, in its perceived drawbacks, a wealth of educational opportunities in disguise. Scalable learning platforms have exploded in popularity over recent years, with MOOCS (massive open online courses), online degree programs, informal learning communities, and alternative credentials all drawing significant enrollments. But, as many educators are asking, are the challenges to delivering education at scale too great and the compromises too many? This book guides instructors to leverage their complex responsibilities-open-ended assessments at scale, individuated feedback to students, academic integrity in less controlled environments, and more-into significant assets. Informed by real-world institutional experience as well as key research in cognitive science and the learning sciences, each chapter provides practical strategies for educators and administrators seeking to solve problems and fulfill the high-quality, broad-access potential of large-scale instruction for lifelong learners.
The text provides a comprehensive overview of the design aspects of the internet of things devices and covers the fundamentals of big data and data science. It explores various scenarios such as what are the middleware and frameworks available and how to build a stable, standards-based, and Secure internet of things device. It discusses important concepts including embedded programming techniques, machine-to-machine architecture, and the internet of things for smart city applications. It will serve as an ideal design book for professionals, senior undergraduate, and graduate students in the fields including electrical engineering, electronics and communication engineering, and computer engineering. The book- Covers applications and architecture needed to deliver solutions to end customers and readers. Discusses practical aspects of implementing the internet of things in diverse areas including manufacturing, and software development. Highlights big data concepts and embedded programming techniques. Presents technologies including machine to machine, integrated sensors, and radio-frequency identification. Introduces global system for mobile communication and precise details of standards based on internet of things architecture models. The book focuses on practical design aspects such as how to finalize a processor integrated circuit, which operating system to use, etc. in a single volume. It will serve as an ideal text for professionals, senior undergraduate, and graduate students in diverse engineering domains including electrical, electronics and communication, computer.
The availability of packaged clustering programs means that anyone with data can easily do cluster analysis on it. But many users of this technology don't fully appreciate its many hidden dangers. In today's world of "grab and go algorithms," part of my motivation for writing this book is to provide users with a set of cautionary tales about cluster analysis, for it is very much an art as well as a science, and it is easy to stumble if you don't understand its pitfalls. Indeed, it is easy to trip over them even if you do! The parenthetical word usually in the title is very important, because all clustering algorithms can and do fail from time to time. Modern cluster analysis has become so technically intricate that it is often hard for the beginner or the non-specialist to appreciate and understand its many hidden dangers. Here's how Yogi Berra put it, and he was right: In theory there's no difference between theory and practice. In practice, there is ~Yogi Berra This book is a step backwards, to four classical methods for clustering in small, static data sets that have all withstood the tests of time. The youngest of the four methods is now almost 50 years old: Gaussian Mixture Decomposition (GMD, 1898) SAHN Clustering (principally single linkage (SL, 1909)) Hard c-means (HCM, 1956, also widely known as (aka) "k-means") Fuzzy c-means (FCM, 1973, reduces to HCM in a certain limit) The dates are the first known writing (to me, anyway) about these four models. I am (with apologies to Marvel Comics) very comfortable in calling HCM, FCM, GMD and SL the Fantastic Four. Cluster analysis is a vast topic. The overall picture in clustering is quite overwhelming, so any attempt to swim at the deep end of the pool in even a very specialized subfield requires a lot of training. But we all start out at the shallow end (or at least that's where we should start!), and this book is aimed squarely at teaching toddlers not to be afraid of the water. There is no section of this book that, if explored in real depth, cannot be expanded into its own volume. So, if your needs are for an in-depth treatment of all the latest developments in any topic in this volume, the best I can do - what I will try to do anyway - is lead you to the pool, and show you where to jump in.
Project or program health checks provide tremendous value to businesses and pay for themselves by multiples of magnitude. No matter how well a project or program is performing, there are always activities that can provide better value, reduce costs, or introduce more innovation. IT project and program health checks can help organizations reach their goals and dramatically improve Return on Investment (ROI). IT Project Health Checks: Driving Successful Implementation and Multiples of Business Value offers a proven approach for evaluating IT projects or programs in order to determine how they are performing and how the eventual outcome for the initiative is currently trending. The project or program health checks provide a set of techniques that produce actionable recommendations that can be applied for any combination of the following outcomes: Drive more business and technical value from a program Set a project or program back on track for successful implementation as defined by executive management Rescue a program that is heading towards failure Act as additional insurance for initiatives that are too important to fail Protect executive careers by creating transparency within the inner workings of complex initiatives. The book shows how a review can quickly identify whether an initiative needs to be rescued even when the project team is not aware that it is hurtling towards failure. It also provides techniques for driving business value even when a project team believes it's been stretched as much as possible. Other outcomes covered in this book include: Objectively develop a project Health-Check Scorecard that establishes how well a project is doing and the direction it is headed Demonstrate how to drive business value from an IT program regardless of how well or badly it is tracking Provide surgical advice to improve a project's outcome How to use the many templates and sample deliverables to get a quick start on your own health check. Designed to provide significant value to any member of a project team, program team, stakeholders, sponsors, business users, system integrators, trainers, and IT professionals, this book can help find opportunities to drive multiples of business value and exceed project success metrics.
Highlights solutions for Secure and Privacy of Big Data Offers Use-case studies for Trustworthy Computing Discusses different secure Big Data Architectures Covers aspects and analysis of Big Data platforms Provides trans-disciplinary approach for distributed systems and Big Data.
The Lean Approach to Digital Transformation: From Customer to Code and From Code to Customer is organized into three parts that expose and develop the three capabilities that are essential for a successful digital transformation: 1. Understanding how to co-create digital services with users, whether they are customers or future customers. This ability combines observation, dialogue, and iterative experimentation. The approach proposed in this book is based on the Lean Startup approach, according to an extended vision that combines Design Thinking and Growth Hacking. Companies must become truly "customer-centric", from observation and listening to co-development. The revolution of the digital age of the 21st century is that customer orientation is more imperative -- the era of abundance, usages rate of change, complexity of experiences, and shift of power towards communities -- are easier, using digital tools and digital communities. 2. Developing an information system (IS) that is the backbone of the digital transformation - called "exponential information system" to designate an open IS (in particular on its borders), capable of interfacing and combining with external services, positioned as a player in software ecosystems and built for processing scalable and dynamic data flows. The exponential information system is constantly changing and it continuously absorbs the best of information processing technology, such as Artificial Intelligence and Machine Learning. 3. Building software "micro-factories" that produce service platforms, which are called "Lean software factories." This "software factory" concept covers the integration of agile methods, tooling and continuous integration and deployment practices, a customer-oriented product approach, and a platform approach based on modularity, as well as API-based architecture and openness to external stakeholders. This software micro-factory is the foundation that continuously produces and provides constantly evolving services. These three capabilities are not unique or specific to this book, they are linked to other concepts such as agile methods, product development according to lean principles, software production approaches such as CICD (continuous integration and deployment) or DevOps. This book weaves a common frame of reference for all these approaches to derive more value from the digital transformation and to facilitate its implementation. The title of the book refers to the "lean approach to digital transformation" because the two underlying frameworks, Lean Startup and Lean Software Factory, are directly inspired by Lean, in the sense of the Toyota Way. The Lean approach is present from the beginning to the end of this book -- it provides the framework for customer orientation and the love of a job well done, which are the conditions for the success of a digital transformation.
Tackling the cybersecurity challenge is a matter of survival for society at large. Cyber attacks are rapidly increasing in sophistication and magnitude-and in their destructive potential. New threats emerge regularly, the last few years having seen a ransomware boom and distributed denial-of-service attacks leveraging the Internet of Things. For organisations, the use of cybersecurity risk management is essential in order to manage these threats. Yet current frameworks have drawbacks which can lead to the suboptimal allocation of cybersecurity resources. Cyber insurance has been touted as part of the solution - based on the idea that insurers can incentivize companies to improve their cybersecurity by offering premium discounts - but cyber insurance levels remain limited. This is because companies have difficulty determining which cyber insurance products to purchase, and insurance companies struggle to accurately assess cyber risk and thus develop cyber insurance products. To deal with these challenges, this volume presents new models for cybersecurity risk management, partly based on the use of cyber insurance. It contains: A set of mathematical models for cybersecurity risk management, including (i) a model to assist companies in determining their optimal budget allocation between security products and cyber insurance and (ii) a model to assist insurers in designing cyber insurance products. The models use adversarial risk analysis to account for the behavior of threat actors (as well as the behavior of companies and insurers). To inform these models, we draw on psychological and behavioural economics studies of decision-making by individuals regarding cybersecurity and cyber insurance. We also draw on organizational decision-making studies involving cybersecurity and cyber insurance. Its theoretical and methodological findings will appeal to researchers across a wide range of cybersecurity-related disciplines including risk and decision analysis, analytics, technology management, actuarial sciences, behavioural sciences, and economics. The practical findings will help cybersecurity professionals and insurers enhance cybersecurity and cyber insurance, thus benefiting society as a whole. This book grew out of a two-year European Union-funded project under Horizons 2020, called CYBECO (Supporting Cyber Insurance from a Behavioral Choice Perspective).
This book focuses on the application of soft computing in materials and manufacturing sectors with the objective to offer an intelligent approach to improve the manufacturing process, material selection and characterization techniques for developing advanced new materials. It unveils different models and soft computing techniques applicable in the field of advanced materials and solves the problems to help the industry and scientists to develop sustainable materials for all purposes. The book focuses on the overall well-being of the environment for better sustenance and livelihood. Firstly, the authors discuss the implementation of soft computing in the various areas of engineering materials. They also review the latest intelligent technologies and algorithms related to the state-of-the-art methodologies of monitoring and effective implementation of sustainable engineering practices. Finally the authors examine the future generation of sustainable and intelligent monitoring techniques beneficial for manufacturing, and cover novel soft computing techniques for the purpose of effective manufacturing processes at par with the standards laid down by the International Standards of Organization (ISO). This book is intended for academics and researchers from all the fields of engineering interested in joining interdisciplinary initiatives on soft computing techniques for advanced materials and manufacturing.
ERP Systems for Manufacturing Supply Chains: Applications, Configuration, and Performance provides insight into the core architecture, modules, and process support of ERP systems used in a manufacturing supply chain. This book explains the building blocks of an ERP system and how they can be used to increase performance of manufacturing supply chains. Starting with an overview of basic concepts of supply chain and ERP systems, the book delves into the core ERP modules that support manufacturing facilities and organizations. It examines each module's structure and functionality as well as the process support the module provides. Cases illustrate how the modules can be applied in manufacturing environments. Also covered is how the ERP modules can be configured to support manufacturing supply chains. Setting up an ERP system to support the supply chain within single manufacturing facility provides insight into how an ERP system is used in the smallest of manufacturing enterprises, as well as lays the foundation for ERP systems in manufacturing organizations. The book then supplies strategies for larger manufacturing enterprises and discusses how ERP systems can be used to support a complete manufacturing supply chain across different facilities and companies. The ERP systems on the market today tend to use common terminology and naming for describing specific functions and data units in the software. However, there are differences among packages. The book discusses various data and functionalities found in different ERP-software packages and uses generic and descriptive terms as often as possible to make these valid for as many ERP systems as possible. Filled with insight into ERP system's core modules and functions, this book shows how ERP systems can be applied to support a supply chain in the smallest of manufacturing organizations that only consist of a single manufacturing facility, as well as large enterprises where the manufacturing supply chain crosses multiple facilities and companies.
The philosophy of computer science is concerned with issues that arise from reflection upon the nature and practice of the discipline of computer science. This book presents an approach to the subject that is centered upon the notion of computational artefact. It provides an analysis of the things of computer science as technical artefacts. Seeing them in this way enables the application of the analytical tools and concepts from the philosophy of technology to the technical artefacts of computer science. With this conceptual framework the author examines some of the central philosophical concerns of computer science including the foundations of semantics, the logical role of specification, the nature of correctness, computational ontology and abstraction, formal methods, computational epistemology and explanation, the methodology of computer science, and the nature of computation. The book will be of value to philosophers and computer scientists.
Circuits, Packets, and Protocols tells the story of the engineers, entrepreneurs, investors, and visionaries who laid the groundwork and built the foundations of the Internet. In the late 1960s, two American corporate behemoths were poised to dominate the rapidly converging industries of computing and communications-the computer giant, IBM, and the regulated telecommunications monopoly, AT&T. But in 1968, a key ruling by the Federal Communications Commission gave small businesses a doorway into an emerging market for communication devices that could transmit computer data over telephone lines. In the two decades that followed, an industry of networking technology emerged that would impact human history in profound and unfathomable ways. Circuits, Packets, and Protocols is a groundbreaking study of the men and women in the engineering labs, board rooms, and regulatory agencies whose decisions determined the evolution of our modern digital communication networks. Unlike histories that glorify the dominant players with the benefit of hindsight, this is a history of a pivotal era as it happened. Drawing on more than 80 interviews recorded in 1988, the book features insights from now-famous individuals such as Paul Baran, JCR Licklider, Vint Cerf, Louis Pouzin, and Robert Metcalfe. Inspired by innovations from government-sponsored Cold War defense projects and the birth of the modern venture capital industry, these trailblazers and many others built the technologies and companies that became essential building blocks in the development of today's Internet. Many of the companies and products failed, even while they helped propel the industry forward at breakneck speed. Equal parts academic history and thrilling startup drama, Circuits, Packets, and Protocols gives the reader a vivid picture of what it was like to take part in one of the most exciting periods of technological advance in our time.
Unique selling point: * Set up to be used as a college textbook with a complete "Case Study" that involves the use of Python (a very key programming language at this time) Core audience: * Cyber security professionals, college students in a cyber forensics class, and individuals interested in cyber crime Place in the market: * Will build on the success of the previous two editions
Low-Voltage CMOS Operational Amplifiers: Theory, Design and Implementation discusses both single and two-stage architectures. Opamps with constant-gm input stage are designed and their excellent performance over the rail-to-rail input common mode range is demonstrated. The first set of CMOS constant-gm input stages was introduced by a group from Technische Universiteit, Delft and Universiteit Twente, the Netherlands. These earlier versions of circuits are discussed, along with new circuits developed at the Ohio State University. The design, fabrication (MOSIS Tiny Chips), and characterization of the new circuits are now complete. Basic analog integrated circuit design concepts should be understood in order to fully appreciate the work presented. However, the topics are presented in a logical order and the circuits are explained in great detail, so that Low-Voltage CMOS Operational Amplifiers can be read and enjoyed by those without much experience in analog circuit design. It is an invaluable reference book, and may be used as a text for advanced courses on the subject.
Are we being manipulated online? If so, is being manipulated by online technologies and algorithmic systems notably different from human forms of manipulation? And what is under threat exactly when people are manipulated online? This volume provides philosophical and conceptual depth to debates in digital ethics about online manipulation. The contributions explore the ramifications of our increasingly consequential interactions with online technologies such as online recommender systems, social media, user friendly design, microtargeting, default settings, gamification, and real time profiling. The authors in this volume address four broad and interconnected themes: What is the conceptual nature of online manipulation? And how, methodologically, should the concept be defined? Does online manipulation threaten autonomy, freedom, and meaning in life and if so, how? What are the epistemic, affective, and political harms and risks associated with online manipulation? What are legal and regulatory perspectives on online manipulation? This volume brings these various considerations together to offer philosophically robust answers to critical questions concerning our online interactions with one another and with autonomous systems. The Philosophy of Online Manipulation will be of interest to researchers and advanced students working in moral philosophy, digital ethics, philosophy of technology, and the ethics of manipulation. |
You may like...
Discovering Computers, Essentials…
Susan Sebok, Jennifer Campbell, …
Paperback
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
Computer-Graphic Facial Reconstruction
John G. Clement, Murray K. Marks
Hardcover
R2,327
Discovery Miles 23 270
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
|