Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Business & management > Business mathematics & systems
This book explains the requirements of ISO 9001 for establishing quality management system (QMS) for an organization. The requirements are illustrated with examples from industries for understanding the requirements and preparing the documents of QMS with high clarity. Methods of integrating ISO 9001 requirements with enterprise resource planning (ERP) software are presented. The software integrated approach enables process owners to focus on their core tasks of achieving the planned outputs of processes and the software generates quality records automatically.
This textbook addresses the conceptual and practical aspects of the various phases of the lifecycle of service systems, ranging from service ideation, design, implementation, analysis, improvement and trading associated with service systems engineering. Written by leading experts in the field, this indispensable textbook will enable a new wave of future professionals to think in a service-focused way with the right balance of competencies in computer science, engineering, and management. Fundamentals of Service Systems is a centerpiece for a course syllabus on service systems. Each chapter includes a summary, a list of learning objectives, an opening case, and a review section with questions, a project description, a list of key terms, and a list of further reading bibliography. All these elements enable students to learn at a faster and more comfortable peace. For researchers, teachers, and students who want to learn about this new emerging science, Fundamentals of Service Systems provides an overview of the core disciplines underlying the study of service systems. It is aimed at students of information systems, information technology, and business and economics. It also targets business and IT practitioners, especially those who are looking for better ways of innovating, designing, modeling, analyzing, and optimizing service systems.
This book discusses action-oriented, concise and easy-to-communicate goals and challenges related to quality, reliability, infocomm technology and business operations. It brings together groundbreaking research in the area of software reliability, e-maintenance and big data analytics, highlighting the importance of maintaining the current growth in information technology (IT) adoption in businesses, while at the same time proposing process innovations to ensure sustainable development in the immediate future. In its thirty-seven chapters, it covers various areas of e-maintenance solutions, software architectures, patching problems in software reliability, preventive maintenance, industrial big data and reliability applications in electric power systems. The book reviews the ways in which countries currently attempt to resolve the conflicts and opportunities related to quality, reliability, IT and business operations, and proposes that internationally coordinated research plans are essential for effective and sustainable development, with research being most effective when it uses evidence-based decision-making frameworks resulting in clear management objectives, and is organized within adaptive management frameworks. Written by leading experts, the book is of interest to researchers, academicians, practitioners and policy makers alike who are working towards the common goal of making business operations more effective and sustainable.
This book provides a state-of-the-art perspective on intelligent process-aware information systems and presents chapters on specific facets and approaches applicable to such systems. Further, it highlights novel advances and developments in various aspects of intelligent process-aware information systems and business process management systems. Intelligence capabilities are increasingly being integrated into or created in many of today's software products and services. Process-aware information systems provide critical computing infrastructure to support the various processes involved in the creation and delivery of business products and services. Yet the integration of intelligence capabilities into process-aware information systems is a non-trivial yet necessary evolution of these complex systems. The book's individual chapters address adaptive process management, case management processes, autonomically-capable processes, process-oriented information logistics, process recommendations, reasoning over process models, process portability, and business process intelligence. The primary target groups are researchers and PhD/Master students in the field of information systems.
The rate of failure of IT projects has remained little changed in survey after survey over the past 15-20 years-over 40-50%. This has happened in spite of new technology, innovative methods and tools, and different management methods. Why does this happen? Why can't the situation be better? One reason is that many think of each IT effort as unique. In reality many IT projects are very similar at a high, strategic level. Where they differ is in the people and exact events-the detail. If you read the literature or have been in information systems or IT for some time, you have seen the same reasons for failure and the same problems and issues recur again and again. In this book IT Management experts Ben Lientz and Lee Larssen show you how to identify and track the recurring issues leading to failure in IT projects and provide a proven, modern method for addressing them. By following the recommendations in this books readers can significantly reduce the risk of IT failures and increase the rate of success. Benefits of using this approach: * Issues are identified earlier-giving more time for solution and action. * Issues are resolved more consistently since the approach tracks on their repetition. * You get an early warning of problems in IT work-before the budget or schedule fall apart. * Management tends to have more realistic expectations with an awareness of issues. * Users and managers have greater confidence in IT due to the improved handling of issues. * Since the number of issues tends to stabilize in an organization, the IT organization and management get better at detecting, preventing, and dealing with issues over time-cumulative improvement. * Giving attention to issues make users more realistic in their requests and acts to deter requirement changes and scope creep.
This book examines trends and challenges in research on IT governance in public organizations, reporting innovative research and new insights in the theories, models and practices within the area. As we noticed, IT governance plays an important role in generating value from organization's IT investments. However there are different challenges for researchers in studying IT governance in public organizations due to the differences between political, administrative, and practices in these organizations. The first section of the book looks at Management issues, including an introduction to IT governance in public organizations; a systematic review of IT alignment research in public organizations; the role of middle managers in aligning strategy and IT in public service organizations; and an analysis of alignment and governance with regard to IT-related policy decisions. The second section examines Modelling, including a consideration of the challenges faced by public administration; a discussion of a framework for IT governance implementation suitable to improve alignment and communication between stakeholders of IT services; the design and implementation of IT architecture; and the adoption of enterprise architecture in public organizations. Finally, section three presents Case Studies, including IT governance in the context of e-government strategy implementation in the Caribbean; the relationship of IT organizational structure and IT governance performance in the IT department of a public research and education organization in a developing country; the relationship between organizational ambidexterity and IT governance through a study of the Swedish Tax Authorities; and the role of institutional logics in IT project activities and interactions in a large Swedish hospital.
How do we define the nature of our business, gather everything that we know about it, and then centralize our information in one, easily accessed place within the organization? Breslin and McGann call such knowledge our ways of working and the place where it will be found a business knowledge repository. All of a company's accumulated operations data, its manuals and procedures, its records of compliance with myriad regulations, its audits, disaster recovery plans--are essential information that today's management needs at its fingertips, and information that tomorroW's management must be sure can easily be found. Breslin and McGann show clearly and comprehensively how business knowledge repositories can be established and maintained, what should go into them and how to get it out, who should have access, and all of the other details that management needs to make the most of this valuable resource and means of doing business. An essential study and guide for management at upper levels in all types of organizations, both public and private. Breslin and McGann show that once an organization's knowledge of itself is formulated into its ways of working, its so-called object orientation makes it easily maintained. The repository approach to organizing and consolidating knowledge makes it possible for all of its potential users to access it easily, without having to go to one source for one thing they need and to another for another thing, a tedious and costly procedure in many organizations that have allowed their information and knowledge resources to not only grow but become duplicated as well. The repository approach also makes it possible for management to organize and access information by job functions, and to make it available to employees more easily in training situations. Regulators and auditors are also more easily served. As a result, CFOs will find their annual audit and various compliance fees considerably reduced. Breslin and McGann's book is thus a blueprint for the creation of knowledge repositories and a discussion of how graphical communication between information systems creators and their client end users can be made to flow smoothly and efficiently.
This book deals with leadership trends in the next decade and beyond. It critically examines how knowledge management can be used to address emerging societal and business issues, such as sustaining complex product quality, controlling automation generated unemployment, increasing cyber insecurity in virtual workforce environment, and unstable government and market trends. These issues require unique leadership qualities to be effective in extremely challenging business and socio-political environments. Included among the topics explored by the authors in this book are: investment for the development of diverse human capital, use of data analytics for performance improvement, declining demographic dividends in population deficient areas, and globally increasing women and minority education and employment. Scholars in business and economics, and managers in industry and government will find this book to be a valuable resource in exploring new directions for the future development of leadership.
This edited volume focuses on big data implications for computational social science and humanities from management to usage. The first part of the book covers geographic data, text corpus data, and social media data, and exemplifies their concrete applications in a wide range of fields including anthropology, economics, finance, geography, history, linguistics, political science, psychology, public health, and mass communications. The second part of the book provides a panoramic view of the development of big data in the fields of computational social sciences and humanities. The following questions are addressed: why is there a need for novel data governance for this new type of data?, why is big data important for social scientists?, and how will it revolutionize the way social scientists conduct research? With the advent of the information age and technologies such as Web 2.0, ubiquitous computing, wearable devices, and the Internet of Things, digital society has fundamentally changed what we now know as "data", the very use of this data, and what we now call "knowledge". Big data has become the standard in social sciences, and has made these sciences more computational. Big Data in Computational Social Science and Humanities will appeal to graduate students and researchers working in the many subfields of the social sciences and humanities.
This book presents a unique approach to stream data mining. Unlike the vast majority of previous approaches, which are largely based on heuristics, it highlights methods and algorithms that are mathematically justified. First, it describes how to adapt static decision trees to accommodate data streams; in this regard, new splitting criteria are developed to guarantee that they are asymptotically equivalent to the classical batch tree. Moreover, new decision trees are designed, leading to the original concept of hybrid trees. In turn, nonparametric techniques based on Parzen kernels and orthogonal series are employed to address concept drift in the problem of non-stationary regressions and classification in a time-varying environment. Lastly, an extremely challenging problem that involves designing ensembles and automatically choosing their sizes is described and solved. Given its scope, the book is intended for a professional audience of researchers and practitioners who deal with stream data, e.g. in telecommunication, banking, and sensor networks.
This handbook provides a unique and in-depth survey of the current state-of-the-art in software engineering, covering its major topics, the conceptual genealogy of each subfield, and discussing future research directions. Subjects include foundational areas of software engineering (e.g. software processes, requirements engineering, software architecture, software testing, formal methods, software maintenance) as well as emerging areas (e.g., self-adaptive systems, software engineering in the cloud, coordination technology). Each chapter includes an introduction to central concepts and principles, a guided tour of seminal papers and key contributions, and promising future research directions. The authors of the individual chapters are all acknowledged experts in their field and include many who have pioneered the techniques and technologies discussed. Readers will find an authoritative and concise review of each subject, and will also learn how software engineering technologies have evolved and are likely to develop in the years to come. This book will be especially useful for researchers who are new to software engineering, and for practitioners seeking to enhance their skills and knowledge.
This proceedings volume highlights the role and importance of Operational Research (OR) in the digital era and the underlying ICT challenges. The selected papers cover recent advances in all branches of operational research, mathematical modeling and decision making. It covers a wide range of key areas from digital economy, to supply chain management, and also finance. The book adopts an applied perspective that covers the contributions of OR in the broad field of business and economics linked with the discipline of computer science. The chapters are based on papers presented at the 6th International Symposium & 28th National Conference on Operational Research. Although the conference is organized by the Hellenic Operational Research Society (HELORS), the contributions in this book promotes international co-operation among researchers and practitioners working in the field.
Information Systems Development: Reflections, Challenges and New Directions, is the collected proceedings of the 20th International Conference on Information Systems Development held in Edinburgh, Scotland, August 24 - 26, 2011. It follows in the tradition of previous conferences in the series in exploring the connections between industry, research and education. These proceedings represent ongoing reflections within the academic community on established information systems topics and emerging concepts, approaches and ideas. It is hoped that the papers herein contribute towards disseminating research and improving practice
This book presents recent research in the recognition of vulnerabilities of national systems and assets which gained special attention for the Critical Infrastructures in the last two decades. The book concentrates on R&D activities in the relation of Critical Infrastructures focusing on enhancing the performance of services as well as the level of security. The objectives of the book are based on a project entitled "Critical Infrastructure Protection Researches" (TAMOP-4.2.1.B-11/2/KMR-2011-0001) which concentrated on innovative UAV solutions, robotics, cybersecurity, surface engineering, and mechatornics and technologies providing safe operations of essential assets. This report is summarizing the methodologies and efforts taken to fulfill the goals defined. The project has been performed by the consortium of the Obuda University and the National University of Public Service.
Seeking to define a new approach to data management at the enterprise level, this work takes the reader beyond information management to information control, where the methods of data capture and manipulation supersede data quantity. Using the metadata approach ensures long-term, universal control of all data characteristics and improves the effectiveness of IT as a corporate function by minimizing the potential for errors, and improving communication and understanding between IT and other disciplines. By describing how to establish metadata management within an organization, this volume provides examples of data structure architectures, and reviews issues associated with metadata management in relation to the Internet and data warehousing. It offers to help the reader to control the factors that make data useable throughout an organization and manage data so that it becomes a valuable corporate asset. The book examines real-world business departments that can benefit from this approach and ways in which sets of metadata can be both autonomous and overlapping.
This book examines the concepts of open innovation, crowdsourcing and co-creation from a holistic point of view and analyzes them considering their suitability to the tourism industry. Methods, theories and models are discussed and examined regarding their practical applicability in tourism. The book illustrates the theoretical mechanisms and principles of Open Innovation, Crowdsourcing and Co-creation with case studies and best practices examples. In addition to the scientific target group, the book is a useful resource for managers of the entire tourism industry. First, the book presents the theoretical fundamentals and concepts in 11 specific chapters. This basis is then enriched by three parts with case studies, focusing on information, creation and provision respectively. Finally in a concluding part the editors sum up the book and give an outlook on the implications, learnings and future perspectives of open innovation, crowdsourcing and collaborative consumption in the tourism industry.
The Media Convergence Handbook sheds new light on the complexity of media convergence and the related business challenges. Approaching the topic from a managerial, technological as well as end-consumer perspective, it acts as a reference book and educational resource in the field. Media convergence at business level may imply transforming business models and using multiplatform content production and distribution tools. However, it is shown that the implementation of convergence strategies can only succeed when expectations and aspirations of every actor involved are taken into account. Media consumers, content producers and managers face different challenges in the process of media convergence. Volume II of the Media Convergence Handbook tackles these challenges by discussing media business models, production, and users' experience and perspectives from a technological convergence viewpoint.
Dynamic Systems Modelling and Optimal Control explores the applications of oil field development, energy system modelling, resource modelling, time varying control of dynamic system of national economy, and investment planning.
This informative book goes beyond the technical aspects of data management to provide detailed analyses of quality problems and their impacts, potential solutions and how they are combined to form an overall data quality program, senior management's role, methods used to make improvements, and the life-cycle of data quality. It concludes with case studies, summaries of main points, roles and responsibilities for each individual, and a helpful listing of "dos and don'ts".
This book systematically examines and quantifies industrial problems by assessing the complexity and safety of large systems. It includes chapters on system performance management, software reliability assessment, testing, quality management, analysis using soft computing techniques, management analytics, and business analytics, with a clear focus on exploring real-world business issues. Through contributions from researchers working in the area of performance, management, and business analytics, it explores the development of new methods and approaches to improve business by gaining knowledge from bulk data. With system performance analytics, companies are now able to drive performance and provide actionable insights for each level and for every role using key indicators, generate mobile-enabled scorecards, time series-based analysis using charts, and dashboards. In the current dynamic environment, a viable tool known as multi-criteria decision analysis (MCDA) is increasingly being adopted to deal with complex business decisions. MCDA is an important decision support tool for analyzing goals and providing optimal solutions and alternatives. It comprises several distinct techniques, which are implemented by specialized decision-making packages. This book addresses a number of important MCDA methods, such as DEMATEL, TOPSIS, AHP, MAUT, and Intuitionistic Fuzzy MCDM, which make it possible to derive maximum utility in the area of analytics. As such, it is a valuable resource for researchers and academicians, as well as practitioners and business experts.
Responsible Management of Information Systems discusses the question how can information systems be used and managed in a responsible manner. It does so by first defining the central concepts of information systems as the business use of information technology and the underlying concepts of ethics and morality. The term responsibility is introduced as a mediation of ethics and morality and a promising approach to normative questions. After demonstrating that the traditional notion of responsibility runs into many problems when applied to information systems the book develops a new, a reflective theory of responsibility. This theory that emphasizes the central characteristics of responsibility, namely openness, consequentialism, and teleology, is then applied to normative problems in information systems. It is shown that with the use of this theory the central moral and legal problems of information systems such as privacy or intellectual property can be successfully addressed.
The book gives a systematic and detailed description of a new integrated product and process development approach for sheet metal manufacturing. Special attention is given to manufacturing that unites multidisciplinary competences of product design, material science, and production engineering, as well as mathematical optimization and computer based information technology. The case study of integral sheet metal structures is used by the authors to introduce the results related to the recent manufacturing technologies of linear flow splitting, bend splitting, and corresponding integrated process chains for sheet metal structures.
This book identifies, analyzes and discusses the current trends of digitalized, decentralized, and networked physical value creation by focusing on the particular example of 3D printing. In addition to evaluating 3D printing's disruptive potentials against a broader economic background, it also addresses the technology's potential impacts on sustainability and emerging modes of bottom-up and community-based innovation. Emphasizing these topics from economic, technical, social and environmental perspectives, the book offers a multifaceted overview that scrutinizes the scenario of a fundamental transition: from a centralized to a far more decentralized system of value creation.
This book discusses the unique nature and complexity of fog data analytics (FDA) and develops a comprehensive taxonomy abstracted into a process model. The exponential increase in sensors and smart gadgets (collectively referred as smart devices or Internet of things (IoT) devices) has generated significant amount of heterogeneous and multimodal data, known as big data. To deal with this big data, we require efficient and effective solutions, such as data mining, data analytics and reduction to be deployed at the edge of fog devices on a cloud. Current research and development efforts generally focus on big data analytics and overlook the difficulty of facilitating fog data analytics (FDA). This book presents a model that addresses various research challenges, such as accessibility, scalability, fog nodes communication, nodal collaboration, heterogeneity, reliability, and quality of service (QoS) requirements, and includes case studies demonstrating its implementation. Focusing on FDA in IoT and requirements related to Industry 4.0, it also covers all aspects required to manage the complexity of FDA for IoT applications and also develops a comprehensive taxonomy.
This book develops a common understanding between the client and the provider in each of the four stages of strategic outsourcing. These stages range from discovery, where the parties envision their future collaboration; planning, where they lay the ground work for the contract and the project; building, where they effectively carry out the work; and lastly to running, where they orchestrate the relationship on a daily basis to ensure that the new, enlarged company achieves the results sought. In a simple yet direct style, it highlights the dos and don'ts the parties should bear in mind at each stage of the process and combines both the client's and the provider's perspectives by comparing their respective involvement at each stage of the process and considering, equally, their obligations in establishing a balanced relationship. The book is primarily intended for those in the private sector with experience of dealing with complex outsourcing situations and who are looking for the small or bigger differentiators that will support their decisions and actions. The target audiences include, on the client side: CCOs, CIOs, lawyers, procurement managers, outsourcing consultants and IT Service managers and, on the provider side: account managers, bid managers, outsourcing project managers, operation managers and service managers. However, it is also useful for anybody involved in outsourcing who is seeking to develop a global understanding of the main processes and roles upstream and downstream in the chain. |
You may like...
Human Resource Information Systems…
Michael J Kavanagh, Richard D. Johnson
Paperback
R2,006
Discovery Miles 20 060
Statistics For Business And Economics
David Anderson, James Cochran, …
Paperback
(1)
R2,342 Discovery Miles 23 420
Business Statistics Using Excel
Glyn Davis, Branko Pecar, …
Paperback
Financial Mathematics - A Computational…
Kevin Pereira, Naeemah Modhien, …
Multiple copy pack
Statistics for Business & Economics…
James McClave, P Benson, …
Paperback
R2,304
Discovery Miles 23 040
|