Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > General theory of computing
Ada's Legacy illustrates the depth and diversity of writers, thinkers, and makers who have been inspired by Ada Lovelace, the English mathematician and writer. The volume, which commemorates the bicentennial of Ada's birth in December 1815, celebrates Lovelace's many achievements as well as the impact of her life and work, which reverberated widely since the late nineteenth century. In the 21st century we have seen a resurgence in Lovelace scholarship, thanks to the growth of interdisciplinary thinking and the expanding influence of women in science, technology, engineering and mathematics. Ada's Legacy is a unique contribution to this scholarship, thanks to its combination of papers on Ada's collaboration with Charles Babbage, Ada's position in the Victorian and Steampunk literary genres, Ada's representation in and inspiration of contemporary art and comics, and Ada's continued relevance in discussions around gender and technology in the digital age. With the 200th anniversary of Ada Lovelace's birth on December 10, 2015, we believe that the timing is perfect to publish this collection of papers. Because of its broad focus on subjects that reach far beyond the life and work of Ada herself, Ada's Legacy will appeal to readers who are curious about Ada's enduring importance in computing and the wider world.
Sektion 1: Internet als Business-Plattform.- Elektronische Koordination interorganisatorischer Geschaftsprozesse zwischen privaten Haushalten und Versicherungen im Rahmen von Tele-Insuring.- Internet-Nutzung im Business-to-Business-Bereich: Stand der Entwicklung, Typologie und Anwendungsbeispiele.- Aufbau eines Elektronischen Handelsplatzes fur Java-Applets.- Preisstrategien fur ein integriertes Universal-Internet.- Sektion 2: Optimierung von Geschaftsprozessen.- Value-Based Management of Inter-Organizational Business Processes.- A Hierarchical Planning Procedure Supporting the Selection of Service Providers in Outtasking Decisions.- A Multiagent System-Approach for the Design of Information Systems in Virtual Organizations.- Office-Automation in Municipal and County Administration with an Integrated Workflow Based Information System.- Efficiency and Cost Implications of Capital Allocation Mechanisms: A Contribution to the Market-versus-Hierarchy-Discussion.- Sektion 3: Groupware- und Workflow-Strategien.- Organisatorischer Wandel bei Einfuhrung von Groupware.- Increased Competitiveness using a Groupware based Project Controlling System.- Architektur zur informationstechnologischen Unterstutzung von Kooperationen.- Enterprise Knowledge Medium (EKM) - Konzeption und Einsatz eines computergestutzten Planungs- und Kontrollsystems im prozessorientierten Unternehmen.- Unterstutzung der Workflow-Entwicklung durch ein unternehmensweites Repository fur Geschaftsprozessrealisierungen.- Sektion 4: Groupware-Anwendungen.- New perspectives for higher education processes as a team-based approach - Back-office information technology for higher education and training.- Konfiguration des Informationsdienstes in Groupware.- Sektion 5: Wirtschaftliche Programmerstellung.- Global Production: The Case of Offshore Programming.- Metriken fur die IV-Diagnose, Konzept und prototypische Implementierung.- Produktinformationssysteme in Business Networks.- Sektion 6: SAP und Client-Server-Integration.- System Migration and System Integration: Two SAP Cases.- Unternehmensweite Datenkonsistenz durch Integration bestehender Informationssysteme.- Client/Server Architecture: what it promises - what it really provides.- IS Project Risk in Polish Organizations.- Sektion 7: Organisation und Datenmanagement.- Flexible Organizations Through Object-oriented and Transaction-oriented Information Systems.- Determinants and Outcomes of Electronic Data Interchange Integration.- Referenz-Informationsmodelle fur den Handel: Begriff, Nutzen und Empfehlungen fur die Gestaltung und unternehmensspezifische Adaption von Referenzmodellen.- Entwicklung eines Data Warehouse fur das Produktionscontrolling: Konzepte und Erfahrungen.- Sektion 8: Neue Chancen durch Multimedia.- Der Markt fur interaktive elektronische Medien aus oekonomischer Sicht.- Sektion 9: Anwendungen von Internet/Intranet.- Improving Competitiveness of Direct Banking via IT-Enabled Incentive Schemes.- Die Nutzung von Internet-Diensten im Rahmen des Elektronischen Datenaustauschs - Architekturvarianten und ein Anwendungsszenario.- Sektion 10: Organisation und Workflow.- Neue Organisationsformen und IT: Herausforderung fur die Unternehmensgestalter.- INCOME/WF - A Petri Net Based Approach to Workflow Management.- On the Object-Oriented Modelling of Distributed Workflow Applications.- Sektion 11: Reorganisation des Unternehmens.- Synthesizing Business and Information Systems (IS): Towards a Common Business-IS Model based on Agents.- Planungs- und Kontrollmodelle zur Steuerung prozessorientierter Organisationen auf der Basis einer Intranet-Anwendung.- Neue Kernprozesse fur Versicherer mit Agenturnetz.- Sponsorenverzeichnis.- Autoren- und Adressverzeichnis.
The book focuses on system dependability modeling and calculation, considering the impact of s-dependency and uncertainty. The best suited approaches for practical system dependability modeling and calculation, (1) the minimal cut approach, (2) the Markov process approach, and (3) the Markov minimal cut approach as a combination of (1) and (2) are described in detail and applied to several examples. The stringently used Boolean logic during the whole development process of the approaches is the key for the combination of the approaches on a common basis. For large and complex systems, efficient approximation approaches, e.g. the probable Markov path approach, have been developed, which can take into account s-dependencies be-tween components of complex system structures. A comprehensive analysis of aleatory uncertainty (due to randomness) and epistemic uncertainty (due to lack of knowledge), and their combination, developed on the basis of basic reliability indices and evaluated with the Monte Carlo simulation method, has been carried out. The uncertainty impact on system dependability is investigated and discussed using several examples with different levels of difficulty. The applications cover a wide variety of large and complex (real-world) systems. Actual state-of-the-art definitions of terms of the IEC 60050-192:2015 standard, as well as the dependability indices, are used uniformly in all six chapters of the book.
An introduction to parallel programming with openmpi using C. It is written so that someone with even a basic understanding of programming can begin to write mpi based parallel programs.
This is an examination of the various technical and organisational elements that impact services management, business management, risk management, and customer relationship management.
This book presents the state of the art in the fields of formal logic pioneered by Graham Priest. It includes advanced technical work on the model and proof theories of paraconsistent logic, in contributions from top scholars in the field. Graham Priest's research has had a considerable influence on the field of philosophical logic, especially with respect to the themes of dialetheism-the thesis that there exist true but inconsistent sentences-and paraconsistency-an account of deduction in which contradictory premises do not entail the truth of arbitrary sentences. Priest's work has regularly challenged researchers to reappraise many assumptions about rationality, ontology, and truth. This book collects original research by some of the most esteemed scholars working in philosophical logic, whose contributions explore and appraise Priest's work on logical approaches to problems in philosophy, linguistics, computation, and mathematics. They provide fresh analyses, critiques, and applications of Priest's work and attest to its continued relevance and topicality. The book also includes Priest's responses to the contributors, providing a further layer to the development of these themes .
A new set of major changes is reshaping the economy and creating challenges that are testing the mettle and talents of organizations and their employees. Unless organizations and their employees develop the requisite skills they need to cope with these challenges, many will become casualties of their own deficiencies. "Keys to Employee Success in Coming Decades" seeks to prepare employees for future success in an increasingly demanding and competitive global environment. Sims, Veres, and their contributors are careful to focus on what employees at different levels in the organization will need to do to be successful in the twenty-first century. Mastery of the knowledge, skills, attitudes, and behaviors discussed by the contributors in this book will lead to enhanced employee performance as the new decade approaches. The requirements for new employees or the redesigned employees is quickly changing. The organizations of tomorrow will expect employees who understand the importance of success; who welcome change and accept it, master it, and deliberately cause it. They are also employees who are proactive innovators, who confront constraints and the limitations on actions that they impose, who take risks and who continue to develop themselves professionally, technically, and personally. Written clearly, concisely, and with a minimum of academic jargon, the book will be important reading for specialists in human resource management, training and development, and others with critical responsibilities throughout the organization.
This volume is the first extensive study of the historical and philosophical connections between technology and mathematics. Coverage includes the use of mathematics in ancient as well as modern technology, devices and machines for computation, cryptology, mathematics in technological education, the epistemology of computer-mediated proofs, and the relationship between technological and mathematical computability. The book also examines the work of such historical figures as Gottfried Wilhelm Leibniz, Charles Babbage, Ada Lovelace, and Alan Turing.
This classroom-tested textbook describes the design and implementation of software for distributed real-time systems, using a bottom-up approach. The text addresses common challenges faced in software projects involving real-time systems, and presents a novel method for simply and effectively performing all of the software engineering steps. Each chapter opens with a discussion of the core concepts, together with a review of the relevant methods and available software. This is then followed with a description of the implementation of the concepts in a sample kernel, complete with executable code. Topics and features: introduces the fundamentals of real-time systems, including real-time architecture and distributed real-time systems; presents a focus on the real-time operating system, covering the concepts of task, memory, and input/output management; provides a detailed step-by-step construction of a real-time operating system kernel, which is then used to test various higher level implementations; describes periodic and aperiodic scheduling, resource management, and distributed scheduling; reviews the process of application design from high-level design methods to low-level details of design and implementation; surveys real-time programming languages and fault tolerance techniques; includes end-of-chapter review questions, extensive C code, numerous examples, and a case study implementing the methods in real-world applications; supplies additional material at an associated website. Requiring only a basic background in computer architecture and operating systems, this practically-oriented work is an invaluable study aid for senior undergraduate and graduate-level students of electrical and computer engineering, and computer science. The text will also serve as a useful general reference for researchers interested in real-time systems.
E-Business covers a broad spectrum of businesses based on the Internet, including e-commerce, e-healthcare, e-government and e tailing. While substantial attention is being given to the planning and development of e-business applications, the efficiency and effectiveness of e-business systems will largely depend on management solutions. These management solutions demand a good grasp of both the technical and business perspectives of an e-business service. There have been many books on the Internet based on e-commerce, Internet protocols, distributed components etc. However, none of these books address the problem of managing e business as a set of networked services. They do not link enterprise management with network and systems management. This book provides an overview of the emerging techniques for IT service management from a business perspective with case studies from telecommunication and healthcare sectors. It integrates the business perspective with relevant technical standards, such as SNMP, WBEM and DMI. This book presents some concepts and methodologies that enable the development of effective and efficient management systems for networked services. The book is intended to familiarize practicing managers, engineers, and graduate level students with networked service management concepts, architectures and methodologies with reference to evolving standards. It should be useful in a number of disciplines, such as business management, information systems, computers and networking, and telecommunications. Appendix 2 is based on TeleManagement (TM) Forum's documents on TOM (GB921, GB910 and GB908). While this appendix has explained the basic management concept of an e-telco, TMForum now recommends the use of eTOM as explained in www.tmforum.com. An overview of eTOM is available in the report The TeleManagement Forum's enhanced Telecom Operations Map (eTOM) by Michael Kelly appearing in the Journal of Network and Systems Management in March 2003.
China Satellite Navigation Conference (CSNC 2021) Proceedings presents selected research papers from CSNC 2021 held during 22nd-25th May, 2021 in Nanchang, China. These papers discuss the technologies and applications of the Global Navigation Satellite System (GNSS), and the latest progress made in the China BeiDou System (BDS) especially. They are divided into 10 topics to match the corresponding sessions in CSNC2021 which broadly covered key topics in GNSS. Readers can learn about the BDS and keep abreast of the latest advances in GNSS techniques and applications.
The financial meltdown resulting from the subprime mortgage fiasco culminated in the most dramatic economic slowdown since the Great Depression. The global economic crisis raised the debate about the role of financial institutions and the role of regulators in an increasingly interconnected and rapidly changing world. It also altered the marketplace's perception of historically trusted financial institutions. Over the years, geopolitical, economic and technical trends have had a subtle, but very powerful, impact on the basic business model for financial institutions worldwide and on their interactions with accountholders. Add to that increased margin pressures, regulatory and compliance issues, fraud and compliance concerns, and competitive threats, and it becomes obvious that the old business model simply won't work going forward. At the same time, the financial industry is littered with some of the oldest technologies of any industry, which contributed to the poor credit decisions that fueled the crisis. A recognized entrepreneur and award-winning innovator, Louis Hernandez, Jr., using historical examples, points out that the rate of change impacting the financial services industry is accelerating. The industry has been slow to respond to change, and the focus on the recent crisis has uncovered fundamental problems that financial institutions have been avoiding. Hernandez outlines a process to map the future direction of individual institutions and the industry in a way that addresses near-term issues and overarching global changes, such as a re-emergent Asia and the dynamics of a knowledge economy. He points out that the "Too Big to Fail" thesis has given way to the seemingly more prudent, community-based institutions that largely avoided the subprime crisis. These institutions have demonstrated that they represent a unique pillar of economic stability. Now, he says, is the perfect time for the leaders of these community-based institutions to seize the day and lead the financial services industry back to the center of economic vitality and drive global economic growth, one community at a time. In Too Small to Fail, Hernandez issues the call to action, "Do you have the extraordinary drive it will take to inspire the industry and bring financial institutions back to their place as trusted intermediaries?"
The acceleration of the Internet and the growing importance of ICT in the globalized markets have played a vital role in the progressively difficult standardization of ICT companies. With the related economic importance of standards, companies and organizations are bringing their own ideas and technologies into the Internet's standard settings.Innovations in Organizational IT Specification and Standards Development provides advancing research on all current aspects of IT standards and standardization. This book aims to be useful in gaining knowledge for IT researchers, scholars, and practitioners alike.
This book takes a foundational approach to the semantics of probabilistic programming. It elaborates a rigorous Markov chain semantics for the probabilistic typed lambda calculus, which is the typed lambda calculus with recursion plus probabilistic choice. The book starts with a recapitulation of the basic mathematical tools needed throughout the book, in particular Markov chains, graph theory and domain theory, and also explores the topic of inductive definitions. It then defines the syntax and establishes the Markov chain semantics of the probabilistic lambda calculus and, furthermore, both a graph and a tree semantics. Based on that, it investigates the termination behavior of probabilistic programs. It introduces the notions of termination degree, bounded termination and path stoppability and investigates their mutual relationships. Lastly, it defines a denotational semantics of the probabilistic lambda calculus, based on continuous functions over probability distributions as domains. The work mostly appeals to researchers in theoretical computer science focusing on probabilistic programming, randomized algorithms, or programming language theory.
Constant improvements in technological applications have allowed for more opportunities to develop automated systems. This not only leads to higher success in smart data analysis, but also ensures that technological progression will continue. Ubiquitous Machine Learning and its Applications is a pivotal reference source for the latest research on the issues and challenges machines face in the new millennium. Featuring extensive coverage on relevant areas such as computational advertising, software engineering, and bioinformatics, this publication is an ideal resource for academicians, graduate students, engineering professionals, and researchers interested in discovering how they can apply these advancements to various disciplines.
The work presented here is generally intended for engineers, educators at all levels, industrialists, managers, researchers and political representatives. Offering a snapshot of various types of research conducted within the field of TRIZ in France, it represents a unique resource. It has been two decades since the TRIZ theory originating in Russia spread across the world. Every continent adopted it in a different manner - sometimes by glorifying its potential and its perspectives (the American way); sometimes by viewing it with mistrust and suspicion (the European way); and sometimes by adopting it as-is, without questioning it further (the Asian way). However, none of these models of adoption truly succeeded. Today, an assessment of TRIZ practices in education, industry and research is necessary. TRIZ has expanded to many different scientific disciplines and has allowed young researchers to reexamine the state of research in their field. To this end, a call was sent out to all known francophone research laboratories producing regular research about TRIZ. Eleven of them agreed to send one or more of their postdoctoral researchers to present their work during a seminar, regardless of the maturity or completeness of their efforts. It was followed by this book project, presenting one chapter for every current thesis in order to reveal the breadth, the richness and the perspectives that research about the TRIZ theory could offer our society. The topics dealt with e.g. the development of new methods inspired by TRIZ, educational practices, and measuring team impact.
This book describes RTL design using Verilog, synthesis and timing closure for System On Chip (SOC) design blocks. It covers the complex RTL design scenarios and challenges for SOC designs and provides practical information on performance improvements in SOC, as well as Application Specific Integrated Circuit (ASIC) designs. Prototyping using modern high density Field Programmable Gate Arrays (FPGAs) is discussed in this book with the practical examples and case studies. The book discusses SOC design, performance improvement techniques, testing and system level verification, while also describing the modern Intel FPGA/XILINX FPGA architectures and their use in SOC prototyping. Further, the book covers the Synopsys Design Compiler (DC) and Prime Time (PT) commands, and how they can be used to optimize complex ASIC/SOC designs. The contents of this book will be useful to students and professionals alike.
Business processes are becoming increasingly complex and dynamic as they seek to cope with a wide range of internal and external interactions and changes. ""The Handbook of Research on Complex Dynamic Process Management: Techniques for Adaptability in Turbulent Environments"" investigates the nature and history of dynamic processes essential to understanding the need for flexibility and adaptability as well as the requirements to improve solutions. This innovative collection covers the development of various strategies, architectures, and techniques for achieving adaptive processes in environments.
This volume contains proceedings of two conferences held in Toronto (Canada) and Kozhikode (India) in 2016 in honor of the 60th birthday of Professor Kumar Murty. The meetings were focused on several aspects of number theory: The theory of automorphic forms and their associated L-functions Arithmetic geometry, with special emphasis on algebraic cycles, Shimura varieties, and explicit methods in the theory of abelian varieties The emerging applications of number theory in information technology Kumar Murty has been a substantial influence in these topics, and the two conferences were aimed at honoring his many contributions to number theory, arithmetic geometry, and information technology.
Wagman gives a broad, structured, and detailed account of advancing intellectual developments in both psychological and computational theories of the nature of problem- solving. Known for originating the PLATO computer-based Dilemma Counseling System, psychologist Wagman is the author of 17 books, including "Scientific Discovery Processes in Humans and Computers "(Praeger, 2000). In this book, Professor Emeritus Morton Wagman gives a broad, structured, and detailed account of advancing intellectual developments in both psychological and computational theories of the nature of problem solving. Known for originating the PLATO computer-based Dilemma Counseling System, psychologist Wagman is the author of 17 books, including "Scientific Discovery Processes in Humans and Computers, "(Praeger, 2000) Of special interest to readers will be Wagman's conclusion that artificial intelligence problem-solving systems are deepening and broadening theories of human problem solving from scientific to everyday approaches. Scholars and professionals in psychology, artificial intelligence, and cognitive science will consider this a volume a valuable addition to their collections.
Recent improvements in healthcare delivery due to innovative technological advancements have redefined fields of biomedical science now allowing for enhanced information management, resource allocation, and quality assurance. Biocomputation and Biomedical Informatics: Case Studies and Applications provides a compendium of terms, definitions, and explanations of concepts, processes, and acronyms in this significant medical field of study. Featuring chapters authored by leading international experts, this unsurpassed collection provides a defining body of research indispensible to medical libraries, researchers, and institutions worldwide. |
You may like...
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
Information Systems, International…
Ralph Stair, George Reynolds
Paperback
Discovering Computers, Essentials…
Susan Sebok, Jennifer Campbell, …
Paperback
|