![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
Although the transition between the first three industrial revolutions took more than a century, Industry 4.0 is progressing quickly. The emergence of digitalization has been rapid thanks to the development of cutting-edge technologies. Though we are witnessing this rapid technological decentralization and interconnectivity at present, organizations and researchers are already discussing Industry 5.0 where full integration of the human side of business and intelligent systems is expected. In this scenario, it is essential to look forward to such strategic workplaces that allow a combination of humans and technology to assure a high degree of automation merged with the cognitive skills of business leaders. Managing Technology Integration for Human Resources in Industry 5.0 provides insights into the impact of the Industrial Revolution 4.0 on human resources. It provides insights for both industry and academia to assist them in teaching and training the next generation leaders through universities and corporate training. Covering topics such as business performance, human technology integration, and digitalization, this premier reference source is an essential resource for human resource managers, IT managers, organizational executives and leaders, entrepreneurs, students and educators of higher education, librarians, researchers, and academicians.
Computer science has emerged as a key driver of innovation in the 21st century. Yet preparing teachers to teach computer science or integrate computer science content into K-12 curricula remains an enormous challenge. Recent policy reports have suggested the need to prepare future teachers to teach computer science through pre-service teacher education programs. In order to prepare a generation of teachers who are capable of delivering computer science to students, however, the field must identify research-based examples, pedagogical strategies, and policies that can facilitate changes in teacher knowledge and practices. The purpose of this book is to provide examples that could help guide the design and delivery of effective teacher preparation on the teaching of computer science. This book identifies promising pathways, pedagogical strategies, and policies that will help teacher education faculty and preservice teachers infuse computer science content into their curricula as well as teach stand-alone computing courses. Specifically, the book focuses on pedagogical practices for developing and assessing pre-service teacher knowledge of computer science, course design models for pre-service teachers, and discussion of policies that can support the teaching of computer science. The primary audience of the book is students and faculty in educational technology, educational or cognitive psychology, learning theory, teacher education, curriculum and instruction, computer science, instructional systems, and learning sciences.
Advances in Computers carries on a tradition of excellence, presenting detailed coverage of innovations in computer hardware, software, theory, design, and applications. The book provides contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles typically allow. The articles included in this book will become standard references, with lasting value in this rapidly expanding field.
Fog computing is quickly increasing its applications and uses to the next level. As it continues to grow, different types of virtualization technologies can thrust this branch of computing further into mainstream use. The Handbook of Research on Cloud and Fog Computing Infrastructures for Data Science is a key reference volume on the latest research on the role of next-generation systems and devices that are capable of self-learning and how those devices will impact society. Featuring wide-ranging coverage across a variety of relevant views and themes such as cognitive analytics, data mining algorithms, and the internet of things, this publication is ideally designed for programmers, IT professionals, students, researchers, and engineers looking for innovative research on software-defined cloud infrastructures and domain-specific analytics.
This book is a celebration of Leslie Lamport's work on concurrency, interwoven in four-and-a-half decades of an evolving industry: from the introduction of the first personal computer to an era when parallel and distributed multiprocessors are abundant. His works lay formal foundations for concurrent computations executed by interconnected computers. Some of the algorithms have become standard engineering practice for fault tolerant distributed computing - distributed systems that continue to function correctly despite failures of individual components. He also developed a substantial body of work on the formal specification and verification of concurrent systems, and has contributed to the development of automated tools applying these methods. Part I consists of technical chapters of the book and a biography. The technical chapters of this book present a retrospective on Lamport's original ideas from experts in the field. Through this lens, it portrays their long-lasting impact. The chapters cover timeless notions Lamport introduced: the Bakery algorithm, atomic shared registers and sequential consistency; causality and logical time; Byzantine Agreement; state machine replication and Paxos; temporal logic of actions (TLA). The professional biography tells of Lamport's career, providing the context in which his work arose and broke new grounds, and discusses LaTeX - perhaps Lamport's most influential contribution outside the field of concurrency. This chapter gives a voice to the people behind the achievements, notably Lamport himself, and additionally the colleagues around him, who inspired, collaborated, and helped him drive worldwide impact. Part II consists of a selection of Leslie Lamport's most influential papers. This book touches on a lifetime of contributions by Leslie Lamport to the field of concurrency and on the extensive influence he had on people working in the field. It will be of value to historians of science, and to researchers and students who work in the area of concurrency and who are interested to read about the work of one of the most influential researchers in this field.
Combinatorial optimization is a multidisciplinary scientific area, lying in the interface of three major scientific domains: mathematics, theoretical computer science and management. The three volumes of the Combinatorial Optimization series aim to cover a wide range of topics in this area. These topics also deal with fundamental notions and approaches as with several classical applications of combinatorial optimization. Concepts of Combinatorial Optimization, is divided into three parts: - On the complexity of combinatorial optimization problems, presenting basics about worst-case and randomized complexity; - Classical solution methods, presenting the two most-known methods for solving hard combinatorial optimization problems, that are Branch-and-Bound and Dynamic Programming; - Elements from mathematical programming, presenting fundamentals from mathematical programming based methods that are in the heart of Operations Research since the origins of this field.
Text analysis tools aid in extracting meaning from digital content. As digital text becomes more and more complex, new techniques are needed to understand conceptual structure. Concept Parsing Algorithms (CPA) for Textual Analysis and Discovery: Emerging Research and Opportunities provides an innovative perspective on the application of algorithmic tools to study unstructured digital content. Highlighting pertinent topics such as semantic tools, semiotic systems, and pattern detection, this book is ideally designed for researchers, academics, students, professionals, and practitioners interested in developing a better understanding of digital text analysis.
Communities of Computing is the first book-length history of the Association for Computing Machinery (ACM), founded in 1947 and with a membership today of 100,000 worldwide. It profiles ACM's notable SIGs, active chapters, and individual members, setting ACM's history into a rich social and political context. The book's 12 core chapters are organized into three thematic sections. "Defining the Discipline" examines the 1960s and 1970s when the field of computer science was taking form at the National Science Foundation, Stanford University, and through ACM's notable efforts in education and curriculum standards. "Broadening the Profession" looks outward into the wider society as ACM engaged with social and political issues - and as members struggled with balancing a focus on scientific issues and awareness of the wider world. Chapters examine the social turbulence surrounding the Vietnam War, debates about the women's movement, efforts for computing and community education, and international issues including professionalization and the Cold War. "Expanding Research Frontiers" profiles three areas of research activity where ACM members and ACM itself shaped notable advances in computing, including computer graphics, computer security, and hypertext. Featuring insightful profiles of notable ACM leaders, such as Edmund Berkeley, George Forsythe, Jean Sammet, Peter Denning, and Kelly Gotlieb, and honest assessments of controversial episodes, the volume deals with compelling and complex issues involving ACM and computing. It is not a narrow organizational history of ACM committees and SIGS, although much information about them is given. All chapters are original works of research. Many chapters draw on archival records of ACM's headquarters, ACM SIGs, and ACM leaders. This volume makes a permanent contribution to documenting the history of ACM and understanding its central role in the history of computing.
Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of significant, lasting value in this rapidly expanding field.
As real-time and integrated systems become increasingly sophisticated, issues related to development life cycles, non-recurring engineering costs, and poor synergy between development teams will arise. The Handbook of Research on Embedded Systems Design provides insights from the computer science community on integrated systems research projects taking place in the European region. This premier references work takes a look at the diverse range of design principles covered by these projects, from specification at high abstraction levels using standards such as UML and related profiles to intermediate design phases. This work will be invaluable to designers of embedded software, academicians, students, practitioners, professionals, and researchers working in the computer science industry.
Despite advancements in technological and engineering fields, there is still a digital gender divide in the adoption, use, and development of information communication technology (ICT) services. This divide is also evident in educational environments and careers, specifically in the STEM fields. In order to mitigate this divide, policy approaches must be addressed and improved in order to encourage the inclusion of women in ICT disciplines. Gender Gaps and the Social Inclusion Movement in ICT provides emerging research exploring the theoretical and practical aspects of gender and policy from developed and developing country perspectives and its applications within ICT through various forms of research including case studies. Featuring coverage on a broad range of topics such as digital identity, human rights, and social inclusion, this book is ideally designed for policymakers, academicians, researchers, students, and technology developers seeking current research on gender inequality in ICT environments.
The book is addressed to young people interested in computer technologies and computer science. The objective of this book is to provide the reader with all the necessary elements to get him or her started in the modern field of informatics and to allow him or her to become aware of the relationship between key areas of computer science. The book is addressed not only to future software developers, but also to all who are interested in computing in a widely understood sense. The authors also expect that some computer professionals will want to review this book to lift themselves above the daily grind and to embrace the excellence of the whole field of computer science. Unlike existing books, this one bypasses issues concerning the construction of computers and focuses only on information processing. Recognizing the importance of the human factor in information processing, the authors intend to present the theoretical foundations of computer science, software development rules, and some business aspects of informatics in non-technocratic, humanistic terms.
"Extended Finite Element Method" provides an introduction to the extended finite element method (XFEM), a novel computational method which has been proposed to solve complex crack propagation problems. The book helps readers understand the method and make effective use of the XFEM code and software plugins now available to model and simulate these complex problems. The book explores the governing equation behind XFEM, including
level set method and enrichment shape function. The authors outline
a new XFEM algorithm based on the continuum-based shell and
consider numerous practical problems, including planar
discontinuities, arbitrary crack propagation in shells and dynamic
response in 3D composite materials.
The universe is considered an expansive informational field subjected to a general organizational law. The organization of the deployment results in the emergence of an autonomous organization of spatial and material elements endowed with permanence, which are generated on an informational substratum where an organizational law is exercised at all scales. The initial action of a generating informational element produces a quantity of basic informational elements that multiply to form other informational elements that will either be neutral, constituting the basic spatial elements, or active, forming quantum elements. The neutral basic elements will form the space by a continuous aggregation and will represent the substrate of the informational links, allowing the active informational elements to communicate, in order to aggregate and organize themselves. Every active element is immersed in an informational envelope, allowing it to continue its organization through constructive communications. The organizational law engages the active quantum elements to aggregate and produce new and more complex quantum elements, then molecular elements, massive elements, suns and planets. Gravity will then be the force of attraction exerted by the informational envelopes of the aggregates depending on their mass, to develop them by acquisition of new aggregates. The organizational communication of the informational envelopes of all of the physical material elements on Earth will enable the organization of living things, with reproduction managed by communications between the informational envelopes of the elements, realizing a continuous and powerful evolution.
Due to the growing use of web applications and communication devices, the use of data has increased throughout various industries. It is necessary to develop new techniques for managing data in order to ensure adequate usage. The Handbook of Research on Pattern Engineering System Development for Big Data Analytics is a critical scholarly resource that examines the incorporation of pattern management in business technologies as well as decision making and prediction process through the use of data management and analysis. Featuring coverage on a broad range of topics such as business intelligence, feature extraction, and data collection, this publication is geared towards professionals, academicians, practitioners, and researchers seeking current research on the development of pattern management systems for business applications.
Combinatorial optimization is a multidisciplinary scientific area, lying in the interface of three major scientific domains: mathematics, theoretical computer science and management. The three volumes of the Combinatorial Optimization series aim to cover a wide range of topics in this area. These topics also deal with fundamental notions and approaches as with several classical applications of combinatorial optimization. Concepts of Combinatorial Optimization, is divided into three parts: - On the complexity of combinatorial optimization problems, presenting basics about worst-case and randomized complexity; - Classical solution methods, presenting the two most-known methods for solving hard combinatorial optimization problems, that are Branch-and-Bound and Dynamic Programming; - Elements from mathematical programming, presenting fundamentals from mathematical programming based methods that are in the heart of Operations Research since the origins of this field. |
You may like...
Data Mining Techniques for the Life…
Oliviero Carugo, Frank Eisenhaber
Hardcover
R4,232
Discovery Miles 42 320
Geospatial Abduction - Principles and…
Paulo Shakarian, V.S. Subrahmanian
Hardcover
R1,408
Discovery Miles 14 080
Implementation of Machine Learning…
Veljko Milutinovi, Nenad Mitic, …
Hardcover
R6,648
Discovery Miles 66 480
Abnormal Psychology - An Integrative…
V. Durand, David Barlow, …
Paperback
(1)R977 Discovery Miles 9 770
|