![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
Emerging Trends in Applications and Infrastructures for Computational Biology, Bioinformatics, and Systems Biology: Systems and Applications covers the latest trends in the field with special emphasis on their applications. The first part covers the major areas of computational biology, development and application of data-analytical and theoretical methods, mathematical modeling, and computational simulation techniques for the study of biological and behavioral systems. The second part covers bioinformatics, an interdisciplinary field concerned with methods for storing, retrieving, organizing, and analyzing biological data. The book also explores the software tools used to generate useful biological knowledge. The third part, on systems biology, explores how to obtain, integrate, and analyze complex datasets from multiple experimental sources using interdisciplinary tools and techniques, with the final section focusing on big data and the collection of datasets so large and complex that it becomes difficult to process using conventional database management systems or traditional data processing applications.
The highly dynamic world of information technology service management stresses the benefits of the quick and correct implementation of IT services. A disciplined approach relies on a separate set of assumptions and principles as an agile approach, both of which have complicated implementation processes as well as copious benefits. Combining these two approaches to enhance the effectiveness of each, while difficult, can yield exceptional dividends. Balancing Agile and Disciplined Engineering and Management Approaches for IT Services and Software Products is an essential publication that focuses on clarifying theoretical foundations of balanced design methods with conceptual frameworks and empirical cases. Highlighting a broad range of topics including business trends, IT service, and software development, this book is ideally designed for software engineers, software developers, programmers, information technology professionals, researchers, academicians, and students.
Computer science has emerged as a key driver of innovation in the 21st century. Yet preparing teachers to teach computer science or integrate computer science content into K-12 curricula remains an enormous challenge. Recent policy reports have suggested the need to prepare future teachers to teach computer science through pre-service teacher education programs. In order to prepare a generation of teachers who are capable of delivering computer science to students, however, the field must identify research-based examples, pedagogical strategies, and policies that can facilitate changes in teacher knowledge and practices. The purpose of this book is to provide examples that could help guide the design and delivery of effective teacher preparation on the teaching of computer science. This book identifies promising pathways, pedagogical strategies, and policies that will help teacher education faculty and preservice teachers infuse computer science content into their curricula as well as teach stand-alone computing courses. Specifically, the book focuses on pedagogical practices for developing and assessing pre-service teacher knowledge of computer science, course design models for pre-service teachers, and discussion of policies that can support the teaching of computer science. The primary audience of the book is students and faculty in educational technology, educational or cognitive psychology, learning theory, teacher education, curriculum and instruction, computer science, instructional systems, and learning sciences.
Advances in Computers carries on a tradition of excellence, presenting detailed coverage of innovations in computer hardware, software, theory, design, and applications. The book provides contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles typically allow. The articles included in this book will become standard references, with lasting value in this rapidly expanding field.
Fog computing is quickly increasing its applications and uses to the next level. As it continues to grow, different types of virtualization technologies can thrust this branch of computing further into mainstream use. The Handbook of Research on Cloud and Fog Computing Infrastructures for Data Science is a key reference volume on the latest research on the role of next-generation systems and devices that are capable of self-learning and how those devices will impact society. Featuring wide-ranging coverage across a variety of relevant views and themes such as cognitive analytics, data mining algorithms, and the internet of things, this publication is ideally designed for programmers, IT professionals, students, researchers, and engineers looking for innovative research on software-defined cloud infrastructures and domain-specific analytics.
This book is a celebration of Leslie Lamport's work on concurrency, interwoven in four-and-a-half decades of an evolving industry: from the introduction of the first personal computer to an era when parallel and distributed multiprocessors are abundant. His works lay formal foundations for concurrent computations executed by interconnected computers. Some of the algorithms have become standard engineering practice for fault tolerant distributed computing - distributed systems that continue to function correctly despite failures of individual components. He also developed a substantial body of work on the formal specification and verification of concurrent systems, and has contributed to the development of automated tools applying these methods. Part I consists of technical chapters of the book and a biography. The technical chapters of this book present a retrospective on Lamport's original ideas from experts in the field. Through this lens, it portrays their long-lasting impact. The chapters cover timeless notions Lamport introduced: the Bakery algorithm, atomic shared registers and sequential consistency; causality and logical time; Byzantine Agreement; state machine replication and Paxos; temporal logic of actions (TLA). The professional biography tells of Lamport's career, providing the context in which his work arose and broke new grounds, and discusses LaTeX - perhaps Lamport's most influential contribution outside the field of concurrency. This chapter gives a voice to the people behind the achievements, notably Lamport himself, and additionally the colleagues around him, who inspired, collaborated, and helped him drive worldwide impact. Part II consists of a selection of Leslie Lamport's most influential papers. This book touches on a lifetime of contributions by Leslie Lamport to the field of concurrency and on the extensive influence he had on people working in the field. It will be of value to historians of science, and to researchers and students who work in the area of concurrency and who are interested to read about the work of one of the most influential researchers in this field.
Combinatorial optimization is a multidisciplinary scientific area, lying in the interface of three major scientific domains: mathematics, theoretical computer science and management. The three volumes of the Combinatorial Optimization series aim to cover a wide range of topics in this area. These topics also deal with fundamental notions and approaches as with several classical applications of combinatorial optimization. Concepts of Combinatorial Optimization, is divided into three parts: - On the complexity of combinatorial optimization problems, presenting basics about worst-case and randomized complexity; - Classical solution methods, presenting the two most-known methods for solving hard combinatorial optimization problems, that are Branch-and-Bound and Dynamic Programming; - Elements from mathematical programming, presenting fundamentals from mathematical programming based methods that are in the heart of Operations Research since the origins of this field.
Communities of Computing is the first book-length history of the Association for Computing Machinery (ACM), founded in 1947 and with a membership today of 100,000 worldwide. It profiles ACM's notable SIGs, active chapters, and individual members, setting ACM's history into a rich social and political context. The book's 12 core chapters are organized into three thematic sections. "Defining the Discipline" examines the 1960s and 1970s when the field of computer science was taking form at the National Science Foundation, Stanford University, and through ACM's notable efforts in education and curriculum standards. "Broadening the Profession" looks outward into the wider society as ACM engaged with social and political issues - and as members struggled with balancing a focus on scientific issues and awareness of the wider world. Chapters examine the social turbulence surrounding the Vietnam War, debates about the women's movement, efforts for computing and community education, and international issues including professionalization and the Cold War. "Expanding Research Frontiers" profiles three areas of research activity where ACM members and ACM itself shaped notable advances in computing, including computer graphics, computer security, and hypertext. Featuring insightful profiles of notable ACM leaders, such as Edmund Berkeley, George Forsythe, Jean Sammet, Peter Denning, and Kelly Gotlieb, and honest assessments of controversial episodes, the volume deals with compelling and complex issues involving ACM and computing. It is not a narrow organizational history of ACM committees and SIGS, although much information about them is given. All chapters are original works of research. Many chapters draw on archival records of ACM's headquarters, ACM SIGs, and ACM leaders. This volume makes a permanent contribution to documenting the history of ACM and understanding its central role in the history of computing.
Text analysis tools aid in extracting meaning from digital content. As digital text becomes more and more complex, new techniques are needed to understand conceptual structure. Concept Parsing Algorithms (CPA) for Textual Analysis and Discovery: Emerging Research and Opportunities provides an innovative perspective on the application of algorithmic tools to study unstructured digital content. Highlighting pertinent topics such as semantic tools, semiotic systems, and pattern detection, this book is ideally designed for researchers, academics, students, professionals, and practitioners interested in developing a better understanding of digital text analysis.
Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of significant, lasting value in this rapidly expanding field.
As real-time and integrated systems become increasingly sophisticated, issues related to development life cycles, non-recurring engineering costs, and poor synergy between development teams will arise. The Handbook of Research on Embedded Systems Design provides insights from the computer science community on integrated systems research projects taking place in the European region. This premier references work takes a look at the diverse range of design principles covered by these projects, from specification at high abstraction levels using standards such as UML and related profiles to intermediate design phases. This work will be invaluable to designers of embedded software, academicians, students, practitioners, professionals, and researchers working in the computer science industry.
Despite advancements in technological and engineering fields, there is still a digital gender divide in the adoption, use, and development of information communication technology (ICT) services. This divide is also evident in educational environments and careers, specifically in the STEM fields. In order to mitigate this divide, policy approaches must be addressed and improved in order to encourage the inclusion of women in ICT disciplines. Gender Gaps and the Social Inclusion Movement in ICT provides emerging research exploring the theoretical and practical aspects of gender and policy from developed and developing country perspectives and its applications within ICT through various forms of research including case studies. Featuring coverage on a broad range of topics such as digital identity, human rights, and social inclusion, this book is ideally designed for policymakers, academicians, researchers, students, and technology developers seeking current research on gender inequality in ICT environments.
"Extended Finite Element Method" provides an introduction to the extended finite element method (XFEM), a novel computational method which has been proposed to solve complex crack propagation problems. The book helps readers understand the method and make effective use of the XFEM code and software plugins now available to model and simulate these complex problems. The book explores the governing equation behind XFEM, including
level set method and enrichment shape function. The authors outline
a new XFEM algorithm based on the continuum-based shell and
consider numerous practical problems, including planar
discontinuities, arbitrary crack propagation in shells and dynamic
response in 3D composite materials.
The universe is considered an expansive informational field subjected to a general organizational law. The organization of the deployment results in the emergence of an autonomous organization of spatial and material elements endowed with permanence, which are generated on an informational substratum where an organizational law is exercised at all scales. The initial action of a generating informational element produces a quantity of basic informational elements that multiply to form other informational elements that will either be neutral, constituting the basic spatial elements, or active, forming quantum elements. The neutral basic elements will form the space by a continuous aggregation and will represent the substrate of the informational links, allowing the active informational elements to communicate, in order to aggregate and organize themselves. Every active element is immersed in an informational envelope, allowing it to continue its organization through constructive communications. The organizational law engages the active quantum elements to aggregate and produce new and more complex quantum elements, then molecular elements, massive elements, suns and planets. Gravity will then be the force of attraction exerted by the informational envelopes of the aggregates depending on their mass, to develop them by acquisition of new aggregates. The organizational communication of the informational envelopes of all of the physical material elements on Earth will enable the organization of living things, with reproduction managed by communications between the informational envelopes of the elements, realizing a continuous and powerful evolution.
|
You may like...
Computer-Graphic Facial Reconstruction
John G. Clement, Murray K. Marks
Hardcover
R2,327
Discovery Miles 23 270
Discovering Computers - Digital…
Misty Vermaat, Mark Ciampa, …
Paperback
Advanced Technologies in Hydropower Flow…
Adam Adamkowski, Anton Bergant
Hardcover
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
R1,136
Discovery Miles 11 360
Properties and Dynamics of Neutron Stars…
Veronica Dexheimer, Rodrigo Negreiros
Hardcover
|