![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
Whether you re new to systems analysis or have been there, done that and seen it all but especially if you want to ponder the significance of information systems analysis in the scheme of the universe, this book is for you. The author brings a unique perspective to the problems of computer system analysis
This book will provide a comprehensive overview of business analytics, for those who have either a technical background (quantitative methods) or a practitioner business background. Business analytics, in the context of the 4th Industrial Revolution, is the "new normal" for businesses that operate in this digital age. This book provides a comprehensive primer and overview of the field (and related fields such as Business Intelligence and Data Science). It will discuss the field as it applies to financial institutions, with some minor departures to other industries. Readers will gain understanding and insight into the field of data science, including traditional as well as emerging techniques. Further, many chapters are dedicated to the establishment of a data-driven team - from executive buy-in and corporate governance to managing and quantifying the return of data-driven projects.
"Distributed Programming: Theory and Practice" presents a practical and rigorous method to develop distributed programs that correctly implement their specifications. The method also covers how to write specifications and how to use them. Numerous examples such as bounded buffers, distributed locks, message-passing services, and distributed termination detection illustrate the method. Larger examples include data transfer protocols, distributed shared memory, and TCP network sockets. "Distributed Programming: Theory and Practice" bridges the gap between books that focus on specific concurrent programming languages and books that focus on distributed algorithms. Programs are written in a "real-life" programming notation, along the lines of Java and Python with explicit instantiation of threads and programs.Students and programmers will see these as programs and not "merely" algorithms in pseudo-code. The programs implement interesting algorithms and solve problems that are large enough to serve as projects in programming classes and software engineering classes. Exercises and examples are included at the end of each chapter with on-line access to the solutions. "Distributed Programming: Theory and Practice "is designed as an advanced-level text book for students in computer science and electrical engineering. Programmers, software engineers and researchers working in this field will also find this book useful."
Podcasting Success in a Day: Beginner's Guide to Fast, Easy, and Efficient Learning of Podcasting What is Podcasting? Want to take your online marketing to the next level and provide Podcasts? Need simple guide to getting started with Podcasting? Need cheap and easy solutions to Podcasting? What equipment to use for Podcasting? How about a successful strategy to Podcasting? Want tips on how to get consumers to subscribe to your podcast? All start today right here with one click!
This book considers logical proof systems from the point of view of their space complexity. After an introduction to propositional proof complexity the author structures the book into three main parts. Part I contains two chapters on resolution, one containing results already known in the literature before this work and one focused on space in resolution, and the author then moves on to polynomial calculus and its space complexity with a focus on the combinatorial technique to prove monomial space lower bounds. The first chapter in Part II addresses the proof complexity and space complexity of the pigeon principles. Then there is an interlude on a new type of game, defined on bipartite graphs, essentially independent from the rest of the book, collecting some results on graph theory. Finally Part III analyzes the size of resolution proofs in connection with the Strong Exponential Time Hypothesis (SETH) in complexity theory. The book is appropriate for researchers in theoretical computer science, in particular computational complexity.
This monograph describes the latest advances in discriminative learning methods for biometric recognition. Specifically, it focuses on three representative categories of methods: sparse representation-based classification, metric learning, and discriminative feature representation, together with their applications in palmprint authentication, face recognition and multi-biometrics. The ideas, algorithms, experimental evaluation and underlying rationales are also provided for a better understanding of these methods. Lastly, it discusses several promising research directions in the field of discriminative biometric recognition.
This book covers the important aspects involved in making cognitive radio devices portable, mobile and green, while also extending their service life. At the same time, it presents a variety of established theories and practices concerning cognitive radio from academia and industry. Cognitive radio can be utilized as a backbone communication medium for wireless devices. To effectively achieve its commercial application, various aspects of quality of service and energy management need to be addressed. The topics covered in the book include energy management and quality of service provisioning at Layer 2 of the protocol stack from the perspectives of medium access control, spectrum selection, and self-coexistence for cognitive radio networks.
These are the proceedings of the 20th international conference on domain decomposition methods in science and engineering. Domain decomposition methods are iterative methods for solving the often very large linearor nonlinear systems of algebraic equations that arise when various problems in continuum mechanics are discretized using finite elements. They are designed for massively parallel computers and take the memory hierarchy of such systems in mind. This is essential for approaching peak floating point performance. There is an increasingly well developed theory whichis having a direct impact on the development and improvements of these algorithms.
This series, since its first volume in 1960 and now the oldest
series still being published, covers new developments in computer
technology. Each volume contains from 5 to 7 chapters and 3 volumes
are produced annually. Most chapters present an overview of a
current subfield within computer science, include many citations,
and often new developments in the field by the authors of the
individual chapters. Topics include hardware, software, web
technology, communications, theoretical underpinnings of computing,
and novel applications of computers. The book series is a valuable
addition to university courses that emphasize the topics under
discussion in that particular volume as well as belonging on the
bookshelf of industrial practitioners who need to implement many of
the technologies that are described.
Volume 5 Reviews in Computational Chemistry Kenny B. Lipkowitz and Donald B. Boyd A Valuable Resource for Novices and Practitioners Alike, This Series Features Detailed Treatments of the Latest Advances in Computational Methods for Organic, Pharmaceutical, Physical, and Biological Chemistry. Balancing Academic and Industrial Interests, Volume 5 Presents Tutorials on Post-Hartree-Fock Methods, Electron Population Analysis, Brownian Dynamics, Lipid Simulations, Distance Geometry in Molecular Modeling, and Computer-Aided Drug Design. A History Traces the Field's Growth and Relationship to Funding Agencies. An Enlarged Compendium of Software Serves As a Valuable Buyer's guide. -From Reviews of the Series Many of the Articles are Indeed Accessible to any Interested Nonspecialist, Even Without Theoretical Background. Journal of the American Chemical Society This Book Serves Beginners as Well as Experts Looking for New Perspectives in the Field and is Highly Recommended. Journal of Molecular Graphics
In this book you will learn all of the basics required to rig any character in Maya. The book covers everything from joints to wires, from the connection editor to pruning small weights. With over 30 example files, 200 images, and countless step by step tutorials, you will be shown exactly how to rig a foot and leg, a hand, and much more. The thing that separates this book from the competition is answering the question of "Why?." The book covers exactly what is a gimbal lock and how do you avoid it? What axis should you use as your primary one? How do I add an influence object to fix a rig already in progress and why would I use a joint in some cases? Knowing why things are done the way they are will allow you to innovate and create new rigs. Knowing why will allow you to troubleshoot someone else's rig, and progress far beyond being a beginner. This book will help the beginner build a solid foundation and is a great addition for any character rigger using Maya. Proceeds donated to charity.
Pursuing an interdisciplinary approach, this book offers detailed insights into the empirical relationships between overall social key figures of states and cultures in the fields of information and communication technology (ICT) (digital divide/inequality), the economy, education and religion. Its goal is to bridge the 'cultural gap' between computer scientists, engineers, economists, social and political scientists by providing a mutual understanding of the essential challenges posed and opportunities offered by a global information and knowledge society. In a sense, the historically unprecedented technical advances in the field of ICT are shaping humanity at different levels and forming a hybrid (intelligent) human-technology system, a so-called global superorganism. The main innovation is the combined study of digitization and globalization in the context of growing social inequalities, collapse, and sustainable development, and how a convergence towards a kind of global culture could take place. Accordingly, the book discusses the spread of ICT, Internet Governance, the balance between the central concentration of power and the extent of decentralized power distribution, the inclusion or exclusion of people and states in global communication processes, and the capacity for global empathy or culture.
This book covers reliability assessment and prediction of new technologies such as next generation networks that use cloud computing, Network Function Virtualization (NVF), Software Defined Network (SDN), Next Generation Transport, Evolving Wireless Systems, Digital VoIP Telephony, and Reliability Testing techniques specific to Next Generation Networks (NGN). This book introduces the technology to the reader first, followed by advanced reliability techniques applicable to both hardware and software reliability analysis. The book covers methodologies that can predict reliability using component failure rates to system level downtimes. The book's goal is to familiarize the reader with analytical techniques, tools and methods necessary for analyzing very complex networks using very different technologies. The book lets readers quickly learn technologies behind currently evolving NGN and apply advanced Markov modeling and Software Reliability Engineering (SRE) techniques for assessing their operational reliability. Covers reliability analysis of advanced networks and provides basic mathematical tools and analysis techniques and methodology for reliability and quality assessment; Develops Markov and Software Engineering Models to predict reliability; Covers both hardware and software reliability for next generation technologies.
This accessible compendium examines a collection of significant technology firms that have helped to shape the field of computing and its impact on society. Each company is introduced with a brief account of its history, followed by a concise account of its key contributions. The selection covers a diverse range of historical and contemporary organizations from pioneers of e-commerce to influential social media companies. Features: presents information on early computer manufacturers; reviews important mainframe and minicomputer companies; examines the contributions to the field of semiconductors made by certain companies; describes companies that have been active in developing home and personal computers; surveys notable research centers; discusses the impact of telecommunications companies and those involved in the area of enterprise software and business computing; considers the achievements of e-commerce companies; provides a review of social media companies.
Developments in industry in recent years have made employee learning a critical factor in organizations' success. The ever-faster pace of technological development and the variety of tasks that business professionals must perform mean that on-the-job learning is a constant, too quick and vital to be left to training departments. And yet, management knows too little about how workers learn on the job and does not give sufficient time and effort to understanding this process. As learning is largely left to chance, it is amazing that it happens at all, and well enough to enable workers to be productive and not to destroy each other's work. This book explores the daily work lives and learning experiences of programmers and other professionals in the computer-software industry. The book focuses on the staff of one small software firm, allowing workers to tell their own stories, describing their work and their use of all the resources available to them in learning the complex systems they are required to develop and maintain. Based in qualitative sociological method, it is an ethnography of a business setting as well as a study of learning. After describing the professional world in which programmers work, the book introduces the company to be discussed and the backgrounds of the participants in the study. Then, proceeding from the environment to the systems to be learned, the author schematizes all of the resources professionals use on the job--their experiences and thought processes, documentation, their colleagues, the computer, and the software system itself--as learning tools. All of this material is then related to academic models of learning style, which are mostly found not to be very relevant, as they are not grounded in the life experiences of workers. The author advocates that professionals' learning be modeled in context, that training be developed from experience rather than from theory, and that management strive to build a workplace and an organizational culture as conducive as possible to employees' continual learning.
Social insects such as ants and termites can be viewed as powerful problem-solving systems with sophisticated collective intelligence. Composed of simple interacting agents, this intelligence lies in the networks of interactions among individuals and between individuals and the environment. Social insects are also a powerful metaphor for artificial intelligence. The problems they solve - for instance, finding food, dividing labor among nestmates, building nests, and responding to external challenges - have important counterparts in engineering and computer science. This book provides a detailed look at models of social insect behaviour and how these can be applied in the design of complex systems. It draws upon a complementary blend of biology and computer science, including artificial intelligence, robotics, operations research, informationdisplay, and computer graphics. The book should appeal to a broadly interdisciplinary audience of modellers, engineers, neuroscientists, and computer scientists, as well as some biologists and ecologists.
Healthcare Informatics: Improving Efficiency and Productivity examines the complexities involved in managing resources in our healthcare system and explains how management theory and informatics applications can increase efficiencies in various functional areas of healthcare services. Delving into data and project management and advanced analytics, this book details and provides supporting evidence for the strategic concepts that are critical to achieving successful healthcare information technology (HIT), information management, and electronic health record (EHR) applications. This includes the vital importance of involving nursing staff in rollouts, engaging physicians early in any process, and developing a more receptive organizational culture to digital information and systems adoption. We owe it to ourselves and future generations to do all we can to make our healthcare systems work smarter, be more effective, and reach more people. The power to know is at our fingertips; we need only embrace it. -From the foreword by James H. Goodnight, PhD, CEO, SAS Bridging the gap from theory to practice, it discusses actual informatics applications that have been incorporated by various healthcare organizations and the corresponding management strategies that led to their successful employment. Offering a wealth of detail, it details several working projects, including: A computer physician order entry (CPOE) system project at a North Carolina hospital E-commerce self-service patient check-in at a New Jersey hospital The informatics project that turned a healthcare system's paper-based resources into digital assets Projects at one hospital that helped reduce excesses in length of stay, improved patient safety; and improved efficiency with an ADE alert system A healthcare system's use of algorithms to identify patients at risk for hepatitis Offering the guidance that healthcare specialists need to make use of various informatics platforms, this book provides the motivation and the proven methods that can be adapted and applied to any number of staff, patient, or regulatory concerns.
Weighted finite automata are classical nondeterministic finite automata in which the transitions carry weights. These weights may model, for example, the cost involved when executing a transition, the resources or time needed for this, or the probability or reliability of its successful execution. Weights can also be added to classical automata with infinite state sets like pushdown automata, and this extension constitutes the general concept of weighted automata. Since their introduction in the 1960s they have stimulated research in related areas of theoretical computer science, including formal language theory, algebra, logic, and discrete structures. Moreover, weighted automata and weighted context-free grammars have found application in natural-language processing, speech recognition, and digital image compression. This book covers all the main aspects of weighted automata and formal power series methods, ranging from theory to applications. The contributors are the leading experts in their respective areas, and each chapter presents a detailed survey of the state of the art and pointers to future research. The chapters in Part I cover the foundations of the theory of weighted automata, specifically addressing semirings, power series, and fixed point theory. Part II investigates different concepts of weighted recognizability. Part III examines alternative types of weighted automata and various discrete structures other than words. Finally, Part IV deals with applications of weighted automata, including digital image compression, fuzzy languages, model checking, and natural-language processing. Computer scientists and mathematicians will find this book an excellent survey and reference volume, and it will also be a valuable resource for students exploring this exciting research area. |
You may like...
Dense Image Correspondences for Computer…
Tal Hassner, Ce Liu
Hardcover
R3,410
Discovery Miles 34 100
Research Developments in Biometrics and…
Rajeev Srivastava, S.K. Singh, …
Hardcover
R4,837
Discovery Miles 48 370
|