![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
This book features a systematic description of microelectronic device design ranging from the basics to current topics, such as low-power/ultralow-voltage designs including subthreshold current reduction, memory subsystem designs for modern DRAMs and various on-chip supply-voltage conversion techniques. It also covers process and device issues as well as design issues relating to systems, circuits, devices and processes, such as signal-to-noise and redundancy.
This practical new book offers the distributed-computing fundamental knowledge for individuals to connect with one another in a more secure and efficient way than with traditional blockchains. These new forms of secure, scalable blockchains promise to replace centralized institutions to connect individuals without the risks of user manipulations or data extortions. The techniques taught herein consist of enhancing blockchain security and making blockchain scalable by relying on the observation that no blockchain can exist without solving the consensus problem. First, the state-of-the-art of consensus protocols are analyzed, hence motivating the need for a new family of consensus protocols offering strong (deterministic) guarantees. Second, a didactic series of classic blockchain vulnerabilities is presented to illustrate the importance of novel designs better suited for the adversarial environment of open networks. These cutting-edge solutions are illustrated through the Redbelly blockchain design, which solves a different problem from the classic Byzantine consensus problem of 1982 and which delivers-in the modern blockchain context-high performance at large scale. Topics and features: Covers the combination of security and distributed computing to devise the new generation of blockchains Shows how blockchain has shed new light on decades of research in distributed systems Provides instruction on the security needed by the industry to use blockchains in production Explains didactically the necessary ingredients to make blockchain efficient at large scale Helps fill the gap of knowledge in the highly demanded blockchain sector This unique volume contains the building blocks to design secure and scalable blockchains. As such, it is dedicated to developers, application designers, and computer scientists and requires only a minimal undergraduate level in mathematics and computer science. Vincent Gramoli is an Australian Research Council Future Fellow at the University of Sydney and the Chief Technology Officer of Redbelly Network. He teaches the Blockchain Scalability course on Coursera.
Test Resource Partitioning for System-on-a-Chip is about test resource partitioning and optimization techniques for plug-and-play system-on-a-chip (SOC) test automation. Plug-and-play refers to the paradigm in which core-to-core interfaces as well as core-to-SOC logic interfaces are standardized, such that cores can be easily plugged into "virtual sockets" on the SOC design, and core tests can be plugged into the SOC during test without substantial effort on the part of the system integrator. The goal of the book is to position test resource partitioning in the context of SOC test automation, as well as to generate interest and motivate research on this important topic. SOC integrated circuits composed of embedded cores are now commonplace. Nevertheless, There remain several roadblocks to rapid and efficient system integration. Test development is seen as a major bottleneck in SOC design, and test challenges are a major contributor to the widening gap between design capability and manufacturing capacity. Testing SOCs is especially challenging in the absence of standardized test structures, test automation tools, and test protocols. Test Resource Partitioning for System-on-a-Chip responds to a pressing need for a structured methodology for SOC test automation. It presents new techniques for the partitioning and optimization of the three major SOC test resources: test hardware, testing time and test data volume. Test Resource Partitioning for System-on-a-Chip paves the way for a powerful integrated framework to automate the test flow for a large number of cores in an SOC in a plug-and-play fashion. The framework presented allows the system integrator to reduce test cost and meet short time-to-market requirements.
Analytics is changing the landscape of businesses across sectors globally. This has led to the stimulation of interest of scholars and practitioners worldwide in this domain. The emergence of 'big data', has fanned the usages of machine learning techniques and the acceptance of 'Analytics Enabled Decision Making'. This book provides a holistic theoretical perspective combined with the application of such theories by drawing on the experiences of industry professionals and academicians from around the world. The book discusses several paradigms including pattern mining, clustering, classification, and data analysis to name a few. The main objective of this book is to offer insight into the process of decision-making that is accelerated and made more precise with the help of analytics.
The International Conference on Informatics and Management Science (IMS) 2012 will be held on November 16-19, 2012, in Chongqing, China, which is organized by Chongqing Normal University, Chongqing University, Shanghai Jiao Tong University, Nanyang Technological University, University of Michigan, Chongqing University of Arts and Sciences, and sponsored by National Natural Science Foundation of China (NSFC). Theobjective of IMS 2012 is to facilitate an exchange of information on best practices for the latest research advances in a range of areas. "Informatics and Management Science "contains over 600 contributions to suggest and inspire solutions and methods drawing from multiple disciplines including: Computer Science Communications and Electrical Engineering Management Science Service Science Business Intelligence Communications and Electrical Engineering Management Science Service Science Business Intelligence Management Science Service Science Business Intelligence Service Science Business Intelligence Business Intelligence Communications and Electrical Engineering Management Science Service Science Business Intelligence Management Science Service Science Business Intelligence Service Science Business Intelligence Business Intelligence Management Science Service Science Business Intelligence Service Science Business Intelligence Business Intelligence Service Science Business Intelligence Business Intelligence Business Intelligence"
What do philosophy and computer science have in common? It turns out, quite a lot! In providing an introduction to computer science (using Python), Daniel Lim presents in this book key philosophical issues, ranging from external world skepticism to the existence of God to the problem of induction. These issues, and others, are introduced through the use of critical computational concepts, ranging from image manipulation to recursive programming to elementary machine learning techniques. In illuminating some of the overlapping conceptual spaces of computer science and philosophy, Lim teaches the reader fundamental programming skills and also allows her to develop the critical thinking skills essential for examining some of the enduring questions of philosophy. Key Features Teaches readers actual computer programming, not merely ideas about computers Includes fun programming projects (like digital image manipulation and Game of Life simulation), allowing the reader to develop the ability to write larger computer programs that require decomposition, abstraction, and algorithmic thinking Uses computational concepts to introduce, clarify, and develop a variety of philosophical issues Covers various aspects of machine learning and relates them to philosophical issues involving science and induction as well as to ethical issues Provides a framework to critically analyze arguments in classic and contemporary philosophical debates
Recent years have seen rapid strides in the level of sophistication of VLSI circuits. On the performance front, there is a vital need for techniques to design fast, low-power chips with minimum area for increasingly complex systems, while on the economic side there is the vastly increased pressure of time-to-market. These pressures have made the use of CAD tools mandatory in designing complex systems. Timing Analysis and Optimization of Sequential Circuits describes CAD algorithms for analyzing and optimizing the timing behavior of sequential circuits with special reference to performance parameters such as power and area. A unified approach to performance analysis and optimization of sequential circuits is presented. The state of the art in timing analysis and optimization techniques is described for circuits using edge-triggered or level-sensitive memory elements. Specific emphasis is placed on two methods that are true sequential timing optimizations techniques: retiming and clock skew optimization. Timing Analysis and Optimization of Sequential Circuits covers the following topics: Algorithms for sequential timing analysis Fast algorithms for clock skew optimization and their applications Efficient techniques for retiming large sequential circuits Coupling sequential and combinational optimizations. Timing Analysis and Optimization of Sequential Circuits is written for graduate students, researchers and professionals in the area of CAD for VLSI and VLSI circuit design.
This book is a guide to kinetic studies of reaction mechanisms. It
reviews conventional reactor types and data collection methods, and
introduces a new methodology for data collection using Temperature
Scanning Reactors (TSR). It provides a theoretical and practical
approach to temperature scanning (TS) methodology and supports a
revival of kinetic studies as a useful approach to the fundamental
understanding of chemical reaction mechanisms and the consequential
reaction kinetics.
The industrial society is fast becoming an information society. As
a result, many companies are experiencing serious difficulties in
developing the new internal structures required. The increasing use
of information technology has a profound effect on markets,
products, and processes, as well as the management of and
co-operation between companies. Recognising the possibilities and
grasping the emerging potential is an important challenge for
todays management, if the organisations and systems are to develop
over the next twenty years.
Review Office automation and associated hardware and software technologies are producing significant changes in traditional typing, printing, and publishing techniques and strategies. The long term impact of current developments is likely to be even more far reaching as reducing hardware costs, improved human-computer interfacing, uniformity through standardization, and sophisticated software facilities will all combine together to provide systems of power, capability and flexibility. The configuration of the system can be matched to the requirements of the user, whether typist, clerk, secretary, scientist, manager, director, or publisher. Enormous advances are currently being made in the areas of publication systems in the bringing together of text and pictures, and the aggregation of a greater variety of multi-media documents. Advances in technology and reductions in cost and size have produced many 'desk-top' publishing systems in the market place. More sophisticated systems are targeted at the high end of the market for newspaper production and quality color output. Outstanding issues in desk-top publishing systems include interactive editing of structured documents, integration of text and graphics, page description languages, standards, and the human-computer interface to documentation systems. The latter area is becoming increasingly important: usability by non-specialists and flexibility across application areas are two current concerns. One of the objectives of current work is to bring the production of high quality documents within the capability of naive users as well as experts.
Simulating Fuzzy Systems demonstrates how many systems naturally become fuzzy systems and shows how regular (crisp) simulation can be used to estimate the alpha-cuts of the fuzzy numbers used to analyze the behavior of the fuzzy system. This monograph presents a concise introduction to fuzzy sets, fuzzy logic, fuzzy estimation, fuzzy probabilities, fuzzy systems theory, and fuzzy computation. It also presents a wide selection of simulation applications ranging from emergency rooms to machine shops to project scheduling, showing the varieties of fuzzy systems.
Industrial Engineering (IE) is concerned with the design, improvement, and installation of integrated systems of people, material, equipment, and energy. Industrial engineers face many problems with incomplete and vague information in these systems since the characteristics of these problems often require this kind of information. Fuzzy sets approaches are usually most appropriate when human evaluations and the modeling of human knowledge are needed. IE brings a significant number of applications of fuzzy set theory. After an introductory chapter explaining the recent status of fuzzy sets in IE, this volume involves application chapters on the major seven areas of IE to which fuzzy set theory can contribute. These major application areas are Control and Reliability, Engineering Economics and Investment Analysis, Group and Multi-criteria Decision-making, Human Factors Engineering and Ergonomics, Manufacturing Systems and Technology Management, Optimization Techniques, and Statistical Decision-making. Under these major areas, every chapter includes didactic numerical applications. The authors
Reasoning in Boolean Networks provides a detailed treatment of recent research advances in algorithmic techniques for logic synthesis, test generation and formal verification of digital circuits. The book presents the central idea of approaching design automation problems for logic-level circuits by specific Boolean reasoning techniques. While Boolean reasoning techniques have been a central element of two-level circuit theory for many decades Reasoning in Boolean Networks describes a basic reasoning methodology for multi-level circuits. This leads to a unified view on two-level and multi-level logic synthesis. The presented reasoning techniques are applied to various CAD-problems to demonstrate their usefulness for today's industrially relevant problems. Reasoning in Boolean Networks provides lucid descriptions of basic algorithmic concepts in automatic test pattern generation, logic synthesis and verification and elaborates their intimate relationship to provide further intuition and insight into the subject. Numerous examples are provide for ease in understanding the material. Reasoning in Boolean Networks is intended for researchers in logic synthesis, VLSI testing and formal verification as well as for integrated circuit designers who want to enhance their understanding of basic CAD methodologies.
This book explores clustering operations in the context of social networks and consensus-reaching paths that take into account non-cooperative behaviors. This book focuses on the two key issues in large-scale group decision-making: clustering and consensus building. Clustering aims to reduce the dimension of a large group. Consensus reaching requires that the divergent individual opinions of the decision makers converge to the group opinion. This book emphasizes the similarity of opinions and social relationships as important measurement attributes of clustering, which makes it different from traditional clustering methods with single attribute to divide the original large group without requiring a combination of the above two attributes. The proposed consensus models focus on the treatment of non-cooperative behaviors in the consensus-reaching process and explores the influence of trust loss on the consensus-reaching process.The logic behind is as follows: firstly, a clustering algorithm is adopted to reduce the dimension of decision-makers, and then, based on the clusters' opinions obtained, a consensus-reaching process is carried out to obtain a decision result acceptable to the majority of decision-makers. Graduates and researchers in the fields of management science, computer science, information management, engineering technology, etc., who are interested in large-scale group decision-making and consensus building are potential audience of this book. It helps readers to have a deeper and more comprehensive understanding of clustering analysis and consensus building in large-scale group decision-making.
This reference blends the concepts of optics and microwave theory. It is logically organized in two main parts, the first section deals with network analysis, while the second concentrates on signal analysis. As a whole, the text focuses on the fundamental aspects of optical networks. Methodology, rather than analysis, is the focus of the book. The discussion provides the tools you need to perform your own in-depth analysis of optical networks.
Business organizations and governments are nowadays developing and providing internet based electronic services (e-services) featuring various intelligent functions. This book offers a thorough introduction and systematic overview of the new field e-service intelligence. It covers the state-of-the-art of e-service intelligence including both theorems and applications, and a broad range of topics are discussed.
Linear algebra is growing in importance. 3D entertainment, animations in movies and video games are developed using linear algebra. Animated characters are generated using equations straight out of this book. Linear algebra is used to extract knowledge from the massive amounts of data generated from modern technology. The Fourth Edition of this popular text introduces linear algebra in a comprehensive, geometric, and algorithmic way. The authors start with the fundamentals in 2D and 3D, then move on to higher dimensions, expanding on the fundamentals and introducing new topics, which are necessary for many real-life applications and the development of abstract thought. Applications are introduced to motivate topics. The subtitle, A Geometry Toolbox, hints at the book's geometric approach, which is supported by many sketches and figures. Furthermore, the book covers applications of triangles, polygons, conics, and curves. Examples demonstrate each topic in action. This practical approach to a linear algebra course, whether through classroom instruction or self-study, is unique to this book. New to the Fourth Edition: Ten new application sections. A new section on change of basis. This concept now appears in several places. Chapters 14-16 on higher dimensions are notably revised. A deeper look at polynomials in the gallery of spaces. Introduces the QR decomposition and its relevance to least squares. Similarity and diagonalization are given more attention, as are eigenfunctions. A longer thread on least squares, running from orthogonal projections to a solution via SVD and the pseudoinverse. More applications for PCA have been added. More examples, exercises, and more on the kernel and general linear spaces. A list of applications has been added in Appendix A. The book gives instructors the option of tailoring the course for the primary interests of their students: mathematics, engineering, science, computer graphics, and geometric modeling.
This book brings together some of the most impactful researchers in the field of Genetic Programming (GP), each one working on unique and interesting intersections of theoretical development and practical applications of this evolutionary-based machine learning paradigm. Topics of particular interest for this year´s book include powerful modeling techniques through GP-based symbolic regression, novel selection mechanisms that help guide the evolutionary process, modular approaches to GP, and applications in cybersecurity, biomedicine and program synthesis, as well as papers by practitioner of GP that focus on usability and real-world results. In summary, readers will get a glimpse of the current state of the art in GP research.
Discusses concepts such as Basic Programming Principles, OOP Principles, Database Programming, GUI Programming, Network Programming, Data Analytics and Visualization, Statistical Analysis, Virtual Reality, Web Development, Machine Learning, Deep Learning Provides the code and the output for all the concepts discussed Includes a case study at the end of each chapter
Paralleling emerging trends in cyber-health technology, concerns are mounting about racial and ethnic disparities in health care utilization and outcomes. This book brings these themes together, challenging readers to use, promote, and develop new technology-based methods for closing these gaps. Edited by a leading urban health advocate and featuring 16 expert contributors, the book examines cyber-strategies with the greatest potential toward effective, equitable care, improved service delivery and better health outcomes for all. The rise of e-Patients and the transformation of the doctor-patient relationship are also discussed.
Modeling by Object-Driven Linear Elemental Relations (MODLER) is a computer language for representing linear programming models, completely separate from instances defined by data realizations. It also includes representations of binary variables and logical constraints, which arise naturally in large-scale planning and operational decision support. The basic input to MODLER is a model file, and its basic output is a matrix file that is in a standard (MPS) format for most optimizers and for ANALYZE and RANDMOD. MODLER can also generate a syntax file for ANALYZE to enable automatic translation of activities and constraints into English for intelligent analysis support. The book is accompanied by a DOS version of MODLER on 3.5 inch diskettes and A Laboratory Manual for Teaching Linear Programming is available upon request.
The European Computing Conference offers a unique forum for establishing new collaborations within present or upcoming research projects, exchanging useful ideas, presenting recent research results, participating in discussions and establishing new academic collaborations, linking university with the industry. Engineers and Scientists working on various areas of Systems Theory, Applied Mathematics, Simulation, Numerical and Computational Methods and Parallel Computing present the latest findings, advances, and current trends on a wide range of topics. This proceedings volume will be of interest to students, researchers, and practicing engineers.
Strengthen your students' understanding and upgrade their confidence and exam skills with our OCR Computer Science workbooks, full of self-contained exercises to consolidate knowledge and exam practice questions to improve performance. Written by an experienced Computer Science author, these full colour workbooks provide stimulus materials on all AS and A-level topics, followed by sets of questions designed to develop and test skills in the unit. * Thoroughly prepares students for their examinations as they work through numerous practice questions that cover every question type in the specification. * Helps students identify their revision needs and see how to target the top grades using online answers for each question. * Encourages ongoing revision throughout the course as students progressively develop their skills in class and at home. * Packed full with consolidation and exam practice questions, these workbooks can save valuable preparation time and expense, with self-contained exercises that don't need photocopying and provide instant lesson and homework solutions for specialist and non-specialist teachers. * Ensures that students feel confident tackling their exams as they know what to expect in each section. |
You may like...
|