![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
ClearRevise is all about making your revision easy. At the end of the course, doing practice papers is useful - but an exam tutor can make a big difference. This book helps provide support from both angles and will really help you to ace the exam. The first section is your exam tutor. It shows you example questions with model answers. Just like a tutor, it gives you exam tips and lets you know what the examiner is looking for. Secondly, you are then given similar questions from the same topic for you to have a go at, applying your knowledge and tips. With over 400 marks in this section and all the answers provided you'll easily revise the topics as you go. Lastly, there are two complete exam papers written in the same style as the live OCR papers to try. They're exactly the same length and marks as the real exam, providing a realistic experience and a great opportunity to show how much you've progressed.
This classroom-tested textbook describes the design and implementation of software for distributed real-time systems, using a bottom-up approach. The text addresses common challenges faced in software projects involving real-time systems, and presents a novel method for simply and effectively performing all of the software engineering steps. Each chapter opens with a discussion of the core concepts, together with a review of the relevant methods and available software. This is then followed with a description of the implementation of the concepts in a sample kernel, complete with executable code. Topics and features: introduces the fundamentals of real-time systems, including real-time architecture and distributed real-time systems; presents a focus on the real-time operating system, covering the concepts of task, memory, and input/output management; provides a detailed step-by-step construction of a real-time operating system kernel, which is then used to test various higher level implementations; describes periodic and aperiodic scheduling, resource management, and distributed scheduling; reviews the process of application design from high-level design methods to low-level details of design and implementation; surveys real-time programming languages and fault tolerance techniques; includes end-of-chapter review questions, extensive C code, numerous examples, and a case study implementing the methods in real-world applications; supplies additional material at an associated website. Requiring only a basic background in computer architecture and operating systems, this practically-oriented work is an invaluable study aid for senior undergraduate and graduate-level students of electrical and computer engineering, and computer science. The text will also serve as a useful general reference for researchers interested in real-time systems.
The acceleration of the Internet and the growing importance of ICT in the globalized markets have played a vital role in the progressively difficult standardization of ICT companies. With the related economic importance of standards, companies and organizations are bringing their own ideas and technologies into the Internet's standard settings.Innovations in Organizational IT Specification and Standards Development provides advancing research on all current aspects of IT standards and standardization. This book aims to be useful in gaining knowledge for IT researchers, scholars, and practitioners alike.
This book contains a selection of papers presented during a special workshop on Complexity Science organized as part of the 9th International Conference on GIScience 2016. Expert researchers in the areas of Agent-Based Modeling, Complexity Theory, Network Theory, Big Data, and emerging methods of Analysis and Visualization for new types of data explore novel complexity science approaches to dynamic geographic phenomena and their applications, addressing challenges and enriching research methodologies in geography in a Big Data Era.
This book explains the development of theoretical computer science in its early stages, specifically from 1965 to 1990. The author is among the pioneers of theoretical computer science, and he guides the reader through the early stages of development of this new discipline. He explains the origins of the field, arising from disciplines such as logic, mathematics, and electronics, and he describes the evolution of the key principles of computing in strands such as computability, algorithms, and programming. But mainly it's a story about people - pioneers with diverse backgrounds and characters came together to overcome philosophical and institutional challenges and build a community. They collaborated on research efforts, they established schools and conferences, they developed the first related university courses, they taught generations of future researchers and practitioners, and they set up the key publications to communicate and archive their knowledge. The book is a fascinating insight into the field as it existed and evolved, it will be valuable reading for anyone interested in the history of computing.
Ada's Legacy illustrates the depth and diversity of writers, thinkers, and makers who have been inspired by Ada Lovelace, the English mathematician and writer. The volume, which commemorates the bicentennial of Ada's birth in December 1815, celebrates Lovelace's many achievements as well as the impact of her life and work, which reverberated widely since the late nineteenth century. In the 21st century we have seen a resurgence in Lovelace scholarship, thanks to the growth of interdisciplinary thinking and the expanding influence of women in science, technology, engineering and mathematics. Ada's Legacy is a unique contribution to this scholarship, thanks to its combination of papers on Ada's collaboration with Charles Babbage, Ada's position in the Victorian and Steampunk literary genres, Ada's representation in and inspiration of contemporary art and comics, and Ada's continued relevance in discussions around gender and technology in the digital age. With the 200th anniversary of Ada Lovelace's birth on December 10, 2015, we believe that the timing is perfect to publish this collection of papers. Because of its broad focus on subjects that reach far beyond the life and work of Ada herself, Ada's Legacy will appeal to readers who are curious about Ada's enduring importance in computing and the wider world.
There are many proposed aims for scientific inquiry - to explain or predict events, to confirm or falsify hypotheses, or to find hypotheses that cohere with our other beliefs in some logical or probabilistic sense. This book is devoted to a different proposal - that the logical structure of the scientist's method should guarantee eventual arrival at the truth, given the scientist's background assumptions. Interest in this methodological property, called "logical reliability", stems from formal learning theory, which draws its insights not from the theory of probability, but from the theory of computability. Kelly first offers an accessible explanation of formal learning theory, then goes on to develop and explore a systematic framework in which various standard learning-theoretic results can be seen as special cases of simpler and more general considerations. Finally, Kelly clarifies the relationship between the resulting framework and other standard issues in the philosophy of science, such as probability, causation, and relativism. Extensively illustrated with figures by the author, The Logic of Reliable Inquiry assumes only introductory knowledge of basic logic and computability theory. It is a major contribution to the literature and will be essential reading for scientists, statiticians, psychologists, linguists, logicians, and philosophers.
A self-study tutorial which presents the fundamental principles and rigorous numerical validations of a major contemporary branch in frequency-domain computational electromagnetics.
Volume 6 Reviews in Computational Chemistry Kenny B. Lipkowitz and Donald B. Boyd This Series Brings together Respected Experts in the Field of Computer-Aided Molecular Research. Computational Chemistry is Increasingly used in Conjunction with Organic, Inorganic, Medicinal, Biological, Physical, and Analytical Chemistry, Biotechnology, Materials Science, and Chemical Physics. This Volume Examines Quantum Chemistry of Solvated Molecules, Molecular Mechanics of Inorganics and Organometallics, Modeling of Polymers, Technology of Massively Parallel Computing, and Productivity of Modeling Software. A Guide to Force Field Parameters and a New Software Compendium Round out This Volume. -From Reviews of the Series The Book Transfers a Working Knowledge of Existing Computational Methods and Programs to an Interested Reader and Potential user. Structural Chemistry It Can Be Recommended for Everyone Who Wants to Learn About the Present State of Development in Computational Chemistry. Angewandte Chemie, International Edition in English
This book covers basic fundamentals of logic design and advanced RTL design concepts using VHDL. The book is organized to describe both simple and complex RTL design scenarios using VHDL. It gives practical information on the issues in ASIC prototyping using FPGAs, design challenges and how to overcome practical issues and concerns. It describes how to write an efficient RTL code using VHDL and how to improve the design performance. The design guidelines by using VHDL are also explained with the practical examples in this book. The book also covers the ALTERA and XILINX FPGA architecture and the design flow for the PLDs. The contents of this book will be useful to students, researchers, and professionals working in hardware design and optimization. The book can also be used as a text for graduate and professional development courses.
This book features a selection of articles from the second edition of the conference Europe Middle East & North Africa Information Systems and Technologies to Support Learning 2018 (EMENA-ISTL'18), held in Fez, Morocco between 25th and 27th October 2018. EMENA-ISTL'18 was a global forum for researchers and practitioners to present and discuss recent findings and innovations, current trends, professional experiences and challenges in information systems & technologies to support learning. The main topics covered are: A) information systems technologies to support education; B) education in science, technology, engineering and Mathematics; C) emerging technologies in education learning innovation in the digital age; D) software systems, architectures, applications and tools; E) multimedia systems and applications; F) computer communications and networks; G) IOT, smart cities and people, wireless, sensor and ad-hoc networks; H) organizational models and information systems and technologies; I) human-computer Interaction; J) computers & security, ethics and data-forensic; K) health informatics, and medical informatics security; l) information and knowledge management; m) big data analytics and applications, intelligent data systems, and machine learning; n) artificial intelligence, high performance computing; o) mobile, embedded and ubiquitous systems; p) language and image processing, computer graphics and vision; and q) the interdisciplinary field of fuzzy logic and data mining.
In today's business world, your success relies directly upon your ability to make your mark online. An effective website is one that can sell your products or services 24 hours a day, 7 days a week. Many businesses turn to online marketing experts to help them navigate the choppy waters of online marketing. Web service providers can help make your website the "go to" resource for your - but how do you know who to hire? Online marketing providers come in many different price categories and levels of competency. Without doing your due diligence, you'll end up placing the viability of your company's website in the wrong hands. In this book, SEO services expert Jeev Trika will walk you through multiple categories of of search engine marketing that your business will need in order to have an effective presence online. Each chapter looks at an industry in depth and shows you what to look for in an excellent service provider or software package. The categories covered include of: search engine optimization, pay per click management services, link building, content services, social media, landing page optimization, video SEO, affiliate marketing, local SEO, mobile optimization, virtual spokesperson, site audit services, hosting, training programs, PSD to HTML conversion services, press release distribution services, SEO shopping cart software, PPC bid management software, email marketing services, web analytics software, and marketing automation software. In each chapter, you'll learn the basics of each service or software and see real world examples of how actual customers have been helped by professionals in the field. Armed with this information, you'll be able to confidently hire and work with a web services professional or company to get your website where it needs to be.
This textbook for a one-semester course in Digital Systems Design describes the basic methods used to develop "traditional" Digital Systems, based on the use of logic gates and flip flops, as well as more advanced techniques that enable the design of very large circuits, based on Hardware Description Languages and Synthesis tools. It was originally designed to accompany a MOOC (Massive Open Online Course) created at the Autonomous University of Barcelona (UAB), currently available on the Coursera platform. Readers will learn what a digital system is and how it can be developed, preparing them for steps toward other technical disciplines, such as Computer Architecture, Robotics, Bionics, Avionics and others. In particular, students will learn to design digital systems of medium complexity, describe digital systems using high level hardware description languages, and understand the operation of computers at their most basic level. All concepts introduced are reinforced by plentiful illustrations, examples, exercises, and applications. For example, as an applied example of the design techniques presented, the authors demonstrate the synthesis of a simple processor, leaving the student in a position to enter the world of Computer Architecture and Embedded Systems.
This book presents a design methodology that is practically applicable to the architectural design of a broad range of systems. It is based on fundamental design concepts to conceive and specify the required functional properties of a system, while abstracting from the specific implementation functions and technologies that can be chosen to build the system. Abstraction and precision are indispensable when it comes to understanding complex systems and precisely creating and representing them at a high functional level. Once understood, these concepts appear natural, self-evident and extremely powerful, since they can directly, precisely and concisely reflect what is considered essential for the functional behavior of a system. The first two chapters present the global views on how to design systems and how to interpret terms and meta-concepts. This informal introduction provides the general context for the remainder of the book. On a more formal level, Chapters 3 through 6 present the main basic design concepts, illustrating them with examples. Language notations are introduced along with the basic design concepts. Lastly, Chapters 7 to 12 discuss the more intricate basic design concepts of interactive systems by focusing on their common functional goal. These chapters are recommended to readers who have a particular interest in the design of protocols and interfaces for various systems. The didactic approach makes it suitable for graduate students who want to develop insights into and skills in developing complex systems, as well as practitioners in industry and large organizations who are responsible for the design and development of large and complex systems. It includes numerous tangible examples from various fields, and several appealing exercises with their solutions.
This book precisely formulates and simplifies the presentation of Instruction Level Parallelism (ILP) compilation techniques. It uniquely offers consistent and uniform descriptions of the code transformations involved. Due to the ubiquitous nature of ILP in virtually every processor built today, from general purpose CPUs to application-specific and embedded processors, this book is useful to the student, the practitioner and also the researcher of advanced compilation techniques. With an emphasis on fine-grain instruction level parallelism, this book will also prove interesting to researchers and students of parallelism at large, in as much as the techniques described yield insights that go beyond superscalar and VLIW (Very Long Instruction Word) machines compilation and are more widely applicable to optimizing compilers in general. ILP techniques have found wide and crucial application in Design Automation, where they have been used extensively in the optimization of performance as well as area and power minimization of computer designs.
This book covers all of the concepts required to tackle second-order cone programs (SOCPs), in order to provide the reader a complete picture of SOC functions and their applications. SOCPs have attracted considerable attention, due to their wide range of applications in engineering, data science, and finance. To deal with this special group of optimization problems involving second-order cones (SOCs), we most often need to employ the following crucial concepts: (i) spectral decomposition associated with SOCs, (ii) analysis of SOC functions, and (iii) SOC-convexity and -monotonicity. Moreover, we can roughly classify the related algorithms into two categories. One category includes traditional algorithms that do not use complementarity functions. Here, SOC-convexity and SOC-monotonicity play a key role. In contrast, complementarity functions are employed for the other category. In this context, complementarity functions are closely related to SOC functions; consequently, the analysis of SOC functions can help with these algorithms.
Intelligent prediction and decision support systems are based on signal processing, computer vision (CV), machine learning (ML), software engineering (SE), knowledge based systems (KBS), data mining, artificial intelligence (AI) and include several systems developed from the study of expert systems (ES), genetic algorithms (GA), artificial neural networks (ANN) and fuzzy-logic systems The use of automatic decision support systems in design and manufacturing industry, healthcare and commercial software development systems has the following benifits: Cost savings in companies, due to employment of expert system technology. Fast decision making, completion of projects in time and development of new products. Improvement in decision making capability and quality. Usage of Knowledge database and Preservation of expertise of individuals Eases complex decision problems. Ex: Diagnosis in Healthcare To address the issues and challenges related to development, implementation and application of automatic and intelligent prediction and decision support systems in domains such as manufacturing, healthcare and software product design, development and optimization, this book aims to collect and publish wide ranges of quality articles such as original research contributions, methodological reviews, survey papers, case studies and/or reports covering intelligent systems, expert prediction systems, evaluation models, decision support systems and Computer Aided Diagnosis (CAD).
Computational Frameworks: Systems, Models and Applications provides an overview of advanced perspectives that bridges the gap between frontline research and practical efforts. It is unique in showing the interdisciplinary nature of this area and the way in which it interacts with emerging technologies and techniques. As computational systems are a dominating part of daily lives and a required support for most of the engineering sciences, this book explores their usage (e.g. big data, high performance clusters, databases and information systems, integrated and embedded hardware/software components, smart devices, mobile and pervasive networks, cyber physical systems, etc.).
Now, for the first time, publication of the landmark work in
backpropagation Scientists, engineers, statisticians, operations
researchers, and other investigators involved in neural networks
have long sought direct access to Paul Werbos's groundbreaking,
much-cited 1974 Harvard doctoral thesis, The Roots of
Backpropagation, which laid the foundation of backpropagation. Now,
with the publication of its full text, these practitioners can go
straight to the original material and gain a deeper, practical
understanding of this unique mathematical approach to social
studies and related fields. In addition, Werbos has provided three
more recent research papers, which were inspired by his original
work, and a new guide to the field. Originally written for readers
who lacked any knowledge of neural nets, The Roots of
Backpropagation firmly established both its historical and
continuing significance as it: |
You may like...
Systems Analysis And Design In A…
John Satzinger, Robert Jackson, …
Hardcover
(1)
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
Computer-Graphic Facial Reconstruction
John G. Clement, Murray K. Marks
Hardcover
R2,327
Discovery Miles 23 270
Discovering Computers, Essentials…
Susan Sebok, Jennifer Campbell, …
Paperback
Practical Guide to Usability Testing
Joseph S. Dumas, Janice C. Redish
Paperback
R984
Discovery Miles 9 840
|