![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer hardware & operating systems > Computer architecture & logic design
Until now, there has been a lack of a complete knowledge base to fully comprehend Low power (LP) design and power aware (PA) verification techniques and methodologies and deploy them all together in a real design verification and implementation project. This book is a first approach to establishing a comprehensive PA knowledge base. LP design, PA verification, and Unified Power Format (UPF) or IEEE-1801 power format standards are no longer special features. These technologies and methodologies are now part of industry-standard design, verification, and implementation flows (DVIF). Almost every chip design today incorporates some kind of low power technique either through power management on chip, by dividing the design into different voltage areas and controlling the voltages, through PA dynamic and PA static verification, or their combination. The entire LP design and PA verification process involves thousands of techniques, tools, and methodologies, employed from the r egister transfer level (RTL) of design abstraction down to the synthesis or place-and-route levels of physical design. These techniques, tools, and methodologies are evolving everyday through the progression of design-verification complexity and more intelligent ways of handling that complexity by engineers, researchers, and corporate engineering policy makers.
This book illustrates the program of Logical-Informational Dynamics. Rational agents exploit the information available in the world in delicate ways, adopt a wide range of epistemic attitudes, and in that process, constantly change the world itself. Logical-Informational Dynamics is about logical systems putting such activities at center stage, focusing on the events by which we acquire information and change attitudes. Its contributions show many current logics of information and change at work, often in multi-agent settings where social behavior is essential, and often stressing Johan van Benthem's pioneering work in establishing this program. However, this is not a Festschrift, but a rich tapestry for a field with a wealth of strands of its own. The reader will see the state of the art in such topics as information update, belief change, preference, learning over time, and strategic interaction in games. Moreover, no tight boundary has been enforced, and some chapters add more general mathematical or philosophical foundations or links to current trends in computer science.
Thus, very much in line with van Benthem's work over many decades, the volume shows how all these disciplines form a natural unity in the perspective of dynamic logicians (broadly conceived) exploring their new themes today. And at the same time, in doing so, it offers a broader conception of logic with a certain grandeur, moving its horizons beyond the traditional study of consequence relations.
This book collates the key security and privacy concerns faced by individuals and organizations who use various social networking sites. This includes activities such as connecting with friends, colleagues, and family; sharing and posting information; managing audio, video, and photos; and all other aspects of using social media sites both professionally and personally. In the setting of the Internet of Things (IoT) that can connect millions of devices at any one time, the security of such actions is paramount. Securing Social Networks in Cyberspace discusses user privacy and trust, location privacy, protecting children, managing multimedia content, cyberbullying, and much more. Current state-of-the-art defense mechanisms that can bring long-term solutions to tackling these threats are considered in the book. This book can be used as a reference for an easy understanding of complex cybersecurity issues in social networking platforms and services. It is beneficial for academicians and graduate-level researchers. General readers may find it beneficial in protecting their social-media-related profiles.
Contemporary High Performance Computing: From Petascale toward Exascale focuses on the ecosystems surrounding the world's leading centers for high performance computing (HPC). It covers many of the important factors involved in each ecosystem: computer architectures, software, applications, facilities, and sponsors. The first part of the book examines significant trends in HPC systems, including computer architectures, applications, performance, and software. It discusses the growth from terascale to petascale computing and the influence of the TOP500 and Green500 lists. The second part of the book provides a comprehensive overview of 18 HPC ecosystems from around the world. Each chapter in this section describes programmatic motivation for HPC and their important applications; a flagship HPC system overview covering computer architecture, system software, programming systems, storage, visualization, and analytics support; and an overview of their data center/facility. The last part of the book addresses the role of clouds and grids in HPC, including chapters on the Magellan, FutureGrid, and LLGrid projects. With contributions from top researchers directly involved in designing, deploying, and using these supercomputing systems, this book captures a global picture of the state of the art in HPC.
If you need to learn CUDA but don't have experience with
parallel computing, "CUDA Programming: A Developer's Introduction
"offers a detailed guide to CUDA with a grounding in parallel
fundamentals. It starts by introducing CUDA and bringing you up to
speed on GPU parallelism and hardware, then delving into CUDA
installation. Chapters on core concepts including threads, blocks,
grids, and memory focus on both parallel and CUDA-specific issues.
Later, the book demonstrates CUDA in practice for optimizing
applications, adjusting to new hardware, and solving common
problems.
- Explores how mixed, virtual and augmented reality technologies can enable designers to create immersive experiences and expand the aesthetic potential of the medium - Curated selection of projects and essays by leading international architects and designers, including those from Zaha Hadid Architects and MVRDV- Illustrated with over 150 images
IOT: Security and Privacy Paradigm covers the evolution of security and privacy issues in the Internet of Things (IoT). It focuses on bringing all security and privacy related technologies into one source, so that students, researchers, and practitioners can refer to this book for easy understanding of IoT security and privacy issues. This edited book uses Security Engineering and Privacy-by-Design principles to design a secure IoT ecosystem and to implement cyber-security solutions. This book takes the readers on a journey that begins with understanding the security issues in IoT-enabled technologies and how it can be applied in various aspects. It walks readers through engaging with security challenges and builds a safe infrastructure for IoT devices. The book helps readers gain an understand of security architecture through IoT and describes the state of the art of IoT countermeasures. It also differentiates security threats in IoT-enabled infrastructure from traditional ad hoc or infrastructural networks, and provides a comprehensive discussion on the security challenges and solutions in RFID, WSNs, in IoT. This book aims to provide the concepts of related technologies and novel findings of the researchers through its chapter organization. The primary audience includes specialists, researchers, graduate students, designers, experts and engineers who are focused on research and security related issues. Souvik Pal, PhD, has worked as Assistant Professor in Nalanda Institute of Technology, Bhubaneswar, and JIS College of Engineering, Kolkata (NAAC "A" Accredited College). He is the organizing Chair and Plenary Speaker of RICE Conference in Vietnam; and organizing co-convener of ICICIT, Tunisia. He has served in many conferences as chair, keynote speaker, and he also chaired international conference sessions and presented session talks internationally. His research area includes Cloud Computing, Big Data, Wireless Sensor Network (WSN), Internet of Things, and Data Analytics. Vicente Garcia-Diaz, PhD, is an Associate Professor in the Department of Computer Science at the University of Oviedo (Languages and Computer Systems area). He is also the editor of several special issues in prestigious journals such as Scientific Programming and International Journal of Interactive Multimedia and Artificial Intelligence. His research interests include eLearning, machine learning and the use of domain specific languages in different areas. Dac-Nhuong Le, PhD, is Deputy-Head of Faculty of Information Technology, and Vice-Director of Information Technology Apply and Foreign Language Training Center, Haiphong University, Vietnam. His area of research includes: evaluation computing and approximate algorithms, network communication, security and vulnerability, network performance analysis and simulation, cloud computing, IoT and image processing in biomedical. Presently, he is serving on the editorial board of several international journals and has authored nine computer science books published by Springer, Wiley, CRC Press, Lambert Publication, and Scholar Press.
Programming is now parallel programming. Much as structured
programming revolutionized traditional serial programming decades
ago, a new kind of structured programming, based on patterns, is
relevant to parallel programming today. Parallel computing experts
and industry insiders Michael McCool, Arch Robison, and James
Reinders describe how to design and implement maintainable and
efficient parallel algorithms using a pattern-based approach. They
present both theory and practice, and give detailed concrete
examples using multiple programming models. Examples are primarily
given using two of the most popular and cutting edge programming
models for parallel programming: Threading Building Blocks, and
Cilk Plus. These architecture-independent models enable easy
integration into existing applications, preserve investments in
existing code, and speed the development of parallel applications.
Examples from realistic contexts illustrate patterns and themes in
parallel algorithm design that are widely applicable regardless of
implementation technology.
"This book fills a void. Most books address only portions of the embedded problem. As the title indicates, this one is uniquely comprehensive. I consider it a must-read" --Review of the 1st Edition. Jack Ganssle, embedded author, lecturer, consultant. Embedded Systems Architecture is a practical and technical guide to understanding the components that make up an embedded system s architecture. This book is perfect for those starting out as technical professionals such as engineers, programmers and designers of embedded systems; and also for students of computer science, computer engineering and electrical engineering. It gives a much-needed big picture for recently graduated engineers grappling with understanding the design of real-world systems for the first time, and provides professionals with a systems-level picture of the key elements that can go into an embedded design, providing a firm foundation on which to build their skills. * Real-world approach to the fundamentals, as well as the design and architecture process, makes this book a popular reference for the daunted or the inexperienced: if in doubt, the answer is in here * Fully updated with new coverage of FPGAs, testing, middleware and the latest programming techniques in C, plus complete source code and sample code, reference designs and tools online make this the complete package * Visit the companion web site at http: //booksite.elsevier.com/9780123821966/ for source code, design examples, data sheets and more. A true introductory book, provides a comprehensive get up and
running reference for those new to the field, and updating skills:
assumes no prior knowledge beyond undergrad level electrical
engineering.
Ever-changing business needs have prompted large companies to
rethink their enterprise IT. Today, businesses must allow
interaction with their customers, partners, and employees at more
touch points and at a depth never thought previously. At the same
time, rapid advances in information technologies, like business
digitization, cloud computing, and Web 2.0, demand fundamental
changes in the enterprises management practices. These changes have
a drastic effect not only on IT and business, but also on policies,
processes, and people. Many companies therefore embark on
enterprise-wide transformation initiatives. The role of Enterprise
Architecture (EA) is to architect and supervise this
transformational journey. Unfortunately, today s EA is often a
ponderous and detached exercise, with most of the EA initiatives
failing to create visible impact. The enterprises need an EA that
is agile and responsive to business dynamics. "Collaborative
Enterprise Architecture" provides the innovative solutions today s
enterprises require, informed by real-world experiences and experts
insights. This book, in its first part, provides a systematic
compendium of the current best practices in EA, analyzes current
ways of doing EA, and identifies its constraints and shortcomings.
In the second part, it leaves the beaten tracks of EA by
introducing Lean, Agile, and Enterprise 2.0 concepts to the
traditional EA methods. This blended approach to EA focuses on
practical aspects, with recommendations derived from real-world
experiences. A truly thought provoking and pragmatic guide to
manage EA, "Collaborative Enterprise Architecture" effectively
merges the long-term oriented top-down approach with pragmatic
bottom-up thinking, and that way offers real solutions to
businesses undergoing enterprise-wide change.
This book provides readers with a detailed reference regarding two of the most important long-term reliability and aging effects on nanometer integrated systems, electromigrations (EM) for interconnect and biased temperature instability (BTI) for CMOS devices. The authors discuss in detail recent developments in the modeling, analysis and optimization of the reliability effects from EM and BTI induced failures at the circuit, architecture and system levels of abstraction. Readers will benefit from a focus on topics such as recently developed, physics-based EM modeling, EM modeling for multi-segment wires, new EM-aware power grid analysis, and system level EM-induced reliability optimization and management techniques. Reviews classic Electromigration (EM) models, as well as existing EM failure models and discusses the limitations of those models; Introduces a dynamic EM model to address transient stress evolution, in which wires are stressed under time-varying current flows, and the EM recovery effects. Also includes new, parameterized equivalent DC current based EM models to address the recovery and transient effects; Presents a cross-layer approach to transistor aging modeling, analysis and mitigation, spanning multiple abstraction levels; Equips readers for EM-induced dynamic reliability management and energy or lifetime optimization techniques, for many-core dark silicon microprocessors, embedded systems, lower power many-core processors and datacenters.
This two-volume book describes the most common IP routing protocols used today, explaining the underlying concepts of each protocol and how the protocol components and processes fit within the typical router. Unlike other books, this title is not vendor focused. Volume 1 discusses fundamental concepts of IP routing and distance-vector routing protocols (RIPv2 and EIGRP). Volume 2 focuses on link-state routing protocols (OSPF and IS-IS) and the only path-vector routing protocol in use today (BGP). The volumes explain the types of databases each routing protocol uses, how the databases are constructed and managed, and how the various protocol components and processes, relate and interact with the databases. They also describe the routing protocols from a systems perspective, recognizing the most important routing and packet forwarding components and functions of a router. An illustrated description of IP routing protocols is given using real-world network examples. The books are presented from a practicing engineer's perspective, linking theory and fundamental concepts to common practices and real-world examples. The discussion is presented in a simple style to make it comprehensible and appealing to undergraduate and graduate level students, research and practicing engineers, scientists, IT personnel, and network engineers.
The field of SMART technologies is an interdependent discipline. It involves the latest burning issues ranging from machine learning, cloud computing, optimisations, modelling techniques, Internet of Things, data analytics, and Smart Grids among others, that are all new fields. It is an applied and multi-disciplinary subject with a focus on Specific, Measurable, Achievable, Realistic & Timely system operations combined with Machine intelligence & Real-Time computing. It is not possible for any one person to comprehensively cover all aspects relevant to SMART Computing in a limited-extent work. Therefore, these conference proceedings address various issues through the deliberations by distinguished Professors and researchers. The SMARTCOM 2020 proceedings contain tracks dedicated to different areas of smart technologies such as Smart System and Future Internet, Machine Intelligence and Data Science, Real-Time and VLSI Systems, Communication and Automation Systems. The proceedings can be used as an advanced reference for research and for courses in smart technologies taught at graduate level.
This book introduces fundamental concepts and principles of 2D and 3D graphics and is written for undergraduate and postgraduate students of computer science, graphics, multimedia, and data science. It demonstrates the use of MATLAB (R) programming for solving problems related to graphics and discusses a variety of visualization tools to generate graphs and plots. The book covers important concepts like transformation, projection, surface generation, parametric representation, curve fitting, interpolation, vector representation, and texture mapping, all of which can be used in a wide variety of educational and research fields. Theoretical concepts are illustrated using a large number of practical examples and programming codes, which can be used to visualize and verify the results. Key Features: Covers fundamental concepts and principles of 2D and 3D graphics Demonstrates the use of MATLAB (R) programming for solving problems on graphics Provides MATLAB (R) codes as answers to specific numerical problems Provides codes in a simple copy and execute format for the novice learner Focuses on learning through visual representation with extensive use of graphs and plots Helps the reader gain in-depth knowledge about the subject matter through practical examples Contains review questions and practice problems with answers for self-evaluation
An uncoded multimedia transmission (UMT) system is one that skips quantization and entropy coding in compression and all subsequent binary operations, including channel coding and bit-to-symbol mapping of modulation. By directly transmitting non-binary symbols with amplitude modulation, the uncoded system avoids the annoying cliff effect observed in the coded transmission system. This advantage makes uncoded transmission more suited to both unicast in varying channel conditions and multicast to heterogeneous users. Particularly, in the first part of Uncoded Multimedia Transmission, we consider how to improve the efficiency of uncoded transmission and make it on par with coded transmission. We then address issues and challenges regarding how to better utilize temporal and spatial correlation of images and video in the uncoded transmission, to achieve the optimal transmission performance. Next, we investigate the resource allocation problem for uncoded transmission, including subchannel, bandwidth and power allocation. By properly allocating these resources, uncoded transmission can achieve higher efficiency and more robust performance. Subsequently, we consider the image and video delivery in MIMO broadcasting networks with diverse channel quality and varying numbers of antennas across receivers. Finally, we investigate the cases where uncoded transmission can be used in conjunction with digital transmission for a balanced efficiency and adaptation capability. This book is the very first monograph in the general area of uncoded multimedia transmission written in a self-contained format. It addresses both the fundamentals and the applications of uncoded transmission. It gives a systematic introduction to the fundamental theory and concepts in this field, and at the same time, also presents specific applications that reveal the great potential and impacts for the technologies generated from the research in this field. By concentrating several important studies and developments currently taking place in the field of uncoded transmission in a single source, this book can reduce the time and cost required to learn and improve skills and knowledge in the field. The authors have been actively working in this field for years, and this book is the final essence of their years of long research in this field. The book may be used as a collection of research notes for researchers in this field, a reference book for practitioners or engineers, as well as a textbook for a graduate advanced seminar in this field or any related fields. The references collected in this book may be used as further reading lists or references for the readers.
The QL&SC 2012 is a major symposium for scientists, and practitioners all around the world to present their latest researches, results, ideas, developments and applications in such areas as quantitative logic, many-valued logic, fuzzy logic, quantification of software, artificial intelligence, fuzzy sets and systems and soft computing.This invaluable book provides a broad introduction to the fuzzy reasoning and soft computing. It is certain one should not go too far in approximation and optimization, and a certain degree must be kept in mind. This is the essential idea of quantitative logic and soft computing.The explanations in the book are complete to provide the necessary background material needed to go further into the subject and explore the research literature. It is suitable reading for graduate students. It provides a platform for mutual exchanges from top experts and scholars around the world in this field.
This volume shows how ICT (information and communications technology) can play the role of a driver of business process reengineering (BPR). ICT can aid in enabling improvement in BPR activity cycles as it provides many components that enhance performance that can lead to competitive advantages. IT can interface with BPR to improve business processes in terms of communication, inventory management, data management, management information systems, customer relationship management, computer-aided design, computer-aided manufacturing (CAM), and computer-aided engineering. This volume explores these issues in depth.
Ecosystems and Technology: Idea Generation and Content Model Processing, presents important new innovations in the area of management and computing. Innovation is the generation and application of new ideas and skills to produce new products, processes, and services that improve economic and social prosperity. This includes management and design policy decisions and encompasses innovation research, analysis, and best practice in enterprises, public and private sector service organizations, government, regional societies and economies. The book, the first volume in the Innovation Management and Computing book series, looks at technology that improves efficiency and idea generation, including systems for business, medical/health, education, and more. The book provides detailed examples to provide readers with current issues, including Venture planning for innovations New technologies supporting innovations systems Competitive business modeling Context-driven innovation modeling The generation of ideas faster The measurement of relevant data Virtual interfaces Business intelligence and content processing Predictive modeling Haptic expression and emotion recognition innovations, with applications to neurocognitive medical science This book provides a wealth of information that will be useful for IT and business professionals, educators, and students in many fields.
This book highlights how to integrate and realize Service Oriented Architecture with web services which is one of the emerging technologies in IT. It also focuses on the latest technologies, such as Metadata Management, Security issues, Quality of Service and its commercialization. A chapter is also devoted to the study of Emerging standards and development tools for Enterprise Application Integration. Most enterprises have made extensive investments in system resources over the course of many years. Such enterprises have an enormous amount of data stored in legacy enterprise information systems (EIS), so it is not practical to discard existing systems. It is more cost-effective to evolve and enhance EIS. This could be done with the help of SOA realizing with web services, which is an emerging field in Information technology. SOA is usually realized through web services. Web services specifications may add to the confusion of how to best to utilize SOA to solve business problems. In order for a smooth transition to SOA, using an architectural style that helps in realizing web services through SOA. The book concentrates on this architecture, realization and integration of SOA with web services. It consists of 12 chapters and is recommended for all postgraduate Computer Science Students.
This book brings together a selection of the best papers from the thirteenth edition of the Forum on specification and Design Languages Conference (FDL), which was held in Southampton, UK in September 2010. FDL is a well established international forum devoted to dissemination of research results, practical experiences and new ideas in the application of specification, design and verification languages to the design, modelling and verification of integrated circuits, complex hardware/software embedded systems, and mixed-technology systems.
Information services are economic and organizational activities for informing people. Because informing is changing rapidly under the influence of internet-technologies, this book presents in Chapter 1 fundamental notions of information and knowledge, based on philosopher C.W. Churchman's inquiring systems. This results in the identification of three product-oriented design theory aspects: content, use value and revenue. Chapter 2 describes how one can cope with these aspects by presenting process-oriented design theory. Both design theory insights are applied in chapters on information services challenges, their business concepts and processes, their architectures and exploitation. The final chapter discusses three case studies that integrate the insights from previous chapters, and it discusses some ideas for future research. This book gives students a coherent start to the topic of information services from a design science perspective, with a balance between technical and managerial aspects. Therefore, this book is useful for modern curricula of management, communication science and information systems. Because of its design science approach, it also explains design science principles. The book also serves professionals and academics in search of a foundational understanding of informing as a science and management practice.
In the past two decades, breakthroughs in computer technology have made a tremendous impact on optimization. In particular, availability of parallel computers has created substantial interest in exploring the use of parallel processing for solving discrete and global optimization problems. The chapters in this volume cover a broad spectrum of recent research in parallel processing of discrete and related problems. The topics discussed include distributed branch-and-bound algorithms, parallel genetic algorithms for large scale discrete problems, simulated annealing, parallel branch-and-bound search under limited-memory constraints, parallelization of greedy randomized adaptive search procedures, parallel optical models of computing, randomized parallel algorithms, general techniques for the design of parallel discrete algorithms, parallel algorithms for the solution of quadratic assignment and satisfiability problems. The book will be a valuable source of information to faculty, students and researchers in combinatorial optimization and related areas.
This book introduces several novel approaches to pave the way for the next generation of integrated circuits, which can be successfully and reliably integrated, even in safety-critical applications. The authors describe new measures to address the rising challenges in the field of design for testability, debug, and reliability, as strictly required for state-of-the-art circuit designs. In particular, this book combines formal techniques, such as the Satisfiability (SAT) problem and the Bounded Model Checking (BMC), to address the arising challenges concerning the increase in test data volume, as well as test application time and the required reliability. All methods are discussed in detail and evaluated extensively, while considering industry-relevant benchmark candidates. All measures have been integrated into a common framework, which implements standardized software/hardware interfaces.
A unique feature of this open access textbook is to provide a comprehensive introduction to the fundamental knowledge in embedded systems, with applications in cyber-physical systems and the Internet of things. It starts with an introduction to the field and a survey of specification models and languages for embedded and cyber-physical systems. It provides a brief overview of hardware devices used for such systems and presents the essentials of system software for embedded systems, including real-time operating systems. The author also discusses evaluation and validation techniques for embedded systems and provides an overview of techniques for mapping applications to execution platforms, including multi-core platforms. Embedded systems have to operate under tight constraints and, hence, the book also contains a selected set of optimization techniques, including software optimization techniques. The book closes with a brief survey on testing. This fourth edition has been updated and revised to reflect new trends and technologies, such as the importance of cyber-physical systems (CPS) and the Internet of things (IoT), the evolution of single-core processors to multi-core processors, and the increased importance of energy efficiency and thermal issues. |
You may like...
Convergence of Broadband, Broadcast, and…
Ramona Trestian, Gabriel-Miro Muntean
Hardcover
R5,899
Discovery Miles 58 990
Wireless Communication Networks…
Hailong Huang, Andrey V. Savkin, …
Paperback
R2,763
Discovery Miles 27 630
Design and Implementation of Practical…
Akshay Kumar, Ahmed Abdelhadi, …
Hardcover
R2,663
Discovery Miles 26 630
Advances in Signal Processing and…
Banmali S. Rawat, Aditya Trivedi, …
Hardcover
R5,267
Discovery Miles 52 670
MIMO Wireless Networks - Channels…
Bruno Clerckx, Claude Oestges
Hardcover
R2,237
Discovery Miles 22 370
|