![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer hardware & operating systems
This book records the author's years of experience in the software industry. In his own practices, the author has found that the distributed work pattern has become increasingly popular in more and more work environments, either between vendors and customers or between different teams inside a company. This means that all practitioners in the software industry need to adapt to this new way of communication and collaboration and get skilled enough to meet the greater challenges in integrating the distributed work pattern with agile software delivery. By centering on the difficulties in communication and collaboration between distributed teams, this book digs into the reasons why so many remote delivery projects end up anticlimactic and provides solutions for readers' reference. It also cites successful cases in promoting agile development in distributed teams, which has been a vexing problem for many software development companies. In addition, readers can find suggestions and measures for building self-managing teams in this book. Remote Delivery: A Guide to Software Delivery through Collaboration between Distributed Teams is a very practical guide for software delivery teams with their members distributed in different places and companies engaged in software customization. Developers, QAs, product managers, and project leaders can also be inspired by this book.
The International Workshop on "The Use of Supercomputers in Theoretical Science" took place on January 24 and 25, 1991, at the University of Antwerp (UIA), Antwerpen, Belgium. It was the sixth in a series of workshops, the fIrst of which took place in 1984. The principal aim of these workshops is to present the state of the art in scientific large-scale and high speed-computation. Computational science has developed into a third methodology equally important now as its theoretical and experimental companions. Gradually academic researchers acquired access to a variety of supercomputers and as a consequence computational science has become a major tool for their work. It is a pleasure to thank the Belgian National Science Foundation (NFWO-FNRS) and the Ministry of ScientifIc Affairs for sponsoring the workshop. It was organized both in the framework of the Third Cycle "Vectorization, Parallel Processing and Supercomputers" and the "Governemental Program in Information Technology." We also very much would like to thank the University of Antwerp (Universitaire Instelling Antwerpen -VIA) for fInancial and material support. Special thanks are due to Mrs. H. Evans for the typing and editing of the manuscripts and for the preparation of the author and subject indexes. J.T. Devreese P.E. Van Camp University of Antwerp July 1991 v CONlENTS High Perfonnance Numerically Intensive Applications on Distributed Memory Parallel Computers .................... . F.W. Wray Abstract ......................................... .
The book provides the complete strategic understanding requisite to allow a person to create and use the RMF process recommendations for risk management. This will be the case both for applications of the RMF in corporate training situations, as well as for any individual who wants to obtain specialized knowledge in organizational risk management. It is an all-purpose roadmap of sorts aimed at the practical understanding and implementation of the risk management process as a standard entity. It will enable an "application" of the risk management process as well as the fundamental elements of control formulation within an applied context.
The interplay between words, computability, algebra and arithmetic has now proved its relevance and fruitfulness. Indeed, the cross-fertilization between formal logic and finite automata (such as that initiated by J.R. Buchi) or between combinatorics on words and number theory has paved the way to recent dramatic developments, for example, the transcendence results for the real numbers having a "simple" binary expansion, by B. Adamczewski and Y. Bugeaud. This book is at the heart of this interplay through a unified exposition. Objects are considered with a perspective that comes both from theoretical computer science and mathematics. Theoretical computer science offers here topics such as decision problems and recognizability issues, whereas mathematics offers concepts such as discrete dynamical systems. The main goal is to give a quick access, for students and researchers in mathematics or computer science, to actual research topics at the intersection between automata and formal language theory, number theory and combinatorics on words. The second of two volumes on this subject, this book covers regular languages, numeration systems, formal methods applied to decidability issues about infinite words and sets of numbers.
Automatic transformation of a sequential program into a parallel form is a subject that presents a great intellectual challenge and promises great practical rewards. There is a tremendous investment in existing sequential programs, and scientists and engineers continue to write their application programs in sequential languages (primarily in Fortran), but the demand for increasing speed is constant. The job of a restructuring compiler is to discover the dependence structure of a given program and transform the program in a way that is consistent with both that dependence structure and the characteristics of the given machine. Much attention in this field of research has been focused on the Fortran do loop. This is where one expects to find major chunks of computation that need to be performed repeatedly for different values of the index variable. Many loop transformations have been designed over the years, and several of them can be found in any parallelizing compiler currently in use in industry or at a university research facility. Loop Transformations for Restructuring Compilers: The Foundations provides a rigorous theory of loop transformations. The transformations are developed in a consistent mathematical framework using objects like directed graphs, matrices and linear equations. The algorithms that implement the transformations can then be precisely described in terms of certain abstract mathematical algorithms. The book provides the general mathematical background needed for loop transformations (including those basic mathematical algorithms), discusses data dependence, and introduces the major transformations. The next volume will build a detailed theory of looptransformations based on the material developed here. Loop Transformations for Restructuring Compilers: The Foundations presents a theory of loop transformations that is rigorous and yet reader-friendly.
Today's organizations find themselves in a race to adopt new technologies in order to keep up with their competition. However, two questions must be answered: Are these organizations ready for new technological advancements, and are these new technologies appropriate for every organization? Technological Challenges and Management: Matching Human and Business Needs focuses on the new advances and challenges that today's organizations face in the areas of human resources and business, resulting from continuous and highly complex changes in technological resources. Organizations need to implement a more proactive and flexible management, matching their human and business needs. Due to this reality, it is important to study and understand varied contributions made by researchers, academics, and practitioners in this field of study worldwide. With the focus of this reality, this book exchanges experiences and perspectives about the state of technological challenges and management research, and future directions for this field of study. It also takes into account the deep implications that these challenges have in the organization of human resources. The authors support academics and researchers and those operating in the management field in dealing with different challenges that organizations face today. This is especially true concerning the relationship between technological changes, human resources management, and business. They propose the sharing of knowledge, through debate and information exchange, about technological challenges and management, matching the critical items of human and business needs. The book is divided into seven chapters that span from evaluating new technologies to finding the perfect fit.
Amid recent interest in Clifford algebra for dual quaternions as a more suitable method for Computer Graphics than standard matrix algebra, this book presents dual quaternions and their associated Clifford algebras in a new light, accessible to and geared towards the Computer Graphics community. Collating all the associated formulas and theorems in one place, this book provides an extensive and rigorous treatment of dual quaternions, as well as showing how two models of Clifford algebras emerge naturally from the theory of dual quaternions. Each chapter comes complete with a set of exercises to help readers sharpen and practice their knowledge. This book is accessible to anyone with a basic knowledge of quaternion algebra and is of particular use to forward-thinking members of the Computer Graphics community. .
Computer Systems and Software Engineering is a compilation of sixteen state-of-the-art lectures and keynote speeches given at the COMPEURO '92 conference. The contributions are from leading researchers, each of whom gives a new insight into subjects ranging from hardware design through parallelism to computer applications. The pragmatic flavour of the contributions makes the book a valuable asset for both researchers and designers alike. The book covers the following subjects: Hardware Design: memory technology, logic design, algorithms and architecture; Parallel Processing: programming, cellular neural networks and load balancing; Software Engineering: machine learning, logic programming and program correctness; Visualization: the graphical computer interface.
Thinking Machines: Machine Learning and Its Hardware Implementation covers the theory and application of machine learning, neuromorphic computing and neural networks. This is the first book that focuses on machine learning accelerators and hardware development for machine learning. It presents not only a summary of the latest trends and examples of machine learning hardware and basic knowledge of machine learning in general, but also the main issues involved in its implementation. Readers will learn what is required for the design of machine learning hardware for neuromorphic computing and/or neural networks. This is a recommended book for those who have basic knowledge of machine learning or those who want to learn more about the current trends of machine learning.
Intelligent Image and Video Compression: Communicating Pictures, Second Edition explains the requirements, analysis, design and application of a modern video coding system. It draws on the authors' extensive academic and professional experience in this field to deliver a text that is algorithmically rigorous yet accessible, relevant to modern standards and practical. It builds on a thorough grounding in mathematical foundations and visual perception to demonstrate how modern image and video compression methods can be designed to meet the rate-quality performance levels demanded by today's applications and users, in the context of prevailing network constraints. "David Bull and Fan Zhang have written a timely and accessible book on the topic of image and video compression. Compression of visual signals is one of the great technological achievements of modern times, and has made possible the great successes of streaming and social media and digital cinema. Their book, Intelligent Image and Video Compression covers all the salient topics ranging over visual perception, information theory, bandpass transform theory, motion estimation and prediction, lossy and lossless compression, and of course the compression standards from MPEG (ranging from H.261 through the most modern H.266, or VVC) and the open standards VP9 and AV-1. The book is replete with clear explanations and figures, including color where appropriate, making it quite accessible and valuable to the advanced student as well as the expert practitioner. The book offers an excellent glossary and as a bonus, a set of tutorial problems. Highly recommended!" --Al Bovik
This book discusses the design principles of physically unclonable functions (PUFs) and how these can be employed in hardware-based security applications, in particular, the book provides readers with a comprehensive overview of security threats and existing countermeasures. This book has many features that make it a unique source for students, engineers and educators, including more than 80 problems and worked exercises, in addition to, approximately 200 references, which give extensive direction for further reading.
This book presents novel hybrid encryption algorithms that possess many different characteristics. In particular, "Hybrid Encryption Algorithms over Wireless Communication Channels", examines encrypted image and video data for the purpose of secure wireless communications. A study of two different families of encryption schemes are introduced: namely, permutation-based and diffusion-based schemes. The objective of the book is to help the reader selecting the best suited scheme for the transmission of encrypted images and videos over wireless communications channels, with the aid of encryption and decryption quality metrics. This is achieved by applying number-theory based encryption algorithms, such as chaotic theory with different modes of operations, the Advanced Encryption Standard (AES), and the RC6 in a pre-processing step in order to achieve the required permutation and diffusion. The Rubik's cube is used afterwards in order to maximize the number of permutations. Transmission of images and videos is vital in today's communications systems. Hence, an effective encryption and modulation schemes are a must. The author adopts Orthogonal Frequency Division Multiplexing (OFDM), as the multicarrier transmission choice for wideband communications. For completeness, the author addresses the sensitivity of the encrypted data to the wireless channel impairments, and the effect of channel equalization on the received images and videos quality. Complete simulation experiments with MATLAB (R) codes are included. The book will help the reader obtain the required understanding for selecting the suitable encryption method that best fulfills the application requirements.
As human activities moved to the digital domain, so did all the well-known malicious behaviors including fraud, theft, and other trickery. There is no silver bullet, and each security threat calls for a specific answer. One specific threat is that applications accept malformed inputs, and in many cases it is possible to craft inputs that let an intruder take full control over the target computer system. The nature of systems programming languages lies at the heart of the problem. Rather than rewriting decades of well-tested functionality, this book examines ways to live with the (programming) sins of the past while shoring up security in the most efficient manner possible. We explore a range of different options, each making significant progress towards securing legacy programs from malicious inputs. The solutions explored include enforcement-type defenses, which excludes certain program executions because they never arise during normal operation. Another strand explores the idea of presenting adversaries with a moving target that unpredictably changes its attack surface thanks to randomization. We also cover tandem execution ideas where the compromise of one executing clone causes it to diverge from another thus revealing adversarial activities. The main purpose of this book is to provide readers with some of the most influential works on run-time exploits and defenses. We hope that the material in this book will inspire readers and generate new ideas and paradigms.
With this book, managers and decision makers are given the tools to make more informed decisions about big data purchasing initiatives. Big Data Analytics: A Practical Guide for Managers not only supplies descriptions of common tools, but also surveys the various products and vendors that supply the big data market. Comparing and contrasting the different types of analysis commonly conducted with big data, this accessible reference presents clear-cut explanations of the general workings of big data tools. Instead of spending time on HOW to install specific packages, it focuses on the reasons WHY readers would install a given package. The book provides authoritative guidance on a range of tools, including open source and proprietary systems. It details the strengths and weaknesses of incorporating big data analysis into decision-making and explains how to leverage the strengths while mitigating the weaknesses. Describes the benefits of distributed computing in simple terms Includes substantial vendor/tool material, especially for open source decisions Covers prominent software packages, including Hadoop and Oracle Endeca Examines GIS and machine learning applications Considers privacy and surveillance issues The book further explores basic statistical concepts that, when misapplied, can be the source of errors. Time and again, big data is treated as an oracle that discovers results nobody would have imagined. While big data can serve this valuable function, all too often these results are incorrect, yet are still reported unquestioningly. The probability of having erroneous results increases as a larger number of variables are compared unless preventative measures are taken. The approach taken by the authors is to explain these concepts so managers can ask better questions of their analysts and vendors as to the appropriateness of the methods used to arrive at a conclusion. Because the world of science and medicine has been grappling with similar issues in the publication of studies, the authors draw on their efforts and apply them to big data.
Why learn functional programming? Isn't that some complicated ivory-tower technique used only in obscure languages like Haskell? In fact, functional programming is actually very simple. It's also very powerful, as Haskell demonstrates by throwing away all the conventional programming tools and using only functional programming features. But it doesn't have to be done that way. Functional programming is a power tool that you can use in addition to all your usual tools, to whatever extent your current mainstream language supports it. Most languages have at least basic support. In this book we use Python and Java and, as a bonus, Scala. If you prefer another language, there will be minor differences in syntax, but the concepts are the same. Give functional programming a try. You may be surprised how much a single power tool can help you in your day-to-day programming.
Tactile Internet with Human-in-the-Loop describes the change from the current Internet, which focuses on the democratization of information independent of location or time, to the Tactile Internet, which democratizes skills to promote equity that is independent of age, gender, sociocultural background or physical limitations. The book promotes the concept of the Tactile Internet for remote closed-loop human-machine interaction and describes the main challenges and key technologies. Current standardization activities in the field for IEEE and IETF are also described, making this book an ideal resource for researchers, graduate students, and industry R&D engineers in communications engineering, electronic engineering, and computer engineering.
Your no-fluff, fast-paced guide to everything Windows 10 This handy, jargon-free guide is designed to help you quickly learn whatever you need to know about Windows 10. Perfect for novices and experienced users alike, you'll get tips, tricks, and savvy advice on how to install programs, set up user accounts, play music and other media files, download photos from your digital camera, go online, set up and secure an email account, and much, much more. Shows how to perform more than 150 Windows tasks, including working with files, digital images, and media; customizing Windows; optimizing performance; and sharing a computer with multiple users Covers installing and repairing applications, system maintenance, setting up password-protected accounts, downloading photos to your computer, and staying safe online With concise, easy-to-follow instructions, and its small, portable size, this is the ideal, on-the-go guide for Windows 10 users everywhere.
The emergence of highly promising and potent technologies has enabled the transition of ordinary objects into smart artifacts-providing wider connectivity of digitized entities that can facilitate the building of connected cities. This book provides readers with a solid foundation on the latest technologies and tools required to develop and enhance smart cities around the world. The book begins by examining the rise of the cloud as the fundamental technology for establishing and sustaining smart cities and enterprises. Explaining the principal technologies and platform solutions for implementing intelligent cities, the book details the role of various technologies, standards, protocols, and tools in establishing flexible homes and the buildings of the future. Examines IT platforms and tools from various product vendors Considers service-oriented architecture and event-driven architecture for smart city applications Explains how to leverage big data analytics for smart city enhancement and improved decision making Includes case studies of intelligent cities, smart homes, buildings, transports, healthcare systems, and airports The authors explore the convergence of cloud computing and enterprise architecture and present valuable information on next-generation cloud computing. They also cover the various architectural types, including enterprise-scale integration, security, management, and governance. The book concludes by explaining the various security requirements of intelligent cities as well as the threats and vulnerabilities of the various components that form the basis of the intelligent city framework, including cloud, big data, Internet of Things, and mobile technologies.
The go-to guide to getting started with micro:bit and exploring all of the mini-computer's amazing capabilities The micro:bit is a pocket-sized electronic development platform built with education in mind. It was developed by the BBC in partnership with Microsoft and other major tech companies to provide kids with a fun, easy, inexpensive way to develop their digital skills. With it, kids (and grownups) can learn basic programming and coding while having fun making virtual pets, developing games, and a whole lot more. Written by internationally bestselling tech author Gareth Halfacree and endorsed by the Micro:bit Foundation, the micro:bit User Guide contains what you need to know to get up and running fast with the micro:bit. Learn everything from taking your first steps with the software to writing your own programs. You'll also learn how to expand its capabilities with add-ons through easy-to-follow, step-by-step instructions. * Configure your micro:bit and develop your digital skills * Write code in Microsoft PXT, Python, JavaScript, and more * Discover the motion detector and compass * Connect the micro:bit to a computer, Raspberry Pi, or your smartphone * Build your own circuits and create hardware The micro:bit User Guide is your go-to source for learning all the secrets of the micro:bit. Whether you're just beginning or have some experience, this book allows you to dive right in and experience everything the micro:bit has to offer.
This book provides insights into the 3rd International Conference on Communication, Devices and Computing (ICCDC 2021), which was held in Haldia, India, on August 16-18, 2021. It covers new ideas, applications, and the experiences of research engineers, scientists, industrialists, scholars, and students from around the globe. The proceedings highlight cutting-edge research on communication, electronic devices, and computing and address diverse areas such as 5G communication, spread spectrum systems, wireless sensor networks, and signal processing for secure communication, error control coding, printed antennas, analysis of wireless networks, antenna array systems, analog and digital signal processing for communication systems, frequency selective surfaces, radar communication, and substrate integrated waveguide and microwave passive components, which are key to state-of-the-art innovations in communication technologies.
Relationships abound in the library and information science (LIS) world. Those relationships may be social in nature, as, for instance, when we deal with human relationships among library personnel or relationships (i. e. , "public relations") between an information center and its clientele. The relationships may be educational, as, for example, when we examine the relationship between the curriculum of an accredited school and the needs of the work force it is preparing students to join. Or the relationships may be economic, as when we investigate the relationship between the cost of journals and the frequency with which they are cited. Many of the relationships of concern to us reflect phenomena entirely internal to the field: the relationship between manuscript collections, archives, and special collections; the relationship between end user search behavior and the effectiveness of searches; the relationship between access to and use of information resources; the relationship between recall and precision; the relationship between various bibliometric laws; etc. The list of such relationships could go on and on. The relationships addressed in this volume are restricted to those involved in the organization of recorded knowledge, which tend to have a conceptual or semantic basis, although statistical means are sometimes used in their discovery.
Tremendous achievements in the area of semiconductor electronics turn - croelectronics into nanoelectronics. Actually, we observe a real technical boom connected with achievements in nanoelectronics. It results in devel- mentofverycomplexintegratedcircuits, particularlythe?eldprogrammable logic devices (FPLD). Up-to-day FPLD chips are so huge, that it is enough only one chip to implement a really complex digital system including a da- path and a control unit. Because of the extreme complexity of modern - crochips, it is very important to develop e?ective design methods oriented on particular properties of logic elements. The development of digital s- tems with use of FPLD microchips is not possible without use of di?erent hardware description languages(HDL), such as VHDL and Verilog. Di?erent computer-aided design tools (CAD) are wide used to develop digital system hardware. As majorityof researchespoint out, the design processis nowvery similar to the process of program development. It allows a researcher to pay more attention to some speci?c problems, where there are no standard f- mal methods of their solution. But application of all these achievements does not guaranteeper sedevelopmentof some competitiveelectronic product, - pecially in the acceptable time-to-market. This problem solution is possible only if a researcher possesses fundamental knowledge of a design process and knows exactly the mode of operation of industrial CAD tools in use. As it is known, any digital system can be represented as a composition of a da- path and a control uni
This book reviews and presents a number of approaches to Fuzzy-based system safety and reliability assessment. For each proposed approach, it provides case studies demonstrating their applicability, which will enable readers to implement them into their own risk analysis process. The book begins by giving a review of using linguistic terms in system safety and reliability analysis methods and their extension by fuzzy sets. It then progresses in a logical fashion, dedicating a chapter to each approach, including the 2-tuple fuzzy-based linguistic term set approach, fuzzy bow-tie analysis, optimizing the allocation of risk control measures using fuzzy MCDM approach, fuzzy sets theory and human reliability, and emergency decision making fuzzy-expert aided disaster management system. This book will be of interest to professionals and researchers working in the field of system safety and reliability, as well as postgraduate and undergraduate students studying applications of fuzzy systems.
Many professionals and students in engineering, science, business, and other application fields need to develop Windows-based and web-enabled information systems to store and use data for decision support, without help from professional programmers. However, few books are available to train professionals and students who are not professional programmers to develop these information systems. Developing Windows-Based and Web-Enabled Information Systems fills this gap, providing a self-contained, easy-to-understand, and well-illustrated text that explores current concepts, methods, and software tools for developing Windows-based and web-enabled information systems. Written in an easily accessible style, the book details current concepts, methods, and software tools for Windows-based and web-enabled information systems that store and use data. It is self-contained with easy-to-understand small examples to walk through concepts and implementation details along with large-scale case studies. The book describes data modeling methods including entity-relationship modeling, relational modeling and normalization, and object-oriented data modeling, to develop data models of a database. The author covers how to use software tools in the Microsoft application development environment, including Microsoft Access, MySQL, SQL, Visual Studio, Visual Basic, VBA, HTML, and XML, to implement databases and develop Windows-based and web-enabled applications with the database, graphical user interface, and program components. The book takes you through the entire process of developing a computer and network application for an information system, highlighting concepts and operation details. In each chapter, small data examples are used to manually walk through concepts and operational details. These features and more give you the conceptual understanding and practical skill required, even if you don't have a computer science background, to develop Windows-based or web-enabled applications for your specialized information system.
Computer vision falls short of human vision in two respects: execution time and intelligent interpretation. This book addresses the question of execution time. It is based on a workshop on specialized processors for real-time image analysis, held as part of the activities of an ESPRIT Basic Research Action, the Working Group on Vision. The aim of the book is to examine the state of the art in vision-oriented computers. Two approaches are distinguished: multiprocessor systems and fine-grain massively parallel computers. The development of fine-grain machines has become more important over the last decade, but one of the main conclusions of the workshop is that this does not imply the replacement of multiprocessor machines. The book is divided into four parts. Part 1 introduces different architectures for vision: associative and pyramid processors as examples of fine-grain machines and a workstation with bus-oriented network topology as an example of a multiprocessor system. Parts 2 and 3 deal with the design and development of dedicated and specialized architectures. Part 4 is mainly devoted to applications, including road segmentation, mobile robot guidance and navigation, reconstruction and identification of 3D objects, and motion estimation. |
![]() ![]() You may like...
PowerShell for Administration, IT Pro…
William R. Stanek, William Stanek
Hardcover
R1,533
Discovery Miles 15 330
Practical TCP/IP and Ethernet Networking…
Deon Reynders, Edwin Wright
Paperback
R1,581
Discovery Miles 15 810
|