![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
As the world proceeds quickly into the Information Age, it encounters both successes and challenges, and it is well recognized that Intelligent Information Processing provides the key to the Information Age and to mastering many of these challenges. Intelligent Information Processing supports the most advanced productive tools that are said to be able to change human life and the world itself. However, the path is never a straight one and every new technology brings with it a spate of new research problems to be tackled by researchers. As such, the demand for Information Processing research is ever-increasing. This book presents the proceedings of the 4th IFIP International Conference on Intelligent Information Processing. This conference provides a forum for engineers and scientists in academia, university and industry to present their latest research findings in all aspects of Intelligent Information Processing.
COLLABORATIVE NETWORKS Becoming a pervasive paradigm In recent years the area of collaborative networks is being consolidated as a new discipline (Camarinha-Matos, Afsarmanesh, 2005) that encompasses and gives more structured support to a large diversity of collaboration forms. In terms of applications, besides the "traditional" sectors represented by the advanced supply chains, virtual enterprises, virtual organizations, virtual teams, and their breading environments, new forms of collaborative structures are emerging in all sectors of the society. Examples can be found in e-government, intelligent transportation systems, collaborative virtual laboratories, agribusiness, elderly care, silver economy, etc. In some cases those developments tend to adopt a terminology that is specific of that domain; often the involved actors in a given domain are not fully aware of the developments in the mainstream research on collaborative networks. For instance, the grid community adopted the term "virtual organization" but focused mainly on the resource sharing perspective, ignoring most of the other aspects involved in collaboration. The European enterprise interoperability community, which was initially focused on the intra-enterprise aspects, is moving towards inter-enterprise collaboration. Collaborative networks are thus becoming a pervasive paradigm giving basis to new socio-organizational structures.
Underwater Robots reports on the latest progress in underwater robotics. In spite of its importance, the ocean is generally overlooked, since we focus more of our attention on land and atmospheric issues. We have not yet been able to explore the full depths of the ocean and its resources. The deep oceans range between 19000 to 36000 feet. At a mere 33-foot depth, the pressure is twice the normal atmospheric pressure of 29.4 psi. This obstacle, compounded with other complex issues due to the unstructured and hazardous environment, makes it difficult to travel in the ocean even though today's technologies allow humans to land on the moon. Only recently, we discovered by using manned submersibles that a large amount of carbon dioxide comes from the sea-floor and that extraordinary groups of organisms live in hydrothermal vent areas. On March 24, 1995 Kaiko (a remotely operated vehicle) navigated the deepest region of the ocean, the Mariana Trough. This vehicle successfully dived to a depth of 33000 feet and instantly showed scenes from the trench through a video camera. New tools like this enable us to gain knowledge of mysterious places. However, extensive use of manned submersibles and remotely operated vehicles is limited to a few applications because of very high operational costs, operator fatigue and safety issues. In spite of these hindrances, the demand for advanced underwater robot technologies is growing and will eventually arrive at fully autonomous, specialized, reliable underwater robotic vehicles. Underwater Robots is an edited volume of peer-reviewed original research comprising thirteen invited contributions by leading researchers. This research work has also been published as a special issue of Autonomous Robots (Volume 3, Numbers 2 and 3).
Global competition, sluggish economies and the potential offered by emerging technologies have pushed firms to fundamentally rethink their business processes. Business Process Reengineering (BPR) has become recognized as a means to restructure aging bureaucratized processes to achieve the strategic objectives of increased efficiency, reduced costs, improved quality and greater customer satisfaction. Business Process Change: Reengineering Concepts, Methods and Technologies provides extensive coverage of the organizational, managerial and technical concepts related to business process change. Among some of the topics included in this book are: process change components; enablers of process change; methodologies, techniques and tools; team-based management; effective adoption of BPR.
Our understanding of nature is often through nonuniform observations in space or time. In space, one normally observes the important features of an object, such as edges. The less important features are interpolated. History is a collection of important events that are nonuniformly spaced in time. Historians infer between events (interpolation) and politicians and stock market analysts forecast the future from past and present events (extrapolation). The 20 chapters of Nonuniform Sampling: Theory and Practice contain contributions by leading researchers in nonuniform and Shannon sampling, zero crossing, and interpolation theory. Its practical applications include NMR, seismology, speech and image coding, modulation and coding, optimal content, array processing, and digital filter design. It has a tutorial outlook for practising engineers and advanced students in science, engineering, and mathematics. It is also a useful reference for scientists and engineers working in the areas of medical imaging, geophysics, astronomy, biomedical engineering, computer graphics, digital filter design, speech and video processing, and phased array radar. A special feature of the package is a CD-ROM containing C-codes, Matlab and Mathcad programs for the algorithms presented.
The design process of digital circuits is often carried out in individual steps, like logic synthesis, mapping, and routing. Since originally the complete process was too complex, it has been split up in several - more or less independent - phases. In the last 40 years powerful algorithms have been developed to find optimal solutions for each of these steps. However, the interaction of these different algorithms has not been considered for a long time. This leads to quality loss e.g. in cases where highly optimized netlists fit badly onto the target architecture. Since the resulting circuits are often far from being optimal and insufficient regarding the optimization criteria, like area and delay, several iterations of the complete design process have to be carried out to get high quality results. This is a very time consuming and costly process. For this reason, some years ago the idea of one-pass synthesis came up. There were two main approaches how to guarantee that a design got "first time right": Combining levels that were split before, e.g. to use layout information already during the logic synthesis phase; Restricting the optimization in one level such that it better fits to the next one. So far, several approaches in these two directions have been presented and new techniques are under development. In Towards One-Pass Synthesis we describe the new paradigm that is used in one-pass synthesis and present examples for the two techniques above. Theoretical and practical aspects are discussed and minimization algorithms are given. This will help people working with synthesis tools and circuit design in general (in industry and academia) to keep informed about recent developments andnew trends in this area.
In this book, the author traces the origin of the present information technology revolution, the technological features that underlie its impact, the organizations, and the companies and technologies which are governing current and future growth. It explains how the technology works, how it fits together, how the industry is structured and what the future might bring.
The consequences of recent floods and flash floods in many parts of the world have been devastating. One way to improving flood management practice is to invest in data collection and modelling activities which enable an understanding of the functioning of a system and the selection of optimal mitigation measures. A Digital Terrain Model (DTM) provides the most essential information for flood managers. Light Detection and Ranging (LiDAR) surveys which enable the capture of spot heights at a spacing of 0.5m to 5m with a horizontal accuracy of 0.3m and a vertical accuracy of 0.15m can be used to develop high accuracy DTM but needs careful processing before using it for any application.This book presents the augmentation of an existing Progressive Morphological filtering algorithm for processing raw LiDAR data to support a 1D/2D urban flood modelling framework. The key characteristics of this improved algorithm are: (1) the ability to deal with different kinds of buildings; (2) the ability to detect elevated road/rail lines and represent them in accordance to the reality; (3) the ability to deal with bridges and riverbanks; and (4) the ability to recover curbs and the use of appropriated roughness coefficient of Manning's value to represent close-to-earth vegetation (e.g. grass and small bush).
This book describes the emerging point-of-care (POC) technologies that are paving the way to the next generation healthcare monitoring and management. It provides the readers with comprehensive, up-to-date information about the emerging technologies, such as smartphone-based mobile healthcare technologies, smart devices, commercial personalized POC technologies, paper-based immunoassays (IAs), lab-on-a-chip (LOC)-based IAs, and multiplex IAs. The book also provides guided insights into the POC diabetes management software and smart applications, and the statistical determination of various bioanalytical parameters. Additionally, the authors discuss the future trends in POC technologies and personalized and integrated healthcare solutions for chronic diseases, such as diabetes, stress, obesity, and cardiovascular disorders. Each POC technology is described comprehensively and analyzed critically with its characteristic features, bioanalytical principles, applications, advantages, limitations, and future trends. This book would be a very useful resource and teaching aid for professionals working in the field of POC technologies, in vitro diagnostics (IVD), mobile healthcare, Big Data, smart technology, software, smart applications, biomedical engineering, biosensors, personalized healthcare, and other disciplines.
Over the past twenty years, the conventional knowledge management approach has evolved into a strategic management approach that has found applications and opportunities outside of business, in society at large, through education, urban development, governance, and healthcare, among others. Knowledge-Based Development for Cities and Societies: Integrated Multi-Level Approaches enlightens the concepts and challenges of knowledge management for both urban environments and entire regions, enhancing the expertise and knowledge of scholars, researchers, practitioners, managers and urban developers in the development of successful knowledge-based development policies, creation of knowledge cities and prosperous knowledge societies. This reference creates large knowledge base for scholars, managers and urban developers and increases the awareness of the role of knowledge cities and knowledge societies in the knowledge era, as well as of the challenges and opportunities for future research.
This book presents the proceedings of the Third International Conference on Electrical Engineering and Control (ICEECA2017). It covers new control system models and troubleshooting tips, and also addresses complex system requirements, such as increased speed, precision and remote capabilities, bridging the gap between the complex, math-heavy controls theory taught in formal courses, and the efficient implementation required in real-world industry settings. Further, it considers both the engineering aspects of signal processing and the practical issues in the broad field of information transmission and novel technologies for communication networks and modern antenna design. This book is intended for researchers, engineers, and advanced postgraduate students in control and electrical engineering, computer science, signal processing, as well as mechanical and chemical engineering.
Educational initiatives attempt to introduce or promote a culture of quality within education by raising concerns related to student learning, providing services related to assessment, professional development of teachers, curriculum and pedagogy, and influencing educational policy, in the realm of technology.
Adaptive Learning of Polynomial Networks delivers theoretical and practical knowledge for the development of algorithms that infer linear and non-linear multivariate models, providing a methodology for inductive learning of polynomial neural network models (PNN) from data. The empirical investigations detailed here demonstrate that PNN models evolved by genetic programming and improved by backpropagation are successful when solving real-world tasks. The text emphasizes the model identification process and presents
This volume is an essential reference for researchers and practitioners interested in the fields of evolutionary computation, artificial neural networks and Bayesian inference, and will also appeal to postgraduate and advanced undergraduate students of genetic programming. Readers willstrengthen their skills in creating both efficient model representations and learning operators that efficiently sample the search space, navigating the search process through the design of objective fitness functions, and examining the search performance of the evolutionary system.
Over the past two decades, many advances have been made in the decision support system (DSS) field. They range from progress in fundamental concepts, to improved techniques and methods, to widespread use of commercial software for DSS development. Still, the depth and breadth of the DSS field continues to grow, fueled by the need to better support decision making in a world that is increasingly complex in terms of volume, diversity, and interconnectedness of the knowledge on which decisions can be based. This continuing growth is facilitated by increasing computer power and decreasing per-unit computing costs. But, it is spearheaded by the multifaceted efforts of DSS researchers. The collective work of these researchers runs from the speculative to the normative to the descriptive. It includes analysis of what the field needs, designs of means for meeting recognized needs, and implementations for study. It encompasses theoretical, empirical, and applied orientations. It is concerned with the invention of concepts, frameworks, models, and languages for giving varied, helpful perspectives. It involves the discovery of principles, methods, and techniques for expeditious construction of successful DSSs. It aims to create computer-based tools that facilitate DSS development. It assesses DSS efficacy by observing systems, their developers, and their users. This growing body of research continues to be fleshed out and take shape on a strong, but still-developing, skeletal foundation.
The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This second volume presents experimental laboratory prototypes and applied computing implementations. Emergent molecular computing is presented by enzymatic logical gates and circuits, and DNA nano-devices. Reaction-diffusion chemical computing is exemplified by logical circuits in Belousov-Zhabotinsky medium and geometrical computation in precipitating chemical reactions. Logical circuits realised with solitons and impulses in polymer chains show advances in collision-based computing. Photo-chemical and memristive devices give us a glimpse on hot topics of a novel hardware. Practical computing is represented by algorithms of collective and immune-computing and nature-inspired optimisation. Living computing devices are implemented in real and simulated cells, regenerating organisms, plant roots and slime mould. The book is the encyclopedia, the first ever complete authoritative account, of the theoretical and experimental findings in the unconventional computing written by the world leaders in the field. All chapters are self-contains, no specialist background is required to appreciate ideas, findings, constructs and designs presented. This treatise in unconventional computing appeals to readers from all walks of life, from high-school pupils to university professors, from mathematicians, computers scientists and engineers to chemists and biologists.
This book describes the emerging practice of e-mail tutoring; one-to-one correspondence between college students and writing tutors conducted over electronic mail. It reviews the history of Composition Studies, paying special attention to those ways in which writing centers and computers and composition have been previously hailed within a narrative of functional literacy and quick-fix solutions. The author suggests a new methodology for tutoring, and a new mandate for the writing center: a strong connection between the rhythms of extended, asynchronous writing and dialogic literacy. The electronic writing center can become a site for informed resistance to functional literacy.
This book investigates in detail the emerging deep learning (DL) technique in computational physics, assessing its promising potential to substitute conventional numerical solvers for calculating the fields in real-time. After good training, the proposed architecture can resolve both the forward computing and the inverse retrieve problems. Pursuing a holistic perspective, the book includes the following areas. The first chapter discusses the basic DL frameworks. Then, the steady heat conduction problem is solved by the classical U-net in Chapter 2, involving both the passive and active cases. Afterwards, the sophisticated heat flux on a curved surface is reconstructed by the presented Conv-LSTM, exhibiting high accuracy and efficiency. Besides, the electromagnetic parameters of complex medium such as the permittivity and conductivity are retrieved by a cascaded framework in Chapter 4. Additionally, a physics-informed DL structure along with a nonlinear mapping module are employed to obtain the space/temperature/time-related thermal conductivity via the transient temperature in Chapter 5. Finally, in Chapter 6, a series of the latest advanced frameworks and the corresponding physics applications are introduced. As deep learning techniques are experiencing vigorous development in computational physics, more people desire related reading materials. This book is intended for graduate students, professional practitioners, and researchers who are interested in DL for computational physics.
System-Level Synthesis deals with the concurrent design of electronic applications, including both hardware and software. The issue has become the bottleneck in the design of electronic systems, including both hardware and software, in several major industrial fields, including telecommunications, automotive and aerospace engineering. The major difficulty with the subject is that it demands contributions from several research fields, including system specification, system architecture, hardware design, and software design. Most existing book cover well only a few aspects of system-level synthesis. The present volume presents a comprehensive discussion of all the aspects of system-level synthesis. Each topic is covered by a contribution written by an international authority on the subject.
Taxonomy for the Technology Domain suggests a new classification system that includes literacy, collaboration, decision-making, infusion, integration, and technology. As with most taxonomies, each step offers a progressively more sophisticated level of complexity by constructing increasingly multifaceted objectives addressing increasingly complex student learning outcomes. ""Taxonomy for the Technology Domain"" affects all aspects of how technology is used in elementary and secondary classrooms, corporate training rooms, and higher education classrooms.
Molecular networks provide descriptions of the organization of various biological processes, including cellular signaling, metabolism, and genetic regulation. Knowledge on molecular networks is commonly used for systems level analysis of biological function; research and method development in this area has grown tremendously in the past few years. This book will provide a detailed review of existing knowledge on the functional characterization of biological networks. In 15 chapters authored by an international group of prolific systems biology and bioinformatics researchers, it will organize, conceptualize, and summarize the existing core of research results and computational methods on understanding biological function from a network perspective.
The size of technically producible integrated circuits increases continuously, but the ability to design and verify these circuits does not keep up. Therefore today 's design flow has to be improved. Using a visionary approach, this book analyzes the current design methodology and verification methodology, a number of deficiencies are identified and solutions suggested. Improvements in the methodology as well as in the underlying algorithms are proposed.
Symbolic Boolean manipulation using binary decision diagrams (BDDs) has been successfully applied to a wide variety of tasks, particularly in very large scale integration (VLSI) computer-aided design (CAD). The concept of decision graphs as an abstract representation of Boolean functions dates back to the early work by Lee and Akers. In the last ten years, BDDs have found widespread use as a concrete data structure for symbolic Boolean manipulation. With BDDs, functions can be constructed, manipulated, and compared by simple and efficient graph algorithms. Since Boolean functions can represent not just digital circuit functions, but also such mathematical domains as sets and relations, a wide variety of CAD problems can be solved using BDDs. Binary Decision Diagrams and Applications for VLSI CAD provides valuable information for both those who are new to BDDs as well as to long time aficionados.' -from the Foreword by Randal E. Bryant. Over the past ten years ... BDDs have attracted the attention of many researchers because of their suitability for representing Boolean functions. They are now widely used in many practical VLSI CAD systems. ... this book can serve as an introduction to BDD techniques and ... it presents several new ideas on BDDs and their applications. ... many computer scientists and engineers will be interested in this book since Boolean function manipulation is a fundamental technique not only in digital system design but also in exploring various problems in computer science.' - from the Preface by Shin-ichi Minato.
Whether you are an experienced Security or System Administrator or a Newbie to the industry, you will learn how to use native, "out-of-the-box," operating system capabilities to secure your UNIX environment. No need for third-party software or freeware tools to be and stay secure! This book will help you ensure that your system is protected from unauthorized users and conduct intrusion traces to identify the intruders if this does occur. It provides you with practical information to use of the native OS security capabilities without the need for a third party security software application. Also included are hundreds of security tips, tricks, ready-to-use scripts and configuration files that will be a valuable resource in your endeavor to secure your UNIX systems.
Digital Intermediation offers a new framework for understanding content creation and distribution across automated media platforms - a new mediatisation process. The book draws on empirical and theoretical research to carefully identify and describe a number of unseen digital infrastructures that contribute to a predictive media production process through technologies, institutions and automation. Field data is drawn from several international sites, including Los Angeles, San Francisco, Portland, London, Amsterdam, Munich, Berlin, Hamburg, Sydney and Cartagena. By highlighting an increasingly automated content production and distribution process, the book responds to a number of regulatory debates on the societal impact of social media platforms. It highlights emerging areas of key importance that shape the production and distribution of social media content, including micro-platformization and digital first personalities. The book explains how technologies, institutions and automation are used within agencies to increase exposure for the talent they manage, while providing inside access to the processes and requirements of producers who create content for platform algorithms. Finally, it outlines user agency as a strategy for those who seek diversity in the information they access on automated social media content distribution platforms. The findings in this book provide key recommendations for policymakers working within digital media platforms, and will be invaluable reading for students and academics interested in automated media environments.
Although the origins of parallel computing go back to the last century, it was only in the 1970s that parallel and vector computers became available to the scientific community. The first of these machines-the 64 processor llliac IV and the vector computers built by Texas Instruments, Control Data Corporation, and then CRA Y Research Corporation-had a somewhat limited impact. They were few in number and available mostly to workers in a few government laboratories. By now, however, the trickle has become a flood. There are over 200 large-scale vector computers now installed, not only in government laboratories but also in universities and in an increasing diversity of industries. Moreover, the National Science Foundation's Super computing Centers have made large vector computers widely available to the academic community. In addition, smaller, very cost-effective vector computers are being manufactured by a number of companies. Parallelism in computers has also progressed rapidly. The largest super computers now consist of several vector processors working in parallel. Although the number of processors in such machines is still relatively small (up to 8), it is expected that an increasing number of processors will be added in the near future (to a total of 16 or 32). Moreover, there are a myriad of research projects to build machines with hundreds, thousands, or even more processors. Indeed, several companies are now selling parallel machines, some with as many as hundreds, or even tens of thousands, of processors." |
You may like...
Durability and Reliability of Medical…
Mike Jenkins, Artemis Stamboulis
Hardcover
R4,034
Discovery Miles 40 340
Hyperfunctions and Harmonic Analysis on…
Henrik Schlichtkrull
Hardcover
R1,409
Discovery Miles 14 090
System Synthesis with VHDL
Petru Eles, Krzysztof Kuchcinski, …
Hardcover
R4,217
Discovery Miles 42 170
|