![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Applications of computing > General
Adaptive Learning of Polynomial Networks delivers theoretical and practical knowledge for the development of algorithms that infer linear and non-linear multivariate models, providing a methodology for inductive learning of polynomial neural network models (PNN) from data. The empirical investigations detailed here demonstrate that PNN models evolved by genetic programming and improved by backpropagation are successful when solving real-world tasks. The text emphasizes the model identification process and presents
This volume is an essential reference for researchers and practitioners interested in the fields of evolutionary computation, artificial neural networks and Bayesian inference, and will also appeal to postgraduate and advanced undergraduate students of genetic programming. Readers willstrengthen their skills in creating both efficient model representations and learning operators that efficiently sample the search space, navigating the search process through the design of objective fitness functions, and examining the search performance of the evolutionary system.
Global competition, sluggish economies and the potential offered by emerging technologies have pushed firms to fundamentally rethink their business processes. Business Process Reengineering (BPR) has become recognized as a means to restructure aging bureaucratized processes to achieve the strategic objectives of increased efficiency, reduced costs, improved quality and greater customer satisfaction. Business Process Change: Reengineering Concepts, Methods and Technologies provides extensive coverage of the organizational, managerial and technical concepts related to business process change. Among some of the topics included in this book are: process change components; enablers of process change; methodologies, techniques and tools; team-based management; effective adoption of BPR.
Over the past two decades, many advances have been made in the decision support system (DSS) field. They range from progress in fundamental concepts, to improved techniques and methods, to widespread use of commercial software for DSS development. Still, the depth and breadth of the DSS field continues to grow, fueled by the need to better support decision making in a world that is increasingly complex in terms of volume, diversity, and interconnectedness of the knowledge on which decisions can be based. This continuing growth is facilitated by increasing computer power and decreasing per-unit computing costs. But, it is spearheaded by the multifaceted efforts of DSS researchers. The collective work of these researchers runs from the speculative to the normative to the descriptive. It includes analysis of what the field needs, designs of means for meeting recognized needs, and implementations for study. It encompasses theoretical, empirical, and applied orientations. It is concerned with the invention of concepts, frameworks, models, and languages for giving varied, helpful perspectives. It involves the discovery of principles, methods, and techniques for expeditious construction of successful DSSs. It aims to create computer-based tools that facilitate DSS development. It assesses DSS efficacy by observing systems, their developers, and their users. This growing body of research continues to be fleshed out and take shape on a strong, but still-developing, skeletal foundation.
In this book, the author traces the origin of the present information technology revolution, the technological features that underlie its impact, the organizations, and the companies and technologies which are governing current and future growth. It explains how the technology works, how it fits together, how the industry is structured and what the future might bring.
Taxonomy for the Technology Domain suggests a new classification system that includes literacy, collaboration, decision-making, infusion, integration, and technology. As with most taxonomies, each step offers a progressively more sophisticated level of complexity by constructing increasingly multifaceted objectives addressing increasingly complex student learning outcomes. ""Taxonomy for the Technology Domain"" affects all aspects of how technology is used in elementary and secondary classrooms, corporate training rooms, and higher education classrooms.
Our understanding of nature is often through nonuniform observations in space or time. In space, one normally observes the important features of an object, such as edges. The less important features are interpolated. History is a collection of important events that are nonuniformly spaced in time. Historians infer between events (interpolation) and politicians and stock market analysts forecast the future from past and present events (extrapolation). The 20 chapters of Nonuniform Sampling: Theory and Practice contain contributions by leading researchers in nonuniform and Shannon sampling, zero crossing, and interpolation theory. Its practical applications include NMR, seismology, speech and image coding, modulation and coding, optimal content, array processing, and digital filter design. It has a tutorial outlook for practising engineers and advanced students in science, engineering, and mathematics. It is also a useful reference for scientists and engineers working in the areas of medical imaging, geophysics, astronomy, biomedical engineering, computer graphics, digital filter design, speech and video processing, and phased array radar. A special feature of the package is a CD-ROM containing C-codes, Matlab and Mathcad programs for the algorithms presented.
Protocols that remain zero-knowledge when many instances are executed concurrently are called concurrent zero-knowledge, and this book is devoted to their study. The book presents constructions of concurrent zero-knowledge protocols, along with proofs of security. It also shows why "traditional" proof techniques (i.e., black-box simulation) are not suitable for establishing the concurrent zero-knowledge property of "message-efficient" protocols.
Mobile devices are the 'it' technology, and everyone wants to know how to apply them to their environments. This book brings together the best examples and insights for implementing mobile technology in libraries. Chapters cover a wide variety of the most important tools and procedures from developing applications to marketing and augmented reality. Readers of this volume will get complete and timely knowledge of library applications for handheld devices. The Handheld Librarian conferences have been a centrepiece of learning about how to apply mobile technologies to library services and collections as well as a forum for sharing examples and lessons learned. The conferences have brought our profession forward into the trend and kept us up to date with ongoing advances. This volume brings together the best from that rich story and presents librarians with the basic information they need to successfully make the case for and implement programs leveraging mobile devices in their libraries. Authors of the diverse practical and well researched pieces originate in all types of libraries and segments of the profession. This wide representation ensures that front line librarians, library administrators, systems staff, even library professors will find this volume perfectly geared for their needs. This book was published as a special issue of The Reference Librarian.
The thesis work was in two major parts: development and testing of
a new approach to detecting and
The design process of digital circuits is often carried out in individual steps, like logic synthesis, mapping, and routing. Since originally the complete process was too complex, it has been split up in several - more or less independent - phases. In the last 40 years powerful algorithms have been developed to find optimal solutions for each of these steps. However, the interaction of these different algorithms has not been considered for a long time. This leads to quality loss e.g. in cases where highly optimized netlists fit badly onto the target architecture. Since the resulting circuits are often far from being optimal and insufficient regarding the optimization criteria, like area and delay, several iterations of the complete design process have to be carried out to get high quality results. This is a very time consuming and costly process. For this reason, some years ago the idea of one-pass synthesis came up. There were two main approaches how to guarantee that a design got "first time right": Combining levels that were split before, e.g. to use layout information already during the logic synthesis phase; Restricting the optimization in one level such that it better fits to the next one. So far, several approaches in these two directions have been presented and new techniques are under development. In Towards One-Pass Synthesis we describe the new paradigm that is used in one-pass synthesis and present examples for the two techniques above. Theoretical and practical aspects are discussed and minimization algorithms are given. This will help people working with synthesis tools and circuit design in general (in industry and academia) to keep informed about recent developments andnew trends in this area.
It has long been apparent to academic library administrators that the current technical services operations within libraries need to be redirected and refocused in terms of both format priorities and human resources. A number of developments and directions have made this reorganization imperative, many of which have been accelerated by the current economic crisis. All of the chapters detail some aspect of technical services reorganization due to downsizing and/or reallocation of human resources, retooling professional and support staff in higher level duties and/or non-MARC metadata, "value-added" metadata opportunities, outsourcing redundant activities, and shifting resources from analog to digital object organization and description. This book will assist both catalogers and library administrators with concrete examples of moving technical services operations and personnel from the analog to the digital environment. This book was published as a special double issue of Cataloging & Classification Quarterly.
Fuzzy Sets in the Management of Uncertainty presents an overview of current problems in business management, primarily for those situations involving decision making of an economic-financial nature. The monograph therefore discusses problems of planning, programming, control and brings light to the entire financial network in its three phases: raising funds, analysis and investment. Special attention is paid to production processes and marketing of products and services. This monograph is a highly readable overview and introduction for scientists, professionals, graduate students, managers and consultants in the growing field of applications and fuzzy logic in the field of management.
This book presents the proceedings of the Third International Conference on Electrical Engineering and Control (ICEECA2017). It covers new control system models and troubleshooting tips, and also addresses complex system requirements, such as increased speed, precision and remote capabilities, bridging the gap between the complex, math-heavy controls theory taught in formal courses, and the efficient implementation required in real-world industry settings. Further, it considers both the engineering aspects of signal processing and the practical issues in the broad field of information transmission and novel technologies for communication networks and modern antenna design. This book is intended for researchers, engineers, and advanced postgraduate students in control and electrical engineering, computer science, signal processing, as well as mechanical and chemical engineering.
The growth of mobile technology has caused considerable changes in the way we interact with one another within both personal and business environments. Advancements in mobile computing and mobile multimedia resonate with engineers, strategists, developers, and managers while also determining the behavior and interaction of end users. Advancing the Next-Generation of Mobile Computing: Emerging Technologies offers historical perspectives on mobile computing, as well as new frameworks and methodologies for mobile networks, intelligent mobile applications, and mobile computing applications. This collection of research aims to inform researchers, designers, and users of mobile technology and promote awareness of new trends and tools in this growing field of study.
This book takes a formal approach to teaching software engineering, using not only UML, but also Object Constraint Language (OCL) for specification and analysis of designed models. Employing technical details typically missing from existing textbooks on software engineering, the author shows how precise specifications lead to static verification of software systems. In addition, data management is given the attention that is required in order to produce a successful software project. Uses constraints in all phases of software development Follows recent developments in software technologies Technical coverage of data management issues and software verification Illustrated throughout to present analysis, specification, implementation and verification of multiple applications Includes end-of-chapter exercises and Instructor Presentation Slides
This book describes the emerging practice of e-mail tutoring; one-to-one correspondence between college students and writing tutors conducted over electronic mail. It reviews the history of Composition Studies, paying special attention to those ways in which writing centers and computers and composition have been previously hailed within a narrative of functional literacy and quick-fix solutions. The author suggests a new methodology for tutoring, and a new mandate for the writing center: a strong connection between the rhythms of extended, asynchronous writing and dialogic literacy. The electronic writing center can become a site for informed resistance to functional literacy.
System-Level Synthesis deals with the concurrent design of electronic applications, including both hardware and software. The issue has become the bottleneck in the design of electronic systems, including both hardware and software, in several major industrial fields, including telecommunications, automotive and aerospace engineering. The major difficulty with the subject is that it demands contributions from several research fields, including system specification, system architecture, hardware design, and software design. Most existing book cover well only a few aspects of system-level synthesis. The present volume presents a comprehensive discussion of all the aspects of system-level synthesis. Each topic is covered by a contribution written by an international authority on the subject.
Although the origins of parallel computing go back to the last century, it was only in the 1970s that parallel and vector computers became available to the scientific community. The first of these machines-the 64 processor llliac IV and the vector computers built by Texas Instruments, Control Data Corporation, and then CRA Y Research Corporation-had a somewhat limited impact. They were few in number and available mostly to workers in a few government laboratories. By now, however, the trickle has become a flood. There are over 200 large-scale vector computers now installed, not only in government laboratories but also in universities and in an increasing diversity of industries. Moreover, the National Science Foundation's Super computing Centers have made large vector computers widely available to the academic community. In addition, smaller, very cost-effective vector computers are being manufactured by a number of companies. Parallelism in computers has also progressed rapidly. The largest super computers now consist of several vector processors working in parallel. Although the number of processors in such machines is still relatively small (up to 8), it is expected that an increasing number of processors will be added in the near future (to a total of 16 or 32). Moreover, there are a myriad of research projects to build machines with hundreds, thousands, or even more processors. Indeed, several companies are now selling parallel machines, some with as many as hundreds, or even tens of thousands, of processors."
provides systematic solutions from formal test theory to automated test description methods, automated simulation test environment construction verifies the effectiveness of the theories, technologies and methods
The size of technically producible integrated circuits increases continuously, but the ability to design and verify these circuits does not keep up. Therefore today 's design flow has to be improved. Using a visionary approach, this book analyzes the current design methodology and verification methodology, a number of deficiencies are identified and solutions suggested. Improvements in the methodology as well as in the underlying algorithms are proposed.
Symbolic Boolean manipulation using binary decision diagrams (BDDs) has been successfully applied to a wide variety of tasks, particularly in very large scale integration (VLSI) computer-aided design (CAD). The concept of decision graphs as an abstract representation of Boolean functions dates back to the early work by Lee and Akers. In the last ten years, BDDs have found widespread use as a concrete data structure for symbolic Boolean manipulation. With BDDs, functions can be constructed, manipulated, and compared by simple and efficient graph algorithms. Since Boolean functions can represent not just digital circuit functions, but also such mathematical domains as sets and relations, a wide variety of CAD problems can be solved using BDDs. Binary Decision Diagrams and Applications for VLSI CAD provides valuable information for both those who are new to BDDs as well as to long time aficionados.' -from the Foreword by Randal E. Bryant. Over the past ten years ... BDDs have attracted the attention of many researchers because of their suitability for representing Boolean functions. They are now widely used in many practical VLSI CAD systems. ... this book can serve as an introduction to BDD techniques and ... it presents several new ideas on BDDs and their applications. ... many computer scientists and engineers will be interested in this book since Boolean function manipulation is a fundamental technique not only in digital system design but also in exploring various problems in computer science.' - from the Preface by Shin-ichi Minato.
Whether you are an experienced Security or System Administrator or a Newbie to the industry, you will learn how to use native, "out-of-the-box," operating system capabilities to secure your UNIX environment. No need for third-party software or freeware tools to be and stay secure! This book will help you ensure that your system is protected from unauthorized users and conduct intrusion traces to identify the intruders if this does occur. It provides you with practical information to use of the native OS security capabilities without the need for a third party security software application. Also included are hundreds of security tips, tricks, ready-to-use scripts and configuration files that will be a valuable resource in your endeavor to secure your UNIX systems.
The evaluation of IT and its business value are recently the subject of many academic and business discussions, as business managers, management consultants and researchers regularly question whether and how the contribution of IT to business performance can be evaluated effectively. Investments in IT are growing extensively and business managers worry about the fact that the benefits of IT investments might not be as high as expected. This phenomenon is often called the IT investment paradox or the IT Black Hole: larges sums are invested in IT that seem to be swallowed by a large black hole without rendering many returns. Information Systems Evaluation Management discusses these issues among others, through its presentation of the most current research in the field of IS evaluation. It is an area of study that touches upon a variety of types of businesses and organization essentially all those who involve IT in their business practices.
This book focuses on the aspects related to the parallelization of evolutionary computations, such as parallel genetic operators, parallel fitness evaluation, distributed genetic algorithms, and parallel hardware implementations, as well as on their impact on several applications. It offers a wide spectrum of sample works developed in leading research about parallel implementations of efficient techniques at the heart of computational intelligence.
Analog Design Issues in Digital VLSI Circuits and Systems brings together in one place important contributions and up-to-date research results in this fast moving area. Analog Design Issues in Digital VLSI Circuits and Systems serves as an excellent reference, providing insight into some of the most challenging research issues in the field. |
![]() ![]() You may like...
Noise and Vibration Mitigation for Rail…
Tatsuo Maeda, Pierre-Etienne Gautier, …
Hardcover
R5,929
Discovery Miles 59 290
Foundations and Methods in Combinatorial…
Israel Cesar Lerman
Hardcover
R4,488
Discovery Miles 44 880
Field and Service Robotics - Results of…
Genya Ishigami, Kazuya Yoshida
Hardcover
R7,138
Discovery Miles 71 380
Geometric Complex Analysis - In Honor of…
Jisoo Byun, Hong Rae Cho, …
Hardcover
R4,402
Discovery Miles 44 020
Modern Differential Geometry of Curves…
Alfred Gray, Elsa Abbena, …
Hardcover
R4,583
Discovery Miles 45 830
The Classification of the Finite Simple…
Inna Capdeboscq, Daniel Gorenstein, …
Paperback
R2,661
Discovery Miles 26 610
|