![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Applications of computing > General
Fluids, play an important role in environmental systems, appearing as surface water in rivers, lakes, and coastal regions or in the subsurface as well as in the atmosphere. Mechanics of environmental fluids is concerned with fluid motion, associated mass and heat transport in addition to deformation processes in subsurface systems. In this textbook the fundamental modelling approaches based on continuum mechanics for fluids in the environment are described, including porous media and turbulence. Numerical methods for solving the process governing equations and its object-oriented computer implementation are discussed and illustrated with examples. Finally the application of computer models in civil and environmental engineering is demonstrated.
This book is the essential guide for any student undertaking a computing/IS project, and will give you everything you need to achieve outstanding results. Undertaking a project is a key component of nearly all computing/information systems degree programmes at both undergraduate and postgraduate levels. Projects in Computing and Information Systems covers the four key aspects of project work (planning, conducting, presenting and taking the project further) in chronological fashion, and provides the reader with the skills to excel.
There has been a common perception that computational complexity is
a theory of "bad news" because its most typical results assert that
various real-world and innocent-looking tasks are infeasible. In
fact, "bad news" is a relative term, and, indeed, in some
situations (e.g., in cryptography), we want an adversary to not be
able to perform a certain task. However, a "bad news" result does
not automatically become useful in such a scenario. For this to
happen, its hardness features have to be quantitatively evaluated
and shown to manifest extensively.
Software design is becoming increasingly complex and difficult as we move to applications that support people interacting with information and with each other over networks. Computer supported cooperative work applications are a typical example of this. The problems to be solved are no longer just technical, they are also social: how do we build systems that meet the real needs of the people who are asked to use them and that fit into their contexts of use. We can characterise these as wicked problems, where our traditional software engineering techniques for understanding requirements and driving these through into design are no longer adequate. This book presents the Locales Framework - and its five aspects of locale foundations, civic structures, individual views, interaction trajectory and mutuality - as a way of dealing with the intertwined problem-solution space of wicked problems. A locale is based on a metaphor of place as the lived relationship between people and the spaces and resources they use in their interactions. The Locales Framework provides a coherent mediating framework for ethnographers, designers, and software engineers to facilitate both understanding requirements of complex social situations and designing solutions to support these situations in all their complexity.
This book presents an updated selection of the most representative contributions to the 2nd and 3rd IEEE Workshops on Signal Propagation on Interconnects (SPI) which were held in TravemA1/4nde (Baltic Sea), Germany, May 13-15, 1998, and in Titisee-Neustadt (Black Forest), Germany, May 19-21, 1999. Interconnects in VLSI Design addresses the need of developers and researchers in the field of VLSI chip and package design. It offers a survey of current problems regarding the influence of interconnect effects on the electrical performance of electronic circuits and suggests innovative solutions. In this sense Interconnects in VLSI Design represents a continuation and a supplement to the first book, Signal Propagation on Interconnects, Kluwer Academic Publishers, 1998. The papers in Interconnects in VLSI Design cover a wide area of research directions. Apart from describing general trends they deal with the solution of signal integrity problems, the modeling of interconnects, parameter extraction using calculations and measurements and last, but not least, actual problems in the field of optical interconnects.
This book presents state-of-the-art developments in the area of computationally intelligent methods applied to various aspects and ways of Web exploration and Web mining. Some novel data mining algorithms that can lead to more effective and intelligent Web-based systems are also described. Scientists, engineers, and research students can expect to find many inspiring ideas in this volume.
As a progressive field of study, end-user computing is continually becoming a significant focus area for businesses, since refining end-user practices to enhance their productivity contributes greatly to positioning organizations for strategic and competitive advantage in the global economy.""Evolutionary Concepts in End User Productivity and Performance: Applications for Organizational Progress"" represents the most current investigations into a wide range of end-user computing issues. This book enhances the field with new insights useful for researchers, educators, and professionals in the end-user domain.
Queueing network models have been widely applied as a powerful tool for modelling, performance evaluation, and prediction of discrete flow systems, such as computer systems, communication networks, production lines, and manufacturing systems. Queueing network models with finite capacity queues and blocking have been introduced and applied as even more realistic models of systems with finite capacity resources and with population constraints. In recent years, research in this field has grown rapidly. Analysis of Queueing Networks with Blocking introduces queueing network models with finite capacity and various types of blocking mechanisms. It gives a comprehensive definition of the analytical model underlying these blocking queueing networks. It surveys exact and approximate analytical solution methods and algorithms and their relevant properties. It also presents various application examples of queueing networks to model computer systems and communication networks. This book is organized in three parts. Part I introduces queueing networks with blocking and various application examples. Part II deals with exact and approximate analysis of queueing networks with blocking and the condition under which the various techniques can be applied. Part III presents a review of various properties of networks with blocking, describing several equivalence properties both between networks with and without blocking and between different blocking types. Approximate solution methods for the buffer allocation problem are presented.
Information communication technologies in human services are of increasing interest and concern to health and welfare educators, managers, and practitioners due to their useful information management and teaching capabilities. ""Information Communication Technologies for Human Services Education and Delivery: Concepts and Cases"" significantly contributes to the growing area of ICT application in education and human service delivery. Containing expert international contributions, this Premier Reference Source showcases innovative practices, addresses ethical and logistic concerns, and provides relevant theoretical frameworks and the latest empirical research findings.
A fundamental understanding of algorithmic bioprocesses is key to learning how information processing occurs in nature at the cell level. The field is concerned with the interactions between computer science on the one hand and biology, chemistry, and DNA-oriented nanoscience on the other. In particular, this book offers a comprehensive overview of research into algorithmic self-assembly, RNA folding, the algorithmic foundations for biochemical reactions, and the algorithmic nature of developmental processes. The editors of the book invited 36 chapters, written by the leading researchers in this area, and their contributions include detailed tutorials on the main topics, surveys of the state of the art in research, experimental results, and discussions of specific research goals. The main subjects addressed are sequence discovery, generation, and analysis; nanoconstructions and self-assembly; membrane computing; formal models and analysis; process calculi and automata; biochemical reactions; and other topics from natural computing, including molecular evolution, regulation of gene expression, light-based computing, cellular automata, realistic modelling of biological systems, and evolutionary computing. This subject is inherently interdisciplinary, and this book will be of value to researchers in computer science and biology who study the impact of the exciting mutual interaction between our understanding of bioprocesses and our understanding of computation.
According to the Semiconductor Industry Association's 1999 International Technology Roadmap for Semiconductors, by the year 2008 the integration of more than 500 million transistors will be possible on a single chip. Integrating transistors on silicon will depend increasingly on design reuse. Design reuse techniques have become the subject of books, conferences, and podium discussions over the last few years. However, most discussions focus on higher-level abstraction like RTL descriptions, which can be synthesized. Design reuse is often seen as an add-on to normal design activity, or a special design task that is not an integrated part of the existing design flow. This may all be true for the ASIC world, but not for high-speed, high-performance microprocessors. In the field of high-speed microprocessors, design reuse is an
integrated part of the design flow. The method of choice in this
demanding field was, and is always, physical design reuse at the
layout level. In the past, the practical implementations of this
method were linear shrinks and the lambda approach. With the
scaling of process technology down to 0.18 micron and below, this
approach lost steam and became inefficient. Automatic Layout Modification, Including design reuse of the Alpha CPU in 0.13 micron SOI technology is a welcome effort to improving some of the practices in chip design today. It is a comprehensive reference work on Automatic Layout Modification which will be valuable to VLSI courses at universities, and to CAD and circuit engineers and engineering managers.
This book presents the proceedings of the 8th international Symposium "Information Fusion and Intelligent Geographic Information Systems 2017" (IF&ICIS'2017), which took place at Shanghai Maritime University, China, from May 10 to 12, 2017. The goal of the symposium was to bring together leading global experts in the field of spatial information integration and intelligent GIS (IGIS) to exchange cutting-edge research ideas and experiences, to discuss perspectives on the fast-paced development of geospatial information theory, methods and models in order to demonstrate the latest advances in IGIS and discover new ways of collaboration. The topics focus on IGIS fundamentals, models, technologies and services in maritime research, such as underwater acoustics, radiolocation, navigation, marine energy, logistics, environmental management, seafood, safety of maritime navigation and others. In addition the book discusses the integration of IGIS technologies in the emerging field of digital humanities research.
This third volume of the Handbook of Formal Languages discusses language theory beyond linear or string models: trees, graphs, grids, pictures, computer graphics. Many chapters offer an authoritative self-contained exposition of an entire area. Special emphasis is on interconnections with logic.
Business-to-business (B2B) integration is a buzzword which has been used a lot in recent years, with a variety of meanings. Starting with a clear technical definition of this term and its relation to topics like A2A (Application-to-Application), ASP (Application Service Provider), A2A, and B2C (Business-to-Consumer), Christoph Bussler outlines a complete and consistent B2B integration architecture based on a coherent conceptual model. He shows that B2B integration not only requires the exchange of business events between distributed trading partners across networks like the Internet, but also demands back-end application integration within business processes, and thus goes far beyond traditional approaches to enterprise application integration approaches. His detailed presentation describes how B2B integration standards like RosettaNet or SWIFT, the application integration standard J2EE Connector Architecture and basic standards like XML act together in order to enable business process integration. The book is the first of its kind that discusses B2B concepts and architectures independent of specific and short-term industrial or academic approaches and thus provides solid and long-lasting knowledge for researchers, students, and professionals interested in the field of B2B integration.
This lively and fascinating text traces the key developments in computation - from 3000 B.C. to the present day - in an easy-to-follow and concise manner. Topics and features: ideal for self-study, offering many pedagogical features such as chapter-opening key topics, chapter introductions and summaries, exercises, and a glossary; presents detailed information on major figures in computing, such as Boole, Babbage, Shannon, Turing, Zuse and Von Neumann; reviews the history of software engineering and of programming languages, including syntax and semantics; discusses the progress of artificial intelligence, with extension to such key disciplines as philosophy, psychology, linguistics, neural networks and cybernetics; examines the impact on society of the introduction of the personal computer, the World Wide Web, and the development of mobile phone technology; follows the evolution of a number of major technology companies, including IBM, Microsoft and Apple.
Volume 55 covers some particularly hot topics. Linda Harasim writes
about education and the Web in "The Virtual University: A State of
the Art." She discusses the issues that will need to be addressed
if online education is to live up to expectations. Neville Holmes
covers a related subject in his chapter "The Net, the Web, and the
Children." He argues that the Web is an evolutionary, rather than
revolutionary, development and highlights the division between the
rich and the poor within and across nations. Continuing the WWW
theme, George Mihaila, Louqa Raschid, and Maria-Esther Vidal look
at the problems of using the Web and finding the information you
want.
CAO is one of the most misunderstood and underutilized weapons available to retailers today. International consultant Barbara Anderson makes clear that in only a limited sense does CAO replace manual ordering. In its full sense it is much more--the optimization of manufacturer, supplier, and retailer distribution to the retail store-- based on consumer and store data and corporate policy. Anderson thus provides a framework and checklist for implementing CAO, and understanding of key terminology, solutions to likely problems, and ways to make CAO implementation successful, and in doing so she covers the full spectrum of retailing. A readable, easily grasped, comprehensive, unique book for retailing management and for their colleagues teaching it in colleges and universities. Anderson points out that CAO is not an off-the-shelf system but an ongoing project, each phase with its own unique set of benefits and cost justification. Retail systems must support a vision where a product may bypass the store on the way to the consumer, or even the distribution center on the way to the stores. Consumers have a wide range of choices, not only of where to shop, but how to shop, and this demands ever greater levels of service. CAO systems help assure that the correct product is available at the store, that it can be located throughout the supply chain, and that it can be moved easily from any location. In CAO, all levels of operation work with real-time information, using decision-making tools that react and learn from new information. Her book thus shows there is no one right system, product, or approach for successful CAO. It's too big a leap to make in one step but consists of modules and functions that can grow in sophistication over time, and that not all retailers nor all categories within one retailer will use the same methods for forecasting and ordering. She also shows that the distinct separation of replenishment product from planning product is artifically imposed and that the separation of head-quarters from stores is also artificial. Indeed, integration does not mean the integration of separate systems; rather, of business functions themselves. Readers will thus get not only a knowledgeable discussion of what CAO should be, what it is and how it works, but an immediately useful understanding of how to make it work in their own companies.
Rule-basedevolutionaryonlinelearningsystems, oftenreferredtoasMichig- style learning classi?er systems (LCSs), were proposed nearly thirty years ago (Holland, 1976; Holland, 1977) originally calling them cognitive systems. LCSs combine the strength of reinforcement learning with the generali- tion capabilities of genetic algorithms promising a ?exible, online general- ing, solely reinforcement dependent learning system. However, despite several initial successful applications of LCSs and their interesting relations with a- mal learning and cognition, understanding of the systems remained somewhat obscured. Questions concerning learning complexity or convergence remained unanswered. Performance in di?erent problem types, problem structures, c- ceptspaces, andhypothesisspacesstayednearlyunpredictable. Thisbookhas the following three major objectives: (1) to establish a facetwise theory - proachforLCSsthatpromotessystemanalysis, understanding, anddesign;(2) to analyze, evaluate, and enhance the XCS classi?er system (Wilson, 1995) by the means of the facetwise approach establishing a fundamental XCS learning theory; (3) to identify both the major advantages of an LCS-based learning approach as well as the most promising potential application areas. Achieving these three objectives leads to a rigorous understanding of LCS functioning that enables the successful application of LCSs to diverse problem types and problem domains. The quantitative analysis of XCS shows that the inter- tive, evolutionary-based online learning mechanism works machine learning competitively yielding a low-order polynomial learning complexity. Moreover, the facetwise analysis approach facilitates the successful design of more - vanced LCSs including Holland's originally envisioned cognitive systems. Martin V.
History of the Book The last three decades have witnessed an explosive development in integrated circuit fabrication technologies. The complexities of cur rent CMOS circuits are reaching beyond the 100 nanometer feature size and multi-hundred million transistors per integrated circuit. To fully exploit this technological potential, circuit designers use sophisticated Computer-Aided Design (CAD) tools. While supporting the talents of innumerable microelectronics engineers, these CAD tools have become the enabling factor responsible for the successful design and implemen tation of thousands of high performance, large scale integrated circuits. This research monograph originated from a body of doctoral disserta tion research completed by the first author at the University of Rochester from 1994 to 1999 while under the supervision of Prof. Eby G. Friedman. This research focuses on issues in the design of the clock distribution net work in large scale, high performance digital synchronous circuits and particularly, on algorithms for non-zero clock skew scheduling. During the development of this research, it has become clear that incorporating timing issues into the successful integrated circuit design process is of fundamental importance, particularly in that advanced theoretical de velopments in this area have been slow to reach the designers' desktops."
This book presents scientific metrics and its applications for approaching scientific findings in the field of Physics, Economics and Scientometrics. Based on a collection of the author's publications in these fields, the book reveals the profound links between the measures and the findings in the natural laws, from micro-particles to macro-cosmos, in the economic rules of human society, and in the core knowledge among mass information. With this book the readers can gain insights or ideas on addressing the questions of how to measure the physical world, economics process and human knowledge, from the perspective of scientific metrics. The book is also useful to scientists, particularly to specialists in physics, economics and scientometrics, for promoting and stimulating their creative ideas based on scientific metrics.
For almost four decades, Software Engineering: A Practitioner's Approach (SEPA) has been the world's leading textbook in software engineering. The ninth edition represents a major restructuring and update of previous editions, solidifying the book's position as the most comprehensive guide to this important subject.
This book is an introduction to the fundamental concepts and tools needed for solving problems of a geometric nature using a computer. It attempts to fill the gap between standard geometry books, which are primarily theoretical, and applied books on computer graphics, computer vision, robotics, or machine learning. This book covers the following topics: affine geometry, projective geometry, Euclidean geometry, convex sets, SVD and principal component analysis, manifolds and Lie groups, quadratic optimization, basics of differential geometry, and a glimpse of computational geometry (Voronoi diagrams and Delaunay triangulations). Some practical applications of the concepts presented in this book include computer vision, more specifically contour grouping, motion interpolation, and robot kinematics. In this extensively updated second edition, more material on convex sets, Farkas's lemma, quadratic optimization and the Schur complement have been added. The chapter on SVD has been greatly expanded and now includes a presentation of PCA. The book is well illustrated and has chapter summaries and a large number of exercises throughout. It will be of interest to a wide audience including computer scientists, mathematicians, and engineers. Reviews of first edition: "Gallier's book will be a useful source for anyone interested in applications of geometrical methods to solve problems that arise in various branches of engineering. It may help to develop the sophisticated concepts from the more advanced parts of geometry into useful tools for applications." (Mathematical Reviews, 2001) ..".it will be useful as a reference book for postgraduates wishing to find the connection between their current problem and the underlying geometry." (The Australian Mathematical Society, 2001)"
Auctions have long been a popular method for allocation and procurement of products and services. Traditional auctions are constrained by time, place, number of bidders, number of bids, and the bidding experience. With the advent of internet communication technologies, the online auction environment has blossomed to support a bustling enterprise. Up until this time, the functional inner workings of these online exchange mechanisms have only been described using anecdotal accounts. Best Practices for Online Procurement Auctions offers a systematic approach to auction examination that will become invaluable to both practitioners and researchers alike. |
![]() ![]() You may like...
Practising Strategy - A Southern African…
Peet Venter, Tersia Botha
Paperback
Computing in Communication Networks…
Frank H. P. Fitzek, Fabrizio Granelli, …
Paperback
R2,832
Discovery Miles 28 320
Talking To Strangers - What We Should…
Malcolm Gladwell
Paperback
![]()
|