![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
The growing commercial market of Microwave/ Millimeter wave industry over the past decade has led to the explosion of interests and opportunities for the design and development of microwave components.The design of most microwave components requires the use of commercially available electromagnetic (EM) simulation tools for their analysis. In the design process, the simulations are carried out by varying the design parameters until the desired response is obtained. The optimization of design parameters by manual searching is a cumbersome and time consuming process. Soft computing methods such as Genetic Algorithm (GA), Artificial Neural Network (ANN) and Fuzzy Logic (FL) have been widely used by EM researchers for microwave design since last decade. The aim of these methods is to tolerate imprecision, uncertainty, and approximation to achieve robust and low cost solution in a small time frame. Modeling and optimization are essential parts and powerful tools for the microwave/millimeter wave design. This book deals with the development and use of soft computing methods for tackling challenging design problems in the microwave/millimeter wave domain. The aim in the development of these methods is to obtain the design in small time frame while improving the accuracy of the design for a wide range of applications. To achieve this goal, a few diverse design problems of microwave field, representing varied challenges in the design, such as different microstrip antennas, microwave filters, a microstrip-via and also some critical high power components such as nonlinear tapers and RF-windows are considered as case-study design problems. Different design methodologies are developed for these applications. The presents soft computing methods, their review for microwave/millimeter wave design problems and specific case-study problems to infuse better insight and understanding of the subject.
Fluids, play an important role in environmental systems, appearing as surface water in rivers, lakes, and coastal regions or in the subsurface as well as in the atmosphere. Mechanics of environmental fluids is concerned with fluid motion, associated mass and heat transport in addition to deformation processes in subsurface systems. In this textbook the fundamental modelling approaches based on continuum mechanics for fluids in the environment are described, including porous media and turbulence. Numerical methods for solving the process governing equations and its object-oriented computer implementation are discussed and illustrated with examples. Finally the application of computer models in civil and environmental engineering is demonstrated.
This book presents the essential background for understanding semantic theories of mood. Mood as a category is widely used in the description of languages and the formal analysis of their grammatical properties. It typically refers to the features of a sentence-individual morphemes or grammatical patterns-that reflect how the sentence contributes to the modal meaning of a larger phrase, or that indicate the type of fundamental pragmatic function that it has in conversation. In this volume, Paul Portner discusses the most significant semantic theories relating to the two main subtypes of mood: verbal mood, including the categories of indicative and subjunctive subordinate clauses, and sentence mood, encompassing declaratives, interrogatives, and imperatives. He evaluates those theories, compares them, and draws connections between seemingly disparate approaches, and he formalizes some of the literature's most important ideas in new ways in order to draw out their most significant insights. Ultimately, this work shows that there are crucial connections between verbal mood and sentence mood which point the way towards a more general understanding of how mood works and its relation to other topics in linguistics; it also outlines the type of semantic and pragmatic theory which will make it possible to explain these relations. The book will be a valuable resource for researchers and students from advanced undergraduate level upwards in the fields of semantics and pragmatics, philosophy, computer science, and psychology.
As a progressive field of study, end-user computing is continually becoming a significant focus area for businesses, since refining end-user practices to enhance their productivity contributes greatly to positioning organizations for strategic and competitive advantage in the global economy.""Evolutionary Concepts in End User Productivity and Performance: Applications for Organizational Progress"" represents the most current investigations into a wide range of end-user computing issues. This book enhances the field with new insights useful for researchers, educators, and professionals in the end-user domain.
Queueing network models have been widely applied as a powerful tool for modelling, performance evaluation, and prediction of discrete flow systems, such as computer systems, communication networks, production lines, and manufacturing systems. Queueing network models with finite capacity queues and blocking have been introduced and applied as even more realistic models of systems with finite capacity resources and with population constraints. In recent years, research in this field has grown rapidly. Analysis of Queueing Networks with Blocking introduces queueing network models with finite capacity and various types of blocking mechanisms. It gives a comprehensive definition of the analytical model underlying these blocking queueing networks. It surveys exact and approximate analytical solution methods and algorithms and their relevant properties. It also presents various application examples of queueing networks to model computer systems and communication networks. This book is organized in three parts. Part I introduces queueing networks with blocking and various application examples. Part II deals with exact and approximate analysis of queueing networks with blocking and the condition under which the various techniques can be applied. Part III presents a review of various properties of networks with blocking, describing several equivalence properties both between networks with and without blocking and between different blocking types. Approximate solution methods for the buffer allocation problem are presented.
This title provides a survey on approaches to information systems supporting sustainable development in the private or public sector. It also documents and encourages the first steps of environmental information processing towards this more comprehensive goal.
According to the Semiconductor Industry Association's 1999 International Technology Roadmap for Semiconductors, by the year 2008 the integration of more than 500 million transistors will be possible on a single chip. Integrating transistors on silicon will depend increasingly on design reuse. Design reuse techniques have become the subject of books, conferences, and podium discussions over the last few years. However, most discussions focus on higher-level abstraction like RTL descriptions, which can be synthesized. Design reuse is often seen as an add-on to normal design activity, or a special design task that is not an integrated part of the existing design flow. This may all be true for the ASIC world, but not for high-speed, high-performance microprocessors. In the field of high-speed microprocessors, design reuse is an
integrated part of the design flow. The method of choice in this
demanding field was, and is always, physical design reuse at the
layout level. In the past, the practical implementations of this
method were linear shrinks and the lambda approach. With the
scaling of process technology down to 0.18 micron and below, this
approach lost steam and became inefficient. Automatic Layout Modification, Including design reuse of the Alpha CPU in 0.13 micron SOI technology is a welcome effort to improving some of the practices in chip design today. It is a comprehensive reference work on Automatic Layout Modification which will be valuable to VLSI courses at universities, and to CAD and circuit engineers and engineering managers.
Agda is an advanced programming language based on Type Theory. Agda's type system is expressive enough to support full functional verification of programs, in two styles. In external verification, we write pure functional programs and then write proofs of properties about them. The proofs are separate external artifacts, typically using structural induction. In internal verification, we specify properties of programs through rich types for the programs themselves. This often necessitates including proofs inside code, to show the type checker that the specified properties hold. The power to prove properties of programs in these two styles is a profound addition to the practice of programming, giving programmers the power to guarantee the absence of bugs, and thus improve the quality of software more than previously possible. Verified Functional Programming in Agda is the first book to provide a systematic exposition of external and internal verification in Agda, suitable for undergraduate students of Computer Science. No familiarity with functional programming or computer-checked proofs is presupposed. The book begins with an introduction to functional programming through familiar examples like booleans, natural numbers, and lists, and techniques for external verification. Internal verification is considered through the examples of vectors, binary search trees, and Braun trees. More advanced material on type-level computation, explicit reasoning about termination, and normalization by evaluation is also included. The book also includes a medium-sized case study on Huffman encoding and decoding.
This book presents an updated selection of the most representative contributions to the 2nd and 3rd IEEE Workshops on Signal Propagation on Interconnects (SPI) which were held in TravemA1/4nde (Baltic Sea), Germany, May 13-15, 1998, and in Titisee-Neustadt (Black Forest), Germany, May 19-21, 1999. Interconnects in VLSI Design addresses the need of developers and researchers in the field of VLSI chip and package design. It offers a survey of current problems regarding the influence of interconnect effects on the electrical performance of electronic circuits and suggests innovative solutions. In this sense Interconnects in VLSI Design represents a continuation and a supplement to the first book, Signal Propagation on Interconnects, Kluwer Academic Publishers, 1998. The papers in Interconnects in VLSI Design cover a wide area of research directions. Apart from describing general trends they deal with the solution of signal integrity problems, the modeling of interconnects, parameter extraction using calculations and measurements and last, but not least, actual problems in the field of optical interconnects.
Software design is becoming increasingly complex and difficult as we move to applications that support people interacting with information and with each other over networks. Computer supported cooperative work applications are a typical example of this. The problems to be solved are no longer just technical, they are also social: how do we build systems that meet the real needs of the people who are asked to use them and that fit into their contexts of use. We can characterise these as wicked problems, where our traditional software engineering techniques for understanding requirements and driving these through into design are no longer adequate. This book presents the Locales Framework - and its five aspects of locale foundations, civic structures, individual views, interaction trajectory and mutuality - as a way of dealing with the intertwined problem-solution space of wicked problems. A locale is based on a metaphor of place as the lived relationship between people and the spaces and resources they use in their interactions. The Locales Framework provides a coherent mediating framework for ethnographers, designers, and software engineers to facilitate both understanding requirements of complex social situations and designing solutions to support these situations in all their complexity.
This book presents state-of-the-art developments in the area of computationally intelligent methods applied to various aspects and ways of Web exploration and Web mining. Some novel data mining algorithms that can lead to more effective and intelligent Web-based systems are also described. Scientists, engineers, and research students can expect to find many inspiring ideas in this volume.
A fundamental understanding of algorithmic bioprocesses is key to learning how information processing occurs in nature at the cell level. The field is concerned with the interactions between computer science on the one hand and biology, chemistry, and DNA-oriented nanoscience on the other. In particular, this book offers a comprehensive overview of research into algorithmic self-assembly, RNA folding, the algorithmic foundations for biochemical reactions, and the algorithmic nature of developmental processes. The editors of the book invited 36 chapters, written by the leading researchers in this area, and their contributions include detailed tutorials on the main topics, surveys of the state of the art in research, experimental results, and discussions of specific research goals. The main subjects addressed are sequence discovery, generation, and analysis; nanoconstructions and self-assembly; membrane computing; formal models and analysis; process calculi and automata; biochemical reactions; and other topics from natural computing, including molecular evolution, regulation of gene expression, light-based computing, cellular automata, realistic modelling of biological systems, and evolutionary computing. This subject is inherently interdisciplinary, and this book will be of value to researchers in computer science and biology who study the impact of the exciting mutual interaction between our understanding of bioprocesses and our understanding of computation.
History of the Book The last three decades have witnessed an explosive development in integrated circuit fabrication technologies. The complexities of cur rent CMOS circuits are reaching beyond the 100 nanometer feature size and multi-hundred million transistors per integrated circuit. To fully exploit this technological potential, circuit designers use sophisticated Computer-Aided Design (CAD) tools. While supporting the talents of innumerable microelectronics engineers, these CAD tools have become the enabling factor responsible for the successful design and implemen tation of thousands of high performance, large scale integrated circuits. This research monograph originated from a body of doctoral disserta tion research completed by the first author at the University of Rochester from 1994 to 1999 while under the supervision of Prof. Eby G. Friedman. This research focuses on issues in the design of the clock distribution net work in large scale, high performance digital synchronous circuits and particularly, on algorithms for non-zero clock skew scheduling. During the development of this research, it has become clear that incorporating timing issues into the successful integrated circuit design process is of fundamental importance, particularly in that advanced theoretical de velopments in this area have been slow to reach the designers' desktops."
This book presents the proceedings of the 8th international Symposium "Information Fusion and Intelligent Geographic Information Systems 2017" (IF&ICIS'2017), which took place at Shanghai Maritime University, China, from May 10 to 12, 2017. The goal of the symposium was to bring together leading global experts in the field of spatial information integration and intelligent GIS (IGIS) to exchange cutting-edge research ideas and experiences, to discuss perspectives on the fast-paced development of geospatial information theory, methods and models in order to demonstrate the latest advances in IGIS and discover new ways of collaboration. The topics focus on IGIS fundamentals, models, technologies and services in maritime research, such as underwater acoustics, radiolocation, navigation, marine energy, logistics, environmental management, seafood, safety of maritime navigation and others. In addition the book discusses the integration of IGIS technologies in the emerging field of digital humanities research.
Information communication technologies in human services are of increasing interest and concern to health and welfare educators, managers, and practitioners due to their useful information management and teaching capabilities. ""Information Communication Technologies for Human Services Education and Delivery: Concepts and Cases"" significantly contributes to the growing area of ICT application in education and human service delivery. Containing expert international contributions, this Premier Reference Source showcases innovative practices, addresses ethical and logistic concerns, and provides relevant theoretical frameworks and the latest empirical research findings.
The present book deals with coalition games in which expected pay-offs are only vaguely known. In fact, this idea about vagueness of expectations ap pears to be adequate to real situations in which the coalitional bargaining anticipates a proper realization of the game with a strategic behaviour of players. The vagueness being present in the expectations of profits is mod elled by means of the theory of fuzzy set and fuzzy quantities. The fuzziness of decision-making and strategic behaviour attracts the attention of mathematicians and its particular aspects are discussed in sev eral works. One can mention in this respect in particular the book "Fuzzy and Multiobjective Games for Conflict Resolution" by Ichiro Nishizaki and Masatoshi Sakawa (referred below as 43]) which has recently appeared in the series Studies in Fuzziness and Soft Computing published by Physica-Verlag in which the present book is also apperaing. That book, together with the one you carry in your hands, form in a certain sense a complementary pair. They present detailed views on two main aspects forming the core of game theory: strategic (mostly 2-person) games, and coalitional (or cooperative) games. As a pair they offer quite a wide overview of fuzzy set theoretical approaches to game theoretical models of human behaviour."
There has been a common perception that computational complexity is
a theory of "bad news" because its most typical results assert that
various real-world and innocent-looking tasks are infeasible. In
fact, "bad news" is a relative term, and, indeed, in some
situations (e.g., in cryptography), we want an adversary to not be
able to perform a certain task. However, a "bad news" result does
not automatically become useful in such a scenario. For this to
happen, its hardness features have to be quantitatively evaluated
and shown to manifest extensively.
This book introduces a promising design for future Internet, the Smart Collaborative Identifier NETwork (SINET). By examining cutting-edge research from around the world, it is the first book to provide a comprehensive survey of SINET, including its basic theories and principles, a broad range of architectures, protocols, standards, and future research directions. For further investigation, the book also provides readers an experimental analysis of SINET to promote further, independent research. The second part of the book presents in detail key technologies in SINET such as scalable routing, efficient mapping systems, mobility management and security issues. In turn, the last part presents various implementations of SINET, assessing its merits. The authors believe SINET will greatly benefit researchers involved in designing future Internet thanks to its high degree of flexibility, security, manageability, mobility support and efficient resource utilization.
Business-to-business (B2B) integration is a buzzword which has been used a lot in recent years, with a variety of meanings. Starting with a clear technical definition of this term and its relation to topics like A2A (Application-to-Application), ASP (Application Service Provider), A2A, and B2C (Business-to-Consumer), Christoph Bussler outlines a complete and consistent B2B integration architecture based on a coherent conceptual model. He shows that B2B integration not only requires the exchange of business events between distributed trading partners across networks like the Internet, but also demands back-end application integration within business processes, and thus goes far beyond traditional approaches to enterprise application integration approaches. His detailed presentation describes how B2B integration standards like RosettaNet or SWIFT, the application integration standard J2EE Connector Architecture and basic standards like XML act together in order to enable business process integration. The book is the first of its kind that discusses B2B concepts and architectures independent of specific and short-term industrial or academic approaches and thus provides solid and long-lasting knowledge for researchers, students, and professionals interested in the field of B2B integration.
This book is the essential guide for any student undertaking a computing/IS project, and will give you everything you need to achieve outstanding results. Undertaking a project is a key component of nearly all computing/information systems degree programmes at both undergraduate and postgraduate levels. Projects in Computing and Information Systems covers the four key aspects of project work (planning, conducting, presenting and taking the project further) in chronological fashion, and provides the reader with the skills to excel.
Most of the intriguing social phenomena of our time, such as international terrorism, social inequality, and urban ethnic segregation, are consequences of complex forms of agent interaction that are difficult to observe methodically and experimentally. This book looks at a new research stream that makes use of advanced computer simulation modelling techniques to spotlight agent interaction that allows us to explain the emergence of social patterns. It presents a method to pursue analytical sociology investigations that look at relevant social mechanisms in various empirical situations, such as markets, urban cities, and organisations. This book: Provides a comprehensive introduction to epistemological, theoretical and methodological features of agent-based modelling in sociology through various discussions and examples.Presents the pros and cons of using agent-based models in sociology.Explores agent-based models in combining quantitative and qualitative aspects, and micro- and macro levels of analysis.Looks at how to pose an agent-based research question, identifying the model building blocks, and how to validate simulation results.Features examples of agent-based models that look at crucial sociology issues.Supported by an accompanying website featuring data sets and code for the models included in the book. "Agent-Based Computational Sociology" is written in a common sociological language and features examples of models that look at all the traditional explanatory challenges of sociology. Researchers and graduate students involved in the field of agent-based modelling and computer simulation in areas such as social sciences, cognitive sciences and computer sciences will benefit from this book.
Volume 11 Reviews in Computational Chemistry Kenny B. Lipkowitz and Donald B. Boyd The Theme of this Eleventh Volume is Computer-Aided Ligand Design and Modeling of Biomolecules. A Stellar Group of Scientists from Around the World Join in this Volume to Provide Tutorials for Beginners and Experts. Chapters 1 and 2 Take A Detailed Look at De Novo Design Methodologies for Discovering New Ligands which May Become Pharmaceuticals. Chapters 3 and 4 Cover the Methods and Applications of Three-Dimensional Quantitative Structure-Activity Relationships (3D-QSAR) Currently Used in Drug Discovery. Ways to Compute the Correct Lipophilic/Hydrophilic Behavior of Molecules are Taught in Chapter 5. Chapter 6 is an Exposition of Realistically Simulating DNA in the Complex Milieu of Ions that Surround it. An Appendix to this Volume Gives A Compendium of Software and Internet Tools for Computational Chemistry. -From Reviews of the Series . This Well-Respected Series Continues the Fine Selection of Topics and Presentation Qualities Set Forth by the Previous Members. For Example, Each Chapter Contains Thorough Treatment of the Theory Behind the Topic Being Covered. Moreover, the Background Material is Followed by Ample Timely Examples Culled From Recent Literature. Journal of Medicinal Chemistry |
You may like...
Cyber-Physical Systems for Social…
Maya Dimitrova, Hiroaki Wagatsuma
Hardcover
R6,528
Discovery Miles 65 280
Distributed and Parallel Systems - From…
Peter Kacsuk, Gabriele Kotsis
Hardcover
R5,273
Discovery Miles 52 730
|