![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
High-Performance Digital VLSI Circuit Design is the first book devoted entirely to the design of digital high-performance VLSI circuits. CMOS, BiCMOS and bipolar ciruits are covered in depth, including state-of-the-art circuit structures. Recent advances in both the computer and telecommunications industries demand high-performance VLSI digital circuits. Digital processing of signals demands high-speed circuit techniques for the GHz range. The design of such circuits represents a great challenge; one that is amplified when the power supply is scaled down to 3.3 V. Moreover, the requirements of low-power/high-performance circuits adds an extra dimension to the design of such circuits. High-Performance Digital VLSI Circuit Design is a self-contained text, introducing the subject of high-performance VLSI circuit design and explaining the speed/power tradeoffs. The first few chapters of the book discuss the necessary background material in the area of device design and device modeling, respectively. High-performance CMOS circuits are then covered, especially the new all-N-logic dynamic circuits. Propagation delay times of high-speed bipolar CML and ECL are developed analytically to give a thorough understanding of various interacting process, device and circuit parameters. High-current phenomena of bipolar devices are also addressed as these devices typically operate at maximum currents for limited device area. Different, new, high-performance BiCMOS circuits are presented and compared to their conventional counterparts. These new circuits find direct applications in the areas of high-speed adders, frequency dividers, sense amplifiers, level-shifters, input/output clock buffers and PLLs. The book concludes with a few system application examples of digital high-performance VLSI circuits. Audience: A vital reference for practicing IC designers. Can be used as a text for graduate and senior undergraduate students in the area.
The relation between logic and knowledge provoked a heated debate since the 1960s. The epistemic approaches, found their formal argument in the mathematics of Brouwer and intuitionistic logic. And following Michael Dummett - started to call themselves: antirealists'. Others persisted with the formal background of the Frege-Tarski tradition, where Cantorian set theory is linked via model theory to classical logic. Jaakko Hintikka tried to join both traditions by means of what is now known as explicit epistemic logic'. Here the epistemic content is introduced into the object language as an operator which yield propositions from propositions rather than as metalogical constraint on the notion of inference. The Realism-Antirealism debate had thus three players: classical logicians, intuitionists and explicit epistemic logicians. The editors of the present volume think that in these days and age of Alternative Logics, where manifold developments in logic happen in a breathtaking pace, this debate should be revisited. Using the most recent logical and epistemological tools, this book provides a novel and refreshing view on the most important topics of the Realism vs. Antirealism debate. Its general scope is to show the most recent developments in philosophical logic to deal with problems inherited from this debate. It is meant for researcher and advanced students in philosophy, logic, formal methods. It's complete collection with a variety of approaches, it is written by leading authors in the fields, every chapter is self-contained.
Practically every crime now involves some aspect of digital evidence. This is the most recent volume in the Advances in Digital Forensics series. It describes original research results and innovative applications in the emerging discipline of digital forensics. In addition, it highlights some of the major technical and legal issues related to digital evidence and electronic crime investigations. This book contains a selection of twenty-eight edited papers from the Fourth Annual IFIP WG 11.9 Conference on Digital Forensics, held at Kyoto University, Kyoto, Japan in the spring of 2008.
The present textbook contains the recordsof a two-semester course on que- ing theory, including an introduction to matrix-analytic methods. This course comprises four hours oflectures and two hours of exercises per week andhas been taughtattheUniversity of Trier, Germany, for about ten years in - quence. The course is directed to last year undergraduate and?rst year gr- uate students of applied probability and computer science, who have already completed an introduction to probability theory. Its purpose is to present - terial that is close enough to concrete queueing models and their applications, while providing a sound mathematical foundation for the analysis of these. Thus the goal of the present book is two-fold. On the one hand, students who are mainly interested in applications easily feel bored by elaborate mathematical questions in the theory of stochastic processes. The presentation of the mathematical foundations in our courses is chosen to cover only the necessary results, which are needed for a solid foundation of the methods of queueing analysis. Further, students oriented - wards applications expect to have a justi?cation for their mathematical efforts in terms of immediate use in queueing analysis. This is the main reason why we have decided to introduce new mathematical concepts only when they will be used in the immediate sequel. On the other hand, students of applied probability do not want any heur- tic derivations just for the sake of yielding fast results for the model at hand.
This volume contains the proceedings of the IFIPTM 2008, the Joint iTrust and PST Conferences on Privacy, Trust Management and Security, held in Trondheim, Norway from June 18 to June 20, 2008. IFIPTM 2008 provides a truly global platform for the reporting of research, development, policy and practice in the interdependent areas of Privacy, Security, and Trust. Following the traditions inherited from the highly successful iTrust and PST conference series, IFIPTM 2008 focuses on trust, privacy and security from multidisciplinary perspectives. The conference is an arena for discussion about re levant problems from both research and practice in the areas of academia, busi ness, and government. IFIPTM 2008 is an open IFIP conference, which only accepts contributed pa pers, so all papers in these proceedings have passed strict peer review. The pro gram of the conference features both theoretical research papers and reports of real world case studies. IFIPTM 2008 received 62 submissions. The program commit tee selected 22 papers for presentation and inclusion in the proceedings. In addi tion, the program and the proceedings include 3 demo descriptions. The highlights of IFIPTM 2008 include invited talks and tutorials by industri al and academic experts in the fields of trust management, privacy and security, including Jon Bing and Michael Steiner.
Offering a deep insight into the venture capital deal-making process, "Raising Venture Capital" also provides valuable introduction to the subject. The book is practical in focus but based on sound academic theory, research and teaching materials gathered over the last 4 years at Tanaka Business School. Part one covers the history of the venture capital industry, shows why entrepreneurs need venture capital finance, and looks at how venture capitals raise and structure their funds. It also covers valuation methods for venture capital investments, and portfolio management. Part two illustrates how successful entrepreneurs raise finance from venture capitals, and gives details on how to approach venture capitals, how to choose the right venture capital firm, and how venture capitals and entrepreneurs work together after the deal is done. Part three gives a blow-by-blow account of the structure of a venture capital deal.
Fuzzy modeling has become one of the most productive and successful results of fuzzy logic. Among others, it has been applied to knowledge discovery, automatic classification, long-term prediction, or medical and engineering analysis. The research developed in the topic during the last two decades has been mainly focused on exploiting the fuzzy model flexibility to obtain the highest accuracy. This approach usually sets aside the interpretability of the obtained models. However, we should remember the initial philosophy of fuzzy sets theory directed to serve the bridge between the human understanding and the machine processing. In this challenge, the ability of fuzzy models to express the behavior of the real system in a comprehensible manner acquires a great importance. This book collects the works of a group of experts in the field that advocate the interpretability improvements as a mechanism to obtain well balanced fuzzy models.
The lectures contained in this book were presented at Harvard University in June 1979. The workshop at which they were presented was the third such on algebro-geometric methods. The first was held in 1973 in London and the emphasis was largely on geometric methods. The second was held at Ames Research Center-NASA in 1976. There again the emphasis was on geometric methods, but algebraic geometry was becoming a dominant theme. In the two years after the Ames meeting there was tremendous growth in the applications of algebraic geometry to systems theory and it was becoming clear that much of the algebraic systems theory was very closely related to the geometric systems theory. On this basis we felt that this was the right time to devote a workshop to the applications of algebra and algebraic geometry to linear systems theory. The lectures contained in this volume represent all but one of the tutorial lectures presented at the workshop. The lec ture of Professor Murray Wonham is not contained in this volume and we refer the interested to the archival literature. This workshop was jointly sponsored by a grant from Ames Research Center-NASA and a grant from the Advanced Study Institute Program of NATO. We greatly appreciate the financial support rendered by these two organizations. The American Mathematical Society hosted this meeting as part of their Summer Seminars in Applied Mathematics and will publish the companion volume of con tributed papers."
The idea that games can have positive impacts upon critical thinking and problem solving is widely accepted in today's digital society, yet the effect of video games on human cognition is still largely unexplored. Gaming and Cognition: Theories And Practice From The Learning Sciences applies the principles of research in the study of human cognition to video games, providing a critical examination of the rigor and design of the experiments in the study of cognition and gaming. Combining many aspects of the learning sciences such as psychology, instructional design, and education into one coherent whole, this book presents historical, theoretical, and practical perspectives.
30 tutorials and more than 100 exercises in chemoinformatics, supported by online software and data sets Chemoinformatics is widely used in both academic and industrial chemical and biochemical research worldwide. Yet, until this unique guide, there were no books offering practical exercises in chemoinformatics methods. Tutorials in Chemoinformatics contains more than 100 exercises in 30 tutorials exploring key topics and methods in the field. It takes an applied approach to the subject with a strong emphasis on problem-solving and computational methodologies. Each tutorial is self-contained and contains exercises for students to work through using a variety of software packages. The majority of the tutorials are divided into three sections devoted to theoretical background, algorithm description and software applications, respectively, with the latter section providing step-by-step software instructions. Throughout, three types of software tools are used: in-house programs developed by the authors, open-source programs and commercial programs which are available for free or at a modest cost to academics. The in-house software and data sets are available on a dedicated companion website. Key topics and methods covered in Tutorials in Chemoinformatics include: * Data curation and standardization * Development and use of chemical databases * Structure encoding by molecular descriptors, text strings and binary fingerprints * The design of diverse and focused libraries * Chemical data analysis and visualization * Structure-property/activity modeling (QSAR/QSPR) * Ensemble modeling approaches, including bagging, boosting, stacking and random subspaces *3D pharmacophores modeling and pharmacological profiling using shape analysis * Protein-ligand docking * Implementation of algorithms in a high-level programming language Tutorials in Chemoinformatics is an ideal supplementary text for advanced undergraduate and graduate courses in chemoinformatics, bioinformatics, computational chemistry, computational biology, medicinal chemistry and biochemistry. It is also a valuable working resource for medicinal chemists, academic researchers and industrial chemists looking to enhance their chemoinformatics skills.
This is a supplementary volume to the major three-volume Handbook of Combinatorial Optimization set. It can also be regarded as a stand-alone volume presenting chapters dealing with various aspects of the subject in a self-contained way.
Modeling and Simulation of High Speed VLSI Interconnects brings together in one place important contributions and state-of-the-art research results in this rapidly advancing area. Modeling and Simulation of High Speed VLSI Interconnects serves as an excellent reference, providing insight into some of the most important issues in the field.
The dynamics of the atmosphere, ocean, and climate are inherently nonlinear and complex, making computer models ideal for accurate, complete understanding of these systems. In the process of building and using models, the reader of this book will learn how the different components of climate systems function, interact with each other, and vary over time. Topics covered include the stability of climate, earth's energy balance, parcel dynamics in the atmosphere, the mechanisms of heat transport in the climate system, and mechanisms of climate variability. Special attention is given to the effects of climate change. The book is accompanied by a cross-platform CD containing models and a run-time version of STELLA (R) software. Walter A. Robinson is Associate Professor of Atmospheric Sciences at the University of Illinois at Urbana-Chamaign.
This monograph provides novel insights into cognitive mechanisms underlying the processing of sound and music in different environments. A solid understanding of these mechanisms is vital for numerous technological applications such as for example information retrieval from distributed musical databases or building expert systems. In order to investigate the cognitive mechanisms of music perception fundamentals of hearing psychophysiology and principles of music perception are presented. In addition, some computational intelligence methods are reviewed, such as rough sets, fuzzy logic, artificial neural networks, decision trees and genetic algorithms. The applications of hybrid decision systems to problem solving in music and acoustics are exemplified and discussed on the basis of obtained experimental results.
Make sure your students get the most from their online learning experiences Even though nearly every K-12 public school in the United States has broadband Internet access, the Web's vast potential as a teaching and learning tool has still not been realized. Web-based learning opportunities have been expensive, slow to develop, and time-consuming to implement, despite pressure on schools to adopt technology solutions that will cure their educational ills. Web-Based Learning in K-12 Classrooms: Opportunities and Challenges chronicles the up and downs of online learning and offers unique insights into its future, providing a comprehensive, curriculum-wide treatment of K-12 content areas (reading, science, mathematics, social studies), special education, counseling, virtual schools, exemplary schools, implementation issues, and educational Web sites. The Internet represents a powerful, complex set of technologies that offers your students access to unlimited knowledgebut that access doesn't replace the human interactions found in classrooms. Placing a student in front of a computer monitor is a supplement to classroom learning, not a substitute for it. Academics and education professionals address questions surrounding the key issues involved in successfully incorporating the wide range of Web-based learning opportunities (formal courses, demonstrations, simulations, collaborations, searches) into the classroom, including technology, content, and implementation. Web-Based Learning in K-12 Classrooms examines: inquiry-based learning online interaction displaying student work online Internet accessibility for students with disabilities initiating school counselors into e-learning technologies the role of government in virtual schools Web-based schools in California, Virginia, Pennsylvania, Vermont, and Texas a 13-category classification system for online educational resources the ATLAS model for program implementation evaluations of more than 1,000 pieces of online information (articles, research, reports, news, and statistics) and 900 Web applications (tutorials, drills, games, and tests) with evaluation criteria Web-Based Learning in K-12 Classrooms is a vital resource for educators interested in online learning applications across the K-12 curriculum.
Spectral analysis requires subjective decisions which influence the final estimate and mean that different analysts can obtain different results from the same stationary stochastic observations. Statistical signal processing can overcome this difficulty, producing a unique solution for any set of observations but that is only acceptable if it is close to the best attainable accuracy for most types of stationary data. This book describes a method which fulfils the above near-optimal-solution criterion, taking advantage of greater computing power and robust algorithms to produce enough candidate models to be sure of providing a suitable candidate for given data.
Digital technology determines today s world and will be one of the key technologies of the future. Successful technology development, introduction and management are not only a question of technical issues; due to their complexity a close cooperation between different scientific disciplines is required to discuss various consequences, chances and risks from manifold points of view as a starting point for the design of adequate solutions. The ability to integrate business and technology decisions will become a crucial core competence. The aim of this volume is to highlight a selection of important current research topics in the field of digital technology and management, illustrating the variety of aspects which have to be considered in the development and application of digital technologies. Main topics of this book are the design of the innovation process, digital rights management, mobile, location-based and ubiquitous services, IT service management and future communication networks."
This book is for people who work in the tech industry-computer and data scientists, software developers and engineers, designers, and people in business, marketing or management roles. It is also for people who are involved in the procurement and deployment of advanced applications, algorithms, and AI systems, and in policy making. Together, they create the digital products, services, and systems that shape our societies and daily lives. The book's aim is to empower people to take responsibility, to 'upgrade' their skills for ethical reflection, inquiry, and deliberation. It introduces ethics in an accessible manner with practical examples, outlines of different ethical traditions, and practice-oriented methods. Additional online resources are available at: ethicsforpeoplewhoworkintech.com.
Embodied conversational agents (ECAs) are autonomous software entities with human-like appearance and communication skills. These agents can take on a number of different roles, for example, as an assistant, tutor, information provider, or customer service agent. They may also simply represent or entertain a user. The precise nature and benefits of different characteristics of ECAs requires careful investigation. Questions range from the function of an eyebrow raise to mechanisms for assessing and improving ECA trustworthiness. This book will help experts and designers in the specification and development of applications incorporating ECAs. Part 1 provides guidelines for evaluation methodologies and the identification of design and evaluation parameters. Part 2 demonstrates the importance of considering the user's perspective and interaction experience. Part 3 addresses issues in fine-tuning design parameters of ECAs and verifying the perceived effect. Finally, in Part 4 lessons learned from a number of application case studies are presented. The book is intended for both ECA researchers in academia and industry, and developers and designers interested in applying the technology.
Standard voltages used in today's ICs may vary from about 1.3V to more than 100V, depending on the technology and the application. High voltage is therefore a relative notion. High Voltage Devices and Circuits in Standard CMOS Technologies is mainly focused on standard CMOS technologies, where high voltage (HV) is defined as any voltage higher than the nominal (low) voltage, i.e. 5V, 3.3V, or even lower. In this standard CMOS environment, IC designers are more and more frequently confronted with HV problems, particularly at the I/O level of the circuit. In the first group of applications, a large range of industrial or consumer circuits either require HV driving capabilities, or are supposed to work in a high-voltage environment. This includes ultrasonic drivers, flat panel displays, robotics, automotive, etc. On the other hand, in the emerging field of integrated microsystems, MEMS actuators mainly make use of electrostatic forces involving voltages in the typical range of 30 to 60V. Last but not least, with the advent of deep sub-micron and/or low-power technologies, the operating voltage tends towards levels ranging from 1V to 2.5V, while the interface needs to be compatible with higher voltages, such as 5V. For all these categories of applications, it is usually preferable to perform most of the signal processing at low voltage, while the resulting output rises to a higher voltage level. Solving this problem requires some special actions at three levels: technology, circuit design and layout. High Voltage Devices and Circuits in Standard CMOS Technologies addresses these topics in a clear and organized way. The theoretical background is supported by practical information and designexamples. It is an invaluable reference for researchers and professionals in both the design and device communities.
13E 2006, the 6th in this series of IFIP conferences, marked the congregation of researchers and practitioners in the areas of e-Commerce, e-Business, and e-Government. The conference was sponsored by IFIP TC 6 in cooperation with TC 8 and TC 11. The conference provided a forum for researchers, engineers and interested users in academia, industry, and government to discuss the latest research, cutting-edge practice and upcoming trends in the growing areas of e-Commerce, e-Business, and particularly e-Government. Sophisticated applications as well as the underlying technology that supports such applications were discussed and demonstrated. The conference attracted a wide range of participants representing a significant community of researchers and practitioners from a broad range of countries. The conference was organized along parallel tracks, each track focusing on specific aspects of current research, industry applications, and public administration.
Developments in electronic hardware, particularly microprocessors and solid-state cameras, have resulted in a vast explosion in the range and variety of applications to which intelligent processing may be applied to yield cost-effective automation. Typical examples include automated visual inspection and repetitive assembly. The technology required is recent and specialized, and is thus not widely known. VISION AND INFORMATION PROCESSING FOR AUTOMATION has arisen from a short course given by the authors to introduce potential users to the technology. Its content is a development and extension of material presented in the course. The objective of the book is to introduce readers to modern concepts and techniques basic to intelligent automation, and explain how these are applied to prac tical problems. Its emphasis is on machine vision. Intelligent instrumentation is concerned with processing infor mation, and an appreciation of the nature of information is essential in configuring instrumentation to handle it effiCiently. An understand ing of the fundamental principles of efficient computation and of the way in which machines make decisions is vital for the same reasons. Selection of appropriate sensing (e.g., camera type and configuration), of illumination, of hardware for processing (microchip or parallel processor?) to give most effective information flow, and of the most appropriate processing algorithms is critical in obtaining an optimal solution. Analysis of performance, to demonstrate that requirements have been met, and to identify the causes if they have not, is also important. All of these topics are covered in this volume."
This book provides a theoretical and application oriented analysis of deterministic scheduling problems arising in computer and manufacturing environments. In such systems processors (machines) and possibly other resources are to be allocated among tasks in such a way that certain scheduling objectives are met. Various scheduling problems are discussed where different problem parameters such as task processing times, urgency weights, arrival times, deadlines, precedence constraints, and processor speed factor are involved. Polynomial and exponential time optimization algorithms as well as approximation and heuristic approaches (including tabu search, simulated annealing, genetic algorithms, and ejection chains) are presented and discussed. Moreover, resource-constrained, imprecise computation, flexible flow shop and dynamic job shop scheduling, as well as flexible manufacturing systems, are considered.
Current practice dictates the separation of the hardware and software development paths early in the design cycle. These paths remain independent with very little interaction occurring between them until system integration. In particular, hardware is often specified without fully appreciating the computational requirements of the software. Also, software development does not influence hardware development and does not track changes made during the hardware design phase. Thus, the ability to explore hardware/software tradeoffs is restricted, such as the movement of functionality from the software domain to the hardware domain (and vice-versa) or the modification of the hardware/software interface. As a result, problems that are encountered during system integration may require modification of the software and/or hardware, resulting in potentially significant cost increases and schedule overruns. To address the problems described above, a cooperative design approach, one that utilizes a unified view of hardware and software, is described. This approach is called hardware/software codesign. The Codesign of Embedded Systems develops several fundamental hardware/software codesign concepts and a methodology that supports them. A unified representation, referred to as a decomposition graph, is presented which can be used to describe hardware or software using either functional abstractions or data abstractions. Using a unified representation based on functional abstractions, an abstract hardware/software model has been implemented in a common simulation environment called ADEPT (Advanced Design Environment Prototyping Tool). This model permits early hardware/software evaluation and tradeoff exploration. Techniques have been developed which support the identification of software bottlenecks and the evaluation of design alternatives with respect to multiple metrics. The application of the model is demonstrated on several examples. A unified representation based on data abstractions is also explored. This work leads to investigations regarding the application of object-oriented techniques to hardware design. The Codesign of Embedded Systems: A Unified Hardware/Software Representation describes a novel approach to a topic of immense importance to CAD researchers and designers alike.
Computational hydraulics and hydrologic modeling are rapidly developing fields with a wide range of applications in areas ranging from wastewater disposal and stormwater management to civil and environmental engineering. The fields are full of promise, but while an abundance of literature now exists, it contains a plethora of new terms that are not always defined. |
You may like...
Farm management - Financing, investment…
M.J. van Reenen, A. de K. Marais, …
Book
R698
Discovery Miles 6 980
Cooperative Economic Insect Report, Vol…
United States Department of Agriculture
Hardcover
R639
Discovery Miles 6 390
The Medical Offset Effect and Public…
John L. Fiedler, Johathan B. Wight
Hardcover
R2,536
Discovery Miles 25 360
|