![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
This book approaches economic problems from a systems thinking and feedback perspective. By introducing system dynamics methods (including qualitative and quantitative techniques) and computer simulation models, the respective contributions apply feedback analysis and dynamic simulation modeling to important local, national, and global economics issues and concerns. Topics covered include: an introduction to macro modeling using a system dynamics framework; a system dynamics translation of the Phillips machine; a re-examination of classical economic theories from a feedback perspective; analyses of important social, ecological, and resource issues; the development of a biophysical economics module for global modelling; contributions to monetary and financial economics; analyses of macroeconomic growth, income distribution and alternative theories of well-being; and a re-examination of scenario macro modeling. The contributions also examine the philosophical differences between the economics and system dynamics communities in an effort to bridge existing gaps and compare methods. Many models and other supporting information are provided as online supplementary files. Consequently, the book appeals to students and scholars in economics, as well as to practitioners and policy analysts interested in using systems thinking and system dynamics modeling to understand and improve economic systems around the world. "Clearly, there is much space for more collaboration between the advocates of post-Keynesian economics and system dynamics! More generally, I would like to recommend this book to all scholars and practitioners interested in exploring the interface and synergies between economics, system dynamics, and feedback thinking." Comments in the Foreword by Marc Lavoie, Emeritus Professor, University of Ottawa and University of Sorbonne Paris Nord
The theory of parsing is an important application area of the theory of formal languages and automata. The evolution of modem high-level programming languages created a need for a general and theoretically dean methodology for writing compilers for these languages. It was perceived that the compilation process had to be "syntax-directed," that is, the functioning of a programming language compiler had to be defined completely by the underlying formal syntax of the language. A program text to be compiled is "parsed" according to the syntax of the language, and the object code for the program is generated according to the semantics attached to the parsed syntactic entities. Context-free grammars were soon found to be the most convenient formalism for describing the syntax of programming languages, and accordingly methods for parsing context-free languages were devel oped. Practical considerations led to the definition of various kinds of restricted context-free grammars that are parsable by means of efficient deterministic linear-time algorithms."
This book discusses various applications of machine learning using a new approach, the dynamic wavelet fingerprint technique, to identify features for machine learning and pattern classification in time-domain signals. Whether for medical imaging or structural health monitoring, it develops analysis techniques and measurement technologies for the quantitative characterization of materials, tissues and structures by non-invasive means. Intelligent Feature Selection for Machine Learning using the Dynamic Wavelet Fingerprint begins by providing background information on machine learning and the wavelet fingerprint technique. It then progresses through six technical chapters, applying the methods discussed to particular real-world problems. Theses chapters are presented in such a way that they can be read on their own, depending on the reader's area of interest, or read together to provide a comprehensive overview of the topic. Given its scope, the book will be of interest to practitioners, engineers and researchers seeking to leverage the latest advances in machine learning in order to develop solutions to practical problems in structural health monitoring, medical imaging, autonomous vehicles, wireless technology, and historical conservation.
During the reception of a piece of information, we are never passive. Depending on its origin and content, from our personal beliefs and convictions, we bestow upon this piece of information, spontaneously or after reflection, a certain amount of confidence. Too much confidence shows a degree of naivety, whereas an absolute lack of it condemns us as being paranoid. These two attitudes are symmetrically detrimental, not only to the proper perception of this information but also to its use. Beyond these two extremes, each person generally adopts an intermediate position when faced with the reception of information, depending on its provenance and credibility. We still need to understand and explain how these judgements are conceived, in what context and to what end. Spanning the approaches offered by philosophy, military intelligence, algorithmics and information science, this book presents the concepts of information and the confidence placed in it, the methods that militaries, the first to be aware of the need, have or should have adopted, tools to help them, and the prospects that they have opened up. Beyond the military context, the book reveals ways to evaluate information for the good of other fields such as economic intelligence, and, more globally, the informational monitoring by governments and businesses. Contents 1. Information: Philosophical Analysis and Strategic Applications, Mouhamadou El Hady Ba and Philippe Capet. 2. Epistemic Trust, Gloria Origgi. 3. The Fundamentals of Intelligence, Philippe Lemercier. 4. Information Evaluation in the Military Domain: Doctrines, Practices and Shortcomings, Philippe Capet and Adrien Revault d Allonnes. 5. Multidimensional Approach to Reliability Evaluation of Information Sources, Frederic Pichon, Christophe Labreuche, Bertrand Duqueroie and Thomas Delavallade. 6. Uncertainty of an Event and its Markers in Natural Language Processing, Mouhamadou El Hady Ba, Stephanie Brizard, Tanneguy Dulong and Benedicte Goujon. 7. Quantitative Information Evaluation: Modeling and Experimental Evaluation, Marie-Jeanne Lesot, Frederic Pichon and Thomas Delavallade. 8. When Reported Information Is Second Hand, Laurence Cholvy. 9. An Architecture for the Evolution of Trust: Definition and Impact of the Necessary Dimensions of Opinion Making, Adrien Revault d Allonnes. About the Authors Philippe Capet is a project manager and research engineer at Ektimo, working mainly on information management and control in military contexts. Thomas Delavallade is an advanced studies engineer at Thales Communications & Security, working on social media mining in the context of crisis management, cybersecurity and the fight against cybercrime.
This book discusses the study and analysis of the physical aspects of social systems and models, inspired by the analogy with familiar models of physical systems and possible applications of statistical physics tools. Unlike the traditional analysis of the physics of macroscopic many-body or condensed matter systems, which is now an established and mature subject, the upsurge in the physical analysis and modelling of social systems, which are clearly many-body dynamical systems, is a recent phenomenon. Though the major developments in sociophysics have taken place only recently, the earliest attempts of proposing "Social Physics" as a discipline are more than one and a half centuries old. Various developments in the mainstream physics of condensed matter systems have inspired and induced the recent growth of sociophysical analysis and models. In spite of the tremendous efforts of many scientists in recent years, the subject is still in its infancy and major challenges are yet to be taken up. An introduction to these challenges is the main motivation for this book.
This book presents a comprehensive study covering the design and application of models and algorithms for assessing the joint device failures of telecommunication backbone networks caused by large-scale regional disasters. At first, failure models are developed to make use of the best data available; in turn, a set of fast algorithms for determining the resulting failure lists are described; further, a theoretical analysis of the complexity of the algorithms and the properties of the failure lists is presented, and relevant practical case studies are investigated. Merging concepts and tools from complexity theory, combinatorial and computational geometry, and probability theory, a comprehensive set of models is developed for translating the disaster hazard in informative yet concise data structures. The information available on the network topology and the disaster hazard is then used to calculate the possible (probabilistic) network failures. The resulting sets of resources that are expected to break down simultaneously are modeled as a collection of Shared Risk Link Groups (SRLGs), or Probabilistic SRLGs. Overall, this book presents improved theoretical methods that can help predicting disaster-caused network malfunctions, identifying vulnerable regions, and assessing precisely the availability of internet services, among other applications.
A comprehensive coverage of emerging and current technology dealing with heterogeneous sources of information, including data, design hints, reinforcement signals from external datasets, and related topics Covers all necessary prerequisites, and if necessary,additional explanations of more advanced topics, to make abstract concepts more tangible Includes illustrative material andwell--known experimentsto offer hands--on experience
Managing information technology (IT) on a global scale presents a number of opportunities and challenges. IT can drive the change in global business strategies and improve international coordination. At the same time, IT can be an impediment to achieving globalization. IT as an enabler of and inhibitor to globalization raises interesting questions. Global Perspective of Information Technology Management provides a collection of research works that address relevant IT management issues from a global perspective. As the world economy becomes more interdependent and competition for business continues to be more globally oriented, it has, likewise, become necessary to address the issues of IT management from a broader global focus.
The purpose of this book is to question the relationships involved in decision making and the systems designed to support it: decision support systems (DSS). The focus is on how these systems are engineered; to stop and think about the questions to be asked throughout the engineering process and, in particular, about the impact designers choices have on these systems.
At last, a right up-to-the-minute volume on a topic of huge national and international importance. As governments around the world battle voter apathy, the need for new and modernized methods of involvement in the polity is becoming acute. This work provides information on advanced research and case studies that survey the field of digital government. Successful applications in a variety of government settings are delineated, while the authors also analyse the implications for current and future policy-making. Each chapter has been prepared and carefully edited within a structured format by a known expert on the individual topic.
by Kurt Keutzer Those looking for a quick overview of the book should fast-forward to the Introduction in Chapter 1. What follows is a personal account of the creation of this book. The challenge from Earl Killian, formerly an architect of the MIPS processors and at that time Chief Architect at Tensilica, was to explain the significant performance gap between ASICs and custom circuits designed in the same process generation. The relevance of the challenge was amplified shortly thereafter by Andy Bechtolsheim, founder of Sun Microsystems and ubiquitous investor in the EDA industry. At a dinner talk at the 1999 International Symposium on Physical Design, Andy stated that the greatest near-term opportunity in CAD was to develop tools to bring the performance of ASIC circuits closer to that of custom designs. There seemed to be some synchronicity that two individuals so different in concern and character would be pre-occupied with the same problem. Intrigued by Earl and Andy's comments, the game was afoot. Earl Killian and other veterans of microprocessor design were helpful with clues as to the sources of the performance discrepancy: layout, circuit design, clocking methodology, and dynamic logic. I soon realized that I needed help in tracking down clues. Only at a wonderful institution like the University of California at Berkeley could I so easily commandeer an ab- bodied graduate student like David Chinnery with a knowledge of architecture, circuits, computer-aided design and algorithms.
Design automation of electronic and hybrid systems is a steadily growing field of interest and a permanent challenge for researchers in Electronics, Computer Engineering and Computer Science. System Design Automation presents some recent results in design automation of different types of electronic and mechatronic systems. It deals with various topics of design automation, ranging from high level digital system synthesis, through analogue and heterogeneous system analysis and design, up to system modeling and simulation. Design automation is treated from the aspects of its theoretical fundamentals, its basic approach and its methods and tools. Several application cases are presented in detail. The book consists of three chapters: High-Level System Synthesis (Digital Hardware/Software Systems). Here embedded systems, distributed systems and processor arrays as well as hardware-software codesign are treated. Also three special application cases are discussed in detail; Analog and Heterogeneous System Design (System Approach and Methodology). This chapter copes with the analysis and design of hybrid systems comprised of analog and digital, electronic and mechanical components; System Simulation and Evaluation (Methods and Tools). In this chapter object-oriented Modelling, analog system simulation including fault-simulation, parameter optimization and system validation are regarded. The contents of the book are based on material presented at the Workshop System Design Automation (SDA 2000) organised by the Sonderforschungsbereich 358 of the Deutsche Forschungsgemeinschaft at TU Dresden.
This book discusses the current research concerning public key cryptosystems. It begins with an introduction to the basic concepts of multivariate cryptography and the history of this field. The authors provide a detailed description and security analysis of the most important multivariate public key schemes, including the four multivariate signature schemes participating as second round candidates in the NIST standardization process for post-quantum cryptosystems. Furthermore, this book covers the Simple Matrix encryption scheme, which is currently the most promising multivariate public key encryption scheme. This book also covers the current state of security analysis methods for Multivariate Public Key Cryptosystems including the algorithms and theory of solving systems of multivariate polynomial equations over finite fields. Through the book's website, interested readers can find source code to the algorithms handled in this book. In 1994, Dr. Peter Shor from Bell Laboratories proposed a quantum algorithm solving the Integer Factorization and the Discrete Logarithm problem in polynomial time, thus making all of the currently used public key cryptosystems, such as RSA and ECC insecure. Therefore, there is an urgent need for alternative public key schemes which are resistant against quantum computer attacks. Researchers worldwide, as well as companies and governmental organizations have put a tremendous effort into the development of post-quantum public key cryptosystems to meet this challenge. One of the most promising candidates for this are Multivariate Public Key Cryptosystems (MPKCs). The public key of an MPKC is a set of multivariate polynomials over a small finite field. Especially for digital signatures, numerous well-studied multivariate schemes offering very short signatures and high efficiency exist. The fact that these schemes work over small finite fields, makes them suitable not only for interconnected computer systems, but also for small devices with limited resources, which are used in ubiquitous computing. This book gives a systematic introduction into the field of Multivariate Public Key Cryptosystems (MPKC), and presents the most promising multivariate schemes for digital signatures and encryption. Although, this book was written more from a computational perspective, the authors try to provide the necessary mathematical background. Therefore, this book is suitable for a broad audience. This would include researchers working in either computer science or mathematics interested in this exciting new field, or as a secondary textbook for a course in MPKC suitable for beginning graduate students in mathematics or computer science. Information security experts in industry, computer scientists and mathematicians would also find this book valuable as a guide for understanding the basic mathematical structures necessary to implement multivariate cryptosystems for practical applications.
"Beautiful C++ presents the C++ Core Guidelines from a developer's point of view with an emphasis on what benefits can be obtained from following the rules and what nightmares can result from ignoring them. For true geeks, it is an easy and entertaining read. For most software developers, it offers something new and useful." --Bjarne Stroustrup, inventor of C++ and co-editor of the C++ Core Guidelines Writing great C++ code needn't be difficult. The C++ Core Guidelines can help every C++ developer design and write C++ programs that are exceptionally reliable, efficient, and well-performing. But the Guidelines are so jam-packed with excellent advice that it's hard to know where to start. Start here, with Beautiful C++. Expert C++ programmers Guy Davidson and Kate Gregory identify 30 Core Guidelines you'll find especially valuable and offer detailed practical knowledge for improving your C++ style. For easy reference, this book is structured to align closely with the official C++ Core Guidelines website. Throughout, Davidson and Gregory offer useful conceptual insights and expert sample code, illuminate proven ways to use both new and longstanding language features more successfully, and show how to write programs that are more robust and performant by default.
A Timely Exploration of Multiuser Detection in Wireless Networks During the past decade, the design and development of current and emerging wireless systems have motivated many important advances in multiuser detection. This book fills an important need by providing a comprehensive overview of crucial recent developments that have occurred in this active research area. Each chapter is contributed by noted experts and is meant to serve as a self-contained treatment of the topic. Coverage includes: Linear and decision feedback methodsIterative multiuser detection and decodingMultiuser detection in the presence of channel impairmentsPerformance analysis with random signatures and channelsJoint detection methods for MIMO channelsInterference avoidance methods at the transmitterTransmitter precoding methods for the MIMO downlink This book is an ideal entry point for exploring ongoing research in multiuser detection and for learning about the field's existing unsolved problems and issues. It is a valuable resource for researchers, engineers, and graduate students who are involved in the area of digital communications.
This book contains selected papers presented at the seventh Conference on Working Group 3.7 of the Internation Federation for Information Processing. The focus of Working Froup 3.7 is on ITEM: Information Technology in Educational Management. The event toook place in Hamamatsu, Japan, and enabled the exchange of findings and ideas between researchers in educational management and information technology, policy-makers in the field of education, developers of ITEM systems, and vendors. The overall goal of the conference was to demonstrate and explore directions for developing and improving all types of educational institutions through ITEM. Contributions to this conference hailed from all over the world. The papers in this volume investigate how ITEM can support and improve educational practice at the level of instruction as well as at the institutional level, an analysis of the history of ITEM as a field of study and practice, results of researcdh on how training can promote the implementation of ITEM systems, and theoretical analyses of the conditions under which ITEM will have the strongest impact. Moreover, all sectors of educational systems (from schools to universities) are represented by the various chapters of this book.
Originally written by a team of Certified Protection Professionals (CPPs), Anthony DiSalvatore gives valuable updates to The Complete Guide for CPP Examination Preparation. This new edition contains an overview of the fundamental concepts and practices of security management while offering important insights into the CPP exam. Until recently the security profession was regarded as a "necessary evil." This book is a comprehensive guide to a profession that is now considered critical to our well-being in the wake of 9/11. It presents a practical approach drawn from decades of combined experience shared by the authors, prepares the reader for the CPP exam, and walks them through the certification process. This edition gives revised and updated treatment of every subject in the CPP exam, encourages and outlines a three-part program for you to follow, and includes sample questions at the end of each area of study. Although these are not questions that appear on the actual exam, they convey the principles and concepts that the exam emphasizes and are valuable in determining if you have mastered the information. The book also includes a security survey that covers all facets of external and internal security, as well as fire prevention. The Complete Guide for CPP Examination Preparation, Second Edition allows you to move steadily forward along your path to achieving one of the most highly regarded certifications in the security industry.
Globalization is defined in economic terms to mean flows of trade, foreign direct investment and finance, and liberalization of trade and investment policies. The impacts of globalization and information technology are examined in this text in terms of growth and productivity, poverty and income distribution, and employment. The experiences of Africa, East and Southeast Africa, South Asia and Latin America in the era of globalization are discussed. It is argued that benefits of freer trade and capital flows need to be managed carefully in order to minimize the costs and maximize gains.
This book provides in-depth insights into use cases implementing artificial intelligence (AI) applications at the edge. It covers new ideas, concepts, research, and innovation to enable the development and deployment of AI, the industrial internet of things (IIoT), edge computing, and digital twin technologies in industrial environments. The work is based on the research results and activities of the AI4DI (ECSEL JU) project, including an overview of industrial use cases, research, technological innovation, validation, and deployment. This book's sections build on the research, development, and innovative ideas elaborated for applications in five industries: automotive, semiconductor, industrial machinery, food and beverage, and transportation. The articles included under each of these five industrial sectors discuss AI-based methods, techniques, models, algorithms, and supporting technologies, such as IIoT, edge computing, digital twins, collaborative robots, silicon-born AI circuit concepts, neuromorphic architectures, and augmented intelligence, that are anticipating the development of Industry 5.0. Automotive applications cover use cases addressing AI-based solutions for inbound logistics and assembly process optimisation, autonomous reconfigurable battery systems, virtual AI training platforms for robot learning, autonomous mobile robotic agents, and predictive maintenance for machines on the level of a digital twin. AI-based technologies and applications in the semiconductor manufacturing industry address use cases related to AI-based failure modes and effects analysis assistants, neural networks for predicting critical 3D dimensions in MEMS inertial sensors, machine vision systems developed in the wafer inspection production line, semiconductor wafer fault classifications, automatic inspection of scanning electron microscope cross-section images for technology verification, anomaly detection on wire bond process trace data, and optical inspection. The use cases presented for machinery and industrial equipment industry applications cover topics related to wood machinery, with the perception of the surrounding environment and intelligent robot applications. AI, IIoT, and robotics solutions are highlighted for the food and beverage industry, presenting use cases addressing novel AI-based environmental monitoring; autonomous environment-aware, quality control systems for Champagne production; and production process optimisation and predictive maintenance for soybeans manufacturing. For the transportation sector, the use cases presented cover the mobility-as-a-service development of AI-based fleet management for supporting multimodal transport. This book highlights the significant technological challenges that AI application developments in industrial sectors are facing, presenting several research challenges and open issues that should guide future development for evolution towards an environment-friendly Industry 5.0. The challenges presented for AI-based applications in industrial environments include issues related to complexity, multidisciplinary and heterogeneity, convergence of AI with other technologies, energy consumption and efficiency, knowledge acquisition, reasoning with limited data, fusion of heterogeneous data, availability of reliable data sets, verification, validation, and testing for decision-making processes.
Management of new technologies is a critical factor in achieving global competitiveness. A recent survey of managers in the United States revealed that the implementation of new and advanced technologies is the most popular strategy in achieving global competitiveness. This book explores the role of technology in that context. The book identifies the role of new technologies in improving quality and shows that adopting a strategic total quality management will, in fact, lead to improved productivity and survivability of the firm. A thorough comparison of the Japanese and American production management practices is presented. This in-depth analysis helps to identify the problems of managing new technologies and shows that human resources management is a critical factor that should not be overlooked. Other strategies for improving global competitiveness are presented. Each of the five sections of the book deals with a major thrust that confronts management of new technologies. The book also discusses information system management and product design. The book uses real-life cases, models, and conceptual frameworks to support the views presented. Productivity, quality, and competitiveness are all related to technology. The success of Japanese corporations in achieving quality management has impelled the U.S. executives to listen and re-evaluate their management practices. Increasingly, many managers believe that new, advanced technologies can contribute to improving the productivity, quality, and competitiveness of a firm. However, simply adopting a new technology will not put an end to productivity and/or quality problems. This fact is most apparent in the case of computer-integrated manufacturing. The islands of automation that have resulted convinced many that effective management of new technologies is necessary in order to exploit any potential benefits. This book focuses on efficient management techniques and looks at the critical areas that can enhance the performance of a firm as a result of the adoption of new technologies. The book is divided into five sections: The total quality management section contains four chapters that present a comparison between the Japanese and American production practices. This section also presents a new way to measure the performance of a firm---not just by the direct quality of the product or service produced, but also by the sensitivity and responsibility of the firm to environmental and greening issues. The selection and implementation of new technologies section discusses the problems associated with the cost-accounting techniques in justifying new technologies and uses a multicriteria decision framework to show how this decision could be made. The strategic management section presents issues on production innovation and performance; The knowledge-based techniques section investigates the role of artificial intelligence and expert systems in the management of new technologies. Finally, the product design and inventory management section discusses the role of product design and reduced lot sizes in achieving a competitive advantage.
With a global view and a vision of our digital future, we should move forward with an understanding of data rights legislation at pace. The earlier we set the value norms around data in this digital long distance race, the more likely we will grasp the opportunities therein and embrace a future of commonly understood values. With a view to the future, the branch of Chinese law that is most likely to lead the world is that related to the digital economy. At the same time, if China wants to be amongst the world's leading digital economies, the basics to be understood and promoted most are higher quality, fairer and more sustainable institutional protection for data rights and subject-relevant interests, and the ability to offer systematic and accurate legal rules within the various digital disciplines.
Are you tired of squinting at the tiny color-coded tables and difficult-to-read text you find on the typical laminated reference card or cheat sheet that you keep with you when you're in the field or on location? DAVID BUSCH'S COMPACT GUIDE FOR THE NIKON D5000 is your solution! This new, lay-flat, spiral bound, reference guide condenses all the must-have information you need while shooting into a portable book you'll want to permanently tuck into your camera bag. You'll find every settings option for your Nikon D5000 listed, along with advice on why you should use--or not use--each adjustment. Useful tables provide recommended settings for a wide variety of shooting situations, including landscapes, portraits, sports, close-ups, and travel. With this guide on hand you have all the information you need at your fingertips so you can confidently use your camera on-the-go.
Textiles and computing have long been associated. High volume and low profit margins of textile products have driven the industry to invest in high technology, particularly in the area of data interpretation and analysis. Thus, it is virtually inevitable that soft computing has found a home in the textile industry. Contained in this volume are six chapters discussing various aspects of soft computing in the field of textiles and apparel.
This book provides analytical and numerical methods for the estimation of dimension characteristics (Hausdorff, Fractal, Caratheodory dimensions) for attractors and invariant sets of dynamical systems and cocycles generated by smooth differential equations or maps in finite-dimensional Euclidean spaces or on manifolds. It also discusses stability investigations using estimates based on Lyapunov functions and adapted metrics. Moreover, it introduces various types of Lyapunov dimensions of dynamical systems with respect to an invariant set, based on local, global and uniform Lyapunov exponents, and derives analytical formulas for the Lyapunov dimension of the attractors of the Henon and Lorenz systems. Lastly, the book presents estimates of the topological entropy for general dynamical systems in metric spaces and estimates of the topological dimension for orbit closures of almost periodic solutions to differential equations. |
You may like...
Addiction and Pregnancy - Empowering…
Laura M. Sanders, Barry R. Sherman, …
Hardcover
R2,535
Discovery Miles 25 350
|