![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
This book aims to gather the insight of leading experts on corruption and anti-corruption studies working at the scientific frontier of this phenomenon using the multidisciplinary tools of data and network science, in order to present current theoretical, empirical, and operational efforts being performed in order to curb this problem. The research results strengthen the importance of evidence-based approaches in the fight against corruption in all its forms, and foster the discussion about the best ways to convert the obtained knowledge into public policy. The contributed chapters provide comprehensive and multidisciplinary approaches to handle the non-trivial structural and dynamical aspects that characterize the modern social, economic, political and technological systems where corruption takes place. This book will serve a broad multi-disciplinary audience from natural to social scientists, applied mathematicians, including law and policymakers.
Cyberspace is a critical part of our lives. Although we all use cyberspace for work, entertainment, and social life, much of its infrastructure and operation is invisible to us. We spend a big part of our lives in an environment that is almost an essential service but is full of potential dangers: a place where criminals can commit new kinds of crimes, where governments can exert political pressure, and where we can be hurt by the unthinking actions of the bored and careless. Making cyberspace more secure is one of the challenges of our times. This is not only (or perhaps even primarily) a technical challenge. It requires actions by governments and businesses to encourage security whenever possible, and to make sure that their own actions do not undermine it. Unfortunately, many of those in a position to do something about cybersecurity do not have the background to understand the issues fully. Cybersecurity for Everyone will help by describing the issues in a way that is accessible to anyone, but especially those from non-technical backgrounds.
This book is about the most significant developments in the field of microlearning in the teaching of programming. In particular, the book covers the creation of content and the use of microlearning activities for automatically evaluating programming assignments. These critical component of microlearning represent a significant contribution both in fulfilling individual project objectives and in improving computer programming education in general. The book is interdisciplinary, examining both computer science and education. Specific topics explored include: development of distance courses, creating microcourses, fostering interdisciplinary knowledge, IT, management, and theoretical, methodological and practical aspects of the implementation of microlearning. Additionally, comprehensive analysis of the scientific literature (monographs, articles, proceedings) on the subject of the project and conducted research is provided.
L systems are language-theoretic models for developmental biology. They wereintroduced in 1968 by Aristid Lindenmayer (1925-1989) and have proved to be among the most beautiful examples of interdisciplinary science, where work in one area induces fruitful ideas and results in other areas. L systemsare based on relational and set-theoretic concepts, which are more suitable for the discrete and combinatorial structures of biology than mathematical models based on calculus or statistics. L systems have stimulated new work not only in the realistic simulation of developing organisms but also in the theory of automata and formal languages, formal power series, computer graphics, and combinatorics of words. This book contains research papers by almost all leading authorities and by many of the most promising young researchers in the field. The 28 contributions are organized in sections on basic L systems, computer graphics, graph grammars and map L systems, biological aspects and models, and variations and generalizations of L systems. The introductory paper by Lindenmayer and J}rgensen was written for a wide audience and is accessible to the non-specialist reader. The volume documents the state of the art in the theory of L systems and their applications. It will interest researchers and advanced students in theoretical computer science and developmental biology as well as professionals in computer graphics.
Communication and commerce continue to connect people from different cultures through information technology. ""Information Technology Ethics: Cultural Perspectives"" is the single reference source to take a global approach to the diverse ethical issues evoked by information and communication technologies and their possible resolutions. The comprehensive chapters contained in this book describe the problems and possibilities of genuinely global information ethics, which are urgently needed as information and communication technologies continue their exponential growth. International experts from diverse backgrounds address both theoretical and culture-specific issues in explicit detail. This Premier Reference Source provides the most thorough examination of the information technology ethics field.
IT-Enabled Strategic Management: Increasing Returns for the Organization includes outstanding contributions from leading intellectuals, exploring the crossroads of information technology systems and strategic management. This book presents research on: specific market enhancements, information management for decision making, inter-organizational efficiencies, the importance of maintaining crucial relationships, and broader societal and global ramifications. ""IT-Enabled Strategic Management: Increasing Returns for the Organization"" encourages continued deliberate cross-fertilization in these critical research areas at a very strategic point in history. Although the parallel development of these fields continues, globalization and technological advances have set the stage for convergence of these fields in a number of arenas.
AI in combination with other innovative technologies promises to bring unprecedented opportunities to all aspects of life. These technologies, however, hold great dangers, especially for the manipulation of the human mind, which have given rise to serious ethical concerns. Apart from some sectoral regulatory efforts to address these concerns, no regulatory framework for AI has yet been adopted though in 2021 the European Commission of the EU published a draft Act on Artificial Intelligence and UNESCO followed suit with a Recommendation on the Ethics of Artificial Intelligence. The book contextualises the future regulation of AI, specifically addressing the regulatory challenges relating to the planned prohibition of the use of AI systems that deploy subliminal techniques. The convergence of AI with various related technologies, such as brain-computer interfaces, functional magnetic resonance imaging, robotics and big data, already allows for "mind reading" or "dream hacking" through brain spyware, as well as other practices that intrude on cognition and the right to freedom of thought. Future innovations will enhance the possibilities for manipulating thoughts and behaviour, and they threaten to cause serious harm to individuals as well as to society as a whole. The issue of subliminal perception and the ability to deceive and manipulate the mind below the threshold of awareness causes severe difficulties for law and democracy and raises important questions for the future of society. This book shows how cognitive, technological, and legal questions are intrinsically interwoven, and aims to stimulate an urgently needed transdisciplinary and transnational debate between students, academics, practitioners, policymakers and citizens interested not only in the law but also in disciplines including computer science, neuroscience, sociology, political science, marketing and psychology.
The Lean Approach to Digital Transformation: From Customer to Code and From Code to Customer is organized into three parts that expose and develop the three capabilities that are essential for a successful digital transformation: 1. Understanding how to co-create digital services with users, whether they are customers or future customers. This ability combines observation, dialogue, and iterative experimentation. The approach proposed in this book is based on the Lean Startup approach, according to an extended vision that combines Design Thinking and Growth Hacking. Companies must become truly "customer-centric", from observation and listening to co-development. The revolution of the digital age of the 21st century is that customer orientation is more imperative -- the era of abundance, usages rate of change, complexity of experiences, and shift of power towards communities -- are easier, using digital tools and digital communities. 2. Developing an information system (IS) that is the backbone of the digital transformation - called "exponential information system" to designate an open IS (in particular on its borders), capable of interfacing and combining with external services, positioned as a player in software ecosystems and built for processing scalable and dynamic data flows. The exponential information system is constantly changing and it continuously absorbs the best of information processing technology, such as Artificial Intelligence and Machine Learning. 3. Building software "micro-factories" that produce service platforms, which are called "Lean software factories." This "software factory" concept covers the integration of agile methods, tooling and continuous integration and deployment practices, a customer-oriented product approach, and a platform approach based on modularity, as well as API-based architecture and openness to external stakeholders. This software micro-factory is the foundation that continuously produces and provides constantly evolving services. These three capabilities are not unique or specific to this book, they are linked to other concepts such as agile methods, product development according to lean principles, software production approaches such as CICD (continuous integration and deployment) or DevOps. This book weaves a common frame of reference for all these approaches to derive more value from the digital transformation and to facilitate its implementation. The title of the book refers to the "lean approach to digital transformation" because the two underlying frameworks, Lean Startup and Lean Software Factory, are directly inspired by Lean, in the sense of the Toyota Way. The Lean approach is present from the beginning to the end of this book -- it provides the framework for customer orientation and the love of a job well done, which are the conditions for the success of a digital transformation.
Optical media are now widely used in the telecommunication networks, and the evolution of optical and optoelectronic technologies tends to show that their wide range of techniques could be successfully introduced in shorter-distance interconnection systems. This book bridges the existing gap between research in optical interconnects and research in high-performance computing and communication systems, of which parallel processing is just an example. It also provides a more comprehensive understanding of the advantages and limitations of optics as applied to high-speed communications. Audience: The book will be a vital resource for researchers and graduate students of optical interconnects, computer architectures and high-performance computing and communication systems who wish to understand the trends in the newest technologies, models and communication issues in the field.
This "hands-on" book is for people who are interested in immediately putting Maple to work. The reader is provided with a compact, fast and surveyable guide that introduces them to the extensive capabilities of the software. The book is sufficient for standard use of Maple and will provide techniques for extending Maple for more specialized work. The author discusses the reliability of results systematically and presents ways of testing questionable results. The book allows a reader to become a user almost immediately and helps him/her to grow gradually to a broader and more proficient use. As a consequence, some subjects are dealt with in an introductory way early in the book, with references to a more detailed discussion later on.
Technology is essential for access to learning and development of a knowledge society. Cases on Interactive Technology Environments and Transnational Collaboration: Concerns and Perspectives provides a comparative and comprehensive analysis of technologically enabled educational environments and various issues concerning education and collaborations across the world while also focusing on best practices and experiences from a varied range of countries.
Containing practical tools and proven techniques for managing virtual teams for optimum performance, this text uses real-world examples and specific guidelines to show the reader how distributed work groups can be even more productive, effective and flexible than traditional co-located teams. The book looks at communication, control, monitoring, team buiding, cultural differences, and legal and process issues. There are step-by-step procedures for "transitioning teams", and tips on using high-speed networks and groupware as tools for solving problems. The software serves as an assessment tool to help make the distributed team more effective.
In today s increasingly complex and uncertain business environment, financial analysis is yet more critical to business managers who tackle problems of an economic or business nature. Knowledge based on formal logic and even experience becomes less sufficient. This volume systematically sets out the basic elements on which to base financial analysis for business in the new century. It incorporates a previous work that can serve as the basis and foundation for the new contributions that are now being made in the field of financial economy and intend to provide business with instruments and models suitable for dealing with the new economic context. In dealing with rapid and unpredictable changes in technological and business conditions, it postulates a growing reliance on the opinions of experts instead of past data or probabilistic forecasts, which is a radical change but may yield fruitful results. For this reason, much emphasis is devoted to the problem of aggregation of the opinion of experts in the financial field, with the object of limiting, wherever possible, the subjective component of the opinions and making sure that the decisions have the best guarantee of reaching the desired objectives."
An accessible introduction to probability, stochastic processes, and statistics for computer science and engineering applications This updated and revised edition of the popular classic relates fundamental concepts in probability and statistics to the computer sciences and engineering. The author uses Markov chains and other statistical tools to illustrate processes in reliability of computer systems and networks, fault tolerance, and performance. This edition features an entirely new section on stochastic Petri nets–as well as new sections on system availability modeling, wireless system modeling, numerical solution techniques for Markov chains, and software reliability modeling, among other subjects. Extensive revisions take new developments in solution techniques and applications into account and bring this work totally up to date. It includes more than 200 worked examples and self-study exercises for each section. Probability and Statistics with Reliability, Queuing and Computer Science Applications, Second Edition offers a comprehensive introduction to probability, stochastic processes, and statistics for students of computer science, electrical and computer engineering, and applied mathematics. Its wealth of practical examples and up-to-date information makes it an excellent resource for practitioners as well.
Designing Secure IoT devices with the Arm Platform Security Architecture and Cortex-M33 explains how to design and deploy secure IoT devices based on the Cortex-M23/M33 processor. The book is split into three parts. First, it introduces the Cortex-M33 and its architectural design and major processor peripherals. Second, it shows how to design secure software and secure communications to minimize the threat of both hardware and software hacking. And finally, it examines common IoT cloud systems and how to design and deploy a fleet of IoT devices. Example projects are provided for the Keil MDK-ARM and NXP LPCXpresso tool chains. Since their inception, microcontrollers have been designed as functional devices with a CPU, memory and peripherals that can be programmed to accomplish a huge range of tasks. With the growth of internet connected devices and the Internet of Things (IoT), "plain old microcontrollers" are no longer suitable as they lack the features necessary to create both a secure and functional device. The recent development by ARM of the Cortex M23 and M33 architecture is intended for today's IoT world.
This book approaches economic problems from a systems thinking and feedback perspective. By introducing system dynamics methods (including qualitative and quantitative techniques) and computer simulation models, the respective contributions apply feedback analysis and dynamic simulation modeling to important local, national, and global economics issues and concerns. Topics covered include: an introduction to macro modeling using a system dynamics framework; a system dynamics translation of the Phillips machine; a re-examination of classical economic theories from a feedback perspective; analyses of important social, ecological, and resource issues; the development of a biophysical economics module for global modelling; contributions to monetary and financial economics; analyses of macroeconomic growth, income distribution and alternative theories of well-being; and a re-examination of scenario macro modeling. The contributions also examine the philosophical differences between the economics and system dynamics communities in an effort to bridge existing gaps and compare methods. Many models and other supporting information are provided as online supplementary files. Consequently, the book appeals to students and scholars in economics, as well as to practitioners and policy analysts interested in using systems thinking and system dynamics modeling to understand and improve economic systems around the world. "Clearly, there is much space for more collaboration between the advocates of post-Keynesian economics and system dynamics! More generally, I would like to recommend this book to all scholars and practitioners interested in exploring the interface and synergies between economics, system dynamics, and feedback thinking." Comments in the Foreword by Marc Lavoie, Emeritus Professor, University of Ottawa and University of Sorbonne Paris Nord
Positive change in society depends highly on a variety of innovative technologies. Sustainability and transformation of the knowledge society helps create developing nations that can survive during times of global turbulence. Sustainable Economic Development and the Influence of Information Technologies: Dynamics of Knowledge Society Transformation provides relevant theoretical frameworks and the latest empirical research findings in the area of information technology as it relates to sustainable economic development and the development of knowledge societies. This innovative publication highlights the influence of information technologies and the significance of the knowledge society on economic development in the 21st century.
This book discusses various applications of machine learning using a new approach, the dynamic wavelet fingerprint technique, to identify features for machine learning and pattern classification in time-domain signals. Whether for medical imaging or structural health monitoring, it develops analysis techniques and measurement technologies for the quantitative characterization of materials, tissues and structures by non-invasive means. Intelligent Feature Selection for Machine Learning using the Dynamic Wavelet Fingerprint begins by providing background information on machine learning and the wavelet fingerprint technique. It then progresses through six technical chapters, applying the methods discussed to particular real-world problems. Theses chapters are presented in such a way that they can be read on their own, depending on the reader's area of interest, or read together to provide a comprehensive overview of the topic. Given its scope, the book will be of interest to practitioners, engineers and researchers seeking to leverage the latest advances in machine learning in order to develop solutions to practical problems in structural health monitoring, medical imaging, autonomous vehicles, wireless technology, and historical conservation.
This book discusses the study and analysis of the physical aspects of social systems and models, inspired by the analogy with familiar models of physical systems and possible applications of statistical physics tools. Unlike the traditional analysis of the physics of macroscopic many-body or condensed matter systems, which is now an established and mature subject, the upsurge in the physical analysis and modelling of social systems, which are clearly many-body dynamical systems, is a recent phenomenon. Though the major developments in sociophysics have taken place only recently, the earliest attempts of proposing "Social Physics" as a discipline are more than one and a half centuries old. Various developments in the mainstream physics of condensed matter systems have inspired and induced the recent growth of sociophysical analysis and models. In spite of the tremendous efforts of many scientists in recent years, the subject is still in its infancy and major challenges are yet to be taken up. An introduction to these challenges is the main motivation for this book.
The theory of parsing is an important application area of the theory of formal languages and automata. The evolution of modem high-level programming languages created a need for a general and theoretically dean methodology for writing compilers for these languages. It was perceived that the compilation process had to be "syntax-directed," that is, the functioning of a programming language compiler had to be defined completely by the underlying formal syntax of the language. A program text to be compiled is "parsed" according to the syntax of the language, and the object code for the program is generated according to the semantics attached to the parsed syntactic entities. Context-free grammars were soon found to be the most convenient formalism for describing the syntax of programming languages, and accordingly methods for parsing context-free languages were devel oped. Practical considerations led to the definition of various kinds of restricted context-free grammars that are parsable by means of efficient deterministic linear-time algorithms."
This book presents a comprehensive study covering the design and application of models and algorithms for assessing the joint device failures of telecommunication backbone networks caused by large-scale regional disasters. At first, failure models are developed to make use of the best data available; in turn, a set of fast algorithms for determining the resulting failure lists are described; further, a theoretical analysis of the complexity of the algorithms and the properties of the failure lists is presented, and relevant practical case studies are investigated. Merging concepts and tools from complexity theory, combinatorial and computational geometry, and probability theory, a comprehensive set of models is developed for translating the disaster hazard in informative yet concise data structures. The information available on the network topology and the disaster hazard is then used to calculate the possible (probabilistic) network failures. The resulting sets of resources that are expected to break down simultaneously are modeled as a collection of Shared Risk Link Groups (SRLGs), or Probabilistic SRLGs. Overall, this book presents improved theoretical methods that can help predicting disaster-caused network malfunctions, identifying vulnerable regions, and assessing precisely the availability of internet services, among other applications.
During the reception of a piece of information, we are never passive. Depending on its origin and content, from our personal beliefs and convictions, we bestow upon this piece of information, spontaneously or after reflection, a certain amount of confidence. Too much confidence shows a degree of naivety, whereas an absolute lack of it condemns us as being paranoid. These two attitudes are symmetrically detrimental, not only to the proper perception of this information but also to its use. Beyond these two extremes, each person generally adopts an intermediate position when faced with the reception of information, depending on its provenance and credibility. We still need to understand and explain how these judgements are conceived, in what context and to what end. Spanning the approaches offered by philosophy, military intelligence, algorithmics and information science, this book presents the concepts of information and the confidence placed in it, the methods that militaries, the first to be aware of the need, have or should have adopted, tools to help them, and the prospects that they have opened up. Beyond the military context, the book reveals ways to evaluate information for the good of other fields such as economic intelligence, and, more globally, the informational monitoring by governments and businesses. Contents 1. Information: Philosophical Analysis and Strategic Applications, Mouhamadou El Hady Ba and Philippe Capet. 2. Epistemic Trust, Gloria Origgi. 3. The Fundamentals of Intelligence, Philippe Lemercier. 4. Information Evaluation in the Military Domain: Doctrines, Practices and Shortcomings, Philippe Capet and Adrien Revault d Allonnes. 5. Multidimensional Approach to Reliability Evaluation of Information Sources, Frederic Pichon, Christophe Labreuche, Bertrand Duqueroie and Thomas Delavallade. 6. Uncertainty of an Event and its Markers in Natural Language Processing, Mouhamadou El Hady Ba, Stephanie Brizard, Tanneguy Dulong and Benedicte Goujon. 7. Quantitative Information Evaluation: Modeling and Experimental Evaluation, Marie-Jeanne Lesot, Frederic Pichon and Thomas Delavallade. 8. When Reported Information Is Second Hand, Laurence Cholvy. 9. An Architecture for the Evolution of Trust: Definition and Impact of the Necessary Dimensions of Opinion Making, Adrien Revault d Allonnes. About the Authors Philippe Capet is a project manager and research engineer at Ektimo, working mainly on information management and control in military contexts. Thomas Delavallade is an advanced studies engineer at Thales Communications & Security, working on social media mining in the context of crisis management, cybersecurity and the fight against cybercrime. |
You may like...
Living While Black - The Essential Guide…
Guilaine Kinouani
Paperback
Expansive - A Guide To Thinking Bigger…
John Sanei, Erik Kruger
Paperback
90 Rules For Entrepreneurs - Your Guide…
Marnus Broodryk
Paperback
(4)
Active Radar Electronic Countermeasures
Edward J. Chrzanowski
Hardcover
R2,021
Discovery Miles 20 210
Photonic Aspects of Modern Radar
Henry Zmuda, Edward N. Toughlian
Hardcover
R2,972
Discovery Miles 29 720
|