![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
This edited text draws together the insights of numerous worldwide eminent academics to evaluate the condition of predictive policing and artificial intelligence (AI) as interlocked policy areas. Predictive and AI technologies are growing in prominence and at an unprecedented rate. Powerful digital crime mapping tools are being used to identify crime hotspots in real-time, as pattern-matching and search algorithms are sorting through huge police databases populated by growing volumes of data in an eff ort to identify people liable to experience (or commit) crime, places likely to host it, and variables associated with its solvability. Facial and vehicle recognition cameras are locating criminals as they move, while police services develop strategies informed by machine learning and other kinds of predictive analytics. Many of these innovations are features of modern policing in the UK, the US and Australia, among other jurisdictions. AI promises to reduce unnecessary labour, speed up various forms of police work, encourage police forces to more efficiently apportion their resources, and enable police officers to prevent crime and protect people from a variety of future harms. However, the promises of predictive and AI technologies and innovations do not always match reality. They often have significant weaknesses, come at a considerable cost and require challenging trade- off s to be made. Focusing on the UK, the US and Australia, this book explores themes of choice architecture, decision- making, human rights, accountability and the rule of law, as well as future uses of AI and predictive technologies in various policing contexts. The text contributes to ongoing debates on the benefits and biases of predictive algorithms, big data sets, machine learning systems, and broader policing strategies and challenges. Written in a clear and direct style, this book will appeal to students and scholars of policing, criminology, crime science, sociology, computer science, cognitive psychology and all those interested in the emergence of AI as a feature of contemporary policing.
Parallel and distributed computing is one of the foremost
technologies for shaping New Horizons of Parallel and Distributed Computing is a collection of self-contained chapters written by pioneering researchers to provide solutions for newly emerging problems in this field. This volume will not only provide novel ideas, work in progress and state-of-the-art techniques in the field, but will also stimulate future research activities in the area of parallel and distributed computing with applications. New Horizons of Parallel and Distributed Computing is intended for industry researchers and developers, as well as for academic researchers and advanced-level students in computer science and electrical engineering. A valuable reference work, it is also suitable as a textbook.
Hybrid dynamical systems, both continuous and discrete dynamics and variables, have attracted considerable interest recently. This emerging area is found at the interface of control theory and computer engineering, focusing on the analogue and digital aspects of systems and devices. They are essential for advances in modern digital- controller technology. "Qualitative Theory of Hybrid Dynamical Systems" provides a thorough development and systematic presentation of the foundations and framework for hybrid dynamical systems. The presentation offers an accessible, but precise, development of the mathematical models, conditions for existence of limit cycles, and criteria of their stability. The book largely concentrates on the case of discretely controlled continuous-time systems and their relevance for modeling aspects of flexible manufacturing systems and dynamically routed queuing networks. Features and topics: *differential automata*development and use of the concept "cyclic linear differential automata" (CLDA)*switched single-server flow networks coverage*application to specific models of manufacturing systems and queuing networks*select collection of open problems for the subject*self-contained presentation of topics, with the necessary background This new book is an excellent resource for the study and analysis of hybrid dynamical systems used in systems and control engineering. Researchers, postgraduates and professionals in control engineering and computer engineering will find the book an up-to-date development of the relevant new concepts and tools.
The advancement of technology in today's world has led to the progression of several professional fields. This includes the classroom, as teachers have begun using new technological strategies to increase student involvement and motivation. ICT innovation including virtual reality and blended learning methods has changed the scope of classroom environments across the globe; however, significant research is lacking in this area. ICTs and Innovation for Didactics of Social Sciences is a fundamental reference focused on didactics of social sciences and ICTs including issues related to innovation, resources, and strategies for teachers that can link to the transformation of social sciences teaching and learning as well as societal transformation. While highlighting topics such as blended learning, augmented reality, and virtual classrooms, this book is ideally designed for researchers, administrators, educators, practitioners, and students interested in understanding current relevant ICT resources and innovative strategies for the didactic of social sciences and didactic possibilities in relation to concrete conceptual contents, resolution of problems, planning, decision making, development of social skills, attention, and motivation promoting a necessary technological literacy.
Today more than 90% of all programmable processors are employed in embedded systems. This number is actually not surprising, contemplating that in a typical home you might find one or two PCs equipped with high-performance standard processors, and probably dozens of embedded systems, including electronic entertainment, household, and telecom devices, each of them equipped with one or more embedded processors. The question arises why programmable processors are so popular in embedded system design. The answer lies in the fact that they help to narrow the gap between chip capacity and designer productivity. Embedded processors cores are nothing but one step further towards improved design reuse, just along the lines of standard cells in logic synthesis and macrocells in RTL synthesis in earlier times of IC design. Additionally, programmable processors permit to migrate functionality from hardware to software, resulting in an even improved reuse factor as well as greatly increased flexibility. The LISA processor design platform (LPDP) presented in Architecture Exploration for Embedded Processors with LISA addresses recent design challenges and results in highly satisfactory solutions. The LPDP covers all major high-level phases of embedded processor design and is capable of automatically generating almost all required software development tools from processor models in the LISA language. It supports a profiling-based, stepwise refinement of processor models down to cycle-accurate and even RTL synthesis models. Moreover, it elegantly avoids model inconsistencies otherwise omnipresent in traditional design flows. The next step in design reuse is already in sight: SoC platforms, i.e., partially pre-designed multi-processor templates that can be quickly tuned towards given applications thereby guaranteeing a high degree of hardware/software reuse in system-level design. Consequently, the LPDP approach goes even beyond processor architecture design. The LPDP solution explicitly addresses SoC integration issues by offering comfortable APIs for external simulation environments as well as clever solutions for the problem of both efficient and user-friendly heterogeneous multiprocessor debugging.
This title features expert advice from a professional UK photographer, useful hints & tips plus clear diagrams & charts. This new series of "The Expanded Guides" focuses on photographic techniques to give you a comprehensive grounding in the subject and takes you a step further to enable you to get much more from your photography. This is an invaluable guide to taking better photographs using today's sophisticated digital SLR and compact digital cameras. Aimed at both the novice and more experienced amateur photographer, jargon-free text explains the theory behind digital photography, how light metering affects exposure and light's relationship to colour, colour temperature and white balance, focal points and the expression of mood. Aperture, depth of field and shutter speed are also thoroughly covered, along with chapters on ISO speeds, dynamic range, use of filters and making in-camera adjustments. Post processing techniques round off this invaluable guide to getting the best results from your photography.
This book aims at providing a view of the current trends in the development of research on Synthesis and Control of Discrete Event Systems. Papers col lected in this volume are based on a selection of talks given in June and July 2001 at two independent meetings: the Workshop on Synthesis of Concurrent Systems, held in Newcastle upon Tyne as a satellite event of ICATPN/ICACSD and organized by Ph. Darondeau and L. Lavagno, and the Symposium on the Supervisory Control of Discrete Event Systems (SCODES), held in Paris as a satellite event of CAV and organized by B. Caillaud and X. Xie. Synthesis is a generic term that covers all procedures aiming to construct from specifications given as input objects matching these specifications. The ories and applications of synthesis have been studied and developped for long in connection with logics, programming, automata, discrete event systems, and hardware circuits. Logics and programming are outside the scope of this book, whose focus is on Discrete Event Systems and Supervisory Control. The stress today in this field is on a better applicability of theories and algorithms to prac tical systems design. Coping with decentralization or distribution and caring for an efficient realization of the synthesized systems or controllers are of the utmost importance in areas so diverse as the supervision of embedded or man ufacturing systems, or the implementation of protocols in software or in hard ware."
Identifying Emerging Trends in Technological Innovation Doctoral programs in science and engineering are important sources of innovative ideas and techniques that might lead to new products and technological innovation. Certainly most PhD students are not experienced researchers and are in the process of learning how to do research. Nevertheless, a number of empiric studies also show that a high number of technological innovation ideas are produced in the early careers of researchers. The combination of the eagerness to try new approaches and directions of young doctoral students with the experience and broad knowledge of their supervisors is likely to result in an important pool of innovation potential. The DoCEIS doctoral conference on Computing, Electrical and Industrial En- neering aims at creating a space for sharing and discussing ideas and results from doctoral research in these inter-related areas of engineering. Innovative ideas and hypotheses can be better enhanced when presented and discussed in an encouraging and open environment. DoCEIS aims to provide such an environment, releasing PhD students from the pressure of presenting their propositions in more formal contexts.
This volume is a how-to guide to the use of computers in library-based adult literacy programs. Since the commitment to literacy training has become an integral part of libraries' efforts to offer equal access to information, Linda Main and Char Whitaker provide a comprehensive study of the efficacious role the computer can play in achieving this objective. The problems and successes associated with the introduction of computers into library literacy programs, as well as financial requirements, space, furniture, training, and the effect on other library operations are central to the study. The text also features a design for an ideal computerized literacy lab, an overview of compatible software, both existing and proposed, and a look at the rewards and challenges facing librarians, professional educators, and literacy program directors in the future. Appendixes provide country-wide information on libraries currently involved in automating literacy, main suppliers of literacy software, and consulting personnel.
Whilst Information Systems has the potential to widen our view of the world, it often has the opposite effect by limiting our ability to interact, facilitating managerial and state surveillance or instituting strict hierarchies and personal control. In this book, Bernd Stahl offers an alternative and critical perspective on the subject, arguing that the ongoing problems in this area could be caused by the misconceptualization of the nature and role of IS. Stahl discusses the question of how IS can be used to actually overcome oppression and promote emancipation, breaking the book into four sections. The first section covers the theory of critical research in IS, giving a central place for the subject of ethics. The second section discusses the philosophical underpinnings of this critical research. The third and largest section gives examples of the application of critical work in IS. The final section then reflects on the approach and suggests ways for further development.
Images have always been very important in human life. Their applications range from primitive communication between humans of all ages to advanced technologies in the industrial, medical and military field. The increased possibilities to capture and analyze images have contributed to the largeness that the scientific field of "image processing" has become today. Many techniques are being applied, including soft computing. "Soft Computing in Image Processing: Recent Advances" follows the edited volumes "Fuzzy Techniques in Image Processing" (volume 52, published in 2000) and "Fuzzy Filters for Image Processing" (volume 122, published in 2003), and covers a wide range of both practical and theoretical applications of soft computing in image processing. The 16 excellent chapters of the book have been grouped into five parts: Applications in Remote Sensing, Applications in Image Retrieval, Applications in Image Analysis, Other Applications, and Theoretical Contributions. The focus of the book is on practical applications, which makes it interesting for every researcher that is involved with soft computing, image processing, or both scientific branches.
Tabu Search (TS) and, more recently, Scatter Search (SS) have proved highly effective in solving a wide range of optimization problems, and have had a variety of applications in industry, science, and government. The goal of Metaheuristic Optimization via Memory and Evolution: Tabu Search and Scatter Search is to report original research on algorithms and applications of tabu search, scatter search or both, as well as variations and extensions having "adaptive memory programming" as a primary focus. Individual chapters identify useful new implementations or new ways to integrate and apply the principles of TS and SS, or that prove new theoretical results, or describe the successful application of these methods to real world problems.
The emergence and widespread use of personal computers and network technologies have seen the development of interest in the use of computers to support cooperative work. This volume presents the proceedings of the tenth European conference on Computer Supported Cooperative Work (CSCW). This is a multidisciplinary area that embraces the development of new technologies grounded in actual cooperative practices. These proceedings contain a collection of papers addressing novel interaction technologies for CSCW systems, new models and architectures for groupware systems, studies of communication and coordination among mobile actors, studies of cooperative work in complex settings, studies of groupware systems in actual use in real-world settings, and theories and techniques to support the development of cooperative applications. The papers present emerging technologies alongside new methods and approaches to the development of this important class of applications.
Document Processing and Retrieval: TEXPROS focuses on the design and implementation of a personal, customizable office information and document processing system called TEXPROS (a TEXt PROcessing System). TEXPROS is a personal, intelligent office information and document processing system for text-oriented documents. This system supports the storage, classification, categorization, retrieval and reproduction of documents, as well as extracting, browsing, retrieving and synthesizing information from a variety of documents. When using TEXPROS in a multi-user or distributed environment, it requires specific protocols for extracting, storing, transmitting and exchanging information. The authors have used a variety of techniques to implement TEXPROS, such as Object-Oriented Programming, Tcl/Tk, X-Windows, etc. The system can be used for many different purposes in many different applications, such as digital libraries, software documentation and information delivery. Audience: Provides in-depth, state-of-the-art coverage of information processing and retrieval, and documentation for such professionals as database specialists, information systems and software developers, and information providers.
From the Foreword..... Modern digital signal processing applications provide a large challenge to the system designer. Algorithms are becoming increasingly complex, and yet they must be realized with tight performance constraints. Nevertheless, these DSP algorithms are often built from many constituent canonical subtasks (e.g., IIR and FIR filters, FFTs) that can be reused in other subtasks. Design is then a problem of composing these core entities into a cohesive whole to provide both the intended functionality and the required performance. In order to organize the design process, there have been two major approaches. The top-down approach starts with an abstract, concise, functional description which can be quickly generated. On the other hand, the bottom-up approach starts from a detailed low-level design where performance can be directly assessed, but where the requisite design and interface detail take a long time to generate. In this book, the authors show a way to effectively resolve this tension by retaining the high-level conciseness of VHDL while parameterizing it to get good fit to specific applications through reuse of core library components. Since they build on a pre-designed set of core elements, accurate area, speed and power estimates can be percolated to high- level design routines which explore the design space. Results are impressive, and the cost model provided will prove to be very useful. Overall, the authors have provided an up-to-date approach, doing a good job at getting performance out of high-level design. The methodology provided makes good use of extant design tools, and is realistic in terms of the industrial design process. The approach is interesting in its own right, but is also of direct utility, and it will give the existing DSP CAD tools a highly competitive alternative. The techniques described have been developed within ARPAs RASSP (Rapid Prototyping of Application Specific Signal Processors) project, and should be of great interest there, as well as to many industrial designers. Professor Jonathan Allen, Massachusetts Institute of Technology
Field-Programmable Gate Arrays (FPGAs) have emerged as an attractive means of implementing logic circuits, providing instant manufacturing turnaround and negligible prototype costs. They hold the promise of replacing much of the VLSI market now held by mask-programmed gate arrays. FPGAs offer an affordable solution for customized VLSI, over a wide variety of applications, and have also opened up new possibilities in designing reconfigurable digital systems. Field-Programmable Gate Arrays discusses the most important aspects of FPGAs in a textbook manner. It provides the reader with a focused view of the key issues, using a consistent notation and style of presentation. It provides detailed descriptions of commercially available FPGAs and an in-depth treatment of the FPGA architecture and CAD issues that are the subjects of current research. The material presented is of interest to a variety of readers, including those who are not familiar with FPGA technology, but wish to be introduced to it, as well as those who already have an understanding of FPGAs, but who are interested in learning about the research directions that are of current interest.
* Provides evidence, examples, and explanation of the developing tactics-illustrated recently in politics in particular-of embedding internal saboteurs bent on dismantling their own institutions from within * Presents numerous case studies to examine instances of insider compromises, including the circumstances and warning signs that led to events * Outlines solutions on how to train organizations and individuals on recognizing, reporting, mitigating, and deterring insider threats
In probability and statistics we often have to estimate probabilities and parameters in probability distributions using a random sample. Instead of using a point estimate calculated from the data we propose using fuzzy numbers which are constructed from a set of confidence intervals. In probability calculations we apply constrained fuzzy arithmetic because probabilities must add to one. Fuzzy random variables have fuzzy distributions. A fuzzy normal random variable has the normal distribution with fuzzy number mean and variance. Applications are to queuing theory, Markov chains, inventory control, decision theory and reliability theory.
The book presents topics in discrete biomathematics. Mathematics has been widely used in modeling biological phenomena. However, the molecular and discrete nature of basic life processes suggests that their logic follow principles that are intrinsically based on discrete and informational mechanisms. The ultimate reason of polymers, as key element of life, is directly based on the computational power of strings, and the intrinsic necessity of metabolism is related to the mathematical notion of multiset. The switch of the two roots of bioinformatics suggests a change of perspective. In bioinformatics, the biologists ask computer scientists to assist them in processing biological data. Conversely, in infobiotics mathematicians and computer scientists investigate principles and theories yielding new interpretation keys of biological phenomena. Life is too important to be investigated by biologists alone, and though computers are essential to process data from biological laboratories, many fundamental questions about life can be appropriately answered by a perspicacious intervention of mathematicians, computer scientists, and physicists, who will complement the work of chemists, biochemists, biologists, and medical investigators. The volume is organized in seven chapters. The first part is devoted to research topics (Discrete information and life, Strings and genomes, Algorithms and Biorhythms, Life Strategies), the second one to mathematical backgrounds (Numbers and Measures, Languages and Grammars, Combinations and Chances).
The International Federation for Information Processing (IFIP) is a non-profit umbrella organization for national societies working in the field of information processing. It was founded in 1960 under the auspices of UNESCO. It is organized into several technical committees. This book represents the proceedings of the 2006 conference of technical committee 8 (TC8), which covers the field of information systems. This conference formed part of IFIP's World Computer Congress in Chile. The occasion celebrated the 30th anniversary of IFIP TC8 by looking at the past, present and future of information systems. The proceedings reflect not only the breadth and depth of the work of TC8, but also the international nature of the group, with authors from 18 countries being represented in the 21 papers (including two invited papers) and 2 panels. All submissions were rigorously refereed by at least two reviewers and an associate editor and following the review and resubmission process nearly 50% of submissions were accepted. This paper introduces the papers and panels presented at the conference and published in this volume. It is never straightforward to classify a set of papers but we have made an attempt and this classification is also reflected in the sessions of the conference itself. The classification for the papers is as follows: the world of information systems - early pioneers; developing improved information systems; information systems in their domains of application; the discipline of information systems; issues of production; IT impacts on the organization; tools and modeling and new directions.
The five digital forces (mobility and pervasive computing, cloud, big data, artificial intelligence and robotics, and social media) are poised to bring great academic and industrial breakthroughs. All stakeholders want to understand how to best harness these forces to their advantage. While literature exists for understanding each force independently, there is a lack of knowledge on how to utilize all the forces together to realize future enterprises. Advanced Digital Architectures for Model-Driven Adaptive Enterprises is an essential reference source that explores the potential in unifying the five digital forces to achieve increased levels of agility, efficiency, and scale. Featuring coverage on a wide range of topics including socio-technical systems, adaptive architectures, and enterprise modeling, this book is ideally designed for managers, executives, programmers, designers, computer engineers, entrepreneurs, tool builders, digital practitioners, researchers, academicians, ands students at the graduate level.
One service mathematics has rendered the 'Bt mm, ... si j'avait su comment en revenir, human race. It has put common sense back je n'y serais point alIe.' Jules Verne where it belongs. on the topmost shelf next to the dusty canister labelled 'discarded non The series is divergent; therefore we may be sense'. Eric T. Bell able to do something with it. O. Heavisidc Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non linearities abound. Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service topology has rendered mathematical physics .. .'; 'One service logic has rendered com puter science .. .'; 'One service category theory has rendered mathematics .. .'. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series."
This book investigates the characteristics of simple versus complex systems, and what the properties of a cyber-physical system design are that contribute to an effective implementation and make the system understandable, simple to use, and easy to maintain. The targeted audience is engineers, managers and advanced students who are involved in the design of cyber-physical systems and are willing to spend some time outside the silo of their daily work in order to widen their background and appreciation for the pervasive problems of system complexity. In the past, design of a process-control system (now called cyber-physical systems) was more of an art than an engineering endeavor. The software technology of that time was concerned primarily with functional correctness and did not pay much attention to the temporal dimension of program execution, which is as important as functional correctness when a physical process must be controlled. In the ensuing years, many problems in the design of cyber-physical systems were simplified. But with an increase in the functional requirements and system size, the complexity problems have appeared again in a different disguise. A sound understanding of the complexity problem requires some insight in cognition, human problem solving, psychology, and parts of philosophy. This book presents the essence of the author's thinking about complexity, accumulated over the past forty years.
ED-L2L, Learning to Live in the Knowledge Society, is one of the co-located conferences of the 20th World Computer Congress (WCC2008). The event is organized under the auspices of IFIP (International Federation for Information Processing) and is to be held in Milan from 7th to 10th September 2008. ED-L2L is devoted to themes related to ICT for education in the knowledge society. It provides an international forum for professionals from all continents to discuss research and practice in ICT and education. The event brings together educators, researchers, policy makers, curriculum designers, teacher educators, members of academia, teachers and content producers. ED-L2L is organised by the IFIP Technical Committee 3, Education, with the support of the Institute for Educational Technology, part of the National Research Council of Italy. The Institute is devoted to the study of educational innovation brought about through the use of ICT. Submissions to ED-L2L are published in this conference book. The published papers are devoted to the published conference themes: Developing digital literacy for the knowledge society: information problem solving, creating, capturing and transferring knowledge, commitment to lifelong learning Teaching and learning in the knowledge society, playful and fun learning at home and in the school New models, processes and systems for formal and informal learning environments and organisations Developing a collective intelligence, learning together and sharing knowledge ICT issues in education - ethics, equality, inclusion and parental role Educating ICT professionals for the global knowledge society Managing the transition to the knowledge society |
You may like...
Research Anthology on Agile Software…
Information R Management Association
Hardcover
R14,523
Discovery Miles 145 230
Deaver on Cybersecurity - An irreverent…
Frederic Scott Deaver
Hardcover
R1,808
Discovery Miles 18 080
Processing and Analyzing Financial Data…
Marcelo S. Perlin
Hardcover
|