Your cart is empty
This book introduces readers to selected issues in distributed systems, and primarily focuses on principles, not on technical details. Though the systems discussed are based on existing (von Neumann) computer architectures, the book also touches on emerging processing paradigms. Uniquely, it approaches system components not only as static constructs, but also "in action," exploring the different states they pass through. The author's teaching experience shows that newcomers to the field, students and even IT professionals can far more readily grasp the essence of distributed algorithmic structures in action, than on the basis of static descriptions.
This book centres on Webcam Child Sex Tourism and the Sweetie Project initiated by the children's rights organization Terre des Hommes in 2013 in response to the exponential increase of online child abuse. Webcam child sex tourism is a growing international problem, which not only encourages the abuse and sexual exploitation of children and provides easy access to child-abuse images, but which is also a crime involving a relatively low risk for offenders as live-streamed webcam performances leave few traces that law enforcement can use. Moreover, webcam child sex tourism often has a cross-border character, which leads to jurisdictional conflicts and makes it even harder to obtain evidence, launch investigations or prosecute suspects. Terre des Hommes set out to actively tackle webcam child sex tourism by employing a virtual 10-year old Philippine girl named Sweetie, a so-called chatbot, to identify offenders in chatrooms. Sweetie 1.0 could be deployed only if police officers participated in chats, and thus was limited in dealing with the large number of offenders. With this in mind, a more pro-active and preventive approach was adopted to tackle the issue. Sweetie 2.0 was developed with an automated chat function to track, identify and deter individuals using the internet to sexually abuse children. Using chatbots allows the monitoring of larger parts of the internet to locate and identify (potential) offenders, and to send them messages to warn of the legal consequences should they proceed further. But using artificial intelligence raises serious legal questions. For instance, is sexually interacting with a virtual child actually a criminal offence? How do rules of criminal procedure apply to Sweetie as investigative software? Does using Sweetie 2.0 constitute entrapment? This book, the outcome of a comparative law research initiative by Leiden University's Center for Law and Digital Technologies (eLaw) and the Tilburg Institute for Law, Technology, and Society (TILT), addresses the application of substantive criminal law and criminal procedure to Sweetie 2.0 within various jurisdictions around the world. This book is especially relevant for legislators and policy-makers, legal practitioners in criminal law, and all lawyers and academics interested in internet-related sexual offences and in Artificial Intelligence and law. Professor Simone van der Hof is General Director of Research at t he Center for Law and Digital Technologies (eLaw) of the Leiden Law School at Leiden University, The Netherlands. Ilina Georgieva, LL.M., is a PhD researcher at the Faculty of Governance and Global Affairs at Leiden University, Bart Schermer is an associate professor at the Center for Law and Digital Technologies (eLaw) of the Leiden Law School, and Professor Bert-Jaap Koops is Professor of Regulation and Technology at the Tilburg Institute for Law, Technology, and Society (TILT), Tilburg University, The Netherlands.
This book explains aspects of social networks, varying from development and application of new artificial intelligence and computational intelligence techniques for social networks to understanding the impact of social networks. Chapters 1 and 2 deal with the basic strategies towards social networks such as mining text from such networks and applying social network metrics using a hybrid approach; Chaps. 3 to 8 focus on the prime research areas in social networks: community detection, influence maximization and opinion mining. Chapter 9 to 13 concentrate on studying the impact and use of social networks in society, primarily in education, commerce, and crowd sourcing. The contributions provide a multidimensional approach, and the book will serve graduate students and researchers as a reference in computer science, electronics engineering, communications, and information technology.
This book includes carefully selected papers presented at the 10th International Conference on Knowledge, Information and Creativity Support Systems (KICCS 2015), which was held in Phuket, Thailand, on November 12-14, 2015. Most of the papers are extended versions with the latest results added, representing virtually all topics covered by the conference. The KICCS 2015 focus theme, "Looking into the Future of Creativity and Decision Support Systems", highlighted the field's growing complexity and called for deeper, insightful discussions about the future, complemented with an exposition of current developments that have proven their value and usefulness. As such, the book addresses topics concerning future-oriented fields of research, such as anticipatory networks and systems; foresight support systems; and relevant newly emerging applications, exemplified by autonomous creative systems. It also focuses on cognitive and collaborative aspects of creativity.
The book focuses on both theory and applications in the broad areas of communication technology, computer science and information security. This two volume book contains the Proceedings of International Conference on Advanced Computing and Intelligent Engineering. These volumes bring together academic scientists, professors, research scholars and students to share and disseminate information on knowledge and scientific research works related to computing, networking, and informatics to discuss the practical challenges encountered and the solutions adopted. The book also promotes translation of basic research into applied investigation and convert applied investigation into practice.
Written by world-class leaders in type-2 fuzzy logic control, this book offers a self-contained reference for both researchers and students. The coverage provides both background and an extensive literature survey on fuzzy logic and related type-2 fuzzy control. It also includes research questions, experiment and simulation results, and downloadable computer programs on an associated website. This key resource will prove useful to students and engineers wanting to learn type-2 fuzzy control theory and its applications.
This book reports on advances in sensing, modeling and control methods for different robotic platforms such as multi-degree of freedom robotic arms, unmanned aerial vehicles and autonomous mobile platforms. Based on 2018 Symposium on Mechatronics, Robotics, and Control (SMTRC'18), held as part of the 2018 CSME International Congress, in York University, Toronto, Canada, the book covers a variety of topics, from filtering and state estimation to adaptive control of reconfigurable robots and more. Next-generation systems with advanced control, planning, perception and interaction capabilities will achieve functionalities far beyond today's technology. Two key challenges remaining for advanced robot technologies are related to sensing and control in robotic systems. Advanced perception is needed to navigate changing environments. Adaptive and intelligent control systems must be developed to enable operation in unstructured and dynamic environments. The selected chapters in this book focus on both of the aforementioned areas and highlight the main trends and challenges in robot sensing and control. The first part of the book introduces chapters which focus on advanced perception and sensing for robotics applications. They include sensor filtering and state estimation for bipedal robots and motion capture systems analysis. The second part focuses on different modeling and control methods for robotic systems including flight control for UAVs, multi-variable robust control for modular and reconfigurable robotics and control for precision micromanipulation.
Artificial intelligence has long been a mainstay of science fiction and increasingly it feels as if AI is entering our everyday lives, with technology like Apple’s Siri now prominent, and self-driving cars almost upon us.
But what do we actually mean when we talk about ‘AI’? Are the sentient machines of 2001 or The Matrix a real possibility or will real-world artificial intelligence look and feel very different? What has it done for us so far? And what technologies could it yield in the future?
AI expert Yorick Wilks takes a journey through the history of artificial intelligence up to the present day, examining its origins, controversies and achievements, as well as looking into just how it works. He also considers the future, assessing whether these technologies could menace our way of life, but also how we are all likely to benefit from AI applications in the years to come.
Entertaining, enlightening, and keenly argued, this is the essential one-stop guide to the AI debate.
This book proposes complex hierarchical deep architectures (HDA) for predicting bankruptcy, a topical issue for business and corporate institutions that in the past has been tackled using statistical, market-based and machine-intelligence prediction models. The HDA are formed through fuzzy rough tensor deep staking networks (FRTDSN) with structured, hierarchical rough Bayesian (HRB) models. FRTDSN is formalized through TDSN and fuzzy rough sets, and HRB is formed by incorporating probabilistic rough sets in structured hierarchical Bayesian model. Then FRTDSN is integrated with HRB to form the compound FRTDSN-HRB model. HRB enhances the prediction accuracy of FRTDSN-HRB model. The experimental datasets are adopted from Korean construction companies and American and European non-financial companies, and the research presented focuses on the impact of choice of cut-off points, sampling procedures and business cycle on the accuracy of bankruptcy prediction models. The book also highlights the fact that misclassification can result in erroneous predictions leading to prohibitive costs to investors and the economy, and shows that choice of cut-off point and sampling procedures affect rankings of various models. It also suggests that empirical cut-off points estimated from training samples result in the lowest misclassification costs for all the models. The book confirms that FRTDSN-HRB achieves superior performance compared to other statistical and soft-computing models. The experimental results are given in terms of several important statistical parameters revolving different business cycles and sub-cycles for the datasets considered and are of immense benefit to researchers working in this area.
This book contains the proceedings of the 11th FSR (Field and Service Robotics), which is the leading single-track conference on applications of robotics in challenging environments. This conference was held in Zurich, Switzerland from 12-15 September 2017. The book contains 45 full-length, peer-reviewed papers organized into a variety of topics: Control, Computer Vision, Inspection, Machine Learning, Mapping, Navigation and Planning, and Systems and Tools. The goal of the book and the conference is to report and encourage the development and experimental evaluation of field and service robots, and to generate a vibrant exchange and discussion in the community. Field robots are non-factory robots, typically mobile, that operate in complex and dynamic environments: on the ground (Earth or other planets), under the ground, underwater, in the air or in space. Service robots are those that work closely with humans to help them with their lives. The first FSR was held in Canberra, Australia, in 1997. Since that first meeting, FSR has been held roughly every two years, cycling through Asia, Americas, and Europe.
This book is a tribute to Julian Francis Miller's ideas and achievements in computer science, evolutionary algorithms and genetic programming, electronics, unconventional computing, artificial chemistry and theoretical biology. Leading international experts in computing inspired by nature offer their insights into the principles of information processing and optimisation in simulated and experimental living, physical and chemical substrates. Miller invented Cartesian Genetic Programming (CGP) in 1999, from a representation of electronic circuits he devised with Thomson a few years earlier. The book presents a number of CGP's wide applications, including multi-step ahead forecasting, solving artificial neural networks dogma, approximate computing, medical informatics, control engineering, evolvable hardware, and multi-objective evolutionary optimisations. The book addresses in depth the technique of 'Evolution in Materio', a term coined by Miller and Downing, using a range of examples of experimental prototypes of computing in disordered ensembles of graphene nanotubes, slime mould, plants, and reaction diffusion chemical systems. Advances in sub-symbolic artificial chemistries, artificial bio-inspired development, code evolution with genetic programming, and using Reed-Muller expansions in the synthesis of Boolean quantum circuits add a unique flavour to the content. The book is a pleasure to explore for readers from all walks of life, from undergraduate students to university professors, from mathematicians, computer scientists and engineers to chemists and biologists.
Work with advanced topics in deep learning, such as optimization algorithms, hyper-parameter tuning, dropout, and error analysis as well as strategies to address typical problems encountered when training deep neural networks. You'll begin by studying the activation functions mostly with a single neuron (ReLu, sigmoid, and Swish), seeing how to perform linear and logistic regression using TensorFlow, and choosing the right cost function. The next section talks about more complicated neural network architectures with several layers and neurons and explores the problem of random initialization of weights. An entire chapter is dedicated to a complete overview of neural network error analysis, giving examples of solving problems originating from variance, bias, overfitting, and datasets coming from different distributions. Applied Deep Learning also discusses how to implement logistic regression completely from scratch without using any Python library except NumPy, to let you appreciate how libraries such as TensorFlow allow quick and efficient experiments. Case studies for each method are included to put into practice all theoretical information. You'll discover tips and tricks for writing optimized Python code (for example vectorizing loops with NumPy). What You Will Learn Implement advanced techniques in the right way in Python and TensorFlow Debug and optimize advanced methods (such as dropout and regularization) Carry out error analysis (to realize if one has a bias problem, a variance problem, a data offset problem, and so on) Set up a machine learning project focused on deep learning on a complex dataset Who This Book Is For Readers with a medium understanding of machine learning, linear algebra, calculus, and basic Python programming.
This book highlights practical quantum key distribution systems and research on the implementations of next-generation quantum communication, as well as photonic quantum device technologies. It discusses how the advances in quantum computing and quantum physics have allowed the building, launching and deploying of space exploration systems that are capable of more and more as they become smaller and lighter. It also presents theoretical and experimental research on the potential and limitations of secure communication and computation with quantum devices, and explores how security can be preserved in the presence of a quantum computer, and how to achieve long-distance quantum communication. The development of a real quantum computer is still in the early stages, but a number of research groups have investigated the theoretical possibilities of such computers.
This book starts with the proposition that digital media invite play and indeed need to be played by their everyday users. Play is probably one of the most visible and powerful ways to appropriate the digital world. The diverse, emerging practices of digital media appear to be essentially playful: Users are involved and active, produce form and content, spread, exchange and consume it, take risks, are conscious of their own goals and the possibilities of achieving them, are skilled and know how to acquire more skills. They share a perspective of can-do, a curiosity of what happens next? Play can be observed in social, economic, political, artistic, educational and criminal contexts and endeavours. It is employed as a (counter) strategy, for tacit or open resistance, as a method and productive practice, and something people do for fun. The book aims to define a particular contemporary attitude, a playful approach to media. It identifies some common ground and key principles in this novel terrain. Instead of looking at play and how it branches into different disciplines like business and education, the phenomenon of play in digital media is approached unconstrained by disciplinary boundaries. The contributions in this book provide a glimpse of a playful technological revolution that is a joyful celebration of possibilities that new media afford. This book is not a practical guide on how to hack a system or to pirate music, but provides critical insights into the unintended, artistic, fun, subversive, and sometimes dodgy applications of digital media. Contributions from Chris Crawford, Mathias Fuchs, Rilla Khaled, Sybille Lammes, Eva and Franco Mattes, Florian 'Floyd' Mueller, Michael Nitsche, Julian Oliver, and others cover and address topics such as reflective game design, identity and people's engagement in online media, conflicts and challenging opportunities for play, playing with cartographical interfaces, player-emergent production practices, the re-purposing of data, game creation as an educational approach, the ludification of society, the creation of meaning within and without play, the internalisation and subversion of roles through play, and the boundaries of play.
This book investigates observer-fault estimation techniques in detail, while also highlighting recent research and findings regarding fault estimation. Many practical control systems are subject to possible malfunctions, which may cause significant performance loss or even system instability. To improve the reliability, performance and safety of dynamical systems, fault diagnosis techniques are now receiving considerable attention, both in research and applications, and have been the subject of intensive investigations. Fault detection - the essential first step in fault diagnosis - is a binary decision-making process used to determine whether or not a fault has occurred. In turn, fault isolation is used to identify the location of the faulty component, while fault estimation is used to identify the size of the fault online. Compared with the problems involved in fault detection and isolation, fault estimation is considerably more challenging.
This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive systems which infer their context through pattern recognition, the author provides readers with a gentle yet robust foundation of knowledge to this growing field of research. The author explores a range of topics including data acquisition, signal processing, control theory, machine learning and system engineering explaining, with the use of simple mathematical concepts, the core principles underlying pervasive computing systems. Real-life examples are applied throughout, including self-driving cars, automatic insulin pumps, smart homes, and social robotic companions, with each chapter accompanied by a set of exercises for the reader. Practical tutorials are also available to guide enthusiastic readers through the process of building a smart system using cameras, microphones and robotic kits. Due to the power of MATLAB (TM), this can be achieved with no previous programming or robotics experience. Although Pervasive Computing is primarily for undergraduate students, the book is accessible to a wider audience of researchers and designers who are interested in exploring pervasive computing further.
This book presents the latest research advances in complex network structure analytics based on computational intelligence (CI) approaches, particularly evolutionary optimization. Most if not all network issues are actually optimization problems, which are mostly NP-hard and challenge conventional optimization techniques. To effectively and efficiently solve these hard optimization problems, CI based network structure analytics offer significant advantages over conventional network analytics techniques. Meanwhile, using CI techniques may facilitate smart decision making by providing multiple options to choose from, while conventional methods can only offer a decision maker a single suggestion. In addition, CI based network structure analytics can greatly facilitate network modeling and analysis. And employing CI techniques to resolve network issues is likely to inspire other fields of study such as recommender systems, system biology, etc., which will in turn expand CI's scope and applications. As a comprehensive text, the book covers a range of key topics, including network community discovery, evolutionary optimization, network structure balance analytics, network robustness analytics, community-based personalized recommendation, influence maximization, and biological network alignment. Offering a rich blend of theory and practice, the book is suitable for students, researchers and practitioners interested in network analytics and computational intelligence, both as a textbook and as a reference work.
This volume comprises a selection of works presented at the Numerical and Evolutionary Optimization (NEO 2016) workshop held in September 2016 in Tlalnepantla, Mexico. The development of powerful search and optimization techniques is of great importance in today's world and requires researchers and practitioners to tackle a growing number of challenging real-world problems. In particular, there are two well-established and widely known fields that are commonly applied in this area: (i) traditional numerical optimization techniques and (ii) comparatively recent bio-inspired heuristics. Both paradigms have their unique strengths and weaknesses, allowing them to solve some challenging problems while still failing in others. The goal of the NEO workshop series is to bring together experts from these and related fields to discuss, compare and merge their complementary perspectives in order to develop fast and reliable hybrid methods that maximize the strengths and minimize the weaknesses of the underlying paradigms. In doing so, NEO promotes the development of new techniques that are applicable to a broader class of problems. Moreover, NEO fosters the understanding and adequate treatment of real-world problems particularly in emerging fields that affect all of us, such as healthcare, smart cities, big data, among many others. The extended papers presented in the book contribute to achieving this goal.
This book explains how to teach better and presents the latest research on processing educational data and presents traditional statistical techniques as well as probabilistic, interval, and fuzzy approaches. Teaching is a very rewarding activity; it is also a very difficult one - because it is largely an art. There is a lot of advice on teaching available, but it is usually informal and is not easy to follow. To remedy this situation, it is reasonable to use techniques specifically designed to handle such imprecise knowledge: the fuzzy logic techniques. Since there are a large number of statistical studies of different teaching techniques, the authors combined statistical and fuzzy approaches to process the educational data in order to provide insights into improving all the stages of the education process: from forming a curriculum to deciding in which order to present the material to grading the assignments and exams. The authors do not claim to have solved all the problems of education. Instead they show, using numerous examples, that an innovative combination of different uncertainty techniques can improve teaching. The book offers teachers and instructors valuable advice and provides researchers in pedagogical and fuzzy areas with techniques to further advance teaching.
A new field of collective intelligence has emerged in the last few years, prompted by a wave of digital technologies that make it possible for organizations and societies to think at large scale. This "bigger mind"--human and machine capabilities working together--has the potential to solve the great challenges of our time. So why do smart technologies not automatically lead to smart results? Gathering insights from diverse fields, including philosophy, computer science, and biology, Big Mind reveals how collective intelligence can guide corporations, governments, universities, and societies to make the most of human brains and digital technologies. Geoff Mulgan explores how collective intelligence has to be consciously organized and orchestrated in order to harness its powers. He looks at recent experiments mobilizing millions of people to solve problems, and at groundbreaking technology like Google Maps and Dove satellites. He also considers why organizations full of smart people and machines can make foolish mistakes--from investment banks losing billions to intelligence agencies misjudging geopolitical events--and shows how to avoid them. Highlighting differences between environments that stimulate intelligence and those that blunt it, Mulgan shows how human and machine intelligence could solve challenges in business, climate change, democracy, and public health. But for that to happen we'll need radically new professions, institutions, and ways of thinking. Informed by the latest work on data, web platforms, and artificial intelligence, Big Mind shows how collective intelligence could help us survive and thrive.
This book gathers the proceedings of a symposium on the role of Internet technologies and how they can transform and improve people's lives. The Internet is essentially a massive database where all types of information can be shared and transmitted. This can be done passively in the form of non-interactive websites and blogs; or it can be done actively in the form of file sharing and document up- and downloading. Thanks to these technologies, a wealth of information is now available to anyone who can access the Internet. Moreover, Internet technologies are constantly improving: growing faster, offering more diverse information, and supporting processes that would have been impossible in the past. As a result, they have changed, and will continue to change, the way that the world does business and how people interact in their day-to-day lives. In conclusion, the symposium and these proceedings provide a valuable opportunity for leading researchers, engineers and professionals around the globe to discuss the latest advances that are helping the world move forward. They also facilitate the exchange of new ideas in the fields of communication technology to create a dialogue between these groups concerning the latest innovations, trends and concerns, practical challenges and potential solutions in the field of Internet technologies.
Looking beyond the communications technology horizon and projecting future competency-specific employment demand, this book presents an evaluation of desirable information systems enhancements by integrating two disparate-domain computer ontologies. It provides readers a fresh solutions approach based on dynamic modeling and methodological contributions to philosophical and assistive communications system development in healthcare, addressing the need for both demand intelligence and practical work environment support. The pace of change in redefining occupation-specific employee resourcing needs is unrelenting and continues to accelerate. And the exponential growth in the demand for healthcare service delivery is correspondingly daunting. As such, the public and private sectors are faced with the challenge of sustaining credible relevant demand intelligence and recruitment practices, while integration, expansion and enrichment of ostensibly unconnected ontologies represent key R&D issues.
This book presents a fascinating, state-of-the-art collection of papers on the recent advances in human-computer systems interaction (H-CSI). It offers a detailed description of the status quo in the H-CSI field and also provides a solid base for further development and research in the area. The content is divided into three parts: I. Aid systems for disabled people; II. Decision-making support systems; and III. Information and communication systems. It is intended for a wide audience of readers who are not necessarily experts in computer science, machine learning or knowledge engineering, but are interested in human-computer systems interaction, and the combination of general and specific papers offers readers deeper insights than might be gleaned from research papers or talks at conferences. It touches on all the current hot topics in the field of H-CSI.
This book employs a new eco-cognitive model of abduction to underline the distributed and embodied nature of scientific cognition. Its main focus is on the knowledge-enhancing virtues of abduction and on the productive role of scientific models. What are the distinctive features that define the kind of knowledge produced by science? To provide an answer to this question, the book first addresses the ideas of Aristotle, who stressed the essential inferential and distributed role of external cognitive tools and epistemic mediators in abductive cognition. This is analyzed in depth from both a naturalized logic and an ecology of cognition perspective. It is shown how the maximization of cognition, and of abducibility - two typical goals of science - are related to a number of fundamental aspects: the optimization of the eco-cognitive situatedness; the maximization of changeability for both the input and the output of the inferences involved; a high degree of information-sensitiveness; and the need to record the "past life" of abductive inferential practices. Lastly, the book explains how some impoverished epistemological niches - the result of a growing epistemic irresponsibility associated with the commodification and commercialization of science - are now seriously jeopardizing the flourishing development of human creative abduction.
You may like...
Artificial Intelligence in Finance - A…
Yves Hilpisch Paperback
Girl Decoded - My Quest to Make…
Rana El Kaliouby Paperback (1)
Sentiment Analysis - Mining Opinions…
Bing Liu Hardcover R1,571 Discovery Miles 15 710
The Performance Cortex - How…
Zach Schonbrun Paperback
T-Minus AI - Humanity's Countdown to…
Michael Kanaan CD
Gemma Fowler Paperback (1)
Collective Agency and Cooperation in…
Catrin Misselhorn Hardcover
Artificial Intelligence - A Guide for…
Melanie Mitchell Paperback
The Technology Trap - Capital, Labor…
Carl Benedikt Frey Paperback
Neural Approximations for Optimal…
Riccardo Zoppoli, Marcello Sanguineti, … Hardcover R4,640 Discovery Miles 46 400