![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General
This book discusses the semantic foundations of concurrent systems with nondeterministic and probabilistic behaviour. Particular attention is given to clarifying the relationship between testing and simulation semantics and characterising bisimulations from metric, logical, and algorithmic perspectives. Besides presenting recent research outcomes in probabilistic concurrency theory, the book exemplifies the use of many mathematical techniques to solve problems in computer science, which is intended to be accessible to postgraduate students in Computer Science and Mathematics. It can also be used by researchers and practitioners either for advanced study or for technical reference.
(Preliminary) The book is a comprehensive collection of the most recent and significant research and applications in the field of fuzzy logic. It covers fuzzy structures, systems, rules, operations as well as important applications, e.g in decision making, environmental prediction and prevention, and communication. It is dedicated to Enric Trillas as an acknowledgement for his pioneering research in the field. The book include a foreword by Lotfi A. Zadeh.
From 9/11 to Charlie Hebdo along with Sony-pocalypse and DARPA's $2 million Cyber Grand Challenge, this book examines counterterrorism and cyber security history, strategies and technologies from a thought-provoking approach that encompasses personal experiences, investigative journalism, historical and current events, ideas from thought leaders and the make-believe of Hollywood such as 24, Homeland and The Americans. President Barack Obama also said in his 2015 State of the Union address, "We are making sure our government integrates intelligence to combat cyber threats, just as we have done to combat terrorism. In this new edition, there are seven completely new chapters, including three new contributed chapters by healthcare chief information security officer Ray Balut and Jean C. Stanford, DEF CON speaker Philip Polstra and security engineer and Black Hat speaker Darren Manners, as well as new commentaries by communications expert Andy Marken and DEF CON speaker Emily Peed. The book offers practical advice for businesses, governments and individuals to better secure the world and protect cyberspace.
This book contains an interesting and state-of the art collection of chapters presenting several examples of attempts to developing modern tools utilizing computational intelligence in different real life problems encountered by humans. Reasoning, prediction, modeling, optimization, decision making, etc. need modern, soft and intelligent algorithms, methods and methodologies to solve, in the efficient ways, problems appearing in human activity. The contents of the book is divided into two parts. Part I, consisting of four chapters, is devoted to selected links of computational intelligence, medicine, health care and biomechanics. Several problems are considered: estimation of healthcare system reliability, classification of ultrasound thyroid images, application of fuzzy logic to measure weight status and central fatness, and deriving kinematics directly from video records. Part II, also consisting of four chapters, is devoted to selected links of computational intelligence and biology. The common denominator of three chapters is Physarum polycephalum, one-cell organisms able to build complex networks for solving different computational tasks. One chapter focuses on a novel device, the memristor, that has possible uses both in the creation of hardware neural nets for artificial intelligence and as the connection between a hardware neural net and a living neuronal cell network in the treatment and monitoring of neurological disease. This book is intended for a wide audience of readers who are interested in various aspects of computational intelligence.
This book contains papers presented at the 2014 MICCAI Workshop on Computational Diffusion MRI, CDMRI’14. Detailing new computational methods applied to diffusion magnetic resonance imaging data, it offers readers a snapshot of the current state of the art and covers a wide range of topics from fundamental theoretical work on mathematical modeling to the development and evaluation of robust algorithms and applications in neuroscientific studies and clinical practice. Inside, readers will find information on brain network analysis, mathematical modeling for clinical applications, tissue microstructure imaging, super-resolution methods, signal reconstruction, visualization, and more. Contributions include both careful mathematical derivations and a large number of rich full-color visualizations. Computational techniques are key to the continued success and development of diffusion MRI and to its widespread transfer into the clinic. This volume will offer a valuable starting point for anyone interested in learning computational diffusion MRI. It also offers new perspectives and insights on current research challenges for those currently in the field. The book will be of interest to researchers and practitioners in computer science, MR physics, and applied mathematics.
This book presents recent research using cognitive science to apprehend risk situations and elaborate new organizations, new systems and new methodological tools in response. The book demonstrates the reasons, advantages and implications of the association of the concepts of cognition and risk. It is shown that this association has strong consequences on how to apprehend critical situations that emerge within various activity domains, and how to elaborate responses to these critical situations.. The following topics are covered by the book: · Influence of the culture in risk management, · Influence of the risk communication in risk management, · User-centred design to improve risk situation management, · Designing new tools to assist risk situation management, · Risk prevention in industrial activities.
This is the third book presenting selected results of research on the further development of the shape understanding system (SUS) carried out by authors in the newly founded Queen Jadwiga Research Institute of Understanding. In this book the new term Machine Understanding is introduced referring to a new area of research aiming to investigate the possibility of building machines with the ability to understand. It is presented that SUS needs to some extent mimic human understanding and for this reason machines are evaluated according to the rules applied for the evaluation of human understanding. The book shows how to formulate problems and how it can be tested if the machine is able to solve these problems.
The present book is the result of a three year research project which investigated the creative act of composing by means of algorithmic composition. Central to the investigation are the compositional strategies of 12 composers, which were documented through a dialogic and cyclic process of modelling and evaluating musical materials. The aesthetic premises and compositional approaches configure a rich spectrum of diverse positions, which is reflected also in the kinds of approaches and methods used. These approaches and methods include the generation and evaluation of chord sequences using genetic algorithms, the application of morphing strategies to research harmonic transformations, an automatic classification of personal preferences via machine learning, and an application of mathematical music theory to the analysis and resynthesis of musical material. The second part of the book features contributions by Sandeep Bhagwati, William Brooks, David Cope, Darla Crispin, Nicolas Donin, and Guerino Mazzola. These authors variously consider the project from different perspectives, offer independent approaches, or provide more general reflections from their respective research fields.
This pioneering book develops definitions and concepts related to Quality of Experience in the context of multimedia- and telecommunications-related applications, systems and services and applies these to various fields of communication and media technologies. The editors bring together numerous key-protagonists of the new discipline “Quality of Experience” and combine the state-of-the-art knowledge in one single volume.
This book presents interesting, important unsolved problems in the mathematical and computational sciences. The contributing authors are leading researchers in their fields and they explain outstanding challenges in their domains, first by offering basic definitions, explaining the context, and summarizing related algorithms, theorems, and proofs, and then by suggesting creative solutions. The authors feel a strong motivation to excite deep research and discussion in the mathematical and computational sciences community, and the book will be of value to postgraduate students and researchers in the areas of theoretical computer science, discrete mathematics, engineering, and cryptology.
This volume gathers the peer reviewed papers presented at the 4th edition of the International Workshop “Service Orientation in Holonic and Multi-agent Manufacturing – SOHOMA’14” organized and hosted on November 5-6, 2014 by the University of Lorraine, France in collaboration with the CIMR Research Centre of the University Politehnica of Bucharest and the TEMPO Laboratory of the University of Valenciennes and Hainaut-Cambrésis. The book is structured in six parts, each one covering a specific research line which represents a trend in future manufacturing: (1) Holonic and Agent-based Industrial Automation Systems; (2) Service-oriented Management and Control of Manufacturing Systems; (3) Distributed Modelling for Safety and Security in Industrial Systems; (4) Complexity, Big Data and Virtualization in Computing-oriented Manufacturing; (5) Adaptive, Bio-inspired and Self-organizing Multi-Agent Systems for Manufacturing and (6) Physical Internet Simulation, Modelling and Control. There is a clear orientation of the SOHOMA’14 workshop towards complexity, which is a common view of all six parts. There is need for a framework allowing the development of manufacturing cyber physical systems including capabilities for complex event processing and data analytics which are expected to move the manufacturing domain closer towards cloud manufacturing within contextual enterprises. Recent advances in sensor, communication and intelligent computing technologies made possible the Internet connectivity of the physical world: the Physical Internet, where not only documents and images are created, shared, or modified in the cyberspace, but also the physical resources and products interact over Internet and make decisions based on shared communication.
Software and systems quality is playing an increasingly important role in the growth of almost all ─ profit and non-profit ─ organisations. Quality is vital to the success of enterprises in their markets. Most small trade and repair businesses use software systems in their administration and marketing processes. Every doctor’s surgery is managing its patients using software. Banking is no longer conceivable without software. Aircraft, trucks and cars use more and more software to handle their increasingly complex technical systems. Innovation, competition and cost pressure are always present in on-going business decisions. The question facing all these organisations is how to achieve the right quality of their software-based systems and products; how to get the required level of quality, a level that the market will reward, a level that mitigates the organisation’s risks and a level that the organisation is willing to pay for. Although a number of good practices are in place, there is still room for huge improvements. Thus, let us take a look into the two worlds of “Embedded systems” and “ICT systems” and let us learn from both worlds, from overlaps and individual solutions. The next step for industrialisation in the software industry is required now. Hence, three pillars will be focused in this book: (1) a fundamental notion of right software and systems quality (RiSSQ); (2) portfolio management, quality governance, quality management, and quality engineering as holistic approach over the three layers of an enterprise, i.e. strategic, tactical, and operational layer; and (3) an industrialisation framework for implementing our approach.
This book investigates the functional adequacy as well as the affective impression made by feedback messages on mobile devices. It presents an easily adoptable experimental setup to examine context effects on various feedback messages and applies it to auditory, tactile and auditory-tactile feedback messages. This approach provides insights into the relationship between the affective impression and functional applicability of these messages as well as an understanding of the influence of unimodal components on the perception of multimodal feedback messages. The developed paradigm can also be extended to investigate other aspects of context and used to investigate feedback messages in modalities other than those presented. The book uses questionnaires implemented on a Smartphone, which can easily be adopted for field studies to broaden the scope even wider. Finally, the book offers guidelines for the design of system feedback.
This book contains the full papers presented at ICCEBS 2013 – the 1st International Conference on Computational and Experimental Biomedical Sciences, which was organized in Azores, in October 2013. The included papers present and discuss new trends in those fields, using several methods and techniques, including active shape models, constitutive models, isogeometric elements, genetic algorithms, level sets, material models, neural networks, optimization and the finite element method, in order to address more efficiently different and timely applications involving biofluids, computer simulation, computational biomechanics, image based diagnosis, image processing and analysis, image segmentation, image registration, scaffolds, simulation and surgical planning. The main audience for this book consists of researchers, Ph.D students and graduate students with multidisciplinary interests related to the areas of artificial intelligence, bioengineering, biology, biomechanics, computational fluid dynamics, computational mechanics, computational vision, histology, human motion, imagiology, applied mathematics, medical image, medicine, orthopaedics, rehabilitation, speech production and tissue engineering.
The book describes advanced business analytics and shows how to apply them to many different professional areas of engineering and management. Each chapter of the book is contributed by a different author and covers a different area of business analytics. The book connects the analytic principles with business practice and provides an interface between the main disciplines of engineering/technology and the organizational, administrative and planning abilities of management. It also refers to other disciplines such as economy, finance, marketing, behavioral economics and risk analysis. This book is of special interest to engineers, economists and researchers who are developing new advances in engineering management but also to practitioners working on this subject.
This carefully edited book takes a walk through recent advances in adaptation and hybridization in the Computational Intelligence (CI) domain. It consists of ten chapters that are divided into three parts. The first part illustrates background information and provides some theoretical foundation tackling the CI domain, the second part deals with the adaptation in CI algorithms, while the third part focuses on the hybridization in CI. This book can serve as an ideal reference for researchers and students of computer science, electrical and civil engineering, economy, and natural sciences that are confronted with solving the optimization, modeling and simulation problems. It covers the recent advances in CI that encompass Nature-inspired algorithms, like Artificial Neural networks, Evolutionary Algorithms and Swarm Intelligence –based algorithms.
This work provides an assessment of the current state of near field communication (NFC) security, it reports on new attack scenarios, and offers concepts and solutions to overcome any unresolved issues. The work describes application-specific security aspects of NFC based on exemplary use-case scenarios and uses these to focus on the interaction with NFC tags and on card emulation. The current security architectures of NFC-enabled cellular phones are evaluated with regard to the identified security aspects.
This book presents a study in knowledge discovery in data with knowledge understood as a set of relations among objects and their properties. Relations in this case are implicative decision rules and the paradigm in which they are induced is that of computing with granules defined by rough inclusions, the latter introduced and studied within rough mereology, the fuzzified version of mereology. In this book basic classes of rough inclusions are defined and based on them methods for inducing granular structures from data are highlighted. The resulting granular structures are subjected to classifying algorithms, notably k—nearest neighbors and bayesian classifiers. Experimental results are given in detail both in tabular and visualized form for fourteen data sets from UCI data repository. A striking feature of granular classifiers obtained by this approach is that preserving the accuracy of them on original data, they reduce substantially the size of the granulated data set as well as the set of granular decision rules. This feature makes the presented approach attractive in cases where a small number of rules providing a high classification accuracy is desirable. As basic algorithms used throughout the text are explained and illustrated with hand examples, the book may also serve as a textbook.
The purpose of law is to prevent the society from harm by declaring what conduct is criminal, and prescribing the punishment to be imposed for such conduct. The pervasiveness of the internet and its anonymous nature make cyberspace a lawless frontier where anarchy prevails. Historically, economic value has been assigned to visible and tangible assets. With the increasing appreciation that intangible data disseminated through an intangible medium can possess economic value, cybercrime is also being recognized as an economic asset. The Cybercrime, Digital Forensics and Jurisdiction disseminate knowledge for everyone involved with understanding and preventing cybercrime - business entities, private citizens, and government agencies. The book is firmly rooted in the law demonstrating that a viable strategy to confront cybercrime must be international in scope.
This book describes recent theoretical findings relevant to bilevel programming in general, and in mixed-integer bilevel programming in particular. It describes recent applications in energy problems, such as the stochastic bilevel optimization approaches used in the natural gas industry. New algorithms for solving linear and mixed-integer bilevel programming problems are presented and explained.
This book documents recent advances in the field of modeling, simulation, control, security and reliability of Cyber- Physical Systems (CPS) in power grids. The aim of this book is to help the reader gain insights into working of CPSs and understand their potential in transforming the power grids of tomorrow. This book will be useful for all those who are interested in design of cyber-physical systems, be they students or researchers in power systems, CPS modeling software developers, technical marketing professionals and business policy-makers.
The aim of this book is to explain to high-performance computing (HPC) developers how to utilize the Intel® Xeon Phi™ series products efficiently. To that end, it introduces some computing grammar, programming technology and optimization methods for using many-integrated-core (MIC) platforms and also offers tips and tricks for actual use, based on the authors’ first-hand optimization experience. The material is organized in three sections. The first section, “Basics of MIC”, introduces the fundamentals of MIC architecture and programming, including the specific Intel MIC programming environment. Next, the section on “Performance Optimization” explains general MIC optimization techniques, which are then illustrated step-by-step using the classical parallel programming example of matrix multiplication. Finally, “Project development” presents a set of practical and experience-driven methods for using parallel computing in application projects, including how to determine if a serial or parallel CPU program is suitable for MIC and how to transplant a program onto MIC. This book appeals to two main audiences: First, software developers for HPC applications – it will enable them to fully exploit the MIC architecture and thus achieve the extreme performance usually required in biological genetics, medical imaging, aerospace, meteorology and other areas of HPC. Second, students and researchers engaged in parallel and high-performance computing – it will guide them on how to push the limits of system performance for HPC applications.
This is a comprehensive description of the cryptographic hash function BLAKE, one of the five final contenders in the NIST SHA3 competition, and of BLAKE2, an improved version popular among developers. It describes how BLAKE was designed and why BLAKE2 was developed, and it offers guidelines on implementing and using BLAKE, with a focus on software implementation. In the first two chapters, the authors offer a short introduction to cryptographic hashing, the SHA3 competition and BLAKE. They review applications of cryptographic hashing, they describe some basic notions such as security definitions and state-of-the-art collision search methods and they present SHA1, SHA2 and the SHA3 finalists. In the chapters that follow, the authors give a complete description of the four instances BLAKE-256, BLAKE-512, BLAKE-224 and BLAKE-384; they describe applications of BLAKE, including simple hashing with or without a salt and HMAC and PBKDF2 constructions; they review implementation techniques, from portable C and Python to AVR assembly and vectorized code using SIMD CPU instructions; they describe BLAKE’s properties with respect to hardware design for implementation in ASICs or FPGAs; they explain BLAKE's design rationale in detail, from NIST’s requirements to the choice of internal parameters; they summarize the known security properties of BLAKE and describe the best attacks on reduced or modified variants; and they present BLAKE2, the successor of BLAKE, starting with motivations and also covering its performance and security aspects. The book concludes with detailed test vectors, a reference portable C implementation of BLAKE, and a list of third-party software implementations of BLAKE and BLAKE2. The book is oriented towards practice – engineering and craftsmanship – rather than theory. It is suitable for developers, engineers and security professionals engaged with BLAKE and cryptographic hashing in general and for applied cryptography researchers and students who need a consolidated reference and a detailed description of the design process, or guidelines on how to design a cryptographic algorithm.
This book provides a coherent methodology for Model-Driven Requirements Engineering which stresses the systematic treatment of requirements within the realm of modelling and model transformations. The underlying basic assumption is that detailed requirements models are used as first-class artefacts playing a direct role in constructing software. To this end, the book presents the Requirements Specification Language (RSL) that allows precision and formality, which eventually permits automation of the process of turning requirements into a working system by applying model transformations and code generation to RSL. The book is structured in eight chapters. The first two chapters present the main concepts and give an introduction to requirements modelling in RSL. The next two chapters concentrate on presenting RSL in a formal way, suitable for automated processing. Subsequently, chapters 5 and 6 concentrate on model transformations with the emphasis on those involving RSL and UML. Finally, chapters 7 and 8 provide a summary in the form of a systematic methodology with a comprehensive case study. Presenting technical details of requirements modelling and model transformations for requirements, this book is of interest to researchers, graduate students and advanced practitioners from industry. While researchers will benefit from the latest results and possible research directions in MDRE, students and practitioners can exploit the presented information and practical techniques in several areas, including requirements engineering, architectural design, software language construction and model transformation. Together with a tool suite available online, the book supplies the reader with what it promises: the means to get from requirements to code “in a snap”.
The book Intelligent Systems for Science and Information is the remarkable collection of extended chapters from the selected papers that were published in the proceedings of Science and Information (SAI) Conference 2013. It contains twenty-four chapters in the field of Intelligent Systems, which received highly recommended feedback during SAI Conference 2013 review process. All chapters have gone through substantial extension and consolidation and were subject to another round of rigorous review and additional modification. These chapters represent the state of the art of the cutting-edge research and technologies in related areas, and can help inform relevant research communities and individuals of the future development in Science and Information. |
![]() ![]() You may like...
Digital Libraries - Integrating Content…
Mark V Dahl, Kyle Banerjee, …
Paperback
Global Information Technology…
Candace Deans, Kirk Karwan
Hardcover
Principles Of Business Information…
Ralph Stair, George Reynolds, …
Paperback
![]()
Handbook of Research on Gamification…
Oscar Bernardes, Vanessa Amorim, …
Hardcover
R10,010
Discovery Miles 100 100
|