![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer software packages
This book introduces the latest progress in six degrees of freedom (6-DoF) haptic rendering with the focus on a new approach for simulating force/torque feedback in performing tasks that require dexterous manipulation skills. One of the major challenges in 6-DoF haptic rendering is to resolve the conflict between high speed and high fidelity requirements, especially in simulating a tool interacting with both rigid and deformable objects in a narrow space and with fine features. The book presents a configuration-based optimization approach to tackle this challenge. Addressing a key issue in many VR-based simulation systems, the book will be of particular interest to researchers and professionals in the areas of surgical simulation, rehabilitation, virtual assembly, and inspection and maintenance.
This book has the unique intention of returning the mathematical tools of neural networks to the biological realm of the nervous system, where they originated a few decades ago. It aims to introduce, in a didactic manner, two relatively recent developments in neural network methodology, namely recurrence in the architecture and the use of spiking or integrate-and-fire neurons. In addition, the neuro-anatomical processes of synapse modification during development, training, and memory formation are discussed as realistic bases for weight-adjustment in neural networks. While neural networks have many applications outside biology, where it is irrelevant precisely which architecture and which algorithms are used, it is essential that there is a close relationship between the network's properties and whatever is the case in a neuro-biological phenomenon that is being modelled or simulated in terms of a neural network. A recurrent architecture, the use of spiking neurons and appropriate weight update rules contribute to the plausibility of a neural network in such a case. Therefore, in the first half of this book the foundations are laid for the application of neural networks as models for the various biological phenomena that are treated in the second half of this book. These include various neural network models of sensory and motor control tasks that implement one or several of the requirements for biological plausibility.
As the visual effects industry has diversified, so too have the
books written to serve the needs of this industry. Today there are
hundreds of highly specialized titles focusing on particular
aspects of film and broadcast animation, computer graphics, stage
photography, miniature photography, color theory, and many
others.
This book is a status report. It provides a broad overview of the most recent developments in the field, spanning a wide range of topical areas in simulational condensed matter physics. These areas include recent developments in simulations of classical statistical mechanics models, electronic structure calculations, quantum simulations, and simulations of polymers. Both new physical results and novel simulational and data analysis methods are presented. Some of the highlights of this volume include detailed accounts of recent theoretical developments in electronic structure calculations, novel quantum simulation techniques and their applications to strongly interacting lattice fermion models, and a wide variety of applications of existing methods as well as novel methods in the simulation of classical statistical mechanics models, including spin glasses and polymers.
Agent-based modelling on a computer appears to have a special role to play in the development of social science. It offers a means of discovering general and applicable social theory, and grounding it in precise assumptions and derivations, whilst addressing those elements of individual cognition that are central to human society. However, there are important questions to be asked and difficulties to overcome in achieving this potential. What differentiates agent-based modelling from traditional computer modelling? Which model types should be used under which circumstances? If it is appropriate to use a complex model, how can it be validated? Is social simulation research to adopt a realist epistemology, or can it operate within a social constructionist framework? What are the sociological concepts of norms and norm processing that could either be used for planned implementation or for identifying equivalents of social norms among co-operative agents? Can sustainability be achieved more easily in a hierarchical agent society than in a society of isolated agents? What examples are there of hybrid forms of interaction between humans and artificial agents? These are some of the sociological questions that are addressed.
II Challenges in Data Mapping Part II deals with one of the most challenging tasks in Interactive Visualization, mapping and teasing out information from large complex datasets and generating visual representations. This section consists of four chapters. Binh Pham, Alex Streit, and Ross Brown provide a comprehensive requirement analysis of information uncertainty visualizations. They examine the sources of uncertainty, review aspects of its complexity, introduce typical models of uncertainty, and analyze major issues in visualization of uncertainty, from various user and task perspectives. Alfred Inselberg examines challenges in the multivariate data analysis. He explains how relations among multiple variables can be mapped uniquely into ?-space subsets having geometrical properties and introduces Parallel Coordinates meth- ology for the unambiguous visualization and exploration of a multidimensional geometry and multivariate relations. Christiaan Gribble describes two alternative approaches to interactive particle visualization: one targeting desktop systems equipped with programmable graphics hardware and the other targeting moderately sized multicore systems using pack- based ray tracing. Finally, Christof Rezk Salama reviews state-of-the-art strategies for the assignment of visual parameters in scientific visualization systems. He explains the process of mapping abstract data values into visual based on transfer functions, clarifies the terms of pre- and postclassification, and introduces the state-of-the-art user int- faces for the design of transfer functions.
Confidently shepherd your organization's implementation of Microsoft Dynamics 365 to a successful conclusion In Mastering Microsoft Dynamics 365 Implementations, accomplished executive, project manager, and author Eric Newell delivers a holistic, step-by-step reference to implementing Microsoft's cloud-based ERP and CRM business applications. You'll find the detailed and concrete instructions you need to take your implementation project all the way to the finish line, on-time, and on-budget. You'll learn: The precise steps to take, in the correct order, to bring your Dynamics 365 implementation to life What to do before you begin the project, including identifying stakeholders and building your business case How to deal with a change management throughout the lifecycle of your project How to manage conference room pilots (CRPs) and what to expect during the sessions Perfect for CIOs, technology VPs, CFOs, Operations leaders, application directors, business analysts, ERP/CRM specialists, and project managers, Mastering Microsoft Dynamics 365 Implementations is an indispensable and practical reference for guiding your real-world Dynamics 365 implementation from planning to completion.
This book presents the selected results of the XI Scientific Conference Selected Issues of Electrical Engineering and Electronics (WZEE) which was held in Rzeszow and Czarna, Poland on September 27-30, 2013. The main aim of the Conference was to provide academia and industry to discuss and present the latest technological advantages and research results and to integrate the new interdisciplinary scientific circle in the field of electrical engineering, electronics and mechatronics. The Conference was organized by the Rzeszow Division of Polish Association of Theoretical and Applied Electrical Engineering (PTETiS) in cooperation with Rzeszow University of Technology, the Faculty of Electrical and Computer Engineering and Rzeszow University, the Faculty of Mathematics and Natural Sciences.
This book examines the historical roots and evolution of simulation from an epistemological, institutional and technical perspective. Rich case studies go far beyond documentation of simulation 's capacity for application in many domains; they also explore the "functional" and "structural" debate that continues to traverse simulation thought and action. This book is an essential contribution to the assessment of simulation as scientific instrument.
Fundamental solutions in understanding information have been elusive for a long time. The field of Artificial Intelligence has proposed the Turing Test as a way to test for the "smart" behaviors of computer programs that exhibit human-like qualities. Equivalent to the Turing Test for the field of Human Information Interaction (HII), getting information to the people that need them and helping them to understand the information is the new challenge of the Web era. In a short amount of time, the infrastructure of the Web became ubiquitious not just in terms of protocols and transcontinental cables but also in terms of everyday devices capable of recalling network-stored data, sometimes wire lessly. Therefore, as these infrastructures become reality, our attention on HII issues needs to shift from information access to information sensemaking, a relatively new term coined to describe the process of digesting information and understanding its structure and intricacies so as to make decisions and take action.
The book offers a detailed guide to temporal ordering, exploring open problems in the field and providing solutions and extensive analysis. It addresses the challenge of automatically ordering events and times in text. Aided by TimeML, it also describes and presents concepts relating to time in easy-to-compute terms. Working out the order that events and times happen has proven difficult for computers, since the language used to discuss time can be vague and complex. Mapping out these concepts for a computational system, which does not have its own inherent idea of time, is, unsurprisingly, tough. Solving this problem enables powerful systems that can plan, reason about events, and construct stories of their own accord, as well as understand the complex narratives that humans express and comprehend so naturally. This book presents a theory and data-driven analysis of temporal ordering, leading to the identification of exactly what is difficult about the task. It then proposes and evaluates machine-learning solutions for the major difficulties. It is a valuable resource for those working in machine learning for natural language processing as well as anyone studying time in language, or involved in annotating the structure of time in documents.
This book presents selected papers from the Sixteenth International Conference on Intelligent Information Hiding and Multimedia Signal Processing, in conjunction with the Thirteenth International Conference on Frontiers of Information Technology, Applications and Tools, held on November 5-7, 2020, in Ho Chi Minh City, Vietnam. It is divided into two volumes and discusses the latest research outcomes in the field of Information Technology (IT) including information hiding, multimedia signal processing, big data, data mining, bioinformatics, database, industrial and Internet of things, and their applications.
Social media is now ubiquitous on the internet, generating both new possibilities and new challenges in information analysis and retrieval. This comprehensive text/reference examines in depth the synergy between multimedia content analysis, personalization, and next-generation networking. The book demonstrates how this integration can result in robust, personalized services that provide users with an improved multimedia-centric quality of experience. Each chapter offers a practical step-by-step walkthrough for a variety of concepts, components and technologies relating to the development of applications and services. Topics and features: provides contributions from an international and interdisciplinary selection of experts in their fields; introduces the fundamentals of social media retrieval, presenting the most important areas of research in this domain; examines the important topic of multimedia tagging in social environments, including geo-tagging; discusses issues of personalization and privacy in social media; reviews advances in encoding, compression and network architectures for the exchange of social media information; describes a range of applications related to social media. Researchers and students interested in social media retrieval will find this book a valuable resource, covering a broad overview of state-of-the-art research and emerging trends in this area. The text will also be of use to practicing engineers involved in envisioning and building innovative social media applications and services.
Introduction The exponential scaling of feature sizes in semiconductor technologies has side-effects on layout optimization, related to effects such as inter connect delay, noise and crosstalk, signal integrity, parasitics effects, and power dissipation, that invalidate the assumptions that form the basis of previous design methodologies and tools. This book is intended to sample the most important, contemporary, and advanced layout opti mization problems emerging with the advent of very deep submicron technologies in semiconductor processing. We hope that it will stimulate more people to perform research that leads to advances in the design and development of more efficient, effective, and elegant algorithms and design tools. Organization of the Book The book is organized as follows. A multi-stage simulated annealing algorithm that integrates floorplanning and interconnect planning is pre sented in Chapter 1. To reduce the run time, different interconnect plan ning approaches are applied in different ranges of temperatures. Chapter 2 introduces a new design methodology - the interconnect-centric design methodology and its centerpiece, interconnect planning, which consists of physical hierarchy generation, floorplanning with interconnect planning, and interconnect architecture planning. Chapter 3 investigates a net-cut minimization based placement tool, Dragon, which integrates the state of the art partitioning and placement techniques."
This book explores inductive inference using the minimum message length (MML) principle, a Bayesian method which is a realisation of Ockham's Razor based on information theory. Accompanied by a library of software, the book can assist an applications programmer, student or researcher in the fields of data analysis and machine learning to write computer programs based upon this principle. MML inference has been around for 50 years and yet only one highly technical book has been written about the subject. The majority of research in the field has been backed by specialised one-off programs but this book includes a library of general MML-based software, in Java. The Java source code is available under the GNU GPL open-source license. The software library is documented using Javadoc which produces extensive cross referenced HTML manual pages. Every probability distribution and statistical model that is described in the book is implemented and documented in the software library. The library may contain a component that directly solves a reader's inference problem, or contain components that can be put together to solve the problem, or provide a standard interface under which a new component can be written to solve the problem. This book will be of interest to application developers in the fields of machine learning and statistics as well as academics, postdocs, programmers and data scientists. It could also be used by third year or fourth year undergraduate or postgraduate students.
Soft City Culture and Technology: The Betaville Project discusses the complete cycle of conception, development, and deployment of the Betaville platform. Betaville is a massively participatory online environment for distributed 3D design and development of proposals for changes to the built environment an experimental integration of art, design, and software development for the public realm. Through a detailed account of Betaville from a Big Crazy Idea to a working "deep social medium," the author examines the current conditions of performance and accessibility of hardware, software, networks, and skills that can be brought together into a new form of open public design and deliberation space, for and spanning and integrating the disparate spheres of art, architecture, social media, and engineering. Betaville is an ambitious enterprise, of building compelling and constructive working relationships in situations where roles and disciplinary boundaries must be as agile as the development process of the software itself. Through a considered account and analysis of the interdependencies between Betaville's project design, development methods, and deployment, the reader can gain a deeper understanding of the potential socio-technical forms of New Soft Cities: blended virtual-physical worlds, whose "public works" must ultimately serve and succeed as massively collaborative works of art and infrastructure."
Multimedia Content Analysis: Theory and Applications covers the latest in multimedia content analysis and applications based on such analysis. As research has progressed, it has become clear that this field has to appeal to other disciplines such as psycho-physics, media production, etc. This book consists of invited chapters that cover the entire range of the field. Some of the topics covered include low-level audio-visual analysis based retrieval and indexing techniques, the TRECVID effort, video browsing interfaces, content creation and content analysis, and multimedia analysis-based applications, among others. The chapters are written by leading researchers in the multimedia field.
This book brings together two major trends: data science and blockchains. It is one of the first books to systematically cover the analytics aspects of blockchains, with the goal of linking traditional data mining research communities with novel data sources. Data science and big data technologies can be considered cornerstones of the data-driven digital transformation of organizations and society. The concept of blockchain is predicted to enable and spark transformation on par with that associated with the invention of the Internet. Cryptocurrencies are the first successful use case of highly distributed blockchains, like the world wide web was to the Internet. The book takes the reader through basic data exploration topics, proceeding systematically, method by method, through supervised and unsupervised learning approaches and information visualization techniques, all the way to understanding the blockchain data from the network science perspective. Chapters introduce the cryptocurrency blockchain data model and methods to explore it using structured query language, association rules, clustering, classification, visualization, and network science. Each chapter introduces basic concepts, presents examples with real cryptocurrency blockchain data and offers exercises and questions for further discussion. Such an approach intends to serve as a good starting point for undergraduate and graduate students to learn data science topics using cryptocurrency blockchain examples. It is also aimed at researchers and analysts who already possess good analytical and data skills, but who do not yet have the specific knowledge to tackle analytic questions about blockchain transactions. The readers improve their knowledge about the essential data science techniques in order to turn mere transactional information into social, economic, and business insights.
It was a pleasure to provide an introduction to a new volume on user experience evaluation in games. The scope, depth, and diversity of the work here is amazing. It attests to the growing popularity of games and the increasing importance developing a range of theories, methods, and scales to evaluate them. This evolution is driven by the cost and complexity of games being developed today. It is also driven by the need to broaden the appeal of games. Many of the approaches described here are enabled by new tools and techniques. This book (along with a few others) represents a watershed in game evaluation and understanding. The eld of game evaluation has truly "come of age." The broader eld of HCI can begin to look toward game evaluation for fresh, critical, and sophisticated thi- ing about design evaluation and product development. They can also look to games for groundbreaking case studies of evaluation of products. I'll brie y summarize each chapter below and provide some commentary. In conclusion, I will mention a few common themes and offer some challenges. Discussion In Chapter 1, User Experience Evaluation in Entertainment, Bernhaupt gives an overview and presents a general framework on methods currently used for user experience evaluation. The methods presented in the following chapters are s- marized and thus allow the reader to quickly assess the right set of methods that will help to evaluate the game under development.
The book focusses on questions of individual and collective action, the emergence and dynamics of social norms and the feedback between individual behaviour and social phenomena. It discusses traditional modelling approaches to social norms and shows the usefulness of agent-based modelling for the study of these micro-macro interactions. Existing agent-based models of social norms are discussed and it is shown that so far too much priority has been given to parsimonious models and questions of the emergence of norms, with many aspects of social norms, such as norm-change, not being modelled. Juvenile delinquency, group radicalisation and moral decision making are used as case studies for agent-based models of collective action extending existing models by providing an embedding into social networks, social influence via argumentation and a causal action theory of moral decision making. The major contribution of the book is to highlight the multifaceted nature of the dynamics of social norms, consisting not only of emergence, and the importance of embedding of agent-based models into existing theory."
Recent trends have shown increasing privatization of standardization activities under various corporations, trade associations, and consortia, raising significant public policy issues about how the public interest may be represented. ""Standardization and Digital Enclosure: The Privatization of Standards, Knowledge, and Policy in the Age of Global Information Technology"" establishes a framework of analysis for public policy discussion and debate. Discussing topics such as social practices and political economic discourse, this book offers a truly interdisciplinary approach to standardization and privatization valuable to technical, economic, and political researchers and practitioners, as well as academicians involved in related fields. |
![]() ![]() You may like...
Developing Churn Models Using Data…
Goran Klepac, Robert Kopal, …
Hardcover
R5,037
Discovery Miles 50 370
Artificial Intelligence and Heuristics…
Chandrasekar Vuppalapati
Hardcover
R4,563
Discovery Miles 45 630
Emergent Knowledge Strategies…
Ettore Bolisani, Constantin Bratianu
Hardcover
R4,172
Discovery Miles 41 720
Contemporary Perspectives in Data Mining…
Kenneth D. Lawrence, Ronald K. Klimberg
Hardcover
Painting by Numbers - Data-Driven…
Diana Seave Greenwald
Hardcover
|