![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer software packages
With the fast growth ofmultimedia information, content-based video anal- ysis, indexing and representation have attracted increasing attention in re- cent years. Many applications have emerged in these areas such as video- on-demand, distributed multimedia systems, digital video libraries, distance learning/education, entertainment, surveillance and geographical information systems. The need for content-based video indexing and retrieval was also rec- ognized by ISOIMPEG, and a new international standard called "Multimedia Content Description Interface" (or in short, MPEG-7)was initialized in 1998 and finalized in September 2001. In this context, a systematic and thorough review ofexisting approaches as well as the state-of-the-art techniques in video content analysis, indexing and representation areas are investigated and studied in this book. In addition, we will specifically elaborate on a system which analyzes, indexes and abstracts movie contents based on the integration ofmultiple media modalities. Content ofeach part ofthis book is briefly previewed below. In the first part, we segment a video sequence into a set ofcascaded shots, where a shot consistsofone or more continuouslyrecorded image frames. Both raw and compressedvideo data will beinvestigated. Moreover, consideringthat there are always non-story units in real TV programs such as commercials, a novel commercial break detection/extraction scheme is developed which ex- ploits both audio and visual cues to achieve robust results. Specifically, we first employ visual cues such as the video data statistics, the camera cut fre- quency, and the existenceofdelimiting black frames between commercials and programs, to obtain coarse-level detection results.
* With Oracle 10g, for the first time, much of the Spatial functionality is provided for free (rather than as a priced option) in the database, thus massively increasing the potential audience. * Shows how any Oracle application that has a spatial element (e.g. postcode) can take advantage of Spatial functionality. * Contains case studies of more advanced applications of Spatial in healthcare, telecom ,retail, and distribution . * Oracle Spatial is recognized to be the standard platform for enterprise land management, mapping, telecom, transportation, and utility applications. Every major GIS tool vendor supports Oracle Spatial and all major map data providers deliver their data in Oracle Spatial format. * The book will be based on extensive feedback from training courses, discussion lists, and customers. It will recommend best practice approaches to the most common problems with which developers struggle. * The authors are all experienced and well-respected experts. The Oracle personnel contributing have a decade of experience with Spatial and in helping partners and customers fully leverage its capabilities. The technical reviewers include lead developers of the product. * Rather than simplified code snippets, the book provides real solutions that people can then build upon themselves.
This book brings together some of the most influential pieces of research undertaken around the world in design synthesis. It is the first comprehensive work of this kind and covers all three aspects of research in design synthesis:- understanding what constitutes and influences synthesis;- the major approaches to synthesis;- the diverse range of tools that are created to support this crucial design task.The chapters are comprised of cutting edge research and established methods, written by the originators of this growing field of research. They cover all major generic synthesis approaches i.e., composition, retrieval, change and repair, and tackle problems that come from a wide variety of domains within architecture and engineering as well as areas of application including clocks, sensors and medical devices. The book contains an editorial introduction to the chapters and the broader context of research they represent. With its range of tools and methods covered, it is an ideal introduction to design synthesis for those intending to research in this area as well as being a valuable source of ideas for educators and practitioners of engineering design.
The third entry in the Jim Blinn's Corner series, this is, like the
others, a handy compilation of selected installments of his
influential column. But here, for the first time, you get the
"Director's Cut" of the articles: revised, expanded, and enhanced
versions of the originals. What's changed? Improved mathematical
notation, more diagrams, new solutions. What remains the same? All
the things you've come to rely on: straight answers, irreverent
style, and innovative thinking. This is Jim Blinn at his best now
even better.
This book has the unique intention of returning the mathematical tools of neural networks to the biological realm of the nervous system, where they originated a few decades ago. It aims to introduce, in a didactic manner, two relatively recent developments in neural network methodology, namely recurrence in the architecture and the use of spiking or integrate-and-fire neurons. In addition, the neuro-anatomical processes of synapse modification during development, training, and memory formation are discussed as realistic bases for weight-adjustment in neural networks. While neural networks have many applications outside biology, where it is irrelevant precisely which architecture and which algorithms are used, it is essential that there is a close relationship between the network's properties and whatever is the case in a neuro-biological phenomenon that is being modelled or simulated in terms of a neural network. A recurrent architecture, the use of spiking neurons and appropriate weight update rules contribute to the plausibility of a neural network in such a case. Therefore, in the first half of this book the foundations are laid for the application of neural networks as models for the various biological phenomena that are treated in the second half of this book. These include various neural network models of sensory and motor control tasks that implement one or several of the requirements for biological plausibility.
Even as developments in photorealistic computer graphics continue
to affect our work and leisure activities, practitioners and
researchers are devoting more and more attention to
non-photorealistic (NPR) techniques for generating images that
appear to have been created by hand. These efforts benefit every
field in which illustrations thanks to their ability to clarify,
emphasize, and convey very precise meanings offer advantages over
photographs. These fields include medicine, architecture,
entertainment, education, geography, publishing, and visualization.
Alternate Reality Games (ARGs) challenge what players understand as "real." Alternate Reality Games and the Cusp of Digital Gameplay is the first collection to explore and define the possibilities of ARGs. Though prominent examples have existed for more than two decades, only recently have ARGs come to the prominence as a unique and highly visible digital game genre. Adopting many of the same strategies as online video games, ARGs blur the distinction between real and fictional. With ARGs continuing to be an important and blurred space between digital and physical gameplay, this volume offers clear analysis of game design, implementation, and ramifications for game studies. Divided into three distinct sections, the contributions include first hand accounts by leading ARG creators, scholarly analysis of the meaning behind ARGs, and explorations of how ARGs are extending digital tools for analysis. By balancing the voices of designers, players, and researchers, this collection highlights how the Alternate Reality Game genre is transforming the ways we play and interact today.
Agent-based modelling on a computer appears to have a special role to play in the development of social science. It offers a means of discovering general and applicable social theory, and grounding it in precise assumptions and derivations, whilst addressing those elements of individual cognition that are central to human society. However, there are important questions to be asked and difficulties to overcome in achieving this potential. What differentiates agent-based modelling from traditional computer modelling? Which model types should be used under which circumstances? If it is appropriate to use a complex model, how can it be validated? Is social simulation research to adopt a realist epistemology, or can it operate within a social constructionist framework? What are the sociological concepts of norms and norm processing that could either be used for planned implementation or for identifying equivalents of social norms among co-operative agents? Can sustainability be achieved more easily in a hierarchical agent society than in a society of isolated agents? What examples are there of hybrid forms of interaction between humans and artificial agents? These are some of the sociological questions that are addressed.
This book presents the selected results of the XI Scientific Conference Selected Issues of Electrical Engineering and Electronics (WZEE) which was held in Rzeszow and Czarna, Poland on September 27-30, 2013. The main aim of the Conference was to provide academia and industry to discuss and present the latest technological advantages and research results and to integrate the new interdisciplinary scientific circle in the field of electrical engineering, electronics and mechatronics. The Conference was organized by the Rzeszow Division of Polish Association of Theoretical and Applied Electrical Engineering (PTETiS) in cooperation with Rzeszow University of Technology, the Faculty of Electrical and Computer Engineering and Rzeszow University, the Faculty of Mathematics and Natural Sciences.
As the visual effects industry has diversified, so too have the
books written to serve the needs of this industry. Today there are
hundreds of highly specialized titles focusing on particular
aspects of film and broadcast animation, computer graphics, stage
photography, miniature photography, color theory, and many
others.
Introduction The exponential scaling of feature sizes in semiconductor technologies has side-effects on layout optimization, related to effects such as inter connect delay, noise and crosstalk, signal integrity, parasitics effects, and power dissipation, that invalidate the assumptions that form the basis of previous design methodologies and tools. This book is intended to sample the most important, contemporary, and advanced layout opti mization problems emerging with the advent of very deep submicron technologies in semiconductor processing. We hope that it will stimulate more people to perform research that leads to advances in the design and development of more efficient, effective, and elegant algorithms and design tools. Organization of the Book The book is organized as follows. A multi-stage simulated annealing algorithm that integrates floorplanning and interconnect planning is pre sented in Chapter 1. To reduce the run time, different interconnect plan ning approaches are applied in different ranges of temperatures. Chapter 2 introduces a new design methodology - the interconnect-centric design methodology and its centerpiece, interconnect planning, which consists of physical hierarchy generation, floorplanning with interconnect planning, and interconnect architecture planning. Chapter 3 investigates a net-cut minimization based placement tool, Dragon, which integrates the state of the art partitioning and placement techniques."
This book is a status report. It provides a broad overview of the most recent developments in the field, spanning a wide range of topical areas in simulational condensed matter physics. These areas include recent developments in simulations of classical statistical mechanics models, electronic structure calculations, quantum simulations, and simulations of polymers. Both new physical results and novel simulational and data analysis methods are presented. Some of the highlights of this volume include detailed accounts of recent theoretical developments in electronic structure calculations, novel quantum simulation techniques and their applications to strongly interacting lattice fermion models, and a wide variety of applications of existing methods as well as novel methods in the simulation of classical statistical mechanics models, including spin glasses and polymers.
II Challenges in Data Mapping Part II deals with one of the most challenging tasks in Interactive Visualization, mapping and teasing out information from large complex datasets and generating visual representations. This section consists of four chapters. Binh Pham, Alex Streit, and Ross Brown provide a comprehensive requirement analysis of information uncertainty visualizations. They examine the sources of uncertainty, review aspects of its complexity, introduce typical models of uncertainty, and analyze major issues in visualization of uncertainty, from various user and task perspectives. Alfred Inselberg examines challenges in the multivariate data analysis. He explains how relations among multiple variables can be mapped uniquely into ?-space subsets having geometrical properties and introduces Parallel Coordinates meth- ology for the unambiguous visualization and exploration of a multidimensional geometry and multivariate relations. Christiaan Gribble describes two alternative approaches to interactive particle visualization: one targeting desktop systems equipped with programmable graphics hardware and the other targeting moderately sized multicore systems using pack- based ray tracing. Finally, Christof Rezk Salama reviews state-of-the-art strategies for the assignment of visual parameters in scientific visualization systems. He explains the process of mapping abstract data values into visual based on transfer functions, clarifies the terms of pre- and postclassification, and introduces the state-of-the-art user int- faces for the design of transfer functions.
Computational finance deals with the mathematics of computer programs that realize financial models or systems. This book outlines the epistemic risks associated with the current valuations of different financial instruments and discusses the corresponding risk management strategies. It covers most of the research and practical areas in computational finance. Starting from traditional fundamental analysis and using algebraic and geometric tools, it is guided by the logic of science to explore information from financial data without prejudice. In fact, this book has the unique feature that it is structured around the simple requirement of objective science: the geometric structure of the data = the information contained in the data.
Fundamental solutions in understanding information have been elusive for a long time. The field of Artificial Intelligence has proposed the Turing Test as a way to test for the "smart" behaviors of computer programs that exhibit human-like qualities. Equivalent to the Turing Test for the field of Human Information Interaction (HII), getting information to the people that need them and helping them to understand the information is the new challenge of the Web era. In a short amount of time, the infrastructure of the Web became ubiquitious not just in terms of protocols and transcontinental cables but also in terms of everyday devices capable of recalling network-stored data, sometimes wire lessly. Therefore, as these infrastructures become reality, our attention on HII issues needs to shift from information access to information sensemaking, a relatively new term coined to describe the process of digesting information and understanding its structure and intricacies so as to make decisions and take action.
This book introduces the latest progress in six degrees of freedom (6-DoF) haptic rendering with the focus on a new approach for simulating force/torque feedback in performing tasks that require dexterous manipulation skills. One of the major challenges in 6-DoF haptic rendering is to resolve the conflict between high speed and high fidelity requirements, especially in simulating a tool interacting with both rigid and deformable objects in a narrow space and with fine features. The book presents a configuration-based optimization approach to tackle this challenge. Addressing a key issue in many VR-based simulation systems, the book will be of particular interest to researchers and professionals in the areas of surgical simulation, rehabilitation, virtual assembly, and inspection and maintenance.
This up-to-date quick reference guides the reader through the most popular SAP module (myERP Financial 6.0). It thoroughly covers all of the sub modules of ERP Financials, including, FICO, FSCM, New GL functionality, SAP integration points, and Report Painter. Unlike other books that only provide questions and answers for certification preparation, this book covers both configurations and end user transactions for validating the implementation methods. A companion CD-ROM with FICO templates, short cuts, and color figures is included.Features: * Includes both configurations and end-user transactions for validation* Uses a quick-reference style for finding information quickly* Covers the latest account configurations for New GL* Includes a CD-ROM with FICO templates, short cuts, and color figures
The User Experience Team of One prescribes a range of approaches that have big impact and take less time and fewer resources than the standard lineup of UX deliverables. Whether you want to cross over into user experience or you're a seasoned practitioner trying to drag your organization forward, this book gives you tools and insight for doing more with less.
Soft City Culture and Technology: The Betaville Project discusses the complete cycle of conception, development, and deployment of the Betaville platform. Betaville is a massively participatory online environment for distributed 3D design and development of proposals for changes to the built environment an experimental integration of art, design, and software development for the public realm. Through a detailed account of Betaville from a Big Crazy Idea to a working "deep social medium," the author examines the current conditions of performance and accessibility of hardware, software, networks, and skills that can be brought together into a new form of open public design and deliberation space, for and spanning and integrating the disparate spheres of art, architecture, social media, and engineering. Betaville is an ambitious enterprise, of building compelling and constructive working relationships in situations where roles and disciplinary boundaries must be as agile as the development process of the software itself. Through a considered account and analysis of the interdependencies between Betaville's project design, development methods, and deployment, the reader can gain a deeper understanding of the potential socio-technical forms of New Soft Cities: blended virtual-physical worlds, whose "public works" must ultimately serve and succeed as massively collaborative works of art and infrastructure."
The book focusses on questions of individual and collective action, the emergence and dynamics of social norms and the feedback between individual behaviour and social phenomena. It discusses traditional modelling approaches to social norms and shows the usefulness of agent-based modelling for the study of these micro-macro interactions. Existing agent-based models of social norms are discussed and it is shown that so far too much priority has been given to parsimonious models and questions of the emergence of norms, with many aspects of social norms, such as norm-change, not being modelled. Juvenile delinquency, group radicalisation and moral decision making are used as case studies for agent-based models of collective action extending existing models by providing an embedding into social networks, social influence via argumentation and a causal action theory of moral decision making. The major contribution of the book is to highlight the multifaceted nature of the dynamics of social norms, consisting not only of emergence, and the importance of embedding of agent-based models into existing theory."
The objective of this monograph is to improve the performance of the sentiment analysis model by incorporating the semantic, syntactic and common-sense knowledge. This book proposes a novel semantic concept extraction approach that uses dependency relations between words to extract the features from the text. Proposed approach combines the semantic and common-sense knowledge for the better understanding of the text. In addition, the book aims to extract prominent features from the unstructured text by eliminating the noisy, irrelevant and redundant features. Readers will also discover a proposed method for efficient dimensionality reduction to alleviate the data sparseness problem being faced by machine learning model. Authors pay attention to the four main findings of the book : -Performance of the sentiment analysis can be improved by reducing the redundancy among the features. Experimental results show that minimum Redundancy Maximum Relevance (mRMR) feature selection technique improves the performance of the sentiment analysis by eliminating the redundant features. - Boolean Multinomial Naive Bayes (BMNB) machine learning algorithm with mRMR feature selection technique performs better than Support Vector Machine (SVM) classifier for sentiment analysis. - The problem of data sparseness is alleviated by semantic clustering of features, which in turn improves the performance of the sentiment analysis. - Semantic relations among the words in the text have useful cues for sentiment analysis. Common-sense knowledge in form of ConceptNet ontology acquires knowledge, which provides a better understanding of the text that improves the performance of the sentiment analysis. |
You may like...
The Body Keeps the Score - Mind, Brain…
Bessel Van Der Kolk
Paperback
(1)
Teen Chick Lit - A Guide to Reading…
Christine Meloni
Hardcover
Computational Intelligence for Machine…
Rajshree Srivastava, Pradeep Kumar Mallick, …
Hardcover
R3,875
Discovery Miles 38 750
|