![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer programming
From the very beginning of their investigation of human reasoning, philosophers have identified two other forms of reasoning, besides deduction, which we now call abduction and induction. Deduction is now fairly well understood, but abduction and induction have eluded a similar level of understanding. The papers collected here address the relationship between abduction and induction and their possible integration. The approach is sometimes philosophical, sometimes that of pure logic, and some papers adopt the more task-oriented approach of AI. The book will command the attention of philosophers, logicians, AI researchers and computer scientists in general.
The evolution of technology has set the stage for the rapid growth of the video Web: broadband Internet access is ubiquitous, and streaming media protocols, systems, and encoding standards are mature. In addition to Web video delivery, users can easily contribute content captured on low cost camera phones and other consumer products. The media and entertainment industry no longer views these developments as a threat to their established business practices, but as an opportunity to provide services for more viewers in a wider range of consumption contexts. The emergence of IPTV and mobile video services offers unprecedented access to an ever growing number of broadcast channels and provides the flexibility to deliver new, more personalized video services. Highly capable portable media players allow us to take this personalized content with us, and to consume it even in places where the network does not reach. Video search engines enable users to take advantage of these emerging video resources for a wide variety of applications including entertainment, education and communications. However, the task of information extr- tion from video for retrieval applications is challenging, providing opp- tunities for innovation. This book aims to first describe the current state of video search engine technology and second to inform those with the req- site technical skills of the opportunities to contribute to the development of this field. Today's Web search engines have greatly improved the accessibility and therefore the value of the Web.
Recently many researchers are working on cluster analysis as a main tool for exploratory data analysis and data mining. A notable feature is that specialists in di?erent ?elds of sciences are considering the tool of data clustering to be useful. A major reason is that clustering algorithms and software are ?exible in thesensethatdi?erentmathematicalframeworksareemployedinthealgorithms and a user can select a suitable method according to his application. Moreover clusteringalgorithmshavedi?erentoutputsrangingfromtheolddendrogramsof agglomerativeclustering to more recent self-organizingmaps. Thus, a researcher or user can choose an appropriate output suited to his purpose, which is another ?exibility of the methods of clustering. An old and still most popular method is the K-means which use K cluster centers. A group of data is gathered around a cluster center and thus forms a cluster. The main subject of this book is the fuzzy c-means proposed by Dunn and Bezdek and their variations including recent studies. A main reasonwhy we concentrate on fuzzy c-means is that most methodology and application studies infuzzy clusteringusefuzzy c-means, andfuzzy c-meansshouldbe consideredto beamajortechniqueofclusteringingeneral, regardlesswhetheroneisinterested in fuzzy methods or not. Moreover recent advances in clustering techniques are rapid and we requirea new textbook that includes recent algorithms.We should also note that several books have recently been published but the contents do not include some methods studied here
This thesis primarily focuses on how to carry out intelligent sensing and understand the high-dimensional and low-quality visual information. After exploring the inherent structures of the visual data, it proposes a number of computational models covering an extensive range of mathematical topics, including compressive sensing, graph theory, probabilistic learning and information theory. These computational models are also applied to address a number of real-world problems including biometric recognition, stereo signal reconstruction, natural scene parsing, and SAR image processing.
In software engineering there is a growing need for formalization as a basis for developing powerful computer assisted methods. This volume contains seven extensive lectures prepared for a series of IFIP seminars on the Formal Description of Programming Concepts. The authors are experts in their fields and have contributed substantially to the state of the art in numerous publications. The lectures cover a wide range in the theoretical foundations of programming and give an up-to-date account of the semantic models and the related tools which have been developed in order to allow a rigorous discussion of the problems met in the construction of correct programs. In particular, methods for the specification and transformation of programs are considered in detail. One lecture is devoted to the formalization of concurrency and distributed systems and reflects their great importance in programming. Further topics are the verification of programs and the use of sophisticated type systems in programming. This compendium on the theoretical foundations of programming is also suitable as a textbook for special seminars on different aspects of this broad subject.
Grammatical Evolution: Evolutionary Automatic Programming in an Arbitrary Language provides the first comprehensive introduction to Grammatical Evolution, a novel approach to Genetic Programming that adopts principles from molecular biology in a simple and useful manner, coupled with the use of grammars to specify legal structures in a search. Grammatical Evolution's rich modularity gives a unique flexibility, making it possible to use alternative search strategies - whether evolutionary, deterministic or some other approach - and to even radically change its behavior by merely changing the grammar supplied. This approach to Genetic Programming represents a powerful new weapon in the Machine Learning toolkit that can be applied to a diverse set of problem domains.
This book describes recent innovations in 3D media and technologies, with coverage of 3D media capturing, processing, encoding, and adaptation, networking aspects for 3D Media, and quality of user experience (QoE). The contributions are based on the results of the FP7 European Project ROMEO, which focuses on new methods for the compression and delivery of 3D multi-view video and spatial audio, as well as the optimization of networking and compression jointly across the future Internet. The delivery of 3D media to individual users remains a highly challenging problem due to the large amount of data involved, diverse network characteristics and user terminal requirements, as well as the user's context such as their preferences and location. As the number of visual views increases, current systems will struggle to meet the demanding requirements in terms of delivery of consistent video quality to fixed and mobile users. ROMEO will present hybrid networking solutions that combine the DVB-T2 and DVB-NGH broadcast access network technologies together with a QoE aware Peer-to-Peer (P2P) distribution system that operates over wired and wireless links. Live streaming 3D media needs to be received by collaborating users at the same time or with imperceptible delay to enable them to watch together while exchanging comments as if they were all in the same location. This book is the second of a series of three annual volumes devoted to the latest results of the FP7 European Project ROMEO. The present volume provides state-of-the-art information on immersive media, 3D multi-view video, spatial audio, cloud-based media, networking protocols for 3D media, P2P 3D media streaming, and 3D Media delivery across heterogeneous wireless networks among other topics. Graduate students and professionals in electrical engineering and computer science with an interest in 3D Future Internet Media will find this volume to be essential reading. Describes the latest innovations in 3D technologies and Future Internet Media Focuses on research to facilitate application scenarios such as social TV and high-quality, real-time collaboration Discusses QoE for 3D Represents the last of a series of three volumes devoted to contributions from FP7 projects in the area of 3D and networked media
Time is a fascinating subject and has long since captured mankind's imagination, from the ancients to modern man, both adult and child alike. It has been studied across a wide range of disciplines, from the natural sciences to philosophy and logic. Today, thirty plus years since Prior's work in laying out foundations for temporal logic, and two decades on from Pnueli's seminal work applying of temporal logic in specification and verification of computer programs, temporal logic has a strong and thriving international research community within the broad disciplines of computer science and artificial intelligence. Areas of activity include, but are certainly not restricted to: Pure Temporal Logic, e. g. temporal systems, proof theory, model theory, expressiveness and complexity issues, algebraic properties, application of game theory; Specification and Verification, e. g. of reactive systems, ofreal-time components, of user interaction, of hardware systems, techniques and tools for verification, execution and prototyping methods; Temporal Databases, e. g. temporal representation, temporal query ing, granularity of time, update mechanisms, active temporal data bases, hypothetical reasoning; Temporal Aspects in AI, e. g. modelling temporal phenomena, in terval temporal calculi, temporal nonmonotonicity, interaction of temporal reasoning with action/knowledge/belief logics, temporal planning; Tense and Aspect in Natural Language, e. g. models, ontologies, temporal quantifiers, connectives, prepositions, processing tempo ral statements; Temporal Theorem Proving, e. g. translation methods, clausal and non-clausal resolution, tableaux, automata-theoretic approaches, tools and practical systems."
This book explains how to write .NET 2.0 applications and services. It provides you with a clean slate, erasing the need for developing the COM, DCOM, COM+, or ActiveX components that used to be a necessity. Instead, you'll learn how to write .NET applications using C++/CLI. This book is based on its highly successful predecessor, and bridges the gap between classic C++ and C++/CLI. Furthermore, this edition is based on the newest version of Visual Studio .NET (2005) and .NET Platform version 2.0. And all topic areas include specific code examples. By the end of the book, you will be proficient in developing .NET applications and services for both the Windows desktop and the Web.
Key to our culture is that we can disseminate information, and then maintain and access it over time. While we are rapidly advancing from vulnerable physical solutions to superior, digital media, preserving and using data over the long term involves complicated research challenges and organization efforts. Uwe Borghoff and his coauthors address the problem of storing, reading, and using digital data for periods longer than 50 years. They briefly describe several markup and document description languages like TIFF, PDF, HTML, and XML, explain the most important techniques such as migration and emulation, and present the OAIS (Open Archival Information System) Reference Model. To complement this background information on the technology issues the authors present the most relevant international preservation projects, such as the Dublin Core Metadata Initiative, and experiences from sample projects run by the Cornell University Library and the National Library of the Netherlands. A rated survey list of available systems and tools completes the book. With this broad overview, the authors address librarians who preserve our digital heritage, computer scientists who develop technologies that access data, and information managers engaged with the social and methodological requirements of long-term information access.
I am very happy to have this opportunity to introduce Luca Vigano's book on Labelled Non-Classical Logics. I put forward the methodology of labelled deductive systems to the participants of Logic Colloquium'90 (Labelled Deductive systems, a Position Paper, In J. Oikkonen and J. Vaananen, editors, Logic Colloquium '90, Volume 2 of Lecture Notes in Logic, pages 66-68, Springer, Berlin, 1993), in an attempt to bring labelling as a recognised and significant component of our logic culture. It was a response to earlier isolated uses of labels by various distinguished authors, as a means to achieve local proof theoretic goals. Labelling was used in many different areas such as resource labelling in relevance logics, prefix tableaux in modal logics, annotated logic programs in logic programming, proof tracing in truth maintenance systems, and various side annotations in higher-order proof theory, arithmetic and analysis. This widespread local use of labels was an indication of an underlying logical pattern, namely the simultaneous side-by-side manipulation of several kinds of logical information. It was clear that there was a need to establish the labelled deductive systems methodology. Modal logic is one major area where labelling can be developed quickly and sys tematically with a view of demonstrating its power and significant advantage. In modal logic the labels can play a double role."
Pro Java ME MMAPI: Mobile Media API is the first and only Java book that explores the mobile media API in great detail which lets mobile game and other mobile application developers add multimedia attributes for more dynamic multimedia functionality on cell phones, PDAs and other mobile devices.
Proceedings of the FISITA 2012 World Automotive Congress are selected from nearly 2,000 papers submitted to the 34th FISITA World Automotive Congress, which is held by Society of Automotive Engineers of China (SAE-China ) and the International Federation of Automotive Engineering Societies (FISITA). This proceedings focus on solutions for sustainable mobility in all areas of passenger car, truck and bus transportation. Volume 6: Vehicle Electronics focuses on: *Engine/Chassis/Body Electronic Control *Electrical and Electronic System *Software and Hardware Development *Electromagnetic Compatibility (EMC) *Vehicle Sensor and Actuator *In-Vehicle Network *Multi-Media/Infotainment System Above all researchers, professional engineers and graduates in fields of automotive engineering, mechanical engineering and electronic engineering will benefit from this book. SAE-China is a national academic organization composed of enterprises and professionals who focus on research, design and education in the fields of automotive and related industries. FISITA is the umbrella organization for the national automotive societies in 37 countries around the world. It was founded in Paris in 1948 with the purpose of bringing engineers from around the world together in a spirit of cooperation to share ideas and advance the technological development of the automobile.
It is said that business re-engineering is part of our transition
to a post-industrial society. The purpose of this book is to
present an approach to how to reorganize businesses using the
discipline of software engineering as a guiding paradigm. The
author's thesis is that software engineering provides the necessary
analytical expertise for defining business processes and the tools
to transform process descriptions to support systems.
The goal of this guide and manual is to provide a practical and brief overview of the theory on computerized adaptive testing (CAT) and multistage testing (MST) and to illustrate the methodologies and applications using R open source language and several data examples. Implementation relies on the R packages catR and mstR that have been already or are being developed by the first author (with the team) and that include some of the newest research algorithms on the topic. The book covers many topics along with the R-code: the basics of R, theoretical overview of CAT and MST, CAT designs, CAT assembly methodologies, CAT simulations, catR package, CAT applications, MST designs, IRT-based MST methodologies, tree-based MST methodologies, mstR package, and MST applications. CAT has been used in many large-scale assessments over recent decades, and MST has become very popular in recent years. R open source language also has become one of the most useful tools for applications in almost all fields, including business and education. Though very useful and popular, R is a difficult language to learn, with a steep learning curve. Given the obvious need for but with the complex implementation of CAT and MST, it is very difficult for users to simulate or implement CAT and MST. Until this manual, there has been no book for users to design and use CAT and MST easily and without expense; i.e., by using the free R software. All examples and illustrations are generated using predefined scripts in R language, available for free download from the book's website.
Reviews different machine learning and deep learning techniques with a biomedical perspective Provides the relevant case studies that demonstrate applicability of different AI techniques Explain different kinds of inputs like various image modalities, biomedical signals types, etc. Covers the latest trends of AI-based biomedical domains including IoT, drug discovery, biomechanics, robotics, electronic health records, etc. Discusses the research challenges and opportunities in AI and biomedical domain
Enterprise Information Systems Design, Implementation and Management: Organizational Applications investigates the creation and implementation of enterprise information systems. Covering a wide array of topics such as flow-shop scheduling, information systems outsourcing, ERP systems utilization, Dietz transaction methodology, and advanced planning systems, it is an essential reference source for researchers and professionals alike.
Data compression is now indispensable to products and services of many industries including computers, communications, healthcare, publishing and entertainment. This invaluable resource introduces this area to information system managers and others who need to understand how it is changing the world of digital systems. For those who know the technology well, it reveals what happens when data compression is used in real-world applications and provides guidance for future technology development.
Solders have given the designer of modern consumer, commercial, and military electronic systems a remarkable flexibility to interconnect electronic components. The properties of solder have facilitated broad assembly choices that have fueled creative applications to advance technology. Solder is the electrical and me chanical "glue" of electronic assemblies. This pervasive dependency on solder has stimulated new interest in applica tions as well as a more concerted effort to better understand materials properties. We need not look far to see solder being used to interconnect ever finer geo metries. Assembly of micropassive discrete devices that are hardly visible to the unaided eye, of silicon chips directly to ceramic and plastic substrates, and of very fine peripheral leaded packages constitute a few of solder's uses. There has been a marked increase in university research related to solder. New electronic packaging centers stimulate applications, and materials engineering and science departments have demonstrated a new vigor to improve both the materials and our understanding of them. Industrial research and development continues to stimulate new application, and refreshing new packaging ideas are emerging. New handbooks have been published to help both the neophyte and seasoned packaging engineer."
Software is difficult to develop, maintain, and reuse. Two factors that contribute to this difficulty are the lack of modular design and good program documentation. The first makes software changes more difficult to implement. The second makes programs more difficult to understand and to maintain. Formal Specification Techniques for Engineering Modular C Programs describes a novel approach to promoting program modularity. The book presents a formal specification language that promotes software modularity through the use of abstract data types, even though the underlying programming language may not have such support. This language is structured to allow useful information to be extracted from a specification, which is then used to perform consistency checks between the specification and its implementation. Formal Specification Techniques for Engineering Modular C Programs also describes a specification-driven, software re-engineering process model for improving existing programs. The aim of this process is to make existing programs easier to maintain and reuse while keeping their essential functionalities unchanged. Audience: Suitable as a secondary text for graduate level courses in software engineering, and as a reference for researchers and practitioners in industry.
The research presented in this book discusses how to efficiently retrieve track and trace information for an item of interest that took a certain path through a complex network of manufacturers, wholesalers, retailers, and consumers. To this end, a super-ordinate system called "Discovery Service" is designed that has to handle large amounts of data, high insert-rates, and a high number of queries that are submitted to the discovery service. An example that is used throughout this book is the European pharmaceutical supply chain, which faces the challenge that more and more counterfeit medicinal products are being introduced. Between October and December 2008, more than 34 million fake drug pills were detected at customs control at the borders of the European Union. These fake drugs can put lives in danger as they were supposed to fight cancer, take effect as painkiller or antibiotics, among others. The concepts described in this book can be adopted for supply chain management use cases other than track and trace, such as recall, supply chain optimization, or supply chain analytics.
As games become increasingly embedded into everyday life, understanding the ethics of their creation and use, as well as their potential for practicing ethical thinking, becomes more relevant. Designing Games for Ethics: Models, Techniques and Frameworks brings together the diverse and growing community of voices and begin to define the field, identify its primary challenges and questions, and establish the current state of the discipline. Such a rigorous, collaborative, and holistic foundation for the study of ethics and games is necessary to appropriately inform future games, policies, standards, and curricula.
No other area of biology has grown as fast and become as relevant over the last decade as virology. It is with no little amount of amaze ment, that the more we learn about fundamental biological questions and mechanisms of diseases, the more obvious it becomes that viruses perme ate all facets of our lives. While on one hand viruses are known to cause acute and chronic, mild and fatal, focal and generalized diseases, on the other hand, they are used as tools for gaining an understanding of the structure and function of higher organisms, and as vehicles for carrying protective or curative therapies. The wide scope of approaches to different biological and medical virological questions was well rep resented by the speakers that participated in this year's Symposium. While the epidemic by the human immunodeficiency virus type 1 continues to spread without hope for much relief in sight, intriguing questions and answers in the area of diagnostics, clinical manifestations and therapeutical approaches to viral infections are unveiled daily. Let us hope, that with the increasing awareness by our society of the role played by viruses, not only as causative agents of diseases, but also as models for better understanding basic biological principles, more efforts and resources are placed into their study. Luis M. de la Maza Irvine, California Ellena M." |
You may like...
Introductory Readings In Geographic…
D. J. Peuquet, D. F. Marble
Paperback
R2,277
Discovery Miles 22 770
Advancements in Security and Privacy…
Ashwani Kumar, Seelam Sai Satyanarayana Reddy
Hardcover
R5,924
Discovery Miles 59 240
Advances in Intelligent Information…
Jeng-Shyang Pan, Pei-Wei Tsai, …
Hardcover
R6,490
Discovery Miles 64 900
Computer Graphics with Open GL - Pearson…
Donald Hearn, Pauline Baker, …
Paperback
R2,239
Discovery Miles 22 390
Mobile Multimedia Communications
David J. Goodman, Dipankar Raychaudhuri
Hardcover
R4,264
Discovery Miles 42 640
|