![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer programming > Algorithms & procedures
Based on the Fourth International Conference on Quantum Communication, Measurement and Computing, this volume brings together scientists working in the interdisciplinary fields of quantum communication science and technology. Topics include quantum information theory, quantum computing, stochastic processes and filtering, and quantum measurement theory
Mobile communications havepermeated the globe in both business and social cultures. In only af ew short years, Japan aloneh ash ad more than ten million subscribers enter the mobilem arket. Such explosive popularity is an indication ofa strong commercial demand for communications in both the tethered and tetherless environments. Accompanying the vibrant growth in mobile communications is the growth in multimedia communications, includingthe Internet. Mobile and multime dia communications technologies are merging, making mobile computing ak ey phrasei n the coming advanced information communication era. Thegrowth i n these dynamic industries shows that achange in our chosen method of commu nications is already well advanced. Reading e mail and connecting to various information feeds have already become a part ofdaily business activities. We are trying to grasp theo verall picture of mobile computing. Its shape and form are just starting to appear as personal digital assistants (PDA), handheld personal computers (HPC), wireless data communication services, and com mercial software designed for mobile environments. We are at the cusp of vast popularization of "computers on the go. " "Any time Anywhere Computing" provides the reader with an understand able explanationo ft he current developments and commercialization of mobile computing. Thec oret ec hnologies and applications needed to un derstand the industry are comprehensively addressed. Thebook emphasizes three infrastruc tures: (1) wireless communication network infrastructure, (2) terminal devices (or "computers on the go"), and (3) software middleware and architectures that support wireless and mobile computing.
This book introduces 'functional networks', a novel neural-based paradigm, and shows that functional network architectures can be efficiently applied to solve many interesting practical problems. Included is an introduction to neural networks, a description of functional networks, examples of applications, and computer programs in Mathematica and Java languages implementing the various algorithms and methodologies. Special emphasis is given to applications in several areas such as: * Box-Jenkins AR(p), MA(q), ARMA(p, q), and ARIMA (p, d, q) models with application to real-life economic problems such as the consumer price index, electric power consumption and international airlines' passenger data. Random time series and chaotic series are considered in relation to the Henon, Lozi, Holmes and Burger maps, as well as the problems of noise reduction and information masking. * Learning differential equations from data and deriving the corresponding equivalent difference and functional equations. Examples of a mass supported by two springs and a viscous damper or dashpot, and a loaded beam, are used to illustrate the concepts.* The problem of obtaining the most general family of implicit, explicit and parametric surfaces as used in Computer Aided Design (CAD). * Applications of functional networks to obtain general nonlinear regression models are given and compared with standard techniques. Functional Networks with Applications: A Neural-Based Paradigm will be of interest to individuals who work in computer science, physics, engineering, applied mathematics, statistics, economics, and other neural networks and data analysis related fiel
This book constitutes the refereed proceedings of the 11th European Conference on Evolutionary Computation, Machine Learning and Data Mining in Bioinformatics, EvoBIO 2013, held in Vienna, Austria, in April 2013, colocated with the Evo* 2013 events EuroGP, EvoCOP, EvoMUSART and EvoApplications. The 10 revised full papers presented together with 9 poster papers were carefully reviewed and selected from numerous submissions. The papers cover a wide range of topics in the field of biological data analysis and computational biology. They address important problems in biology, from the molecular and genomic dimension to the individual and population level, often drawing inspiration from biological systems in oder to produce solutions to biological problems.
The NSF Center for Intelligent Information Retrieval (CIIR) was formed in the Computer Science Department of the University of Massachusetts, Amherst, in 1992. Through its efforts in basic research, applied research, and technology transfer, the CIIR has become known internationally as one of the leading research groups in the area of information retrieval. The CIIR focuses on research that results in more effective and efficient access and discovery in large, heterogeneous, distributed text and multimedia databases. The scope of the work that is done in the CIIR is broad and goes significantly beyond 'traditional' areas of information retrieval such as retrieval models, cross-lingual search, and automatic query expansion. The research includes both low-level systems issues such as the design of protocols and architectures for distributed search, as well as more human-centered topics such as user interface design, visualization and data mining with text, and multimedia retrieval.Advances in Information Retrieval: Recent Research from the Center for Intelligent Information Retrieval is a collection of papers that covers a wide variety of topics in the general area of information retrieval. Together, they represent a snapshot of the state of the art in information retrieval at the turn of the century and at the end of a decade that has seen the advent of the World-Wide Web. The papers provide overviews and in-depth analysis of theory and experimental results. This book can be used as source material for graduate courses in information retrieval, and as a reference for researchers and practitioners in industry.
Mining Spatio-Temporal Information Systems, an edited volume is composed of chapters from leading experts in the field of Spatial-Temporal Information Systems and addresses the many issues in support of modeling, creation, querying, visualizing and mining. Mining Spatio-Temporal Information Systems is intended to bring together a coherent body of recent knowledge relating to STIS data modeling, design, implementation and STIS in knowledge discovery. In particular, the reader is exposed to the latest techniques for the practical design of STIS, essential for complex query processing. Mining Spatio-Temporal Information Systems is structured to meet the needs of practitioners and researchers in industry and graduate-level students in Computer Science.
Diversity is characteristic of the information age and also of statistics. To date, the social sciences have contributed greatly to the development of handling data under the rubric of measurement, while the statistical sciences have made phenomenal advances in theory and algorithms. Measurement and Multivariate Analysis promotes an effective interplay between those two realms of research-diversity with unity. The union and the intersection of those two areas of interest are reflected in the papers in this book, drawn from an international conference in Banff, Canada, with participants from 15 countries. In five major categories - scaling, structural analysis, statistical inference, algorithms, and data analysis - readers will find a rich variety of topics of current interest in the extended statistical community.
IFIP/SEC2000, being part of the 16th IFIP World Computer Congress (WCC2000), is being held in Beijing, China from August 21 to 25, 2000. SEC2000 is the annual conference of TCll (Information Security) of the International Federation of Information Processing. The conference focuses on the seamless integration of information security services as an integral part of the Global Information Infrastructure in the new millenniUm. SEC2000 is sponsored by the China Computer Federation (CCF), IFIP/TCll, and Engineering Research Centre for Information Security Technology, Chinese Academy of Sciences (ERCIST, CAS). There were 180 papers submitted for inclusion, 50 papers among them have been accepted as long papers and included in this proceeding, 81 papers have been accepted as short papers and published in another proceeding. All papers presented in this conference were reviewed blindly by a minimum of two international reviewers. The authors' affiliations of the 180 submissions and the accepted 131 papers range over 26 and 25 countries or regions, respectively. We would like to appreciate all who have submitted papers to IFIP/SEC2000, and the authors of accepted papers for their on-time preparation of camera-ready fmal versions. Without their contribution there would be no conference. We wish to express our gratitude to all program committee members and other reviewers for their hard work in reviewing the papers in a short time and for contributing to the conference in different ways. We would like to thank Rein Venter for his time and expertise in compiling the fmal version of the proceedings.
This book constitutes the refereed proceedings of the 13th European Conference on Evolutionary Computation in Combinatorial Optimization, EvoCOP 2013, held in Vienna, Austria, in April 2013, colocated with the Evo* 2013 events EuroGP, EvoBIO, EvoMUSART, and EvoApplications. The 23 revised full papers presented were carefully reviewed and selected from 50 submissions. The papers present the latest research and discuss current developments and applications in metaheuristics - a paradigm to effectively solve difficult combinatorial optimization problems appearing in various industrial, economic, and scientific domains. Prominent examples of metaheuristics are ant colony optimization, evolutionary algorithms, greedy randomized adaptive search procedures, iterated local search, simulated annealing, tabu search, and variable neighborhood search. Applications include scheduling, timetabling, network design, transportation and distribution, vehicle routing, the travelling salesman problem, packing and cutting, satisfiability, and general mixed integer programming.
Real-Time Systems Engineering and Applications is a well-structured collection of chapters pertaining to present and future developments in real-time systems engineering. After an overview of real-time processing, theoretical foundations are presented. The book then introduces useful modeling concepts and tools. This is followed by concentration on the more practical aspects of real-time engineering with a thorough overview of the present state of the art, both in hardware and software, including related concepts in robotics. Examples are given of novel real-time applications which illustrate the present state of the art. The book concludes with a focus on future developments, giving direction for new research activities and an educational curriculum covering the subject. This book can be used as a source for academic and industrial researchers as well as a textbook for computing and engineering courses covering the topic of real-time systems engineering.
Video segmentation is the most fundamental process for appropriate index ing and retrieval of video intervals. In general, video streams are composed 1 of shots delimited by physical shot boundaries. Substantial work has been done on how to detect such shot boundaries automatically (Arman et aI. , 1993) (Zhang et aI. , 1993) (Zhang et aI. , 1995) (Kobla et aI. , 1997). Through the inte gration of technologies such as image processing, speech/character recognition and natural language understanding, keywords can be extracted and associated with these shots for indexing (Wactlar et aI. , 1996). A single shot, however, rarely carries enough amount of information to be meaningful by itself. Usu ally, it is a semantically meaningful interval that most users are interested in re trieving. Generally, such meaningful intervals span several consecutive shots. There hardly exists any efficient and reliable technique, either automatic or manual, to identify all semantically meaningful intervals within a video stream. Works by (Smith and Davenport, 1992) (Oomoto and Tanaka, 1993) (Weiss et aI. , 1995) (Hjelsvold et aI. , 1996) suggest manually defining all such inter vals in the database in advance. However, even an hour long video may have an indefinite number of meaningful intervals. Moreover, video data is multi interpretative. Therefore, given a query, what is a meaningful interval to an annotator may not be meaningful to the user who issues the query. In practice, manual indexing of meaningful intervals is labour intensive and inadequate.
This book contains the thoroughly refereed post-conference proceedings of the 14th Information Hiding Conference, IH 2012, held in Berkeley, CA, USA, in May 2012. The 18 revised full papers presented were carefully reviewed and selected from numerous submissions. The papers are organized in topical sections on multimedia forensics and counter-forensics, steganalysis, data hiding in unusual content, steganography, covert channels, anonymity and privacy, watermarking, and fingerprinting.
The SGML FAQ Book: Understanding the Foundation of HTML and XML is similar, but not quite the same kind of thing as an online FAQ or 'Frequently Asked Questions' list. It addresses questions from people who already actually use SGML in some way (including HTML authors), and people who are about to use it. It deals mainly with issues that arise when using SGML in practice. A very brief introduction to SGML is included as Appendix A. The questions discussed in The SGML FAQ Book are repeatedly heard by people who make their living serving the SGML community. SGML experts spend many hours teaching these details, sometimes repeatedly because some questions do not seem important - until you run into them. So one benefit of this book is learning more of the art of document creation and management, both by general reading before questions arise and by specific reference when a question arises. For the latter use, the appendices, glossary, and index are particularly important. A second benefit of this book is that it provides a common theme to its answers that you can apply in your use of SGML, HTML and related languages in general.The fundamental answer to many of the questions boils down to 'simplify': many questions do not show up if you use the simple, elegant core of SGML without worrying about optional features. The credo of this book is simply, 'SGML doesn't need to be complicated'. SGML has the potential for complexity at certain points. But much of the complexity comes from optional parts and can be avoided. SGML methodology and its primary benefits suffer no loss even if you skip many features, which speaks well for the quality of SGML's overall design. Many of the questions discussed involve those optional parts, and therefore can be avoided by judicious designers and authors. The two key goals of the book are (1) to answer questions that you may actually encounter as an SGML user, and to help you get 'unstuck' and be as productive as possible in using the language and (2) to show proactive ways you can simplify your use of SGML, and get its very substantial benefits with minimal complexity.
Advanced Communications and Multimedia Security presents a state-of-the-art review of current perspectives as well as the latest developments in the area of communications and multimedia security. It examines requirements, issues and solutions pertinent to securing information networks, and identifies future security-related research challenges. A wide spectrum of topics is discussed, including: -Applied cryptography; -Biometry; -Communication systems security; -Applications security; Mobile security; -Distributed systems security; -Digital watermarking and digital signatures. This volume comprises the proceedings of the sixth Joint Working Conference on Communications and Multimedia Security (CMS'02), which was sponsored by the International Federation for Information Processing (IFIP) and held in September 2002 in Portoroz, Slovenia. It constitutes essential reading for information security specialists, researchers and professionals working in the area of computer science and communication systems.
Object-Oriented Behavioral Specifications encourages builders of complex information systems to accelerate their move to using the approach of a scientific discipline in analysis rather than the approach of a craft. The focus is on understanding customers' needs and on precise specification of understanding gained through analysis. Specifications must bridge any gaps in understanding about business rules among customers, Subject Matter Experts, and `computer people', must inform decisions about reuse of software and systems, and must enable review of semantics over time. Specifications need to describe semantics rather than syntax, and to do that in an abstract and precise manner, in order to create software systems that satisfy business rules. The papers in this book show various ways of designing elegant and clear specifications which are reusable, lead to savings of intellectual effort, time, and money, and which contribute to the reliability of software and systems. Object-Oriented Behavioral Specifications offers a fresh treatment of the object-oriented paradigm by examining the limitations of traditional OO methodologies and by describing the significance of competing trends in OO modeling. The book builds on four years of successful OOPSLA workshops (1991-1995) on behavior semantics. This book deals with precise specifications of `what' is accomplished by the business and `what' is to be done by a system. The book includes descriptions of successful use of abstract and precise specification in industry. It draws on the experience of experts from industrial and academic settings and benefits from international participation. Collective behavior, neglected in some treatment of the OO paradigm, is addressed explicitly in this book. The book does not take `reuse' of specifications or software for granted, but furnishes a foundation for taking as rigorous an approach to reuse decisions as to precise specifications in original developments.
ThIS IS an English verSIOn of the book m two volumes, entitled "KeiJo Shon Kogaku (1), (2)" (Nikkan Kogyo Shinbun Co.) written in Japanese. The purpose of the book is a umfied and systematic exposition of the wealth of research results m the field of mathematical representation of curves and surfaces for computer aided geometric design that have appeared in the last thirty years. The material for the book started hfe as a set of notes for computer aided geometnc design courses which I had at the graduate schools of both computer SCIence, the umversity of Utah m U.S.A. and Kyushu Institute of Design in Japan. The book has been used extensively as a standard text book of curves and surfaces for students, practtcal engmeers and researchers. With the aim of systematic expositIOn, the author has arranged the book in 8 chapters: Chapter 0: The sIgmficance of mathemattcal representations of curves and surfaces is explained and histoncal research developments in this field are revIewed. Chapter 1: BasIc mathematical theones of curves and surfaces are reviewed and summanzed. Chapter 2: A classical mterpolation method, the Lagrange interpolation, is discussed. Although its use is uncommon in practice, this chapter is helpful in understanding Chaps. 4 and 6. Chapter 3: This chapter dIscusses the Coons surface in detail, which is one of the most important contributions in this field. Chapter 4: The fundamentals of spline functions, spline curves and surfaces are discussed in some detail.
1 This year marks the l0 h anniversary of the IFIP International Workshop on Protocols for High-Speed Networks (PfHSN). It began in May 1989, on a hillside overlooking Lake Zurich in Switzerland, and arrives now in Salem Massachusetts 6,000 kilometers away and 10 years later, in its sixth incarnation, but still with a waterfront view (the Atlantic Ocean). In between, it has visited some picturesque views of other lakes and bays of the world: Palo Alto (1990 - San Francisco Bay), Stockholm (1993 - Baltic Sea), Vancouver (1994- the Strait of Georgia and the Pacific Ocean), and Sophia Antipolis I Nice (1996- the Mediterranean Sea). PfHSN is a workshop providing an international forum for the exchange of information on high-speed networks. It is a relatively small workshop, limited to 80 participants or less, to encourage lively discussion and the active participation of all attendees. A significant component of the workshop is interactive in nature, with a long history of significant time reserved for discussions. This was enhanced in 1996 by Christophe Diot and W allid Dabbous with the institution of Working Sessions chaired by an "animator," who is a distinguished researcher focusing on topical issues of the day. These sessions are an audience participation event, and are one of the things that makes PfHSN a true "working conference.
This book constitutes the proceedings of the 33rd Annual International Conference on the Theory and Applications of Cryptographic Techniques, EUROCRYPT 2014, held in Copenhagen, Denmark, in May 2014. The 38 full papers included in this volume were carefully reviewed and selected from 197 submissions. They deal with public key cryptanalysis, identity-based encryption, key derivation and quantum computing, secret-key analysis and implementations, obfuscation and multi linear maps, authenticated encryption, symmetric encryption, multi-party encryption, side-channel attacks, signatures and public-key encryption, functional encryption, foundations and multi-party computation.
This two-volume-set (LNCS 8384 and 8385) constitutes the refereed proceedings of the 10th International Conference of Parallel Processing and Applied Mathematics, PPAM 2013, held in Warsaw, Poland, in September 2013. The 143 revised full papers presented in both volumes were carefully reviewed and selected from numerous submissions. The papers cover important fields of parallel/distributed/cloud computing and applied mathematics, such as numerical algorithms and parallel scientific computing; parallel non-numerical algorithms; tools and environments for parallel/distributed/cloud computing; applications of parallel computing; applied mathematics, evolutionary computing and metaheuristics.
This book constitutes the refereed proceedings of the 5th International Symposium on Engineering Secure Software and Systems, ESSoS 2013, held in Paris, France, in February/March 2013. The 13 revised full papers presented together with two idea papers were carefully reviewed and selected from 62 submissions. The papers are organized in topical sections on secure programming, policies, proving, formal methods, and analyzing.
The growth of the Internet and the availability of enormous volumes of data in digital form has necessitated intense interest in techniques for assisting the user in locating data of interest. The Internet has over 350 million pages of data and is expected to reach over one billion pages by the year 2000. Buried on the Internet are both valuable nuggets for answering questions as well as large quantities of information the average person does not care about. The Digital Library effort is also progressing, with the goal of migrating from the traditional book environment to a digital library environment. Information Retrieval Systems: Theory and Implementation provides a theoretical and practical explanation of the latest advancements in information retrieval and their application to existing systems. It takes a system approach, discussing all aspects of an Information Retrieval System. The importance of the Internet and its associated hypertext-linked structure is put into perspective as a new type of information retrieval data structure.The total system approach also includes discussion of the human interface and the importance of information visualization for identification of relevant information. The theoretical metrics used to describe information systems are expanded to discuss their practical application in the uncontrolled environment of real world systems. Information Retrieval Systems: Theory and Implementation is suitable as a textbook for a graduate-level course on information retrieval, and as a reference for researchers and practitioners in industry.
Around the globe, nations face the problem of protecting their Critical Information Infrastructure, normally referred to as Cyber Space. In this monograph, we capture FIVE different aspects of the problem; High speed packet capture, Protection through authentication, Technology Transition, Test Bed Simulation, and Policy and Legal Environment. The monograph is the outcome of over three years of cooperation between India and Australia.
This book constitutes thoroughly refereed post-conference proceedings of the workshops of the 19th International Conference on Parallel Computing, Euro-Par 2013, held in Aachen, Germany in August 2013. The 99 papers presented were carefully reviewed and selected from 145 submissions. The papers include seven workshops that have been co-located with Euro-Par in the previous years: - Big Data Cloud (Second Workshop on Big Data Management in Clouds) - Hetero Par (11th Workshop on Algorithms, Models and Tools for Parallel Computing on Heterogeneous Platforms) - HiBB (Fourth Workshop on High Performance Bioinformatics and Biomedicine) - OMHI (Second Workshop on On-chip Memory Hierarchies and Interconnects) - PROPER (Sixth Workshop on Productivity and Performance) - Resilience (Sixth Workshop on Resiliency in High Performance Computing with Clusters, Clouds, and Grids) - UCHPC (Sixth Workshop on Un Conventional High Performance Computing) as well as six newcomers: - DIHC (First Workshop on Dependability and Interoperability in Heterogeneous Clouds) - Fed ICI (First Workshop on Federative and Interoperable Cloud Infrastructures) - LSDVE (First Workshop on Large Scale Distributed Virtual Environments on Clouds and P2P) - MHPC (Workshop on Middleware for HPC and Big Data Systems) -PADABS ( First Workshop on Parallel and Distributed Agent Based Simulations) - ROME (First Workshop on Runtime and Operating Systems for the Many core Era) All these workshops focus on promotion and advancement of all aspects of parallel and distributed computing.
This book constitutes the thoroughly refereed post-conference proceedings of the Third International Conference on Agents and Artificial Intelligence, ICAART 2011, held in Rome, Italy, in January 2011. The 26 revised full papers presented together with two invited paper were carefully reviewed and selected from 367 submissions. The papers are organized in two topical sections on artificial intelligence and on agents.
This book constitutes the thoroughly refereed post-conference proceedings of the 24th International Workshop on Languages and Compilers for Parallel Computing, LCPC 2011, held in Fort Collins, CO, USA, in September 2011. The 19 revised full papers presented and 19 poster papers were carefully reviewed and selected from 52 submissions. The scope of the workshop spans the theoretical and practical aspects of parallel and high-performance computing, and targets parallel platforms including concurrent, multithreaded, multicore, accelerator, multiprocessor, and cluster systems. |
You may like...
Promethean Oracle
Sophia Shultz, Mark Cogan
Miscellaneous printed matter
The Less Is More Linear Algebra of…
Daniela Calvetti, Erkki Somersalo
Paperback
R1,601
Discovery Miles 16 010
|