![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > Data structures
Many emerging technologies such as video conferencing, video-on-demand, and digital libraries require the efficient delivery of compressed video streams. For applications that require the delivery of compressed stored multimedia streams, the a priori knowledge available about these compressed streams can aid in the allocation of server and network resources. By using a client-side buffer, the resource requirements from the server and network can be minimized. Buffering Techniques for Delivery of Compressed Video in Video-on-Demand Systems presents a comprehensive description of buffering techniques for the delivery of compressed, prerecorded multimedia data. While these techniques can be applied to any compressed data streams, this book focusses primarily on the delivery of video streams because of the large resource requirements that they can consume. The book first describes buffering techniques for the continuous playback of stored video sources. In particular, several bandwidth smoothing (or buffering) algorithms that are provably optimal under certain conditions are presented. To provide a well-rounded discussion, the book then describes extensions that aid in the ability to provide interactive delivery of video across networks. Specifically, reservation techniques that take into account interactive functions such as fast-forward and rewind are described. In addition, extensions to the bandwidth smoothing algorithms presented in the first few chapters are described. These algorithms are designed with interactive, continuous playback of stored video in mind and are also provably optimal under certain constraints. Buffering Techniques for Delivery of Compressed Video in Video-on-Demand Systems serves as an excellent resource for multimedia systems, networking and video-on-demand designers, and may be used as a text for advanced courses on the topic.
It is important to understand what came before and how to meld new products with legacy systems. Network managers need to understand the context and origins of the systems they are using. Programmers need an understanding of the reasons behind the interfaces they must satisfy and the relationship of the software they build to the whole network. And finally, sales representatives need to see the context into which their products must fit.
Real-Time Systems Engineering and Applications is a well-structured collection of chapters pertaining to present and future developments in real-time systems engineering. After an overview of real-time processing, theoretical foundations are presented. The book then introduces useful modeling concepts and tools. This is followed by concentration on the more practical aspects of real-time engineering with a thorough overview of the present state of the art, both in hardware and software, including related concepts in robotics. Examples are given of novel real-time applications which illustrate the present state of the art. The book concludes with a focus on future developments, giving direction for new research activities and an educational curriculum covering the subject. This book can be used as a source for academic and industrial researchers as well as a textbook for computing and engineering courses covering the topic of real-time systems engineering.
Content protection and digital rights management (DRM) are fields that receive a lot of attention: content owners require systems that protect and maximize their revenues; consumers want backwards compatibility, while they fear that content owners will spy on their viewing habits; and academics are afraid that DRM may be a barrier to knowledge sharing. DRM technologies have a poor reputation and are not yet trusted. This book describes the key aspects of content protection and DRM systems, the objective being to demystify the technology and techniques. In the first part of the book, the author builds the foundations, with sections that cover the rationale for protecting digital video content; video piracy; current toolboxes that employ cryptography, watermarking, tamper resistance, and rights expression languages; different ways to model video content protection; and DRM. In the second part, he describes the main existing deployed solutions, including video ecosystems; how video is protected in broadcasting; descriptions of DRM systems, such as Microsoft's DRM and Apple's FairPlay; techniques for protecting prerecorded content distributed using DVDs or Blu-ray; and future methods used to protect content within the home network. The final part of the book looks towards future research topics, and the key problem of interoperability. While the book focuses on protecting video content, the DRM principles and technologies described are also used to protect many other types of content, such as ebooks, documents and games. The book will be of value to industrial researchers and engineers developing related technologies, academics and students in information security, cryptography and media systems, and engaged consumers.
Welcome to the third International Conference on Management of Multimedia Networks and Services (MMNS'2000) in Fortaleza (Brazil)! The first MMNS was held in Montreal ( Canada) in july 1997 and the second MMNS was held in Versailles (France) in November 1998. The MMNS conference takes place every year and a half and is aimed to be a truly international event by bringing together researchers and practitioners from all around the world and by organising the conference each time in a different continent/country. Over the past several years, there has been a considerable amount of research within the fields of multimedia networking and network management. Much of that work has taken place within the context of managing Quality-of Service in broadband integrated services digital networks such as the A TM, and more recently in IP-based networks, to respond to the requirements of emerging multimedia applications. A TM networks were designed to support multimedia traffic with diverse characteristics and can be used as the transfer mode for both wired and wireless networks. A new set of Internet protocols is being developed to provide better quality of service, which is a prerequisite for supporting multimedia applications. Multimedia applications have a different set of requirements, which impacts the design of the underlying communication network as well as its management. Several QoS management mechanisms intervening at different layers of the communication network are required including QoS-routing, QoS-based transport, QoS negotiation, QoS adaptation, FCAPS management, and mobility management.
Advanced Communications and Multimedia Security presents a state-of-the-art review of current perspectives as well as the latest developments in the area of communications and multimedia security. It examines requirements, issues and solutions pertinent to securing information networks, and identifies future security-related research challenges. A wide spectrum of topics is discussed, including: -Applied cryptography; -Biometry; -Communication systems security; -Applications security; Mobile security; -Distributed systems security; -Digital watermarking and digital signatures. This volume comprises the proceedings of the sixth Joint Working Conference on Communications and Multimedia Security (CMS'02), which was sponsored by the International Federation for Information Processing (IFIP) and held in September 2002 in Portoroz, Slovenia. It constitutes essential reading for information security specialists, researchers and professionals working in the area of computer science and communication systems.
Research Directions in Data and Applications Security describes original research results and innovative practical developments, all focused on maintaining security and privacy in database systems and applications that pervade cyberspace. The areas of coverage include: -Role-Based Access Control; -Database Security; -XML Security; -Data Mining and Inference; -Multimedia System Security; -Network Security; -Public Key Infrastructure; -Formal Methods and Protocols; -Security and Privacy.
This book constitutes the proceedings of the 16th International Conference on Integer Programming and Combinatorial Optimization, IPCO 2013, held in Valparaiso, Chile, in March 2013. The 33 full papers presented were carefully reviewed and selected from 98 submissions. The conference is a forum for researchers and practitioners working on various aspects of integer programming and combinatorial optimization with the aim to present recent developments in theory, computation, and applications. The scope of IPCO is viewed in a broad sense, to include algorithmic and structural results in integer programming and combinatorial optimization as well as revealing computational studies and novel applications of discrete optimization to practical problems.
High Performance Networking is a state-of-the-art book that deals with issues relating to the fast-paced evolution of public, corporate and residential networks. It focuses on the practical and experimental aspects of high performance networks and introduces novel approaches and concepts aimed at improving the performance, usability, interoperability and scalability of such systems. Among others, the topics covered include: * Java applets and applications; * distributed virtual environments; * new internet streaming protocols; * web telecollaboration tools; * Internet, Intranet; * real-time services like multimedia; * quality of service; * mobility. High Performance Networking comprises the proceedings of the Eighth International Conference on High Performance Networking, sponsored by the International Federation for Information Processing (IFIP), and was held at Vienna Univrsity of Technology, Vienna, Austria, in September 1998. High Performance Networking is suitable as a secondary text for a graduate level course on high performance networking, and as a reference for researchers and practitioners in industry.
This book is the outcome of the Dagstuhl Seminar 13201 on Information Visualization - Towards Multivariate Network Visualization, held in Dagstuhl Castle, Germany in May 2013. The goal of this Dagstuhl Seminar was to bring together theoreticians and practitioners from Information Visualization, HCI and Graph Drawing with a special focus on multivariate network visualization, i.e., on graphs where the nodes and/or edges have additional (multidimensional) attributes. The integration of multivariate data into complex networks and their visual analysis is one of the big challenges not only in visualization, but also in many application areas. Thus, in order to support discussions related to the visualization of real world data, also invited researchers from selected application areas, especially bioinformatics, social sciences and software engineering. The unique "Dagstuhl climate" ensured an open and undisturbed atmosphere to discuss the state-of-the-art, new directions and open challenges of multivariate network visualization.
aiStructure of Solutions of Variational Problems is devoted to recent progress made in the studies of the structure of approximate solutions of variational problems considered on subintervals of a real line. Results on properties of approximate solutions which are independent of the length of the interval, for all sufficiently large intervals are presented in a clear manner. Solutions, new approaches, techniques and methods to a number of difficult problems in the calculus of variations are illustrated throughout this book. This book also contains significant results and information about the turnpike property of the variational problems. This well-known property is a general phenomenon which holds for large classes of variational problems. The author examines the following in relation to the turnpike property in individual (non-generic) turnpike results, sufficient and necessary conditions for the turnpike phenomenon as well as in the non-intersection property for extremals of variational problems. This book appeals to mathematicians working in optimal control and the calculus as well as with graduate students.aiaiai
Diversity is characteristic of the information age and also of statistics. To date, the social sciences have contributed greatly to the development of handling data under the rubric of measurement, while the statistical sciences have made phenomenal advances in theory and algorithms. Measurement and Multivariate Analysis promotes an effective interplay between those two realms of research-diversity with unity. The union and the intersection of those two areas of interest are reflected in the papers in this book, drawn from an international conference in Banff, Canada, with participants from 15 countries. In five major categories - scaling, structural analysis, statistical inference, algorithms, and data analysis - readers will find a rich variety of topics of current interest in the extended statistical community.
IFIP/SEC2000, being part of the 16th IFIP World Computer Congress (WCC2000), is being held in Beijing, China from August 21 to 25, 2000. SEC2000 is the annual conference of TCll (Information Security) of the International Federation of Information Processing. The conference focuses on the seamless integration of information security services as an integral part of the Global Information Infrastructure in the new millenniUm. SEC2000 is sponsored by the China Computer Federation (CCF), IFIP/TCll, and Engineering Research Centre for Information Security Technology, Chinese Academy of Sciences (ERCIST, CAS). There were 180 papers submitted for inclusion, 50 papers among them have been accepted as long papers and included in this proceeding, 81 papers have been accepted as short papers and published in another proceeding. All papers presented in this conference were reviewed blindly by a minimum of two international reviewers. The authors' affiliations of the 180 submissions and the accepted 131 papers range over 26 and 25 countries or regions, respectively. We would like to appreciate all who have submitted papers to IFIP/SEC2000, and the authors of accepted papers for their on-time preparation of camera-ready fmal versions. Without their contribution there would be no conference. We wish to express our gratitude to all program committee members and other reviewers for their hard work in reviewing the papers in a short time and for contributing to the conference in different ways. We would like to thank Rein Venter for his time and expertise in compiling the fmal version of the proceedings.
This book constitutes the refereed proceedings of the 5th
International Conference on Pairing-Based Cryptography, Pairing
2012, held in Cologne, Germany, in May 2012.
Video segmentation is the most fundamental process for appropriate index ing and retrieval of video intervals. In general, video streams are composed 1 of shots delimited by physical shot boundaries. Substantial work has been done on how to detect such shot boundaries automatically (Arman et aI. , 1993) (Zhang et aI. , 1993) (Zhang et aI. , 1995) (Kobla et aI. , 1997). Through the inte gration of technologies such as image processing, speech/character recognition and natural language understanding, keywords can be extracted and associated with these shots for indexing (Wactlar et aI. , 1996). A single shot, however, rarely carries enough amount of information to be meaningful by itself. Usu ally, it is a semantically meaningful interval that most users are interested in re trieving. Generally, such meaningful intervals span several consecutive shots. There hardly exists any efficient and reliable technique, either automatic or manual, to identify all semantically meaningful intervals within a video stream. Works by (Smith and Davenport, 1992) (Oomoto and Tanaka, 1993) (Weiss et aI. , 1995) (Hjelsvold et aI. , 1996) suggest manually defining all such inter vals in the database in advance. However, even an hour long video may have an indefinite number of meaningful intervals. Moreover, video data is multi interpretative. Therefore, given a query, what is a meaningful interval to an annotator may not be meaningful to the user who issues the query. In practice, manual indexing of meaningful intervals is labour intensive and inadequate.
This book constitutes the refereed proceedings of the 13th European Conference on Evolutionary Computation in Combinatorial Optimization, EvoCOP 2013, held in Vienna, Austria, in April 2013, colocated with the Evo* 2013 events EuroGP, EvoBIO, EvoMUSART, and EvoApplications. The 23 revised full papers presented were carefully reviewed and selected from 50 submissions. The papers present the latest research and discuss current developments and applications in metaheuristics - a paradigm to effectively solve difficult combinatorial optimization problems appearing in various industrial, economic, and scientific domains. Prominent examples of metaheuristics are ant colony optimization, evolutionary algorithms, greedy randomized adaptive search procedures, iterated local search, simulated annealing, tabu search, and variable neighborhood search. Applications include scheduling, timetabling, network design, transportation and distribution, vehicle routing, the travelling salesman problem, packing and cutting, satisfiability, and general mixed integer programming.
The NSF Center for Intelligent Information Retrieval (CIIR) was formed in the Computer Science Department of the University of Massachusetts, Amherst, in 1992. Through its efforts in basic research, applied research, and technology transfer, the CIIR has become known internationally as one of the leading research groups in the area of information retrieval. The CIIR focuses on research that results in more effective and efficient access and discovery in large, heterogeneous, distributed text and multimedia databases. The scope of the work that is done in the CIIR is broad and goes significantly beyond 'traditional' areas of information retrieval such as retrieval models, cross-lingual search, and automatic query expansion. The research includes both low-level systems issues such as the design of protocols and architectures for distributed search, as well as more human-centered topics such as user interface design, visualization and data mining with text, and multimedia retrieval.Advances in Information Retrieval: Recent Research from the Center for Intelligent Information Retrieval is a collection of papers that covers a wide variety of topics in the general area of information retrieval. Together, they represent a snapshot of the state of the art in information retrieval at the turn of the century and at the end of a decade that has seen the advent of the World-Wide Web. The papers provide overviews and in-depth analysis of theory and experimental results. This book can be used as source material for graduate courses in information retrieval, and as a reference for researchers and practitioners in industry.
The vast area of Scientific Computing, which is concerned with the computer- aided simulation of various processes in engineering, natural, economical, or social sciences, now enjoys rapid progress owing to the development of new efficient symbolic, numeric, and symbolic/numeric algorithms. There has already been for a long time a worldwide recognition of the fact that the mathematical term algorithm takes its origin from the Latin word algo- ritmi, which is in turn a Latin transliteration of the Arab name "AI Khoresmi" of the Khoresmian mathematician Moukhammad Khoresmi, who lived in the Khoresm khanate during the years 780 - 850. The Khoresm khanate took sig- nificant parts of the territories of present-day TUrkmenistan and Uzbekistan. Such towns of the Khoresm khanate as Bukhara and Marakanda (the present- day Samarkand) were the centers of mathematical science and astronomy. The great Khoresmian mathematician M. Khoresmi introduced the Indian decimal positional system into everyday's life; this system is based on using the famil- iar digits 1,2,3,4,5,6,7,8,9,0. M. Khoresmi had presented the arithmetic in the decimal positional calculus (prior to him, the Indian positional system was the subject only for jokes and witty disputes). Khoresmi's Book of Addition and Subtraction by Indian Method (Arithmetic) differs little from present-day arith- metic. This book was translated into Latin in 1150; the last reprint was produced in Rome in 1957.
As computer power grows and data collection technologies advance, a plethora of data is generated in almost every field where computers are used. The com puter generated data should be analyzed by computers; without the aid of computing technologies, it is certain that huge amounts of data collected will not ever be examined, let alone be used to our advantages. Even with today's advanced computer technologies (e. g. , machine learning and data mining sys tems), discovering knowledge from data can still be fiendishly hard due to the characteristics of the computer generated data. Taking its simplest form, raw data are represented in feature-values. The size of a dataset can be measUJ*ed in two dimensions, number of features (N) and number of instances (P). Both Nand P can be enormously large. This enormity may cause serious problems to many data mining systems. Feature selection is one of the long existing methods that deal with these problems. Its objective is to select a minimal subset of features according to some reasonable criteria so that the original task can be achieved equally well, if not better. By choosing a minimal subset offeatures, irrelevant and redundant features are removed according to the criterion. When N is reduced, the data space shrinks and in a sense, the data set is now a better representative of the whole data population. If necessary, the reduction of N can also give rise to the reduction of P by eliminating duplicates.
1 This year marks the l0 h anniversary of the IFIP International Workshop on Protocols for High-Speed Networks (PfHSN). It began in May 1989, on a hillside overlooking Lake Zurich in Switzerland, and arrives now in Salem Massachusetts 6,000 kilometers away and 10 years later, in its sixth incarnation, but still with a waterfront view (the Atlantic Ocean). In between, it has visited some picturesque views of other lakes and bays of the world: Palo Alto (1990 - San Francisco Bay), Stockholm (1993 - Baltic Sea), Vancouver (1994- the Strait of Georgia and the Pacific Ocean), and Sophia Antipolis I Nice (1996- the Mediterranean Sea). PfHSN is a workshop providing an international forum for the exchange of information on high-speed networks. It is a relatively small workshop, limited to 80 participants or less, to encourage lively discussion and the active participation of all attendees. A significant component of the workshop is interactive in nature, with a long history of significant time reserved for discussions. This was enhanced in 1996 by Christophe Diot and W allid Dabbous with the institution of Working Sessions chaired by an "animator," who is a distinguished researcher focusing on topical issues of the day. These sessions are an audience participation event, and are one of the things that makes PfHSN a true "working conference.
This book constitutes the thoroughly refereed proceedings of the 10th Theory of Cryptography Conference, TCC 2013, held in Tokyo, Japan, in March 2013. The 36 revised full papers presented were carefully reviewed and selected from 98 submissions. The papers cover topics such as study of known paradigms, approaches, and techniques, directed towards their better understanding and utilization; discovery of new paradigms, approaches and techniques that overcome limitations of the existing ones; formulation and treatment of new cryptographic problems; study of notions of security and relations among them; modeling and analysis of cryptographic algorithms; and study of the complexity assumptions used in cryptography.
This book constitutes the refereed proceedings of the Second IFIP TC 5/8 International Conference on Information and Communication Technology, ICT-Eur Asia 2014, with the collocation of Asia ARES 2014 as a special track on Availability, Reliability and Security, held in Bali, Indonesia, in April 2014. The 70 revised full papers presented were carefully reviewed and selected from numerous submissions. The papers have been organized in the following topical sections: applied modeling and simulation; mobile computing; advanced urban-scale ICT applications; semantic web and knowledge management; cloud computing; image processing; software engineering; collaboration technologies and systems; e-learning; data warehousing and data mining; e-government and e-health; biometric and bioinformatics systems; network security; dependable systems and applications; privacy and trust management; cryptography; multimedia security and dependable systems and applications.
Around the globe, nations face the problem of protecting their Critical Information Infrastructure, normally referred to as Cyber Space. In this monograph, we capture FIVE different aspects of the problem; High speed packet capture, Protection through authentication, Technology Transition, Test Bed Simulation, and Policy and Legal Environment. The monograph is the outcome of over three years of cooperation between India and Australia.
This book constitutes the refereed proceedings of the 5th International Symposium on Engineering Secure Software and Systems, ESSoS 2013, held in Paris, France, in February/March 2013. The 13 revised full papers presented together with two idea papers were carefully reviewed and selected from 62 submissions. The papers are organized in topical sections on secure programming, policies, proving, formal methods, and analyzing.
The growth of the Internet and the availability of enormous volumes of data in digital form has necessitated intense interest in techniques for assisting the user in locating data of interest. The Internet has over 350 million pages of data and is expected to reach over one billion pages by the year 2000. Buried on the Internet are both valuable nuggets for answering questions as well as large quantities of information the average person does not care about. The Digital Library effort is also progressing, with the goal of migrating from the traditional book environment to a digital library environment. Information Retrieval Systems: Theory and Implementation provides a theoretical and practical explanation of the latest advancements in information retrieval and their application to existing systems. It takes a system approach, discussing all aspects of an Information Retrieval System. The importance of the Internet and its associated hypertext-linked structure is put into perspective as a new type of information retrieval data structure.The total system approach also includes discussion of the human interface and the importance of information visualization for identification of relevant information. The theoretical metrics used to describe information systems are expanded to discuss their practical application in the uncontrolled environment of real world systems. Information Retrieval Systems: Theory and Implementation is suitable as a textbook for a graduate-level course on information retrieval, and as a reference for researchers and practitioners in industry. |
You may like...
Primer for Data Analytics and Graduate…
Douglas Wolfe, Grant Schneider
Hardcover
R2,441
Discovery Miles 24 410
Concept Parsing Algorithms (CPA) for…
Uri Shafrir, Masha Etkind
Hardcover
R3,276
Discovery Miles 32 760
Comprehensive Metaheuristics…
S. Ali Mirjalili, Amir Hossein Gandomi
Paperback
R3,956
Discovery Miles 39 560
|