|  |  Welcome to Loot.co.za!  
				Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search | Your cart is empty | ||
| Books > Computing & IT > Computer software packages > Desktop publishing software > General 
 Content Licensing is a wide-ranging and comprehensive guide to
providing content for dissemination electronically. It outlines a
step-by-step introduction to the why, how, and frequently asked
questions of digital content and how to license it. In addition, it
examines the context in which licensing takes place. What makes the
book unique is that it examines licensing from a range of
perspectives.  
 
 
 
 
 The typesetting system TeX, invented by Donald Knuth, is thirty-two years old in 2010. For this anniversary, a commemorative book has been prepared containing example papers by Knuth and the Stanford graduate students who helped develop TeX. These papers all were selected from the TeX Users Group's TUGboat journal archive. 
 The book offers a detailed guide to temporal ordering, exploring open problems in the field and providing solutions and extensive analysis. It addresses the challenge of automatically ordering events and times in text. Aided by TimeML, it also describes and presents concepts relating to time in easy-to-compute terms. Working out the order that events and times happen has proven difficult for computers, since the language used to discuss time can be vague and complex. Mapping out these concepts for a computational system, which does not have its own inherent idea of time, is, unsurprisingly, tough. Solving this problem enables powerful systems that can plan, reason about events, and construct stories of their own accord, as well as understand the complex narratives that humans express and comprehend so naturally. This book presents a theory and data-driven analysis of temporal ordering, leading to the identification of exactly what is difficult about the task. It then proposes and evaluates machine-learning solutions for the major difficulties. It is a valuable resource for those working in machine learning for natural language processing as well as anyone studying time in language, or involved in annotating the structure of time in documents. 
 The objective of this monograph is to improve the performance of the sentiment analysis model by incorporating the semantic, syntactic and common-sense knowledge. This book proposes a novel semantic concept extraction approach that uses dependency relations between words to extract the features from the text. Proposed approach combines the semantic and common-sense knowledge for the better understanding of the text. In addition, the book aims to extract prominent features from the unstructured text by eliminating the noisy, irrelevant and redundant features. Readers will also discover a proposed method for efficient dimensionality reduction to alleviate the data sparseness problem being faced by machine learning model. Authors pay attention to the four main findings of the book : -Performance of the sentiment analysis can be improved by reducing the redundancy among the features. Experimental results show that minimum Redundancy Maximum Relevance (mRMR) feature selection technique improves the performance of the sentiment analysis by eliminating the redundant features. - Boolean Multinomial Naive Bayes (BMNB) machine learning algorithm with mRMR feature selection technique performs better than Support Vector Machine (SVM) classifier for sentiment analysis. - The problem of data sparseness is alleviated by semantic clustering of features, which in turn improves the performance of the sentiment analysis. - Semantic relations among the words in the text have useful cues for sentiment analysis. Common-sense knowledge in form of ConceptNet ontology acquires knowledge, which provides a better understanding of the text that improves the performance of the sentiment analysis. 
 The first comprehensive guide to explore the growing field of electronic information, The Text in the Machine: Electronic Texts in the Humanities will help you create and use electronic texts. This book explains the processes involved in developing computerized books on library Web sites, CD-ROMs, or your own Web site. With the information provided by The Text in the Machine, you?ll be able to successfully transfer written words to a digitized form and increase access to any kind of information. Keeping the perspectives of scholars, students, librarians, users, and publishers in mind, this book outlines the necessary steps for electronic conversion in a comprehensive manner. The Text in the Machine addresses many variables that need to be taken into consideration to help you digitize texts, such as: defining types of markup, markup systems, and their uses identifying characteristics of the written text, such as its linguistic and physical nature, before choosing a markup scheme ensuring accuracy in electronic texts by keying in information up to three times and choosing software that is compatible with the markup systems you are using examining the best file formats for scanning written texts and converting them to digital form explaining the delivery systems available for electronic texts, such as CD-ROMs, the Internet, magnetic tape, and the variety of software that will interpret these interfaces designing the structure of electronic texts with linear presentation, segmented text, or image files to increase readability and accessibility Containing lists of suggested readings and examples of electronic text Web sites, this book provides you with the opportunity to see how other libraries and scholars are creating and publishing digital texts. From The Text in the Machine, you?ll receive the knowledge to make this medium of information accessible and beneficial to patrons and scholars around the world. 
 1.1 Was ist Typographie?.- 1.2 Typographie als Handwerk.- 1.3 Die Schrift.- Schrift.- 2.1 Das Schriftzeichen.- 2.2 Zeichenmasse.- 2.3 Serifen.- 2.4 Das einzelne Zeichen (die Letter).- 2.5 Schrifteneinteilung.- 2.6 Schriftnamen innerhalb einer Schnittfamilie.- 2.7 Spezialzeichen und Spezialschnitte.- 2.8 Laufweiten, Zeichenabstande.- 2.8.1 Unterschneiden.- 2.8.2 Ligaturen.- 2.9 Schriftauszeichnung.- 2.10 Wortabstande.- 2.11 Zeilenabstand und Durchschuss.- 2.12 Satzausrichtung.- 2.13 Initialen.- 2.14 Formsatz.- Massangaben in der Typographie.- 3.1 Typographische Masseinheiten.- 3.1.1 Kegel- und Versalhoehen.- 3.1.2 Massangaben im DTP.- 3.1.3 Weitere typographische Masse.- Der Satzspiegel.- 4.1 Proportionen der Seite.- 4.2 Festlegen des Satzspiegels.- 4.2.1 Stege beim Satzspiegel.- 4.2.2 Gestaltungsraster.- 4.3 Stilelemente im Satzspiegel.- 4.3.1 Schriftgroessen.- 4.3.2 Absatzuntergliederung.- 4.3.3 Grauwert einer Seite.- 4.3.4 Fussnoten und Marginalien.- 4.4 Feinkorrekturen.- 4.4.1 Absatzumbruch.- 4.4.2 Worttrennungen.- 4.4.3 AEsthetikprogramme.- Die Schrift zum Text.- 5.1 Aussage einer Schrift.- 5.2 Werkschriften.- 5.3 UEberschriften / Headlines.- 5.4 Welche Schrift zu welchem Zweck?.- 5.5 Mischen von Schriften.- 5.6 Ausnahmen.- Schreibregeln.- 6.1 Zahlensatz.- 6.2 Absatz-Numerierung.- 6.3 Abkurzungen.- 6.4 Unterschiedliche Anfuhrungszeichen.- 6.5 Zwischenraume.- 6.6 Verschiedene Textstriche.- Tabellensatz.- 7.1 Reihensatz oder Tabelle.- 7.2 Tabellenkomponenten.- 7.3 Tabellenkonzeption.- 7.4 Tabellengliederung.- 7.5 Besondere Situationen.- 7.6 Diagramm statt Tabelle.- 7.7 Tabelle statt Liste.- Abbildungen.- 8.1 Anordnung von Abbildungen.- 8.2 Linienstarken.- 8.3 Geeigneter Detaillierungsgrad.- 8.4 Schrift in Abbildungen.- 8.5 Randbeschnitt.- 8.6 Halbtonbilder und Raster.- 8.6.1 Rasterzerlegung.- 8.6.2 Bildqualitat in Farbtiefe.- 8.6.3 Technische Raster.- 8.6.4 Tonwertzuwachs.- 8.7 Komprimieren - aber richtig.- 8.8 Farbe in Dokumenten.- 8.8.1 Der Einsatz von Farben.- 8.8.2 Stimmung und Wirkung von Farben.- 8.8.3 Farben im Farbkreis.- 8.8.4 Farbharmonie.- 8.8.5 Farben in Diagrammen und Graphiken.- 8.8.6 Farben an das Ausgabemedium adaptieren.- Von Zahlen zu Diagrammen.- 9.1 Verschiedene Diagrammarten.- 9.1.1 Kreisdiagramme.- 9.1.2 Balkendiagramme.- 9.1.3 Stabdiagramme.- 9.1.4 Figurendiagramme.- 9.1.5 Liniendiagramme.- 9.1.6 Netzdiagramme.- 9.2 Dreidimensionale Diagramme.- 9.3 Skalen.- 9.4 Weitere Regeln bei Diagrammen.- Prasentationen.- 10.1 Voruberlegungen zu Prasentationen.- 10.2 Von der Information zur Prasentation.- 10.3 Prasentationsmedien.- 10.4 Gliederung der Folien.- 10.5 Schrift in Folien.- 10.6 Makro- und Mikrotypographic in Prasentationen.- 10.6.1 Typo-Orthographie - Schreibregeln.- 10.6.2 Eine sorgfaltige Vorlage ist die halbe Arbeit.- 10.7 Graphiken in Folien.- 10.7.1 Bilder als symbolischer Hintergrund.- 10.7.2 Funktionsgraphiken.- 10.7.3 Weitere Prinzipien bei Graphiken.- 10.8 UEberblendeffekte und Animationen.- 10.9 Weitere Tips fur Prasentationen.- Der Standardbrief.- 11.1 DIN-Brief.- 11.2 Umschlage und Falzarten.- Von der Seite zum Buch.- 12.1 Arbeitsablauf einer Publikation.- 12.2 Die Teile eines Buchs.- 12.3 Titelei.- 12.3.1 Der Schmutztitel.- 12.3.2 Der Haupttitel.- 12.3.3 Impressum.- 12.3.4 Inhaltsverzeichnis.- 12.3.5 Vor- oder Geleitwort.- 12.4 Anhang.- 12.4.1 Bibliographie.- 12.4.2 Glossar.- 12.4.3 Register.- 12.5 Einband.- Satz und Korrektur.- 13.1 Satzanweisungen.- 13.2 Korrekturen, Korrekturzeichen.- Das Belichten.- 14.1 Die Belichtung.- 14.2 PostScript, PDF oder Dokument?.- 14.2.1 PostScript-Dateien.- 14.2.2 Adobe Acrobat - PDF.- 14.2.3 Dokumenten-Dateien.- 14.3 Schriften beim Belichten.- 14.4 Graphiken und Bilder.- 14.5 Angaben zum Belichten.- 14.6 Raster beim Belichten.- 14.7 Belichten von Farben.- 14.8 Belichtungsformular.- 14.9 Proof - Probedruck.- Drucken und Binden.- 15.1 Verschiedene Druckverfahren.- 15.2 Druck-Vorbereitungen.- 15.2.1 Belichten des Films.- 15.2.2 Ausschiessen.- 
 
 The explosion of information technology has led to substantial growth of web-accessible linguistic data in terms of quantity, diversity and complexity. These resources become even more useful when interlinked with each other to generate network effects. The general trend of providing data online is thus accompanied by newly developing methodologies to interconnect linguistic data and metadata. This includes linguistic data collections, general-purpose knowledge bases (e.g., the DBpedia, a machine-readable edition of the Wikipedia), and repositories with specific information about languages, linguistic categories and phenomena. The Linked Data paradigm provides a framework for interoperability and access management, and thereby allows to integrate information from such a diverse set of resources. The contributions assembled in this volume illustrate the band-width of applications of the Linked Data paradigm for representative types of language resources. They cover lexical-semantic resources, annotated corpora, typological databases as well as terminology and metadata repositories. The book includes representative applications from diverse fields, ranging from academic linguistics (e.g., typology and corpus linguistics) over applied linguistics (e.g., lexicography and translation studies) to technical applications (in computational linguistics, Natural Language Processing and information technology). This volume accompanies the Workshop on Linked Data in Linguistics 2012 (LDL-2012) in Frankfurt/M., Germany, organized by the Open Linguistics Working Group (OWLG) of the Open Knowledge Foundation (OKFN). It assembles contributions of the workshop participants and, beyond this, it summarizes initial steps in the formation of a Linked Open Data cloud of linguistic resources, the Linguistic Linked Open Data cloud (LLOD). 
 Text classification is becoming a crucial task to analysts in different areas. In the last few decades, the production of textual documents in digital form has increased exponentially. Their applications range from web pages to scientific documents, including emails, news and books. Despite the widespread use of digital texts, handling them is inherently difficult - the large amount of data necessary to represent them and the subjectivity of classification complicate matters. This book gives a concise view on how to use kernel approaches for inductive inference in large scale text classification; it presents a series of new techniques to enhance, scale and distribute text classification tasks. It is not intended to be a comprehensive survey of the state-of-the-art of the whole field of text classification. Its purpose is less ambitious and more practical: to explain and illustrate some of the important methods used in this field, in particular kernel approaches and techniques. 
 We are living in a multilingual world and the diversity in languages which are used to interact with information access systems has generated a wide variety of challenges to be addressed by computer and information scientists. The growing amount of non-English information accessible globally and the increased worldwide exposure of enterprises also necessitates the adaptation of Information Retrieval (IR) methods to new, multilingual settings. Peters, Braschler and Clough present a comprehensive description of the technologies involved in designing and developing systems for Multilingual Information Retrieval (MLIR). They provide readers with broad coverage of the various issues involved in creating systems to make accessible digitally stored materials regardless of the language(s) they are written in. Details on Cross-Language Information Retrieval (CLIR) are also covered that help readers to understand how to develop retrieval systems that cross language boundaries. Their work is divided into six chapters and accompanies the reader step-by-step through the various stages involved in building, using and evaluating MLIR systems. The book concludes with some examples of recent applications that utilise MLIR technologies. Some of the techniques described have recently started to appear in commercial search systems, while others have the potential to be part of future incarnations. The book is intended for graduate students, scholars, and practitioners with a basic understanding of classical text retrieval methods. It offers guidelines and information on all aspects that need to be taken into consideration when building MLIR systems, while avoiding too many 'hands-on details' that could rapidly become obsolete. Thus it bridges the gap between the material covered by most of the classical IR textbooks and the novel requirements related to the acquisition and dissemination of information in whatever language it is stored. 
 Electronic Multimedia Publishing brings together in one place important contributions and up-to-date research results in this fast moving area. Electronic Mulitmedia Publishing serves as an excellent reference, providing insight into some of the most challenging research issues in the field. 
 This book focuses on grammatical inference, presenting classic and modern methods of grammatical inference from the perspective of practitioners. To do so, it employs the Python programming language to present all of the methods discussed. Grammatical inference is a field that lies at the intersection of multiple disciplines, with contributions from computational linguistics, pattern recognition, machine learning, computational biology, formal learning theory and many others. Though the book is largely practical, it also includes elements of learning theory, combinatorics on words, the theory of automata and formal languages, plus references to real-world problems. The listings presented here can be directly copied and pasted into other programs, thus making the book a valuable source of ready recipes for students, academic researchers, and programmers alike, as well as an inspiration for their further development.> 
 This book explains how to turn your manuscript into an ebook, and use various market channels to sell your book efficiently and effectively. The first few chapters explain how to improve your book's chance of commercial success by writing hooks in your book, proofreading your own work, formatting and typesetting your book, converting it into a digital download and uploading it for sale on Amazon Kindle. This book explains how to use Facebook, LinkedIn, Twitter, Pinterest and other social networking sites to promote your book to reach a global market, and explains how to successfully building an author brand, develop an Amazon Author Page and participate on Kindleboards. Being savvy online is an important part of publishing and promoting your ebook. Tweeting and retweeting on Twitter, linking to Pinterest and other social bookmarking sites, and securing positive reviews from experts on Goodreads and other social cataloguing websites are just some of the useful online tips this book covers. It also explains how to seek reviews and high-level endorsements for your book, how to set yourself up as a reviewer and how to provide reciprocal reviews of other authors' books. Want to raise your literary profile and build up a readership and fan base but don't know where to start? This book walks you through blogging, the ins and outs of YouTube and the importance of an author website to help get your presence out there and your work known. It even discusses how to sell the foreign rights to your book and seek traditional publication once your book has garnered local and regional notoriety. Turn your book into a bestseller 'Publish and Promote Your Ebook IN A DAY' will show you how. 
 RagTime 5.6 f r Windows und MacOS eignet sich besonders zur Erstellung von Expos s, Gesch ftsberichten, Auswertungen, Pr sentationen, Katalogen und Periodika. Diese vielf ltige Verwendbarkeit ergibt sich aus den Layoutm glichkeiten und den integrierten Office-Funktionalit ten der Software wie Tabellenkalkulation, Textverarbeitung, Grafik- und Zeichenwerkzeuge, zahlreicher Import- und Exportoptionen, sowie professionellem Farbmanagement nach ICC-Standard (kommerzielle Version) und Funktionserweiterungen durch Zusatzmodule. Mit zahlreichen Projektbeispielen aus den benannten Anwendungen gibt das vorliegende Arbeitsbuch nach einer konzisen Einf hrung in Konzept und Bedienung von RagTime wertvolle Tipps, Tricks und Techniken f r Novizen und bereits erfahrene Anwender. Die beiliegende Hybrid-CD-ROM f r MacOS und Windows enth lt die Vollversion von RagTime 5.6.1 privat, n tzliche Tools, Informationen und Autorenbeispiele. 
 How many hours have you spent trying to get LATEX to do what You want? If the answer is ‘too many’, then you need LATEX Line by Line. Written for LATEX novices, this book starts from the very basics providing clear explanations and realistic examples. The book is designed to help you find solutions to specific tasks, such as indexing, setting complicated mathematics and producing simple line diagrams. It also aims to guide you through the process of creating documents as simple as letters and articles or as complex as books—all to a professional presentational quality. The glossary provides guidance on every standard command—and a few non-standard ones too. Although the book provides a comprehensive introduction to the latest version of LATEX, namely LATEX 2ε, an appendix clearly explains the differences between LATEX 2ε and the original version of LATEX. In this way, the book can profitably be used by people who still have the older version. Many of the examples of typesetting that are given in the book are coded as templates and are available on the accompanying Website. They can be used as recipes or skeletons which can be customized by making minor adjustments to the commands. Full explanations of how these documents have been constructed are given for those wanting to better understand the inner workings of this flexible and powerful package. 
 This issue represents a broad synopsis of the past, present, and future of electronic publishing. The contributors explore the opportunities and challenges related to this new distribution channel, and the effect of this change on publishers, authors/editors, distributors, and consumers. Standing with the key to the "new world," publishers will be faced with new opportunities and nagging issues related to new competition, content control, and protection of revenue streams requiring strategies that stress rationalization of distribution systems, cross-promotion, strategic pricing, and leveraging to new revenue sources. In addition, this issue also highlights the objections of consumers to these types of change, the benefits of the new technology for consumers, and the adaptation of the publishing industry as a whole. 
 
Comprehensive, cross-platform, DIY guide to the creation of a wide
range of graphic effects: from the scanning and manipulation of
photographs to exciting 3D graphics and the creative use of
typography. Benefit from a design professional's experience, not
the software vendors!  
 Comprehensive, cross-platform, DIY guide to the creation of a wide range of graphic effects: from the scanning and manipulation of photographs to exciting 3D graphics and the creative use of typography. Benefit from a design professional's experience, not the software vendors!Part one leads you through a summary of the rapid advances in graphic design software and hardware now available to the PC or Mac user, followed by a structured overview of the rich array of resources to the digital designer in the form of drawing, painting and 3D applications, clipart, photolibraries, scanned images, digital photographs and new Internet sources.Part Two is structured in the form of a series of Workshop sessions. Each session explains in simple language the methods and techniques used to create the wide variety of over 300 graphic design examples included in the book. The examples are based on a wide range of popular PC and Mac applications, covering vector drawing, painting, scanning, photoeditng, use of special effect filters and the creation of 3D effects.Ken Pender is a freelance graphic arts professional. He has also worked for 25 years with IBM and was Manager of their European Computer Integrated Manufacturing Technology Centre in Germany. 
 New Subediting gives a detailed account of modern editing and production techniques. Its aim is both to help the young subeditor and to spell out to the newcomer to newspaper journalism what happens between the writing of news stories and features and their appearance in the newspaper when it comes off the press. In this age of technological change the quality of the subbing has never been more important to a successful newspaper. The careful use of typography, pictures, graphics and compelling headlines and the skillful handling of text coupled with good page planning, all help to give character,style and readability. This book examines, and draws lessons from, work in contemporary newspapers in editing and presentation; it defines the varied techniques of copytasting, of editing news stories and features, of styles of headline writing and the use of typography to guide and draw the attention of the reader. It takes into account developments in the use of English as a vehicle of mass communication in two important chapters on structure and word use; and it shows how to get the best out of the electronic tools now available to subeditors. It also reminds journalisis that, however advanced the tools, a newspaper is only as good as the creative skills of those that write, edit and put it together. 
 Donald Knuth's influence in computer science ranges from the invention of literate programming to the development of the TeX programming language. One of the foremost figures in the field of mathematical sciences, Knuth has written papers which stand as milestones of development over a wide range of topics. In this collection, the second in the series, Knuth explores the relationship between computers and typography. The present volume, in the words of the author, is the legacy of all the work he has done on typography. When type designers, punch cutters, typographers, book historians, and scholars visited the University while Knuth was working in this field, it gave to Stanford what some consider to be its golden age of digital typography. By the author's own admission, the present work is one of the most difficult books that he has prepared. This is truly a work that only Knuth could have produced. 
 TYPOGRAPHIC PROJECTS TO SHARPEN YOUR CREATIVE SKILLS & DIVERSIFY YOUR PORTFOLIO Whether you're a seasoned pro looking to brush up your portfolio, or a novice with a laptop full of design software you haven't yet mastered, this book has you covered. In dozens of projects, the authors guide you through the nitty-gritty details of book design, magazine layout, poster production, and all manner of print projects, from start to finish. The Type Project Book is loaded with tips and insider knowledge that will help you hone your design skills, deepen your type knowledge, and nerd out on the history of graphic design. Each section is a deep dive into real-world design projects from working designers: a cookbook; a letterpress gig poster; an animated web banner; an infographic; even the humble business card is explored. Along the way, wisdom is offered, tips and time-saving tricks are shared, the secrets of working graphic designers are revealed-all with the requisite doses of wit one expects from seasoned professionals with decades of experience. THE TYPE PROJECT BOOK PROVIDES: A wide variety of typography-focused projects ranging from a single letter to a book of several hundred pages An understanding of the design principles involved in creating impactful graphic design Immersion into the wider world of type and lettering and its use for artistic expression Tips and techniques for the most efficient working practices 
 Learning Experience Design (LXD) offers a powerful new approach to creating memorable learning experiences that deliver superior outcomes, bridging the gap between creative design disciplines and the world of learning. Now, one of the field's leading pioneers has written the definitive guide to LXD: what it is, how it works, what's better about it, and how you can make the most of it. Drawing on over a decade of experience defining, applying, and teaching LXD, Niels Floor covers LXD mindsets, methods, skills, tools: everything you need to succeed. Floor guides you step-by-step through every stage of the LXD process, from preliminary questions and ideas to focused design research, prototyping to final design. Floor also introduces the world's #1 LXD tool, his own Learning Experience Canvas, together with key tools such as personas, empathy maps, and experience maps. Whether you're a professional learning designer, course developer, or corporate training specialist you'll find this guide invaluable. And if you're a creative professional, it'll open new vistas of opportunity in fast-growing marketplace for learning solutions. |     You may like...
	
	
	
		
			
			
				The Use of Tools by Human and Non-human…
			
		
	
	 
		
			A. Berthelet, J. Chavaillon
		
		Hardcover
		
		
			
				
				
				
				
				
				R4,495
				
				Discovery Miles 44 950
			
			
		
	 
 |