Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > Computer software packages > Desktop publishing software
This book constitutes the refereed proceedings of the 8th Metadata and Semantics Research Conference, MTSR 2014, held in Karlsruhe, Germany, in November 2014. The 23 full papers and 9 short papers presented were carefully reviewed and selected from 57 submissions. The papers are organized in several sessions and tracks. They cover the following topics: metadata and linked data: tools and models; (meta) data quality assessment and curation; semantic interoperability, ontology-based data access and representation; big data and digital libraries in health, science and technology; metadata and semantics for open repositories, research information systems and data infrastructure; metadata and semantics for cultural collections and applications; semantics for agriculture, food and environment.
This book constitutes the thoroughly refereed post-conference proceedings of the 10th International Workshop on Graphics Recognition, GREC 2013, held in Bethlehem, PA, USA, in August 2013. The 20 revised full papers presented were carefully reviewed and selected from 32 initial submissions. Graphics recognition is a subfield of document image analysis that deals with graphical entities in engineering drawings, sketches, maps, architectural plans, musical scores, mathematical notation, tables, and diagrams. Accordingly the conference papers are organized in 5 topical sessions on symbol spotting and retrieval, graphics recognition in context, structural and perceptual based approaches, low level processing, and performance evaluation and ground truthing.
Data mining, an interdisciplinary field combining methods from artificial intelligence, machine learning, statistics and database systems, has grown tremendously over the last 20 years and produced core results for applications like business intelligence, spatio-temporal data analysis, bioinformatics, and stream data processing. The fifteen contributors to this volume are successful and well-known data mining scientists and professionals. Although by no means an exhaustive list, all of them have helped the field to gain the reputation and importance it enjoys today, through the many valuable contributions they have made. Mohamed Medhat Gaber has asked them (and many others) to write down their journeys through the data mining field, trying to answer the following questions: 1. What are your motives for conducting research in the data mining field? 2. Describe the milestones of your research in this field. 3. What are your notable success stories? 4. How did you learn from your failures? 5. Have you encountered unexpected results? 6. What are the current research issues and challenges in your area? 7. Describe your research tools and techniques. 8. How would you advise a young researcher to make an impact? 9. What do you predict for the next two years in your area? 10. What are your expectations in the long term? In order to maintain the informal character of their contributions, they were given complete freedom as to how to organize their answers. This narrative presentation style provides PhD students and novices who are eager to find their way to successful research in data mining with valuable insights into career planning. In addition, everyone else interested in the history of computer science may be surprised about the stunning successes and possible failures computer science careers (still) have to offer.
This book constitutes the refereed proceedings of the 4th International Workshop on Controlled Natural Language, CNL 2014, held in Galway, Ireland, in August 2014. The 17 full papers and one invited paper presented were carefully reviewed and selected from 26 submissions. The topics include simplified language, plain language, formalized language, processable language, fragments of language, phraseologies, conceptual authoring, language generation, and guided natural language interfaces.
"Practical LaTeX" covers the material that is needed for everyday LaTeX documents. This accessible manual is friendly, easy to read, and is designed to be as portable as LaTeX itself. A short chapter, "Mission Impossible," introduces LaTeX
documents and presentations. Read these 30 pages; you then should
be able to compose your own work in LaTeX. The remainder of the
book delves deeper into the topics outlined in "Mission Impossible"
while avoiding technical subjects. Chapters on presentations and
illustrations are a highlight, as is the introduction of LaTeX on
an iPad. Amazon.com, Best of 2000, Editors Choice Review of Astronomical Tools
Automatic Indexing and Abstracting of Document Texts summarizes the latest techniques of automatic indexing and abstracting, and the results of their application. It also places the techniques in the context of the study of text, manual indexing and abstracting, and the use of the indexing descriptions and abstracts in systems that select documents or information from large collections. Important sections of the book consider the development of new techniques for indexing and abstracting. The techniques involve the following: using text grammars, learning of the themes of the texts including the identification of representative sentences or paragraphs by means of adequate cluster algorithms, and learning of classification patterns of texts. In addition, the book is an attempt to illuminate new avenues for future research. Automatic Indexing and Abstracting of Document Texts is an excellent reference for researchers and professionals working in the field of content management and information retrieval.
Delivering MPEG-4 Based Audio-Visual Services investigates the different aspects of end-to-end multimedia services; content creation, server and service provider, network, and the end-user terminal. Part I provides a comprehensive introduction to digital video communications, MPEG standards, and technologies, and deals with system level issues including standardization and interoperability, user interaction, and the design of a distributed video server. Part II investigates the systems in the context of object-based multimedia services and presents a design for an object-based audio-visual terminal, some of these features having been adopted by the MPEG-4 Systems specification. The book goes on to study the requirements for a file format to represent object-based audio-visual content and the design of one such format. The design introduces new concepts such as direct streaming that are essential for scalable servers. The final part of the book examines the delivery of object-based multimedia presentations and gives optimal algorithms for multiplex-scheduling of object-based audio-visual presentations, showing that the audio-visual object scheduling problem is NP-complete in the strong sense. The problem of scheduling audio-visual objects is similar to the problem of sequencing jobs on a single machine. The book compares these problems and adapts job-sequencing results to audio-visual object scheduling, and provides optimal algorithms for scheduling presentations under resource constraints, such as bandwidth (network constraints) and buffer (terminal constraints). In addition, the book presents algorithms that minimize the resources required for scheduling presentations and the auxiliary capacity required to support interactivity in object-based audio-visual presentations. Delivering MPEG-4 Based Audio-Visual Services is essential reading for researchers and practitioners in the areas of multimedia systems engineering and multimedia computing, network professionals, service providers, and all scientists and technical managers interested in the most up-to-date MPEG standards and technologies.
This book covers all aspects of computer document preparation text processing and printing. Computers are being used increasingly in the processing of documents, from simple textual material, such as letters and memos, to complete books with mathematical formulae and graphics. The material may be extensively edited and manipulated on the computer before subsequent output on media such as typewriters, laser printers or photocomposers. This volume contains contributions from several established leaders in the field, and a number of research articles referred by an international programme committee. As such, the book gives a good impression of the state-of-the art in this area, which is of major importance in this 'electronic age' where on-line information retrieval and electronic publishing will increasingly affect our everyday life.
Renommierte Gestalter verschiedenster Disziplinen stellen
unterschiedlichste Positionen zur Gestaltung vor. Die sehr
personlichen Gesprache mit national und international bekannten
Designern geben Ihnen einen spannenden Einblick in die
facettenreiche Design-Diskussion der 90er Jahre.
This book constitutes the refereed proceedings of the Second International Workshop on Controlled Natural Language, CNL 2010, held in Marettimo Island, Italy, in September 2010. The 9 revised papers presented in this volume, together with 1 tutorial, were carefully reviewed and selected from 17 initial submissions. They broadly cover the field of controlled natural language, stressing theoretical and practical aspects of CNLs, relations to other knowledge representation languages, tool support, and applications.
This book constitutes the refereed proceedings of the 11th International Conference on Intelligent Tutoring Systems, ITS 2012, held in Chania, Crete, Greece, in June 2012. The 28 revised full papers, 50 short papers, and 56 posters presented were carefully viewed and selected from 177 submissions. The specific theme of the ITS 2012 conference is co-adaption between technologies and human learning. Besides that, the highly interdisciplinary ITS conferences bring together researchers in computer science, informatics, and artificial intelligence on the one side - and cognitive science, educational psychology, and linguistics on the other side. The papers are organized in topical sections on affect/emotions, affect/signals, games/motivation and design, games/empirical studies, content representation, feedback, non conventional approaches, conceptual content representation, assessment constraints, dialogue, dialogue/questions, learner modeling, learning detection, interaction strategies for games, and empirical studies thereof in general.
A guide on the use of SVMs in pattern classification, including a rigorous performance comparison of classifiers and regressors. The book presents architectures for multiclass classification and function approximation problems, as well as evaluation criteria for classifiers and regressors. Features: Clarifies the characteristics of two-class SVMs; Discusses kernel methods for improving the generalization ability of neural networks and fuzzy systems; Contains ample illustrations and examples; Includes performance evaluation using publicly available data sets; Examines Mahalanobis kernels, empirical feature space, and the effect of model selection by cross-validation; Covers sparse SVMs, learning using privileged information, semi-supervised learning, multiple classifier systems, and multiple kernel learning; Explores incremental training based batch training and active-set training methods, and decomposition techniques for linear programming SVMs; Discusses variable selection for support vector regressors.
Foreword------------------------------------- SGML is misunderstood and underestimated. I have always wanted to write this book. I am pleased that two people with whom I have had the pleasure to work were finally able to do so. Since I have always been a bit of an evangelist, I feel pride when my "students" become recognized "teachers." In the early years of SGML we struggled to define a language that would bring the information to its rightful place. We succeeded. Then we had to explain these idea to technical adoptors. Again, I think we have succeeded. We have learned much about SGML in the process of implementing it. These experiences must now also be shared, along with comprehensible information on the lan guage itself. The word must move out of the lab and the computer center and reach the business people, the users, the movers and shakers. The next generation will do things with SGML that we can't even imagine yet- it is that versatile."
The two volume set LNCS 6854/6855 constitutes the refereed proceedings of the International Conference on Computer Analysis of Images and Patterns, CAIP 2011, which took place in Seville, Spain, August 29-31, 2011. The 138 papers presented together with 2 invited talks were carefully reviewed and selected from 286 submissions. The papers are organized in topical section on: motion analysis, image and shape models, segmentation and grouping, shape recovery, kernel methods, medical imaging, structural pattern recognition, Biometrics, image and video processing, calibration; and tracking and stereo vision.
The two volume set LNCS 6854/6855 constitutes the refereed proceedings of the International Conference on Computer Analysis of Images and Patterns, CAIP 2011, which took place in Seville, Spain, August 29-31, 2011. The 138 papers presented together with 2 invited talks were carefully reviewed and selected from 286 submissions. The papers are organized in topical section on: motion analysis, image and shape models, segmentation and grouping, shape recovery, kernel methods, medical imaging, structural pattern recognition, Biometrics, image and video processing, calibration; and tracking and stereo vision.
This volume contains the proceedings of two recent conferences in the ?eld of electronic publishing and digital documents: - DDEP 2000, the 8th International Conference on Digital Documents and Electronic Publishing, the successor conference to the EP conference series; and - PODDP 2000, the 5th International Workshop on the Principles of Digital Document Processing. Both conferences were held at the Technische Universit] at Munc ] hen, Munich, Germany in September 2000. DDEP 2000 was the eighth in a biennial series of international conferences organized to promote the exchange of novel ideas concerning the computer p- duction, manipulation and dissemination of documents. This conference series has attempted to re?ect the evolving nature and usage of documents by treating digital documents and electronic publishing as a broad topic covering many - pects. These aspects have included document models, document representation and document dissemination, dynamic and hyper-documents, document ana- sis and management, and wide-ranging applications. The papers presented at DDEP 2000 and in this volume re?ect this broad view, and cover such diverse topicsashypermediastructureanddesign, multimediaauthoringtechniquesand systems, document structure inference, typography, document management and adaptation, document collections and Petri nets. All papers were refereed by an international program committee."
Adobe Acrobat 6.0: The Professional Usera (TM)s Guide, discusses the latest version of the program from the perspective of the professional user. To meet the needs and requirements of the professional, the program is presented from a functional perspective. The technology and background of an area is discussed, followed by a description of methods and processes for using the feature, backed up by real-world tutorials and projects. Incorporating threaded projects a " with a specific focus on design, engineering, and business a " by the end of the book, three projects in three major areas are completed.
"Adobe Acrobat 5: The Professional User's Guide" is designed for professionals, covering all of the programs major components, and providing thorough instruction on how to use Acrobat as effectively as possible. Throughout the book, renowned author Donna Baker includes a series of "Workflow Tips" designed to give you immediate direction on how to use Acrobat's features, how to make planning decisions, and how to avoid common mistakes. This book also includes a comprehensive project chapter that illustrates a real-life scenario involving project planning and form design processes. The book is organized into functional sections for ease of use. After a general introduction to Acrobat 5, the book moves on to creation and security issues, and then covers output options, with several chapters devoted to different forms of output. An extensive chapter on Acrobat JavaScript is also included for reference. All topical chapters have projects, tutorials, and demonstrations. The accompanying CD-ROM includes complete source files from the books projects and tutorials, as well as completed versions of the project files for reference and troubleshooting.
Die zunehmende Verbreitung und Bedeutung der neuen Medien, insbesondere des Internets, ver ndern unsere Schreib- und Lesegewohnheiten. Statt l ngere Schrifttexte am St ck zu lesen, klickt sich der Nutzer des World Wide Web durch Hypermedia-Texte, in denen Schrift, Bild, Ton und Video durch Links untereinander und mit anderen Dokumenten im WWW verkn pft sind. F r die journalistische Arbeit bieten Hypermedia-Texte viele Chancen: Inhalte lassen sich schnell aktualisieren, neue Module und Links k nnen unkompliziert hinzugef gt werden. Leserforen, G steb cher und Chats erlauben neue Formen der Leserbeteiligung und der Kontaktpflege. Hypermedia-Texte werden aber nicht nur anders gelesen, sondern m ssen auch anders geplant und geschrieben werden. Dieses Buch bietet eine umfassende Einf hrung in die Thematik 'Online-Texte' unter Ber cksichtigung der Lesegewohnheiten am Bildschirm sowie Aspekten der Wahrnehmungspsychologie und journalistischer Grundlagen.
1 - Kamera-Setup.- 1.1 Grundlegende Kameraeinstellungen.- 1.1.1 Empfehlungen.- 1.2 Bildaufloesung.- 1.3 Bildkomprimierung.- 1.4 Aufloesung und Komprimierung eintesten.- 1.5 Scharfung.- 1.6 Farbraum.- 1.7 Digitalzoom.- 1.8 Sucher.- 1.9 Monitor.- 1.10 Belichtung.- 1.10.1 Belichtungsmessung.- 1.10.2 Belichtungsprogramme.- 1.10.3 Belichtungsvarianten.- 1.10.4 Belichtungskorrektur.- 1.11 Autofokus.- 1.11.1Sensorwahl.- 1.11.2 Statischer und dynamischer Autofokus.- 1.11.3 Ausloese- und Scharfeprioritat.- 1.11.4.Manuelle Scharfeinstellung.- 1.12 Empfindlichkeit.- 1.13 Weissabgleich.- 1.13.1 Lichtfarben.- 1.13.2 Automatischer Weissabgleich.- 1.13.3 Manueller Weissabgleich.- 1.14 Datum und Uhrzeit.- 1.15 Dateneinbelichtung.- 1.16 Fernsehsignal.- 1.17 Tonsignale.- 1.18 Optimiertes Kamera-Setup kurz gefasst.- 1.18.1 Standardsituationen.- 1.18.2 Hoechste Anspruche.- 1.18.3 Available-Light-Situationen.- 1.18.4 Nah- und Makroaufnahmen.- 1.18.5 Schnelle Fotografie.- 2 - Belichtung.- 2.1 Belichtungsmessung.- 2.1.1 Charakteristik der Belichtungsmessung.- 2.2 Belichtungsprogramme.- 2.2.1 Programmautomatik.- 2.2.2 Programmshift.- 2.2.3 Motivprogramme.- 2.2.4 Zeitautomatik.- 2.2.5 Blendenautomatik.- 2.2.6 Manuelle Belichtungseinstellung.- 2.3 Belichtungseingriffe.- 2.3.1 Belichtungskorrektur.- 2.3.2 Ersatzmessung.- 2.3.3 Kontrastmessung.- 2.3.4 Belichtungsreihe.- 3 - Praxis der digitalen Fotografie.- 3.1 Vorzuge.- 3.2 Anmerkungen.- 3.3 Aufnahmevorbereitungen.- 3.3.1 Energieversorgung.- 3.3.2 Speichermedien.- 3.3.3 Kamera-Setup.- 3.3.4 Belichtungsmessung.- 3.3.5 Nutzliches fur die Kameratasche.- 3.3.6 Kameratasche packen.- 3.4 Fotos gestalten.- 3.4.1 Der Goldene Schnitt.- 3.5 Gestalten mit der Brennweite.- 3.5.1 Weitwinkelbrennweite.- 3.5.2 Normalbrennweite.- 3.5.3 Telebrennweite.- 3.5.4 Scharfentiefe.- 3.5.5 Brennweitenwirkung.- 3.6 Licht in der Fotografie.- 3.6.1 Tageslicht.- 3.6.2 Aufhellen.- 3.6.3 Available Light.- 3.6.4 Kunstlicht.- 3.6.5 Mischlicht.- 3.6.6 Langzeitbelichtung.- 4 - Blitzlichtfotografie.- 4.1 Blitzlicht.- 4.2 Grundlagen.- 4.2.1 Blitzleistung.- 4.2.2 Abstandsgesetz.- 4.2.3 Blitzsynchronisation.- 4.2.4 Blitzreichweite.- 4.3 Blitztechniken.- 4.3.1 Vorblitz.- 4.3.2 Aufhellblitz.- 4.3.3 Vollblitz.- 4.3.4 Kurzzeitsynchronisation.- 4.3.5 Langzeitsynchronisation.- 4.3.6 Blitz-Belichtungskorrektur.- 4.3.7 Ultrakurzzeitfotografie.- 4.3.8 Drahtlose Blitzsteuerung.- 4.4 Blitzautomatiken.- 4.4.1 Blitzen mit Programmautomatik.- 4.4.2 Blitzen mit Zeit- oder Blendenautomatik.- 4.4.3 Manuelles Blitzen.- 4.5 Besseres Blitzlicht.- 4.5.1 Indirekter Blitz.- 4.5.2 Entfesselter Blitz 1.- 4.5.3 Entfesselter Blitz 2.- 4.5.4 Blitzreflektor.- 4.5.5 Servoblitz.- 5 - Motive und Themen.- 5.1 Handwerkliches.- 5.2 Visuelle Notizen.- 5.3 Landschaftsfotografie.- 5.4 Architekturfotografie.- 5.5 Nah-und Makrofotografie.- 5.6 Menschenbilder.- 5.7 Reise und Reportage.- 5.8 Infrarotfotografie.- 5.9 Ultraviolettfotografie.- 5.10 Experimente.- 5.11 Panoramafotografie.- 6 - Studiofotografie.- 6.1 Sachaufnahme.- 6.2 Stilleben.- 6.3 Reproduktion.- 6.4 Lichtquellen im Studio.- 6.4.1 Tageslicht.- 6.4.2 Blitzlicht.- 6.4.3 Kaltlichtleuchte.- 6.4.4 Kunstlichtleuchte.- 6.4.5 Portables Studioblitzgerat.- 6.4.6 Studioblitz.- 6.4.7 Studioblitzgerat zunden.- 6.5 Einrichten des Studios.- 6.5.1 Tipps zum Raum.- 6.5.2 Aufheller und Neger.- 6.5.3 Hintergrund.- 6.5.4 Aufnahmetisch.- 6.6 Lichtfuhrung.- 6.6.1 Lichtposition.- 6.6.2 Lichtcharakteristik.- 6.6.3 Licht einrichten.- 6.6.4 Lichtfuhrung im Nahbereich.- 6.6.5 Hellfeldbeleuchtung.- 6.6.6 Dunkelfeldbeleuchtung.- 6.6.7 Lichtzelt.- 6.6.8 Polarisiertes Licht.- 6.7 Ausrustungsvorschlag.- 6.8 Setup fur Sachaufnahmen.- 7 - Fotos digitalisieren.- 7.1 Analoge Vorlagen digitalisieren.- 7.2 Digitalisieren mit der Kamera.- 7.2.1 Aufsichtvorlagen digitalisieren.- 7.2.2 Durchsichtvorlagen digitalisieren.- 7.3 Filmscanner.- 7.4 Flachbettscanner.- 7.5 Trommelscanner.- 7.6 Scan-Service via Photo CD.- 7.7 Hinweise zum Digitalisieren.
This comprehensive guide is directed at Linux and UNIX users but is also the best how-to book on the use of LaTeX in preparing articles, books and theses. Unlike other LaTeX books, this one is particularly suitable for anyone coming to LaTeX for the first time.
SGML und XML sind fundamentale Konzepte der heutigen und kunftigen Speicherung, Verarbeitung und Verbreitung langlebiger Informationen, gerade im Hinblick auf die weltweite Vernetzung durch das Internet. Ausgehend von den technischen und okonomischen Herausforderungen unserer Informationsgesellschaft, stellen die Autoren die wichtigen Entwicklungen dar und berichten uber aktuelle Projekte mit SGML- und XML-Anwendungen in der Praxis. Das Buch wendet sich an das Management informationsverarbeitender und -produzierender Unternehmen, an die Technik-Verantwortlichen und SGML-/XML-Anwender sowie an Studierende informationsverarbeitender Facher und der Informatik.
KomplexitAt und Umfang aktueller Websites nehmen stetig zu, grafische Darstellung und Interaktionsformen entwickeln sich weiter. Um Usability auf der einen und innovative Konzepte auf der anderen Seite zu gewAhrleisten, etabliert sich in Agenturen und Design-BA1/4ros die Informationsarchitektur als Bestandteil des Entwicklungsprozesses. Die erfolgreiche Arbeit von Informationsarchitekten stA1/4tzt sich nicht nur auf eigene Methoden, sondern hAngt von der Integration in einen A1/4bergreifenden Workflow ab. In diesem Buch werden flexibel anwendbare Methoden und Techniken vorgestellt, die geeignet sind, hochwertige Informations-Architekturen zu entwickeln, und dabei das Know-How der A1/4brigen beteiligten Disziplinen nutzen. So entstehen Websites, die allen NutzerbedA1/4rfnissen entsprechen, und gleichzeitig die Anforderungen des Kunden nach AktualitAt und Innovation erfA1/4llen.
Although the World Wide Web is enjoying enormous growth rates, many Web publishers have discovered that HTML is not up to the requirements of modern corporate communication. For them, Adobe Acrobat offers a wealth of design possibilities. The close integration of Acrobat in the World Wide Web unites the structural advantages of HTML with the comprehensive layout possibilities of Portable Document Format (PDF). On the basis of practical examples and numerous tricks, this book describes how to produce PDF documents efficiently. Numerous tips on integrating Acrobat into CGI, JavaScript, VBScript, Active Server Pages, search engines, and so on make the book a mine of information for all designers and administrators of Web sites.
New Subediting gives a detailed account of modern editing and
production techniques. Its aim is both to help the young subeditor
and to spell out to the newcomer to newspaper journalism what
happens between the writing of news stories and features and their
appearance in the newspaper when it comes off the press. |
You may like...
Multilingual Information Retrieval…
Carol Peters, Martin Braschler, …
Hardcover
R1,483
Discovery Miles 14 830
The Frugal Book Promoter - 3rd Edition…
Carolyn Howard-Johnson
Hardcover
Health Information Science - 7th…
Siuly Siuly, Ickjai Lee, …
Paperback
R1,469
Discovery Miles 14 690
Mut Zur Typographie - Ein Kurs Fur…
Juergen Gulbins, Christine Kahrmann, …
Hardcover
R1,484
Discovery Miles 14 840
Automatically Ordering Events and Times…
Leon R.A. Derczynski
Hardcover
R3,297
Discovery Miles 32 970
Linked Data in Linguistics…
Christian Chiarcos, Sebastian Nordhoff, …
Hardcover
R1,482
Discovery Miles 14 820
Prominent Feature Extraction for…
Basant Agarwal, Namita Mittal
Hardcover
R2,789
Discovery Miles 27 890
|