![]() |
![]() |
Your cart is empty |
||
Books > Language & Literature > Language & linguistics > Computational linguistics
This book traces the history of language technology from writing -
the first technology specifically designed for language - to
digital speech and other contemporary language systems. The book
describes the social impact of technological developments over five
millennia, and addresses topics such as the ways in which literacy
has influenced cognitive and scientific development; the social
impact of modern speech technology; the influence of various
printing technologies; the uses and limitations of machine
translation; how far mass information access is a means for
exploitation or enlightenment; the deciphering of ancient scripts;
and technical aids for people with language disabilities.
This book introduces the most important problems of reference and
considers the solutions that have been proposed to explain them.
Reference is at the centre of debate among linguists and
philosophers and, as Barbara Abbott shows, this has been the case
for centuries. She begins by examining the basic issue of how far
reference is a two place (words-world) or a three place
(speakers-words-world) relation. She then discusses the main
aspects of the field and the issues associated with them, including
those concerning proper names; direct reference and individual
concepts; the difference between referential and quantificational
descriptions; pronouns and indexicality; concepts like definiteness
and strength; and noun phrases in discourse.
This book brings together selected revised papers representing a multidisciplinary approach to language, music, and gesture, as well as their interaction. Among the number of multidisciplinary and comparative studies of the structure and organization of language and music, the presented book broadens the scope with the inclusion of gesture problems in the analyzed spectrum. A unique feature of the presented collection is that the papers, compiled in one volume, allow readers to see similarities and differences in gesture as an element of non-verbal communication and gesture as the main element of dance. In addition to enhancing the analysis, the data on the perception and comprehension of speech, music, and dance in regard to both their functioning in a natural situation and their reflection in various forms of performing arts makes this collection extremely useful for those who are interested in human cognitive abilities and performing skills. The book begins with a philosophical overview of recent neurophysiological studies reflecting the complexity of higher cognitive functions, which references the idea of the baroque style in art being neither linear nor stable. The following papers are allocated into 5 sections. The papers of the section "Language-Music-Gesture As Semiotic Systems" discuss the issues of symbolic and semiotic aspects of language, music, and gesture, including from the perspective of their notation. This is followed by the issues of "Language-Music-Gesture Onstage" and interaction within the idea of the "World as a Text." The papers of "Teaching Language and Music" present new teaching methods that take into account the interaction of all the cognitive systems examined. The papers of the last two sections focus on issues related primarily to language: The section "Verbalization Of Music And Gesture" considers the problem of describing musical text and non-verbal behavior with language, and papers in the final section "Emotions In Linguistics And Ai-Communication Systems" analyze the ways of expressing emotions in speech and the problems of organizing emotional communication with computer agents.
This book explores the empirical and theoretical aspects of
constituent structure in natural language syntax. It surveys a wide
variety of functionalist and formalist theoretical approaches, from
dependency grammars and Relational Grammar to Lexical Functional
Grammar, Head-driven Phrase Structure Grammar, and Minimalism. It
describes the traditional tests for constituency and the formal
means for representing them in phrase structure grammars, extended
phrase structure grammars, X-bar theory, and set theoretic bare
phrase structure. In doing so it provides a clear, thorough, and
rigorous axiomatic description of the structural properties of
constituent trees.
This book explores the empirical and theoretical aspects of
constituent structure in natural language syntax. It surveys a wide
variety of functionalist and formalist theoretical approaches, from
dependency grammars and Relational Grammar to Lexical Functional
Grammar, Head-driven Phrase Structure Grammar, and Minimalism. It
describes the traditional tests for constituency and the formal
means for representing them in phrase structure grammars, extended
phrase structure grammars, X-bar theory, and set theoretic bare
phrase structure. In doing so it provides a clear, thorough, and
rigorous axiomatic description of the structural properties of
constituent trees.
Specifically designed for linguists, this book provides an introduction to programming using Python for those with little to no experience of coding. Python is one of the most popular and widely-used programming languages as it's also available for free and runs on any operating system. All examples in the text involve language data and can be adapted or used directly for language research. The text focuses on key language-related issues: searching, text manipulation, text encoding and internet data, providing an excellent resource for language research. More experienced users of Python will also benefit from the advanced chapters on graphical user interfaces and functional programming.
This open access book introduces Vector semantics, which links the formal theory of word vectors to the cognitive theory of linguistics. The computational linguists and deep learning researchers who developed word vectors have relied primarily on the ever-increasing availability of large corpora and of computers with highly parallel GPU and TPU compute engines, and their focus is with endowing computers with natural language capabilities for practical applications such as machine translation or question answering. Cognitive linguists investigate natural language from the perspective of human cognition, the relation between language and thought, and questions about conceptual universals, relying primarily on in-depth investigation of language in use. In spite of the fact that these two schools both have 'linguistics' in their name, so far there has been very limited communication between them, as their historical origins, data collection methods, and conceptual apparatuses are quite different. Vector semantics bridges the gap by presenting a formal theory, cast in terms of linear polytopes, that generalizes both word vectors and conceptual structures, by treating each dictionary definition as an equation, and the entire lexicon as a set of equations mutually constraining all meanings.
This book considers how people talk about the location of objects and places. Spatial language has occupied many researchers across diverse fields, such as linguistics, psychology, GIScience, architecture, and neuroscience. However, the vast majority of work in this area has examined spatial language in monologue situations, and often in highly artificial and restricted settings. Yet there is a growing recognition in the language research community that dialogue rather than monologue should be a starting point for language understanding. Hence, the current zeitgeist in both language research and robotics/AI demands an integrated examination of spatial language in dialogue settings. The present volume provides such integration for the first time and reports on the latest developments in this important field. Written in a way that will appeal to researchers across disciplines from graduate level upwards, the book sets the agenda for future research in spatial conceptualization and communication.
In this book, Peter Culicover introduces the analysis of natural
language within the broader question of how language works - of how
people use languages to configure words and morphemes in order to
express meanings. He focuses both on the syntactic and
morphosyntactic devices that languages use, and on the conceptual
structures that correspond to particular aspects of linguistic
form. He seeks to explain linguistic forms and in the process to
show how these correspond with meanings.
This is a book about semantic theories of modality. Its main goal
is to explain and evaluate important contemporary theories within
linguistics and to discuss a wide range of linguistic phenomena
from the perspective of these theories. The introduction describes
the variety of grammatical phenomena associated with modality,
explaining why modal verbs, adjectives, and adverbs represent the
core phenomena. Chapters are then devoted to the possible worlds
semantics for modality developed in modal logic; current theories
of modal semantics within linguistics; and the most important
empirical areas of research. The author concludes by discussing the
relation between modality and other topics, especially tense,
aspect, mood, and discourse meaning.
This book is the first comprehensive presentation of Functional
Discourse Grammar, a new and important theory of language
structure. The authors set out its nature and origins and show how
it relates to contemporary linguistic theory. They demonstrate and
test its explanatory power and descriptive utility against
linguistic facts from over 150 languages across a wide range of
linguistic families.
This important contribution to the Minimalist Program offers a
comprehensive theory of locality and new insights into phrase
structure and syntactic cartography. It unifies central components
of the grammar and increases the symmetry in syntax. Its central
hypothesis has broad empirical application and at the same time
reinforces the central premise of minimalism that language is an
optimal system.
The book covers theoretical work, approaches, applications, and techniques for computational models of information, language, and reasoning. Computational and technological developments that incorporate natural language are proliferating. Adequate coverage of natural language processing in artificial intelligence encounters problems on developments of specialized computational approaches and algorithms. Many difficulties are due to ambiguities in natural language and dependency of interpretations on contexts and agents. Classical approaches proceed with relevant updates, and new developments emerge in theories of formal and natural languages, computational models of information and reasoning, and related computerized applications. Its focus is on computational processing of human language and relevant medium languages, which can be theoretically formal, or for programming and specification of computational systems. The goal is to promote intelligent natural language processing, along with models of computation, language, reasoning, and other cognitive processes.
This open access book provides an in-depth description of the EU project European Language Grid (ELG). Its motivation lies in the fact that Europe is a multilingual society with 24 official European Union Member State languages and dozens of additional languages including regional and minority languages. The only meaningful way to enable multilingualism and to benefit from this rich linguistic heritage is through Language Technologies (LT) including Natural Language Processing (NLP), Natural Language Understanding (NLU), Speech Technologies and language-centric Artificial Intelligence (AI) applications. The European Language Grid provides a single umbrella platform for the European LT community, including research and industry, effectively functioning as a virtual home, marketplace, showroom, and deployment centre for all services, tools, resources, products and organisations active in the field. Today the ELG cloud platform already offers access to more than 13,000 language processing tools and language resources. It enables all stakeholders to deposit, upload and deploy their technologies and datasets. The platform also supports the long-term objective of establishing digital language equality in Europe by 2030 - to create a situation in which all European languages enjoy equal technological support. This is the very first book dedicated to Language Technology and NLP platforms. Cloud technology has only recently matured enough to make the development of a platform like ELG feasible on a larger scale. The book comprehensively describes the results of the ELG project. Following an introduction, the content is divided into four main parts: (I) ELG Cloud Platform; (II) ELG Inventory of Technologies and Resources; (III) ELG Community and Initiative; and (IV) ELG Open Calls and Pilot Projects.
This book collects and introduces some of the best and most useful
work in practical lexicography. It has been designed as a resource
for students and scholars of lexicography and lexicology and to be
an essential reference for professional lexicographers. It focusses
on central issues in the field and covers topics hotly debated in
lexicography circles. After a full contextual introduction Thierry
Fontenelle divides the book into twelve parts - theoretical
perspectives, corpus design, lexicographical evidence, word senses
and polysemy, collocations and idioms, definitions, examples,
grammar and usage, bilingual lexicography, tools and methods,
semantic networks, and how dictionaries are used. The book is fully
referenced and indexed.
When viewed through a political lens, the act of defining terms in natural language arguably transforms knowledge into values. This unique volume explores how corporate, military, academic, and professional values shaped efforts to define computer terminology and establish an information engineering profession as a precursor to what would become computer science. As the Cold War heated up, U.S. federal agencies increasingly funded university researchers and labs to develop technologies, like the computer, that would ensure that the U.S. maintained economic prosperity and military dominance over the Soviet Union. At the same time, private corporations saw opportunities for partnering with university labs and military agencies to generate profits as they strengthened their business positions in civilian sectors. They needed a common vocabulary and principles of streamlined communication to underpin the technology development that would ensure national prosperity and military dominance. investigates how language standardization contributed to the professionalization of computer science as separate from mathematics, electrical engineering, and physics examines traditions of language standardization in earlier eras of rapid technology development around electricity and radio highlights the importance of the analogy of "the computer is like a human" to early explanations of computer design and logic traces design and development of electronic computers within political and economic contexts foregrounds the importance of human relationships in decisions about computer design This in-depth humanistic study argues for the importance of natural language in shaping what people come to think of as possible and impossible relationships between computers and humans. The work is a key reference in the history of technology and serves as a source textbook on the human-level history of computing. In addition, it addresses those with interests in sociolinguistic questions around technology studies, as well as technology development at the nexus of politics, business, and human relations.
The topic of this book is the theoretical foundations of a theory LSLT -- Lexical Semantic Language Theory - and its implementation in a the system for text analysis and understanding called GETARUN, developed at the University of Venice, Laboratory of Computational Linguistics, Department of Language Sciences. LSLT encompasses a psycholinguistic theory of the way the language faculty works, a grammatical theory of the way in which sentences are analysed and generated -- for this we will be using Lexical-Functional Grammar -- a semantic theory of the way in which meaning is encoded and expressed in utterances -- for this we will be using Situation Semantics -, and a parsing theory of the way in which components of the theory interact in a common architecture to produce the needed language representation to be eventually spoken aloud or interpreted by the phonetic/acoustic language interface. LSLT will then be put to use to show how discourse relations are mapped automatically from text using the tools available in the 4 sub-theories, and in particular we will focus on Causal Relations showing how the various sub-theories contribute to address different types of causality.
The book features recent attempts to construct corpora for specific purposes - e.g. multifactorial Dutch (parallel), Geasy Easy Language Corpus (intralingual), HK LegCo interpreting corpus - and showcases sophisticated and innovative corpus analysis methods. It proposes new approaches to address classical themes - i.e. translation pedagogy, translation norms and equivalence, principles of translation - and brings interdisciplinary perspectives - e.g. contrastive linguistics, cognition and metaphor studies - to cast new light. It is a timely reference for the researchers as well as postgraduate students who are interested in the applications of corpus technology to solving translation and interpreting problems.
The relation between ontologies and language is currently at the forefront of natural language processing (NLP). Ontologies, as widely used models in semantic technologies, have much in common with the lexicon. A lexicon organizes words as a conventional inventory of concepts, while an ontology formalizes concepts and their logical relations. A shared lexicon is the prerequisite for knowledge-sharing through language, and a shared ontology is the prerequisite for knowledge-sharing through information technology. In building models of language, computational linguists must be able to accurately map the relations between words and the concepts that they can be linked to. This book focuses on the technology involved in enabling integration between lexical resources and semantic technologies. It will be of interest to researchers and graduate students in NLP, computational linguistics, and knowledge engineering, as well as in semantics, psycholinguistics, lexicology and morphology/syntax.
One of the challenges brought on by the digital revolution of the recent decades is the mechanism by which information carried by texts can be extracted in order to access its contents. The processing of named entities remains a very active area of research, which plays a central role in natural language processing technologies and their applications. Named entity recognition, a tool used in information extraction tasks, focuses on recognizing small pieces of information in order to extract information on a larger scale. The authors use written text and examples in French and English to present the necessary elements for the readers to familiarize themselves with the main concepts related to named entities and to discover the problems associated with them, as well as the methods available in practice for solving these issues.
This book is about the nature of expression in speech. It is a comprehensive exploration of how such expression is produced and understood, and of how the emotional content of spoken words may be analysed, modelled, tested, and synthesized. Listeners can interpret tone-of-voice, assess emotional pitch, and effortlessly detect the finest modulations of speaker attitude; yet these processes present almost intractable difficulties to the researchers seeking to identify and understand them. In seeking to explain the production and perception of emotive content, Mark Tatham and Katherine Morton review the potential of biological and cognitive models. They examine how the features that make up the speech production and perception systems have been studied by biologists, psychologists, and linguists, and assess how far biological, behavioural, and linguistic models generate hypotheses that provide insights into the nature of expressive speech. The authors use recent techniques in speech synthesis and automatic speech recognition as a test bed for models of expression in speech.Acknowledging that such testing presupposes a comprehensive computational model of speech production, they put forward original proposals for its foundations and show how the relevant data structures may be modelled within its framework. This pioneering book will be of central interest to researchers in linguistics and in speech science, pathology, and technology. It will also be valuable for behavioural and cognitive scientists wanting to know more about this vital and elusive aspect of human behaviour.
The two-volume proceedings, LNCS 13249 and 13250, constitutes the thoroughly refereed post-workshop proceedings of the 22nd Chinese Lexical Semantics Workshop, CLSW 2021, held in Nanjing, China in May 2021. The 68 full papers and 4 short papers were carefully reviewed and selected from 261 submissions. They are organized in the following topical sections: Lexical Semantics and General Linguistics; Natural Language Processing and Language Computing; Cognitive Science and Experimental Studies; Lexical Resources and Corpus Linguistics.
This work presents a discourse-aware Text Simplification approach that splits and rephrases complex English sentences within the semantic context in which they occur. Based on a linguistically grounded transformation stage, complex sentences are transformed into shorter utterances with a simple canonical structure that can be easily analyzed by downstream applications. To avoid breaking down the input into a disjointed sequence of statements that is difficult to interpret, the author incorporates the semantic context between the split propositions in the form of hierarchical structures and semantic relationships, thus generating a novel representation of complex assertions that puts a semantic layer on top of the simplified sentences. In a second step, she leverages the semantic hierarchy of minimal propositions to improve the performance of Open IE frameworks. She shows that such systems benefit in two dimensions. First, the canonical structure of the simplified sentences facilitates the extraction of relational tuples, leading to an improved precision and recall of the extracted relations. Second, the semantic hierarchy can be leveraged to enrich the output of existing Open IE approaches with additional meta-information, resulting in a novel lightweight semantic representation for complex text data in the form of normalized and context-preserving relational tuples.
This book is an advanced introduction to semantics that presents this crucial component of human language through the lens of the 'Meaning-Text' theory - an approach that treats linguistic knowledge as a huge inventory of correspondences between thought and speech. Formally, semantics is viewed as an organized set of rules that connect a representation of meaning (Semantic Representation) to a representation of the sentence (Deep-Syntactic Representation). The approach is particularly interesting for computer assisted language learning, natural language processing and computational lexicography, as our linguistic rules easily lend themselves to formalization and computer applications. The model combines abstract theoretical constructions with numerous linguistic descriptions, as well as multiple practice exercises that provide a solid hands-on approach to learning how to describe natural language semantics.
In this pioneering book Katarzyna Jaszczolt lays down the
foundations of an original theory of meaning in discourse, reveals
the cognitive foundations of discourse interpretation, and puts
forward a new basis for the analysis of discourse processing. She
provides a step-by-step introduction to the theory and its
application, and explains new terms and formalisms as required. Dr.
Jaszczolt unites the precision of truth-conditional, dynamic
approaches with insights from neo-Gricean pragmatics into the role
of speaker's intentions in communication. She shows that the
compositionality of meaning may be understood as merger
representations combining information from various sources
including word meaning and sentence structure, various kinds of
default interpretations, and conscious pragmatic inference. |
![]() ![]() You may like...
Bamboozled - In Search Of Joy In A World…
Melinda Ferguson
Paperback
The Legend Of Zola Mahobe - And The…
Don Lepati, Nikolaos Kirkinis
Paperback
![]()
|