|
Showing 1 - 15 of
15 matches in All Departments
This book combines ideas about the architecture of grammar and
language acquisition, processing, and change to explain why
languages show regular patterns when there is so much irregularity
in their use and so much complexity when there is such regularity
in linguistic phenomena. Peter Culicover argues that the structure
of language can be understood and explained in terms of two kinds
of complexity: firstly that of the correspondence between form and
meaning; secondly in the real-time processes involved in the
construction of meanings in linguistic expressions. Mainstream
syntactic theory has focused largely on regularities within and
across languages, relegating to the periphery exceptional and
idiosyncratic phenomena. But, the author argues, a languages
irregular and unique features offer fundamental insights into the
nature of language, how it changes, and how it is produced and
understood. Peter Culicover's new book offers a pertinent and
original contribution to key current debates in linguistic theory.
It will interest scholars and advanced students of linguists of all
theoretical persuasions.
This book brings together many of Peter Culicover's most
significant observations on the nature of syntax and its place
within the architecture of human language. Over four decades he has
sought to understand the cognitive foundations of linguistic theory
and the place of syntactic theory in explaining how language works.
This has led him to specific proposals regarding the proper scope
of syntactic theory and to a re-examination of the empirical basis
of syntactic analyses, which reflect judgements reflecting not only
linguistic competence but the complexity of the computations
involved in acquiring and using language. After a brief a
retrospective the author opens the book with the Simpler Syntax
Hypothesis, an article written with Ray Jackendoff, that proposes
significant restrictions on the scope of the syntactic component of
the grammar. The work is then divided into parts concerned broadly
with representations, structures, and computation. The chapters are
provided with contextual headnotes and footnote references to
subsequent work, but are otherwise printed essentially as they
first appeared. Peter Culicover's lively and original perspectives
on syntax and grammar will appeal to all theoretical linguists and
their advanced students.
This groundbreaking book offers a new and compelling perspective on
the structure of human language. The fundamental issue it addresses
is the proper balance between syntax and semantics, between
structure and derivation, and between rule systems and lexicon. It
argues that the balance struck by mainstream generative grammar is
wrong. It puts forward a new basis for syntactic theory, drawing on
a wide range of frameworks, and charts new directions for research.
In the past four decades, theories of syntactic structure have
become more abstract, and syntactic derivations have become ever
more complex. Peter Culicover and Ray Jackendoff trace this
development through the history of contemporary syntactic theory,
showing how much it has been driven by theory-internal rather than
empirical considerations. They develop an alternative that is
responsive to linguistic, cognitive, computational, and biological
concerns. Simpler Syntax is addressed to linguists of all
persuasions. It will also be of central interest to those concerned
with language in psychology, human biology, evolution,
computational science, and artificial intelligence.
In this book, Peter Culicover introduces the analysis of natural
language within the broader question of how language works - of how
people use languages to configure words and morphemes in order to
express meanings. He focuses both on the syntactic and
morphosyntactic devices that languages use, and on the conceptual
structures that correspond to particular aspects of linguistic
form. He seeks to explain linguistic forms and in the process to
show how these correspond with meanings.
The book's clear, step-by-step exposition is presented within the
Simpler Syntax framework whose development has been led by the
author and Ray Jackendoff over the last fifteen years. This
integrates syntactic theory with the representation of conceptual
structure and casts fresh light on the interface between syntax and
semantics. It also enables elegant and economical analyses of
natural language phenomena without recourse to such abstract
devices as functional heads and uniform binary branching.
Peter Culicover opens his account with an overview of the nature
of language and the aims of its analysis. He then divides the book
into parts devoted to syntactic categories, syntactic structure and
argument structure, argument realization, unbounded dependencies,
and clausal structure. He provides exercises, problems, and
suggestions for further reading throughout the book.
This text explores the consequences for language acquisition,
language evolution and linguistic theory of taking the underlying
architecture of the language faculty to be that of a dynamical
system. The authors investigate whether it is possible for a
complex adaptive system to identify the categories, structures and
rules of a language given access only to instances of grammatical
utterances of that language. The linguistic tradition says that
this is impossible, but there is a growing body of literature in
psychology and computer science arguing that grammar can be
uncovered using purely statistical techniques applied to the
distribution of forms in a string of words. The book goes on to
discuss whether a learner requires information about structure that
goes beyond the information that is contained in the meaning. Does
the learner have to have knowledge of grammar per se prior to
language acquisition, as has been traditionally assumed? The
authors ask whether it is possible to adequately describe and
explain linguistic phenomena if we restrict ourselves to the
relatively impoverished apparatus that we require for language
acquisition. They explore the consequences of adopting a radical
form of minimalism to try to reconcile the linguistic facts with
the book's perspective of language acquisition. Culicover and Nowak
investigate to what extent it is possible to account for language
variation in dynamical terms, as a consequence of the behaviour of
the complex social network in which languages and the properties of
languages are acquired by learners through interactions with other
speakers over time.
This book investigates the architecture of the language faculty by
considering what the properties of language reveal about the mental
abilities and processes involved in language acquisition. The
language faculty, the author argues, must be able not only to
accommodate what is general, exceptionless, and universal in
language, but must also be capable of dealing with what is
irregular, exceptional, and idiosyncratic. In Syntactic Nuts Peter
Culicover shows that this is true not only of the lexicon, but for
syntax. Marginal and exceptional cases, where there is no
straightforward form-meaning correspondence, are dealt with by the
language faculty easily and precisely as the general cases. In
considering how and why this should be the author argues against
the prevailing trend in generative grammar, which takes the learner
as either incorporating maximally global generalisations as part of
its innate capacity for language, or projecting global
generalisations from a very limited input on the basis of innate
mechanisms. He suggests that the learning mechanism does not
generalize significantly beyond the evidence presented to it, and
further that it seeks to form generalizations based on all and only
the evidence presented to it. Syntactic Nuts makes a fundamental
contribution to generative grammar and syntactic theory. It
situates syntactic theory within cognitive science in a novel way.
It contributes to an alternative, and yet in many ways traditional,
perspective on the manner in which knowledge is represented and
processed in the mind.
A work that reveals the profound links between the evolution,
acquisition, and processing of language, and proposes a new
integrative framework for the language sciences. Language is a
hallmark of the human species; the flexibility and unbounded
expressivity of our linguistic abilities is unique in the
biological world. In this book, Morten Christiansen and Nick Chater
argue that to understand this astonishing phenomenon, we must
consider how language is created: moment by moment, in the
generation and understanding of individual utterances; year by
year, as new language learners acquire language skills; and
generation by generation, as languages change, split, and fuse
through the processes of cultural evolution. Christiansen and
Chater propose a revolutionary new framework for understanding the
evolution, acquisition, and processing of language, offering an
integrated theory of how language creation is intertwined across
these multiple timescales. Christiansen and Chater argue that
mainstream generative approaches to language do not provide
compelling accounts of language evolution, acquisition, and
processing. Their own account draws on important developments from
across the language sciences, including statistical natural
language processing, learnability theory, computational modeling,
and psycholinguistic experiments with children and adults.
Christiansen and Chater also consider some of the major
implications of their theoretical approach for our understanding of
how language works, offering alternative accounts of specific
aspects of language, including the structure of the vocabulary, the
importance of experience in language processing, and the nature of
recursive linguistic structure.
This book combines ideas about the architecture of grammar and
language acquisition, processing, and change to explain why
languages show regular patterns when there is so much irregularity
in their use and so much complexity when there is such regularity
in linguistic phenomena. Peter Culicover argues that the structure
of language can be understood and explained in terms of two kinds
of complexity: firstly that of the correspondence between form and
meaning; secondly in the real-time processes involved in the
construction of meanings in linguistic expressions. Mainstream
syntactic theory has focused largely on regularities within and
across languages, relegating to the periphery exceptional and
idiosyncratic phenomena. But, the author argues, a languages
irregular and unique features offer fundamental insights into the
nature of language, how it changes, and how it is produced and
understood. Peter Culicover's new book offers a pertinent and
original contribution to key current debates in linguistic theory.
It will interest scholars and advanced students of linguists of all
theoretical persuasions.
How are native speakers of a language instinctively able to make precise linguistic judgements about marginal syntactic matters? What does this tell us about both the structure of language and our innate language ability as humans? These questions form the focus of Professor Culicover's in-depth study which will appeal to both graduate students and professionals within the fields of linguistic theory and cognitive science.
This volume explores how human languages become what they are, why
they differ from one another in certain ways but not in others, and
why they change in the ways that they do. Given that language is a
universal creation of the human mind, the puzzle is why there are
different languages at all: why do we not all speak the same
language? Moreover, while there is considerable variation, in some
ways grammars do show consistent patterns: why are languages
similar in those respects, and why are those particular patterns
preferred? Peter Culicover proposes that the solution to these
puzzles is a constructional one. Grammars consist of constructions
that carry out the function of expressing universal conceptual
structure. While there are in principle many different ways of
accomplishing this task, languages are under press to reduce
constructional complexity. The result is that there is
constructional change in the direction of less complexity, and
grammatical patterns emerge that more efficiently reflect
conceptual universals. The volume is divided into three parts: the
first establishes the theoretical foundations; the second explores
variation in argument structure, grammatical functions, and A-bar
constructions, drawing on data from a variety of languages
including English and Plains Cree; and the third examines
constructional change, focusing primarily on Germanic. The study
ends with observations and speculations on parameter theory,
analogy, the origins of typological patterns, and Greenbergian
'universals'.
Dynamical Grammar explores the consequences for language acquisition, language evolution, and linguistic theory of taking the underlying architecture of the language faculty to be that of a complex adaptive dynamical system. It contains the first results of a new and complex model of language acquisition which the authors have developed to measure how far language input is reflected in language output and thereby get a better idea of just how far the human language faculty is hard-wired.
In this book, Peter Culicover introduces the analysis of natural
language within the broader question of how language works - of how
people use languages to configure words and morphemes in order to
express meanings. He focuses both on the syntactic and
morphosyntactic devices that languages use, and on the conceptual
structures that correspond to particular aspects of linguistic
form. He seeks to explain linguistic forms and in the process to
show how these correspond with meanings.
The book's clear, step-by-step exposition is presented within the
Simpler Syntax framework whose development has been led by the
author and Ray Jackendoff over the last fifteen years. This
integrates syntactic theory with the representation of conceptual
structure and casts fresh light on the interface between syntax and
semantics. It also enables elegant and economical analyses of
natural language phenomena without recourse to such abstract
devices as functional heads and uniform binary branching.
Peter Culicover opens his account with an overview of the nature of
language and the aims of its analysis. He then divides the book
into parts devoted to syntactic categories, syntactic structure and
argument structure, argument realization, unbounded dependencies,
and clausal structure. He provides exercises, problems, and
suggestions for further reading throughout the book.
This groundbreaking book offers a new and compelling perspective on
the structure of human language. The fundamental issue it addresses
is the proper balance between syntax and semantics, between
structure and derivation, and between rule systems and lexicon. It
argues that the balance struck by mainstream generative grammar is
wrong. It puts forward a new basis for syntactic theory, drawing on
a wide range of frameworks, and charts new directions for research.
In the past four decades, theories of syntactic structure have
become more abstract, and syntactic derivations have become ever
more complex. Peter Culicover and Ray Jackendoff trace this
development through the history of contemporary syntactic theory,
showing how much it has been driven by theory-internal rather than
empirical considerations. They develop an alternative that is
responsive to linguistic, cognitive, computational, and biological
concerns. At the core of this alternative is the Simpler Syntax
Hypothesis: the most explanatory syntactic theory is one that
imputes the minimum structure necessary to mediate between
phonology and meaning. A consequence of this hypothesis is a far
richer mapping between syntax and semantics than is generally
assumed. Through concrete analyses of numerous grammatical
phenomena, some well studied and some new, the authors demonstrate
the empirical and conceptual superiority of the Simpler Syntax
approach. Simpler Syntax is addressed to linguists of all
persuasions. It will also be of central interest to those concerned
with language in psychology, human biology, evolution,
computational science, and artificial intellige
A comprehensive and up-to-date survey of the most widely-taught theory of syntax. Concentrating on Principles and Parameters Theory, the book places particular emphasis on conceptual and methodological foundations. It connects earlier versions of the theory to Chomsky's recent proposals for a `minimalist' syntactic theory.
|
You may like...
Loot
Nadine Gordimer
Paperback
(2)
R383
R310
Discovery Miles 3 100
|