|
|
Showing 1 - 25 of
116 matches in All Departments
The Chomskian revolution in linguistics gave rise to a new
orthodoxy about mind and language. Michael Devitt throws down a
provocative challenge to that orthodoxy. What is linguistics about?
What role should linguistic intuitions play in constructing
grammars? What is innate about language? Is there a 'language
faculty'? These questions are crucial to our developing
understanding of ourselves; Michael Devitt offers refreshingly
original answers. He argues that linguistics is about linguistic
reality and is not part of psychology; that linguistic rules are
not represented in the mind; that speakers are largely ignorant of
their language; that speakers' intuitions do not reflect
information supplied by the language faculty and are not the main
evidence for grammars; that the rules of 'Universal Grammar' are
largely, if not entirely, innate structure rules of thought;
indeed, that there is little or nothing to the language faculty.
Devitt's controversial theses will prove highly stimulating to
anyone working on language and the mind.
The metaphysical part of this book is largely concerned with
realism issues. Michael Devitt starts with realism about
universals, dismissing Plato's notorious 'one over many' problem.
Several chapters argue for a fairly uncompromisingly realist view
of the external physical world of commonsense and science. Both the
nonfactualism of moral noncognitivism and positivistic
instrumentalism, and defl ationism about truth, are found to rest
on an antirealism that is hard to characterize. A case is presented
for moral realism. Various biological realisms are considered.
Finally, an argument is presented for an unfashionable biological
essentialism.
The second part of the book is epistemological. Devitt argues
against the a priori and for a Quinean naturalism. The intuitions
that so dominate "armchair philosophy" are empirical not a priori.
Throughout the book there is an emphasis on distinguishing
metaphysical issues about what there is and what it's like from
semantic issues about meaning, truth, and reference. Another
central theme, captured in the title, is that we should "put
metaphysics first." We should approach epistemology and semantics
from a metaphysical perspective rather than vice versa. The
epistemological turn in modern philosophy, and the linguistic turn
in contemporary philosophy, were something of disasters.
The "Blackwell Guide to Philosophy of Language" is a collection of
twenty new essays in a cutting-edge and wide-ranging field.
Surveys central issues in contemporary philosophy of language while
examining foundational topics
Provides pedagogical tools such as abstracts and suggestions for
further readings
Topics addressed include the nature of meaning, speech acts and
pragmatics, figurative language, and naturalistic theories of
reference
This book criticizes the methodology of the recent
semantics-pragmatics debate in the theory of language and proposes
an alternative. It applies this methodology to argue for a
traditional view against a group of "contextualists" and
"pragmatists", including Sperber and Wilson, Bach, Carston,
Recanati, Neale, and many others. The author disagrees with these
theorists who hold that the meaning of the sentence in an utterance
never, or hardly ever, yields its literal truth-conditional
content, even after disambiguation and reference fixing; it needs
to be pragmatically supplemented in context. The standard
methodology of this debate is to consult intuitions. The book
argues that theories should be tested against linguistic usage.
Theoretical distinctions, however intuitive, need to be
scientifically motivated. Also we should not be guided by Grice's
"Modified Occam's Razor", Ruhl's "Monosemantic Bias", or other such
strategies for "meaning denialism". From this novel perspective,
the striking examples of context relativity that motivate
contextualists and pragmatists typically exemplify semantic rather
than pragmatic properties. In particular, polysemous phenomena
should typically be treated as semantic ambiguity. The author
argues that conventions have been overlooked, that there's no
extensive "semantic underdetermination" and that the new
theoretical framework of "truth-conditional pragmatics" is a
mistake.
Landmark Essays on Rhetorical Genre Studies gathers major works
that have contributed to the recent rhetorical reconceptualization
of genre. A lively and complex field developed over the past 30
years, Rhetorical Genre Studies is central to many current research
and teaching agendas. This collection, which is organized both
thematically and chronologically, explores genre research across a
range of disciplinary interests but with a specific focus on
rhetoric and composition. With introductions by the co-editors to
frame and extend each section, this volume helps readers understand
and contextualize both the foundations of the field and the central
themes and insights that have emerged. It will be of particular
interest to students and scholars working on topics related to
composition, rhetoric, professional and technical writing, and
applied linguistics.
In the latter half of the 19th century Gustave Pierre Trouve, a
modest but brilliant Parisian electrical engineer, conceived and
patented some 75 inventions, including the endoscope, the electric
car and the frontal headlamp. He also designed an electric boat -
complete with outboard motor, headlight and horn - an electric
rifle, an electric piano and a luminous fountain, and developed
wearable technology and ultraviolet light (PUVA) therapy. Unlike
his famous contemporary Nikola Tesla, who worked for Thomas Edison
and was patronised by George Westinghouse, Trouve never came to
America. A confirmed bachelor disinterested in industrialisation,
he was a gradually forgotten following his accidental death in
1902. This first-ever biography of Trouve details the fascinating
life of the Chevalier of the Legion of Honor once dubbed "the
French Edison."
This book is an excellent best-practice guide for senior managers
and directors with innovation responsibilities. It describes how
organisations of all sizes and sectors can apply design thinking
principles, coupled with commercial awareness, to their innovation
agenda. It explains how to keep the customer experience at the
centre of innovation efforts and when to apply the range of
available practices. It provides a clear, extensive rationale for
all advice and techniques offered. Design thinking has become the
number one innovation methodology for many businesses, but there
has been a lack of clarity about how best to adopt it. It often
requires significant mindset and behavioural changes and managers
must have a coherent and integrated understanding in order to guide
its adoption effectively. Many design thinking implementations are
inadequate or sub-optimal through focusing too much on details of
individual methods or being too abstract, with ill-defined
objectives. This book uniquely provides integrated clarity and
rationale across all levels of design thinking practice and
introduces the ARRIVE framework for design thinking in business
innovation, which the authors have developed over ten years of
practice and research. ARRIVE = Audit – Research – Reframe –
Ideate – Validate – Execute. The book contains a chapter for
each of A-R-R-I-V-E, each of which has explanatory background and
step-by-step methods instruction in a clear and standard format.
Using the ARRIVE framework, the book provides high-level
understanding, rationale and step-by-step guidance for CEOs, senior
innovation leaders, innovation project managers and design
practitioners in diverse public and private sectors. It applies
equally well to innovation of products, services or systems.
This open access book serves as a comprehensive guide to digital
writing technology, featuring contributions from over 20 renowned
researchers from various disciplines around the world. The book is
designed to provide a state-of-the-art synthesis of the
developments in digital writing in higher education, making it an
essential resource for anyone interested in this rapidly evolving
field. In the first part of the book, the authors offer an overview
of the impact that digitalization has had on writing, covering more
than 25 key technological innovations and their implications for
writing practices and pedagogical uses. Drawing on these chapters,
the second part of the book explores the theoretical underpinnings
of digital writing technology such as writing and learning, writing
quality, formulation support, writing and thinking, and writing
processes. The authors provide insightful analysis on the impact of
these developments and offer valuable insights into the future of
writing. Overall, this book provides a cohesive and consistent
theoretical view of the new realities of digital writing,
complementing existing literature on the digitalization of writing.
It is an essential resource for scholars, educators, and
practitioners interested in the intersection of technology and
writing.
The metaphysical part of this book is largely concerned with
realism issues. Michael Devitt starts with realism about
universals, dismissing Plato's notorious 'one over many' problem.
Several chapters argue for a fairly uncompromisingly realist view
of the external physical world of commonsense and science. Both the
nonfactualism of moral noncognitivism and positivistic
instrumentalism, and defl ationism about truth, are found to rest
on an antirealism that is hard to characterize. A case is presented
for moral realism. Various biological realisms are considered.
Finally, an argument is presented for an unfashionable biological
essentialism.
The second part of the book is epistemological. Devitt argues
against the a priori and for a Quinean naturalism. The intuitions
that so dominate "armchair philosophy" are empirical not a priori.
Throughout the book there is an emphasis on distinguishing
metaphysical issues about what there is and what it's like from
semantic issues about meaning, truth, and reference. Another
central theme, captured in the title, is that we should "put
metaphysics first." We should approach epistemology and semantics
from a metaphysical perspective rather than vice versa. The
epistemological turn in modern philosophy, and the linguistic turn
in contemporary philosophy, were something of disasters.
The Chomskian revolution in linguistics gave rise to a new
orthodoxy about mind and language. Michael Devitt throws down a
provocative challenge to that orthodoxy. What is linguistics about?
What role should linguistic intuitions play in constructing
grammars? What is innate about language? Is there a "language
faculty?" These questions are crucial to our developing
understanding of ourselves; Michael Devitt offers refreshingly
original answers. He argues that linguistics is about linguistic
reality and is not part of psychology; that linguistic rules are
not represented in the mind; that speakers are largely ignorant of
their language; that speakers' intuitions do not reflect
information supplied by the language faculty and are not the main
evidence for grammars; that the rules of "Universal Grammar" are
largely, if not entirely, innate structure rules of thought;
indeed, that there is little or nothing to the language faculty.
Devitt's controversial theses will prove highly stimulating to
anyone working on language and the mind
This book offers a new view of the linguistic process of
standardization, the movement of specific language features towards
uniformity. Drawing on theoretical arguments and empirical data, it
examines the way in which linguistic conformity develops out of
variation, and the textual and social factors which influence this
process. After defining and clarifying the general theoretical
issues involved, Professor Devitt takes as a specific case study
the standardization of written English in Scotland in the sixteenth
and seventeenth centuries, and shows that standardization is a
gradual process, that it encompasses periods of great variation and
that it occurs concurrently with sociopolitical shifts. The
interrelationship of linguistic features, genres and social
pressures shapes the nature and direction of standardization. This
is a readable and accessible book which will appeal to those
involved in the study of Scots-English, and is of importance for
linguistic methodology and the study and teaching of literacy.
Landmark Essays on Rhetorical Genre Studies gathers major works
that have contributed to the recent rhetorical reconceptualization
of genre. A lively and complex field developed over the past 30
years, Rhetorical Genre Studies is central to many current research
and teaching agendas. This collection, which is organized both
thematically and chronologically, explores genre research across a
range of disciplinary interests but with a specific focus on
rhetoric and composition. With introductions by the co-editors to
frame and extend each section, this volume helps readers understand
and contextualize both the foundations of the field and the central
themes and insights that have emerged. It will be of particular
interest to students and scholars working on topics related to
composition, rhetoric, professional and technical writing, and
applied linguistics.
Maple V Mathematics Programming Guide is the fully updated language
and programming reference for Maple V Release 5. It presents a
detailed description of Maple V Release 5 - the latest release of
the powerful, interactive computer algebra system used worldwide as
a tool for problem-solving in mathematics, the sciences,
engineering, and education. This manual describes the use of both
numeric and symbolic expressions, the data types available, and the
programming language statements in Maple. It shows how the system
can be extended or customized through user defined routines and
gives complete descriptions of the system's user interface and 2D
and 3D graphics capabilities.
Maple V Mathematics Learning Guide is the fully revised
introductory documentation for Maple V Release 5. It shows how to
use Maple V as a calculator with instant access to hundreds of
high-level math routines and as a programming language for more
demanding or specialized tasks. Topics include the basic data types
and statements in the Maple V language. The book serves as a
tutorial introduction and explains the difference between numeric
computation and symbolic computation, illustrating how both are
used in Maple V Release 5. Extensive "how-to" examples are
presented throughout the text to show how common types of
calculations can be easily expressed in Maple. Graphics examples
are used to illustrate the way in which 2D and 3D graphics can aid
in understanding the behaviour of problems.
This book offers a new view of the linguistic process of
standardization, the movement of specific language features towards
uniformity. Drawing on theoretical arguments and empirical data, it
examines the way in which linguistic conformity develops out of
variation, and the textual and social factors which influence this
process. After defining and clarifying the general theoretical
issues involved, Professor Devitt takes as a specific case study
the standardization of written English in Scotland in the sixteenth
and seventeenth centuries, and shows that standardization is a
gradual process, that it encompasses periods of great variation and
that it occurs concurrently with sociopolitical shifts. The
interrelationship of linguistic features, genres and social
pressures shapes the nature and direction of standardization. This
is a readable and accessible book which will appeal to those
involved in the study of Scots-English, and is of importance for
linguistic methodology and the study and teaching of literacy.
This book criticizes the methodology of the recent
semantics-pragmatics debate in the theory of language and proposes
an alternative. It applies this methodology to argue for a
traditional view against a group of "contextualists" and
"pragmatists", including Sperber and Wilson, Bach, Carston,
Recanati, Neale, and many others. The author disagrees with these
theorists who hold that the meaning of the sentence in an utterance
never, or hardly ever, yields its literal truth-conditional
content, even after disambiguation and reference fixing; it needs
to be pragmatically supplemented in context. The standard
methodology of this debate is to consult intuitions. The book
argues that theories should be tested against linguistic usage.
Theoretical distinctions, however intuitive, need to be
scientifically motivated. Also we should not be guided by Grice's
"Modified Occam's Razor", Ruhl's "Monosemantic Bias", or other such
strategies for "meaning denialism". From this novel perspective,
the striking examples of context relativity that motivate
contextualists and pragmatists typically exemplify semantic rather
than pragmatic properties. In particular, polysemous phenomena
should typically be treated as semantic ambiguity. The author
argues that conventions have been overlooked, that there's no
extensive "semantic underdetermination" and that the new
theoretical framework of "truth-conditional pragmatics" is a
mistake.
Health Politics in Europe: A Handbook is a major new reference
work, which provides historical background and up-to-date
information and analysis on health politics and health systems
throughout Europe. In particular, it captures developments that
have taken place since the end of the Cold War, a turning point for
many European health systems, with most post-communist transition
countries privatizing their state-run health systems, and many
Western European health systems experimenting with new public
management and other market-oriented health reforms. Following
three introductory, stage-setting chapters, the handbook offers
country cases divided into seven regional sections, each of which
begins with a short regional outlook chapter that highlights the
region's common characteristics and divergent paths taken by the
separate countries, including comparative data on health system
financing, healthcare access, and the political salience of health.
Each regional section contains at least one detailed main case,
followed by shorter treatments of the other countries in the
region. Country chapters feature a historical overview focusing on
the country's progression through a series of political regimes and
the consequences of this history for the health system; an overview
of the institutions and functioning of the contemporary health
system; and a political narrative tracing the politics of health
policy since 1989. This political narrative, the core of each
country case, examines key health reforms in order to understand
the political motivations and dynamics behind them and their impact
on public opinion and political legitimacy. The handbook's
systematic structure makes it useful for country-specific,
cross-national, and topical research and analysis.
This book constitutes the refereed proceedings of the 8th
International Conference on Reversible Computation, RC 2016, held
in Bologna, Italy, in July 2016. The 18 full and 5 short papers
included in this volume were carefully reviewed and selected from
38 submissions. The papers are organized in topical sections named:
process calculi; reversible models; programming languages; quantum
computing; quantum programming; circuit theory; and syntheses.
Michael Devitt is a distinguished philosopher of language. In this
book he takes up one of the most important difficulties that must
be faced by philosophical semantics: namely, the threat posed by
holism. Three important questions lie at the core of this book:
what are the main objectives of semantics; why are they worthwhile;
how should we accomplish them? Devitt answers these
'methodological' questions naturalistically and explores what
semantic programme arises from the answers. The approach is
anti-Cartesian, rejecting the idea that linguistic or conceptual
competence yields any privileged access to meanings. This new
methodology is used first against holism. Devitt argues for a
truth-referential localism, and in the process rejects
direct-reference, two-factor, and verificationist theories. The
book concludes by arguing against revisionism, eliminativism, and
the idea that we should ascribe narrow meanings to explain
behaviour.
Michael Devitt is a distinguished philosopher of language. In this
book he takes up one of the most important difficulties that must
be faced by philosophical semantics: namely, the threat posed by
holism. Three important questions lie at the core of this book:
what are the main objectives of semantics; why are they worthwhile;
how should we accomplish them? Devitt answers these
'methodological' questions naturalistically and explores what
semantic programme arises from the answers. The approach is
anti-Cartesian, rejecting the idea that linguistic or conceptual
competence yields any privileged access to meanings. This new
methodology is used first against holism. Devitt argues for a
truth-referential localism, and in the process rejects
direct-reference, two-factor, and verificationist theories. The
book concludes by arguing against revisionism, eliminativism, and
the idea that we should ascribe narrow meanings to explain
behaviour.
|
|