|
Showing 1 - 25 of
52 matches in All Departments
Apposition in Contemporary English is the first full-length treatment of apposition. It provides detailed discussion of its linguistic characteristics and of its usage in various kinds of speech and writing, derived from the data of British and American computer corpora. Charles Meyer demonstrates the inadequacies of previous studies and argues that apposition is a grammatical relation realized by constructions having particular syntactic, semantic and pragmatic characteristics, of which certain are dominant. The language of press reportage, fiction, learned writing and spontaneous conversation is analyzed.
Faces of English explores the phenomenon of increasing dialects,
varieties, and creoles, even as the spread of globalization
supports an apparently growing uniformity among nations. The book's
chapters supply descriptions of Jamaican English in Toronto,
English as an L2 in a South African mining township, Chinese and
English contact in Singapore, unexpected, emergent variants in
Canadian English, and innovations in the English of West Virginia.
Further, the book offers some perspective on internet English as
well as on abiding uniformities in the lexicon and grammar of
standard varieties. In the analyses of this heterogeneous growth
such considerations as speakers' sociolinguistic profiles,
phonological, morpho-syntactic, and lexical variables, frequencies,
and typological patterns provide ample insight in the current
status of English both in oral and electronic communities. The
opening chapter presents a theoretical framework that argues for
linguistic typology as conceptually resourceful in accommodating
techniques of analysis and in distinguishing the wide arrays of
English found throughout the globe. One clear function for Faces of
English is that of a catalyst: to spur studies of diversities in
English (and in other languages), to suggest approaches to adapt,
to invite counterargument and developments in analysis.
This collection of essays sheds new light on the verb in English. The authors illustrate that verbs can only be properly understood if studied from both a theoretical and descriptive perspective. In Part One, the authors explore topics such as the terminological problems of classification, verb complementation, the semantics and pragmatics of verbs and verbal combinations, and the notions of tense, aspect, voice and modality. In Part Two, computer corpora are used to study various types of verb complements and collocations, to trace the development in English of certain verb forms, and to detail the usage of verbs in different varieties and genres of English.
Computers are currently used in a variety of critical applications,
including systems for nuclear reactor control, flight control (both
aircraft and spacecraft), and air traffic control. Moreover,
experience has shown that the dependability of such systems is
particularly sensitive to that of its software components, both the
system software of the embedded computers and the application
software they support. Software Performability: From Concepts to
Applications addresses the construction and solution of analytic
performability models for critical-application software. The book
includes a review of general performability concepts along with
notions which are peculiar to software performability. Since fault
tolerance is widely recognized as a viable means for improving the
dependability of computer system (beyond what can be achieved by
fault prevention), the examples considered are fault-tolerant
software systems that incorporate particular methods of design
diversity and fault recovery. Software Performability: From
Concepts to Applications will be of direct benefit to both
practitioners and researchers in the area of performance and
dependability evaluation, fault-tolerant computing, and dependable
systems for critical applications. For practitioners, it supplies a
basis for defining combined performance-dependability criteria (in
the form of objective functions) that can be used to enhance the
performability (performance/dependability) of existing software
designs. For those with research interests in model-based
evaluation, the book provides an analytic framework and a variety
of performability modeling examples in an application context of
recognized importance. The material contained in this book will
both stimulate future research on related topics and, for teaching
purposes, serve as a reference text in courses on computer system
evaluation, fault-tolerant computing, and dependable
high-performance computer systems.
Polymer Microscopy, 3rd Edition, is a comprehensive and practical
guide to the study of the microstructure of polymers, and is the
result of the authors' many years of academic and industrial
experience. To address the needs of students and professionals from
a variety of backgrounds, introductory chapters deal with the basic
concepts of both polymer morphology and processing and microscopy
and imaging theory. The core of the book is more applied, with many
examples of specimen preparation and image interpretation leading
to materials characterization.Microscopy is applied to the
characterization of a wide range of polymer systems, including
fibers, films, engineering resins and plastics, composites,
nanocomposites, polymer blends, emulsions and liquid crystaline
polymers. Light microscopy, atomic force microscopy, scanning and
transmission electron microscopy techniques are all considered, as
are emerging techniques such as compositional mapping in which
microscopy is combined with spectroscopy. This extensively updated
and revised third edition closes with a problem solving guide,
which gives a systematic framework for deciding on suitable
approaches to the characterization of
Inaccuracies in Children's Testimony combines the literature on
obedience to authority with that on suggestibility to create a
third literature. This book examines children's testimony from
several perspectives and gives you insightful suggestions for
increasing children's abilities to testify accurately about
traumatic things that have happened to them. In doing so, you'll
learn how to ensure that those who abuse or sexually exploit
children are brought to justice while those falsely accused are
adequately protected.How children are questioned to learn what they
have witnessed is crucial due to the effects the questioning
sessions may have on their testimonies--improper questioning may
lead to inaccurate answers. This is just one of the many areas of
children's testimony covered in Inaccuracies in Children's
Testimony. In each of the chapters you'll discover new ways for
increasing the accuracy and dependability of children's testimony
as you read about: factors that affect children's testimonies
suggestibility--definition and research, including sources of
suggestibility how obedience to authority can explain children's
behavior as witnesses children's memory in the courtroom and what
they are able to remember how children's involvement in the courts
can be problematic free versus prompted recall--which is more
accurate and why the "worst" method is often used with children
Milgram's theory of obedience to authority tied to children as
witnesses review of the literature on the effects of stress,
prompting, and imagination on children's recall ideas for future
researchExperts in the field of legal testimony, legal personnel,
child counselors, psychologists, social workers, and faculty and
students of related courses will find Inaccuracies in Children's
Testimony an essential resource for understanding the importance of
making the child victim/witness more believable and reliable.
Inaccuracies in Children's Testimony combines the literature on
obedience to authority with that on suggestibility to create a
third literature. This book examines children's testimony from
several perspectives and gives you insightful suggestions for
increasing children's abilities to testify accurately about
traumatic things that have happened to them. In doing so, you'll
learn how to ensure that those who abuse or sexually exploit
children are brought to justice while those falsely accused are
adequately protected.How children are questioned to learn what they
have witnessed is crucial due to the effects the questioning
sessions may have on their testimonies--improper questioning may
lead to inaccurate answers. This is just one of the many areas of
children's testimony covered in Inaccuracies in Children's
Testimony. In each of the chapters you'll discover new ways for
increasing the accuracy and dependability of children's testimony
as you read about: factors that affect children's testimonies
suggestibility--definition and research, including sources of
suggestibility how obedience to authority can explain children's
behavior as witnesses children's memory in the courtroom and what
they are able to remember how children's involvement in the courts
can be problematic free versus prompted recall--which is more
accurate and why the "worst" method is often used with children
Milgram's theory of obedience to authority tied to children as
witnesses review of the literature on the effects of stress,
prompting, and imagination on children's recall ideas for future
researchExperts in the field of legal testimony, legal personnel,
child counselors, psychologists, social workers, and faculty and
students of related courses will find Inaccuracies in Children's
Testimony an essential resource for understanding the importance of
making the child victim/witness more believable and reliable.
Accretion disks in astrophysics represent the characteristic flow
by which compact bodies accrete mass from their environment. Their
intrinsically high luminosity, and recent progress in observational
accessibility at all wavelength bands, have led to rapidly growing
awareness of their importance and made them the object of intense
research on widely different scales, ranging from binary stars to
young stellar objects and active galactic nuclei. This book
contains the proceedings of the NATO Advanced Workshop on Theory of
Accretion Disks 2' for which some of the most active researchers in
the different fields came together at the Max-Planck-Institut for
Astrophysics in Garching in March, 1993. Its reviews and
contributions give an up-to-date account of the present status of
our understanding and provide a stimulating challenge in
discussions of open questions in a rapidly developing field.
This step-by-step guide to creating and analyzing linguistic corpora discusses the role that corpus linguistics plays in linguistic theory. It demonstrates that corpora have proven to be very useful resources for linguists who believe that their theories and descriptions of English should be based on real rather than contrived data. The author shows how to collect and computerize data for inclusion in a corpus and how to annotate and conduct a linguistic analysis once the corpus has been created.
Accretion disks in astrophysics represent the characteristic flow
by which compact bodies accrete mass from their environment. Their
intrinsically high luminosity, and recent progress in observational
accessibility at all wavelength bands, have led to rapidly growing
awareness of their importance and made them the object of intense
research on widely different scales, ranging from binary stars to
young stellar objects and active galactic nuclei. This book
contains the proceedings of the NATO Advanced Workshop on `Theory
of Accretion Disks 2' for which some of the most active researchers
in the different fields came together at the Max-Planck-Institut
for Astrophysics in Garching in March, 1993. Its reviews and
contributions give an up-to-date account of the present status of
our understanding and provide a stimulating challenge in
discussions of open questions in a rapidly developing field.
This volume contains the papers presented at the Second
International Work ing Conference on Dependable Computing for
Critical Applications, sponsored by IFIP Working Group lOA and held
in Tucson, Arizona on February 18-20, 1991. In keeping with the
first such conference on this topic, which took place at the
University of California, Santa Barbara in 1989, this meeting was
like wise concerned with an important basic question: Can we rely
on Computers? In more precise terms, it addressed various aspects
of computer system de pendability, a broad concept defined as th'e
trustworthiness of computer service such that reliance can
justifiably be placed on this service. Given that this term
includes attributes such as reliability, availability, safety, and
security, it is our hope that these papers will contribute to
further integration of these ideas in the context of critical
applications. The program consisted of 20 papers and three panel
sessions. The papers were selected from a total of 61 submissions
at a November 1990 meeting of the Program Committee in Ann Arbor,
Michigan. We were very fortunate to have a broad spectrum of
interests represented, with papers in the final program coming from
seven different countries, representing work at universities,
corporations, and government agencies. The process was greatly
facilitated by the diligent work of the Program Committee and the
quality of reviews provided by outside referees. In addition to the
paper presentations, there were three panel sessions or ganized to
examine particular topics in detail."
With the advent of space observatories and modern developments in
ground based astronomy and concurrent progress in the theoretical
understanding of these observations it has become clear that
accretion of material on to compact objects is an ubiquitous
mechanism powering very diverse astrophysical sources ranging in
size and luminosity by many orders of magnitude. A problem common
to these systems is that the material accreted must in general get
rid of its angular momentum and this leads to the formation of an
Accretion Disk which allows angular momentum re-distribution and
converts potential energy into radiation with an efficiency which
can be higher than the nuclear burning yield. These systems range
in size from quasars and active galactic nuclei to accretion disks
around forming stars and the early solar system and to compact
binaries such as cataclysmic variables and low-mass X-ray binaries.
Other objects that should be mentioned in this context are 88433,
the black hole binary candidates, and possibly gamma-ray burst
sources. Observations of these systems have provided important
constraints for theoretical accretion disk models on widely
differing scales, lumi nosities, mass-transfer rates and physical
environments."
Computers are currently used in a variety of critical applications,
including systems for nuclear reactor control, flight control (both
aircraft and spacecraft), and air traffic control. Moreover,
experience has shown that the dependability of such systems is
particularly sensitive to that of its software components, both the
system software of the embedded computers and the application
software they support. Software Performability: From Concepts to
Applications addresses the construction and solution of analytic
performability models for critical-application software. The book
includes a review of general performability concepts along with
notions which are peculiar to software performability. Since fault
tolerance is widely recognized as a viable means for improving the
dependability of computer system (beyond what can be achieved by
fault prevention), the examples considered are fault-tolerant
software systems that incorporate particular methods of design
diversity and fault recovery. Software Performability: From
Concepts to Applications will be of direct benefit to both
practitioners and researchers in the area of performance and
dependability evaluation, fault-tolerant computing, and dependable
systems for critical applications. For practitioners, it supplies a
basis for defining combined performance-dependability criteria (in
the form of objective functions) that can be used to enhance the
performability (performance/dependability) of existing software
designs. For those with research interests in model-based
evaluation, the book provides an analytic framework and a variety
of performability modeling examples in an application context of
recognized importance. The material contained in this book will
both stimulate future research on related topics and, for teaching
purposes, serve as a reference text in courses on computer system
evaluation, fault-tolerant computing, and dependable
high-performance computer systems.
Martin F. Meyer untersucht die Entwicklung des
lebenswissenschaftlichen Denkens von den fruhsten Anfangen bis zur
Geburt der wissenschaftlichen Biologie bei Aristoteles. Der Autor
zeigt im ersten Teil, wie sich zentrale biologische Begriffe
(Leben, Lebewesen, Mensch, Tier, Pflanze) im fruhgriechischen
Denken, bei den Vorsokratikern und in der sogenannten
Hippokratischen Medizin entwickelt haben. Im zweiten Teil
beleuchtet er die Ziele, Methoden und die Systematik der von
Aristoteles begrundeten Biologie im Kontext seines
naturwissenschaftlichen Programms.
Are you looking for a genuine introduction to the linguistics of
English that provides a broad overview of the subject that sustains
students' interest and avoids excessive detail? Introducing English
Linguistics accomplishes this goal in two ways. First, it takes a
top-down approach to language, beginning with the largest unit of
linguistic structure, the text, and working its way down through
successively smaller structures (sentences, words, and finally
speech sounds). The advantage of presenting language this way is
that students are first given the larger picture - they study
language in context - and then see how the smaller pieces of
language are a consequence of the larger goals of linguistic
communication. Second, the book does not contain invented examples,
as is the case with most comparable texts, but instead takes its
sample materials from the major computerised databases of spoken
and written English, giving students a more realistic view of
language.
Are you looking for a genuine introduction to the linguistics of
English that provides a broad overview of the subject that sustains
students' interest and avoids excessive detail? Introducing English
Linguistics accomplishes this goal in two ways. First, it takes a
top-down approach to language, beginning with the largest unit of
linguistic structure, the text, and working its way down through
successively smaller structures (sentences, words, and finally
speech sounds). The advantage of presenting language this way is
that students are first given the larger picture - they study
language in context - and then see how the smaller pieces of
language are a consequence of the larger goals of linguistic
communication. Second, the book does not contain invented examples,
as is the case with most comparable texts, but instead takes its
sample materials from the major computerised databases of spoken
and written English, giving students a more realistic view of
language.
Apposition in Contemporary English is the first full-length
treatment of apposition. It provides detailed discussion of its
linguistic characteristics and of its usage in various kinds of
speech and writing, derived from the data of British and American
computer corpora. Charles Meyer demonstrates the inadequacies of
previous studies and argues that apposition is a grammatical
relation realized by constructions having particular syntactic,
semantic and pragmatic characteristics, of which certain are
dominant. The language of press reportage, fiction, learned writing
and spontaneous conversation is analyzed.
This collection of essays by some of the leading scholars in the
field sheds new light on the verb in English. The central concern
of the volume is to illustrate that verbs can only be adequately
and properly understood if studied from both a theoretical and
descriptive perspective. In part one, theoretical topics are
explored: terminological problems of classifying verbs and
verb-related elements, the 'determining' properties of verbs, verb
complementation, the semantics and pragmatics of verbs and verbal
combinations, and the notions of tense, aspect, voice and modality.
In part two, computer corpora are used to study various types of
verb complements and collocations, to trace the development in
English of certain verb forms and to detail the usage of verbs in
different varieties and genres of English.
|
|