|
Showing 1 - 25 of
53 matches in All Departments
Unlike some other reproductions of classic texts (1) We have not
used OCR(Optical Character Recognition), as this leads to bad
quality books with introduced typos. (2) In books where there are
images such as portraits, maps, sketches etc We have endeavoured to
keep the quality of these images, so they represent accurately the
original artefact. Although occasionally there may be certain
imperfections with these old texts, we feel they deserve to be made
available for future generations to enjoy.
The B-minor Mass has always represented a fascinating challenge to
musical scholarship. Composed over the course of Johann Sebastian
Bach's life, it is considered by many to be the composer's greatest
and most complex work. The fourteen essays assembled in this volume
originate from the International Symposium 'Understanding Bach's
B-minor mass' at which scholars from eighteen countries gathered to
debate the latest topics in the field. In revised and updated form,
they comprise a thorough and systematic study of Bach's Opus
Ultimum, including a wide range of discussions relating to the
Mass's historical background and contexts, structure and
proportion, sources and editions, and the reception of the work in
the late eighteenth and early nineteenth centuries. In the light of
important new developments in the study of the piece, this
collection demonstrates the innovation and rigour for which Bach
scholarship has become known.
This book presents current research on mobile Internet society.
Past research was not able to show a clear analytical framework,
thus was unable to close in on the fundamental changes in that
society. This book, however, analyzes mobile Internet society by
introducing the concept of "doubling of time and place" and the
analytical framework of the "second offline." The emergence of the
smartphone has made Internet use easier, and now, people are
constantly using online information in the midst of their daily
lives. Our society is transitioning from the first offline society,
a society without being connected to Internet, to the second
offline society, where users are connected to the Internet at all
times. In this second offline society, our sense of time and place
are beginning to change. Broadcast and communication media have
made possible the overlapping of different places, which has been
called the doubling of place. Furthermore, virtual reality (VR) and
augmented reality (AR) technologies have enabled the overlap of
different times, which this book calls the doubling of time. The
smartphone makes both possible. With the second offline and the
doubling of time and place as keywords, the book takes into
consideration research that includes, among other topics, the media
usage of young adults, selfies, education, social media usage,
mobile games, work stations, and consumer activity in the mobile
Internet society.
Because stroke is essentially a disease of the vessels and blood
flow, the most fundamental aspects of ischemic blood flow in the
brain are under investigation by researchers. Their work was the
focus of the sixth in the series of Keio University International
Symposia for Life Sciences and Medicine, held in Tokyo in 1999.
Selected here are 55 papers from the symposium, covering the buffy
coat (glycocalyx) of endothelial cells, the blood-brain barrier and
permeability, gene expression, vascular reactivity, dysregulation,
inflammatory deterioration, cortical spreading depression, edema,
microvascular derangement, and pathology, in ten major sections.
The book includes the thought-provoking discussions that followed
the presentations, thus providing an invaluable source of
up-to-date information not only for researchers investigating
microcirculation but also for clinicians implementing the most
effective treatment for stroke patients.
Metabolism is the sum of the chemical reactions in cells that
produce life-sustaining chemical energy and metabolites. In the
post-genome era, metabolism has taken on new significance for
biological scientists: metabolites are the chemical basis of
phenotypes that are final expressions of genomic information. This
book covers research on metabolomics, ranging from the development
of specialized chemical analytical techniques to the construction
of databases and methods for metabolic simulation. The authors have
been directly involved in the development of all the subject areas,
including capillary electrophoresis, liquid chromatography, mass
spectrometry, metabolic databases, and metabolic simulation.
Breakthrough achievements and the future of metabolome studies are
described, making this book a valuable source for researchers in
metabolomics in diverse fields, such as plant, animal, cellular,
microbial, pharmaceutical, medical, and genetic sciences.
This book presents theories and techniques for perception of
textures by computer. Texture is a homogeneous visual pattern that
we perceive in surfaces of objects such as textiles, tree barks or
stones. Texture analysis is one of the first important steps in
computer vision since texture provides important cues to recognize
real-world objects. A major part of the book is devoted to
two-dimensional analysis of texture patterns by extracting
statistical and structural features. It also deals with the
shape-from-texture problem which addresses recovery of the
three-dimensional surface shapes based on the geometry of
projection of the surface texture to the image plane. Perception is
still largely mysterious. Realizing a computer vision system that
can work in the real world requires more research and ex periment.
Capability of textural perception is a key component. We hope this
book will contribute to the advancement of computer vision toward
robust, useful systems. vVe would like to express our appreciation
to Professor Takeo Kanade at Carnegie Mellon University for his
encouragement and help in writing this book; to the members of
Computer Vision Section at Electrotechni cal Laboratory for
providing an excellent research environment; and to Carl W. Harris
at Kluwer Academic Publishers for his help in preparing the
manuscript."
A sequel to Tomita's A Bibliographical Catalogue of Italian Books
Printed in England 1558-1603, this volume provides the data for the
succeeding 40 years (during the reign of King James I and Charles
I) and contributes to the study of Anglo-Italian relations in
literature through entries on 187 Italian books (335 editions)
printed in England. The Catalogue starts with the books published
immediately after the death of Queen Elizabeth I on 24 March 1603,
and ends in 1642 with the closing of English theatres. It also
contains 45 Elizabethan books (75 editions), which did not feature
in the previous volume. Formatted along the lines of Mary Augusta
Scott's Elizabethan Translations from the Italian (1916), and
adopting Philip Gaskell's scientific method of bibliographical
description, this volume provides reliable and comprehensive
information about books and their publication, viewed in a general
perspective of Anglo-Italian transactions in Jacobean and part of
Caroline England.
Through entries on 291 Italian books (451 editions) published in
England during the reign of Queen Elizabeth I, covering the years
1558-1603, this catalogue represents a summary of current research
and knowledge of diffusion of Italian culture on English literature
in this period. It also provides a foundation for new work on
Anglo-Italian relations in Elizabethan England. Mary Augusta
Scott's 1916 Elizabethan Translations from the Italian forms the
basis for the catalogue; Soko Tomita adds 59 new books and
eliminates 23 of Scott's original entries. The information here is
presented in a user-friendly and uncluttered manner, guided by
Philip Gaskell's principles of bibliographical description; the
volume includes bibliographical descriptions, tables, graphs,
images, and two indices (general and title). In an attempt to
restore each book to its original status, each entry is concerned
not only with the physical book, but with the human elements
guiding it through production: the relationship with the author,
editor, translator, publisher, book-seller, and patron are all
recounted as important players in the exploration of cultural
significance. Renaissance Anglo-Italian relations were marked by
both patriotism and xenophobia; this catalogue provides reliable
and comprehensive information about books and publication as well
as concrete evidence of what elements of Italian culture the
English responded to and how Italian culture was acclimatized into
Elizabethan England.
The Jazz Rhythm Section introduces the basics of this very
important part of the jazz ensemble. They are the foundation of any
jazz group, so improving the rhythm section will result in a
stronger sounding band. This book is intended to be a practical
guide with chapters on each of the primary instruments in the
rhythm section: bass, drums, piano and guitar. Key topics include:
Equipment and setup issues: from drum heads to bass amps to guitar
pickups to mic'ing a piano on stage, each instrument presents
unique equipment issues that the director must face. Performance
practice: including tips on constructing walking bass lines,
learning voicings and comping rhythms and creating drummer's setup
fills. Additionally, there is a chapter on the rhythm section
itself that details all of the inter-relationships, suggestions for
count-offs and metronome exercises that will help improve your
band. The Jazz Rhythm Section is intended for novice directors, but
directors of all levels will benefit as well.
The Generalized LR parsing algorithm (some call it "Tomita's
algorithm") was originally developed in 1985 as a part of my Ph.D
thesis at Carnegie Mellon University. When I was a graduate student
at CMU, I tried to build a couple of natural language systems based
on existing parsing methods. Their parsing speed, however, always
bothered me. I sometimes wondered whether it was ever possible to
build a natural language parser that could parse reasonably long
sentences in a reasonable time without help from large mainframe
machines. At the same time, I was always amazed by the speed of
programming language compilers, because they can parse very long
sentences (i.e., programs) very quickly even on workstations. There
are two reasons. First, programming languages are considerably
simpler than natural languages. And secondly, they have very
efficient parsing methods, most notably LR. The LR parsing
algorithm first precompiles a grammar into an LR parsing table, and
at the actual parsing time, it performs shift-reduce parsing guided
deterministically by the parsing table. So, the key to the LR
efficiency is the grammar precompilation; something that had never
been tried for natural languages in 1985. Of course, there was a
good reason why LR had never been applied for natural languages; it
was simply impossible. If your context-free grammar is sufficiently
more complex than programming languages, its LR parsing table will
have multiple actions, and deterministic parsing will be no longer
possible.
Parsing Efficiency is crucial when building practical natural
language systems. 'Ibis is especially the case for interactive
systems such as natural language database access, interfaces to
expert systems and interactive machine translation. Despite its
importance, parsing efficiency has received little attention in the
area of natural language processing. In the areas of compiler
design and theoretical computer science, on the other hand, parsing
algorithms 3 have been evaluated primarily in terms of the
theoretical worst case analysis (e.g. lXn", and very few practical
comparisons have been made. This book introduces a context-free
parsing algorithm that parses natural language more efficiently
than any other existing parsing algorithms in practice. Its
feasibility for use in practical systems is being proven in its
application to Japanese language interface at Carnegie Group Inc.,
and to the continuous speech recognition project at Carnegie-Mellon
University. This work was done while I was pursuing a Ph.D degree
at Carnegie-Mellon University. My advisers, Herb Simon and Jaime
Carbonell, deserve many thanks for their unfailing support, advice
and encouragement during my graduate studies. I would like to thank
Phil Hayes and Ralph Grishman for their helpful comments and
criticism that in many ways improved the quality of this book. I
wish also to thank Steven Brooks for insightful comments on
theoretical aspects of the book (chapter 4, appendices A, B and C),
and Rich Thomason for improving the linguistic part of tile book
(the very beginning of section 1.1).
Through entries on 291 Italian books (451 editions) published in
England during the reign of Queen Elizabeth I, covering the years
1558-1603, this catalogue represents a summary of current research
and knowledge of diffusion of Italian culture on English literature
in this period. It also provides a foundation for new work on
Anglo-Italian relations in Elizabethan England. Mary Augusta
Scott's 1916 Elizabethan Translations from the Italian forms the
basis for the catalogue; Soko Tomita adds 59 new books and
eliminates 23 of Scott's original entries. The information here is
presented in a user-friendly and uncluttered manner, guided by
Philip Gaskell's principles of bibliographical description; the
volume includes bibliographical descriptions, tables, graphs,
images, and two indices (general and title). In an attempt to
restore each book to its original status, each entry is concerned
not only with the physical book, but with the human elements
guiding it through production: the relationship with the author,
editor, translator, publisher, book-seller, and patron are all
recounted as important players in the exploration of cultural
significance. Renaissance Anglo-Italian relations were marked by
both patriotism and xenophobia; this catalogue provides reliable
and comprehensive information about books and publication as well
as concrete evidence of what elements of Italian culture the
English responded to and how Italian culture was acclimatized into
Elizabethan England.
Introducing Zom-BL! A truck pulls into a sleepy roadside diner. In
it a pair of young men argue over something before stepping into
the restaurant for some grub and some assistance. Apparently their
car is having engine trouble and they need a new car battery. The
cook offers his used battery and hands one of them the keys to his
car before he starts preparing a pair of chili dogs. While he is
over the grill, he is told a tale of chaos, violence and disorder
perpetuated by undying people(?)...
With the spread of mobile augmented reality, it has become very
difficult to consider digital space and physical space
independently. In this book, the authors identify and discuss the
state 'Second Offline' which refers to a real-world environment
whose elements are augmented by virtual information and one in
which individuals are constantly referring to the online world.
'Second Offline' is observed across a wide range of social contexts
and the relationship between superimposed digital online
information and physical offline information is increasingly
important. This book analyses the cooperative relationship between
online and offline and also examines situations where there may be
a conflict between these realities. Furthermore, the authors
discuss the possibility that in addition to influencing the
physical space, the digital world actually causes some of the
physical world to be lost. Offering a discussion of the
implications of a post-mobile society in which second offline is
widespread, this edited collection will be of interest to students,
scholars and practitioners working in sociology, mobile media and
cultural studies more generally.
For nearly two centuries, Johann Sebastian Bach has been regarded
as a cornerstone of Western musical culture. His music inspired
subsequent generations of composers and philosophers alike, and
continues to capture our imaginations in many ways. Bach studies is
part of this picture, often seen as providing excellent examples of
musicological scholarship. For The Baroque Masters: Bach, the
editor has chosen thirty-three published articles which, in his
view, not only represent a broad spectrum of the scholarly
discussions on Bach's life and works, but will also facilitate the
on-going study of Bach's creative genius. The articles have been
selected to ensure that this volume will be considered useful for
not only those students who are currently engaging in Bach studies
at universities but also for more seasoned Bach scholars as they
consider future direction of Bach studies.
The interdisciplinary field of molecular systems biology aims to
understand the behavior and mechanisms of biological processes
composed of individual molecular components. As we gain more
qualitative and quantitative information of complex intracellular
processes, biochemical modeling and simulation become indispensable
not only to uncover the molecular mechanisms of the processes, but
to perform useful predictions. To this end, the E-Cell System, a
multi-algorithm, multi-timescale object-oriented simulation
platform, can be used to construct predictive virtual biological
systems. Gene regulatory and biochemical networks that constitute a
sub- or a whole cellular system can be constructed using the E-Cell
System to perform qualitative and quantitative analyses. The
purpose of E-Cell System: Basic Concepts and Applications is to
provide a comprehensive guide for the E-Cell System version 3 in
terms of the software features and its usage. While the publicly
available E-Cell Simulation Environment version 3 User's Manual
provides the technical details of model building and scripting, it
does not describe some of the underlying concepts of the E-Cell
System. The first part of the book addresses this issue by
providing the basic concepts of modeling and simulation with the
E-Cell System.
This book constitutes the thoroughly refereed conference
proceedings of the 9th International Workshop on Algorithms and
Computation, WALCOM 2015, held in Dhaka, Bangladesh, in February
2015. The 26 revised full papers presented together with 3 invited
talks were carefully reviewed and selected from 85 submissions. The
papers are organized in topical sections on approximation
algorithms, data structures and algorithms, computational geometry,
combinatorial algorithms, distributed and online algorithms, graph
drawing and algorithms, combinatorial problems and complexity, and
graph enumeration and algorithms.
This book presents theories and techniques for perception of
textures by computer. Texture is a homogeneous visual pattern that
we perceive in surfaces of objects such as textiles, tree barks or
stones. Texture analysis is one of the first important steps in
computer vision since texture provides important cues to recognize
real-world objects. A major part of the book is devoted to
two-dimensional analysis of texture patterns by extracting
statistical and structural features. It also deals with the
shape-from-texture problem which addresses recovery of the
three-dimensional surface shapes based on the geometry of
projection of the surface texture to the image plane. Perception is
still largely mysterious. Realizing a computer vision system that
can work in the real world requires more research and ex periment.
Capability of textural perception is a key component. We hope this
book will contribute to the advancement of computer vision toward
robust, useful systems. vVe would like to express our appreciation
to Professor Takeo Kanade at Carnegie Mellon University for his
encouragement and help in writing this book; to the members of
Computer Vision Section at Electrotechni cal Laboratory for
providing an excellent research environment; and to Carl W. Harris
at Kluwer Academic Publishers for his help in preparing the
manuscript."
These letters tell the story of a young American woman of Japanese
descent who, along with over 10,000 other Japanese Americans, was
stranded in Japan during World War II.
Because stroke is essentially a disease of the vessels and blood
flow, the most fundamental aspects of ischemic blood flow in the
brain are under investigation by researchers. Their work was the
focus of the sixth in the series of Keio University International
Symposia for Life Sciences and Medicine, held in Tokyo in 1999.
Selected here are 55 papers from the symposium, covering the buffy
coat (glycocalyx) of endothelial cells, the blood-brain barrier and
permeability, gene expression, vascular reactivity, dysregulation,
inflammatory deterioration, cortical spreading depression, edema,
microvascular derangement, and pathology, in ten major sections.
The book includes the thought-provoking discussions that followed
the presentations, thus providing an invaluable source of
up-to-date information not only for researchers investigating
microcirculation but also for clinicians implementing the most
effective treatment for stroke patients.
The Generalized LR parsing algorithm (some call it "Tomita's
algorithm") was originally developed in 1985 as a part of my Ph.D
thesis at Carnegie Mellon University. When I was a graduate student
at CMU, I tried to build a couple of natural language systems based
on existing parsing methods. Their parsing speed, however, always
bothered me. I sometimes wondered whether it was ever possible to
build a natural language parser that could parse reasonably long
sentences in a reasonable time without help from large mainframe
machines. At the same time, I was always amazed by the speed of
programming language compilers, because they can parse very long
sentences (i.e., programs) very quickly even on workstations. There
are two reasons. First, programming languages are considerably
simpler than natural languages. And secondly, they have very
efficient parsing methods, most notably LR. The LR parsing
algorithm first precompiles a grammar into an LR parsing table, and
at the actual parsing time, it performs shift-reduce parsing guided
deterministically by the parsing table. So, the key to the LR
efficiency is the grammar precompilation; something that had never
been tried for natural languages in 1985. Of course, there was a
good reason why LR had never been applied for natural languages; it
was simply impossible. If your context-free grammar is sufficiently
more complex than programming languages, its LR parsing table will
have multiple actions, and deterministic parsing will be no longer
possible.
|
You may like...
The Northman
Alexander Skarsgard, Nicole Kidman, …
Blu-ray disc
(1)
R210
Discovery Miles 2 100
|