|
|
Books > Science & Mathematics > Science: general issues > General
Electromagnetic homogenization is the process of estimating the
effective electromagnetic properties of composite materials in the
long-wavelength regime, wherein the length scales of
nonhomogeneities are much smaller than the wavelengths involved.
This is a bird's-eye view of currently available homogenization
formalisms for particulate composite materials. It presents
analytical methods only, with focus on the general settings of
anisotropy and bianisotropy. The authors largely concentrate on
'effective' materials as opposed to 'equivalent' materials, and
emphasize the fundamental (but sometimes overlooked) differences
between these two categories of homogenized composite materials.
The properties of an 'effective' material represents those of its
composite material, regardless of the geometry and dimensions of
the bulk materials and regardless of the orientations and
polarization states of the illuminating electromagnetic fields. In
contrast, the properties of 'equivalent' materials only represent
those of their corresponding composite materials under certain
restrictive circumstances.
Today, air-to-surface vessel (ASV) radars, or more generally
maritime surveillance radars, are installed on maritime
reconnaissance aircraft for long-range detection, tracking and
classification of surface ships (ASuW - Air to Surface Warfare) and
for hunting submarines (ASW - anti-submarine warfare). Such radars
were first developed in the UK during WWII as part of the response
to the threat to shipping from German U-Boats. This book describes
the ASV radars developed in the UK after WWII (1946-2000) and used
by the RAF for long-range maritime surveillance.
Domain theory, a subject that arose as a response to natural
concerns in the semantics of computation, studies ordered sets
which possess an unusual amount of mathematical structure. This
book explores its connection with quantum information science and
the concept that relates them: disorder. This is not a literary
work. It can be argued that its subject, domain theory and quantum
information science, does not even really exist, which makes the
scope of this alleged 'work' irrelevant. BUT, it does have a
purpose and to some extent, it can also be said to have a method. I
leave the determination of both of those largely to you, the
reader. Except to say, I am hoping to convince the uninitiated to
take a look. A look at what? Twenty years ago, I failed to
satisfactorily prove a claim that I still believe: that there is
substantial domain theoretic structure in quantum mechanics and
that we can learn a lot from it. One day it will be proven to the
point that people will be comfortable dismissing it as a
'well-known' idea that many (possibly including themselves) had
long suspected but simply never bothered to write down. They may
even call it "obvious!" I will not bore you with a brief history
lesson on why it is not obvious, except to say that we have never
been interested in the difficulty of proving the claim only in
establishing its validity. This book then documents various
attempts on my part to do just that.
Monitoring of patients with critical neurologic illness has
expanded significantly over the past several decades. Prior to the
advent and application of technologies such as continuous EEG
(electroencephalogram), intracranial pressure monitoring, brain
tissue oxygenation and multimodal monitoring, the care of these
critically ill patients relied on frequent clinical examinations to
detect subtle changes that may signal an acute neurologic
deterioration. This type of monitoring was limited by the
availability of highly trained clinicians and nursing staff. The
severity of the patient's illness can also obscure clinical
changes, and then the interventions taken in order to treat the
illness, such as induced coma for status epilepticus or
intracranial hypertension, could further mask the clinical signs
that would be necessary for detection of an acute change.
This is an introductory textbook on computational methods and
techniques intended for undergraduates at the sophomore or junior
level in the fields of science, mathematics, and engineering. It
provides an introduction to programming languages such as FORTRAN
90/95/2000 and covers numerical techniques such as differentiation,
integration, root finding, and data fitting. The textbook also
entails the use of the Linux/Unix operating system and other
relevant software such as plotting programs, text editors, and mark
up languages such as LaTeX. It includes multiple homework
assignments.
The book is about the post-relativity philosophy of time as
championed by Bertrand Russell and Einstein. It argues that The
Past, Present and Future notion of time is an illusion. The sun, as
daylight, is on constantly with no temporal past and future, except
in chemistry perhaps. Only the earth's revolutions bring temporary
days and nights. So the Bertrand Russell notion that under
relativity man constructs his time is logically unassailable (the
days, weeks, months and years are all human concepts.) Relativity
allows time to begin from anywhere. So the revolutionary view is
that there are or can be as many times as there are frames, or
planets---a world-changing idea but true because it is based on
objective, physical experiments, but generally ignored.
Jean-Henri Fabre was a famous French entomologist whose
observations of insects were praised - this examination of various
beetles is characteristic of his meticulous yet engrossing
descriptions. Fabre's greatest talent was rooted in his genuine
passion for entomology; a natural ability to observe the quirks and
habits of small creatures, and describe them to others in a plain
but lively way. As demonstrated in this book, he wrote about
insects as if they were his friends - seeing their lives play out,
it is thus that qualities of biography are found alongside the
scientific value of this work. In life, Fabre met with backlash for
his unique style - formal schools, whom he in turn criticized for
dryness of tutoring - considered his books long-winded, or even
frivolous. Nevertheless he managed to connect atmospheric pressure
to the behavior of certain insects, while contemporaries such as
Charles Darwin held Fabre in high esteem, to the point of finding
his studies inspirational.
"Know Thyself." Such was the advice constantly offered over 2,000
years ago by the famed Greek Oracle of Apollo at Delphi. It was
given in response to those who sought her counsel regarding the
course their destiny was likely to take. It is still sound advice
for most of us in the modern world. To come to "really" know
oneself-discover one's distinctive temperament and
character-requires frequent self-scrutiny. It is well nigh
impossible to know what makes one "tick" without recognizing the
nature of one's attitudes and responses to life in the outside
world, while also acknowledging the highly personal inner
psychological drives of feeling, thought and imagination. The
consciousness that impels us is psychologically deep and
wide-ranging. The search for the essential Self requires a
"Sherlock Holmes" mentality and discipline: it's a hell of a job to
unify outer and inner "consciousnesses." This book should help.
Every chapter can be seen and read as its own "story" describing an
especially significant aspect of consciousness. Cumulatively, they
are meant to help readers attain a sense of their own
body-mind-spirit complexes and "who" they are as entities unto
themselves. And then to ask the question as to where "reality" is
to be found: in the mental life of thoughts and feelings. . . or in
physical encounters with the material world of time and space?
This textbook describes the basics of research in medical,
clinical, and biomedical settings as well as the concepts and
application of epidemiologic designs in research conduct. Design
transcends statistical techniques, and no matter how sophisticated
a statistical modeling, errors of design/sampling cannot be
corrected. The authors of this textbook have presented a complex
field in a very simplified and reader-friendly manner with the
intent that such presentation will facilitate the understanding of
design process and epidemiologic thinking in clinical and
biomedical research. Covers these relevant topics in epidemiology:
Case-Cohort Design Prospective Case-Control Quantitative Evidence
Synthesis (QES) Instant Cohort Design & Case-Crossover Design
Effect Modification & Interaction Epidemiologic Tree -
Molecular Epidemiology & Health Disparities Epidemiologic
Challenge - "Big Data," mHealth, Social Media 3 "Ts" - Team
Science, Transdisciplinary Research, Translational Research Bias,
Random error, Confounding Systems Science & Evidence Discovery
Research is presented as an exercise around measurement, with
measurement error inevitable in its conduct-hence the inherent
uncertainties of all findings in clinical and biomedical research.
Concise Epidemiologic Principles and Concepts covers research
conceptualization, namely research objectives, questions,
hypothesis, design, implementation, data collection, analysis,
results, and interpretation. While the primary focus of
epidemiology is to assess the relationship between exposure (risk
or predisposing factor) and outcome (disease or health-related
event), causal association is presented in a simplified manner,
including the role of quantitative evidence synthesis
(meta-analysis) in causal inference. Epidemiology has evolved over
the past three decades resulting in several fields being developed.
This text presents in brief the perspectives and future of
epidemiology in the era of the molecular basis of medicine. With
molecular epidemiology, we are better equipped with tools to
identify molecular biologic indicators of risk as well as biologic
alterations in the early stages of disease.
|
You may like...
Germany 2022
Organisation for Economic Cooperation and Development
Paperback
R2,778
Discovery Miles 27 780
|