|
Books > Academic & Education > Professional & Technical
This book gives a practical introduction to model-based testing,
showing how to write models for testing purposes and how to use
model-based testing tools to generate test suites. It is aimed at
testers and software developers who wish to use model-based
testing, rather than at tool-developers or academics.
The book focuses on the mainstream practice of functional black-box
testing and covers different styles of models, especially
transition-based models (UML state machines) and pre/post models
(UML/OCL specifications and B notation). The steps of applying
model-based testing are demonstrated on examples and case studies
from a variety of software domains, including embedded software and
information systems.
From this book you will learn:
* The basic principles and terminology of model-based testing
* How model-based testing differs from other testing
processes
* How model-based testing fits into typical software lifecycles
such as agile methods and the Unified Process
* The benefits and limitations of model-based testing, its cost
effectiveness and how it can reduce time-to-market
* A step-by-step process for applying model-based testing
* How to write good models for model-based testing
* How to use a variety of test selection criteria to control the
tests that are generated from your models
* How model-based testing can connect to existing automated test
execution platforms such as Mercury Test Director, Java JUnit, and
proprietary test execution environments
* Presents the basic principles and terminology of model-based
testing
* Shows how model-based testing fits into the software lifecycle,
its cost-effectiveness, and how it can reduce time to market
* Offers guidance on how to use different kinds of modeling
techniques, useful test generation strategies, how to apply
model-based testing techniques to real applications using case
studies
Geometric Function Theory is that part of Complex Analysis which
covers the theory of conformal and quasiconformal mappings.
Beginning with the classical Riemann mapping theorem, there is a
lot of existence theorems for canonical conformal mappings. On the
other side there is an extensive theory of qualitative properties
of conformal and quasiconformal mappings, concerning mainly a prior
estimates, so called distortion theorems (including the Bieberbach
conjecture with the proof of the Branges). Here a starting point
was the classical Scharz lemma, and then Koebe's distortion
theorem.
There are several connections to mathematical physics, because of
the relations to potential theory (in the plane). The Handbook of
Geometric Function Theory contains also an article about
constructive methods and further a Bibliography including
applications eg: to electroxtatic problems, heat conduction,
potential flows (in the plane).
- A collection of independent survey articles in the field of
GeometricFunction Theory
- Existence theorems and qualitative properties of conformal and
quasiconformal mappings
- A bibliography, including many hints to applications in
electrostatics, heat conduction, potential flows (in the plane).
Indispensable for food, chemical, mechanical, and packaging
engineers, "Handbook of Farm, Dairy, and Food Machinery" covers in
one comprehensive volume fundamental food engineering principles in
the design of food industry machinery. The handbook provides broad,
yet technically detailed coverage of food safety, regulations,
product processing systems, packaging, facilities, waste
management, and machinery design topics in a "farm to the fork"
organization.
The 22 chapters are contributed by leading experts worldwide with
numerous illustrations, tables, and references. The book includes
the new USDA regulations for "certified organic" processing, as
well as state-of-the-art technologies for equipment both on the
farm and in the plant.
This book presents an overview of the physics of radiation
detection and its applications. It covers the origins and
properties of different kinds of ionizing radiation, their
detection and measurement, and the procedures used to protect
people and the environment from their potentially harmful effects.
It details the experimental techniques and instrumentation used in
different detection systems in a very practical way without
sacrificing the physics content. It provides useful formulae and
explains methodologies to solve problems related to radiation
measurements. With abundance of worked-out examples and
end-of-chapter problems, this book enables the reader to understand
the underlying physical principles and their applications. Detailed
discussions on different detection media, such as gases, liquids,
liquefied gases, semiconductors, and scintillators make this book
an excellent source of information for students as well as
professionals working in related fields. Chapters on statistics,
data analysis techniques, software for data analysis, and data
acquisition systems provide the reader with necessary skills to
design and build practical systems and perform data analysis.
* Covers the modern techniques involved in detection and
measurement of radiation and the underlying physical principles
* Illustrates theoretical and practical details with an abundance
of practical, worked-out examples
* Provides practice problems at the end of each chapter
An integrative introduction to the theories and themes in research
on creativity, this book is both a reference work and text for
courses in this burgeoning area of research. The book begins with a
discussion of the theories of creativity (Person, Product, Process,
Place), the general question of whether creativity is influenced by
nature or nurture, what research has indicated of the personality
and style of creative individuals from a personality analysis
standpoint, how social context affects creativity, and then
coverage of issues like gender differences, whether creativity can
be enhanced, if creativity is related to poor mental or physical
health, etc.
The book contains boxes covering special interest items including
one page biographies of famous creative individuals and activities
for a group or individual to test and/or encourage creativity, as
well as references to internet sites relating to creativity.
*Breaks down the major theories about creativity but doesn't
restrict to a singular perspective
*Includes extensive citations of existing literature
*Textbook features included (i.e., key terms defined)
Currently, few drugs are available for the effective treatment of
neurodegenerative diseases and neurodevelopmental disorders. Recent
advances in neuroscience research offer hope that future strategies
for treating these brain disorders will include neurogenesis and
neuroenhancement as therapeutic endpoints. This volume reviews
cutting-edge findings related to the pharmacological aspects of
neurogenesis and neuroprotection. A broad range of topics are
covered from basic lab bench research to drug discovery efforts and
important clinical issues. This collection of reviews is a perfect
way to become acquainted with these exciting new fields in the
space of a single volume. Chapters are written with a general
audience in mind, but with enough high-level discussion to appeal
to specialists and experts as well. The authors have done an
excellent job of challenging current paradigms and pushing the
boundaries of exploration in keeping with the pioneering spirit
that gave rise to these emerging areas of research. Consequently,
this will be an indispensable resource for many years to come.
* Provides state-of-the-art reviews spanning significant emerging
fields
* Discusses future directions and questions for future
studies
* Includes informative illustrations
*
Nuclear magnetic resonance (NMR) is an analytical tool used by
chemists and physicists to study the structure and dynamics of
molecules. In recent years, no other technique has grown to such
importance as NMR spectroscopy. It is used in all branches of
science where precise structural determination is required and
where the nature of interactions and reactions in solution is being
studied. Annual Reports on NMR has established itself as a premier
means for the specialist and nonspecialist alike to become familiar
with new techniques and applications of NMR spectroscopy.
* Includes comprehensive review articles on NMR Spectroscopy
* NMR is used in all branches of science
* No other technique has grown to such importance as NMR
Spectroscopy in recent years
The first edition of Sound and Structural Vibration was written in
the early 1980s. Since then, two major developments have taken
place in the field of vibroacoustics. Powerful computational
methods and procedures for the numerical analysis of structural
vibration, acoustical fields and acoustical interactions between
fluids and structures have been developed and these are now
universally employed by researchers, consultants and industrial
organisations. Advances in signal processing systems and
algorithms, in transducers, and in structural materials and forms
of construction, have facilitated the development of practical
means of applying active and adaptive control systems to structures
for the purposes of reducing or modifying structural vibration and
the associated sound radiation and transmission.
In this greatly expanded and extensively revised edition, the
authors have retained most of the analytically based material that
forms the pedagogical content of the first edition, and have
expanded it to present the theoretical foundations of modern
numerical analysis. Application of the latter is illustrated by
examples that have been chosen to complement the analytical
approaches to solving fairly simple problems of sound radiation,
transmission and fluid-structural coupling that are presented in
the first edition. The number of examples of experimental data that
relate to the theoretical content, and illustrate important
features of vibroacoustic interaction, has been augmented by the
inclusion of a selection from the vast amount of material published
during the past twenty five years. The final chapter on the active
control of sound and vibration has no precursor in the first
edition.
* Covers theoretical approaches to modeling and analysis
* Highly applicable to challenges in industry and academia
* For engineering students to use throughout their career
With Psycholinguistics in its fifth decade of existence, the second
edition of the Handbook of Psycholinguistics represents a
comprehensive survey of psycholinguistic theory, research and
methodology, with special emphasis on the very best empirical
research conducted in the past decade. Thirty leading experts have
been brought together to present the reader with both broad and
detailed current issues in Language Production, Comprehension and
Development.
The handbook is an indispensible single-source guide for
professional researchers, graduate students, advanced
undergraduates, university and college teachers, and other
professionals in the fields of psycholinguistics, language
comprehension, reading, neuropsychology of language, linguistics,
language development, and computational modeling of language. It
will also be a general reference for those in neighboring fields
such as cognitive and developmental psychology and education.
*Provides a complete account of psycholinguistic theory, research,
and methodology
*30 of the field's foremost experts have contributed to this
edition
*An invaluable single-source reference
Ian Sinclair's Practical Electronics Handbook combines a wealth
useful day-to-day electronics information, concise explanations and
practical guidance in this essential companion to anyone involved
in electronics design and construction. The compact collection of
key data, fundamental principles and circuit design basics provides
an ideal reference for a wide range of students, enthusiasts,
technicians and practitioners of electronics who have progressed
beyond the basics.
The sixth edition is updated throughout with new material on
microcontrollers and computer assistance, and a new chapter on
digital signal processing
. Invaluable handbook and reference for hobbyists, students and
technicians
. Essential day-to-day electronics information, clear explanations
and practical guidance in one compact volume
. Assumes some previous electronics knowledge but coverage to
interest beginners and professionals alike
This thematic issue devoted to 'Template Effects and Molecular
Organization' is based on a special symposium recently organized by
the American Chemical Society in Philadelphia. The authors
contributed oral presentations and are experts in their fields.
* Each chapter is fully referenced
* Contains comprehensive reviews written by leading experts in the
field
* Includes new information on the important advances in inorganic
and bioinorganic chemistry
This book describes, analyzes, and recommends traffic engineering
(TE) and quality of service (QoS) optimization methods for
integrated voice/data dynamic routing networks. These functions
control a network's response to traffic demands and other stimuli,
such as link failures or node failures. TE and QoS optimization is
concerned with measurement, modeling, characterization, and control
of network traffic, and the application of techniques to achieve
specific performance objectives. The scope of the analysis and
recommendations include dimensioning, call/flow and connection
routing, QoS resource management, routing table management, dynamic
transport routing, and operational requirements. Case studies are
included which provide the reader with a concrete way into the
technical details and highlight why and how to use the techniques
described in the book.
* Includes Case Studies of MPLS & GMPLS Network
Optimization
* Presents state-of-the-art traffic engineering and quality of
service optimization methods and illustrates the tradeoffs between
the various methods discussed
* Contains practical Case Studies based on large-scale service
provider implementations and architecture plans
* Written by a highly respected and well known active expert in
traffic engineering and quality of service
For a long time, conventional reliability analyses have been
oriented towards selecting the more reliable system and preoccupied
with maximising the reliability of engineering systems. On the
basis of counterexamples however, we demonstrate that selecting the
more reliable system does not necessarily mean selecting the system
with the smaller losses from failures! As a result, reliability
analyses should necessarily be risk-based, linked with the losses
from failures.
Accordingly, a theoretical framework and models are presented which
form the foundations of the reliability analysis and reliability
allocation linked with the losses from failures.
An underlying theme in the book is the basic principle for a
risk-based design: the larger the cost of failure associated with a
component, the larger its minimum necessary reliability level. Even
identical components should be designed to different reliability
levels if their failures are associated with different losses.
According to a classical definition, the risk of failure is a
product of the probability of failure and the cost given failure.
This risk measure however cannot describe the risk of losses
exceeding a maximum acceptable limit. Traditionally the losses from
failures have been 'accounted for' by the average production
availability (the ratio of the actual production capacity and the
maximum production capacity). As demonstrated in the book by using
a simple counterexample, two systems with the same production
availability can be characterised by very different losses from
failures.
As an alternative, a new aggregated risk measure based on the
cumulative distribution of the potential losses has been
introducedand the theoretical framework for risk analysis based on
the concept potential losses has also been developed. This new risk
measure incorporates the uncertainty associated with the exposure
to losses and the uncertainty in the consequences given the
exposure. For repairable systems with complex topology, the
distribution of the potential losses can be revealed by simulating
the behaviour of systems during their life-cycle. For this purpose,
fast discrete event-driven simulators are presented capable of
tracking the potential losses for systems with complex topology,
composed of a large number of components. The simulators are based
on new, very efficient algorithms for system reliability analysis
of systems comprising thousands of components.
An important theme in the book are the generic principles and
techniques for reducing technical risk. These have been classified
into three major categories: preventive (reducing the likelihood of
failure), protective (reducing the consequences from failure) and
dual (reducing both, the likelihood and the consequences from
failure). Many of these principles (for example: avoiding
clustering of events, deliberately introducing weak links, reducing
sensitivity, introducing changes with opposite sign, etc.) are
discussed in the reliability literature for the first time.
Significant space has been allocated to component reliability. In
the last chapter of the book, several applications are discussed of
a powerful equation which constitutes the core of a new theory of
locally initiated component failure by flaws whose number is a
random variable.
This book has been written with the intention to fill two big gaps
in the reliability and riskliterature: the risk-based reliability
analysis as a powerful alternative to the traditional reliability
analysis and the generic principles for reducing technical risk. I
hope that the principles, models and algorithms presented in the
book will help to fill these gaps and make the book useful to
reliability and risk-analysts, researchers, consultants, students
and practising engineers.
- Offers a shift in the existing paradigm for conducting
reliability analyses.
- Covers risk-based reliability analysis and generic principles for
reducing risk.
- Provides a new measure of risk based on the distribution of the
potential losses from failure as well as the basic principles for
risk-based design.
- Incorporates fast algorithms for system reliability analysis and
discrete-event simulators.
- Includes the probability of failure of a structure with complex
shape expressed with a simple equation.
IPv6 was introduced in 1994 and has been in development at the
IETF for over 10 years. It has now reached the deployment stage.
KAME, the de-facto open-source reference implementation of the IPv6
standards, played a significant role in the acceptance and the
adoption of the IPv6 technology. The adoption of KAME by key
companies in a wide spectrum of commercial products is a
testimonial to the success of the KAME project, which concluded not
long ago.
This book is the first and the only one of its kind, which
reveals all of the details of the KAME IPv6 protocol stack,
explaining exactly what every line of code does and why it was
designed that way. Through the dissection of both the code and its
design, the authors illustrate how IPv6 and its related protocols
have been interpreted and implemented from the specifications. This
reference will demystify those ambiguous areas in the standards,
which are open to interpretation and problematic in deployment, and
presents solutions offered by KAME in dealing with these
implementation challenges.
Covering a snapshot version of KAME dated April 2003 based on
FreeBSD 4.8Extensive line-by-line code listings with meticulous
explanation of their rationale and use for the KAME snapshot
implementation, which is generally applicable to most recent
versions of the KAME IPv6 stack including those in recent releases
of BSD variantsNumerous diagrams and illustrations help in
visualizing the implementation In-depth discussion of the standards
provides intrinsic understanding of the specifications
Gastroesophageal Reflux Disease (GERD) is one of the most common
maladies of mankind. Approximately 40% of the adult population of
the USA suffers from significant heartburn and the numerous
antacids advertised incessantly on national television represents a
$8 billion per year drug market. The ability to control acid
secretion with the increasingly effective acid-suppressive agents
such as the H2 blockers ("pepcid, zantac") and proton pump
inhibitors ("nexium, prevacid") has given physicians an excellent
method of treating the symptoms of acid reflux.
Unfortunately, this has not eradicated reflux disease. It has just
changed its nature. While heartburn, ulceration and strictures have
become rare, reflux-induced adenocarcinoma of the esophagus is
becoming increasingly common. Adenocarcinoma of the esophagus and
gastric cardia is now the most rapidly increasing cancer type in
the Western world.
The increasing incidence of esophageal adenocarcinoma has created
an enormous interest and stimulus for research in this area. GERD
brings together a vast amount of disparate literature and presents
the entire pathogenesis of reflux disease in one place. In addition
to providing a new concept of how gastroesophageal reflux causes
cellular changes in the esophagus, GERD also offers a complete
solution to a problem that has confused physicians for over a
century. Both clinical and pathological information about reflux
disease and its treatment are presented. GERD is meant to be used
as a comprehensive reference for gastroenterologists, esophageal
surgeons, and pathologists alike.
*Outlines how gastroesophageal reflux causes cellular changes in
the esophagus
*Brings together the pathogenesis of the disease in one source and
applies it toward clinical treatment
*Tom DeMeester is THE leading international expert on reflux
disease; Parakrama Chandrasoma is one of the leading pathologists
in the area
*Book contains approximately 350 illustrations
*Ancillary web site features color illustrations:
www.chandrasoma.com
The most significant articles from each of the fields represented
at the conference on Work with Display Units 1992 are presented in
this volume. Such topics are:
- The newest occupational health research results, partially backed
by extensive epidemiological studies, concerning radiation, eye
fatigue, stressors and diseases of the musculoskeletal apparatus
and the medical surveillance of workers with display units.
- Ergonomic studies pertaining to design improvements.
- Support of the human operator by intelligent user surfaces and
interfaces, demonstrated by experimental results.
- The latest developments in input devices including virtual
reality as well as multi-media, multi-screen and multi-language
human computer interaction.
- Concepts of group work and organizational stress in combination
with its psychophysical and psychophysiological evaluation,
especially the problem of stress-strain regulation, adaptation and
long-term effects on recovery and recreation.
- Discussion of hypertext and hypermedia as well as applications of
computer aided techniques for knowledge acquisition, knowledge
structuring and knowledge use for decisions and working processes
in different contexts.
- New organizational concepts such as Chaos Theory and new methods
of work design in industry such as Kansei Engineering.
- Discussion of the International Standardization and EC
regulations.
Microirrigation has become the fastest growing segment of the
irrigation industry worldwide and has the potential to increase the
quality of food supply through improved water fertilizer
efficiency. This book is meant to update the text "Trickle
Irrigation, Design, Operation and Management." This text offers the
most current understanding of the management criteria needed to
obtain maximum water and fertilization efficiency.
* Presents a detailed explanation of system design, operation, and
management specific to various types of MI systems
* Analyzes proper use of irrigation technology and its effect to
increase efficiency
* Provides an understanding to the basic science needed to
comprehend operation and management
* Over 150 figures of designs and charts of systems including,
surface drip, subsurface drip, spray/microsprinkler, and more
Customizable processors have been described as the next natural
step in the evolution of the microprocessor business: a step in the
life of a new technology where top performance alone is no longer
sufficient to guarantee market success. Other factors become
fundamental, such as time to market, convenience, energy
efficiency, and ease of customization.
This book is the first to explore comprehensively one of the most
fundamental trends which emerged in the last decade: to treat
processors not as rigid, fixed entities, which designers include
"as is" in their products; but rather, to build sound methodologies
to tailor-fit processors to the specific needs of such products.
This book addresses the goal of maintaining a very large family of
processors, with a wide range of features, at a cost comparable to
that of maintaining a single processor.
- First book to present comprehensively the major ASIP design
methodologies and tools without any particular bias.
- Written by most of the pioneers and top international experts of
this young domain.
- Unique mix of management perspective, technical detail, research
outlook, and practical implementation.
The design of today's semiconductor chips for various applications,
such as telecommunications, poses various challenges due to the
complexity of these systems. These highly complex systems-on-chips
demand new approaches to connect and manage the communication
between on-chip processing and storage components and networks on
chips (NoCs) provide a powerful solution.
This book is the first to provide a unified overview of NoC
technology. It includes in-depth analysis of all the on-chip
communication challenges, from physical wiring implementation up to
software architecture, and a complete classification of their
various Network-on-Chip approaches and solutions.
* Leading-edge research from world-renowned experts in academia and
industry with state-of-the-art technology
implementations/trends
* An integrated presentation not currently available in any other
book
* A thorough introduction to current design methodologies and chips
designed with NoCs
This book is a comprehensive guide to new DFT methods that will
show the readers how to design a testable and quality product,
drive down test cost, improve product quality and yield, and speed
up time-to-market and time-to-volume.
. Most up-to-date coverage of design for testability.
. Coverage of industry practices commonly found in commercial DFT
tools but not discussed in other books.
. Numerous, practical examples in each chapter illustrating basic
VLSI test principles and DFT architectures.
. Lecture slides and exercise solutions for all chapters are now
available.
. Instructors are also eligible for downloading PPT slide files and
MSWORD solutions files from the manual website.
Gene regulatory networks are the most complex, extensive control
systems found in nature. The interaction between biology and
evolution has been the subject of great interest in recent years.
The author, Eric Davidson, has been instrumental in elucidating
this relationship. He is a world renowned scientist and a major
contributor to the field of developmental biology.
The Regulatory Genome beautifully explains the control of animal
development in terms of structure/function relations of inherited
regulatory DNA sequence, and the emergent properties of the gene
regulatory networks composed of these sequences. New insights into
the mechanisms of body plan evolution are derived from
considerations of the consequences of change in developmental gene
regulatory networks. Examples of crucial evidence underscore each
major concept. The clear writing style explains regulatory
causality without requiring a sophisticated background in
descriptive developmental biology. This unique text supersedes
anything currently available in the market.
* The only book in the market that is solely devoted to the genomic
regulatory code for animal development
* Written at a conceptual level, including many novel synthetic
concepts that ultimately simplify understanding
* Presents a comprehensive treatment of molecular control elements
that determine the function of genes
* Provides a comparative treatment of development, based on
principles rather than description of developmental processes
* Considers the evolutionary processes in terms of the structural
properties of gene regulatory networks
* Includes 42 full-color descriptive figures and diagrams
Calculation and optimisation of flight performance is required to
design or select new aircraft, efficiently operate existing
aircraft, and upgrade aircraft. It provides critical data for
aircraft certification, accident investigation, fleet management,
flight regulations and safety. This book presents an unrivalled
range of advanced flight performance models for both transport and
military aircraft, including the unconventional ends of the
envelopes. Topics covered include the numerical solution of
supersonic acceleration, transient roll, optimal climb of propeller
aircraft, propeller performance, long-range flight with en-route
stop, fuel planning, zero-gravity flight in the atmosphere, VSTOL
operations, ski jump from aircraft carrier, optimal flight paths at
subsonic and supersonic speed, range-payload analysis of fixed- and
rotary wing aircraft, performance of tandem helicopters,
lower-bound noise estimation, sonic boom, and more. This book will
be a valuable text for undergraduate and post-graduate level
students of aerospace engineering. It will also be an essential
reference and resource for practicing aircraft engineers, aircraft
operations managers and organizations handling air traffic control,
flight and flying regulations, standards, safety, environment, and
the complex financial aspects of flying aircraft.
This is a hands-on reference guide for the maintenance or
reliability engineer and plant manager. As the third volume in the
"Life Cycle Engineering" series, this book takes the guiding
principles of Lean Manufacturing and Maintenance and applies these
concepts to everyday planning and scheduling tasks allowing
engineers to keep their equipment running smoothly, while
decreasing downtime. The authors offer invaluable advice on the
effective use of work orders and schedules and how they fit into
the overall maintenance plan.
There are not many books out there on planning and scheduling, that
go beyond the theory and show the engineer, in a hands-on way, how
to use planning and scheduling techniques to improve performance,
cut costs, and extend the life of their plant machinery.
* The only book that takes a direct look at streamlining planning
and scheduling for a Lean Manufacturing Environment
* This book shows the engineer how to create and stick to effective
schedules
* Gives examples and templates in the back of the book for use in
day-to-day scheduling and calculations
Metallic nanoparticles display fascinating properties that are
quite different from those of individual atoms, surfaces or bulk
rmaterials. They are a focus of interest for fundamental science
and, because of their huge potential in nanotechnology, they are
the subject of intense research effort in a range of disciplines.
Applications, or potential applications, are diverse and
interdisciplinary. They include, for example, use in biochemistry,
in catalysis and as chemical and biological sensors, as systems for
nanoelectronics and nanostructured magnetism (e.g. data storage
devices), where the drive for further miniaturization provides
tremendous technological challenges and, in medicine, there is
interest in their potential as agents for drug delivery.
The book describes the structure of metallic nanoparticles, the
experimental and theoretical techniques by which this is
determined, and the models employed to facilitate understanding.
The various methods for the production of nanoparticles are
outlined. It surveys the properties of clusters and the methods of
characterisation, such as photoionization, optical spectroscopy,
chemical reactivity and magnetic behaviour, and discusses
element-specific information that can be extracted by
synchrotron-based techniques such as EXAFS, XMCD and XMLD. The
properties of clusters can vary depending on whether they are free,
deposited on a surface or embedded in a matrix of another material;
these issues are explored. Clusters on a surface can be formed by
the diffusion and aggregation of atoms; ways of modelling these
processes are described. Finally we look at nanotechnology and
examine the science behind the potential of metallic nanoparticles
in chemical synthesis, catalysis, the magnetic separation of
biomolecules, the detection of DNA, the controlled release of
molecules and their relevance to data storage.
The book addresses a wide audience. There was a huge development of
the subject beginning in the mid-1980s where researchers began to
study the properties of free nanoparticle and models were developed
to describe the observations. The newcomer is introduced to the
established models and techniques of the field without the need to
refer to other sources to make the material accessible. It then
takes the reader through to the latest research and provides a
comprehensive list of references for those who wish to pursue
particular aspects in more detail. It will also be an invaluable
handbook for the expert in a particular aspect of nanoscale
research who wishes to acquire knowledge of other areas.
The authors are specialists in different aspects of the subject
with expertise in physics and chemistry, experimental techniques
and computational modelling, and in interdisciplinary research.
They have collaborated in research. They have also collaborated in
writing this book, with the aim from the outset of making it is a
coherent whole rather than a series of independent loosely
connected articles.
* Appeals to a wide audience
* Provides an introduction to established models and techniques in
the field
* Comprehensive list of references
|
|