|
Showing 1 - 12 of
12 matches in All Departments
Econometric models are made up of assumptions which never exactly
match reality. Among the most contested ones is the requirement
that the coefficients of an econometric model remain stable over
time. Recent years have therefore seen numerous attempts to test
for it or to model possible structural change when it can no longer
be ignored. This collection of papers from Empirical Economics
mirrors part of this development. The point of departure of most
studies in this volume is the standard linear regression model Yt =
x;fJt + U (t = I, ... , 1), t where notation is obvious and where
the index t emphasises the fact that structural change is mostly
discussed and encountered in a time series context. It is much less
of a problem for cross section data, although many tests apply
there as well. The null hypothesis of most tests for structural
change is that fJt = fJo for all t, i.e. that the same regression
applies to all time periods in the sample and that the disturbances
u are well behaved. The well known Chow test for instance assumes t
that there is a single structural shift at a known point in time,
i.e. that fJt = fJo (t< t*), and fJt = fJo + t1fJ (t"?:. t*),
where t* is known.
Scan 2000, the GAMM - IMACS International Symposium on Scientific
Computing, Computer Arithmetic, and Validated Numerics and Interval
2000, the International Conference on Interval Methods in Science
and Engineering were jointly held in Karlsruhe, September 19-22,
2000. The joint conference continued the series of 7 previous
Scan-symposia under the joint sponsorship of GAMM and IMACS. These
conferences have traditionally covered the numerical and
algorithmic aspects of scientific computing, with a strong emphasis
on validation and verification of computed results as well as on
arithmetic, programming, and algorithmic tools for this purpose.
The conference further continued the series of 4 former Interval
conferences focusing on interval methods and their application in
science and engineering. The objectives are to propagate current
applications and research as well as to promote a greater
understanding and increased awareness of the subject matters. The
symposium was held in Karlsruhe the European cradle of interval
arithmetic and self-validating numerics and attracted 193
researchers from 33 countries. 12 invited and 153 contributed talks
were given. But not only the quantity was overwhelming we were
deeply impressed by the emerging maturity of our discipline. There
were many talks discussing a wide variety of serious applications
stretching all parts of mathematical modelling. New efficient,
publicly available or even commercial tools were proposed or
presented, and also foundations of the theory of intervals and
reliable computations were considerably strengthened.
This Festschrift is dedicated to Goetz Trenkler on the occasion of
his 65th birthday. As can be seen from the long list of
contributions, Goetz has had and still has an enormous range of
interests, and colleagues to share these interests with. He is a
leading expert in linear models with a particular focus on matrix
algebra in its relation to statistics. He has published in almost
all major statistics and matrix theory journals. His research
activities also include other areas (like nonparametrics,
statistics and sports, combination of forecasts and magic squares,
just to mention afew). Goetz Trenkler was born in Dresden in 1943.
After his school years in East G- many and West-Berlin, he obtained
a Diploma in Mathematics from Free University of Berlin (1970),
where he also discovered his interest in Mathematical Statistics.
In 1973, he completed his Ph.D. with a thesis titled: On a
distance-generating fu- tion of probability measures. He then moved
on to the University of Hannover to become Lecturer and to write a
habilitation-thesis (submitted 1979) on alternatives to the
Ordinary Least Squares estimator in the Linear Regression Model, a
topic that would become his predominant ?eld of research in the
years to come.
|
Numerical Validation in Current Hardware Architectures - International Dagstuhl Seminar, Dagstuhl Castle, Germany, January 6-11, 2008, Revised Papers (Paperback, 2009 ed.)
Annie A. M. Cuyt, Walter Kramer, Wolfram Luther, Peter Markstein
|
R1,557
Discovery Miles 15 570
|
Ships in 10 - 15 working days
|
The major emphasis of the Dagstuhl Seminar on "Numerical Validation
in C- rent Hardware Architectures" lay on numerical validation in
current hardware architecturesand softwareenvironments. The
generalidea wasto bring together experts who are concerned with
computer arithmetic in systems with actual processor architectures
and scientists who develop, use, and need techniques from veri?ed
computation in their applications. Topics of the seminar therefore
included: - The ongoing revision of the IEEE 754/854 standard for
?oating-point ari- metic - Feasible ways to implement multiple
precision (multiword) arithmetic and to compute the actual
precision at run-time according to the needs of input data - The
achievement of a similar behavior of ?xed-point, ?oating-point and
- terval arithmetic across language compliant implementations - The
design of robust and e?cient numerical programsportable from
diverse computers to those that adhere to the IEEE standard - The
development and propagation of validated special-purpose software
in di?erent application areas - Error analysis in several contexts
- Certi?cation of numerical programs, veri?cation and validation
assessment Computer arithmetic plays an important role at the
hardware and software level, when microprocessors, embedded
systems, or grids are designed. The re- ability of numerical
softwarestrongly depends on the compliance with the cor- sponding
?oating-point norms. Standard CISC processors follow the 1985 IEEE
norm 754, which is currently under revision, but the new highly
performing CELL processor is not fully IEEE compliant.
This Festschrift is dedicated to Goetz Trenkler on the occasion of
his 65th birthday. As can be seen from the long list of
contributions, Goetz has had and still has an enormous range of
interests, and colleagues to share these interests with. He is a
leading expert in linear models with a particular focus on matrix
algebra in its relation to statistics. He has published in almost
all major statistics and matrix theory journals. His research
activities also include other areas (like nonparametrics,
statistics and sports, combination of forecasts and magic squares,
just to mention afew). Goetz Trenkler was born in Dresden in 1943.
After his school years in East G- many and West-Berlin, he obtained
a Diploma in Mathematics from Free University of Berlin (1970),
where he also discovered his interest in Mathematical Statistics.
In 1973, he completed his Ph.D. with a thesis titled: On a
distance-generating fu- tion of probability measures. He then moved
on to the University of Hannover to become Lecturer and to write a
habilitation-thesis (submitted 1979) on alternatives to the
Ordinary Least Squares estimator in the Linear Regression Model, a
topic that would become his predominant ?eld of research in the
years to come.
Scan 2000, the GAMM - IMACS International Symposium on Scientific
Computing, Computer Arithmetic, and Validated Numerics and Interval
2000, the International Conference on Interval Methods in Science
and Engineering were jointly held in Karlsruhe, September 19-22,
2000. The joint conference continued the series of 7 previous
Scan-symposia under the joint sponsorship of GAMM and IMACS. These
conferences have traditionally covered the numerical and
algorithmic aspects of scientific computing, with a strong emphasis
on validation and verification of computed results as well as on
arithmetic, programming, and algorithmic tools for this purpose.
The conference further continued the series of 4 former Interval
conferences focusing on interval methods and their application in
science and engineering. The objectives are to propagate current
applications and research as well as to promote a greater
understanding and increased awareness of the subject matters. The
symposium was held in Karlsruhe the European cradle of interval
arithmetic and self-validating numerics and attracted 193
researchers from 33 countries. 12 invited and 153 contributed talks
were given. But not only the quantity was overwhelming we were
deeply impressed by the emerging maturity of our discipline. There
were many talks discussing a wide variety of serious applications
stretching all parts of mathematical modelling. New efficient,
publicly available or even commercial tools were proposed or
presented, and also foundations of the theory of intervals and
reliable computations were considerably strengthened.
Dieser Sammelband zeigt an ausgewahlten Beispielen, wie spannend
und vielfaltig statistische Forschung sein kann. Ob es nun darum
geht, hoergeschadigten Menschen einen guten Musikgenuss zu
verschaffen, aus Texten sinnvolle quantitative Daten zu extrahieren
oder UEberschwemmungskatastrophen zu modellieren und damit besser
in den Griff zu bekommen - die meisten in diesem Buch dargestellten
Erkenntnisse sind nicht in Lehrbuchern zu finden, sie stammen
direkt von der Forschungsfront und laden zum Staunen und Entdecken
ein. Auf Fachjargon und Formalismus wird bei der Darstellung so
weit wie moeglich verzichtet - das Buch richtet sich somit an
jeden, der sich fur das aktuelle Forschungsgeschehen im Bereich
statistischer Anwendungen interessiert. Es ermoeglicht einen
unverdeckten Blick auf eine durch und durch faszinierende
Wissenschaft - ohne dass die einzelnen Analysen bis ins Detail
nachvollzogen werden mussen. Studierenden kann das Buch helfen,
Begeisterung fur statistische Fragestellungen und Methoden zu
entwickeln, oder sogar Anregungen fur die eigene Laufbahn geben.
Ein Grossteil der Beitrage entstand an der Fakultat Statistik der
TU Dortmund, der einzigen eigenstandigen Statistik-Fakultat im
ganzen deutschen Sprachgebiet, sowie daruber hinaus im Rahmen von
an diese Fakultat angedockten DFG-Sonderforschungsbereichen.
Das Programmpaket SAS hat sich im Lauf der Jahre als
Standardprogramm zur statistischen Datenanalyse etabliert. Der
souverane Umgang mit statistischen Methoden und deren praktischer
Umsetzung in SAS bietet somit einen unschatzbaren Vorteil fur die
tagliche Arbeit des Datenanalytikers. Im vorliegenden Buch erlernt
der Leser zunachst die Grundlagen fur die Programmierung.
Anschliessend wird eine grosse Auswahl statistischer Verfahren und
deren Umsetzung als SAS-Programm vorgestellt. Dabei wird grosses
Augenmerk auf die grafischen Aspekte der statistischen Datenanalyse
gelegt. Ein zusatzlicher Teil uber Programmierung mit IML und
Makros sowie hilfreiche Assistenten in SAS runden die Darstellung
ab. Mit seiner umfassenden Themenauswahl ist das Buch als
Einfuhrung, aber auch als Nachschlagewerk fur den
fortgeschritteneren Leser geeignet.
Der Band bietet eine allgemein verstandliche Ubersicht uber 100
Jahre Deutsche Statistische Gesellschaft (DStatG). In 17 Kapiteln
schildern anerkannte Experten, wie die DStatG zur Begrundung und
Fortentwicklung der deutschen Wirtschafts- und Sozialstatistik und
zu methodischen Innovationen wie neuere Zeitreihen-, Preisindex-
oder Stichprobenverfahren beigetragen hat. Weitere Themen sind die
Rolle der DStatG bei der Zusammenfuhrung der Ost- und Weststatistik
sowie die Vorbereitung und Durchfuhrung der letzen und der
aktuellen Volkszahlung."
Das Buch nimmt die aktuelle Wirtschaftskrise zum Anlass, um die
Frage nach der Qualitat der Presse, insbesondere der der
Tageszeitungen zu stellen. Herzstuck des Buches sind aktuelle
Beispiele, die veranschaulichen, dass Journalisten zu oft die
Bedurfnisse der Leser missachten und schwer verstandliche Texte
oder Uberschriften produzieren, die haufig nicht einmal interessant
sind. Wer immer schon mal wissen wollte, warum ihn Wirtschaftstexte
abschrecken, bekommt hier Antworten. Das Autorengespann zeigt
gangige Auswahlfehler und kritisiert den Hang zum Dramatisieren und
das Primat der Unterhaltung im Journalismus ebenso wie das in der
Wirtschaftspresse grassierende Fachchinesisch und
Burokratendeutsch. Die Autoren verstehen das Buch als Pladoyer fur
einen guten Journalismus, damit die Medienverdrossenheit nicht noch
weitere Kreise zieht.
Karsten Webel liefert einen UEberblick uber die in der OEkonometrie
etablierten Erklarungsansatze fur ein langes Gedachtnis und weist
erstmals nach, dass auch ein Prozess, der in diesen Gebiet nahezu
unbekannt ist, unter bestimmten Bedingungen ein langes Gedachtnis
erzeugen kann.
Die Statistik und ihre Anwendung in unserem Leben in 101
Stichwoertern kurz, pragnant und verstandlich erklaren kann nur
Walter Kramer. Ob es um die Zusammensetzung der Arbeitslosenquote
geht, Aktienkurse, Wahlprognosen, Intelligenzquotient, polizeiliche
Kriminalstatistik oder um Klinische Studien und Big Data: Der Leser
erhalt genau die Informationen, die er benoetigt, um im taglichen
Leben mit Statistik sinnvoll umgehen zu koennen. Dazu muss man kein
Rechen-As sein oder Mathematik studiert haben. Ein gesunder
Menschenverstand und die Bereitschaft, den Tatsachen ohne
Vorurteile ins Gesicht zu sehen reichen vollkommen aus, um die
Kunst der Statistik schatzen zu lernen: den Schein vom Sein zu
trennen und die Stecknadel im Heuhaufen zu finden. Dieses Buch ist
ein gleichermassen verstandlicher, faszinierender, amusanter wie
auch und hilfreicher als Ratgeber fur unseren taglichen Umgang mit
Statistik: denn nur wer versteht kann mitreden und entlarven.
|
You may like...
Loot
Nadine Gordimer
Paperback
(2)
R398
R330
Discovery Miles 3 300
|