0
Your cart

Your cart is empty

Browse All Departments
Price
  • R50 - R100 (1)
  • R100 - R250 (1,269)
  • R250 - R500 (278)
  • R500+ (3,576)
  • -
Status
Format
Author / Contributor
Publisher

Books > Business & Economics > Economics > Econometrics

Maximum Penalized Likelihood Estimation - Volume II: Regression (Paperback, 2009 ed.): Paul P. Eggermont, Vincent N. Lariccia Maximum Penalized Likelihood Estimation - Volume II: Regression (Paperback, 2009 ed.)
Paul P. Eggermont, Vincent N. Lariccia
R3,412 Discovery Miles 34 120 Ships in 18 - 22 working days

Unique blend of asymptotic theory and small sample practice through simulation experiments and data analysis.

Novel reproducing kernel Hilbert space methods for the analysis of smoothing splines and local polynomials. Leading to uniform error bounds and honest confidence bands for the mean function using smoothing splines

Exhaustive exposition of algorithms, including the Kalman filter, for the computation of smoothing splines of arbitrary order.

Econometric Modeling in Economic Education Research (Paperback, Softcover reprint of the original 1st ed. 1987): William E.... Econometric Modeling in Economic Education Research (Paperback, Softcover reprint of the original 1st ed. 1987)
William E. Becker Jr., Rolf A. Walstad
R4,006 Discovery Miles 40 060 Ships in 18 - 22 working days

Since its establishment in the 1950s the American Economic Association's Committee on Economic Education has sought to promote improved instruction in economics and to facilitate this objective by stimulating research on the teaching of economics. These efforts are most apparent in the sessions on economic education that the Committee organizes at the Association's annual meetings. At these sessions economists interested in economic education have opportunities to present new ideas on teaching and research and also to report the findings of their research. The record of this activity can be found in the Proceedings of the American Eco nomic Review. The Committee on Economic Education and its members have been actively involved in a variety of other projects. In the early 1960s it organized the National Task Force on Economic Education that spurred the development of economics teaching at the precollege level. This in turn led to the development of a standardized research instrument, a high school test of economic understanding. This was followed later in the 1960s by the preparation of a similar test of understanding college economics. The development of these two instruments greatly facilitated research on the impact of economics instruction, opened the way for application of increasingly sophisticated statistical methods in measuring the impact of economic education, and initiated a steady stream of research papers on a subject that previously had not been explored."

Spatial Statistics and Modeling (Paperback, 2010 ed.): Carlo Gaetan, Xavier Guyon Spatial Statistics and Modeling (Paperback, 2010 ed.)
Carlo Gaetan, Xavier Guyon
R4,696 Discovery Miles 46 960 Ships in 18 - 22 working days

Spatial statistics are useful in subjects as diverse as climatology, ecology, economics, environmental and earth sciences, epidemiology, image analysis and more. This book covers the best-known spatial models for three types of spatial data: geostatistical data (stationarity, intrinsic models, variograms, spatial regression and space-time models), areal data (Gibbs-Markov fields and spatial auto-regression) and point pattern data (Poisson, Cox, Gibbs and Markov point processes). The level is relatively advanced, and the presentation concise but complete.

The most important statistical methods and their asymptotic properties are described, including estimation in geostatistics, autocorrelation and second-order statistics, maximum likelihood methods, approximate inference using the pseudo-likelihood or Monte-Carlo simulations, statistics for point processes and Bayesian hierarchical models. A chapter is devoted to Markov Chain Monte Carlo simulation (Gibbs sampler, Metropolis-Hastings algorithms and exact simulation).
A large number of real examples are studied with R, and each chapter ends with a set of theoretical and applied exercises. While a foundation in probability and mathematical statistics is assumed, three appendices introduce some necessary background. The book is accessible to senior undergraduate students with a solid math background and Ph.D. students in statistics. Furthermore, experienced statisticians and researchers in the above-mentioned fields will find the book valuable as a mathematically sound reference.

This book is the English translation of Modelisation et Statistique Spatiales published by Springer in the series Mathematiques & Applications, a series established by Societe de Mathematiques Appliquees et Industrielles (SMAI)."

Survival Analysis: State of the Art (Paperback, Softcover reprint of hardcover 1st ed. 1992): John P. Klein, P.K. Goel Survival Analysis: State of the Art (Paperback, Softcover reprint of hardcover 1st ed. 1992)
John P. Klein, P.K. Goel
R2,698 Discovery Miles 26 980 Ships in 18 - 22 working days

Survival analysis is a highly active area of research with applications spanning the physical, engineering, biological, and social sciences. In addition to statisticians and biostatisticians, researchers in this area include epidemiologists, reliability engineers, demographers and economists. The economists survival analysis by the name of duration analysis and the analysis of transition data. We attempted to bring together leading researchers, with a common interest in developing methodology in survival analysis, at the NATO Advanced Research Workshop. The research works collected in this volume are based on the presentations at the Workshop. Analysis of survival experiments is complicated by issues of censoring, where only partial observation of an individual's life length is available and left truncation, where individuals enter the study group if their life lengths exceed a given threshold time. Application of the theory of counting processes to survival analysis, as developed by the Scandinavian School, has allowed for substantial advances in the procedures for analyzing such experiments. The increased use of computer intensive solutions to inference problems in survival analysis~ in both the classical and Bayesian settings, is also evident throughout the volume. Several areas of research have received special attention in the volume.

The Measurement of Efficiency of Production (Paperback, Softcover reprint of the original 1st ed. 1985): Rolf Fare, Shawna... The Measurement of Efficiency of Production (Paperback, Softcover reprint of the original 1st ed. 1985)
Rolf Fare, Shawna Grosskopf, C.A. Knox Lovell
R4,059 Discovery Miles 40 590 Ships in 18 - 22 working days
European Regional Growth (Paperback, Softcover reprint of hardcover 1st ed. 2003): Bernard Fingleton European Regional Growth (Paperback, Softcover reprint of hardcover 1st ed. 2003)
Bernard Fingleton
R4,179 Discovery Miles 41 790 Ships in 18 - 22 working days

European Regional Growth is the result of three major influences. First, the ongoing integration of the European regional economies and the need to understand what this means for European economic and social cohesion. Second, the development of geo-economic theories. Third, the development of techniques of spatial data analysis, simulation, data visualization and spatial econometrics. The outcome is a collection of chapters that apply these methods, motivated by a variety of theoretical positions. The book provides powerful and detailed analyses of the causes of income, productivity and employment variations across Europe's regions, and insights into their future prospects.

Elicitation of Preferences (Paperback, Softcover reprint of hardcover 1st ed. 2000): Baruch Fischhoff, Charles F. Manski Elicitation of Preferences (Paperback, Softcover reprint of hardcover 1st ed. 2000)
Baruch Fischhoff, Charles F. Manski
R2,653 Discovery Miles 26 530 Ships in 18 - 22 working days

Economists and psychologists have, on the whole, exhibited sharply different perspectives on the elicitation of preferences. Economists, who have made preference the central primitive in their thinking about human behavior, have for the most part rejected elicitation and have instead sought to infer preferences from observations of choice behavior. Psychologists, who have tended to think of preference as a context-determined subjective construct, have embraced elicitation as their dominant approach to measurement. This volume, based on a symposium organized by Daniel McFadden at the University of California at Berkeley, provides a provocative and constructive engagement between economists and psychologists on the elicitation of preferences.

Hidden Markov Models - Applications to Financial Economics (Paperback, Softcover reprint of the original 1st ed. 2004):... Hidden Markov Models - Applications to Financial Economics (Paperback, Softcover reprint of the original 1st ed. 2004)
Ramaprasad Bhar, Shigeyuki Hamori
R2,623 Discovery Miles 26 230 Ships in 18 - 22 working days

Markov chains have increasingly become useful way of capturing stochastic nature of many economic and financial variables. Although the hidden Markov processes have been widely employed for some time in many engineering applications e.g. speech recognition, its effectiveness has now been recognized in areas of social science research as well. The main aim of Hidden Markov Models: Applications to Financial Economics is to make such techniques available to more researchers in financial economics. As such we only cover the necessary theoretical aspects in each chapter while focusing on real life applications using contemporary data mainly from OECD group of countries. The underlying assumption here is that the researchers in financial economics would be familiar with such application although empirical techniques would be more traditional econometrics. Keeping the application level in a more familiar level, we focus on the methodology based on hidden Markov processes. This will, we believe, help the reader to develop more in-depth understanding of the modeling issues thereby benefiting their future research.

Industrial Price, Quantity, and Productivity Indices - The Micro-Economic Theory and an Application (Paperback, Softcover... Industrial Price, Quantity, and Productivity Indices - The Micro-Economic Theory and an Application (Paperback, Softcover reprint of hardcover 1st ed. 1998)
Bert M. Balk
R2,653 Discovery Miles 26 530 Ships in 18 - 22 working days

Industrial Price, Quantity, and Productivity Indices: The Micro-Economic Theory and an Application gives a comprehensive account of the micro-economic foundations of industrial price, quantity, and productivity indices. The various results available from the literature have been brought together into a consistent framework, based upon modern duality theory. This integration also made it possible to generalize several of these results. Thus, this book will be an important resource for theoretically as well as empirically-oriented researchers who seek to analyse economic problems with the help of index numbers. Although this book's emphasis is on micro-economic theory, it is also intended as a practical guide. A full chapter is therefore devoted to an empirical application. Three different approaches are pursued: a straightforward empirical approach, a non-parametric estimation approach, and a parametric estimation approach. As well as illustrating some of the more important concepts explored in this book, and showing to what extent different computational approaches lead to different outcomes for the same measures, this chapter also makes a powerful case for the use of enterprise micro-data in economic research.

Game-Theoretic Methods in General Equilibrium Analysis (Paperback, Softcover reprint of the original 1st ed. 1994): J. F.... Game-Theoretic Methods in General Equilibrium Analysis (Paperback, Softcover reprint of the original 1st ed. 1994)
J. F. Mertens, S. Sorin
R5,143 Discovery Miles 51 430 Ships in 18 - 22 working days

JEAN-FRANQOIS MERTENS This book presents a systematic exposition of the use of game theoretic methods in general equilibrium analysis. Clearly the first such use was by Arrow and Debreu, with the "birth" of general equi librium theory itself, in using Nash's existence theorem (or a generalization) to prove the existence of a competitive equilibrium. But this use appeared possibly to be merely tech nical, borrowing some tools for proving a theorem. This book stresses the later contributions, were game theoretic concepts were used as such, to explain various aspects of the general equilibrium model. But clearly, each of those later approaches also provides per sea game theoretic proof of the existence of competitive equilibrium. Part A deals with the first such approach: the equality between the set of competitive equilibria of a perfectly competitive (i.e., every trader has negligible market power) economy and the core of the corresponding cooperative game."

Game Theory, Experience, Rationality - Foundations of Social Sciences, Economics and Ethics in honor of John C. Harsanyi... Game Theory, Experience, Rationality - Foundations of Social Sciences, Economics and Ethics in honor of John C. Harsanyi (Paperback, Softcover reprint of hardcover 1st ed. 1998)
W. Leinfellner, Eckehart Koehler
R4,060 Discovery Miles 40 600 Ships in 18 - 22 working days

When von Neumann's and Morgenstern's Theory of Games and Economic Behavior appeared in 1944, one thought that a complete theory of strategic social behavior had appeared out of nowhere. However, game theory has, to this very day, remained a fast-growing assemblage of models which have gradually been united in a new social theory - a theory that is far from being completed even after recent advances in game theory, as evidenced by the work of the three Nobel Prize winners, John F. Nash, John C. Harsanyi, and Reinhard Selten. Two of them, Harsanyi and Selten, have contributed important articles to the present volume. This book leaves no doubt that the game-theoretical models are on the right track to becoming a respectable new theory, just like the great theories of the twentieth century originated from formerly separate models which merged in the course of decades. For social scientists, the age of great discover ies is not over. The recent advances of today's game theory surpass by far the results of traditional game theory. For example, modem game theory has a new empirical and social foundation, namely, societal experiences; this has changed its methods, its "rationality. " Morgenstern (I worked together with him for four years) dreamed of an encompassing theory of social behavior. With the inclusion of the concept of evolution in mathematical form, this dream will become true. Perhaps the new foundation will even lead to a new name, "conflict theory" instead of "game theory."

Arrovian Aggregation Models (Paperback, Softcover reprint of hardcover 1st ed. 1999): Fuad T. Aleskerov Arrovian Aggregation Models (Paperback, Softcover reprint of hardcover 1st ed. 1999)
Fuad T. Aleskerov
R2,653 Discovery Miles 26 530 Ships in 18 - 22 working days

Aggregation of individual opinions into a social decision is a problem widely observed in everyday life. For centuries people tried to invent the best' aggregation rule. In 1951 young American scientist and future Nobel Prize winner Kenneth Arrow formulated the problem in an axiomatic way, i.e., he specified a set of axioms which every reasonable aggregation rule has to satisfy, and obtained that these axioms are inconsistent. This result, often called Arrow's Paradox or General Impossibility Theorem, had become a cornerstone of social choice theory. The main condition used by Arrow was his famous Independence of Irrelevant Alternatives. This very condition pre-defines the local' treatment of the alternatives (or pairs of alternatives, or sets of alternatives, etc.) in aggregation procedures. Remaining within the framework of the axiomatic approach and based on the consideration of local rules, Arrovian Aggregation Models investigates three formulations of the aggregation problem according to the form in which the individual opinions about the alternatives are defined, as well as to the form of desired social decision. In other words, we study three aggregation models. What is common between them is that in all models some analogue of the Independence of Irrelevant Alternatives condition is used, which is why we call these models Arrovian aggregation models. Chapter 1 presents a general description of the problem of axiomatic synthesis of local rules, and introduces problem formulations for various versions of formalization of individual opinions and collective decision. Chapter 2 formalizes precisely the notion of rationality' of individual opinions and social decision. Chapter 3 deals with the aggregation model for the case of individual opinions and social decisions formalized as binary relations. Chapter 4 deals with Functional Aggregation Rules which transform into a social choice function individual opinions defined as choice functions. Chapter 5 considers another model &endash; Social Choice Correspondences when the individual opinions are formalized as binary relations, and the collective decision is looked for as a choice function. Several new classes of rules are introduced and analyzed.

Multivariate Statistical Analysis - A High-Dimensional Approach (Paperback, Softcover reprint of hardcover 1st ed. 2000): V.I.... Multivariate Statistical Analysis - A High-Dimensional Approach (Paperback, Softcover reprint of hardcover 1st ed. 2000)
V.I. Serdobolskii
R2,653 Discovery Miles 26 530 Ships in 18 - 22 working days

Multivariate Statistical Analysis

Computational Solution of Large-Scale Macroeconometric Models (Paperback, Softcover reprint of hardcover 1st ed. 1997): Giorgio... Computational Solution of Large-Scale Macroeconometric Models (Paperback, Softcover reprint of hardcover 1st ed. 1997)
Giorgio Pauletto
R2,653 Discovery Miles 26 530 Ships in 18 - 22 working days

This book is the result of my doctoral dissertation research at the Department of Econometrics of the University of Geneva, Switzerland. This research was also partially financed by the Swiss National Science Foundation (grants 12- 31072.91 and 12-40300.94). First and foremost, I wish to express my deepest gratitude to Professor Manfred Gilli, my thesis supervisor, for his constant support and help. I would also like to thank the president of my jury, Professor Fabrizio Carlevaro, as well as the other members of the jury, Professor Andrew Hughes Hallett, Professor Jean-Philippe Vial and Professor Gerhard Wanner. I am grateful to my colleagues and friends of the Departement of Econometrics, especially David Miceli who provided constant help and kind understanding during all the stages of my research. I would also like to thank Pascale Mignon for proofreading my text and im proving my English. Finally, I am greatly indebted to my parents for their kindness and encourage ments without which I could never have achieved my goals. Giorgio Pauletto Department of Econometrics, University of Geneva, Geneva, Switzerland Chapter 1 Introduction The purpose of this book is to present the available methodologies for the solution of large-scale macroeconometric models. This work reviews classical solution methods and introduces more recent techniques, such as parallel com puting and nonstationary iterative algorithms."

Econometrics of Information and Efficiency (Paperback, Softcover reprint of hardcover 1st ed. 1993): Jati Sengupta Econometrics of Information and Efficiency (Paperback, Softcover reprint of hardcover 1st ed. 1993)
Jati Sengupta
R4,015 Discovery Miles 40 150 Ships in 18 - 22 working days

Econometrics as an applied discipline attempts to use information in a most efficient manner, yet the information theory and entropy approach developed by Shannon and others has not played much of a role in applied econometrics. Econometrics of Information and Efficiency bridges the gap. Broadly viewed, information theory analyzes the uncertainty of a given set of data and its probabilistic characteristics. Whereas the economic theory of information emphasizes the value of information to agents in a market, the entropy theory stresses the various aspects of imprecision of data and their interactions with the subjective decision processes. The tools of information theory, such as the maximum entropy principle, mutual information and the minimum discrepancy are useful in several areas of statistical inference, e.g., Bayesian estimation, expected maximum likelihood principle, the fuzzy statistical regression. This volume analyzes the applications of these tools of information theory to the most commonly used models in econometrics. The outstanding features of Econometrics of Information and Efficiency are: A critical survey of the uses of information theory in economics and econometrics; An integration of applied information theory and economic efficiency analysis; The development of a new economic hypothesis relating information theory to economic growth models; New lines of research are emphasized.

Non-Parametric Statistical Diagnosis - Problems and Methods (Paperback, Softcover reprint of hardcover 1st ed. 2000): E.... Non-Parametric Statistical Diagnosis - Problems and Methods (Paperback, Softcover reprint of hardcover 1st ed. 2000)
E. Brodsky, B.S. Darkhovsky
R5,191 Discovery Miles 51 910 Ships in 18 - 22 working days

Non-Parametric Statistical Diagnosis

Monetary Policy - A Theoretical and Econometric Approach (Paperback, Softcover reprint of hardcover 1st ed. 1990): Y. Barroux Monetary Policy - A Theoretical and Econometric Approach (Paperback, Softcover reprint of hardcover 1st ed. 1990)
Y. Barroux; Edited by P. Artus
R4,003 Discovery Miles 40 030 Ships in 18 - 22 working days

Patrick Artus and Yves Barroux The Applied Econometric Association organised an international conference on "Monetary and Financial Models" in Geneva in January 1987. The purpose of this book is to make available to the public a choice of the papers that were presented at the conference. The selected papers all deal with the setting of monetary targets and the effects of monetary policy on the economy as well as with the analysis of the financial behaviours of economic agents. Other papers presented at the same conference but dealing with the external aspects of monetary policy (exchange rate policy, international coordination of economic policies, international transmission of business cycles, . . . ) are the matter of a distinct publication. The papers put together to make up this book either are theoretical research contributions or consist of applied statistical or econometric work. It seemed to be more logical to start with the more theoretical papers. The topics tackled in the first two parts of the book have in common the fact that they appeared just recently in the field of economic research and deal with the analysis of the behaviour of Central Banks. They analyse this behaviour so as to be able to exhibit its major determinants as well as revealed preferences of Central Banks: this topic comes under the caption "optimal monetary policy and reaction function of the monetary authorities."

Dynamics of Data Envelopment Analysis - Theory of Systems Efficiency (Paperback, Softcover reprint of hardcover 1st ed. 1995):... Dynamics of Data Envelopment Analysis - Theory of Systems Efficiency (Paperback, Softcover reprint of hardcover 1st ed. 1995)
Jati Sengupta
R4,013 Discovery Miles 40 130 Ships in 18 - 22 working days

Data envelopment analysis develops a set of nonparametric and semiparametric techniques for measuring economic efficiency among firms and nonprofit organizations. Over the past decade this technique has found most widespread applications in public sector organizations. However these applications have been mostly static. This monograph extends this static framework of efficiency analysis in several new directions. These include but are not limited to the following: (1) a dynamic view of the production and cost frontier, where capital inputs are treated differently from the current inputs, (2) a direct role of the technological progress and regress, which is so often stressed in total factor productivity discussion in modem growth theory in economics, (3) stochastic efficiency in a dynamic setting, where reliability improvement competes with technical efficiency, (4) flexible manufacturing systems, where flexibility of the production process and the economies of scope play an important role in efficiency analysis and (5) the role of economic factors such as externalities and input interdependences. Efficiency is viewed here in the framework of a general systems theory model. Such a view is intended to broaden the scope of applications of this promising new technique of data envelopment analysis. The monograph stresses the various applied aspects of the dynamic theory, so that it can be empirically implemented in different situations. As far as possible abstract mathematical treatments are avoided and emphasis placed on the statistical examples and empirical illustrations.

Simulation and Inference for Stochastic Differential Equations - With R Examples (Paperback, Softcover reprint of hardcover 1st... Simulation and Inference for Stochastic Differential Equations - With R Examples (Paperback, Softcover reprint of hardcover 1st ed. 2008)
Stefano M. Iacus
R3,108 Discovery Miles 31 080 Ships in 18 - 22 working days

This book covers a highly relevant and timely topic that is of wide interest, especially in finance, engineering and computational biology. The introductory material on simulation and stochastic differential equation is very accessible and will prove popular with many readers. While there are several recent texts available that cover stochastic differential equations, the concentration here on inference makes this book stand out. No other direct competitors are known to date. With an emphasis on the practical implementation of the simulation and estimation methods presented, the text will be useful to practitioners and students with minimal mathematical background. What's more, because of the many R programs, the information here is appropriate for many mathematically well educated practitioners, too.

International Applications of Productivity and Efficiency Analysis - A Special Issue of the Journal of Productivity Analysis... International Applications of Productivity and Efficiency Analysis - A Special Issue of the Journal of Productivity Analysis (Paperback, Softcover reprint of hardcover 1st ed. 1992)
Thomas R. Gulledge, C.A. Knox Lovell
R2,672 Discovery Miles 26 720 Ships in 18 - 22 working days

International Applications of Productivity and Efficiency Analysis features a complete range of techniques utilized in frontier analysis, including extensions of existing techniques and the development of new techniques. Another feature is that most of the contributions use panel data in a variety of approaches. Finally, the range of empirical applications is at least as great as the range of techniques, and many of the applications are of considerable policy relevance.

Models for Analyzing Comparative Advantage (Paperback, Softcover reprint of the original 1st ed. 1990): David Andrew Kendrick Models for Analyzing Comparative Advantage (Paperback, Softcover reprint of the original 1st ed. 1990)
David Andrew Kendrick
R3,983 Discovery Miles 39 830 Ships in 18 - 22 working days

Recent economic history suggests that a key element in economic growth and development for many countries has been an aggressive export policy and a complementary import policy. Such policies can be very effective provided that resources are used wisely to encourage exports from industries that can be com petitive in the international arena. Also, import protection must be used carefully so that it encourages infant industries instead of providing rents to industries that are not competitive. Policy makers may use a variety of methods of analysis in planning trade policy. As computing power has grown in recent years increasing attention has been give to economic models as one of the most powerful aids to policy making. These models can be used on the one hand to help in selecting export industries to encourage and infant industries to protect and on the other hand to chart the larger effects ofttade policy on the entire economy. While many models have been developed in recent years there has not been any analysis of the strengths and weaknesses of the various types of models. Therefore, this monograph provides a review and analysis of the models which can be used to analyze dynamic comparative advantage."

Contributions to Modern Econometrics - From Data Analysis to Economic Policy (Paperback, Softcover reprint of the original 1st... Contributions to Modern Econometrics - From Data Analysis to Economic Policy (Paperback, Softcover reprint of the original 1st ed. 2002)
Ingo Klein, Stefan Mittnik
R2,653 Discovery Miles 26 530 Ships in 18 - 22 working days

The field of econometrics has gone through remarkable changes during the last thirty-five years. Widening its earlier focus on testing macroeconomic theories, it has become a rather comprehensive discipline concemed with the development of statistical methods and their application to the whole spectrum of economic data. This development becomes apparent when looking at the biography of an econometrician whose illustrious research and teaching career started about thirty-five years ago and who will retire very soon after his 65th birthday. This is Gerd Hansen, professor of econometrics at the Christian Albrechts University at Kiel and to whom this volume with contributions from colleagues and students has been dedicated. He has shaped the econometric landscape in and beyond Germany throughout these thirty-five years. At the end of the 1960s he developed one of the first econometric models for the German econ omy which adhered c10sely to the traditions put forth by the Cowles commission."

A Modern Approach to Regression with R (Paperback, Softcover reprint of hardcover 1st ed. 2009): Simon Sheather A Modern Approach to Regression with R (Paperback, Softcover reprint of hardcover 1st ed. 2009)
Simon Sheather
R1,665 Discovery Miles 16 650 Ships in 18 - 22 working days

This book focuses on tools and techniques for building regression models using real-world data and assessing their validity. A key theme throughout the book is that it makes sense to base inferences or conclusions only on valid models. Plots are shown to be an important tool for both building regression models and assessing their validity. We shall see that deciding what to plot and how each plot should be interpreted will be a major challenge. In order to overcome this challenge we shall need to understand the mathematical properties of the fitted regression models and associated diagnostic procedures. As such this will be an area of focus throughout the book. In particular, we shall carefully study the properties of resi- als in order to understand when patterns in residual plots provide direct information about model misspecification and when they do not. The regression output and plots that appear throughout the book have been gen- ated using R. The output from R that appears in this book has been edited in minor ways. On the book web site you will find the R code used in each example in the text.

Multiscale Modeling - A Bayesian Perspective (Paperback, Softcover reprint of hardcover 1st ed. 2007): Marco A. R. Ferreira,... Multiscale Modeling - A Bayesian Perspective (Paperback, Softcover reprint of hardcover 1st ed. 2007)
Marco A. R. Ferreira, Herbert K.H. Lee
R2,653 Discovery Miles 26 530 Ships in 18 - 22 working days

This highly useful book contains methodology for the analysis of data that arise from multiscale processes. It brings together a number of recent developments and makes them accessible to a wider audience. Taking a Bayesian approach allows for full accounting of uncertainty, and also addresses the delicate issue of uncertainty at multiple scales. These methods can handle different amounts of prior knowledge at different scales, as often occurs in practice.

Finite Mixture and Markov Switching Models (Paperback, Softcover reprint of hardcover 1st ed. 2006): Sylvia Fruhwirth-Schnatter Finite Mixture and Markov Switching Models (Paperback, Softcover reprint of hardcover 1st ed. 2006)
Sylvia Fruhwirth-Schnatter
R5,200 Discovery Miles 52 000 Ships in 18 - 22 working days

WINNER OF THE 2007 DEGROOT PRIZE

The prominence of finite mixture modelling is greater than ever. Many important statistical topics like clustering data, outlier treatment, or dealing with unobserved heterogeneity involve finite mixture models in some way or other. The area of potential applications goes beyond simple data analysis and extends to regression analysis and to non-linear time series analysis using Markov switching models.

For more than the hundred years since Karl Pearson showed in 1894 how to estimate the five parameters of a mixture of two normal distributions using the method of moments, statistical inference for finite mixture models has been a challenge to everybody who deals with them. In the past ten years, very powerful computational tools emerged for dealing with these models which combine a Bayesian approach with recent Monte simulation techniques based on Markov chains. This book reviews these techniques and covers the most recent advances in the field, among them bridge sampling techniques and reversible jump Markov chain Monte Carlo methods.

It is the first time that the Bayesian perspective of finite mixture modelling is systematically presented in book form. It is argued that the Bayesian approach provides much insight in this context and is easily implemented in practice. Although the main focus is on Bayesian inference, the author reviews several frequentist techniques, especially selecting the number of components of a finite mixture model, and discusses some of their shortcomings compared to the Bayesian approach.

The aim of this book is to impart the finite mixture and Markov switching approach to statistical modelling to a wide-ranging community. This includes not only statisticians, but also biologists, economists, engineers, financial agents, market researcher, medical researchers or any other frequent user of statistical models. This book should help newcomers to the field to understand how finite mixture and Markov switching models are formulated, what structures they imply on the data, what they could be used for, and how they are estimated. Researchers familiar with the subject also will profit from reading this book. The presentation is rather informal without abandoning mathematical correctness. Previous notions of Bayesian inference and Monte Carlo simulation are useful but not needed.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, … Hardcover R2,160 Discovery Miles 21 600
Macroeconomics and the Real World…
Roger E. Backhouse, Andrea Salanti Hardcover R4,479 Discovery Miles 44 790
Handbook of Econometrics, Volume 3
Michael D. Intriligator, Z. Griliches Hardcover R2,975 Discovery Miles 29 750
Foundations of Info-Metrics - Modeling…
Amos Golan Hardcover R3,308 Discovery Miles 33 080
Functional Materials from Carbon…
Sanjay J. Dhoble, Amol Nande, … Paperback R5,687 Discovery Miles 56 870
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann Hardcover R4,258 Discovery Miles 42 580
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, … Hardcover R2,987 Discovery Miles 29 870
Developments and Advances in Defense and…
Alvaro Rocha, Robson Pacheco Pereira Hardcover R5,241 Discovery Miles 52 410
Computational Geometry, Topology and…
James F. Peters Hardcover R4,659 Discovery Miles 46 590
Lectures on Microeconomic Theory, Volume…
E. Malinvaud Hardcover R1,641 Discovery Miles 16 410

 

Partners