Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
Econometric models are made up of assumptions which never exactly match reality. Among the most contested ones is the requirement that the coefficients of an econometric model remain stable over time. Recent years have therefore seen numerous attempts to test for it or to model possible structural change when it can no longer be ignored. This collection of papers from Empirical Economics mirrors part of this development. The point of departure of most studies in this volume is the standard linear regression model Yt = x;fJt + U (t = I, ... , 1), t where notation is obvious and where the index t emphasises the fact that structural change is mostly discussed and encountered in a time series context. It is much less of a problem for cross section data, although many tests apply there as well. The null hypothesis of most tests for structural change is that fJt = fJo for all t, i.e. that the same regression applies to all time periods in the sample and that the disturbances u are well behaved. The well known Chow test for instance assumes t that there is a single structural shift at a known point in time, i.e. that fJt = fJo (t< t*), and fJt = fJo + t1fJ (t"?:. t*), where t* is known.
Floro Ernesto Caroleo and Francesco Pastore This book was conceived to collect selected essays presented at the session on "The Labour Market Impact of the European Union Enlargements. A New Regional Geography of Europe?" of the XXII Conference of the Italian Association of Labour Economics (AIEL). The session aimed to stimulate the debate on the continuity/ fracture of regional patterns of development and employment in old and new European Union (EU) regions. In particular, we asked whether, and how different, the causes of emergence and the evolution of regional imbalances in the new EU members of Central and Eastern Europe (CEE) are compared to those in the old EU members. Several contributions in this book suggest that a factor common to all backward regions, often neglected in the literature, is to be found in their higher than average degree of structural change or, more precisely, in the hardship they expe- ence in coping with the process of structural change typical of all advanced economies. In the new EU members of CEE, structural change is still a consequence of the continuing process of transition from central planning to a market economy, but also of what Fabrizio et al. (2009) call the "second transition," namely that related to the run-up to and entry in the EU.
This important new dictionary - the first of its kind, now available in paperback - presents an accessible source of reference on the main concepts and techniques in econometrics. Featuring entries on all the major areas in theoretical econometrics, the dictionary will be used by students, both undergraduate and postgraduate, to aid their understanding of the subject. Sorted by alphabetical order, each entry is a short essay which is designed to present the essential points of a particular concept or technique and offer a concise guide to other relevant literature. Written in an accessible and discursive style, the book adopts non-technical language to make the topics accessible to those who need to know more about applied econometrics and the underlying econometric theory. It will be widely welcomed as an indispensable supplement to the standard textbook literature and will be particularly well suited to students following modular courses. An essential source of reference for both undergraduate and post graduate students, the dictionary will also be useful for professional economists seeking to keep abreast of the latest developments in econometrics.
Markov networks and other probabilistic graphical modes have recently received an upsurge in attention from Evolutionary computation community, particularly in the area of Estimation of distribution algorithms (EDAs). EDAs have arisen as one of the most successful experiences in the application of machine learning methods in optimization, mainly due to their efficiency to solve complex real-world optimization problems and their suitability for theoretical analysis. This book focuses on the different steps involved in the conception, implementation and application of EDAs that use Markov networks, and undirected models in general. It can serve as a general introduction to EDAs but covers also an important current void in the study of these algorithms by explaining the specificities and benefits of modeling optimization problems by means of undirected probabilistic models. All major developments to date in the progressive introduction of Markov networks based EDAs are reviewed in the book. Hot current research trends and future perspectives in the enhancement and applicability of EDAs are also covered. The contributions included in the book address topics as relevant as the application of probabilistic-based fitness models, the use of belief propagation algorithms in EDAs and the application of Markov network based EDAs to real-world optimization problems. The book should be of interest to researchers and practitioners from areas such as optimization, evolutionary computation, and machine learning.
The Analytic Hierarchy Process (AHP) is a prominent and powerful tool for making decisions in situations involving multiple objectives. Models, Methods, Concepts and Applications of the Analytic Hierarchy Process, 2nd Edition applies the AHP in order to solve problems focused on the following three themes: economics, the social sciences, and the linking of measurement with human values. For economists, the AHP offers a substantially different approach to dealing with economic problems through ratio scales. Psychologists and political scientists can use the methodology to quantify and derive measurements for intangibles. Meanwhile researchers in the physical and engineering sciences can apply the AHP methods to help resolve the conflicts between hard measurement data and human values. Throughout the book, each of these topics is explored utilizing real life models and examples, relevant to problems in today's society. This new edition has been updated and includes five new chapters that includes discussions of the following: - The eigenvector and why it is necessary - A summary of ongoing research in the Middle East that brings together Israeli and Palestinian scholars to develop concessions from both parties - A look at the Medicare Crisis and how AHP can be used to understand the problems and help develop ideas to solve them.
This tutorial presents a hands-on introduction to a new discrete choice modeling approach based on the behavioral notion of regret-minimization. This so-called Random Regret Minimization-approach (RRM) forms a counterpart of the Random Utility Maximization-approach (RUM) to discrete choice modeling, which has for decades dominated the field of choice modeling and adjacent fields such as transportation, marketing and environmental economics. Being as parsimonious as conventional RUM-models and compatible with popular software packages, the RRM-approach provides an alternative and appealing account of choice behavior. Rather than providing highly technical discussions as usually encountered in scholarly journals, this tutorial aims to allow readers to explore the RRM-approach and its potential and limitations hands-on and based on a detailed discussion of examples. This tutorial is written for students, scholars and practitioners who have a basic background in choice modeling in general and RUM-modeling in particular. It has been taken care of that all concepts and results should be clear to readers that do not have an advanced knowledge of econometrics.
This volume evolved from a conference on "Financial Markets Economet- rics" held at the ZEW (Zentrum fiir Europaische Wirtschaftsforschung) in Mannheim, Germany in February, 1992. However, not all papers included in this volume were presented at the conference. In some cases the papers are follow-up papers to the ones presented. The purpose of the conference was to bring together researchers from several European countries to discuss their applications of recent economet- ric methods to the analysis of financial markets. From a methodological point of view the main emphasis of the conference papers was on cointe- gration analysis and ARCH modelling. In . cointegration analysis the links between long-run components of time series are studied and the methods can .be applied to the determination of equilibrium relationships between the vari- ables, whereas ARCH models (ARCH is the acronym of autoregressive condi- tional heteroskedasticity) are concerned with the measurement and analysis of changing variances in time series. These two models have been the most significant innovations' for the empirical analysis of financial time series in recent years. Six papers of this volume apply cointegration analysis (the papers by MacDonald/Marsh, Hansen, Ronning, Garbers, Kirchgassner/Wolters, and Kunst/Polasek) and seven papers deal with ARCH models (Kramer/Runde, Drost, Kunst/Polasek, Kugler, Eggington/Hall, Koedijk/Stork/deVries, and Demos/Sentana/Shah). Other econometric methods and models applied in the papers include factor analysis (Eggington/Hall and Demos/Sentana/- Shah), vector autoregressions (Kirchgassner/Wolters and Kunst/Polasek), Markov-switching models (Garbers and Kaehler /Marnet), spectral analysis (Kirchgassner/Wolters), stable Paretian distributions (Kramer/Runde and Drost) and ARFIMA models (Drost).
Charles de Gaulle commence ses Memoires d'Espoir, ainsi: 'La France vient du fond des ages. Elle vito Les Siecles l'appellent. Mais elle demeure elle-meme au long du temps. Ses limites peuvent se modifier sans que changent Ie relief, Ie climat, les fleuves, les mers, qui la marquent indefmiment. Y habitent des peuples qu'etreignent, au cours de l'Histoire, les epreuves les plus diverses, mais que la nature des choses, utilisee par la politique, petrit sans cesse en une seule nation. Celle-ci a embrasse de nombreuses generations. Elle en comprend actuellement plusieurs. Elle en enfantera beaucoup d'autres. Mais, de par la geograpbie du pays qui est Ie sien, de par Ie genie des races qui la composent, de par les voisinages qui l'entourent, elle revet un caractere constant qui fait dependre de leurs peres les Fran ais de chaque epoque et les engage pour leurs descendants. A moins de se rompre, cet ensemble humain, sur ce territoire, au sein de cet univers, comporte donc un passe, un present, un avenir, indissolubles. Aussi l'ttat, qui repond de la France, est-il en charge, a la fois, de son heritage d'bier, de ses interets d'aujourd'hui et de ses espoirs de demain. ' A la lurniere de cette idee de nation, il est clair, qu'un dialogue entre nations est eminemment important et que la Semaine Universitaire Franco Neerlandaise est une institution pour stimuler ce dialogue."
The book details the innovative TERM (The Enormous Regional Model) approach to regional and national economic modeling, and explains the conversion from a comparative-static to a dynamic model. It moves on to an adaptation of TERM to water policy, including the additional theoretical and database requirements of the dynamic TERM-H2O model. In particular, it examines the contrasting economic impacts of water buyback policy and recurring droughts in the Murray-Darling Basin. South-east Queensland, where climate uncertainty has been borne out by record-breaking drought and the worst floods in living memory, provides a chapter-length case study. The exploration of the policy background and implications of TERM's dynamic modeling will provide food for thought in policy making circles worldwide, where there is a pressing need for solutions to similarly intractable problems in water management.
For courses in introductory econometrics. This package includes Pearson MyLab Economics. Engaging applications bring the theory and practice of modern econometrics to life Ensure students grasp the relevance of econometrics with Introduction to Econometrics -- the text that connects modern theory and practice with motivating, engaging applications. The 4th Edition, Global Edition, maintains a focus on currency, while building on the philosophy that applications should drive the theory, not the other way around. The text incorporates real-world questions and data, and methods that are immediately relevant to the applications. With very large data sets increasingly being used in economics and related fields, a new chapter dedicated to Big Data helps students learn about this growing and exciting area. This coverage and approach make the subject come alive for students and helps them to become sophisticated consumers of econometrics. Reach every student by pairing this text with Pearson MyLab Economics MyLab (TM) is the teaching and learning platform that empowers you to reach every student. By combining trusted author content with digital tools and a flexible platform, MyLab personalizes the learning experience and improves results for each student. Pearson MyLab Economics should only be purchased when required by an instructor. Please be sure you have the correct ISBN and Course ID. Instructors, contact your Pearson representative for more information.
Max-Min problems are two-step allocation problems in which one side must make his move knowing that the other side will then learn what the move is and optimally counter. They are fundamental in parti cular to military weapons-selection problems involving large systems such as Minuteman or Polaris, where the systems in the mix are so large that they cannot be concealed from an opponent. One must then expect the opponent to determine on an optlmal mixture of, in the case men tioned above, anti-Minuteman and anti-submarine effort. The author's first introduction to a problem of Max-Min type occurred at The RAND Corporation about 1951. One side allocates anti-missile defenses to various cities. The other side observes this allocation and then allocates missiles to those cities. If F(x, y) denotes the total residual value of the cities after the attack, with x denoting the defender's strategy and y the attacker's, the problem is then to find Max MinF(x, y) = Max MinF(x, y)] ."
"Mathematical Optimization and Economic Analysis" is a self-contained introduction to various optimization techniques used in economic modeling and analysis such as geometric, linear, and convex programming and data envelopment analysis. Through a systematic approach, this book demonstrates the usefulness of these mathematical tools in quantitative and qualitative economic analysis. The book presents specific examples to demonstrate each technique's advantages and applicability as well as numerous applications of these techniques to industrial economics, regulatory economics, trade policy, economic sustainability, production planning, and environmental policy. Key Features include: - A detailed presentation of both single-objective and multiobjective optimization; - An in-depth exposition of various applied optimization problems; - Implementation of optimization tools to improve the accuracy of various economic models; - Extensive resources suggested for further reading. This book is intended for graduate and postgraduate students studying quantitative economics, as well as economics researchers and applied mathematicians. Requirements include a basic knowledge of calculus and linear algebra, and a familiarity with economic modeling.
9
? In his "Prime ricerche sulla rivoluzione dei prezzi in Firenze" (1939), Giuseppe Parenti, by Fernand Braudel regarded as an author who "se classait, d'entree de jeu et sans discussion possible, a la hauteur meme d'Earl Jefferson Hamilton. . . . " begins his opening lines with a description/de?nition of the price revolution which took place in the XVI in Europe as "that extraordinary enhancement of all things that occurred in European countries around the second half of the XVI; revolution in the true meaning of the word, as not only, like any strong price increase, it modi?ed the wealth distribution process and changed the relative position of the various social categories and of the different functions of the economic activity, but affected too, in a way that was not enough studied yet, the relative evolution of the various national economies, and ?nally, . . . . . . . . . ., certainly contributed to the birth, or at least to the dissemination, of the new naturalistic economic ideas, from which the economic science would have sprung." De?nition that can be taken as the founding metaphor of this volume."
An observational study is an empiric investigation of effects caused by treatments when randomized experimentation is unethical or infeasible. Observational studies are common in most fields that study the effects of treatments on people, including medicine, economics, epidemiology, education, psychology, political science and sociology. The quality and strength of evidence provided by an observational study is determined largely by its design. Design of Observational Studies is both an introduction to statistical inference in observational studies and a detailed discussion of the principles that guide the design of observational studies. Design of Observational Studies is divided into four parts. Chapters 2, 3, and 5 of Part I cover concisely, in about one hundred pages, many of the ideas discussed in Rosenbaum's Observational Studies (also published by Springer) but in a less technical fashion. Part II discusses the practical aspects of using propensity scores and other tools to create a matched comparison that balances many covariates. Part II includes a chapter on matching in R. In Part III, the concept of design sensitivity is used to appraise the relative ability of competing designs to distinguish treatment effects from biases due to unmeasured covariates. Part IV discusses planning the analysis of an observational study, with particular reference to Sir Ronald Fisher's striking advice for observational studies, "make your theories elaborate." The second edition of his book, Observational Studies, was published by Springer in 2002.
Spatial statistics are useful in subjects as diverse as climatology, ecology, economics, environmental and earth sciences, epidemiology, image analysis and more. This book covers the best-known spatial models for three types of spatial data: geostatistical data (stationarity, intrinsic models, variograms, spatial regression and space-time models), areal data (Gibbs-Markov fields and spatial auto-regression) and point pattern data (Poisson, Cox, Gibbs and Markov point processes). The level is relatively advanced, and the presentation concise but complete. The most important statistical methods and their asymptotic
properties are described, including estimation in geostatistics,
autocorrelation and second-order statistics, maximum likelihood
methods, approximate inference using the pseudo-likelihood or
Monte-Carlo simulations, statistics for point processes and
Bayesian hierarchical models. A chapter is devoted to Markov Chain
Monte Carlo simulation (Gibbs sampler, Metropolis-Hastings
algorithms and exact simulation). This book is the English translation of Modelisation et Statistique Spatiales published by Springer in the series Mathematiques & Applications, a series established by Societe de Mathematiques Appliquees et Industrielles (SMAI)."
On May 27-31, 1985, a series of symposia was held at The University of Western Ontario, London, Canada, to celebrate the 70th birthday of Pro fessor V. M. Joshi. These symposia were chosen to reflect Professor Joshi's research interests as well as areas of expertise in statistical science among faculty in the Departments of Statistical and Actuarial Sciences, Economics, Epidemiology and Biostatistics, and Philosophy. From these symposia, the six volumes which comprise the "Joshi Festschrift" have arisen. The 117 articles in this work reflect the broad interests and high quality of research of those who attended our conference. We would like to thank all of the contributors for their superb cooperation in helping us to complete this project. Our deepest gratitude must go to the three people who have spent so much of their time in the past year typing these volumes: Jackie Bell, Lise Constant, and Sandy Tarnowski. This work has been printed from "camera ready" copy produced by our Vax 785 computer and QMS Lasergraphix printers, using the text processing software TEX. At the initiation of this project, we were neophytes in the use of this system. Thank you, Jackie, Lise, and Sandy, for having the persistence and dedication needed to complete this undertaking."
A. Dogramaci and N.R. Adam Productivity of a firm is influenced both by economic forces which act at the macro level and impose themselves on the individual firm as well as internal factors that result from decisions and processes which take place within the boundaries of the firm. Efforts towards increasing the produc tivity level of firms need to be based on a sound understanding of how the above processes take place. Our objective in this volume is to present some of the recent research work in this field. The volume consists of three parts. In part I, two macro issues are addressed (taxation and inflation) and their relation to produc tivity is analyzed. The second part of the volume focuses on methods for productivity analysis within the firm. Finally, the third part of the book deals with two additional productivity analysis techniques and their applications to public utilities. The objective of the volume is not to present a unified point of view, but rather to cover a sample of different methodologies and perspectives through original, scholarly papers."
Articles on econometric methodology with special reference to the
quantification of poverty and economic inequality are presented in
this book. Poverty and inequality measurement present special
problems to the econometrician, and most of these papers analyze
how to attack those problems.
The proliferation of the internet has often been referred to as the fourth technological revolution. This book explores the diffusion of radical new communication technologies, and the subsequent transformation not only of products, but also of the organisation of production and business methods.
This volume addresses profound issues in international economics, with contributions from leading researchers on the implications of trade. Empirical studies address preferential trading arrangements, global imbalances and exchange rates, facilitating an understanding of how the economy functions and enabling detailed policy evaluation.
This book extends Thirlwall's model and adapts its implications to the current problems facing developed and emerging economies. In this context, this book combines theoretical models and empirical applications, unveiling new results and highlighting the importance of the balance of payments as a constraint to growth.
A collection of papers from leading thinkers to celebrate the work of the late Wynne Godley, and his enormous contribution to the field of monetary economics. Chapters include in-depth discussions of the revolutionary economic modelling systems that Godley introduced, as well as his prescient concerns about the global financial crash.
The subject theory is important in finance, economics, investment strategies, health sciences, environment, industrial engineering, etc.
This book was mainly written while I stayed at the Catholic University of Louvain. Professor Anton P. Barten was the one who did not only give me a warm welcome in Louvain, but also supported my research with most valuable comments and constructive criticisms. In addition I benefitted from dis cussions with Erik Schokkaert, Denis de Crombrugghe and Jo Baras on various subjects, such as the small-sample correction of Chapter 9. The arduous task of transferring my neat handwriting into a readable typescript was excellently taken care of by Brs. E. Crabbe and notably Brs. F. Duij sens, even after working hours. Mrs. A. Molders prevented me of making serious abuse of the English language. My admiration for Carien, finally, is an exponential function of the patience and enthusiasm with which she sup ported my research. Chapter I is a general introduction to the subject of linkage models, and it contains few mathematical elaborations. Chapters 2 to 4 use more, but elementary, mathematics, and treat several aspects related to the deriva tion, interpretation and estimation of linkage models. Chapter 2 deals vii tll the theory of import allocation models, Chapter J treats the problem of defining and interpreting elasticities of substitution, while Chapter 4 is concerned with the econometric problems related to the estimation of mul tivariate models with linear restrictions, such as import allocation models." |
You may like...
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,513
Discovery Miles 65 130
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R7,998
Discovery Miles 79 980
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Advanced Introduction to Spatial…
Daniel A. Griffith, Bin Li
Hardcover
R2,745
Discovery Miles 27 450
|