![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics > General
This second edition retains the positive features of being clearly written, well organized, and incorporating calculus in the text, while adding expanded coverage on game theory, experimental economics, and behavioural economics. It remains more focused and manageable than similar textbooks, and provides a concise yet comprehensive treatment of the core topics of microeconomics, including theories of the consumer and of the firm, market structure, partial and general equilibrium, and market failures caused by public goods, externalities and asymmetric information. The book includes helpful solved problems in all the substantive chapters, as well as over seventy new mathematical exercises and enhanced versions of the ones in the first edition. The authors make use of the book's full color with sharp and helpful graphs and illustrations. This mathematically rigorous textbook is meant for students at the intermediate level who have already had an introductory course in microeconomics, and a calculus course.
Presenting an economic perspective of deforestation in the Brazilan Amazon, this study utilizes economic and ecological data from 1970 to 1996. It examines the extent to which land clearing promotes economic activity and growth and analyzes policies such as road building and subsidized credit. It explores whether the economic benefits of land clearing surpass the ecological costs and considers the viability of extractivism as an alternative to deforestation.
This book develops a machine-learning framework for predicting economic growth. It can also be considered as a primer for using machine learning (also known as data mining or data analytics) to answer economic questions. While machine learning itself is not a new idea, advances in computing technology combined with a dawning realization of its applicability to economic questions makes it a new tool for economists.
Petri Nets were defined for the study of discrete events systems and later extended for many purposes including dependability assessment. In our knowledge, no book deals specifically with the use of different type of PN to dependability. We propose in addition to bring a focus on the adequacy of Petri net types to the study of various problems related to dependability such as risk analysis and probabilistic assessment. In the first part, the basic models of PN and some useful extensions are briefly recalled. In the second part, the PN are used as a formal model to describe the evolution process of critical system in the frame of an ontological approach. The third part focuses on the stochastic Petri Nets (SPN) and their use in dependability assessment. Different formal models of SPN are formally presented (semantics, evolution rules...) and their equivalence with the corresponding class of Markov processes to get an analytical assessment of dependability. Simplification methods are proposed in order to reduce the size of analytical model and to make it more calculable. The introduction of some concepts specific to high level PN allows too the consideration of complex systems. Few applications in the field of the instrumentation and control (l&C) systems, safety integrated systems (SIS) emphasize the benefits of SPN for dependability assessment.
This is the second volume in a two-part series on frontiers in regional research. It identifies methodological advances as well as trends and future developments in regional systems modelling and open science. Building on recent methodological and modelling advances, as well as on extensive policy-analysis experience, top international regional scientists identify and evaluate emerging new conceptual and methodological trends and directions in regional research. Topics such as dynamic interindustry modelling, computable general equilibrium models, exploratory spatial data analysis, geographic information science, spatial econometrics and other advanced methods are the central focus of this book. The volume provides insights into the latest developments in object orientation, open source, and workflow systems, all in support of open science. It will appeal to a wide readership, from regional scientists and economists to geographers, quantitatively oriented regional planners and other related disciplines. It offers a source of relevant information for academic researchers and policy analysts in government, and is also suitable for advanced teaching courses on regional and spatial science, economics and political science.
This book presents essential tools for modelling non-linear time series. The first part of the book describes the main standard tools of probability and statistics that directly apply to the time series context to obtain a wide range of modelling possibilities. Functional estimation and bootstrap are discussed, and stationarity is reviewed. The second part describes a number of tools from Gaussian chaos and proposes a tour of linear time series models. It goes on to address nonlinearity from polynomial or chaotic models for which explicit expansions are available, then turns to Markov and non-Markov linear models and discusses Bernoulli shifts time series models. Finally, the volume focuses on the limit theory, starting with the ergodic theorem, which is seen as the first step for statistics of time series. It defines the distributional range to obtain generic tools for limit theory under long or short-range dependences (LRD/SRD) and explains examples of LRD behaviours. More general techniques (central limit theorems) are described under SRD; mixing and weak dependence are also reviewed. In closing, it describes moment techniques together with their relations to cumulant sums as well as an application to kernel type estimation.The appendix reviews basic probability theory facts and discusses useful laws stemming from the Gaussian laws as well as the basic principles of probability, and is completed by R-scripts used for the figures. Richly illustrated with examples and simulations, the book is recommended for advanced master courses for mathematicians just entering the field of time series, and statisticians who want more mathematical insights into the background of non-linear time series.
Today, most money is credit money, created by commercial banks. While credit can finance innovation, excessive credit can lead to boom/bust cycles, such as the recent financial crisis. This highlights how the organization of our monetary system is crucial to stability. One way to achieve this is by separating the unit of account from the medium of exchange and in pre-modern Europe, such a separation existed. This new volume examines this idea of monetary separation and this history of monetary arrangements in the North and Baltic Seas region, from the Hanseatic League onwards. This book provides a theoretical analysis of four historical cases in the Baltic and North Seas region, with a view to examining evolution of monetary arrangements from a new monetary economics perspective. Since the objective exhange value of money (its purchasing power), reflects subjective individual valuations of commodities, the author assesses these historical cases by means of exchange rates. Using theories from new monetary economics , the book explores how the units of account and their media of exchange evolved as social conventions, and offers new insight into the separation between the two. Through this exploration, it puts forward that money is a social institution, a clearing device for the settlement of accounts, and so the value of money, or a separate unit of account, ultimately results from the size of its network of users. The History of Money and Monetary Arrangements offers a highly original new insight into monetary arrangments as an evolutionary process. It will be of great interest to an international audience of scholars and students, including those with an interest in economic history, evolutionary economics and new monetary economics.
Principles of Econometrics, 4th Edition, is an introductory book on economics and finance designed to provide an understanding of why econometrics is necessary, and a working knowledge of basic econometric tools. This latest edition is updated to reflect current state of economic and financial markets and provides new content on Kernel Density Fitting and Analysis of Treatment Effects. It offers new end-of-chapters questions and problems in each chapter; updated comprehensive Glossary of Terms; and summary of Probably and Statistics. The text applies basic econometric tools to modeling, estimation, inference, and forecasting through real world problems and evaluates critically the results and conclusions from others who use basic econometric tools. Furthermore, it provides a foundation and understanding for further study of econometrics and more advanced techniques.
The conference, 'Measurement Error: Econometrics and Practice' was recently hosted by Aston University and organised jointly by researchers from Aston University and Lund University to highlight the enormous problems caused by measurement error in Economic and Financial data which often go largely unnoticed. Thanks to the sponsorship from Eurostat, a number of distinguished researchers were invited to present keynote lectures. Professor Arnold Zellner from University of Chicago shared his knowledge on measurement error in general; Professor William Barnett from the University of Kansas gave a lecture on implications of measurement error on monetary policy, whilst Dennis Fixler shared his knowledge on how statistical agencies deal with measurement errors. This volume is the result of the selection of high-quality papers presented at the conference and is designed to draw attention to the enormous problem in econometrics of measurement error in data provided by the worlds leading statistical agencies; highlighting consequences of data error and offering solutions to deal with such problems. This volume should appeal to economists, financial analysts and practitioners interested in studying and solving economic problems and building econometric models in everyday operations.
This book seeks new perspectives on the growing inequalities that our societies face, putting forward Structured Additive Distributional Regression as a means of statistical analysis that circumvents the common problem of analytical reduction to simple point estimators. This new approach allows the observed discrepancy between the individuals' realities and the abstract representation of those realities to be explicitly taken into consideration using the arithmetic mean alone. In turn, the method is applied to the question of economic inequality in Germany.
This book examines conventional time series in the context of stationary data prior to a discussion of cointegration, with a focus on multivariate models. The authors provide a detailed and extensive study of impulse responses and forecasting in the stationary and non-stationary context, considering small sample correction, volatility and the impact of different orders of integration. Models with expectations are considered along with alternate methods such as Singular Spectrum Analysis (SSA), the Kalman Filter and Structural Time Series, all in relation to cointegration. Using single equations methods to develop topics, and as examples of the notion of cointegration, Burke, Hunter, and Canepa provide direction and guidance to the now vast literature facing students and graduate economists.
"Bayesian Econometrics" illustrates the scope and diversity of modern applications, reviews some recent advances, and highlights many desirable aspects of inference and computations. It begins with an historical overview by Arnold Zellner who describes key contributions to development and makes predictions for future directions. In the second paper, Giordani and Kohn makes suggestions for improving Markov chain Monte Carlo computational strategies. The remainder of the book is categorized according to microeconometric and time-series modeling. Models considered include an endogenous selection ordered probit model, a censored treatment-response model, equilibrium job search models and various other types. These are used to study a variety of applications for example dental insurance and care, educational attainment, voter opinions and the marketing share of various brands and an aggregate cross-section production function. Models and topics considered include the potential problem of improper posterior densities in a variety of dynamic models, selection and averaging for forecasting with vector autoregressions, a consumption capital-asset pricing model and various others. Applications involve U.S. macroeconomic variables, exchange rates, an investigation of purchasing power parity, data from London Metals Exchange, international automobile production data, and data from the Asian stock market.
Davidson and MacKinnon have written an outstanding textbook for graduates in econometrics, covering both basic and advanced topics and using geometrical proofs throughout for clarity of exposition. The book offers a unified theoretical perspective, and emphasizes the practical applications of modern theory.
Nonlinear models have been used extensively in the areas of economics and finance. Recent literature on the topic has shown that a large number of series exhibit nonlinear dynamics as opposed to the alternative--linear dynamics. Incorporating these concepts involves deriving and estimating nonlinear time series models, and these have typically taken the form of Threshold Autoregression (TAR) models, Exponential Smooth Transition (ESTAR) models, and Markov Switching (MS) models, among several others. This edited volume provides a timely overview of nonlinear estimation techniques, offering new methods and insights into nonlinear time series analysis. It features cutting-edge research from leading academics in economics, finance, and business management, and will focus on such topics as Zero-Information-Limit-Conditions, using Markov Switching Models to analyze economics series, and how best to distinguish between competing nonlinear models. Principles and techniques in this book will appeal to econometricians, finance professors teaching quantitative finance, researchers, and graduate students interested in learning how to apply advances in nonlinear time series modeling to solve complex problems in economics and finance.
The book provides a description of the process of health economic evaluation and modelling for cost-effectiveness analysis, particularly from the perspective of a Bayesian statistical approach. Some relevant theory and introductory concepts are presented using practical examples and two running case studies. The book also describes in detail how to perform health economic evaluations using the R package BCEA (Bayesian Cost-Effectiveness Analysis). BCEA can be used to post-process the results of a Bayesian cost-effectiveness model and perform advanced analyses producing standardised and highly customisable outputs. It presents all the features of the package, including its many functions and their practical application, as well as its user-friendly web interface. The book is a valuable resource for statisticians and practitioners working in the field of health economics wanting to simplify and standardise their workflow, for example in the preparation of dossiers in support of marketing authorisation, or academic and scientific publications.
This volume systematically details both the basic principles and new developments in Data Envelopment Analysis (DEA), offering a solid understanding of the methodology, its uses, and its potential. New material in this edition includes coverage of recent developments that have greatly extended the power and scope of DEA and have lead to new directions for research and DEA uses. Each chapter accompanies its developments with simple numerical examples and discussions of actual applications. The first nine chapters cover the basic principles of DEA, while the final seven chapters provide a more advanced treatment.
This book provides the tools and concepts necessary to study the
behavior of econometric estimators and test statistics in large
samples. An econometric estimator is a solution to an optimization
problem; that is, a problem that requires a body of techniques to
determine a specific solution in a defined set of possible
alternatives that best satisfies a selected object function or set
of constraints. Thus, this highly mathematical book investigates
situations concerning large numbers, in which the assumptions of
the classical linear model fail. Economists, of course, face these
situations often.
Presents recent developments of probabilistic assessment of systems dependability based on stochastic models, including graph theory, finite state automaton and language theory, for both dynamic and hybrid contexts.
Measurement in Economics: a Handbook aims to serve as a source,
reference, and teaching supplement for quantitative empirical
economics, inside and outside the laboratory. Covering an extensive
range of fields in economics: econometrics, actuarial science,
experimental economics, index theory, national accounts, and
economic forecasting, it is the first book that takes measurement
in economics as its central focus. It shows how different and
sometimes distinct fields share the same kind of measurement
problems and so how the treatment of these problems in one field
can function as a guidance in other fields. This volume provides
comprehensive and up-to-date surveys of recent developments in
economic measurement, written at a level intended for professional
use by economists, econometricians, statisticians and social
scientists.
This book presents a novel approach to time series econometrics, which studies the behavior of nonlinear stochastic processes. This approach allows for an arbitrary dependence structure in the increments and provides a generalization with respect to the standard linear independent increments assumption of classical time series models. The book offers a solution to the problem of a general semiparametric approach, which is given by a concept called C-convolution (convolution of dependent variables), and the corresponding theory of convolution-based copulas. Intended for econometrics and statistics scholars with a special interest in time series analysis and copula functions (or other nonparametric approaches), the book is also useful for doctoral students with a basic knowledge of copula functions wanting to learn about the latest research developments in the field.
Putting Econometrics in its Place is an original and fascinating book, in which Peter Swann argues that econometrics has dominated applied economics for far too long and displaced other essential techniques. While Peter Swann is critical of the monopoly that econometrics currently holds in applied economics, the more important and positive contribution of the book is to propose a new direction and a new attitude to applied economics.The advance of econometrics from its early days has been a massive achievement, but it has also been problematic; practical results from the use of econometrics are often disappointing. The author argues that to get applied economics back on course economists must use a much wider variety of research techniques, and must once again learn to respect vernacular knowledge of the economy. This vernacular includes the knowledge gathered by ordinary people from their everyday interactions with markets. While vernacular knowledge is often unsystematic and informal, it offers insights that can never be found from formal analysis alone. As a serious, original and sometimes contentious book, its readership will be varied and international. Scholars throughout the many fields of economics - both skilled and unskilled in econometrics - are likely to be intrigued by the serious alternative approaches outlined within the book. It will also appeal to communities of economists outside economics departments in government, industry and business as well as business and management schools. Research centres for applied economics, policy research and innovation research, will also find it of interest due to its focus on getting reliable results rather than methodological orthodoxy for its own sake.
This book discusses the problem of model choice when the statistical models are separate, also called nonnested. Chapter 1 provides an introduction, motivating examples and a general overview of the problem. Chapter 2 presents the classical or frequentist approach to the problem as well as several alternative procedures and their properties. Chapter 3 explores the Bayesian approach, the limitations of the classical Bayes factors and the proposed alternative Bayes factors to overcome these limitations. It also discusses a significance Bayesian procedure. Lastly, Chapter 4 examines the pure likelihood approach. Various real-data examples and computer simulations are provided throughout the text.
This book provides advanced theoretical and applied tools for the implementation of modern micro-econometric techniques in evidence-based program evaluation for the social sciences. The author presents a comprehensive toolbox for designing rigorous and effective ex-post program evaluation using the statistical software package Stata. For each method, a statistical presentation is developed, followed by a practical estimation of the treatment effects. By using both real and simulated data, readers will become familiar with evaluation techniques, such as regression-adjustment, matching, difference-in-differences, instrumental-variables and regression-discontinuity-design and are given practical guidelines for selecting and applying suitable methods for specific policy contexts.
This book presents the latest advances in the theory and practice of Marshall-Olkin distributions. These distributions have been increasingly applied in statistical practice in recent years, as they make it possible to describe interesting features of stochastic models like non-exchangeability, tail dependencies and the presence of a singular component. The book presents cutting-edge contributions in this research area, with a particular emphasis on financial and economic applications. It is recommended for researchers working in applied probability and statistics, as well as for practitioners interested in the use of stochastic models in economics. This volume collects selected contributions from the conference “Marshall-Olkin Distributions: Advances in Theory and Applications,” held in Bologna on October 2-3, 2013.
This book assesses how efficient primary and upper primary education is across different states of India considering both output oriented and input oriented measures of technical efficiency. It identifies the most important factors that could produce differential efficiency among the states, including the effects of central grants, school-specific infrastructures, social indicators and policy variables, as well as state-specific factors like per-capita net-state-domestic-product from the service sector, inequality in distribution of income (Gini coefficient), the percentage of people living below the poverty line and the density of population. The study covers the period 2005-06 to 2010-11 and all the states and union territories of India, which are categorized into two separate groups, namely: (i) General Category States (GCS); and (ii) Special Category States (SCS) and Union Territories (UT). It uses non-parametric Data Envelopment Analysis (DEA) and obtains the Technology Closeness Ratio (TCR), measuring whether the maximum output producible from an input bundle by a school within a given group is as high as what could be produced if the school could choose to join the other group. The major departure of this book is its approach to estimating technical efficiency (TE), which does not use a single frontier encompassing all the states and UT, as is done in the available literature. Rather, this method assumes that GCS, SCS and UT are not homogeneous and operate under different fiscal and economic conditions. |
![]() ![]() You may like...
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R7,916
Discovery Miles 79 160
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,432
Discovery Miles 64 320
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,656
Discovery Miles 26 560
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
Introduction to Econometrics, Global…
James Stock, Mark Watson
Paperback
R2,447
Discovery Miles 24 470
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,326
Discovery Miles 33 260
|