![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
This book offers a useful combination of probabilistic and statistical tools for analyzing nonlinear time series. Key features of the book include a study of the extremal behavior of nonlinear time series and a comprehensive list of nonlinear models that address different aspects of nonlinearity. Several inferential methods, including quasi likelihood methods, sequential Markov Chain Monte Carlo Methods and particle filters, are also included so as to provide an overall view of the available tools for parameter estimation for nonlinear models. A chapter on integer time series models based on several thinning operations, which brings together all recent advances made in this area, is also included. Readers should have attended a prior course on linear time series, and a good grasp of simulation-based inferential methods is recommended. This book offers a valuable resource for second-year graduate students and researchers in statistics and other scientific areas who need a basic understanding of nonlinear time series.
Accessible to a general audience with some background in statistics and computing Many examples and extended case studies Illustrations using R and Rstudio A true blend of statistics and computer science -- not just a grab bag of topics from each
The volume aims at providing an outlet for some of the best papers presented at the 15th Annual Conference of the African Econometric Society, which is one of the "chapters" of the International Econometric Society. Many of these papers represent the state of the art in financial econometrics and applied econometric modeling, and some also provide useful simulations that shed light on the models' ability to generate meaningful scenarios for forecasting and policy analysis.
The book provides an extensive discussion of asymptotic theory of M-estimators in the context of dynamic nonlinear models. The class of M-estimators contains least mean distance estimators (including maximum likelihood estimators) and generalized method of moments estimators. In addition to establishing the asymptotic properties of such estimators, the book provides a detailed discussion of the statistical and probabilistic tools necessary for such an analysis. The book also gives a careful treatment of estimators of asymptotic variance covariance matrices for dependent processes.
Are there distinct European traditions in economics? Is modern economics homogenous and American? The volume includes case studies of the UK, Sweden, the Netherlands, Belgium, Germany, France, Italy, Portugal, Spain and Greece. Each of these examines the conditions relating to the supply of, and demand for, economists. These include: the growth of higher education, the development of postgraduate training in economics, international linkages, both within Europe and outside it, economic ideas and professionalization, and involvement in economic policy-making and public affairs. Whilst each chapter is attentive to particular national features, they also place the development of economics in the context of the postwar movement towards European integration.
This book provides the tools and concepts necessary to study the
behavior of econometric estimators and test statistics in large
samples. An econometric estimator is a solution to an optimization
problem; that is, a problem that requires a body of techniques to
determine a specific solution in a defined set of possible
alternatives that best satisfies a selected object function or set
of constraints. Thus, this highly mathematical book investigates
situations concerning large numbers, in which the assumptions of
the classical linear model fail. Economists, of course, face these
situations often.
Financial Asset Pricing Theory offers a comprehensive overview of the classic and the current research in theoretical asset pricing. Asset pricing is developed around the concept of a state-price deflator which relates the price of any asset to its future (risky) dividends and thus incorporates how to adjust for both time and risk in asset valuation. The willingness of any utility-maximizing investor to shift consumption over time defines a state-price deflator which provides a link between optimal consumption and asset prices that leads to the Consumption-based Capital Asset Pricing Model (CCAPM). A simple version of the CCAPM cannot explain various stylized asset pricing facts, but these asset pricing 'puzzles' can be resolved by a number of recent extensions involving habit formation, recursive utility, multiple consumption goods, and long-run consumption risks. Other valuation techniques and modelling approaches (such as factor models, term structure models, risk-neutral valuation, and option pricing models) are explained and related to state-price deflators. The book will serve as a textbook for an advanced course in theoretical financial economics in a PhD or a quantitative Master of Science program. It will also be a useful reference book for researchers and finance professionals. The presentation in the book balances formal mathematical modelling and economic intuition and understanding. Both discrete-time and continuous-time models are covered. The necessary concepts and techniques concerning stochastic processes are carefully explained in a separate chapter so that only limited previous exposure to dynamic finance models is required.
"Advances in Econometrics and Quantitative Economics" is a comprehensive guide to the statistical methods used in econometrics and quantitative economics. Bringing together contributions from those acknowledged to be amongst the world's leading econometricians and statisticians this volume covers topics such as: * Semiparametric and non-parametric interference. The book is dedicated to Professor C. R. Rao, whose unique contribution to the subject has influenced econometricians for many years.
Understanding why so many people across the world are so poor is one of the central intellectual challenges of our time. This book provides the tools and data that will enable students, researchers and professionals to address that issue. Empirical Development Economics has been designed as a hands-on teaching tool to investigate the causes of poverty. The book begins by introducing the quantitative approach to development economics. Each section uses data to illustrate key policy issues. Part One focuses on the basics of understanding the role of education, technology and institutions in determining why incomes differ so much across individuals and countries. In Part Two, the focus is on techniques to address a number of topics in development, including how firms invest, how households decide how much to spend on their children's education, whether microcredit helps the poor, whether food aid works, who gets private schooling and whether property rights enhance investment. A distinctive feature of the book is its presentation of a range of approaches to studying development questions. Development economics has undergone a major change in focus over the last decade with the rise of experimental methods to address development issues; this book shows how these methods relate to more traditional ones. Please visit the book's website at www.empiricalde.com for online supplements including Stata files and solutions to the exercises.
Maurice Potron (1872-1942), a French Jesuit mathematician, constructed and analyzed a highly original, but virtually unknown economic model. This book presents translated versions of all his economic writings, preceded by a long introduction which sketches his life and environment based on extensive archival research and family documents. Potron had no education in economics and almost no contact with the economists of his time. His primary source of inspiration was the social doctrine of the Church, which had been updated at the end of the nineteenth century. Faced with the 'economic evils' of his time, he reacted by utilizing his talents as a mathematician and an engineer to invent and formalize a general disaggregated model in which production, employment, prices and wages are the main unknowns. He introduced four basic principles or normative conditions ('sufficient production', the 'right to rest', 'justice in exchange', and the 'right to live') to define satisfactory regimes of production and labour on the one hand, and of prices and wages on the other. He studied the conditions for the existence of these regimes, both on the quantity side and the value side, and he explored the way to implement them. This book makes it clear that Potron was the first author to develop a full input-output model, to use the Perron-Frobenius theorem in economics, to state a duality result, and to formulate the Hawkins-Simon condition. These are all techniques which now belong to the standard toolkit of economists. This book will be of interest to Economics postgraduate students and researchers, and will be essential reading for courses dealing with the history of mathematical economics in general, and linear production theory in particular.
* Starts from the basics, focusing less on proofs and the high-level math underlying regressions, and adopts an engaging tone to provide a text which is entirely accessible to students who don't have a stats background * New chapter on integrity and ethics in regression analysis * Each chapter offers boxed examples, stories, exercises and clear summaries, all of which are designed to support student learning * Optional appendix of statistical tools, providing a primer to readers who need it * Code in R and Stata, and data sets and exercises in Stata and CSV, to allow students to practice running their own regressions * Author-created videos on YouTube * PPT lecture slides and test bank for instructors
This book describes the new generation of discrete choice methods, focusing on the many advances that are made possible by simulation. Researchers use these statistical methods to examine the choices that consumers, households, firms, and other agents make. Each of the major models is covered: logit, generalized extreme value, or GEV (including nested and cross-nested logits), probit, and mixed logit, plus a variety of specifications that build on these basics. Simulation-assisted estimation procedures are investigated and compared, including maximum stimulated likelihood, method of simulated moments, and method of simulated scores. Procedures for drawing from densities are described, including variance reduction techniques such as anithetics and Halton draws. Recent advances in Bayesian procedures are explored, including the use of the Metropolis-Hastings algorithm and its variant Gibbs sampling. The second edition adds chapters on endogeneity and expectation-maximization (EM) algorithms. No other book incorporates all these fields, which have arisen in the past 25 years. The procedures are applicable in many fields, including energy, transportation, environmental studies, health, labor, and marketing.
Quantile regression has emerged as an essential statistical tool of contemporary empirical economics and biostatistics. Complementing classical least squares regression methods which are designed to estimate conditional mean models, quantile regression provides an ensemble of techniques for estimating families of conditional quantile models, thus offering a more complete view of the stochastic relationship among variables. This volume collects 12 outstanding empirical contributions in economics and offers an indispensable introduction to interpretation, implementation, and inference aspects of quantile regression.
The major methodological task for modern economists has been to establish the testability of models. Too often, however, methodological assumptions can make a model virtually impossible to test even under ideal conditions, yet few theorists have examined the requirements and problems of assuring testability in economics. In The Methodology of Economic Model Building, first published in 1989, Lawrence Boland presents the results of a research project that spanned more than twenty years. He examines how economists have applied the philosophy of Karl Popper, relating methodological debates about falsifiability to wider discussions about the truth status of models in natural and social sciences. He concludes that model building in economics reflects more the methodological prescriptions of the economist Paul Samuelson than Popper's 'falsificationism'. This title will prove invaluable to both students and researchers, and represents a substantial contribution to debates about the scientific status of economics.
Environmental risk directly affects the financial stability of banks since they bear the financial consequences of the loss of liquidity of the entities to which they lend and of the financial penalties imposed resulting from the failure to comply with regulations and for actions taken that are harmful to the natural environment. This book explores the impact of environmental risk on the banking sector and analyzes strategies to mitigate this risk with a special emphasis on the role of modelling. It argues that environmental risk modelling allows banks to estimate the patterns and consequences of environmental risk on their operations, and to take measures within the context of asset and liability management to minimize the likelihood of losses. An important role here is played by the environmental risk modelling methodology as well as the software and mathematical and econometric models used. It examines banks' responses to macroprudential risk, particularly from the point of view of their adaptation strategies; the mechanisms of its spread; risk management and modelling; and sustainable business models. It introduces the basic concepts, definitions, and regulations concerning this type of risk, within the context of its influence on the banking industry. The book is primarily based on a quantitative and qualitative approach and proposes the delivery of a new methodology of environmental risk management and modelling in the banking sector. As such, it will appeal to researchers, scholars, and students of environmental economics, finance and banking, sociology, law, and political sciences.
This book presents an exciting new set of econometric methods. They have been developed as a result of the increase in power and affordability of computers which allow simulations to be run. The authors have played a large role in developing the techniques.
Financial Economics and Econometrics provides an overview of the core topics in theoretical and empirical finance, with an emphasis on applications and interpreting results. Structured in five parts, the book covers financial data and univariate models; asset returns; interest rates, yields and spreads; volatility and correlation; and corporate finance and policy. Each chapter begins with a theory in financial economics, followed by econometric methodologies which have been used to explore the theory. Next, the chapter presents empirical evidence and discusses seminal papers on the topic. Boxes offer insights on how an idea can be applied to other disciplines such as management, marketing and medicine, showing the relevance of the material beyond finance. Readers are supported with plenty of worked examples and intuitive explanations throughout the book, while key takeaways, 'test your knowledge' and 'test your intuition' features at the end of each chapter also aid student learning. Digital supplements including PowerPoint slides, computer codes supplements, an Instructor's Manual and Solutions Manual are available for instructors. This textbook is suitable for upper-level undergraduate and graduate courses on financial economics, financial econometrics, empirical finance and related quantitative areas.
Advanced and Multivariate Statistical Methods, Seventh Edition provides conceptual and practical information regarding multivariate statistical techniques to students who do not necessarily need technical and/or mathematical expertise in these methods. This text has three main purposes. The first purpose is to facilitate conceptual understanding of multivariate statistical methods by limiting the technical nature of the discussion of those concepts and focusing on their practical applications. The second purpose is to provide students with the skills necessary to interpret research articles that have employed multivariate statistical techniques. Finally, the third purpose of AMSM is to prepare graduate students to apply multivariate statistical methods to the analysis of their own quantitative data or that of their institutions. New to the Seventh Edition All references to SPSS have been updated to Version 27.0 of the software. A brief discussion of practical significance has been added to Chapter 1. New data sets have now been incorporated into the book and are used extensively in the SPSS examples. All the SPSS data sets utilized in this edition are available for download via the companion website. Additional resources on this site include several video tutorials/walk-throughs of the SPSS procedures. These "how-to" videos run approximately 5-10 minutes in length. Advanced and Multivariate Statistical Methods was written for use by students taking a multivariate statistics course as part of a graduate degree program, for example in psychology, education, sociology, criminal justice, social work, mass communication, and nursing.
This book contains a set of notes prepared by Ragnar Frisch for a lecture series that he delivered at Yale University in 1930. The lecture notes provide not only a valuable source document for the history of econometrics, but also a more systematic introduction to some of Frisch's key methodological ideas than his other works so far published in various media for the econometrics community. In particular, these notes contain a number of prescient ideas precursory to some of the most important notions developed in econometrics during the 1970s and 1980s More remarkably, Frisch demonstrated a deep understanding of what econometric or statistical analysis could achieve under the situation where there lacked known correct theoretical models. This volume has been rigorously edited and comes with an introductory essay from Olav Bjerkholt and Duo Qin placing the notes in their historical context.
Master key spreadsheet and business analytics skills with SPREADSHEET MODELING AND DECISION ANALYSIS: A PRACTICAL INTRODUCTION TO BUSINESS ANALYTICS, 9E, written by respected business analytics innovator Cliff Ragsdale. This edition's clear presentation, realistic examples, fascinating topics and valuable software provide everything you need to become proficient in today's most widely used business analytics techniques using the latest version of Excel (R) in Microsoft (R) Office 365 or Office 2019. Become skilled in the newest Excel functions as well as Analytic Solver (R) and Data Mining add-ins. This edition helps you develop both algebraic and spreadsheet modeling skills. Step-by-step instructions and annotated, full-color screen images make examples easy to follow and show you how to apply what you learn about descriptive, predictive and prescriptive analytics to real business situations. WebAssign online tools and author-created videos further strengthen understanding.
Explains modern SDC techniques for data stewards and develop tools to implement them. Explains the logic behind modern privacy protections for researchers and how they may use publicly released data to generate valid statistical inferences-as well as the limitations imposed by SDC techniques.
Hardbound. The Handbook is a definitive reference source and teaching aid for econometricians. It examines models, estimation theory, data analysis and field applications in econometrics. Comprehensive surveys, written by experts, discuss recent developments at a level suitable for professional use by economists, econometricians, statisticians, and in advanced graduate econometrics courses.For more information on the Handbooks in Economics series, please see our home page on http: //www.elsevier.nl/locate/he
First published in 1985, Advances in Monetary Economics draws together papers given at the 1984 Money Study Group Conference and additional papers presented in seminars of the same year. The book includes papers on theoretical, empirical and institutional aspects of monetary economics. Each chapter displays a concern with policy in the monetary sphere, both with regards to macroeconomic questions of monetary and fiscal management, and issues of policy at the microeconomic level towards financial institutions and markets. In doing so, the book highlights the importance of monetary economics in policy issues. Advances in Monetary Economics has enduring relevance for those with an interest in the history and development of monetary economics.
This book provides an accessible guide to price index and hedonic techniques, with a focus on how to best apply these techniques and interpret the resulting measures. One goal of this book is to provide first-hand experience at constructing these measures, with guidance on practical issues such as what the ideal data would look like and how best to construct these measures when the data are less than ideal. A related objective is to fill the wide gulf between the necessarily simplistic elementary treatments in textbooks and the very complex discussions found in the theoretical and empirical measurement literature. Here, the theoretical results are summarized in an intuitive way and their numerical importance is illustrated using data and results from existing studies. Finally, while the aim of much of the existing literature is to better understand official price indexes like the Consumer Price Index, the emphasis here is more practical: to provide the needed tools for individuals to apply these techniques on their own. As new datasets become increasingly accessible, tools like these will be needed to obtain summary price measures. Indeed, these techniques have been applied for years in antitrust cases that involve pricing, where economic experts typically have access to large, granular datasets.
The results of the 1959 Glasgow University investigation into British industrial profit, business saving, and investment are the subject of this book, originally published in 1965. Part 1 presents original estimates of profits in British industries 1920-1938, which when linked with Government estimates of such profits since 1948, permit long runs studies of the fortunes of individual industries. In addition, the appropriation of profit between dividends and business saving is also estimated for manufacturing industry 1920-1938. Part 2 begins the analysis of the extensive financial data collected in the Glasgow enquiry and is concerned with the effects of the size of a firm on its financial performance. The financial performance of large companies quoted on the Stock Exchange with a sample of small unquoted private companies and unincorporated firms is compared. |
![]() ![]() You may like...
Maps Of Meaning - The Architecture Of…
Jordan B. Peterson
Paperback
![]()
Frontiers in the Science and Technology…
Guneri Akovali, Carlos A. Bernardo, …
Hardcover
R6,174
Discovery Miles 61 740
Hardware Accelerators in Data Centers
Christoforos Kachris, Babak Falsafi, …
Hardcover
R4,328
Discovery Miles 43 280
Contemporary Issues in Evaluating…
Anna Esbensen, Emily Schworer
Hardcover
R5,842
Discovery Miles 58 420
The Robust Maximum Principle - Theory…
Vladimir G. Boltyanski, Alexander S. Poznyak
Hardcover
R3,878
Discovery Miles 38 780
Introduction to Hazardous Waste…
Clifton VanGuilder
Mixed media product
Geometric Method for Type Synthesis of…
Qinchuan Li, Jacques M. Herve, …
Hardcover
R3,040
Discovery Miles 30 400
New Directions in the Modeling of…
Agusti Lledos, Gregori Ujaque
Hardcover
R8,741
Discovery Miles 87 410
Handbook of Research on Emerging Trends…
Arun Solanki, Sandeep Kumar, …
Hardcover
R11,824
Discovery Miles 118 240
|