![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics > General
This Handbook takes an econometric approach to the foundations of economic performance analysis. The focus is on the measurement of efficiency, productivity, growth and performance. These concepts are commonly measured residually and difficult to quantify in practice. In real-life applications, efficiency and productivity estimates are often quite sensitive to the models used in the performance assessment and the methodological approaches adopted by the analysis. The Palgrave Handbook of Performance Analysis discusses the two basic techniques of performance measurement - deterministic benchmarking and stochastic benchmarking - in detail, and addresses the statistical techniques that connect them. All chapters include applications and explore topics ranging from the output/input ratio to productivity indexes and national statistics.
Petri Nets were defined for the study of discrete events systems and later extended for many purposes including dependability assessment. In our knowledge, no book deals specifically with the use of different type of PN to dependability. We propose in addition to bring a focus on the adequacy of Petri net types to the study of various problems related to dependability such as risk analysis and probabilistic assessment. In the first part, the basic models of PN and some useful extensions are briefly recalled. In the second part, the PN are used as a formal model to describe the evolution process of critical system in the frame of an ontological approach. The third part focuses on the stochastic Petri Nets (SPN) and their use in dependability assessment. Different formal models of SPN are formally presented (semantics, evolution rules...) and their equivalence with the corresponding class of Markov processes to get an analytical assessment of dependability. Simplification methods are proposed in order to reduce the size of analytical model and to make it more calculable. The introduction of some concepts specific to high level PN allows too the consideration of complex systems. Few applications in the field of the instrumentation and control (l&C) systems, safety integrated systems (SIS) emphasize the benefits of SPN for dependability assessment.
This book proposes new solutions to the problem of poverty, and begins with providing analyses. It bases most of the analyses and solutions in the context of the digital era. The book also follows, in addition to a scientific distribution, a spatial-geographical one: analyses of countries of the European Union as well as South Africa, while it referring to two main variables, television and art, as agents of poverty alleviation. The book places particular focus on how poverty is understood in the framework of Industry 4.0. It introduces a new expanded Multidimensional Poverty Index with more than 20 dimensions; moreover, it provides a mathematically based solution for the disposal of perishable food. Finally, it does not disregard the crucial aspect of the issue of poverty: that of education planning. This book is of interest to specialists in poverty research, from students to professionals and from professors to activists, without excluding engineers.
Methods for Estimation and Inference in Modern Econometrics provides a comprehensive introduction to a wide range of emerging topics, such as generalized empirical likelihood estimation and alternative asymptotics under drifting parameterizations, which have not been discussed in detail outside of highly technical research papers. The book also addresses several problems often arising in the analysis of economic data, including weak identification, model misspecification, and possible nonstationarity. The book's appendix provides a review of some basic concepts and results from linear algebra, probability theory, and statistics that are used throughout the book. Topics covered include:
Offering a unified approach to studying econometric problems, Methods for Estimation and Inference in Modern Econometrics links most of the existing estimation and inference methods in a general framework to help readers synthesize all aspects of modern econometric theory. Various theoretical exercises and suggested solutions are included to facilitate understanding.
Complex dynamics constitute a growing and increasingly important area as they offer a strong potential to explain and formalize natural, physical, financial and economic phenomena. This book pursues the ambitious goal to bring together an extensive body of knowledge regarding complex dynamics from various academic disciplines. Beyond its focus on economics and finance, including for instance the evolution of macroeconomic growth models towards nonlinear structures as well as signal processing applications to stock markets, fundamental parts of the book are devoted to the use of nonlinear dynamics in mathematics, statistics, signal theory and processing. Numerous examples and applications, almost 700 illustrations and numerical simulations based on the use of Matlab make the book an essential reference for researchers and students from many different disciplines who are interested in the nonlinear field. An appendix recapitulates the basic mathematical concepts required to use the book.
This book provides the ultimate goal of economic studies to predict how the economy develops-and what will happen if we implement different policies. To be able to do that, we need to have a good understanding of what causes what in economics. Prediction and causality in economics are the main topics of this book's chapters; they use both more traditional and more innovative techniques-including quantum ideas -- to make predictions about the world economy (international trade, exchange rates), about a country's economy (gross domestic product, stock index, inflation rate), and about individual enterprises, banks, and micro-finance institutions: their future performance (including the risk of bankruptcy), their stock prices, and their liquidity. Several papers study how COVID-19 has influenced the world economy. This book helps practitioners and researchers to learn more about prediction and causality in economics -- and to further develop this important research direction.
Principles of Econometrics, 4th Edition, is an introductory book on economics and finance designed to provide an understanding of why econometrics is necessary, and a working knowledge of basic econometric tools. This latest edition is updated to reflect current state of economic and financial markets and provides new content on Kernel Density Fitting and Analysis of Treatment Effects. It offers new end-of-chapters questions and problems in each chapter; updated comprehensive Glossary of Terms; and summary of Probably and Statistics. The text applies basic econometric tools to modeling, estimation, inference, and forecasting through real world problems and evaluates critically the results and conclusions from others who use basic econometric tools. Furthermore, it provides a foundation and understanding for further study of econometrics and more advanced techniques.
This collection of original articles 8 years in the making
shines a bright light on recent advances in financial econometrics.
From a survey of mathematical and statistical tools for
understanding nonlinear Markov processes to an exploration of the
time-series evolution of the risk-return tradeoff for stock market
investment, noted scholars Yacine Ait-Sahalia and Lars Peter Hansen
benchmark the current state of knowledge while contributors build a
framework for its growth. Whether in the presence of statistical
uncertainty or the proven advantages and limitations of value at
risk models, readers will discover that they can set few
constraints on the value of this long-awaited volume.
This advanced undergraduate/graduate textbook teaches students in finance and economics how to use R to analyse financial data and implement financial models. It demonstrates how to take publically available data and manipulate, implement models and generate outputs typical for particular analyses. A wide spectrum of timely and practical issues in financial modelling are covered including return and risk measurement, portfolio management, option pricing and fixed income analysis. This new edition updates and expands upon the existing material providing updated examples and new chapters on equities, simulation and trading strategies, including machine learnings techniques. Select data sets are available online.
The University of Oxford has been and continues to be one of the most important global centres for economics. With six chapters on themes in Oxford economics and 24 chapters on the lives and work of Oxford economists, this volume shows how economics became established at the University, how it produced some of the world's best-known economists, including Francis Ysidro Edgeworth, Roy Harrod and David Hendry, and how it remains a global force for the very best in teaching and research in economics. With original contributions from a stellar cast, this volume provides economists - especially those interested in macroeconomics and the history of economic thought - with the first in-depth analysis of Oxford economics.
Presents recent developments of probabilistic assessment of systems dependability based on stochastic models, including graph theory, finite state automaton and language theory, for both dynamic and hybrid contexts.
This book overviews latest ideas and developments in financial econometrics, with an emphasis on how to best use prior knowledge (e.g., Bayesian way) and how to best use successful data processing techniques from other application areas (e.g., from quantum physics). The book also covers applications to economy-related phenomena ranging from traditionally analyzed phenomena such as manufacturing, food industry, and taxes, to newer-to-analyze phenomena such as cryptocurrencies, influencer marketing, COVID-19 pandemic, financial fraud detection, corruption, and shadow economy. This book will inspire practitioners to learn how to apply state-of-the-art Bayesian, quantum, and related techniques to economic and financial problems and inspire researchers to further improve the existing techniques and come up with new techniques for studying economic and financial phenomena. The book will also be of interest to students interested in latest ideas and results.
This handbook presents emerging research exploring the theoretical and practical aspects of econometric techniques for the financial sector and their applications in economics. By doing so, it offers invaluable tools for predicting and weighing the risks of multiple investments by incorporating data analysis. Throughout the book the authors address a broad range of topics such as predictive analysis, monetary policy, economic growth, systemic risk and investment behavior. This book is a must-read for researchers, scholars and practitioners in the field of economics who are interested in a better understanding of current research on the application of econometric methods to financial sector data.
How might one determine if a financial institution is taking risk in a balanced and productive manner? A powerful tool to address this question is economic capital, which is a model-based measure of the amount of equity that an entity must hold to satisfactorily offset its risk-generating activities. This book, with a particular focus on the credit-risk dimension, pragmatically explores real-world economic-capital methodologies and applications. It begins with the thorny practical issues surrounding the construction of an (industrial-strength) credit-risk economic-capital model, defensibly determining its parameters, and ensuring its efficient implementation. It then broadens its gaze to examine various critical applications and extensions of economic capital; these include loan pricing, the computation of loan impairments, and stress testing. Along the way, typically working from first principles, various possible modelling choices and related concepts are examined. The end result is a useful reference for students and practitioners wishing to learn more about a centrally important financial-management device.
This new text book by Urs Birchler and Monika Butler is an
introduction to the study of how information affects economic
relations. The authors provide a narrative treatment of the more
formal concepts of Information Economics, using easy to understand
and lively illustrations from film and literature and nutshell
examples. This book also comes with a supporting website (www.alicebob.info), maintained by the authors.
Introduction to RATS. Stationary Time-Series. Modeling Volatility. Tests for Trends and Unit Roots. Vector Autoregression Analysis. Cointegration and Error Correction. Statistical Tables. References and Additional Readings.
This book offers a fresh perspective on the early history of macroeconomics, by examining the macro-dynamic models developed from the late 1920s to the late 1940s, and their treatment of economic instability. It first explores the differences and similarities between the early mathematical business cycle models developed by Ragnar Frisch, Michal Kalecki, Jan Tinbergen and others, which were presented at meetings of the Econometric Society and discussed in private correspondence. By doing so, it demonstrates the diversity of models representing economic phenomena and especially economic crises and instability. Jan Tinbergen emerged as one of the most original and pivotal economists of this period, before becoming a leader of the macro-econometric movement, a role for which he is better known. His emphasis on economic policy was later mirrored in the United States in Paul Samuelson's early work on business cycles analysis, which, drawing on Alvin Hansen, aimed at interpreting the 1937-1938 recession. The authors then show that the subsequent shift in Samuelson's approach, from the study of business cycle trajectories to the comparison of equilibrium points, provided a response to the econometricians' critique of early Keynesian models. In the early 1940s, Samuelson was able to link together the tools that had been developed by the econometricians and the economic content that was at the heart of the so-called Keynesian revolution. The problem then shifted from business cycle trajectories to the disequilibrium between economic aggregates, and the issues raised by the global stability of full employment equilibrium. This was addressed by Oskar Lange, who presented an analysis of market coordination failures, and Lawrence Klein, Samuelson's first PhD student, who pursued empirical work in this direction. The book highlights the various visions and approaches that were embedded in these macro-dynamic models, and that their originality is of interest to today's model builders as well as to students and anyone interested in how new economic ideas come to be developed.
This book is a companion to Baltagi's (2008) leading graduate econometrics textbook on panel data entitled Econometric Analysis of Panel Data, 4 th Edition. The book guides the student of panel data econometrics by solving exercises in a logical and pedagogical manner, helping the reader understand, learn and apply panel data methods. It is also a helpful tool for those who like to learn by solving exercises and running software to replicate empirical studies. It works as a complementary study guide to Baltagi (2008) and also as a stand alone book that builds up the reader's confidence in working out difficult exercises in panel data econometrics and applying these methods to empirical work. The exercises start by providing some background information on partitioned regressions and the Frisch-Waugh-Lovell theorem. Then it goes through the basic material on fixed and random effects models in a one-way and two-way error components models: basic estimation, test of hypotheses and prediction. This include maximum likelihood estimation, testing for poolability of the data, testing for the significance of individual and time effects, as well as Hausman's test for correlated effects. It also provides extensions of panel data techniques to serial correlation, spatial correlation, heteroskedasticity, seemingly unrelated regressions, simultaneous equations, dynamic panel models, incomplete panels, measurement error, count panels, rotating panels, limited dependent variables, and non-stationary panels. The book provides several empirical examples that are useful to applied researchers, illustrating them using Stata and EViews showing the reader how to replicate these studies. The data sets are provided on the Wiley web site: www.wileyeurope.com/college/baltagi .
Statistical Theories and Methods with Applications to Economics and Business highlights recent advances in statistical theory and methods that benefit econometric practice. It deals with exploratory data analysis, a prerequisite to statistical modelling and part of data mining. It provides recently developed computational tools useful for data mining, analysing the reasons to do data mining and the best techniques to use in a given situation. * Provides a detailed description of computer algorithms. * Provides recently developed computational tools useful for data mining * Highlights recent advances in statistical theory and methods that benefit econometric practice. * Features examples with real life data. * Accompanying software featuring DASC (Data Analysis and Statistical Computing). Essential reading for practitioners in any area of econometrics; business analysts involved in economics and management; and Graduate students and researchers in economics and statistics.
This book presents the Multiple Criteria Decision Making (MCDM) paradigm for modelling agricultural decision-making in three parts. The first part, comprising two chapters, is philosophical in nature and deals with the concepts that define the underlying structure of the MCDM paradigm. The second part is the largest part consisting of five chapters, each of which presents the logic of a specific MCDM technique, and demonstrates how it can be used to model a particular decision problem. In the final part, some selected applications of the MCDM techniques to agricultural problems are presented and thus reinforce the development of an understanding of the MCDM paradigm.
This book presents covariance matrix estimation and related aspects of random matrix theory. It focuses on the sample covariance matrix estimator and provides a holistic description of its properties under two asymptotic regimes: the traditional one, and the high-dimensional regime that better fits the big data context. It draws attention to the deficiencies of standard statistical tools when used in the high-dimensional setting, and introduces the basic concepts and major results related to spectral statistics and random matrix theory under high-dimensional asymptotics in an understandable and reader-friendly way. The aim of this book is to inspire applied statisticians, econometricians, and machine learning practitioners who analyze high-dimensional data to apply the recent developments in their work.
Essentials of Applied Econometrics prepares students for a world in which more data surround us every day and in which econometric tools are put to diverse uses. Written for students in economics and for professionals interested in continuing an education in econometrics, this succinct text not only teaches best practices and state-of-the-art techniques, but uses vivid examples and data obtained from a variety of real world sources. The book's emphasis on application uniquely prepares the reader for today's econometric work, which can include analyzing causal relationships or correlations in big data to obtain useful insights.
The individual risks faced by banks, insurers, and marketers are less well understood than aggregate risks such as market-price changes. But the risks incurred or carried by individual people, companies, insurance policies, or credit agreements can be just as devastating as macroevents such as share-price fluctuations. A comprehensive introduction, The Econometrics of Individual Risk is the first book to provide a complete econometric methodology for quantifying and managing this underappreciated but important variety of risk. The book presents a course in the econometric theory of individual risk illustrated by empirical examples. And, unlike other texts, it is focused entirely on solving the actual individual risk problems businesses confront today. Christian Gourieroux and Joann Jasiak emphasize the microeconometric aspect of risk analysis by extensively discussing practical problems such as retail credit scoring, credit card transaction dynamics, and profit maximization in promotional mailing. They address regulatory issues in sections on computing the minimum capital reserve for coverage of potential losses, and on the credit-risk measure CreditVar. The book will interest graduate students in economics, business, finance, and actuarial studies, as well as actuaries and financial analysts. |
![]() ![]() You may like...
Tax Policy Design and Behavioural…
Hielke Buddelmeyer, John Creedy, …
Hardcover
R3,277
Discovery Miles 32 770
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Handbook of Field Experiments, Volume 1
Esther Duflo, Abhijit Banerjee
Hardcover
R3,684
Discovery Miles 36 840
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R7,998
Discovery Miles 79 980
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,513
Discovery Miles 65 130
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,672
Discovery Miles 26 720
|