![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Factor models have become the most successful tool in the analysis and forecasting of high-dimensional time series. This monograph provides an extensive account of the so-called General Dynamic Factor Model methods. The topics covered include: asymptotic representation problems, estimation, forecasting, identification of the number of factors, identification of structural shocks, volatility analysis, and applications to macroeconomic and financial data.
A number of clubs in professional sports leagues exhibit winning streaks over a number of consecutive seasons that do not conform to the standard economic model of a professional sports league developed by El Hodiri and Quirk (1994) and Fort and Quirk (1995). These clubs seem to display what we call "unsustainable runs," defined as a period of two to four seasons where the club acquires expensive talent and attempts to win a league championship despite not having the market size to sustain such a competitive position in the long run. The standard model predicts that clubs that locate in large economic markets will tend to acquire more talent and achieve more success on the field and at the box office than clubs that are located in small markets.This book builds a model that can allow for unsustainable runs yet retain most of the features of the standard model then subjects it to empirical verification. The new model we develop in the book has as its central feature the ability to generate two equilibria for a club under certain conditions. In the empirical sections of the book, we use time-series analysis to attempt to test for the presence of unsustainable runs using historical data from National Football League (NFL), National Basketball Association (NBA), National Hockey League (NHL) and Major League Baseball (MLB). The multiple equilibria model retains all of the features of the standard model of a professional sports league that is accepted quite universally by economists, yet it offers a much richer approach by including an exploration of the effects of revenues that are earned at the league level (television, apparel, naming rights, etc.) that are then shared by all of the member clubs, making this book very unique and of great interest to scholars in a variety of fields in economics.
This book presents eleven classic papers by the late Professor Suzanne Scotchmer with introductions by leading economists and legal scholars. This book introduces Scotchmer's life and work; analyses her pioneering contributions to the economics of patents and innovation incentives, with a special focus on the modern theory of cumulative innovation; and describes her pioneering work on law and economics, evolutionary game theory, and general equilibrium/club theory. This book also provides a self-contained introduction to students who want to learn more about the various fields that Professor Scotchmer worked in, with a particular focus on patent incentives and cumulative innovation.
This book develops a machine-learning framework for predicting economic growth. It can also be considered as a primer for using machine learning (also known as data mining or data analytics) to answer economic questions. While machine learning itself is not a new idea, advances in computing technology combined with a dawning realization of its applicability to economic questions makes it a new tool for economists.
Davidson and MacKinnon have written an outstanding textbook for graduates in econometrics, covering both basic and advanced topics and using geometrical proofs throughout for clarity of exposition. The book offers a unified theoretical perspective, and emphasizes the practical applications of modern theory.
This is the perfect (and essential) supplement for all econometrics
classes--from a rigorous first undergraduate course, to a first
master's, to a PhD course.
This lively book lays out a methodology of confidence distributions and puts them through their paces. Among other merits, they lead to optimal combinations of confidence from different sources of information, and they can make complex models amenable to objective and indeed prior-free analysis for less subjectively inclined statisticians. The generous mixture of theory, illustrations, applications and exercises is suitable for statisticians at all levels of experience, as well as for data-oriented scientists. Some confidence distributions are less dispersed than their competitors. This concept leads to a theory of risk functions and comparisons for distributions of confidence. Neyman-Pearson type theorems leading to optimal confidence are developed and richly illustrated. Exact and optimal confidence distribution is the gold standard for inferred epistemic distributions. Confidence distributions and likelihood functions are intertwined, allowing prior distributions to be made part of the likelihood. Meta-analysis in likelihood terms is developed and taken beyond traditional methods, suiting it in particular to combining information across diverse data sources.
This book presents the latest advances in the theory and practice of Marshall-Olkin distributions. These distributions have been increasingly applied in statistical practice in recent years, as they make it possible to describe interesting features of stochastic models like non-exchangeability, tail dependencies and the presence of a singular component. The book presents cutting-edge contributions in this research area, with a particular emphasis on financial and economic applications. It is recommended for researchers working in applied probability and statistics, as well as for practitioners interested in the use of stochastic models in economics. This volume collects selected contributions from the conference “Marshall-Olkin Distributions: Advances in Theory and Applications,†held in Bologna on October 2-3, 2013.
The main objective of this book is to develop a strategy and policy measures to enhance the formalization of the shadow economy in order to improve the competitiveness of the economy and contribute to economic growth; it explores these issues with special reference to Serbia. The size and development of the shadow economy in Serbia and other Central and Eastern European countries are estimated using two different methods (the MIMIC method and household-tax-compliance method). Micro-estimates are based on a special survey of business entities in Serbia, which for the first time allows us to explore the shadow economy from the perspective of enterprises and entrepreneurs. The authors identify the types of shadow economy at work in business entities, the determinants of shadow economy participation, and the impact of competition from the informal sector on businesses. Readers will learn both about the potential fiscal effects of reducing the shadow economy to the levels observed in more developed countries and the effects that formalization of the shadow economy can have on economic growth.
Inequality is a charged topic. Measures of income inequality rose in the USA in the 1990s to levels not seen since 1929 and gave rise to a suspicion, not for the first time, of a link between radical inequality and financial instability with a resulting crisis under capitalism. Professional macroeconomists have generally taken little interest in inequality because, within the parameters of traditional economic theory, the economy will stabilize itself at full employment. In addition, enlightened economists could enact stabilizing measures to manage any imbalances. The dominant voices among academic economists were unable to interpret the causal forces at work during both the Great Depression and the recent global financial crisis. In Inequality and Instability, James K. Galbraith argues that since there has been no serious work done on the macroeconomic effects of inequality, new sources of evidence are required. Galbraith offers for the first time a vast expansion of the capacity to calculate measures of inequality both at lower and higher levels of aggregation. Instead of measuring inequality as traditionally done, by country, Galbraith insists that to understand real differences that have real effects, inequality must be examined through both smaller and larger administrative units, like sub-national levels within and between states and provinces, multinational continental economies, and the world. He points out that inequality could be captured by measures across administrative boundaries to capture data on more specific groups to which people belong. For example, in China, economic inequality reflects the difference in average income levels between city and countryside, or between coastal regions and the interior, and a simple ratio averages would be an indicator of trends in inequality over the country as a whole. In a comprehensive presentation of this new method of using data, Inequality and Instability offers an unequaled look at the US economy and various global economies that was not accessible to us before. This provides a more sophisticated and a more accurate picture of inequality around the world, and how inequality is one of the most basic sources of economic instability.
Pioneered by American economist Paul Samuelson, revealed preference theory is based on the idea that the preferences of consumers are revealed in their purchasing behavior. Researchers in this field have developed complex and sophisticated mathematical models to capture the preferences that are 'revealed' through consumer choice behavior. This study of consumer demand and behavior is closely tied up with econometrics (especially nonparametric econometrics), where testing the validity of different theoretical models is an important aspect of research. The theory of revealed preference has a very long and distinguished tradition in economics, but there was no systematic presentation of the theory until now. This book deals with basic questions in economic theory, such as the relation between theory and data, and studies the situations in which empirical observations are consistent or inconsistent with some of the best known theories in economics.
In recent years nonlinearities have gained increasing importance in economic and econometric research, particularly after the financial crisis and the economic downturn after 2007. This book contains theoretical, computational and empirical papers that incorporate nonlinearities in econometric models and apply them to real economic problems. It intends to serve as an inspiration for researchers to take potential nonlinearities in account. Researchers should be aware of applying linear model-types spuriously to problems which include non-linear features. It is indispensable to use the correct model type in order to avoid biased recommendations for economic policy.
This book deals with the application of wavelet and spectral methods for the analysis of nonlinear and dynamic processes in economics and finance. It reflects some of the latest developments in the area of wavelet methods applied to economics and finance. The topics include business cycle analysis, asset prices, financial econometrics, and forecasting. An introductory paper by James Ramsey, providing a personal retrospective of a decade's research on wavelet analysis, offers an excellent overview over the field.
Designed for a one-semester course, Applied Statistics for Business and Economics offers students in business and the social sciences an effective introduction to some of the most basic and powerful techniques available for understanding their world. Numerous interesting and important examples reflect real-life situations, stimulating students to think realistically in tackling these problems. Calculations can be performed using any standard spreadsheet package. To help with the examples, the author offers both actual and hypothetical databases on his website http://iwu.edu/~bleekley The text explores ways to describe data and the relationships found in data. It covers basic probability tools, Bayes' theorem, sampling, estimation, and confidence intervals. The text also discusses hypothesis testing for one and two samples, contingency tables, goodness-of-fit, analysis of variance, and population variances. In addition, the author develops the concepts behind the linear relationship between two numeric variables (simple regression) as well as the potentially nonlinear relationships among more than two variables (multiple regression). The final chapter introduces classical time-series analysis and how it applies to business and economics. This text provides a practical understanding of the value of statistics in the real world. After reading the book, students will be able to summarize data in insightful ways using charts, graphs, and summary statistics as well as make inferences from samples, especially about relationships.
​This publication provides insight into the agricultural sector. It illustrates new tendencies in agricultural economics and dynamics (interrelationship with other sectors in rural zones and multifunctionality) and the implications of the World Trade Organization negotiations in the international trade of agricultural products. Due to environmental problems, availability of budget, consumer preferences for food safety and pressure from the World Trade Organization, there are many changes in the agricultural sector. This book addresses those new developments and provides insights into possible future developments. The agricultural activity is an economic sector that is fundamental for a sustainable economic growth of every country. However, this sector has many particularities, namely those related with some structural problems (many farms with reduced dimension, sometimes lack of vocational training of the farmers, difficulties of put the farmers together in associations and cooperatives), variations of the productions and prices over the year and some environmental problems derived from the utilization of pesticides and fertilizers.
In the era of Big Data our society is given the unique opportunity to understand the inner dynamics and behavior of complex socio-economic systems. Advances in the availability of very large databases, in capabilities for massive data mining, as well as progress in complex systems theory, multi-agent simulation and computational social science open the possibility of modeling phenomena never before successfully achieved. This contributed volume from the Perm Winter School address the problems of the mechanisms and statistics of the socio-economics system evolution with a focus on financial markets powered by the high-frequency data analysis.
This book provides a detailed introduction to the theoretical and methodological foundations of production efficiency analysis using benchmarking. Two of the more popular methods of efficiency evaluation are Stochastic Frontier Analysis (SFA) and Data Envelopment Analysis (DEA), both of which are based on the concept of a production possibility set and its frontier. Depending on the assumed objectives of the decision-making unit, a Production, Cost, or Profit Frontier is constructed from observed data on input and output quantities and prices. While SFA uses different maximum likelihood estimation techniques to estimate a parametric frontier, DEA relies on mathematical programming to create a nonparametric frontier. Yet another alternative is the Convex Nonparametric Frontier, which is based on the assumed convexity of the production possibility set and creates a piecewise linear frontier consisting of a number of tangent hyper planes. Three of the papers in this volume provide a detailed and relatively easy to follow exposition of the underlying theory from neoclassical production economics and offer step-by-step instructions on the appropriate model to apply in different contexts and how to implement them. Of particular appeal are the instructions on (i) how to write the codes for different SFA models on STATA, (ii) how to write a VBA Macro for repetitive solution of the DEA problem for each production unit on Excel Solver, and (iii) how to write the codes for the Nonparametric Convex Frontier estimation. The three other papers in the volume are primarily theoretical and will be of interest to PhD students and researchers hoping to make methodological and conceptual contributions to the field of nonparametric efficiency analysis.
Economists can use computer algebra systems to manipulate symbolic models, derive numerical computations, and analyze empirical relationships among variables. Maxima is an open-source multi-platform computer algebra system that rivals proprietary software. Maxima's symbolic and computational capabilities enable economists and financial analysts to develop a deeper understanding of models by allowing them to explore the implications of differences in parameter values, providing numerical solutions to problems that would be otherwise intractable, and by providing graphical representations that can guide analysis. This book provides a step-by-step tutorial for using this program to examine the economic relationships that form the core of microeconomics in a way that complements traditional modeling techniques. Readers learn how to phrase the relevant analysis and how symbolic expressions, numerical computations, and graphical representations can be used to learn from microeconomic models. In particular, comparative statics analysis is facilitated. Little has been published on Maxima and its applications in economics and finance, and this volume will appeal to advanced undergraduates, graduate-level students studying microeconomics, academic researchers in economics and finance, economists, and financial analysts.
This ambitious book looks 'behind the model' to reveal how economists use formal models to generate insights into the economy. Drawing on recent work in the philosophy of science and economic methodology, the book presents a novel framework for understanding the logic of economic modeling. It also reveals the ways in which economic models can mislead rather than illuminate. Importantly, the book goes beyond purely negative critique, proposing a concrete program of methodological reform to better equip economists to detect potential mismatches between their models and the targets of their inquiry. Ranging across economics, philosophy, and social science methods, and drawing on a variety of examples, including the recent financial crisis, Behind the Model will be of interest to anyone who has wondered how economics works - and why it sometimes fails so spectacularly.
This ambitious book looks 'behind the model' to reveal how economists use formal models to generate insights into the economy. Drawing on recent work in the philosophy of science and economic methodology, the book presents a novel framework for understanding the logic of economic modeling. It also reveals the ways in which economic models can mislead rather than illuminate. Importantly, the book goes beyond purely negative critique, proposing a concrete program of methodological reform to better equip economists to detect potential mismatches between their models and the targets of their inquiry. Ranging across economics, philosophy, and social science methods, and drawing on a variety of examples, including the recent financial crisis, Behind the Model will be of interest to anyone who has wondered how economics works - and why it sometimes fails so spectacularly.
The individual risks faced by banks, insurers, and marketers are less well understood than aggregate risks such as market-price changes. But the risks incurred or carried by individual people, companies, insurance policies, or credit agreements can be just as devastating as macroevents such as share-price fluctuations. A comprehensive introduction, The Econometrics of Individual Risk is the first book to provide a complete econometric methodology for quantifying and managing this underappreciated but important variety of risk. The book presents a course in the econometric theory of individual risk illustrated by empirical examples. And, unlike other texts, it is focused entirely on solving the actual individual risk problems businesses confront today. Christian Gourieroux and Joann Jasiak emphasize the microeconometric aspect of risk analysis by extensively discussing practical problems such as retail credit scoring, credit card transaction dynamics, and profit maximization in promotional mailing. They address regulatory issues in sections on computing the minimum capital reserve for coverage of potential losses, and on the credit-risk measure CreditVar. The book will interest graduate students in economics, business, finance, and actuarial studies, as well as actuaries and financial analysts.
Many economic theories depend on the presence or absence of a unit root for their validity, and econometric and statistical theory undergo considerable changes when unit roots are present. Thus, knowledge on unit roots has become so important, necessitating an extensive, compact, and nontechnical book on this subject. This book is rested on this motivation and introduces the literature on unit roots in a comprehensive manner to both empirical and theoretical researchers in economics and other areas. By providing a clear, complete, and critical discussion of unit root literature, In Choi covers a wide range of topics, including uniform confidence interval construction, unit root tests allowing structural breaks, mildly explosive processes, exuberance testing, fractionally integrated processes, seasonal unit roots and panel unit root testing. Extensive, up to date, and readily accessible, this book is a comprehensive reference source on unit roots for both students and applied workers.
Three leading experts have produced a landmark work based on a set of working papers published by the Center for Operations Research and Econometrics (CORE) at the Universite Catholique de Louvain in 1994 under the title 'Repeated Games', which holds almost mythic status among game theorists. Jean-Francois Mertens, Sylvain Sorin and Shmuel Zamir have significantly elevated the clarity and depth of presentation with many results presented at a level of generality that goes far beyond the original papers - many written by the authors themselves. Numerous results are new, and many classic results and examples are not to be found elsewhere. Most remain state of the art in the literature. This book is full of challenging and important problems that are set up as exercises, with detailed hints provided for their solutions. A new bibliography traces the development of the core concepts up to the present day.
Three leading experts have produced a landmark work based on a set of working papers published by the Center for Operations Research and Econometrics (CORE) at the Universite Catholique de Louvain in 1994 under the title 'Repeated Games', which holds almost mythic status among game theorists. Jean-Francois Mertens, Sylvain Sorin and Shmuel Zamir have significantly elevated the clarity and depth of presentation with many results presented at a level of generality that goes far beyond the original papers - many written by the authors themselves. Numerous results are new, and many classic results and examples are not to be found elsewhere. Most remain state of the art in the literature. This book is full of challenging and important problems that are set up as exercises, with detailed hints provided for their solutions. A new bibliography traces the development of the core concepts up to the present day.
The productivity of a business exerts an important influence on its financial performance. A similar influence exists for industries and economies: those with superior productivity performance thrive at the expense of others. Productivity performance helps explain the growth and demise of businesses and the relative prosperity of nations. Productivity Accounting: The Economics of Business Performance offers an in-depth analysis of variation in business performance, providing the reader with an analytical framework within which to account for this variation and its causes and consequences. The primary focus is the individual business, and the principal consequence of business productivity performance is business financial performance. Alternative measures of financial performance are considered, including profit, profitability, cost, unit cost, and return on assets. Combining analytical rigor with empirical illustrations, the analysis draws on wide-ranging literatures, both historical and current, from business and economics, and explains how businesses create value and distribute it. |
You may like...
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,581
Discovery Miles 35 810
The Handbook of Historical Economics
Alberto Bisin, Giovanni Federico
Paperback
R2,567
Discovery Miles 25 670
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
|