Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics
The authors present a basic model of the Bayesian implementation problem and then consider its application in areas including classical pure exchange economies, public goods provision, auctions and bargaining.
The main objective of politicians is to maximise economic growth, which heavily drives political policy and decision-making. Critics of the maximisation of growth as the central aim of economic policy have argued that growth in itself is not necessarily a good thing, particularly for the environment; however, what would replace the system and how it would be measured are questions that have been rarely answered satisfactorily. First published in 1991, this book was the first to lay out an entirely new set of practical proposals for developing new economic measurement tools, with the aim of being sustainable, 'green' and human-centred. Victor Anderson proposes that a whole set of indicators, rather than a single one, should play all the roles that GNP (Gross National Product) is responsible for. With a detailed overview of the central debates between the advocates and opponents of continued economic growth and an analysis of the various proposals for modification, this title will be of particular value to students interested in the diversity of measurement tools and the notion that economies should also be evaluated by their social and environmental consequences.
The development of economics changed dramatically during the twentieth century with the emergence of econometrics, macroeconomics and a more scientific approach in general. One of the key individuals in the transformation of economics was Ragnar Frisch, professor at the University of Oslo and the first Nobel Laureate in economics in 1969. He was a co-founder of the Econometric Society in 1930 (after having coined the word econometrics in 1926) and edited the journal Econometrics for twenty-two years. The discovery of the manuscripts of a series of eight lectures given by Frisch at the Henri Poincare Institute in March-April 1933 on The Problems and Methods of Econometrics will enable economists to more fully understand his overall vision of econometrics. This book is a rare exhibition of Frisch's overview on econometrics and is published here in English for the first time. Edited and with an introduction by Olav Bjerkholt and Ariane Dupont-Kieffer, Frisch's eight lectures provide an accessible and astute discussion of econometric issues from philosophical foundations to practical procedures. Concerning the development of economics in the twentieth century and the broader visions about economic science in general and econometrics in particular held by Ragnar Frisch, this book will appeal to anyone with an interest in the history of economics and econometrics.
'Big data' is now readily available to economic historians, thanks to the digitisation of primary sources, collaborative research linking different data sets, and the publication of databases on the internet. Key economic indicators, such as the consumer price index, can be tracked over long periods, and qualitative information, such as land use, can be converted to a quantitative form. In order to fully exploit these innovations it is necessary to use sophisticated statistical techniques to reveal the patterns hidden in datasets, and this book shows how this can be done. A distinguished group of economic historians have teamed up with younger researchers to pilot the application of new techniques to 'big data'. Topics addressed in this volume include prices and the standard of living, money supply, credit markets, land values and land use, transport, technological innovation, and business networks. The research spans the medieval, early modern and modern periods. Research methods include simultaneous equation systems, stochastic trends and discrete choice modelling. This book is essential reading for doctoral and post-doctoral researchers in business, economic and social history. The case studies will also appeal to historical geographers and applied econometricians.
This title provides a comprehensive, critical coverage of the progress and development of mathematical modelling within urban and regional economics over four decades.
This volume provides a coherent analysis of the economic, monetary and political aspects of growth dynamics in the Euro area. The different relevant aspects in this debate, presented and discussed by leading scholars and representatives of international organizations, include an assessment of the newest theoretical growth models for open economies, and empirical investigation of: the growth divergence between the US and Europe the extent to which fiscal co-ordination is desirable in a monetary union the role of product and labor market reforms the complex relationships between exchange rates and growth the contribution of monetary policy to economic growth and the prospects for economic growth in monetary unions. Although primarily focused on the Euro area, the analysis is equally relevant to all other common currency areas and will be welcomed by academics and students with an interest in European studies and financial economics, as well as policy and decision makers in international organisations, national institutions and central banks.
Portfolio theory and much of asset pricing, as well as many empirical applications, depend on the use of multivariate probability distributions to describe asset returns. Traditionally, this has meant the multivariate normal (or Gaussian) distribution. More recently, theoretical and empirical work in financial economics has employed the multivariate Student (and other) distributions which are members of the elliptically symmetric class. There is also a growing body of work which is based on skew-elliptical distributions. These probability models all exhibit the property that the marginal distributions differ only by location and scale parameters or are restrictive in other respects. Very often, such models are not supported by the empirical evidence that the marginal distributions of asset returns can differ markedly. Copula theory is a branch of statistics which provides powerful methods to overcome these shortcomings. This book provides a synthesis of the latest research in the area of copulae as applied to finance and related subjects such as insurance. Multivariate non-Gaussian dependence is a fact of life for many problems in financial econometrics. This book describes the state of the art in tools required to deal with these observed features of financial data. This book was originally published as a special issue of the European Journal of Finance.
"Transportation Economics" explores the efficient use of society's
scarce resources for the movement of people and goods. This book
carefully examines transportation markets and standard economic
tools, how these resources are used, and how the allocation of
society resources affects transportation activities. This textbook is unique in that it uses a detailed analysis of
econometric results from current transportation literature to
provide an integrated collection of theory and application. Its
numerous case studies illustrate the economic principles, discuss
testable hypotheses, analyze econometric results, and examine each
study's implications for public policy. These features make this a
well-developed introduction to the foundations of transportation
economics. Additional case studies on a spectrum of domestic and
international transportation topics available at http:
//www.blackwellpublishers.co.uk/mccarthy in order to keep students
abreast of recent developments in the field and their implications
for public policy. The paperback edition of this book is not available from Blackwell in the US or Canda.
First Published in 1970. Econometric model-building, on the other hand, has been largely confined to the advanced industrialised countries. In the few cases where macro-models have been built for underdeveloped countries (e.g. the Narasimham model (112) for India) the underlying assumptions have been largely of the Keynesian type, and thus in the authors opinion unconnected with the theory of economic development. This study is a modest attempt at econometric model-building on the basis of a model of development of an underdeveloped country.
Algorithmic Trading and Quantitative Strategies provides an in-depth overview of this growing field with a unique mix of quantitative rigor and practitioner's hands-on experience. The focus on empirical modeling and practical know-how makes this book a valuable resource for students and professionals. The book starts with the often overlooked context of why and how we trade via a detailed introduction to market structure and quantitative microstructure models. The authors then present the necessary quantitative toolbox including more advanced machine learning models needed to successfully operate in the field. They next discuss the subject of quantitative trading, alpha generation, active portfolio management and more recent topics like news and sentiment analytics. The last main topic of execution algorithms is covered in detail with emphasis on the state of the field and critical topics including the elusive concept of market impact. The book concludes with a discussion of the technology infrastructure necessary to implement algorithmic strategies in large-scale production settings. A GitHub repository includes data sets and explanatory/exercise Jupyter notebooks. The exercises involve adding the correct code to solve the particular analysis/problem.
When Harold Fried, et al. published The Measurement of Productive Efficiency: Techniques and Applications with OUP in 1993, the book received a great deal of professional interest for its accessible treatment of the rapidly growing field of efficiency and productivity analysis. The first several chapters, providing the background, motivation, and theoretical foundations for this topic, were the most widely recognized. In this tight, direct update, these same editors have compiled over ten years of the most recent research in this changing field, and expanded on those seminal chapters. The book will guide readers from the basic models to the latest, cutting-edge extensions, and will be reinforced by references to classic and current theoretical and applied research. It is intended for professors and graduate students in a variety of fields, ranging from economics to agricultural economics, business administration, management science, and public administration. It should also appeal to public servants and policy makers engaged in business performance analysis or regulation.
Economic Time Series: Modeling and Seasonality is a focused resource on analysis of economic time series as pertains to modeling and seasonality, presenting cutting-edge research that would otherwise be scattered throughout diverse peer-reviewed journals. This compilation of 21 chapters showcases the cross-fertilization between the fields of time series modeling and seasonal adjustment, as is reflected both in the contents of the chapters and in their authorship, with contributors coming from academia and government statistical agencies. For easier perusal and absorption, the contents have been grouped into seven topical sections: Section I deals with periodic modeling of time series, introducing, applying, and comparing various seasonally periodic models Section II examines the estimation of time series components when models for series are misspecified in some sense, and the broader implications this has for seasonal adjustment and business cycle estimation Section III examines the quantification of error in X-11 seasonal adjustments, with comparisons to error in model-based seasonal adjustments Section IV discusses some practical problems that arise in seasonal adjustment: developing asymmetric trend-cycle filters, dealing with both temporal and contemporaneous benchmark constraints, detecting trading-day effects in monthly and quarterly time series, and using diagnostics in conjunction with model-based seasonal adjustment Section V explores outlier detection and the modeling of time series containing extreme values, developing new procedures and extending previous work Section VI examines some alternative models and inference procedures for analysis of seasonal economic time series Section VII deals with aspects of modeling, estimation, and forecasting for nonseasonal economic time series By presenting new methodological developments as well as pertinent empirical analyses and reviews of established methods, the book provides much that is stimulating and practically useful for the serious researcher and analyst of economic time series.
Designed for a one-semester course, Applied Statistics for Business and Economics offers students in business and the social sciences an effective introduction to some of the most basic and powerful techniques available for understanding their world. Numerous interesting and important examples reflect real-life situations, stimulating students to think realistically in tackling these problems. Calculations can be performed using any standard spreadsheet package. To help with the examples, the author offers both actual and hypothetical databases on his website http://iwu.edu/~bleekley The text explores ways to describe data and the relationships found in data. It covers basic probability tools, Bayes' theorem, sampling, estimation, and confidence intervals. The text also discusses hypothesis testing for one and two samples, contingency tables, goodness-of-fit, analysis of variance, and population variances. In addition, the author develops the concepts behind the linear relationship between two numeric variables (simple regression) as well as the potentially nonlinear relationships among more than two variables (multiple regression). The final chapter introduces classical time-series analysis and how it applies to business and economics. This text provides a practical understanding of the value of statistics in the real world. After reading the book, students will be able to summarize data in insightful ways using charts, graphs, and summary statistics as well as make inferences from samples, especially about relationships.
A unique and comprehensive source of information, this book is the only international publication providing economists, planners, policymakers and business people with worldwide statistics on current performance and trends in the manufacturing sector. The Yearbook is designed to facilitate international comparisons relating to manufacturing activity and industrial development and performance. It provides data which can be used to analyse patterns of growth and related long term trends, structural change and industrial performance in individual industries. Statistics on employment patterns, wages, consumption and gross output and other key indicators are also presented.
This book constitutes the first serious attempt to explain the basics of econometrics and its applications in the clearest and simplest manner possible. Recognising the fact that a good level of mathematics is no longer a necessary prerequisite for economics/financial economics undergraduate and postgraduate programmes, it introduces this key subdivision of economics to an audience who might otherwise have been deterred by its complex nature.
World Inequality Report 2018 is the most authoritative and up-to-date account of global trends in inequality. Researched, compiled, and written by a team of the world’s leading economists of inequality, it presents—with unrivaled clarity and depth—information and analysis that will be vital to policy makers and scholars everywhere. Inequality has taken center stage in public debate as the wealthiest people in most parts of the world have seen their share of the economy soar relative to that of others, many of whom, especially in the West, have experienced stagnation. The resulting political and social pressures have posed harsh new challenges for governments and created a pressing demand for reliable data. The World Inequality Lab at the Paris School of Economics and the University of California, Berkeley, has answered this call by coordinating research into the latest trends in the accumulation and distribution of income and wealth on every continent. This inaugural report analyzes the Lab’s findings, which include data from major countries where information has traditionally been difficult to acquire, such as China, India, and Brazil. Among nations, inequality has been decreasing as traditionally poor countries’ economies have caught up with the West. The report shows, however, that inequality has been steadily deepening within almost every nation, though national trajectories vary, suggesting the importance of institutional and policy frameworks in shaping inequality. World Inequality Report 2018 will be a key document for anyone concerned about one of the most imperative and contentious subjects in contemporary politics and economics.
This two volume set is a collection of 30 classic papers presenting ideas which have now become standard in the field of Bayesian inference. Topics covered include the central field of statistical inference as well as applications to areas of probability theory, information theory, utility theory and computational theory. It is organized into seven sections: foundations, information theory and prior distributions; robustness and outliers; hierarchical, multivariate and non-parametric models; asymptotics; computations and Monte Carlo methods; and Bayesian econometrics.
Law and economics research has had an enormous impact on the laws of contracts, torts, property, crimes, corporations, and antitrust, as well as public regulation and fundamental rights. The Law and Economics of Patent Damages, Antitrust, and Legal Process examines several areas of important research by a variety of international scholars. It contains technical papers on the appropriate way to estimate damages in patent disputes, as well as methods for evaluating relevant markets and vertically integrated firms when determining the competitive effects of mergers and other actions. There are also papers on the implication of different legal processes, regulations, and liability rules on consumer welfare, which range from the impact of delays in legal decisions in labour cases in France to issues of criminal liability related to the use of artificial intelligence. This volume of Research in Law and Economics is a must-read for researchers and professionals of patent damages, antitrust, labour, and legal process.
Thoroughly classroom tested, this introductory text covers all the topics that constitute a foundation for basic econometrics, with concise and intuitive explanations of technical material. Important proofs are shown in detail; however, the focus is on developing regression models and understanding the residual
This volume in Advances in Econometrics showcases fresh methodological and empirical research on the econometrics of networks. Comprising both theoretical, empirical and policy papers, the authors bring together a wide range of perspectives to facilitate a dialogue between academics and practitioners for better understanding this groundbreaking field and its role in policy discussions. This edited collection includes thirteen chapters which covers various topics such as identification of network models, network formation, networks and spatial econometrics and applications of financial networks. Readers can also learn about network models with different types of interactions, sample selection in social networks, trade networks, stochastic dynamic programming in space, spatial panels, survival and networks, financial contagion, spillover effects, interconnectedness on consumer credit markets and a financial risk meter. The topics covered in the book, centered on the econometrics of data and models, are a valuable resource for graduate students and researchers in the field. The collection is also useful for industry professionals and data scientists due its focus on theoretical and applied works.
Volume 39A of Research in the History of Economic Thought and Methodology features a selection of essays presented at the 2019 Conference of the Latin American Society for the History of Economic Thought (ALAHPE), edited by Felipe Almeida and Carlos Eduardo Suprinyak, as well as a new general-research essay by Daniel Kuehn, an archival discovery by Katia Caldari and Luca Fiorito, and a book review by John Hall.
Now in its fourth edition, this comprehensive introduction of fundamental panel data methodologies provides insights on what is most essential in panel literature. A capstone to the forty-year career of a pioneer of panel data analysis, this new edition's primary contribution will be the coverage of advancements in panel data analysis, a statistical method widely used to analyze two or higher-dimensional panel data. The topics discussed in early editions have been reorganized and streamlined to comprehensively introduce panel econometric methodologies useful for identifying causal relationships among variables, supported by interdisciplinary examples and case studies. This book, to be featured in Cambridge's Econometric Society Monographs series, has been the leader in the field since the first edition. It is essential reading for researchers, practitioners and graduate students interested in the analysis of microeconomic behavior.
Now in its fourth edition, this comprehensive introduction of fundamental panel data methodologies provides insights on what is most essential in panel literature. A capstone to the forty-year career of a pioneer of panel data analysis, this new edition's primary contribution will be the coverage of advancements in panel data analysis, a statistical method widely used to analyze two or higher-dimensional panel data. The topics discussed in early editions have been reorganized and streamlined to comprehensively introduce panel econometric methodologies useful for identifying causal relationships among variables, supported by interdisciplinary examples and case studies. This book, to be featured in Cambridge's Econometric Society Monographs series, has been the leader in the field since the first edition. It is essential reading for researchers, practitioners and graduate students interested in the analysis of microeconomic behavior.
This major volume of essays by Kenneth F. Wallis features 28 articles published over a quarter of a century on the statistical analysis of economic time series, large-scale macroeconometric modelling, and the interface between them.The first part deals with time-series econometrics and includes significant early contributions to the development of the LSE tradition in time-series econometrics, which is the dominant British tradition and has considerable influence worldwide. Later sections discuss theoretical and practical issues in modelling seasonality and forecasting with applications in both large-scale and small-scale models. The final section summarizes the research programme of the ESRC Macroeconomic Modelling Bureau, a unique comparison project among economy-wide macroeconometric models. Professor Wallis has written a detailed introduction to the papers in this volume in which he explains the background to these papers and comments on subsequent developments. |
You may like...
Statistics for Business and Economics…
Paul Newbold, William Carlson, …
Paperback
R2,397
Discovery Miles 23 970
Business Statistics, Global Edition
Norean Sharpe, Richard De Veaux, …
Paperback
R2,297
Discovery Miles 22 970
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,524
Discovery Miles 35 240
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
|