![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Distributional issues may not have always been among the main
concerns of the economic profession. Today, in the beginning of the
2000s, the position is different. During the last quarter of a
century, economic growth proved to be unsteady and rather slow on
average. The situation of those at the bottom ceased to improve
regularly as in the preceding fast growth and full-employment
period. Europe has seen prolonged unemployment and there has been
widening wage dispersion in a number of OECD countries. Rising
affluence in rich countries coexists, in a number of such
countries, with the persistence of poverty. As a consequence, it is
difficult nowadays to think of an issue ranking high in the public
economic debate without some strong explicit distributive
implications. Monetary policy, fiscal policy, taxes, monetary or
trade union, privatisation, price and competition regulation, the
future of the Welfare State are all issues which are now often
perceived as conflictual because of their strong redistributive
content.
For more information on the Handbooks in Economics series, please see our home page on http: //www.elsevier.nl/locate/hes
First Published in 2000. Routledge is an imprint of Taylor & Francis, an informa company.
Originally published in 1984. This book addresses the economics of the changing mineral industry, which is highly affected by energy economics. The study estimates, in quantitative terms, the short- to mid-term consequences of rising energy prices alongside falling ore quality for the copper and aluminum industries. The effects of changing cost factors on substitution between metals is assessed as is the potential for relying on increased recycling. Copper and aluminum industry problems should be representative of those faced by the mineral processing sector as a whole. Two complex econometric models presented here produce forecasts for the industries and the book discusses and reviews other econometric commodity models.
Originally published in 1976 and with second edition published in 1984. This book established itself as the first genuinely introductory text on econometric methods, assuming no formal background on the part of the reader. The second edition maintains this distinctive feature. Fundamental concepts are carefully explained and, where possible, techniques are developed by verbal reasoning rather than formal proof. It provides all the material for a basic course. and is also ideal for a student working alone. Very little knowledge of maths and statistics is assumed, and the logic of statistical method is carefully stated. There are numerous exercises, designed to help the student assess individual progress. Methods are described with computer solutions in mind and the author shows how a variety of different calculations can be performed with relatively simple programs. This new edition also includes much new material - statistical tables are now included and their use carefully explained.
Originally published in 1979. An Input/output database is an information system carrying current data on the intermediate consumption of any product or service by all the specified major firms that consume it. This book begins with a survey of how the interrelationships of an economic system can be represented in a two-dimensional model which traces the output of each economic sector to all other sectors. It talks about how the use of such databases to identify major buyers and sellers can illuminate problems of economic policy at the national, regional, and corporate level and aid in analyzing factors affecting the control of inflation, energy use, transportation, and environmental pollution. The book discusses how advances in database technology, have brought to the fore such issues as the right to individual privacy, corporate secrecy, the public's right of access to stored data, and the use of such information for national planning in a free-enterprise society.
Originally published in 1991. The dilemma of solid and hazardous waste disposal in an environmentally safe manner has become a global problem. This book presents a modern approach to economic and operations research modelling in urban and regional waste management with an international perspective. Location and space economics are discussed along with transportation, technology, health hazards, capacity levels, political realities and the linkage with general global economic systems. The algorithms and models developed are then applied to two major cities in the world by way of case study example of the use of these systems.
Originally published in 1979. This study focuses primarily on the development of a structural model for the U. S. Government securities market, ie. the specification and estimation of the demands for disaggregated maturity classes of U.S. Government securities by the individual investor groups participating in the market. A particularly important issue addressed involves the extent of the substitution relationship among different maturity classes of U.S. Government securities.
This book brings together cutting edge contributions in the fields of international economics, micro theory, welfare economics and econometrics, with contributions from Donald R. Davis, Avinash K. Dixit, Tadashi Inoue, Ronald W. Jones, Dale W. Jorgenson, K. Rao Kadiyala, Murray C. Kemp, Kenneth M. Kletzer, Anne O. Krueger, Mukul Majumdar, Daniel McFadden, Lionel McKenzie, James R. Melvin, James C. Moore, Takashi Negishi, Yoshihiko Otani, Raymond Riezman, Paul A. Samuelson, Joaquim Silvestre and Marie Thursby.
This book makes indicators more accessible, in terms of what they are, who created them and how they are used. It examines the subjectivity and human frailty behind these quintessentially 'hard' and technical measures of the world. To achieve this goal, The Rise and Rise of Indicators presents the world in terms of a selected set of indicators. The emphasis is upon the origins of the indicators and the motivation behind their creation and evolution. The ideas and assumptions behind the indicators are made transparent to demonstrate how changes to them can dramatically alter the ranking of countries that emerge. They are, after all, human constructs and thus embody human biases. The book concludes by examining the future of indicators and the author sets out some possible trajectories, including the growing emphasis on indicators as important tools in the Sustainable Development Goals that have been set for the world up until 2030. This is a valuable resource for undergraduate and postgraduate students in the areas of economics, sociology, geography, environmental studies, development studies, area studies, business studies, politics and international relations.
Statistical Methods in Econometrics is appropriate for beginning
graduate courses in mathematical statistics and econometrics in
which the foundations of probability and statistical theory are
developed for application to econometric methodology. Because
econometrics generally requires the study of several unknown
parameters, emphasis is placed on estimation and hypothesis testing
involving several parameters. Accordingly, special attention is
paid to the multivariate normal and the distribution of quadratic
forms. Lagrange multiplier tests are discussed in considerable
detail, along with the traditional likelihood ration and Wald
tests. Characteristic functions and their properties are fully
exploited. Also asymptotic distribution theory, usually given only
cursory treatment, is discussed in detail.
The aim of this book is an applied and unified introduction into parametric, non- and semiparametric regression that closes the gap between theory and application. The most important models and methods in regression are presented on a solid formal basis, and their appropriate application is shown through many real data examples and case studies. Availability of (user-friendly) software has been a major criterion for the methods selected and presented. Thus, the book primarily targets an audience that includes students, teachers and practitioners in social, economic, and life sciences, as well as students and teachers in statistics programs, and mathematicians and computer scientists with interests in statistical modeling and data analysis. It is written on an intermediate mathematical level and assumes only knowledge of basic probability, calculus, and statistics. The most important definitions and statements are concisely summarized in boxes. Two appendices describe required matrix algebra, as well as elements of probability calculus and statistical inference.
Many economic and social surveys are designed as panel studies, which provide important data for describing social changes and testing causal relations between social phenomena. This textbook shows how to manage, describe, and model these kinds of data. It presents models for continuous and categorical dependent variables, focusing either on the level of these variables at different points in time or on their change over time. It covers fixed and random effects models, models for change scores and event history models. All statistical methods are explained in an application-centered style using research examples from scholarly journals, which can be replicated by the reader through data provided on the accompanying website. As all models are compared to each other, it provides valuable assistance with choosing the right model in applied research. The textbook is directed at master and doctoral students as well as applied researchers in the social sciences, psychology, business administration and economics. Readers should be familiar with linear regression and have a good understanding of ordinary least squares estimation.
This study, first published in 1979, examines and contrasts two concepts of credit rationing. The first concept takes the relevant price of credit to be the explicit interest rate on the loan and defines the demand for credit as the amount an individual borrower would like to receive at that rate. Under the alternative definition, the price of credit consists of the complete set of loan terms confronting a class of borrowers with given characteristics, while the demand for credit equals the total number of loan which members of the class would like to receive at those terms. This title will be of interest to students of monetary economics.
This study, first published in 1994, is intended to deepen the readers understanding of the phenomenon of equilibrium credit rationing in two areas. The first area concerns the form that equilibrium credit rationing assumes and its importance in determining the behaviour of interest rates. The second concerns the role of equilibrium credit rationing in transmitting monetary shocks to the real sector. This title will be of interest to students of monetary economics.
The object of this work, first published in 1977, is to examine the history of the economic and monetary union (EMU) in the European Community, the policies of the parties involved and the conflicts of interest created in the political and economic environment within which all this has taken place. This title will be of interest to students of monetary economics and finance.
Notions of probability and uncertainty have been increasingly
prominant in modern economics. This book considers the
philosophical and practical difficulties inherent in integrating
these concepts into realistic economic situations. It outlines and
evaluates the major developments, indicating where further work is
needed.
The analysis, prediction and interpolation of economic and other time series has a long history and many applications. Major new developments are taking place, driven partly by the need to analyze financial data. The five papers in this book describe those new developments from various viewpoints and are intended to be an introduction accessible to readers from a range of backgrounds. The book arises out of the second Seminaire European de Statistique (SEMSTAT) held in Oxford in December 1994. This brought together young statisticians from across Europe, and a series of introductory lectures were given on topics at the forefront of current research activity. The lectures form the basis for the five papers contained in the book. The papers by Shephard and Johansen deal respectively with time series models for volatility, i.e. variance heterogeneity, and with cointegration. Clements and Hendry analyze the nature of prediction errors. A complementary review paper by Laird gives a biometrical view of the analysis of short time series. Finally Astrup and Nielsen give a mathematical introduction to the study of option pricing. Whilst the book draws its primary motivation from financial series and from multivariate econometric modelling, the applications are potentially much broader.
Portfolio theory and much of asset pricing, as well as many empirical applications, depend on the use of multivariate probability distributions to describe asset returns. Traditionally, this has meant the multivariate normal (or Gaussian) distribution. More recently, theoretical and empirical work in financial economics has employed the multivariate Student (and other) distributions which are members of the elliptically symmetric class. There is also a growing body of work which is based on skew-elliptical distributions. These probability models all exhibit the property that the marginal distributions differ only by location and scale parameters or are restrictive in other respects. Very often, such models are not supported by the empirical evidence that the marginal distributions of asset returns can differ markedly. Copula theory is a branch of statistics which provides powerful methods to overcome these shortcomings. This book provides a synthesis of the latest research in the area of copulae as applied to finance and related subjects such as insurance. Multivariate non-Gaussian dependence is a fact of life for many problems in financial econometrics. This book describes the state of the art in tools required to deal with these observed features of financial data. This book was originally published as a special issue of the European Journal of Finance.
This book discusses market microstructure environment within the context of the global financial crisis. In the first part, the market microstructure theory is recalled and the main microstructure models and hypotheses are discussed. The second part focuses on the main effects of the financial downturn through an examination of market microstructure dynamics. In particular, the effects of market imperfections and the limitations associated with microstructure models are discussed. Finally, the new regulations and recent developments for financial markets that aim to improve the market microstructure are discussed. Well-known experts on the subject contribute to the chapters in the book. A must-read for academic researchers, students and quantitative practitioners.
This is the second of three volumes surveying the state of the art
in Game Theory and its applications to many and varied fields, in
particular to economics. The chapters in the present volume are
contributed by outstanding authorities, and provide comprehensive
coverage and precise statements of the main results in each area.
The applications include empirical evidence. The following topics
are covered: communication and correlated equilibria, coalitional
games and coalition structures, utility and subjective probability,
common knowledge, bargaining, zero-sum games, differential games,
and applications of game theory to signalling, moral hazard,
search, evolutionary biology, international relations, voting
procedures, social choice, public economics, politics, and cost
allocation. This handbook will be of interest to scholars in
economics, political science, psychology, mathematics and biology.
For more information on the Handbooks in Economics series, please
see our home page on http: //www.elsevier.nl/locate/hes
Hardbound. This is the fourth volume of the Handbook of Econometrics. The Handbook is a definitive reference source and teaching aid for econometricians. It examines models, estimation theory, data analysis and field applications in econometrics. Comprehensive surveys, written by experts, discuss recent developments at a level suitable for professional use by economists, econometricians, statisticians, and in advanced graduate econometrics courses.
Concepts of probability are an integral component of economic theory. However there are many theories of probability and these are manifested in different approaches to economic theory itself. This text offers a clear and informative survey of the area serving to standardize terminology, and so to integrate probability into a discussion of the foundations of economic theory. Having summarized the three main, competing interpretations of probability, the author explains its fundamental importance in economics, and illustrates this with a comparison of Knight's and Keynes's very different conceptions. Finally, he examines the Austrian, Keynesian and New Classical/Rational Expectation schools of thought.
This book studies the information spillover among financial markets and explores the intraday effect and ACD models with high frequency data. This book also contributes theoretically by providing a new statistical methodology with comparative advantages for analyzing co-movements between two time series. It explores this new method by testing the information spillover between the Chinese stock market and the international market, futures market and spot market. Using the high frequency data, this book investigates the intraday effect and examines which type of ACD model is particularly suited in capturing financial duration dynamics. The book will be of invaluable use to scholars and graduate students interested in co-movements among different financial markets and financial market microstructure and to investors and regulation departments looking to improve their risk management.
This book undertakes a theoretical and econometric analysis of intense economic growth in selected European countries during the end of the twentieth century and the beginning of the twenty first. Focusing on the accelerated economic growth that occurred in Ireland, the Netherlands, Spain, and Turkey, this book investigates the determinants and consequences of this "miracle" growth and discusses them in context of growth and development processes observed in European market-type economies after the World War II. Using imperfect knowledge economics (IKE) as a theoretical framework to interpret the empirical results, this book provides a fresh theoretical perspective in comparison with current Neo-classical, Keynesian and institutional paradigms. With this systematic approach, the authors seek to provide a unified methodology for evaluating the phenomenon of intense economic growth that has heretofore been missing from the discipline. Combining diverse theoretical and methodological strategies to provide a holistic understanding of the historical process of economic change, this volume will be of interest to students and scholars of economic growth, econometrics, political economy, and the new institutional economics as well as policymakers.
This book provides a coherent description of the main concepts and statistical methods used to analyse economic performance. The focus is on measures of performance that are of practical relevance to policy makers. Most, if not all, of these measures can be viewed as measures of productivity and/or efficiency. Linking fields as diverse as index number theory, data envelopment analysis and stochastic frontier analysis, the book explains how to compute measures of input and output quantity change that are consistent with measurement theory. It then discusses ways in which meaningful measures of productivity change can be decomposed into measures of technical progress, environmental change, and different types of efficiency change. The book is aimed at graduate students, researchers, statisticians, accountants and economists working in universities, regulatory authorities, government departments and private firms. The book contains many numerical examples. Computer codes and datasets are available on a companion website. |
You may like...
Surveillance Technologies and Early…
Ali Serhan Koyuncugil, Nermin Ozgulbas
Hardcover
R4,583
Discovery Miles 45 830
Multimedia Data Mining and Analytics…
Aaron K Baughman, Jiang Gao, …
Hardcover
The Data and Analytics Playbook - Proven…
Lowell Fryman, Gregory Lampshire, …
Paperback
R1,200
Discovery Miles 12 000
Towards Advanced Data Analysis by…
Christian Borgelt, Maria Angeles Gil, …
Hardcover
R4,064
Discovery Miles 40 640
Mathematical Foundations of Data Science…
Frank Emmert-Streib, Salissou Moutari, …
Hardcover
Exploring Advances in Interdisciplinary…
David Taniar, Lukman Hakim Iwan
Hardcover
R4,941
Discovery Miles 49 410
Designing Networks for Innovation and…
Matthaus P. Zylka, Hauke Fuehres, …
Hardcover
Dynamic and Seamless Integration of…
Eberhard Abele, Manfred Boltze, …
Hardcover
Intelligent Computing for Interactive…
Parisa Eslambolchilar, Mark Dunlop, …
Hardcover
R2,302
Discovery Miles 23 020
|