![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
The object of this work, first published in 1977, is to examine the history of the economic and monetary union (EMU) in the European Community, the policies of the parties involved and the conflicts of interest created in the political and economic environment within which all this has taken place. This title will be of interest to students of monetary economics and finance.
Notions of probability and uncertainty have been increasingly
prominant in modern economics. This book considers the
philosophical and practical difficulties inherent in integrating
these concepts into realistic economic situations. It outlines and
evaluates the major developments, indicating where further work is
needed.
The analysis, prediction and interpolation of economic and other time series has a long history and many applications. Major new developments are taking place, driven partly by the need to analyze financial data. The five papers in this book describe those new developments from various viewpoints and are intended to be an introduction accessible to readers from a range of backgrounds. The book arises out of the second Seminaire European de Statistique (SEMSTAT) held in Oxford in December 1994. This brought together young statisticians from across Europe, and a series of introductory lectures were given on topics at the forefront of current research activity. The lectures form the basis for the five papers contained in the book. The papers by Shephard and Johansen deal respectively with time series models for volatility, i.e. variance heterogeneity, and with cointegration. Clements and Hendry analyze the nature of prediction errors. A complementary review paper by Laird gives a biometrical view of the analysis of short time series. Finally Astrup and Nielsen give a mathematical introduction to the study of option pricing. Whilst the book draws its primary motivation from financial series and from multivariate econometric modelling, the applications are potentially much broader.
Portfolio theory and much of asset pricing, as well as many empirical applications, depend on the use of multivariate probability distributions to describe asset returns. Traditionally, this has meant the multivariate normal (or Gaussian) distribution. More recently, theoretical and empirical work in financial economics has employed the multivariate Student (and other) distributions which are members of the elliptically symmetric class. There is also a growing body of work which is based on skew-elliptical distributions. These probability models all exhibit the property that the marginal distributions differ only by location and scale parameters or are restrictive in other respects. Very often, such models are not supported by the empirical evidence that the marginal distributions of asset returns can differ markedly. Copula theory is a branch of statistics which provides powerful methods to overcome these shortcomings. This book provides a synthesis of the latest research in the area of copulae as applied to finance and related subjects such as insurance. Multivariate non-Gaussian dependence is a fact of life for many problems in financial econometrics. This book describes the state of the art in tools required to deal with these observed features of financial data. This book was originally published as a special issue of the European Journal of Finance.
This book discusses market microstructure environment within the context of the global financial crisis. In the first part, the market microstructure theory is recalled and the main microstructure models and hypotheses are discussed. The second part focuses on the main effects of the financial downturn through an examination of market microstructure dynamics. In particular, the effects of market imperfections and the limitations associated with microstructure models are discussed. Finally, the new regulations and recent developments for financial markets that aim to improve the market microstructure are discussed. Well-known experts on the subject contribute to the chapters in the book. A must-read for academic researchers, students and quantitative practitioners.
This is the second of three volumes surveying the state of the art
in Game Theory and its applications to many and varied fields, in
particular to economics. The chapters in the present volume are
contributed by outstanding authorities, and provide comprehensive
coverage and precise statements of the main results in each area.
The applications include empirical evidence. The following topics
are covered: communication and correlated equilibria, coalitional
games and coalition structures, utility and subjective probability,
common knowledge, bargaining, zero-sum games, differential games,
and applications of game theory to signalling, moral hazard,
search, evolutionary biology, international relations, voting
procedures, social choice, public economics, politics, and cost
allocation. This handbook will be of interest to scholars in
economics, political science, psychology, mathematics and biology.
For more information on the Handbooks in Economics series, please
see our home page on http: //www.elsevier.nl/locate/hes
Hardbound. This is the fourth volume of the Handbook of Econometrics. The Handbook is a definitive reference source and teaching aid for econometricians. It examines models, estimation theory, data analysis and field applications in econometrics. Comprehensive surveys, written by experts, discuss recent developments at a level suitable for professional use by economists, econometricians, statisticians, and in advanced graduate econometrics courses.
Concepts of probability are an integral component of economic theory. However there are many theories of probability and these are manifested in different approaches to economic theory itself. This text offers a clear and informative survey of the area serving to standardize terminology, and so to integrate probability into a discussion of the foundations of economic theory. Having summarized the three main, competing interpretations of probability, the author explains its fundamental importance in economics, and illustrates this with a comparison of Knight's and Keynes's very different conceptions. Finally, he examines the Austrian, Keynesian and New Classical/Rational Expectation schools of thought.
This book studies the information spillover among financial markets and explores the intraday effect and ACD models with high frequency data. This book also contributes theoretically by providing a new statistical methodology with comparative advantages for analyzing co-movements between two time series. It explores this new method by testing the information spillover between the Chinese stock market and the international market, futures market and spot market. Using the high frequency data, this book investigates the intraday effect and examines which type of ACD model is particularly suited in capturing financial duration dynamics. The book will be of invaluable use to scholars and graduate students interested in co-movements among different financial markets and financial market microstructure and to investors and regulation departments looking to improve their risk management.
This book undertakes a theoretical and econometric analysis of intense economic growth in selected European countries during the end of the twentieth century and the beginning of the twenty first. Focusing on the accelerated economic growth that occurred in Ireland, the Netherlands, Spain, and Turkey, this book investigates the determinants and consequences of this "miracle" growth and discusses them in context of growth and development processes observed in European market-type economies after the World War II. Using imperfect knowledge economics (IKE) as a theoretical framework to interpret the empirical results, this book provides a fresh theoretical perspective in comparison with current Neo-classical, Keynesian and institutional paradigms. With this systematic approach, the authors seek to provide a unified methodology for evaluating the phenomenon of intense economic growth that has heretofore been missing from the discipline. Combining diverse theoretical and methodological strategies to provide a holistic understanding of the historical process of economic change, this volume will be of interest to students and scholars of economic growth, econometrics, political economy, and the new institutional economics as well as policymakers.
This book provides a coherent description of the main concepts and statistical methods used to analyse economic performance. The focus is on measures of performance that are of practical relevance to policy makers. Most, if not all, of these measures can be viewed as measures of productivity and/or efficiency. Linking fields as diverse as index number theory, data envelopment analysis and stochastic frontier analysis, the book explains how to compute measures of input and output quantity change that are consistent with measurement theory. It then discusses ways in which meaningful measures of productivity change can be decomposed into measures of technical progress, environmental change, and different types of efficiency change. The book is aimed at graduate students, researchers, statisticians, accountants and economists working in universities, regulatory authorities, government departments and private firms. The book contains many numerical examples. Computer codes and datasets are available on a companion website.
Originally published in 1984. This book addresses the economics of the changing mineral industry, which is highly affected by energy economics. The study estimates, in quantitative terms, the short- to mid-term consequences of rising energy prices alongside falling ore quality for the copper and aluminum industries. The effects of changing cost factors on substitution between metals is assessed as is the potential for relying on increased recycling. Copper and aluminum industry problems should be representative of those faced by the mineral processing sector as a whole. Two complex econometric models presented here produce forecasts for the industries and the book discusses and reviews other econometric commodity models.
Originally published in 1979. This study focuses primarily on the development of a structural model for the U. S. Government securities market, ie. the specification and estimation of the demands for disaggregated maturity classes of U.S. Government securities by the individual investor groups participating in the market. A particularly important issue addressed involves the extent of the substitution relationship among different maturity classes of U.S. Government securities.
Originally published in 1974. This book provides a rigorous and detailed introductory treatment of the theory of difference equations and their applications in the construction and analysis of dynamic economic models. It explains the theory of linear difference equations and various types of dynamic economic models are then analysed. Including plenty of examples of application throughout the text, it will be of use to those working in macroeconomics and econometrics.
Originally published in 1991. The dilemma of solid and hazardous waste disposal in an environmentally safe manner has become a global problem. This book presents a modern approach to economic and operations research modelling in urban and regional waste management with an international perspective. Location and space economics are discussed along with transportation, technology, health hazards, capacity levels, political realities and the linkage with general global economic systems. The algorithms and models developed are then applied to two major cities in the world by way of case study example of the use of these systems.
Reissuing works originally published between 1929 and 1991, this collection of 17 volumes presents a variety of considerations on Econometrics, from introductions to specific research works on particular industries. With some volumes on models for macroeconomics and international economies, this is a widely interesting set of economic texts. Input/Output methods and databases are looked at in some volumes while others look at Bayesian techniques, linear and non-linear models. This set will be of use to those in industry and business studies, geography and sociology as well as politics and economics.
Leonid Hurwicz (1917-2008) was a major figure in modern theoretical economics whose contributions over sixty-five years spanned at least five areas: econometrics, nonlinear programming, decision theory, microeconomic theory, and mechanism design. In 2007, at age ninety, he received the Nobel Memorial Prize in Economics (shared with Eric Maskin and Roger Myerson) for pioneering the field of mechanism design and incentive compatibility. Hurwicz made seminal contributions in the other areas as well. In non-linear programming, he contributed to the understanding of Lagrange-Kuhn-Tucker problems (along with co-authors Kenneth Arrowand Hirofumi Uzawa). In econometrics, the Hurwicz bias in the least-squares analysis of time series is a fundamental and commonly cited benchmark. In decision theory, the Hurwicz criterion for decision-making under ambiguity is routinely invoked, sometimes without a citation since his original paper was never published. In microeconomic theory, Hurwicz (along with Arrow and H.D. Block) initiated the study of stability of the market mechanism, and (with Uzawa) solved the classic integrability of demand problem, a core result in neoclassical consumer theory. While some of Hurwicz's work were published in journals, many remain scattered as chapters in books which are difficult to access; yet others were never published at all. The Collected Papers of Leonid Hurwicz is the first volume in a series of four that will bring his oeuvre in one place, to bring to light the totality of his intellectual output, to document his contribution to economics and the extent of his legacy, with the express purpose to make it easily available for future generations of researchers to build upon.
The global financial crisis saw many Eurozone countries bearing excessive public debt. This led the government bond yields of some peripheral countries to rise sharply, resulting in the outbreak of the European sovereign debt crisis. The debt crisis is characterized by its immediate spread from Greece, the country of origin, to its neighbouring countries and the connection between the Eurozone banking sector and the public sector debt. Addressing these interesting features, this book sheds light on the impacts of the crisis on various financial markets in Europe. This book is among the first to conduct a thorough empirical analysis of the European sovereign debt crisis. It analyses, using advanced econometric methodologies, why the crisis escalated so prominently, having significant impacts on a wide range of financial markets, and was not just limited to government bond markets. The book also allows one to understand the consequences and the overall impact of such a debt crisis, enabling investors and policymakers to formulate diversification strategies, and create suitable regulatory frameworks.
First published in 1992, The Efficiency of New Issue Markets provides a theoretical discussion of the adverse selection model of the new issue market. It addresses the hypothesis that the method of distribution of new issues has an important bearing on the efficiency of these markets. In doing this, the book tests the efficiency of the Offer for Sale new issue market, which demonstrates the validity of the adverse selection model and contradicts the monopsony power hypothesis. This examines the relative efficiency of the new issue markets and in turn demonstrates the importance of distribution in determining relative efficiency. The book provides a comprehensive overview of under-pricing and through this assesses the efficiency of new issue markets.
This title was first published in 2003. This book provides a much-needed comprehensive and up-to-date treatise on financial distress modelling. Since many of the challenges facing researchers of financial distress can only be addressed by a totally new research design and modelling methodology, this book concentrates on extending the potential for bankruptcy analysis from single-equation modelling to multi-equation analysis. Essentially, the work provides an innovative new approach by comparing each firm with itself over time rather than testing specific hypotheses or improving predictive and classificatory accuracy. Added to this new design, a whole new methodology - or way of modelling the process - is applied in the form of a family of models of which the traditional single equation logit or MDA models is just a special case. Preliminary two-equation and three-equation models are presented and tested in the final chapters as a taste of things to come. The groundwork for a full treatise on these sorts of multi-equation systems is laid for further study - this family of models could be used as a basis for more specific applications to different industries and to test hypotheses concerning influential variables to bankruptcy risk.
The Handbook of Mathematical Economics aims to provide a definitive source, reference, and teaching supplement for the field of mathematical economics. It surveys, as of the late 1970's the state of the art of mathematical economics. This is a constantly developing field and all authors were invited to review and to appraise the current status and recent developments in their presentations. In addition to its use as a reference, it is intended that this Handbook will assist researchers and students working in one branch of mathematical economics to become acquainted with other branches of this field. The emphasis of this fourth volume of the Handbook of Mathematical Economics is on choice under uncertainty, general equilibrium analysis under conditions of uncertainty, economies with an infinite number of consumers or commodities, and dynamical systems. The book thus reflects some of the ideas that have been most influential in mathematical economics since the appearance of the first three volumes of the Handbook. Researchers, students, economists and mathematicians will all find this Handbook to be an indispensable reference source. It surveys the entire field of mathematical economics, critically reviewing recent developments. The chapters (which can be read independently) are written at an advanced level suitable for professional, teaching and graduate-level use. For more information on the Handbooks in Economics series,
please see our home page on http:
//www.elsevier.nl/locate/hes
Updated to textbook form by popular demand, this second edition discusses diverse mathematical models used in economics, ecology, and the environmental sciences with emphasis on control and optimization. It is intended for graduate and upper-undergraduate course use, however, applied mathematicians, industry practitioners, and a vast number of interdisciplinary academics will find the presentation highly useful. Core topics of this text are: * Economic growth and technological development * Population dynamics and human impact on the environment * Resource extraction and scarcity * Air and water contamination * Rational management of the economy and environment * Climate change and global dynamics The step-by-step approach taken is problem-based and easy to follow. The authors aptly demonstrate that the same models may be used to describe different economic and environmental processes and that similar investigation techniques are applicable to analyze various models. Instructors will appreciate the substantial flexibility that this text allows while designing their own syllabus. Chapters are essentially self-contained and may be covered in full, in part, and in any order. Appropriate one- and two-semester courses include, but are not limited to, Applied Mathematical Modeling, Mathematical Methods in Economics and Environment, Models of Biological Systems, Applied Optimization Models, and Environmental Models. Prerequisites for the courses are Calculus and, preferably, Differential Equations.
This second edition of Design of Observational Studies is both an introduction to statistical inference in observational studies and a detailed discussion of the principles that guide the design of observational studies. An observational study is an empiric investigation of effects caused by treatments when randomized experimentation is unethical or infeasible. Observational studies are common in most fields that study the effects of treatments on people, including medicine, economics, epidemiology, education, psychology, political science and sociology. The quality and strength of evidence provided by an observational study is determined largely by its design. Design of Observational Studies is organized into five parts. Chapters 2, 3, and 5 of Part I cover concisely many of the ideas discussed in Rosenbaum's Observational Studies (also published by Springer) but in a less technical fashion. Part II discusses the practical aspects of using propensity scores and other tools to create a matched comparison that balances many covariates, and includes an updated chapter on matching in R. In Part III, the concept of design sensitivity is used to appraise the relative ability of competing designs to distinguish treatment effects from biases due to unmeasured covariates. Part IV is new to this edition; it discusses evidence factors and the computerized construction of more than one comparison group. Part V discusses planning the analysis of an observational study, with particular reference to Sir Ronald Fisher's striking advice for observational studies: "make your theories elaborate." This new edition features updated exploration of causal influence, with four new chapters, a new R package DOS2 designed as a companion for the book, and discussion of several of the latest matching packages for R. In particular, DOS2 allows readers to reproduce many analyses from Design of Observational Studies.
This book provides a practical introduction to mathematics for economics using R software. Using R as a basis, this book guides the reader through foundational topics in linear algebra, calculus, and optimization. The book is organized in order of increasing difficulty, beginning with a rudimentary introduction to R and progressing through exercises that require the reader to code their own functions in R. All chapters include applications for topics in economics and econometrics. As fully reproducible book, this volume gives readers the opportunity to learn by doing and develop research skills as they go. As such, it is appropriate for students in economics and econometrics.
Originally published in 1984. This book examines two important dimensions of efficiency in the foreign exchange market using econometric techniques. It responds to the macroeconomics trend to re-examining the theories of exchange rate determination following the erratic behaviour of exchange rates in the late 1970s. In particular the text looks at the relation between spot and forward exchange rates and the term structure of the forward premium, both of which require a joint test of market efficiency and the equilibrium model. Approaches used are the regression of spot rates on lagged forward rates and an explicit time series analysis of the spot and forward rates, using data from Canada, the United Kingdom, the Netherlands, Switzerland and Germany. |
You may like...
Exploring Science International Biology…
Mark Levesley, Susan Kearsey
Paperback
Pearson Edexcel International GCSE (9-1…
Philip Bradfield, Steve Potter
Paperback
R1,318
Discovery Miles 13 180
|