![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
This book provides a coherent description of the main concepts and statistical methods used to analyse economic performance. The focus is on measures of performance that are of practical relevance to policy makers. Most, if not all, of these measures can be viewed as measures of productivity and/or efficiency. Linking fields as diverse as index number theory, data envelopment analysis and stochastic frontier analysis, the book explains how to compute measures of input and output quantity change that are consistent with measurement theory. It then discusses ways in which meaningful measures of productivity change can be decomposed into measures of technical progress, environmental change, and different types of efficiency change. The book is aimed at graduate students, researchers, statisticians, accountants and economists working in universities, regulatory authorities, government departments and private firms. The book contains many numerical examples. Computer codes and datasets are available on a companion website.
Originally published in 1984. This book addresses the economics of the changing mineral industry, which is highly affected by energy economics. The study estimates, in quantitative terms, the short- to mid-term consequences of rising energy prices alongside falling ore quality for the copper and aluminum industries. The effects of changing cost factors on substitution between metals is assessed as is the potential for relying on increased recycling. Copper and aluminum industry problems should be representative of those faced by the mineral processing sector as a whole. Two complex econometric models presented here produce forecasts for the industries and the book discusses and reviews other econometric commodity models.
Originally published in 1979. This study focuses primarily on the development of a structural model for the U. S. Government securities market, ie. the specification and estimation of the demands for disaggregated maturity classes of U.S. Government securities by the individual investor groups participating in the market. A particularly important issue addressed involves the extent of the substitution relationship among different maturity classes of U.S. Government securities.
Originally published in 1974. This book provides a rigorous and detailed introductory treatment of the theory of difference equations and their applications in the construction and analysis of dynamic economic models. It explains the theory of linear difference equations and various types of dynamic economic models are then analysed. Including plenty of examples of application throughout the text, it will be of use to those working in macroeconomics and econometrics.
Originally published in 1991. The dilemma of solid and hazardous waste disposal in an environmentally safe manner has become a global problem. This book presents a modern approach to economic and operations research modelling in urban and regional waste management with an international perspective. Location and space economics are discussed along with transportation, technology, health hazards, capacity levels, political realities and the linkage with general global economic systems. The algorithms and models developed are then applied to two major cities in the world by way of case study example of the use of these systems.
Reissuing works originally published between 1929 and 1991, this collection of 17 volumes presents a variety of considerations on Econometrics, from introductions to specific research works on particular industries. With some volumes on models for macroeconomics and international economies, this is a widely interesting set of economic texts. Input/Output methods and databases are looked at in some volumes while others look at Bayesian techniques, linear and non-linear models. This set will be of use to those in industry and business studies, geography and sociology as well as politics and economics.
The global financial crisis saw many Eurozone countries bearing excessive public debt. This led the government bond yields of some peripheral countries to rise sharply, resulting in the outbreak of the European sovereign debt crisis. The debt crisis is characterized by its immediate spread from Greece, the country of origin, to its neighbouring countries and the connection between the Eurozone banking sector and the public sector debt. Addressing these interesting features, this book sheds light on the impacts of the crisis on various financial markets in Europe. This book is among the first to conduct a thorough empirical analysis of the European sovereign debt crisis. It analyses, using advanced econometric methodologies, why the crisis escalated so prominently, having significant impacts on a wide range of financial markets, and was not just limited to government bond markets. The book also allows one to understand the consequences and the overall impact of such a debt crisis, enabling investors and policymakers to formulate diversification strategies, and create suitable regulatory frameworks.
First published in 1992, The Efficiency of New Issue Markets provides a theoretical discussion of the adverse selection model of the new issue market. It addresses the hypothesis that the method of distribution of new issues has an important bearing on the efficiency of these markets. In doing this, the book tests the efficiency of the Offer for Sale new issue market, which demonstrates the validity of the adverse selection model and contradicts the monopsony power hypothesis. This examines the relative efficiency of the new issue markets and in turn demonstrates the importance of distribution in determining relative efficiency. The book provides a comprehensive overview of under-pricing and through this assesses the efficiency of new issue markets.
This title was first published in 2003. This book provides a much-needed comprehensive and up-to-date treatise on financial distress modelling. Since many of the challenges facing researchers of financial distress can only be addressed by a totally new research design and modelling methodology, this book concentrates on extending the potential for bankruptcy analysis from single-equation modelling to multi-equation analysis. Essentially, the work provides an innovative new approach by comparing each firm with itself over time rather than testing specific hypotheses or improving predictive and classificatory accuracy. Added to this new design, a whole new methodology - or way of modelling the process - is applied in the form of a family of models of which the traditional single equation logit or MDA models is just a special case. Preliminary two-equation and three-equation models are presented and tested in the final chapters as a taste of things to come. The groundwork for a full treatise on these sorts of multi-equation systems is laid for further study - this family of models could be used as a basis for more specific applications to different industries and to test hypotheses concerning influential variables to bankruptcy risk.
The Handbook of Mathematical Economics aims to provide a definitive source, reference, and teaching supplement for the field of mathematical economics. It surveys, as of the late 1970's the state of the art of mathematical economics. This is a constantly developing field and all authors were invited to review and to appraise the current status and recent developments in their presentations. In addition to its use as a reference, it is intended that this Handbook will assist researchers and students working in one branch of mathematical economics to become acquainted with other branches of this field. The emphasis of this fourth volume of the Handbook of Mathematical Economics is on choice under uncertainty, general equilibrium analysis under conditions of uncertainty, economies with an infinite number of consumers or commodities, and dynamical systems. The book thus reflects some of the ideas that have been most influential in mathematical economics since the appearance of the first three volumes of the Handbook. Researchers, students, economists and mathematicians will all find this Handbook to be an indispensable reference source. It surveys the entire field of mathematical economics, critically reviewing recent developments. The chapters (which can be read independently) are written at an advanced level suitable for professional, teaching and graduate-level use. For more information on the Handbooks in Economics series,
please see our home page on http:
//www.elsevier.nl/locate/hes
Updated to textbook form by popular demand, this second edition discusses diverse mathematical models used in economics, ecology, and the environmental sciences with emphasis on control and optimization. It is intended for graduate and upper-undergraduate course use, however, applied mathematicians, industry practitioners, and a vast number of interdisciplinary academics will find the presentation highly useful. Core topics of this text are: * Economic growth and technological development * Population dynamics and human impact on the environment * Resource extraction and scarcity * Air and water contamination * Rational management of the economy and environment * Climate change and global dynamics The step-by-step approach taken is problem-based and easy to follow. The authors aptly demonstrate that the same models may be used to describe different economic and environmental processes and that similar investigation techniques are applicable to analyze various models. Instructors will appreciate the substantial flexibility that this text allows while designing their own syllabus. Chapters are essentially self-contained and may be covered in full, in part, and in any order. Appropriate one- and two-semester courses include, but are not limited to, Applied Mathematical Modeling, Mathematical Methods in Economics and Environment, Models of Biological Systems, Applied Optimization Models, and Environmental Models. Prerequisites for the courses are Calculus and, preferably, Differential Equations.
This second edition of Design of Observational Studies is both an introduction to statistical inference in observational studies and a detailed discussion of the principles that guide the design of observational studies. An observational study is an empiric investigation of effects caused by treatments when randomized experimentation is unethical or infeasible. Observational studies are common in most fields that study the effects of treatments on people, including medicine, economics, epidemiology, education, psychology, political science and sociology. The quality and strength of evidence provided by an observational study is determined largely by its design. Design of Observational Studies is organized into five parts. Chapters 2, 3, and 5 of Part I cover concisely many of the ideas discussed in Rosenbaum's Observational Studies (also published by Springer) but in a less technical fashion. Part II discusses the practical aspects of using propensity scores and other tools to create a matched comparison that balances many covariates, and includes an updated chapter on matching in R. In Part III, the concept of design sensitivity is used to appraise the relative ability of competing designs to distinguish treatment effects from biases due to unmeasured covariates. Part IV is new to this edition; it discusses evidence factors and the computerized construction of more than one comparison group. Part V discusses planning the analysis of an observational study, with particular reference to Sir Ronald Fisher's striking advice for observational studies: "make your theories elaborate." This new edition features updated exploration of causal influence, with four new chapters, a new R package DOS2 designed as a companion for the book, and discussion of several of the latest matching packages for R. In particular, DOS2 allows readers to reproduce many analyses from Design of Observational Studies.
This book provides a practical introduction to mathematics for economics using R software. Using R as a basis, this book guides the reader through foundational topics in linear algebra, calculus, and optimization. The book is organized in order of increasing difficulty, beginning with a rudimentary introduction to R and progressing through exercises that require the reader to code their own functions in R. All chapters include applications for topics in economics and econometrics. As fully reproducible book, this volume gives readers the opportunity to learn by doing and develop research skills as they go. As such, it is appropriate for students in economics and econometrics.
Originally published in 1984. This book examines two important dimensions of efficiency in the foreign exchange market using econometric techniques. It responds to the macroeconomics trend to re-examining the theories of exchange rate determination following the erratic behaviour of exchange rates in the late 1970s. In particular the text looks at the relation between spot and forward exchange rates and the term structure of the forward premium, both of which require a joint test of market efficiency and the equilibrium model. Approaches used are the regression of spot rates on lagged forward rates and an explicit time series analysis of the spot and forward rates, using data from Canada, the United Kingdom, the Netherlands, Switzerland and Germany.
Originally published in 1981, this book considers one particular area of econometrics- the linear model- where significant recent advances have been made. It considers both single and multiequation models with varying co-efficients, explains the various theories and techniques connected with these and goes on to describe the various applications of the models. Whilst the detailed explanation of the models will interest primarily econometrics specialists, the implications of the advances outlined and the applications of the models will intrest a wide range of economists.
This book investigates whether the effects of economic integration differ according to the size of countries. The analysis incorporates a classification of the size of countries, reflecting the key economic characteristics of economies in order to provide an appropriate benchmark for each size group in the empirical analysis of the effects of asymmetric economic integration. The formation or extension of Preferential Trade Areas (PTAs) leads to a reduction in trade costs. This poses a critical secondary question as to the extent to which trade costs differ according to the size of countries. The extent to which membership of PTAs has an asymmetric impact on trade flow according to the size of member countries is analyzed by employing econometric tools and general equilibrium analysis, estimating both the ex-post and ex-ante effects of economic integration on the size of countries, using a data set of 218 countries, 45 of which are European. ?
'Big data' is now readily available to economic historians, thanks to the digitisation of primary sources, collaborative research linking different data sets, and the publication of databases on the internet. Key economic indicators, such as the consumer price index, can be tracked over long periods, and qualitative information, such as land use, can be converted to a quantitative form. In order to fully exploit these innovations it is necessary to use sophisticated statistical techniques to reveal the patterns hidden in datasets, and this book shows how this can be done. A distinguished group of economic historians have teamed up with younger researchers to pilot the application of new techniques to 'big data'. Topics addressed in this volume include prices and the standard of living, money supply, credit markets, land values and land use, transport, technological innovation, and business networks. The research spans the medieval, early modern and modern periods. Research methods include simultaneous equation systems, stochastic trends and discrete choice modelling. This book is essential reading for doctoral and post-doctoral researchers in business, economic and social history. The case studies will also appeal to historical geographers and applied econometricians.
This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging when based on conventional methods. Among others, it introduces a modern text-mining method called dynamic topic modeling in detail and applies it to the message board of Bitcoins. The unique synthesis of theory and practice supported by computational tools is reflected not only in the selection of topics, but also in the fine balance of scientific contributions on practical implementation and theoretical concepts. This link between theory and practice offers theoreticians insights into considerations of applicability and, vice versa, provides practitioners convenient access to new techniques in quantitative finance. Hence the book will appeal both to researchers, including master and PhD students, and practitioners, such as financial engineers. The results presented in the book are fully reproducible and all quantlets needed for calculations are provided on an accompanying website. The Quantlet platform quantlet.de, quantlet.com, quantlet.org is an integrated QuantNet environment consisting of different types of statistics-related documents and program codes. Its goal is to promote reproducibility and offer a platform for sharing validated knowledge native to the social web. QuantNet and the corresponding Data-Driven Documents-based visualization allows readers to reproduce the tables, pictures and calculations inside this Springer book.
An accessible, contemporary introduction to the methods for determining cause and effect in the social sciences "Causation versus correlation has been the basis of arguments-economic and otherwise-since the beginning of time. Causal Inference: The Mixtape uses legit real-world examples that I found genuinely thought-provoking. It's rare that a book prompts readers to expand their outlook; this one did for me."-Marvin Young (Young MC) Causal inference encompasses the tools that allow social scientists to determine what causes what. In a messy world, causal inference is what helps establish the causes and effects of the actions being studied-for example, the impact (or lack thereof) of increases in the minimum wage on employment, the effects of early childhood education on incarceration later in life, or the influence on economic growth of introducing malaria nets in developing regions. Scott Cunningham introduces students and practitioners to the methods necessary to arrive at meaningful answers to the questions of causation, using a range of modeling techniques and coding instructions for both the R and the Stata programming languages.
Essays in Economic Theory, first published in 1983, combines two essays on game theory and its applications in economics. The first, "Learning Behavior and the Noncooperative Equilibrium", considers whether an adaptive justification, like those commonly available for the optimization models frequently employed elsewhere in economics, can be found for the Nash noncooperative equilibrium. The second essay, "A Game of Fair Division", was motivated by the desire to find attractive methods for solving allocation problems and bargaining disputes that are simple enough to provide useful alternatives to existing methods. It studies in detail one such simple method: the classical "divide-and-choose" procedure. This book will be of interest to students of economics.
This book, first published in 1992, examines the subject of foreign exchange market efficiency and, in particular, the effectiveness of central bank intervention in the market. This book is ideal for students of economics.
The Handbook is a definitive reference source and teaching aid for
econometricians. It examines models, estimation theory, data
analysis and field applications in econometrics. Comprehensive
surveys, written by experts, discuss recent developments at a level
suitable for professional use by economists, econometricians,
statisticians, and in advanced graduate econometrics courses. For
more information on the Handbooks in Economics series, please see
our home page on http: //www.elsevier.nl/locate/hes
The 30th Volume of Advances in Econometrics is in honor of the two individuals whose hard work has helped ensure thirty successful years of the series, Thomas Fomby and R. Carter Hill. This volume began with a history of the Advances series by Asli Ogunc and Randall Campbell summarizing the prior volumes. Tom Fomby and Carter Hill both provide discussions of the role of Advances over the years. The remaining articles include contributions by a number of authors who have played key roles in the series over the years and in the careers of Fomby and Hill. Overall, this leads to a more diverse mix of papers than a typical volume of Advances in Econometrics.
Economic Modeling Using Artificial Intelligence Methods examines the application of artificial intelligence methods to model economic data. Traditionally, economic modeling has been modeled in the linear domain where the principles of superposition are valid. The application of artificial intelligence for economic modeling allows for a flexible multi-order non-linear modeling. In addition, game theory has largely been applied in economic modeling. However, the inherent limitation of game theory when dealing with many player games encourages the use of multi-agent systems for modeling economic phenomena. The artificial intelligence techniques used to model economic data include: multi-layer perceptron neural networks radial basis functions support vector machines rough sets genetic algorithm particle swarm optimization simulated annealing multi-agent system incremental learning fuzzy networks Signal processing techniques are explored to analyze economic data, and these techniques are the time domain methods, time-frequency domain methods and fractals dimension approaches. Interesting economic problems such as causality versus correlation, simulating the stock market, modeling and controling inflation, option pricing, modeling economic growth as well as portfolio optimization are examined. The relationship between economic dependency and interstate conflict is explored, and knowledge on how economics is useful to foster peace - and vice versa - is investigated. Economic Modeling Using Artificial Intelligence Methods deals with the issue of causality in the non-linear domain and applies the automatic relevance determination, the evidence framework, Bayesian approach and Granger causality to understand causality and correlation. Economic Modeling Using Artificial Intelligence Methods makes an important contribution to the area of econometrics, and is a valuable source of reference for graduate students, researchers and financial practitioners.
In the memorable words of Ragnar Frisch, econometrics is 'a unification of the theoretical-quantitative and the empirical-quantitative approach to economic problems'. Beginning to take shape in the 1930s and 1940s, econometrics is now recognized as a vital subdiscipline supported by a vast-and still rapidly growing-body of literature. Following the positive reception of The Rise of Econometrics (2013) (978-0-415-61678-2), Routledge now announces a new collection bringing together the best that has been published on the practical application and functional use of economic metrics and measurements. With a comprehensive introduction, newly written by the editor, which places the assembled materials in their historical and intellectual context, Applied Econometrics is an essential work of reference. This fully indexed collection will be particularly useful as an indispensable database allowing scattered and often fugitive material to be easily located. It will also be welcomed as a crucial tool permitting rapid access to less familiar-and sometimes overlooked-texts. For researchers and students, as well as economic policy-makers, it is a vital one-stop research and pedagogic resource. |
You may like...
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
The Oxford Handbook of Applied Bayesian…
Anthony O'Hagan, Mike West
Hardcover
R4,188
Discovery Miles 41 880
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
The Oxford Handbook of the Economics of…
Yann Bramoulle, Andrea Galeotti, …
Hardcover
R5,455
Discovery Miles 54 550
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Applied Econometric Analysis - Emerging…
Brian W Sloboda, Yaya Sissoko
Hardcover
R5,351
Discovery Miles 53 510
|