Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
Convex optimization problems arise frequently in many different fields. This book provides a comprehensive introduction to the subject, and shows in detail how such problems can be solved numerically with great efficiency. The book begins with the basic elements of convex sets and functions, and then describes various classes of convex optimization problems. Duality and approximation techniques are then covered, as are statistical estimation techniques. Various geometrical problems are then presented, and there is detailed discussion of unconstrained and constrained minimization problems, and interior-point methods. The focus of the book is on recognizing convex optimization problems and then finding the most appropriate technique for solving them. It contains many worked examples and homework exercises and will appeal to students, researchers and practitioners in fields such as engineering, computer science, mathematics, statistics, finance and economics.
In the memorable words of Ragnar Frisch, econometrics is 'a unification of the theoretical-quantitative and the empirical-quantitative approach to economic problems'. Beginning to take shape in the 1930s and 1940s, econometrics is now recognized as a vital subdiscipline supported by a vast-and still rapidly growing-body of literature. Following the positive reception of The Rise of Econometrics (2013) (978-0-415-61678-2), Routledge now announces a new collection from its Critical Concepts in Economics series. With a comprehensive introduction, newly written by the editor, which places the assembled materials in their historical and intellectual context, Time Series Econometrics is an essential work of reference. This fully indexed collection will be particularly useful as an essential database allowing scattered and often fugitive material to be easily located. It will also be welcomed as a crucial tool permitting rapid access to less familiar-and sometimes overlooked-texts. For researchers and students, as well as economic policy-makers, it is a vital one-stop research and pedagogic resource.
This book studies the information spillover among financial markets and explores the intraday effect and ACD models with high frequency data. This book also contributes theoretically by providing a new statistical methodology with comparative advantages for analyzing co-movements between two time series. It explores this new method by testing the information spillover between the Chinese stock market and the international market, futures market and spot market. Using the high frequency data, this book investigates the intraday effect and examines which type of ACD model is particularly suited in capturing financial duration dynamics. The book will be of invaluable use to scholars and graduate students interested in co-movements among different financial markets and financial market microstructure and to investors and regulation departments looking to improve their risk management.
Most economists assume that the mathematical and quantative sides
of their science are relatively recent developments. Measurement,
Quantification and Economic Analysis shows that this is a
misconception. Its authors argue that economists have long relied
on measurement and quantification as essential tools.
This title, first published in 1979, presents the Ph.D. thesis of the world-renowned economist and financial expert, Willem Buiter. In Part I, three alternative specifications of temporary equilibria in asset markets, including their implications for macroeconomic models, are discussed; Part II examines the long-term implications of some short-term macroeconomic models. The analysis of the theoretical foundations of 'direct crowding out' and 'indirect crowding out' is particularly prominent, with the result that a synthesis of short-term macroeconomic analysis and long-term growth theory is formulated. The traditional tools of comparative dynamics and stability analysis are employed frequently. However, it is also argued that the true scope of government policy can only be adequately evaluated with the aid of concepts such as dynamic and static controllability. Temporary Equilibrium and Long-Run Equilibrium is a valuable study, and relevant for all serious students of modern economic theory.
This book brings together the latest research in the areas of market microstructure and high-frequency finance along with new econometric methods to address critical practical issues in these areas of research. Thirteen chapters, each of which makes a valuable and significant contribution to the existing literature have been brought together, spanning a wide range of topics including information asymmetry and the information content in limit order books, high-frequency return distribution models, multivariate volatility forecasting, analysis of individual trading behaviour, the analysis of liquidity, price discovery across markets, market microstructure models and the information content of order flow. These issues are central both to the rapidly expanding practice of high frequency trading in financial markets and to the further development of the academic literature in this area. The volume will therefore be of immediate interest to practitioners and academics. This book was originally published as a special issue of European Journal of Finance.
This introductory overview explores the methods, models and interdisciplinary links of artificial economics, a new way of doing economics in which the interactions of artificial economic agents are computationally simulated to study their individual and group behavior patterns. Conceptually and intuitively, and with simple examples, Mercado addresses the differences between the basic assumptions and methods of artificial economics and those of mainstream economics. He goes on to explore various disciplines from which the concepts and methods of artificial economics originate; for example cognitive science, neuroscience, artificial intelligence, evolutionary science and complexity science. Introductory discussions on several controversial issues are offered, such as the application of the concepts of evolution and complexity in economics and the relationship between artificial intelligence and the philosophies of mind. This is one of the first books to fully address artificial economics, emphasizing its interdisciplinary links and presenting in a balanced way its occasionally controversial aspects.
First published in 1994. Concepts of probability are an integral component of economic theory. However there are a wide range of theories of probability and these are manifested in different approaches to economic theory itself. In this book Charles McCann, Jr provides a clear and informative survey of the area which serves to standardize terminology and so integrate probability into a discussion of the foundations of economic theory. This is illustrated by examples from Austrian, Keynesian and New Classical Economics.
The authors present a basic model of the Bayesian implementation problem and then consider its application in areas including classical pure exchange economies, public goods provision, auctions and bargaining.
This book contains a set of notes prepared by Ragnar Frisch for a lecture series that he delivered at Yale University in 1930. The lecture notes provide not only a valuable source document for the history of econometrics, but also a more systematic introduction to some of Frisch's key methodological ideas than his other works so far published in various media for the econometrics community. In particular, these notes contain a number of prescient ideas precursory to some of the most important notions developed in econometrics during the 1970s and 1980s More remarkably, Frisch demonstrated a deep understanding of what econometric or statistical analysis could achieve under the situation where there lacked known correct theoretical models. This volume has been rigorously edited and comes with an introductory essay from Olav Bjerkholt and Duo Qin placing the notes in their historical context.
'Big data' is now readily available to economic historians, thanks to the digitisation of primary sources, collaborative research linking different data sets, and the publication of databases on the internet. Key economic indicators, such as the consumer price index, can be tracked over long periods, and qualitative information, such as land use, can be converted to a quantitative form. In order to fully exploit these innovations it is necessary to use sophisticated statistical techniques to reveal the patterns hidden in datasets, and this book shows how this can be done. A distinguished group of economic historians have teamed up with younger researchers to pilot the application of new techniques to 'big data'. Topics addressed in this volume include prices and the standard of living, money supply, credit markets, land values and land use, transport, technological innovation, and business networks. The research spans the medieval, early modern and modern periods. Research methods include simultaneous equation systems, stochastic trends and discrete choice modelling. This book is essential reading for doctoral and post-doctoral researchers in business, economic and social history. The case studies will also appeal to historical geographers and applied econometricians.
In this landmark collection, the editor has selected the most influential papers on the econometrics of panel data published in the period from 1992-2001, thus providing an update on developments in the field since the two volumes edited by G.S. Maddala in 1993, which covered the period from 1966-1992. Topics covered in these latest volumes include core articles on dynamic panels and the generalized method of moments, heterogeneous panels, non-stationary panels including spurious regression, unit roots and tests for cointegration in panels, limited dependent variable models using panel data including models with censored endogenous variables and sample selection, non-linear panel data models, unbalanced panels, pseudo-panels and specification tests in panels.
Non-market valuation has become a broadly accepted and widely practiced means of measuring the economic values of the environment and natural resources. In this book, the authors provide a guide to the statistical and econometric practices that economists employ in estimating non-market values.The authors develop the econometric models that underlie the basic methods: contingent valuation, travel cost models, random utility models and hedonic models. They analyze the measurement of non-market values as a procedure with two steps: the estimation of parameters of demand and preference functions and the calculation of benefits from the estimated models. Each of the models is carefully developed from the preference function to the behavioral or response function that researchers observe. The models are then illustrated with datasets that characterize the kinds of data researchers typically deal with. The real world data and clarity of writing in this book will appeal to environmental economists, students, researchers and practitioners in multilateral banks and government agencies.
Nonlinear modelling has become increasingly important and widely used in economics. This valuable book brings together recent advances in the area including contributions covering cross-sectional studies of income distribution and discrete choice models, time series models of exchange rate dynamics and jump processes, and artificial neural network and genetic algorithm models of financial markets. Attention is given to the development of theoretical models as well as estimation and testing methods with a wide range of applications in micro and macroeconomics, labour and finance. The book provides valuable introductory material that is accessible to students and scholars interested in this exciting research area, as well as presenting the results of new and original research. Nonlinear Economic Models provides a sequel to Chaos and Nonlinear Models in Economics by the same editors.
The development of economics changed dramatically during the twentieth century with the emergence of econometrics, macroeconomics and a more scientific approach in general. One of the key individuals in the transformation of economics was Ragnar Frisch, professor at the University of Oslo and the first Nobel Laureate in economics in 1969. He was a co-founder of the Econometric Society in 1930 (after having coined the word econometrics in 1926) and edited the journal Econometrics for twenty-two years. The discovery of the manuscripts of a series of eight lectures given by Frisch at the Henri Poincare Institute in March-April 1933 on The Problems and Methods of Econometrics will enable economists to more fully understand his overall vision of econometrics. This book is a rare exhibition of Frisch's overview on econometrics and is published here in English for the first time. Edited and with an introduction by Olav Bjerkholt and Ariane Dupont-Kieffer, Frisch's eight lectures provide an accessible and astute discussion of econometric issues from philosophical foundations to practical procedures. Concerning the development of economics in the twentieth century and the broader visions about economic science in general and econometrics in particular held by Ragnar Frisch, this book will appeal to anyone with an interest in the history of economics and econometrics.
Portfolio theory and much of asset pricing, as well as many empirical applications, depend on the use of multivariate probability distributions to describe asset returns. Traditionally, this has meant the multivariate normal (or Gaussian) distribution. More recently, theoretical and empirical work in financial economics has employed the multivariate Student (and other) distributions which are members of the elliptically symmetric class. There is also a growing body of work which is based on skew-elliptical distributions. These probability models all exhibit the property that the marginal distributions differ only by location and scale parameters or are restrictive in other respects. Very often, such models are not supported by the empirical evidence that the marginal distributions of asset returns can differ markedly. Copula theory is a branch of statistics which provides powerful methods to overcome these shortcomings. This book provides a synthesis of the latest research in the area of copulae as applied to finance and related subjects such as insurance. Multivariate non-Gaussian dependence is a fact of life for many problems in financial econometrics. This book describes the state of the art in tools required to deal with these observed features of financial data. This book was originally published as a special issue of the European Journal of Finance.
This title provides a comprehensive, critical coverage of the progress and development of mathematical modelling within urban and regional economics over four decades.
"A collection of proofs of fundamental theorems, this volume utilizes a format that is exhaustive and consistent. Every result covered in Econometrics''is proved as well as stated. One notation system is used throughout the volume. The topics included in the book cover such areas as estimations and testing in linear regression models under various sets of assumptions, and estimation and testing in simultaneous equations models. The latter subject is treated more extensively than in most econometrics books, and the entire volume is characterized by its rigorous level of examination. "
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin-Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in ride-sharing applications.
First Published in 1970. Econometric model-building, on the other hand, has been largely confined to the advanced industrialised countries. In the few cases where macro-models have been built for underdeveloped countries (e.g. the Narasimham model (112) for India) the underlying assumptions have been largely of the Keynesian type, and thus in the authors opinion unconnected with the theory of economic development. This study is a modest attempt at econometric model-building on the basis of a model of development of an underdeveloped country.
Thoroughly classroom tested, this introductory text covers all the topics that constitute a foundation for basic econometrics, with concise and intuitive explanations of technical material. Important proofs are shown in detail; however, the focus is on developing regression models and understanding the residual
This book constitutes the first serious attempt to explain the basics of econometrics and its applications in the clearest and simplest manner possible. Recognising the fact that a good level of mathematics is no longer a necessary prerequisite for economics/financial economics undergraduate and postgraduate programmes, it introduces this key subdivision of economics to an audience who might otherwise have been deterred by its complex nature.
In this book, Nancy and Richard Ruggles demonstrate their unique grasp of the measurement and analysis of macro and micro data and elucidate ways of integrating the two data sets. Their analysis of macrodata is used to examine the economic growth of the United States from the 1920s to the present day. They focus particularly on recession and recovery between 1929 and 1974 and the measurement of short-run economic growth. They also examine the measurement of saving, investment and capital formation in the United States. On a microeconomic level, they analyse economic intelligence in World War II, offer a study of fertility in the United States in the pre-war era and analyse longitudinal establishment data. Finally they integrating the two approaches to provide a method of providing a more complete picture of social and economic performance.
"Transportation Economics" explores the efficient use of society's
scarce resources for the movement of people and goods. This book
carefully examines transportation markets and standard economic
tools, how these resources are used, and how the allocation of
society resources affects transportation activities. This textbook is unique in that it uses a detailed analysis of
econometric results from current transportation literature to
provide an integrated collection of theory and application. Its
numerous case studies illustrate the economic principles, discuss
testable hypotheses, analyze econometric results, and examine each
study's implications for public policy. These features make this a
well-developed introduction to the foundations of transportation
economics. Additional case studies on a spectrum of domestic and
international transportation topics available at http:
//www.blackwellpublishers.co.uk/mccarthy in order to keep students
abreast of recent developments in the field and their implications
for public policy. The paperback edition of this book is not available from Blackwell in the US or Canda.
This two volume set is a collection of 30 classic papers presenting ideas which have now become standard in the field of Bayesian inference. Topics covered include the central field of statistical inference as well as applications to areas of probability theory, information theory, utility theory and computational theory. It is organized into seven sections: foundations, information theory and prior distributions; robustness and outliers; hierarchical, multivariate and non-parametric models; asymptotics; computations and Monte Carlo methods; and Bayesian econometrics. |
You may like...
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,513
Discovery Miles 65 130
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,672
Discovery Miles 26 720
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R7,998
Discovery Miles 79 980
|