![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
This book provides a practical introduction to mathematics for economics using R software. Using R as a basis, this book guides the reader through foundational topics in linear algebra, calculus, and optimization. The book is organized in order of increasing difficulty, beginning with a rudimentary introduction to R and progressing through exercises that require the reader to code their own functions in R. All chapters include applications for topics in economics and econometrics. As fully reproducible book, this volume gives readers the opportunity to learn by doing and develop research skills as they go. As such, it is appropriate for students in economics and econometrics.
Originally published in 1984. This book examines two important dimensions of efficiency in the foreign exchange market using econometric techniques. It responds to the macroeconomics trend to re-examining the theories of exchange rate determination following the erratic behaviour of exchange rates in the late 1970s. In particular the text looks at the relation between spot and forward exchange rates and the term structure of the forward premium, both of which require a joint test of market efficiency and the equilibrium model. Approaches used are the regression of spot rates on lagged forward rates and an explicit time series analysis of the spot and forward rates, using data from Canada, the United Kingdom, the Netherlands, Switzerland and Germany.
Originally published in 1981, this book considers one particular area of econometrics- the linear model- where significant recent advances have been made. It considers both single and multiequation models with varying co-efficients, explains the various theories and techniques connected with these and goes on to describe the various applications of the models. Whilst the detailed explanation of the models will interest primarily econometrics specialists, the implications of the advances outlined and the applications of the models will intrest a wide range of economists.
This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging when based on conventional methods. Among others, it introduces a modern text-mining method called dynamic topic modeling in detail and applies it to the message board of Bitcoins. The unique synthesis of theory and practice supported by computational tools is reflected not only in the selection of topics, but also in the fine balance of scientific contributions on practical implementation and theoretical concepts. This link between theory and practice offers theoreticians insights into considerations of applicability and, vice versa, provides practitioners convenient access to new techniques in quantitative finance. Hence the book will appeal both to researchers, including master and PhD students, and practitioners, such as financial engineers. The results presented in the book are fully reproducible and all quantlets needed for calculations are provided on an accompanying website. The Quantlet platform quantlet.de, quantlet.com, quantlet.org is an integrated QuantNet environment consisting of different types of statistics-related documents and program codes. Its goal is to promote reproducibility and offer a platform for sharing validated knowledge native to the social web. QuantNet and the corresponding Data-Driven Documents-based visualization allows readers to reproduce the tables, pictures and calculations inside this Springer book.
'Big data' is now readily available to economic historians, thanks to the digitisation of primary sources, collaborative research linking different data sets, and the publication of databases on the internet. Key economic indicators, such as the consumer price index, can be tracked over long periods, and qualitative information, such as land use, can be converted to a quantitative form. In order to fully exploit these innovations it is necessary to use sophisticated statistical techniques to reveal the patterns hidden in datasets, and this book shows how this can be done. A distinguished group of economic historians have teamed up with younger researchers to pilot the application of new techniques to 'big data'. Topics addressed in this volume include prices and the standard of living, money supply, credit markets, land values and land use, transport, technological innovation, and business networks. The research spans the medieval, early modern and modern periods. Research methods include simultaneous equation systems, stochastic trends and discrete choice modelling. This book is essential reading for doctoral and post-doctoral researchers in business, economic and social history. The case studies will also appeal to historical geographers and applied econometricians.
This book investigates whether the effects of economic integration differ according to the size of countries. The analysis incorporates a classification of the size of countries, reflecting the key economic characteristics of economies in order to provide an appropriate benchmark for each size group in the empirical analysis of the effects of asymmetric economic integration. The formation or extension of Preferential Trade Areas (PTAs) leads to a reduction in trade costs. This poses a critical secondary question as to the extent to which trade costs differ according to the size of countries. The extent to which membership of PTAs has an asymmetric impact on trade flow according to the size of member countries is analyzed by employing econometric tools and general equilibrium analysis, estimating both the ex-post and ex-ante effects of economic integration on the size of countries, using a data set of 218 countries, 45 of which are European. ?
An accessible, contemporary introduction to the methods for determining cause and effect in the social sciences "Causation versus correlation has been the basis of arguments-economic and otherwise-since the beginning of time. Causal Inference: The Mixtape uses legit real-world examples that I found genuinely thought-provoking. It's rare that a book prompts readers to expand their outlook; this one did for me."-Marvin Young (Young MC) Causal inference encompasses the tools that allow social scientists to determine what causes what. In a messy world, causal inference is what helps establish the causes and effects of the actions being studied-for example, the impact (or lack thereof) of increases in the minimum wage on employment, the effects of early childhood education on incarceration later in life, or the influence on economic growth of introducing malaria nets in developing regions. Scott Cunningham introduces students and practitioners to the methods necessary to arrive at meaningful answers to the questions of causation, using a range of modeling techniques and coding instructions for both the R and the Stata programming languages.
Applied econometrics, known to aficionados as 'metrics, is the original data science. 'Metrics encompasses the statistical methods economists use to untangle cause and effect in human affairs. Through accessible discussion and with a dose of kung fu-themed humor, Mastering 'Metrics presents the essential tools of econometric research and demonstrates why econometrics is exciting and useful. The five most valuable econometric methods, or what the authors call the Furious Five--random assignment, regression, instrumental variables, regression discontinuity designs, and differences in differences--are illustrated through well-crafted real-world examples (vetted for awesomeness by Kung Fu Panda's Jade Palace). Does health insurance make you healthier? Randomized experiments provide answers. Are expensive private colleges and selective public high schools better than more pedestrian institutions? Regression analysis and a regression discontinuity design reveal the surprising truth. When private banks teeter, and depositors take their money and run, should central banks step in to save them? Differences-in-differences analysis of a Depression-era banking crisis offers a response. Could arresting O. J. Simpson have saved his ex-wife's life? Instrumental variables methods instruct law enforcement authorities in how best to respond to domestic abuse. Wielding econometric tools with skill and confidence, Mastering 'Metrics uses data and statistics to illuminate the path from cause to effect. * Shows why econometrics is important* Explains econometric research through humorous and accessible discussion* Outlines empirical methods central to modern econometric practice* Works through interesting and relevant real-world examples
Essays in Economic Theory, first published in 1983, combines two essays on game theory and its applications in economics. The first, "Learning Behavior and the Noncooperative Equilibrium", considers whether an adaptive justification, like those commonly available for the optimization models frequently employed elsewhere in economics, can be found for the Nash noncooperative equilibrium. The second essay, "A Game of Fair Division", was motivated by the desire to find attractive methods for solving allocation problems and bargaining disputes that are simple enough to provide useful alternatives to existing methods. It studies in detail one such simple method: the classical "divide-and-choose" procedure. This book will be of interest to students of economics.
This book, first published in 1992, examines the subject of foreign exchange market efficiency and, in particular, the effectiveness of central bank intervention in the market. This book is ideal for students of economics.
The Handbook is a definitive reference source and teaching aid for
econometricians. It examines models, estimation theory, data
analysis and field applications in econometrics. Comprehensive
surveys, written by experts, discuss recent developments at a level
suitable for professional use by economists, econometricians,
statisticians, and in advanced graduate econometrics courses. For
more information on the Handbooks in Economics series, please see
our home page on http: //www.elsevier.nl/locate/hes
Economic Modeling Using Artificial Intelligence Methods examines the application of artificial intelligence methods to model economic data. Traditionally, economic modeling has been modeled in the linear domain where the principles of superposition are valid. The application of artificial intelligence for economic modeling allows for a flexible multi-order non-linear modeling. In addition, game theory has largely been applied in economic modeling. However, the inherent limitation of game theory when dealing with many player games encourages the use of multi-agent systems for modeling economic phenomena. The artificial intelligence techniques used to model economic data include: multi-layer perceptron neural networks radial basis functions support vector machines rough sets genetic algorithm particle swarm optimization simulated annealing multi-agent system incremental learning fuzzy networks Signal processing techniques are explored to analyze economic data, and these techniques are the time domain methods, time-frequency domain methods and fractals dimension approaches. Interesting economic problems such as causality versus correlation, simulating the stock market, modeling and controling inflation, option pricing, modeling economic growth as well as portfolio optimization are examined. The relationship between economic dependency and interstate conflict is explored, and knowledge on how economics is useful to foster peace - and vice versa - is investigated. Economic Modeling Using Artificial Intelligence Methods deals with the issue of causality in the non-linear domain and applies the automatic relevance determination, the evidence framework, Bayesian approach and Granger causality to understand causality and correlation. Economic Modeling Using Artificial Intelligence Methods makes an important contribution to the area of econometrics, and is a valuable source of reference for graduate students, researchers and financial practitioners.
The 30th Volume of Advances in Econometrics is in honor of the two individuals whose hard work has helped ensure thirty successful years of the series, Thomas Fomby and R. Carter Hill. This volume began with a history of the Advances series by Asli Ogunc and Randall Campbell summarizing the prior volumes. Tom Fomby and Carter Hill both provide discussions of the role of Advances over the years. The remaining articles include contributions by a number of authors who have played key roles in the series over the years and in the careers of Fomby and Hill. Overall, this leads to a more diverse mix of papers than a typical volume of Advances in Econometrics.
In the memorable words of Ragnar Frisch, econometrics is 'a unification of the theoretical-quantitative and the empirical-quantitative approach to economic problems'. Beginning to take shape in the 1930s and 1940s, econometrics is now recognized as a vital subdiscipline supported by a vast-and still rapidly growing-body of literature. Following the positive reception of The Rise of Econometrics (2013) (978-0-415-61678-2), Routledge now announces a new collection bringing together the best that has been published on the practical application and functional use of economic metrics and measurements. With a comprehensive introduction, newly written by the editor, which places the assembled materials in their historical and intellectual context, Applied Econometrics is an essential work of reference. This fully indexed collection will be particularly useful as an indispensable database allowing scattered and often fugitive material to be easily located. It will also be welcomed as a crucial tool permitting rapid access to less familiar-and sometimes overlooked-texts. For researchers and students, as well as economic policy-makers, it is a vital one-stop research and pedagogic resource.
This monograph addresses the methodological and empirical issues relevant for the development of sustainable agriculture, with a particular focus on Eastern Europe. It relates economic growth to the other dimensions of sustainability by applying integrated methods. The book comprises five chapters dedicated to the theoretical approaches towards sustainable rural development, productivity analysis, structural change analysis and environmental footprint. The book focuses on the transformations of the agricultural sector while taking into account economic, environmental, and social dynamics. The importance of agricultural transformations to the livelihood of the rural population and food security are highlighted. Further, advanced methodologies and frameworks are presented to fathom the underlying trends in different facets of agricultural production. The authors present statistical methods used for the analysis of agricultural sustainability along with applications for agriculture in the European Union. Additionally, they discuss the measures of efficiency, methodological approaches and empirical models. Finally, the book applies econometric and optimization techniques, which are useful for the estimation of the production functions and other representations of technology in the case of the European Union member states. Therefore, the book is a must-read for researchers and students of agricultural and production economics, as well as policy-makers and academia in general.
In this landmark collection, the editor has selected the most influential papers on the econometrics of panel data published in the period from 1992-2001, thus providing an update on developments in the field since the two volumes edited by G.S. Maddala in 1993, which covered the period from 1966-1992. Topics covered in these latest volumes include core articles on dynamic panels and the generalized method of moments, heterogeneous panels, non-stationary panels including spurious regression, unit roots and tests for cointegration in panels, limited dependent variable models using panel data including models with censored endogenous variables and sample selection, non-linear panel data models, unbalanced panels, pseudo-panels and specification tests in panels.
Bringing together the proceedings of the 1979 and 1980 annual conferences of the Association of University Teachers of Economics the papers in this volume discuss: the effect of social security on private saving; an analysis of aggregate consumer behaviour; the philosophy and objectives of econometrics and other topics in macroeconomic and econometric analysis.
This volume deals with a range of contemporary issues in Indian and other world economies, with a focus on economic theory and policy and their longstanding implications. It analyses and predicts the mechanisms that can come into play to determine the function of institutions and the impact of public policy.
This book is a collection of selected papers presented at the Annual Meeting of the European Academy of Management and Business Economics (AEDEM), held at the Faculty of Economics and Business of the University of Barcelona, 05 07 June, 2012. This edition of the conference has been presented with the slogan Creating new opportunities in an uncertain environment . There are different ways for assessing uncertainty in management but this book mainly focused on soft computing theories and their role in assessing uncertainty in a complex world. The present book gives a comprehensive overview of general management topics and discusses some of the most recent developments in all the areas of business and management including management, marketing, business statistics, innovation and technology, finance, sports and tourism. This book might be of great interest for anyone working in the area of management and business economics and might be especially useful for scientists and graduate students doing research in these fields."
In the memorable words of Ragnar Frisch, econometrics is 'a unification of the theoretical-quantitative and the empirical-quantitative approach to economic problems'. Beginning to take shape in the 1930s and 1940s, econometrics is now recognized as a vital subdiscipline supported by a vast-and still rapidly growing-body of literature. Following the positive reception of The Rise of Econometrics (2013) (978-0-415-61678-2), Routledge now announces a new collection from its Critical Concepts in Economics series. With a comprehensive introduction, newly written by the editor, which places the assembled materials in their historical and intellectual context, Time Series Econometrics is an essential work of reference. This fully indexed collection will be particularly useful as an essential database allowing scattered and often fugitive material to be easily located. It will also be welcomed as a crucial tool permitting rapid access to less familiar-and sometimes overlooked-texts. For researchers and students, as well as economic policy-makers, it is a vital one-stop research and pedagogic resource.
"A collection of proofs of fundamental theorems, this volume utilizes a format that is exhaustive and consistent. Every result covered in Econometrics''is proved as well as stated. One notation system is used throughout the volume. The topics included in the book cover such areas as estimations and testing in linear regression models under various sets of assumptions, and estimation and testing in simultaneous equations models. The latter subject is treated more extensively than in most econometrics books, and the entire volume is characterized by its rigorous level of examination. "
This book brings together the latest research in the areas of market microstructure and high-frequency finance along with new econometric methods to address critical practical issues in these areas of research. Thirteen chapters, each of which makes a valuable and significant contribution to the existing literature have been brought together, spanning a wide range of topics including information asymmetry and the information content in limit order books, high-frequency return distribution models, multivariate volatility forecasting, analysis of individual trading behaviour, the analysis of liquidity, price discovery across markets, market microstructure models and the information content of order flow. These issues are central both to the rapidly expanding practice of high frequency trading in financial markets and to the further development of the academic literature in this area. The volume will therefore be of immediate interest to practitioners and academics. This book was originally published as a special issue of European Journal of Finance.
This is the seventh book in a series of discussions about the great minds in the history and theory of finance. While the series addresses the contributions of scholars in our understanding of financial decisions and markets, this seventh book describes how econometrics developed and how its underlying assumptions created the underpinning of much of modern financial theory. The author shows that the theorists of econometrics were a mix of mathematicians and cosmologists, entrepreneurs, economists and financial scholars. The author demonstrates that by laying down the foundation of empirical analysis, they also forever determined the way in which we think about financial returns and the vocabulary we employ to describe them. Through this volume, the reader can discover the life stories, inspirations, and theories of Carl Friedrich Gauss, Francis Galton, Karl Pearson, Ronald Aylmer Fisher, Harold Hotelling, Alfred Cowles III, Ragnar Frisch, and Trygve Haavelmo, specifically. We learn how each theorist made an intellectual leap simply by thinking about a conventional problem in an unconventional way.
This book studies the information spillover among financial markets and explores the intraday effect and ACD models with high frequency data. This book also contributes theoretically by providing a new statistical methodology with comparative advantages for analyzing co-movements between two time series. It explores this new method by testing the information spillover between the Chinese stock market and the international market, futures market and spot market. Using the high frequency data, this book investigates the intraday effect and examines which type of ACD model is particularly suited in capturing financial duration dynamics. The book will be of invaluable use to scholars and graduate students interested in co-movements among different financial markets and financial market microstructure and to investors and regulation departments looking to improve their risk management. |
You may like...
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,581
Discovery Miles 35 810
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
|