![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
This book is about the concept of "Quality of Life". What is necessary for quality of life, and how can it be measured? The approach is a multicriterial scheme reduction which prevents as much information loss as possible when shifting from the set of partial criteria to their convolution. This book is written for researchers, analysts and graduate and postgraduate students of mathematics and economics.
This book provides a comprehensive and concrete illustration of time series analysis focusing on the state-space model, which has recently attracted increasing attention in a broad range of fields. The major feature of the book lies in its consistent Bayesian treatment regarding whole combinations of batch and sequential solutions for linear Gaussian and general state-space models: MCMC and Kalman/particle filter. The reader is given insight on flexible modeling in modern time series analysis. The main topics of the book deal with the state-space model, covering extensively, from introductory and exploratory methods to the latest advanced topics such as real-time structural change detection. Additionally, a practical exercise using R/Stan based on real data promotes understanding and enhances the reader's analytical capability.
Understanding why so many people across the world are so poor is one of the central intellectual challenges of our time. This book provides the tools and data that will enable students, researchers and professionals to address that issue. Empirical Development Economics has been designed as a hands-on teaching tool to investigate the causes of poverty. The book begins by introducing the quantitative approach to development economics. Each section uses data to illustrate key policy issues. Part One focuses on the basics of understanding the role of education, technology and institutions in determining why incomes differ so much across individuals and countries. In Part Two, the focus is on techniques to address a number of topics in development, including how firms invest, how households decide how much to spend on their children's education, whether microcredit helps the poor, whether food aid works, who gets private schooling and whether property rights enhance investment. A distinctive feature of the book is its presentation of a range of approaches to studying development questions. Development economics has undergone a major change in focus over the last decade with the rise of experimental methods to address development issues; this book shows how these methods relate to more traditional ones. Please visit the book's website at www.empiricalde.com for online supplements including Stata files and solutions to the exercises.
Thorough presentation of the problem of portfolio optimization, leading in a natural way to the Capital Market Theory Dynamic programming and the optimal portfolio selection-consumption problem through time An intuitive approach to Brownian motion and stochastic integral models for continuous time problems The Black-Scholes equation for simple European option values, derived in several different ways A chapter on several types of exotic options and one on material on the management of risk in several contexts
A Guide to Modern Econometrics, Fifth Edition has become established as a highly successful textbook. It serves as a guide to alternative techniques in econometrics with an emphasis on intuition and the practical implementation of these approaches. This fifth edition builds upon the success of its predecessors. The text has been carefully checked and updated, taking into account recent developments and insights. It includes new material on casual inference, the use and limitation of p-values, instrumental variables estimation and its implementation, regression discontinuity design, standardized coefficients, and the presentation of estimation results.
The volume aims at providing an outlet for some of the best papers presented at the 15th Annual Conference of the African Econometric Society, which is one of the "chapters" of the International Econometric Society. Many of these papers represent the state of the art in financial econometrics and applied econometric modeling, and some also provide useful simulations that shed light on the models' ability to generate meaningful scenarios for forecasting and policy analysis.
The Handbook of Mathematical Economics aims to provide a definitive source, reference, and teaching supplement for the field of mathematical economics. It surveys, as of the late 1970's the state of the art of mathematical economics. This is a constantly developing field and all authors were invited to review and to appraise the current status and recent developments in their presentations. In addition to its use as a reference, it is intended that this Handbook will assist researchers and students working in one branch of mathematical economics to become acquainted with other branches of this field. Volume 2 elaborates on "Mathematical Approaches to Microeconomic Theory," including consumer, producer, oligopoly, and duality theory, as well as "Mathematical Approaches to Competitive Equilibrium" including such aspects of competitive equilibrium as existence, stability, uncertainty, the computation of equilibrium prices, and the core of an economy. For more information on the Handbooks in Economics series,
please see our home page on http:
//www.elsevier.nl/locate/hes
This book presents an exciting new set of econometric methods. They have been developed as a result of the increase in power and affordability of computers which allow simulations to be run. The authors have played a large role in developing the techniques.
Maurice Potron (1872-1942), a French Jesuit mathematician, constructed and analyzed a highly original, but virtually unknown economic model. This book presents translated versions of all his economic writings, preceded by a long introduction which sketches his life and environment based on extensive archival research and family documents. Potron had no education in economics and almost no contact with the economists of his time. His primary source of inspiration was the social doctrine of the Church, which had been updated at the end of the nineteenth century. Faced with the 'economic evils' of his time, he reacted by utilizing his talents as a mathematician and an engineer to invent and formalize a general disaggregated model in which production, employment, prices and wages are the main unknowns. He introduced four basic principles or normative conditions ('sufficient production', the 'right to rest', 'justice in exchange', and the 'right to live') to define satisfactory regimes of production and labour on the one hand, and of prices and wages on the other. He studied the conditions for the existence of these regimes, both on the quantity side and the value side, and he explored the way to implement them. This book makes it clear that Potron was the first author to develop a full input-output model, to use the Perron-Frobenius theorem in economics, to state a duality result, and to formulate the Hawkins-Simon condition. These are all techniques which now belong to the standard toolkit of economists. This book will be of interest to Economics postgraduate students and researchers, and will be essential reading for courses dealing with the history of mathematical economics in general, and linear production theory in particular.
The major methodological task for modern economists has been to establish the testability of models. Too often, however, methodological assumptions can make a model virtually impossible to test even under ideal conditions, yet few theorists have examined the requirements and problems of assuring testability in economics. In The Methodology of Economic Model Building, first published in 1989, Lawrence Boland presents the results of a research project that spanned more than twenty years. He examines how economists have applied the philosophy of Karl Popper, relating methodological debates about falsifiability to wider discussions about the truth status of models in natural and social sciences. He concludes that model building in economics reflects more the methodological prescriptions of the economist Paul Samuelson than Popper's 'falsificationism'. This title will prove invaluable to both students and researchers, and represents a substantial contribution to debates about the scientific status of economics.
Most economists assume that the mathematical and quantative sides
of their science are relatively recent developments. Measurement,
Quantification and Economic Analysis shows that this is a
misconception. Its authors argue that economists have long relied
on measurement and quantification as essential tools.
This title, first published in 1979, presents the Ph.D. thesis of the world-renowned economist and financial expert, Willem Buiter. In Part I, three alternative specifications of temporary equilibria in asset markets, including their implications for macroeconomic models, are discussed; Part II examines the long-term implications of some short-term macroeconomic models. The analysis of the theoretical foundations of 'direct crowding out' and 'indirect crowding out' is particularly prominent, with the result that a synthesis of short-term macroeconomic analysis and long-term growth theory is formulated. The traditional tools of comparative dynamics and stability analysis are employed frequently. However, it is also argued that the true scope of government policy can only be adequately evaluated with the aid of concepts such as dynamic and static controllability. Temporary Equilibrium and Long-Run Equilibrium is a valuable study, and relevant for all serious students of modern economic theory.
This book contains a set of notes prepared by Ragnar Frisch for a lecture series that he delivered at Yale University in 1930. The lecture notes provide not only a valuable source document for the history of econometrics, but also a more systematic introduction to some of Frisch's key methodological ideas than his other works so far published in various media for the econometrics community. In particular, these notes contain a number of prescient ideas precursory to some of the most important notions developed in econometrics during the 1970s and 1980s More remarkably, Frisch demonstrated a deep understanding of what econometric or statistical analysis could achieve under the situation where there lacked known correct theoretical models. This volume has been rigorously edited and comes with an introductory essay from Olav Bjerkholt and Duo Qin placing the notes in their historical context.
The development of economics changed dramatically during the twentieth century with the emergence of econometrics, macroeconomics and a more scientific approach in general. One of the key individuals in the transformation of economics was Ragnar Frisch, professor at the University of Oslo and the first Nobel Laureate in economics in 1969. He was a co-founder of the Econometric Society in 1930 (after having coined the word econometrics in 1926) and edited the journal Econometrics for twenty-two years. The discovery of the manuscripts of a series of eight lectures given by Frisch at the Henri Poincare Institute in March-April 1933 on The Problems and Methods of Econometrics will enable economists to more fully understand his overall vision of econometrics. This book is a rare exhibition of Frisch's overview on econometrics and is published here in English for the first time. Edited and with an introduction by Olav Bjerkholt and Ariane Dupont-Kieffer, Frisch's eight lectures provide an accessible and astute discussion of econometric issues from philosophical foundations to practical procedures. Concerning the development of economics in the twentieth century and the broader visions about economic science in general and econometrics in particular held by Ragnar Frisch, this book will appeal to anyone with an interest in the history of economics and econometrics.
This book explores widely used seasonal adjustment methods and recent developments in real time trend-cycle estimation. It discusses in detail the properties and limitations of X12ARIMA, TRAMO-SEATS and STAMP - the main seasonal adjustment methods used by statistical agencies. Several real-world cases illustrate each method and real data examples can be followed throughout the text. The trend-cycle estimation is presented using nonparametric techniques based on moving averages, linear filters and reproducing kernel Hilbert spaces, taking recent advances into account. The book provides a systematical treatment of results that to date have been scattered throughout the literature. Seasonal adjustment and real time trend-cycle prediction play an essential part at all levels of activity in modern economies. They are used by governments to counteract cyclical recessions, by central banks to control inflation, by decision makers for better modeling and planning and by hospitals, manufacturers, builders, transportation, and consumers in general to decide on appropriate action. This book appeals to practitioners in government institutions, finance and business, macroeconomists, and other professionals who use economic data as well as academic researchers in time series analysis, seasonal adjustment methods, filtering and signal extraction. It is also useful for graduate and final-year undergraduate courses in econometrics and time series with a good understanding of linear regression and matrix algebra, as well as ARIMA modelling.
This book focuses on the interaction between equilibrium real exchange rates, optimal external debt, endogenous optimal growth and current account balances, in a world of uncertainty. The theoretical parts result from interdisciplinary research between economics and applied mathematics. From the economic theory and the mathematics of stochastic optimal control the author derives benchmarks for the optimal debt and equilibrium real exchange rate in an environment where both the return on capital and the real rate of interest are stochastic variables. The theoretically derived equilibrium real exchange rate - the "natural real exchange rate" NATREX - is where the real exchange rate is heading. These benchmarks are applied to answer the following questions. * What is a theoretically based empirical measure of a "misaligned" exchange rate that increases the probability of a significant depreciation or a currency crisis? * What is a theoretically based empirical measure of an "excess" debt that increases the probability of or a debt crisis? * What is the interaction between an excess debt and a misaligned exchange rate? The theory is applied to evaluate the Euro exchange rate, the exchange rates of the transition economies, the sustainability of U.S. current account deficits, and derives warning signals of the Asian crises and debt crises in emerging markets.
This title provides a comprehensive, critical coverage of the progress and development of mathematical modelling within urban and regional economics over four decades.
Leverage the full power of Bayesian analysis for competitive advantage Bayesian methods can solve problems you can't reliably handle any other way. Building on your existing Excel analytics skills and experience, Microsoft Excel MVP Conrad Carlberg helps you make the most of Excel's Bayesian capabilities and move toward R to do even more. Step by step, with real-world examples, Carlberg shows you how to use Bayesian analytics to solve a wide array of real problems. Carlberg clarifies terminology that often bewilders analysts, and offers sample R code to take advantage of the rethinking package in R and its gateway to Stan. As you incorporate these Bayesian approaches into your analytical toolbox, you'll build a powerful competitive advantage for your organization-and yourself. Explore key ideas and strategies that underlie Bayesian analysis Distinguish prior, likelihood, and posterior distributions, and compare algorithms for driving sampling inputs Use grid approximation to solve simple univariate problems, and understand its limits as parameters increase Perform complex simulations and regressions with quadratic approximation and Richard McElreath's quap function Manage text values as if they were numeric Learn today's gold-standard Bayesian sampling technique: Markov Chain Monte Carlo (MCMC) Use MCMC to optimize execution speed in high-complexity problems Discover when frequentist methods fail and Bayesian methods are essential-and when to use both in tandem
The revised edition of this book captures new developments in economics and finance. Turning its focus towards the application of Engle's (1982) autoregressive conditional heteroscedasticity (ARCH) in cutting-edge research and a discussion of whether energy prices reflect long memory, this book will keep readers up-to-date with current developments in the literature. It presents twenty-one empirical studies of econometric time series analysis of crude oil, natural gas and electricity markets in face of the rapidly changing dynamics of the energy markets. Amongst them, several studies employ nonlinear time series methods, unlike the standard linear approach commonly used, to reflect the nonlinear nature of the economic system.Two new chapters are included, extending beyond the leading-edge research and innovative energy markets econometrics detailed in the first edition: Chapter 17 examines the effects of oil price changes and speculations on economic activity and Chapter 20 re-evaluates empirical evidence for random walk type behavior in energy futures prices using a statistical physics approach.
First published in 1987, this is an analysis of the contemporary breakdown of political and economic systems within the Eastern European communist countries. Rather than passively following the developments of this crisis, the author seeks instead to identify the reasons for failure and to examine alternative policies that offer solutions to these problems. Jan Winiecki's work offers a comparative study of the Soviet-type economies of the East with the market economies of the West; providing a cause and effect analysis of each model, with possible scenarios for their future prospects.
High-dimensional probability offers insight into the behavior of random vectors, random matrices, random subspaces, and objects used to quantify uncertainty in high dimensions. Drawing on ideas from probability, analysis, and geometry, it lends itself to applications in mathematics, statistics, theoretical computer science, signal processing, optimization, and more. It is the first to integrate theory, key tools, and modern applications of high-dimensional probability. Concentration inequalities form the core, and it covers both classical results such as Hoeffding's and Chernoff's inequalities and modern developments such as the matrix Bernstein's inequality. It then introduces the powerful methods based on stochastic processes, including such tools as Slepian's, Sudakov's, and Dudley's inequalities, as well as generic chaining and bounds based on VC dimension. A broad range of illustrations is embedded throughout, including classical and modern results for covariance estimation, clustering, networks, semidefinite programming, coding, dimension reduction, matrix completion, machine learning, compressed sensing, and sparse regression.
This book examines the measurement and econometric effects of ethnic diversity. This issue is of great relevance to research and policy and is currently being discussed a great deal in the literature. In particular, a sizable literature has suggested that ethnic diversity constitutes a significant barrier to economic development. The precise measurement and interpretation of these results are a matter of substantial controversy. In this book, the dynamics of ethnic diversity are being empirically analyzed for the first time. Furthermore, it develops and applies a new measure of ethnic diversity which takes the distance between groups into account, thus focusing on diversity rather than mere fragmentation. This book convincingly confronts theoretical considerations with (new) data and thereby provides a good mix of theory and empirics, making significant contributions to the current debates.
The design of trading algorithms requires sophisticated mathematical models backed up by reliable data. In this textbook, the authors develop models for algorithmic trading in contexts such as executing large orders, market making, targeting VWAP and other schedules, trading pairs or collection of assets, and executing in dark pools. These models are grounded on how the exchanges work, whether the algorithm is trading with better informed traders (adverse selection), and the type of information available to market participants at both ultra-high and low frequency. Algorithmic and High-Frequency Trading is the first book that combines sophisticated mathematical modelling, empirical facts and financial economics, taking the reader from basic ideas to cutting-edge research and practice. If you need to understand how modern electronic markets operate, what information provides a trading edge, and how other market participants may affect the profitability of the algorithms, then this is the book for you.
Analytics is one of a number of terms which are used to describe a data-driven more scientific approach to management. Ability in analytics is an essential management skill: knowledge of data and analytics helps the manager to analyze decision situations, prevent problem situations from arising, identify new opportunities, and often enables many millions of dollars to be added to the bottom line for the organization. The objective of this book is to introduce analytics from the perspective of the general manager of a corporation. Rather than examine the details or attempt an encyclopaedic review of the field, this text emphasizes the strategic role that analytics is playing in globally competitive corporations today. The chapters of this book are organized in two main parts. The first part introduces a problem area and presents some basic analytical concepts that have been successfully used to address the problem area. The objective of this material is to provide the student, the manager of the future, with a general understanding of the tools and techniques used by the analyst.
Explains modern SDC techniques for data stewards and develop tools to implement them. Explains the logic behind modern privacy protections for researchers and how they may use publicly released data to generate valid statistical inferences-as well as the limitations imposed by SDC techniques. |
You may like...
New Generation of Portal Software and…
Jana Polgar, Greg Adamson
Hardcover
R4,560
Discovery Miles 45 600
|