Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
Stochastic differential equations are differential equations whose solutions are stochastic processes. They exhibit appealing mathematical properties that are useful in modeling uncertainties and noisy phenomena in many disciplines. This book is motivated by applications of stochastic differential equations in target tracking and medical technology and, in particular, their use in methodologies such as filtering, smoothing, parameter estimation, and machine learning. It builds an intuitive hands-on understanding of what stochastic differential equations are all about, but also covers the essentials of Ito calculus, the central theorems in the field, and such approximation schemes as stochastic Runge-Kutta. Greater emphasis is given to solution methods than to analysis of theoretical properties of the equations. The book's practical approach assumes only prior understanding of ordinary differential equations. The numerous worked examples and end-of-chapter exercises include application-driven derivations and computational assignments. MATLAB/Octave source code is available for download, promoting hands-on work with the methods.
In the last 20 years, econometric theory on panel data has developed rapidly, particularly for analyzing common behaviors among individuals over time. Meanwhile, the statistical methods employed by applied researchers have not kept up-to-date. This book attempts to fill in this gap by teaching researchers how to use the latest panel estimation methods correctly. Almost all applied economics articles use panel data or panel regressions. However, many empirical results from typical panel data analyses are not correctly executed. This book aims to help applied researchers to run panel regressions correctly and avoid common mistakes. The book explains how to model cross-sectional dependence, how to estimate a few key common variables, and how to identify them. It also provides guidance on how to separate out the long-run relationship and common dynamic and idiosyncratic dynamic relationships from a set of panel data. Aimed at applied researchers who want to learn about panel data econometrics by running statistical software, this book provides clear guidance and is supported by a full range of online teaching and learning materials. It includes practice sections on MATLAB, STATA, and GAUSS throughout, along with short and simple econometric theories on basic panel regressions for those who are unfamiliar with econometric theory on traditional panel regressions.
Applied Econometrics: A Practical Guide is an extremely user-friendly and application-focused book on econometrics. Unlike many econometrics textbooks which are heavily theoretical on abstractions, this book is perfect for beginners and promises simplicity and practicality to the understanding of econometric models. Written in an easy-to-read manner, the book begins with hypothesis testing and moves forth to simple and multiple regression models. It also includes advanced topics: Endogeneity and Two-stage Least Squares Simultaneous Equations Models Panel Data Models Qualitative and Limited Dependent Variable Models Vector Autoregressive (VAR) Models Autocorrelation and ARCH/GARCH Models Unit Root and Cointegration The book also illustrates the use of computer software (EViews, SAS and R) for economic estimating and modeling. Its practical applications make the book an instrumental, go-to guide for solid foundation in the fundamentals of econometrics. In addition, this book includes excerpts from relevant articles published in top-tier academic journals. This integration of published articles helps the readers to understand how econometric models are applied to real-world use cases.
"Econometric Theory" presents a modern approach to the theory of
econometric estimation and inference, with particular applications
to time series. An ideal reference for practitioners and
researchers, the book is also suited for advanced two-semester
econometrics courses and one-semester regression courses. Based on lectures originally given to graduates at the London School of Economics, the book applies recent developments in asymptotic theory to derive the properties of estimators when the model is only partially specified. Topics covered in depth include the linear regression model, dynamic modeling, simultaneous equations, optimization estimators, hypothesis testing, and the theory of nonstationary time series and cointegration.
Stata is one of the most popular statistical software in the world and suited for all kinds of users, from absolute beginners to experienced veterans. This book offers a clear and concise introduction to the usage and the workflow of Stata. Included topics are importing and managing datasets, cleaning and preparing data, creating and manipulating variables, producing descriptive statistics and meaningful graphs as well as central quantitative methods, like linear (OLS) and binary logistic regressions and matching. Additional information about diagnostical tests ensures that these methods yield valid and correct results that live up to academic standards. Furthermore, users are instructed how to export results that can be directly used in popular software like Microsoft Word for seminar papers and publications. Lastly, the book offers a short yet focussed introduction to scientific writing, which should guide readers through the process of writing a first quantitative seminar paper or research report. The book underlines correct usage of the software and a productive workflow which also introduces aspects like replicability and general standards for academic writing. While absolute beginners will enjoy the easy to follow point-and-click interface, more experienced users will benefit from the information about do-files and syntax which makes Stata so popular. Lastly, a wide range of user-contributed software ("Ados") is introduced which further improves the general workflow and guarantees the availability of state of the art statistical methods.
Leonid Hurwicz (1917-2008) was a major figure in modern theoretical economics whose contributions over sixty-five years spanned at least five areas: econometrics, nonlinear programming, decision theory, microeconomic theory, and mechanism design. In 2007, at age ninety, he received the Nobel Memorial Prize in Economics (shared with Eric Maskin and Roger Myerson) for pioneering the field of mechanism design and incentive compatibility. Hurwicz made seminal contributions in the other areas as well. In non-linear programming, he contributed to the understanding of Lagrange-Kuhn-Tucker problems (along with co-authors Kenneth Arrowand Hirofumi Uzawa). In econometrics, the Hurwicz bias in the least-squares analysis of time series is a fundamental and commonly cited benchmark. In decision theory, the Hurwicz criterion for decision-making under ambiguity is routinely invoked, sometimes without a citation since his original paper was never published. In microeconomic theory, Hurwicz (along with Arrow and H.D. Block) initiated the study of stability of the market mechanism, and (with Uzawa) solved the classic integrability of demand problem, a core result in neoclassical consumer theory. While some of Hurwicz's work were published in journals, many remain scattered as chapters in books which are difficult to access; yet others were never published at all. The Collected Papers of Leonid Hurwicz is the first volume in a series of four that will bring his oeuvre in one place, to bring to light the totality of his intellectual output, to document his contribution to economics and the extent of his legacy, with the express purpose to make it easily available for future generations of researchers to build upon.
This book reports the results of five empirical studies undertaken in the early seventies by a collaboration headed by Professor Morishima. It deals with applications of the general equilibrium models whose theoretical aspects have been one of Professor Morishima's main interests. Four main econometric models are constructed for the USA, the UK, and Japan. These are used as a basis for the discussion of various topics in economic theory, such as: the existence and stability or instability of the neoclassical path of full employment growth equilibrium and a von Neumann-type path of balanced growth at constant proces; the antimony between price-stability and full employment; the Samuelson-LeChatelier principle; the theory of the balanced-budget multiplier; the three Hicksian laws of the gross substitutes system; the Brown-Jones super-multipliers of international trade, and so on. In addition, this 1972 work makes a quantitative evaluation for the US economy of monetary and fiscal policies as short-run measures for achieving full employment; the effectiveness of built-in flexibility of taxes in the UK economy is discussed; and estimates are made of the rapid decrease in disguised unemployment in post-war Japan.
This book is the first of its kind to systematically analyze and apply Lim Chong Yah's S-Curve Hypothesis to the various facets of economic growth and economic transition. By augmenting the mathematical and economical sophistication of the hypothesis, this book extends the S-Curve hypothesis to provide further insight into economic growth and transition. It also utilizes a construction of a stochastic growth model to provide the microeconomic foundation for the S-Curve hypothesis. This model resolves the puzzle of why some developing countries experience economic take-off, while others do not. The book analyzes and extends discussion on the S-Curve, and also applies the S-Curve hypothesis to predict long-term growth in Japan and Singapore. It serves as an excellent resource for people interested in Lim's growth theory.
Ragnar Frisch (1895-1973) received the first Nobel Memorial Prize in Economic Science together with Jan Tinbergen in 1969 for having played an important role in ensuring that mathematical techniques figure prominently in modern economic analysis. Frisch was also a co-founder of the Econometric Society in 1930, the inaugural editor of its journal Econometrica for over 20 years, and a major figure in Norwegian academic life. This collection of essays derived from the centennial symposium which marked Frisch's birth explores his contributions to econometrics and other key fields in the discipline as well as the results of new research. Contributors include eminent scholars from Europe, the United Kingdom and North America who investigate themes in utility measurement, production theory, microeconomic policy, econometric methods, macrodynamics, and macroeconomic planning.
Reflecting the fast pace and ever-evolving nature of the financial industry, the Handbook of High-Frequency Trading and Modeling in Finance details how high-frequency analysis presents new systematic approaches to implementing quantitative activities with high-frequency financial data. Introducing new and established mathematical foundations necessary to analyze realistic market models and scenarios, the handbook begins with a presentation of the dynamics and complexity of futures and derivatives markets as well as a portfolio optimization problem using quantum computers. Subsequently, the handbook addresses estimating complex model parameters using high-frequency data. Finally, the handbook focuses on the links between models used in financial markets and models used in other research areas such as geophysics, fossil records, and earthquake studies. The Handbook of High-Frequency Trading and Modeling in Finance also features: Contributions by well-known experts within the academic, industrial, and regulatory fields A well-structured outline on the various data analysis methodologies used to identify new trading opportunities Newly emerging quantitative tools that address growing concerns relating to high-frequency data such as stochastic volatility and volatility tracking; stochastic jump processes for limit-order books and broader market indicators; and options markets Practical applications using real-world data to help readers better understand the presented material The Handbook of High-Frequency Trading and Modeling in Finance is an excellent reference for professionals in the fields of business, applied statistics, econometrics, and financial engineering. The handbook is also a good supplement for graduate and MBA-level courses on quantitative finance, volatility, and financial econometrics. Ionut Florescu, PhD, is Research Associate Professor in Financial Engineering and Director of the Hanlon Financial Systems Laboratory at Stevens Institute of Technology. His research interests include stochastic volatility, stochastic partial differential equations, Monte Carlo Methods, and numerical methods for stochastic processes. Dr. Florescu is the author of Probability and Stochastic Processes, the coauthor of Handbook of Probability, and the coeditor of Handbook of Modeling High-Frequency Data in Finance, all published by Wiley. Maria C. Mariani, PhD, is Shigeko K. Chan Distinguished Professor in Mathematical Sciences and Chair of the Department of Mathematical Sciences at The University of Texas at El Paso. Her research interests include mathematical finance, applied mathematics, geophysics, nonlinear and stochastic partial differential equations and numerical methods. Dr. Mariani is the coeditor of Handbook of Modeling High-Frequency Data in Finance, also published by Wiley. H. Eugene Stanley, PhD, is William Fairfield Warren Distinguished Professor at Boston University. Stanley is one of the key founders of the new interdisciplinary field of econophysics, and has an ISI Hirsch index H=128 based on more than 1200 papers. In 2004 he was elected to the National Academy of Sciences. Frederi G. Viens, PhD, is Professor of Statistics and Mathematics and Director of the Computational Finance Program at Purdue University. He holds more than two dozen local, regional, and national awards and he travels extensively on a world-wide basis to deliver lectures on his research interests, which range from quantitative finance to climate science and agricultural economics. A Fellow of the Institute of Mathematics Statistics, Dr. Viens is the coeditor of Handbook of Modeling High-Frequency Data in Finance, also published by Wiley.
At the intersection between statistical physics and rigorous econometric analysis, this powerful new framework sheds light on how innovation and competition shape the growth and decline of companies and industries. Analyzing various sources of data including a unique micro level database which collects historic data on the sales of more than 3,000 firms and 50,000 products in 20 countries, the authors introduce and test a model of innovation and proportional growth, which relies on minimal assumptions and accounts for the empirically observed regularities. Through a combination of extensive stochastic simulations and statistical tests, the authors investigate to what extent their simple assumptions are falsified by empirically observable facts. Physicists looking for application of their mathematical and modelling skills to relevant economic problems as well as economists interested in the explorative analysis of extensive data sets and in a physics-orientated way of thinking will find this book a key reference.
This book examines whether continuous-time models in frictionless financial economies can be well approximated by discrete-time models. It specifically looks to answer the question: in what sense and to what extent does the famous Black-Scholes-Merton (BSM) continuous-time model of financial markets idealize more realistic discrete-time models of those markets? While it is well known that the BSM model is an idealization of discrete-time economies where the stock price process is driven by a binomial random walk, it is less known that the BSM model idealizes discrete-time economies whose stock price process is driven by more general random walks. Starting with the basic foundations of discrete-time and continuous-time models, David M. Kreps takes the reader through to this important insight with the goal of lowering the entry barrier for many mainstream financial economists, thus bringing less-technical readers to a better understanding of the connections between BSM and nearby discrete-economies.
This book presents essential tools for modelling non-linear time series. The first part of the book describes the main standard tools of probability and statistics that directly apply to the time series context to obtain a wide range of modelling possibilities. Functional estimation and bootstrap are discussed, and stationarity is reviewed. The second part describes a number of tools from Gaussian chaos and proposes a tour of linear time series models. It goes on to address nonlinearity from polynomial or chaotic models for which explicit expansions are available, then turns to Markov and non-Markov linear models and discusses Bernoulli shifts time series models. Finally, the volume focuses on the limit theory, starting with the ergodic theorem, which is seen as the first step for statistics of time series. It defines the distributional range to obtain generic tools for limit theory under long or short-range dependences (LRD/SRD) and explains examples of LRD behaviours. More general techniques (central limit theorems) are described under SRD; mixing and weak dependence are also reviewed. In closing, it describes moment techniques together with their relations to cumulant sums as well as an application to kernel type estimation.The appendix reviews basic probability theory facts and discusses useful laws stemming from the Gaussian laws as well as the basic principles of probability, and is completed by R-scripts used for the figures. Richly illustrated with examples and simulations, the book is recommended for advanced master courses for mathematicians just entering the field of time series, and statisticians who want more mathematical insights into the background of non-linear time series.
The book first discusses in depth various aspects of the well-known
inconsistency that arises when explanatory variables in a linear
regression model are measured with error. Despite this
inconsistency, the region where the true regression coeffecients
lies can sometimes be characterized in a useful way, especially
when bounds are known on the measurement error variance but also
when such information is absent. Wage discrimination with imperfect
productivity measurement is discussed as an important special case.
"A Companion to Theoretical Econometrics" provides a comprehensive
reference to the basics of econometrics. It focuses on the
foundations of the field and at the same time integrates popular
topics often encountered by practitioners. The chapters are written
by international experts and provide up-to-date research in areas
not usually covered by standard econometric texts.
This book is an exceptional reference for readers who require
quick access to the foundation theories in this field. Chapters are
organized to provide clear information and to point to further
readings on the subject. Important topics covered include:
This book presents the principles and methods for the practical analysis and prediction of economic and financial time series. It covers decomposition methods, autocorrelation methods for univariate time series, volatility and duration modeling for financial time series, and multivariate time series methods, such as cointegration and recursive state space modeling. It also includes numerous practical examples to demonstrate the theory using real-world data, as well as exercises at the end of each chapter to aid understanding. This book serves as a reference text for researchers, students and practitioners interested in time series, and can also be used for university courses on econometrics or computational finance.
This book makes indicators more accessible, in terms of what they are, who created them and how they are used. It examines the subjectivity and human frailty behind these quintessentially 'hard' and technical measures of the world. To achieve this goal, The Rise and Rise of Indicators presents the world in terms of a selected set of indicators. The emphasis is upon the origins of the indicators and the motivation behind their creation and evolution. The ideas and assumptions behind the indicators are made transparent to demonstrate how changes to them can dramatically alter the ranking of countries that emerge. They are, after all, human constructs and thus embody human biases. The book concludes by examining the future of indicators and the author sets out some possible trajectories, including the growing emphasis on indicators as important tools in the Sustainable Development Goals that have been set for the world up until 2030. This is a valuable resource for undergraduate and postgraduate students in the areas of economics, sociology, geography, environmental studies, development studies, area studies, business studies, politics and international relations.
VENKATARAMA KRISHNAN, PhD, is Professor Emeritus in the Department of Electrical Engineering at the University of Massachusetts Lowell. Previously, he has taught at the Indian Institute of Science, Polytechnic University, the University of Pennsylvania, Princeton University, Villanova University, and Smith College. He also worked for two years (1974-1976) as a senior systems analyst for Dynamics Research Corporation on estimation problems associated with navigation and guidance and continued as their consultant for more than a decade. Professor Krishnan's research interests include estimation of steady-state queue distributions, tomographic imaging, biosystems, and digital, aerospace, control, communications, and stochastic systems. As a senior member of IEEE, Dr. Krishnan has authored three other books in addition to technical publications.
Including contributions spanning a variety of theoretical and applied topics in econometrics, this volume of Advances in Econometrics is published in honour of Cheng Hsiao. In the first few chapters of this book, new theoretical panel and time series results are presented, exploring JIVE estimators, HAC, HAR and various sandwich estimators, as well as asymptotic distributions for using information criteria to distinguish between the unit root model and explosive models. Other chapters address topics such as structural breaks or growth empirics; auction models; and semiparametric methods testing for common vs. individual trends. Three chapters provide novel empirical approaches to applied problems, such as estimating the impact of survey mode on responses, or investigating how cross-sectional and spatial dependence of mortgages varies by default rates and geography. In the final chapters, Cheng Hsiao offers a forward-focused discussion of the role of big data in economics. For any researcher of econometrics, this is an unmissable volume of the most current and engaging research in the field.
It is impossible to understand modern economics without knowledge of the basic tools of gametheory and mechanism design. This book provides a graduate-level introduction to the economic modeling of strategic behavior. The goal is to teach Economics doctoral students the tools of game theory and mechanism design that all economists should know.
Discover the secrets to applying simple econometric techniques to improve forecasting Equipping analysts, practitioners, and graduate students with a statistical framework to make effective decisions based on the application of simple economic and statistical methods, Economic and Business Forecasting offers a comprehensive and practical approach to quantifying and accurate forecasting of key variables. Using simple econometric techniques, author John E. Silvia focuses on a select set of major economic and financial variables, revealing how to optimally use statistical software as a template to apply to your own variables of interest. * Presents the economic and financial variables that offer unique insights into economic performance * Highlights the econometric techniques that can be used to characterize variables * Explores the application of SAS software, complete with simple explanations of SAS-code and output * Identifies key econometric issues with practical solutions to those problems Presenting the "ten commandments" for economic and business forecasting, this book provides you with a practical forecasting framework you can use for important everyday business applications.
Drawing on the author's extensive and varied research, this book provides readers with a firm grounding in the concepts and issues across several disciplines including economics, nutrition, psychology and public health in the hope of improving the design of food policies in the developed and developing world. Using longitudinal (panel) data from India, Bangladesh, Kenya, the Philippines, Vietnam, and Pakistan and extending the analytical framework used in economics and biomedical sciences to include multi-disciplinary analyses, Alok Bhargava shows how rigorous and thoughtful econometric and statistical analysis can improve our understanding of the relationships between a number of socioeconomic, nutritional, and behavioural variables on a number of issues like cognitive development in children and labour productivity in the developing world. These unique insights combined with a multi-disciplinary approach forge the way for a more refined and effective approach to food policy formation going forward. A chapter on the growing obesity epidemic is also included, highlighting the new set of problems facing not only developed but developing countries. The book also includes a glossary of technical terms to assist readers coming from a variety of disciplines. |
You may like...
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R7,686
Discovery Miles 76 860
Tools and Techniques for Economic…
Jelena Stankovi, Pavlos Delias, …
Hardcover
R5,455
Discovery Miles 54 550
Advanced Introduction to Spatial…
Daniel A. Griffith, Bin Li
Hardcover
R2,639
Discovery Miles 26 390
The Mahalanobis Growth Model - A…
Chetan Ghate, Pawan Gopalakrishnan, …
Hardcover
R1,905
Discovery Miles 19 050
Boundaries and Borders in the…
Nenad Stefanov, Srdjan Radovic
Hardcover
R2,794
Discovery Miles 27 940
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,570
Discovery Miles 25 700
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
|