![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Heavy tails -extreme events or values more common than expected -emerge everywhere: the economy, natural events, and social and information networks are just a few examples. Yet after decades of progress, they are still treated as mysterious, surprising, and even controversial, primarily because the necessary mathematical models and statistical methods are not widely known. This book, for the first time, provides a rigorous introduction to heavy-tailed distributions accessible to anyone who knows elementary probability. It tackles and tames the zoo of terminology for models and properties, demystifying topics such as the generalized central limit theorem and regular variation. It tracks the natural emergence of heavy-tailed distributions from a wide variety of general processes, building intuition. And it reveals the controversy surrounding heavy tails to be the result of flawed statistics, then equips readers to identify and estimate with confidence. Over 100 exercises complete this engaging package.
Upon the backdrop of impressive progress made by the Indian economy during the last two decades after the large-scale economic reforms in the early 1990s, this book evaluates the performance of the economy on some income and non-income dimensions of development at the national, state and sectoral levels. It examines regional economic growth and inequality in income originating from agriculture, industry and services. In view of the importance of the agricultural sector, despite its declining share in gross domestic product, it evaluates the performance of agricultural production and the impact of agricultural reforms on spatial integration of food grain markets. It studies rural poverty, analyzing the trend in employment, the trickle-down process and the inclusiveness of growth in rural India. It also evaluates the impact of microfinance, as an instrument of financial inclusion, on the socio-economic conditions of rural households. Lastly, it examines the relative performance of fifteen major states of India in terms of education, health and human development. An important feature of the book is that it approaches these issues, applying rigorously advanced econometric methods, and focusing primarily on their regional disparities during the post-reform period vis-a-vis the pre-reform period. It offers important results to guide policies for future development.
Building on the strength of the first edition, Quantitative Methods for Business and Economics provides a simple introduction to the mathematical and statistical techniques needed in business. This book is accessible and easy to use, with the emphasis clearly on how to apply quantitative techniques to business situations. It includes numerous real world applications and many opportunities for student interaction. It is clearly focused on business, management and economics students taking a single module in Quantitative Methods.
This volume of Advances in Econometrics contains articles that examine key topics in the modeling and estimation of dynamic stochastic general equilibrium (DSGE) models. Because DSGE models combine micro- and macroeconomic theory with formal econometric modeling and inference, over the past decade they have become an established framework for analyzing a variety of issues in empirical macroeconomics. The research articles make contributions in several key areas in DSGE modeling and estimation. In particular, papers cover the modeling and role of expectations, the study of optimal monetary policy in two-country models, and the problem of non-invertibility. Other interesting areas of inquiry include the analysis of parameter identification in new open economy macroeconomic models and the modeling of trend inflation shocks. The second part of the volume is devoted to articles that offer innovations in econometric methodology. These papers advance new techniques for addressing major inferential problems and include discussion and applications of Laplace-type, frequency domain, empirical likelihood and method of moments estimators.
Across the social sciences there has been increasing focus on reproducibility, i.e., the ability to examine a study's data and methods to ensure accuracy by reproducing the study. Reproducible Econometrics Using R combines an overview of key issues and methods with an introduction to how to use them using open source software (R) and recently developed tools (R Markdown and bookdown) that allow the reader to engage in reproducible econometric research. Jeffrey S. Racine provides a step-by-step approach, and covers five sets of topics, i) linear time series models, ii) robust inference, iii) robust estimation, iv) model uncertainty, and v) advanced topics. The time series material highlights the difference between time-series analysis, which focuses on forecasting, versus cross-sectional analysis, where the focus is typically on model parameters that have economic interpretations. For the time series material, the reader begins with a discussion of random walks, white noise, and non-stationarity. The reader is next exposed to the pitfalls of using standard inferential procedures that are popular in cross sectional settings when modelling time series data, and is introduced to alternative procedures that form the basis for linear time series analysis. For the robust inference material, the reader is introduced to the potential advantages of bootstrapping and the Jackknifing versus the use of asymptotic theory, and a range of numerical approaches are presented. For the robust estimation material, the reader is presented with a discussion of issues surrounding outliers in data and methods for addressing their presence. Finally, the model uncertainly material outlines two dominant approaches for dealing with model uncertainty, namely model selection and model averaging. Throughout the book there is an emphasis on the benefits of using R and other open source tools for ensuring reproducibility. The advanced material covers machine learning methods (support vector machines that are useful for classification) and nonparametric kernel regression which provides the reader with more advanced methods for confronting model uncertainty. The book is well suited for advanced undergraduate and graduate students alike. Assignments, exams, slides, and a solution manual are available for instructors.
Sociological theories of crime include: theories of strain blame crime on personal stressors; theories of social learning blame crime on its social rewards, and see crime more as an institution in conflict with other institutions rather than as in- vidual deviance; and theories of control look at crime as natural and rewarding, and explore the formation of institutions that control crime. Theorists of corruption generally agree that corruption is an expression of the Patron-Client relationship in which a person with access to resources trades resources with kin and members of the community in exchange for loyalty. Some approaches to modeling crime and corruption do not involve an explicit simulation: rule based systems; Bayesian networks; game theoretic approaches, often based on rational choice theory; and Neoclassical Econometrics, a rational choice-based approach. Simulation-based approaches take into account greater complexities of interacting parts of social phenomena. These include fuzzy cognitive maps and fuzzy rule sets that may incorporate feedback; and agent-based simulation, which can go a step farther by computing new social structures not previously identified in theory. The latter include cognitive agent models, in which agents learn how to perceive their en- ronment and act upon the perceptions of their individual experiences; and reactive agent simulation, which, while less capable than cognitive-agent simulation, is adequate for testing a policy's effects with existing societal structures. For example, NNL is a cognitive agent model based on the REPAST Simphony toolkit.
How do groups form, how do institutions come into being, and when do moral norms and practices emerge? This volume explores how game-theoretic approaches can be extended to consider broader questions that cross scales of organization, from individuals to cooperatives to societies. Game theory' strategic formulation of central problems in the analysis of social interactions is used to develop multi-level theories that examine the interplay between individuals and the collectives they form. The concept of cooperation is examined at a higher level than that usually addressed by game theory, especially focusing on the formation of groups and the role of social norms in maintaining their integrity, with positive and negative implications. The authors suggest that conventional analyses need to be broadened to explain how heuristics, like concepts of fairness, arise and become formalized into the ethical principles embraced by a society.
Financial models are an inescapable feature of modern financial markets. Yet it was over reliance on these models and the failure to test them properly that is now widely recognized as one of the main causes of the financial crisis of 2007-2011. Since this crisis, there has been an increase in the amount of scrutiny and testing applied to such models, and validation has become an essential part of model risk management at financial institutions. The book covers all of the major risk areas that a financial institution is exposed to and uses models for, including market risk, interest rate risk, retail credit risk, wholesale credit risk, compliance risk, and investment management. The book discusses current practices and pitfalls that model risk users need to be aware of and identifies areas where validation can be advanced in the future. This provides the first unified framework for validating risk management models.
Economists are regularly confronted with results of quantitative economics research. Econometrics: Theory and Applications with EViews provides a broad introduction to quantitative economic methods, for example how models arise, their underlying assumptions and how estimates of parameters or other economic quantities are computed. The author combines econometric theory with practice by demonstrating its use with the software package EViews through extensive use of screen shots. The emphasis is on understanding how to select the right method of analysis for a given situation, and how to actually apply the theoretical methodology correctly. The EViews software package is available from 'Quantitive Micro Software'. Written for any undergraduate or postgraduate course in Econometrics.
A ground-breaking book that reveals why our human biases affect the way
we receive and interpret information
This Handbook takes an econometric approach to the foundations of economic performance analysis. The focus is on the measurement of efficiency, productivity, growth and performance. These concepts are commonly measured residually and difficult to quantify in practice. In real-life applications, efficiency and productivity estimates are often quite sensitive to the models used in the performance assessment and the methodological approaches adopted by the analysis. The Palgrave Handbook of Performance Analysis discusses the two basic techniques of performance measurement - deterministic benchmarking and stochastic benchmarking - in detail, and addresses the statistical techniques that connect them. All chapters include applications and explore topics ranging from the output/input ratio to productivity indexes and national statistics.
Now in its fourth edition, this comprehensive introduction of fundamental panel data methodologies provides insights on what is most essential in panel literature. A capstone to the forty-year career of a pioneer of panel data analysis, this new edition's primary contribution will be the coverage of advancements in panel data analysis, a statistical method widely used to analyze two or higher-dimensional panel data. The topics discussed in early editions have been reorganized and streamlined to comprehensively introduce panel econometric methodologies useful for identifying causal relationships among variables, supported by interdisciplinary examples and case studies. This book, to be featured in Cambridge's Econometric Society Monographs series, has been the leader in the field since the first edition. It is essential reading for researchers, practitioners and graduate students interested in the analysis of microeconomic behavior.
Score your highest in econometrics? Easy. Econometrics can prove challenging for many students unfamiliar with the terms and concepts discussed in a typical econometrics course. "Econometrics For Dummies "eliminates that confusion with easy-to-understand explanations of important topics in the study of economics. "Econometrics For Dummies "breaks down this complex subject and provides you with an easy-to-follow course supplement to further refine your understanding of how econometrics works and how it can be applied in real-world situations.An excellent resource for anyone participating in a college or graduate level econometrics courseProvides you with an easy-to-follow introduction to the techniques and applications of econometricsHelps you score high on exam day If you're seeking a degree in economics and looking for a plain-English guide to this often-intimidating course, "Econometrics For Dummies" has you covered.
We live in a time of economic virtualism, whereby our lives are
made to conform to the virtual reality of economic thought.
Globalization, transnational capitalism, structural adjustment
programmes and the decay of welfare are all signs of the growing
power of economics, one of the most potent forces of recent
decades. In the last thirty years, economics has ceased to be just
an academic discipline concerned with the study of economy, and has
come to be the only legitimate way to think about all aspects of
society and how we order our lives. Economic models are no longer
measured against the world they seek to describe, but instead the
world is measured against them, found wanting and made to conform.
This book is dedicated to the study of the term structures of the yields of zero-coupon bonds. The methods it describes differ from those usually found in the literature in that the time variable is not the term to maturity but the interest rate duration, or another convenient non-linear transformation of terms. This makes it possible to consider yield curves not only for a limited interval of term values, but also for the entire positive semiaxis of terms. The main focus is the comparative analysis of yield curves and forward curves and the analytical study of their features. Generalizations of yield term structures are studied where the dimension of the state space of the financial market is increased. In cases where the analytical approach is too cumbersome, or impossible, numerical techniques are used. This book will be of interest to financial analysts, financial market researchers, graduate students and PhD students.
Connections among different assets, asset classes, portfolios, and the stocks of individual institutions are critical in examining financial markets. Interest in financial markets implies interest in underlying macroeconomic fundamentals. In Financial and Macroeconomic Connectedness, Frank Diebold and Kamil Yilmaz propose a simple framework for defining, measuring, and monitoring connectedness, which is central to finance and macroeconomics. These measures of connectedness are theoretically rigorous yet empirically relevant. The approach to connectedness proposed by the authors is intimately related to the familiar econometric notion of variance decomposition. The full set of variance decompositions from vector auto-regressions produces the core of the 'connectedness table.' The connectedness table makes clear how one can begin with the most disaggregated pair-wise directional connectedness measures and aggregate them in various ways to obtain total connectedness measures. The authors also show that variance decompositions define weighted, directed networks, so that these proposed connectedness measures are intimately related to key measures of connectedness used in the network literature. After describing their methods in the first part of the book, the authors proceed to characterize daily return and volatility connectedness across major asset (stock, bond, foreign exchange and commodity) markets as well as the financial institutions within the U.S. and across countries since late 1990s. These specific measures of volatility connectedness show that stock markets played a critical role in spreading the volatility shocks from the U.S. to other countries. Furthermore, while the return connectedness across stock markets increased gradually over time the volatility connectedness measures were subject to significant jumps during major crisis events. This book examines not only financial connectedness, but also real fundamental connectedness. In particular, the authors show that global business cycle connectedness is economically significant and time-varying, that the U.S. has disproportionately high connectedness to others, and that pairwise country connectedness is inversely related to bilateral trade surpluses.
Drawing on the author's extensive and varied research, this book provides readers with a firm grounding in the concepts and issues across several disciplines including economics, nutrition, psychology and public health in the hope of improving the design of food policies in the developed and developing world. Using longitudinal (panel) data from India, Bangladesh, Kenya, the Philippines, Vietnam, and Pakistan and extending the analytical framework used in economics and biomedical sciences to include multi-disciplinary analyses, Alok Bhargava shows how rigorous and thoughtful econometric and statistical analysis can improve our understanding of the relationships between a number of socioeconomic, nutritional, and behavioural variables on a number of issues like cognitive development in children and labour productivity in the developing world. These unique insights combined with a multi-disciplinary approach forge the way for a more refined and effective approach to food policy formation going forward. A chapter on the growing obesity epidemic is also included, highlighting the new set of problems facing not only developed but developing countries. The book also includes a glossary of technical terms to assist readers coming from a variety of disciplines.
This report is a partial result of the China's Quarterly Macroeconomic Model (CQMM), a project developed and maintained by the Center for Macroeconomic Research (CMR) at Xiamen University. The CMR, one of the Key Research Institutes of Humanities and Social Sciences sponsored by the Ministry of Education of China, has been focusing on China's economic forecast and macroeconomic policy analysis, and it started to develop the CQMM for purpose of short-term forecasting, policy analysis, and simulation in 2005.Based on the CQMM, the CMR and its partners hold press conferences to release forecasts for China' major macroeconomic variables. Since July, 2006, twenty-six quarterly reports on China's macroeconomic outlook have been presented and thirteen annual reports have been published. This 27th quarterly report has been presented at the Forum on China's Macroeconomic Prospects and Press Conference of the CQMM at Xiamen University Malaysia on October 25, 2019. This conference was jointly held by Xiamen University and Economic Information Daily of Xinhua News Agency.
This introductory overview explores the methods, models and interdisciplinary links of artificial economics, a new way of doing economics in which the interactions of artificial economic agents are computationally simulated to study their individual and group behavior patterns. Conceptually and intuitively, and with simple examples, Mercado addresses the differences between the basic assumptions and methods of artificial economics and those of mainstream economics. He goes on to explore various disciplines from which the concepts and methods of artificial economics originate; for example cognitive science, neuroscience, artificial intelligence, evolutionary science and complexity science. Introductory discussions on several controversial issues are offered, such as the application of the concepts of evolution and complexity in economics and the relationship between artificial intelligence and the philosophies of mind. This is one of the first books to fully address artificial economics, emphasizing its interdisciplinary links and presenting in a balanced way its occasionally controversial aspects.
This introductory overview explores the methods, models and interdisciplinary links of artificial economics, a new way of doing economics in which the interactions of artificial economic agents are computationally simulated to study their individual and group behavior patterns. Conceptually and intuitively, and with simple examples, Mercado addresses the differences between the basic assumptions and methods of artificial economics and those of mainstream economics. He goes on to explore various disciplines from which the concepts and methods of artificial economics originate; for example cognitive science, neuroscience, artificial intelligence, evolutionary science and complexity science. Introductory discussions on several controversial issues are offered, such as the application of the concepts of evolution and complexity in economics and the relationship between artificial intelligence and the philosophies of mind. This is one of the first books to fully address artificial economics, emphasizing its interdisciplinary links and presenting in a balanced way its occasionally controversial aspects.
In this book, different quantitative approaches to the study of electoral systems have been developed: game-theoretic, decision-theoretic, statistical, probabilistic, combinatorial, geometric, and optimization ones. All the authors are prominent scholars from these disciplines. Quantitative approaches offer a powerful tool to detect inconsistencies or poor performance in actual systems. Applications to concrete settings such as EU, American Congress, regional, and committee voting are discussed.
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin-Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in ride-sharing applications.
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin-Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in ride-sharing applications.
This book examines conventional time series in the context of stationary data prior to a discussion of cointegration, with a focus on multivariate models. The authors provide a detailed and extensive study of impulse responses and forecasting in the stationary and non-stationary context, considering small sample correction, volatility and the impact of different orders of integration. Models with expectations are considered along with alternate methods such as Singular Spectrum Analysis (SSA), the Kalman Filter and Structural Time Series, all in relation to cointegration. Using single equations methods to develop topics, and as examples of the notion of cointegration, Burke, Hunter, and Canepa provide direction and guidance to the now vast literature facing students and graduate economists. |
You may like...
Space - A Collection of Essays and…
Andi Campognone, Shana Mabari
Hardcover
R1,133
Discovery Miles 11 330
Passivity of Complex Dynamical Networks…
Jinliang Wang, Huai-Ning Wu, …
Hardcover
R3,804
Discovery Miles 38 040
Reference for Modern Instrumentation…
R.N. Thurston, Allan D. Pierce
Hardcover
R3,460
Discovery Miles 34 600
Machine Learning Techniques for Pattern…
Mohit Dua, Ankit Kumar Jain
Hardcover
R7,962
Discovery Miles 79 620
|