Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
This book scientifically tests the assertion that accommodative monetary policy can eliminate the "crowd out" problem, allowing fiscal stimulus programs (such as tax cuts or increased government spending) to stimulate the economy as intended. It also tests to see if natural growth in th economy can cure the crowd out problem as well or better. The book is intended to be the largest scale scientific test ever performed on this topic. It includes about 800 separate statistical tests on the U.S. economy testing different parts or all of the period 1960 - 2010. These tests focus on whether accommodative monetary policy, which increases the pool of loanable resources, can offset the crowd out problem as well as natural growth in the economy. The book, employing the best scientific methods available to economists for this type of problem, concludes accommodate monetary policy could have, but until the quantitative easing program, Federal Reserve efforts to accommodate fiscal stimulus programs were not large enough to offset more than 23% to 44% of any one year's crowd out problem. That provides the science part of the answer as to why accommodative monetary policy didn't accommodate: too little of it was tried. The book also tests whether other increases in loanable funds, occurring because of natural growth in the economy or changes in the savings rate can also offset crowd out. It concludes they can, and that these changes tend to be several times as effective as accommodative monetary policy. This book's companion volume Why Fiscal Stimulus Programs Fail explores the policy implications of these results.
This handbook presents emerging research exploring the theoretical and practical aspects of econometric techniques for the financial sector and their applications in economics. By doing so, it offers invaluable tools for predicting and weighing the risks of multiple investments by incorporating data analysis. Throughout the book the authors address a broad range of topics such as predictive analysis, monetary policy, economic growth, systemic risk and investment behavior. This book is a must-read for researchers, scholars and practitioners in the field of economics who are interested in a better understanding of current research on the application of econometric methods to financial sector data.
Reflecting the fast pace and ever-evolving nature of the financial industry, the Handbook of High-Frequency Trading and Modeling in Finance details how high-frequency analysis presents new systematic approaches to implementing quantitative activities with high-frequency financial data. Introducing new and established mathematical foundations necessary to analyze realistic market models and scenarios, the handbook begins with a presentation of the dynamics and complexity of futures and derivatives markets as well as a portfolio optimization problem using quantum computers. Subsequently, the handbook addresses estimating complex model parameters using high-frequency data. Finally, the handbook focuses on the links between models used in financial markets and models used in other research areas such as geophysics, fossil records, and earthquake studies. The Handbook of High-Frequency Trading and Modeling in Finance also features: Contributions by well-known experts within the academic, industrial, and regulatory fields A well-structured outline on the various data analysis methodologies used to identify new trading opportunities Newly emerging quantitative tools that address growing concerns relating to high-frequency data such as stochastic volatility and volatility tracking; stochastic jump processes for limit-order books and broader market indicators; and options markets Practical applications using real-world data to help readers better understand the presented material The Handbook of High-Frequency Trading and Modeling in Finance is an excellent reference for professionals in the fields of business, applied statistics, econometrics, and financial engineering. The handbook is also a good supplement for graduate and MBA-level courses on quantitative finance, volatility, and financial econometrics. Ionut Florescu, PhD, is Research Associate Professor in Financial Engineering and Director of the Hanlon Financial Systems Laboratory at Stevens Institute of Technology. His research interests include stochastic volatility, stochastic partial differential equations, Monte Carlo Methods, and numerical methods for stochastic processes. Dr. Florescu is the author of Probability and Stochastic Processes, the coauthor of Handbook of Probability, and the coeditor of Handbook of Modeling High-Frequency Data in Finance, all published by Wiley. Maria C. Mariani, PhD, is Shigeko K. Chan Distinguished Professor in Mathematical Sciences and Chair of the Department of Mathematical Sciences at The University of Texas at El Paso. Her research interests include mathematical finance, applied mathematics, geophysics, nonlinear and stochastic partial differential equations and numerical methods. Dr. Mariani is the coeditor of Handbook of Modeling High-Frequency Data in Finance, also published by Wiley. H. Eugene Stanley, PhD, is William Fairfield Warren Distinguished Professor at Boston University. Stanley is one of the key founders of the new interdisciplinary field of econophysics, and has an ISI Hirsch index H=128 based on more than 1200 papers. In 2004 he was elected to the National Academy of Sciences. Frederi G. Viens, PhD, is Professor of Statistics and Mathematics and Director of the Computational Finance Program at Purdue University. He holds more than two dozen local, regional, and national awards and he travels extensively on a world-wide basis to deliver lectures on his research interests, which range from quantitative finance to climate science and agricultural economics. A Fellow of the Institute of Mathematics Statistics, Dr. Viens is the coeditor of Handbook of Modeling High-Frequency Data in Finance, also published by Wiley.
Space is a crucial variable in any economic activity. Spatial Economics is the branch of economics that explicitly aims to incorporate the space dimension in the analysis of economic phenomena. From its beginning in the last century, Spatial Economics has contributed to the understanding of the economy by developing plenty of theoretical models as well as econometric techniques having the "space" as a core dimension of the analysis.This edited volume addresses the complex issue of Spatial Economics from an applied point of view. This volume is part of a more complex project including another edited volume (Spatial Economics Volume I: Theory) collecting original papers which address Spatial Economics from a theoretical perspective.
This Handbook takes an econometric approach to the foundations of economic performance analysis. The focus is on the measurement of efficiency, productivity, growth and performance. These concepts are commonly measured residually and difficult to quantify in practice. In real-life applications, efficiency and productivity estimates are often quite sensitive to the models used in the performance assessment and the methodological approaches adopted by the analysis. The Palgrave Handbook of Performance Analysis discusses the two basic techniques of performance measurement - deterministic benchmarking and stochastic benchmarking - in detail, and addresses the statistical techniques that connect them. All chapters include applications and explore topics ranging from the output/input ratio to productivity indexes and national statistics.
This text prepares first-year graduate students and advanced undergraduates for empirical research in economics, and also equips them for specialization in econometric theory, business, and sociology. "A Course in Econometrics" is likely to be the text most thoroughly attuned to the needs of your students. Derived from the course taught by Arthur S. Goldberger at the University of Wisconsin-Madison and at Stanford University, it is specifically designed for use over two semesters, offers students the most thorough grounding in introductory statistical inference, and offers a substantial amount of interpretive material. The text brims with insights, strikes a balance between rigor and intuition, and provokes students to form their own critical opinions. "A Course in Econometrics" thoroughly covers the fundamentals--classical regression and simultaneous equations--and offers clear and logical explorations of asymptotic theory and nonlinear regression. To accommodate students with various levels of preparation, the text opens with a thorough review of statistical concepts and methods, then proceeds to the regression model and its variants. Bold subheadings introduce and highlight key concepts throughout each chapter. Each chapter concludes with a set of exercises specifically designed to reinforce and extend the material covered. Many of the exercises include real micro-data analyses, and all are ideally suited to use as homework and test questions.
This monograph addresses the methodological and empirical issues relevant for the development of sustainable agriculture, with a particular focus on Eastern Europe. It relates economic growth to the other dimensions of sustainability by applying integrated methods. The book comprises five chapters dedicated to the theoretical approaches towards sustainable rural development, productivity analysis, structural change analysis and environmental footprint. The book focuses on the transformations of the agricultural sector while taking into account economic, environmental, and social dynamics. The importance of agricultural transformations to the livelihood of the rural population and food security are highlighted. Further, advanced methodologies and frameworks are presented to fathom the underlying trends in different facets of agricultural production. The authors present statistical methods used for the analysis of agricultural sustainability along with applications for agriculture in the European Union. Additionally, they discuss the measures of efficiency, methodological approaches and empirical models. Finally, the book applies econometric and optimization techniques, which are useful for the estimation of the production functions and other representations of technology in the case of the European Union member states. Therefore, the book is a must-read for researchers and students of agricultural and production economics, as well as policy-makers and academia in general.
In this book, Nancy and Richard Ruggles demonstrate their unique grasp of the measurement and analysis of macro and micro data and elucidate ways of integrating the two data sets. Their analysis of macrodata is used to examine the economic growth of the United States from the 1920s to the present day. They focus particularly on recession and recovery between 1929 and 1974 and the measurement of short-run economic growth. They also examine the measurement of saving, investment and capital formation in the United States. On a microeconomic level, they analyse economic intelligence in World War II, offer a study of fertility in the United States in the pre-war era and analyse longitudinal establishment data. Finally they integrating the two approaches to provide a method of providing a more complete picture of social and economic performance.
Building on the strength of the first edition, Quantitative Methods for Business and Economics provides a simple introduction to the mathematical and statistical techniques needed in business. This book is accessible and easy to use, with the emphasis clearly on how to apply quantitative techniques to business situations. It includes numerous real world applications and many opportunities for student interaction. It is clearly focused on business, management and economics students taking a single module in Quantitative Methods.
The volume examines the state-of-the-art of productivity and efficiency analysis. It brings together a selection of the best papers from the 10th North American Productivity Workshop. By analyzing world-wide perspectives on challenges that local economies and institutions may face when changes in productivity are observed, readers can quickly assess the impact of productivity measurement, productivity growth, dynamics of productivity change, measures of labor productivity, measures of technical efficiency in different sectors, frontier analysis, measures of performance, industry instability and spillover effects. The contributions in this volume focus on the theory and application of economics, econometrics, statistics, management science and operational research related to problems in the areas of productivity and efficiency measurement. Popular techniques and methodologies including stochastic frontier analysis and data envelopment analysis are represented. Chapters also cover broader issues related to measuring, understanding, incentivizing and improving the productivity and performance of firms, public services, and industries.
This book is dedicated to the study of the term structures of the yields of zero-coupon bonds. The methods it describes differ from those usually found in the literature in that the time variable is not the term to maturity but the interest rate duration, or another convenient non-linear transformation of terms. This makes it possible to consider yield curves not only for a limited interval of term values, but also for the entire positive semiaxis of terms. The main focus is the comparative analysis of yield curves and forward curves and the analytical study of their features. Generalizations of yield term structures are studied where the dimension of the state space of the financial market is increased. In cases where the analytical approach is too cumbersome, or impossible, numerical techniques are used. This book will be of interest to financial analysts, financial market researchers, graduate students and PhD students.
Econometrics, Macroeconomics and Economic Policy presents eighteen papers by Carl Christ focusing on econometric models, their evaluation and history, and the interactions between monetary and fiscal policy.Professor Christ's pioneering contributions to econometrics, monetary and fiscal policies and the government's budget constraint are thoroughly covered in this volume. Other areas addressed include monetary economics, monetary policy, macroeconomic model building, and the role of the economist in economic policy making. The book also features an original new introduction by the author and a detailed bibliography. Econometricians and macroeconomists will welcome this outstanding volume in which Professor Christ argues firmly for the importance of testing econometric equations and models against new data, as well as for exploring the impact of the policies of central government.
This volume of Advances in Econometrics contains a selection of papers presented at the "Econometrics of Complex Survey Data: Theory and Applications" conference organized by the Bank of Canada, Ottawa, Canada, from October 19-20, 2017. The papers included in this volume span a range of methodological and practical topics including survey collection comparisons, imputation mechanisms, the bootstrap, nonparametric techniques, specification tests, and empirical likelihood estimation using complex survey data. For academics and students with an interest in econometrics and the ways in which complex survey data can be used and evaluated, this volume is essential.
Space is a crucial variable in any economic activity. Spatial Economics is the branch of economics that explicitly aims to incorporate the space dimension in the analysis of economic phenomena. From its beginning in the last century, Spatial Economics has contributed to the understanding of the economy by developing plenty of theoretical models as well as econometric techniques having the "space" as a core dimension of the analysis. This edited volume addresses the complex issue of Spatial Economics from a theoretical point of view. This volume is part of a more complex project including another edited volume (Spatial Economics Volume II: Applications) collecting original papers which address Spatial Economics from an applied perspective.
This book systematically provides a prospective integrated approach for complexity social science in its view of statistical physics and mathematics, with an impressive collection of the knowledge and expertise of leading researchers from all over the world. The book mainly covers both finitary methods of statistical equilibrium and data-driven analysis by econophysics. The late Professor Masanao Aoki of UCLA, who passed away at the end of July 2018, in his later years dedicated himself to the reconstruction of macroeconomics mainly in terms of statistical physics. Professor Aoki, who was already an IEEE fellow, was also named an Econometric Society Fellow in 1979. Until the early 1990s, however, his contributions were focused on the new developments of a novel algorithm for the time series model and their applications to economic data. Those contributions were undoubtedly equivalent to the Nobel Prize-winning work of Granger's "co-integration method". After the publications of his New Approaches to Macroeconomic Modeling and Modeling Aggregate Behavior and Fluctuations in Economics, both published by Cambridge University Press, in 1996 and 2002, respectively, his contributions rapidly became known and spread throughout the field. In short, these new works challenged econophysicists to develop evolutionary stochastic dynamics, multiple equilibria, and externalities as field effects and revolutionized the stochastic views of interacting agents. In particular, the publication of Reconstructing Macroeconomics, also by Cambridge University Press (2007), in cooperation with Hiroshi Yoshikawa, further sharpened the process of embodying "a perspective from statistical physics and combinatorial stochastic processes" in economic modeling. Interestingly, almost concurrently with Prof. Aoki's newest development, similar approaches were appearing. Thus, those who were working in the same context around the world at that time came together, exchanging their results during the past decade. In memory of Prof. Aoki, this book has been planned by authors who followed him to present the most advanced outcomes of his heritage.
We live in a time of economic virtualism, whereby our lives are
made to conform to the virtual reality of economic thought.
Globalization, transnational capitalism, structural adjustment
programmes and the decay of welfare are all signs of the growing
power of economics, one of the most potent forces of recent
decades. In the last thirty years, economics has ceased to be just
an academic discipline concerned with the study of economy, and has
come to be the only legitimate way to think about all aspects of
society and how we order our lives. Economic models are no longer
measured against the world they seek to describe, but instead the
world is measured against them, found wanting and made to conform.
Discover the secrets to applying simple econometric techniques to improve forecasting Equipping analysts, practitioners, and graduate students with a statistical framework to make effective decisions based on the application of simple economic and statistical methods, Economic and Business Forecasting offers a comprehensive and practical approach to quantifying and accurate forecasting of key variables. Using simple econometric techniques, author John E. Silvia focuses on a select set of major economic and financial variables, revealing how to optimally use statistical software as a template to apply to your own variables of interest. * Presents the economic and financial variables that offer unique insights into economic performance * Highlights the econometric techniques that can be used to characterize variables * Explores the application of SAS software, complete with simple explanations of SAS-code and output * Identifies key econometric issues with practical solutions to those problems Presenting the "ten commandments" for economic and business forecasting, this book provides you with a practical forecasting framework you can use for important everyday business applications.
Now in its third edition, Essential Econometric Techniques: A Guide to Concepts and Applications is a concise, student-friendly textbook which provides an introductory grounding in econometrics, with an emphasis on the proper application and interpretation of results. Drawing on the author's extensive teaching experience, this book offers intuitive explanations of concepts such as heteroskedasticity and serial correlation, and provides step-by-step overviews of each key topic. This new edition contains more applications, brings in new material including a dedicated chapter on panel data techniques, and moves the theoretical proofs to appendices. After Chapter 7, students will be able to design and conduct rudimentary econometric research. The next chapters cover multicollinearity, heteroskedasticity, and autocorrelation, followed by techniques for time-series analysis and panel data. Excel data sets for the end-of-chapter problems are available as a digital supplement. A solutions manual is also available for instructors, as well as PowerPoint slides for each chapter. Essential Econometric Techniques shows students how economic hypotheses can be questioned and tested using real-world data, and is the ideal supplementary text for all introductory econometrics courses.
Drawing on the author's extensive and varied research, this book provides readers with a firm grounding in the concepts and issues across several disciplines including economics, nutrition, psychology and public health in the hope of improving the design of food policies in the developed and developing world. Using longitudinal (panel) data from India, Bangladesh, Kenya, the Philippines, Vietnam, and Pakistan and extending the analytical framework used in economics and biomedical sciences to include multi-disciplinary analyses, Alok Bhargava shows how rigorous and thoughtful econometric and statistical analysis can improve our understanding of the relationships between a number of socioeconomic, nutritional, and behavioural variables on a number of issues like cognitive development in children and labour productivity in the developing world. These unique insights combined with a multi-disciplinary approach forge the way for a more refined and effective approach to food policy formation going forward. A chapter on the growing obesity epidemic is also included, highlighting the new set of problems facing not only developed but developing countries. The book also includes a glossary of technical terms to assist readers coming from a variety of disciplines.
This book surveys the state-of-the-art in efficiency and productivity analysis, examining advances in the analytical foundations and empirical applications. The analytical techniques developed in this book for efficiency provide alternative ways of defining optimum outcome sets, typically as a (technical) production frontier or as an (economic) cost, revenue or profit frontier, and alternative ways of measuring efficiency relative to an appropriate frontier. Simultaneously, the analytical techniques developed for efficiency analysis extend directly to productivity analysis, thereby providing alternative methods for estimating productivity levels, and productivity change through time or productivity variation across producers. This book includes chapters using data envelopment analysis (DEA) or stochastic frontier analysis (SFA) as quantitative techniques capable of measuring efficiency and productivity. Across the book's 15 chapters, it broadly extends into popular application areas including agriculture, banking and finance, and municipal performance, and relatively new application areas including corporate social responsibility, the value of intangible assets, land consolidation, and the measurement of economic well-being. The chapters also cover topics such as permutation tests for production frontier shifts, new indices of total factor productivity, and also randomized controlled trials and production frontiers.
Gary Madden was a renaissance man with respect to the nexus between information and communications technology (ICT) and economics. He contributed to a variety of fields in ICT: applied econometrics, forecasting, internet governance and policy. This series of essays, two of which were co-authored by Professor Madden prior to his untimely death, cover the range of his research interests. While the essays focus on a number of ICT issues, they are on the frontier of research in the sector. Gerard Faulhaber provides a broad overview of how we have reached the digital age and its implications. The applied econometric section brings the latest research in the area, for example Lester Taylor illustrates how own-price, cross-price and income elasticities can be calculated from survey data and translated into real income effects. The forecasting section ranges from forecasting online political participation to broadband's impact on economic growth. The final section covers aspects of governance and regulation of the ICT sector.
This introductory overview explores the methods, models and interdisciplinary links of artificial economics, a new way of doing economics in which the interactions of artificial economic agents are computationally simulated to study their individual and group behavior patterns. Conceptually and intuitively, and with simple examples, Mercado addresses the differences between the basic assumptions and methods of artificial economics and those of mainstream economics. He goes on to explore various disciplines from which the concepts and methods of artificial economics originate; for example cognitive science, neuroscience, artificial intelligence, evolutionary science and complexity science. Introductory discussions on several controversial issues are offered, such as the application of the concepts of evolution and complexity in economics and the relationship between artificial intelligence and the philosophies of mind. This is one of the first books to fully address artificial economics, emphasizing its interdisciplinary links and presenting in a balanced way its occasionally controversial aspects.
In this book, different quantitative approaches to the study of electoral systems have been developed: game-theoretic, decision-theoretic, statistical, probabilistic, combinatorial, geometric, and optimization ones. All the authors are prominent scholars from these disciplines. Quantitative approaches offer a powerful tool to detect inconsistencies or poor performance in actual systems. Applications to concrete settings such as EU, American Congress, regional, and committee voting are discussed.
The Regression Discontinuity (RD) design is one of the most popular and credible research designs for program evaluation and causal inference. This volume 38 of Advances in Econometrics collects twelve innovative and thought-provoking contributions to the RD literature, covering a wide range of methodological and practical topics. Some chapters touch on foundational methodological issues such as identification, interpretation, implementation, falsification testing, estimation and inference, while others focus on more recent and related topics such as identification and interpretation in a discontinuity-in-density framework, empirical structural estimation, comparative RD methods, and extrapolation. These chapters not only give new insights for current methodological and empirical research, but also provide new bases and frameworks for future work in this area. This volume contributes to the rapidly expanding RD literature by bringing together theoretical and applied econometricians, statisticians, and social, behavioural and biomedical scientists, in the hope that these interactions will further spark innovative practical developments in this important and active research area.
A provocative new analysis of immigration's long-term effects on a nation's economy and culture. Over the last two decades, as economists began using big datasets and modern computing power to reveal the sources of national prosperity, their statistical results kept pointing toward the power of culture to drive the wealth of nations. In The Culture Transplant, Garett Jones documents the cultural foundations of cross-country income differences, showing that immigrants import cultural attitudes from their homelands—toward saving, toward trust, and toward the role of government—that persist for decades, and likely for centuries, in their new national homes. Full assimilation in a generation or two, Jones reports, is a myth. And the cultural traits migrants bring to their new homes have enduring effects upon a nation's economic potential. Built upon mainstream, well-reviewed academic research that hasn't pierced the public consciousness, this book offers a compelling refutation of an unspoken consensus that a nation's economic and political institutions won't be changed by immigration. Jones refutes the common view that we can discuss migration policy without considering whether migration can, over a few generations, substantially transform the economic and political institutions of a nation. And since most of the world's technological innovations come from just a handful of nations, Jones concludes, the entire world has a stake in whether migration policy will help or hurt the quality of government and thus the quality of scientific breakthroughs in those rare innovation powerhouses. |
You may like...
Energy Sector: A Systemic Analysis of…
Oleg V. Inshakov, Agnessa O. Inshakova, …
Hardcover
R3,407
Discovery Miles 34 070
On Creating Competition and Strategic…
Emiel F.M. Wubben, William Hulsink
Hardcover
R3,176
Discovery Miles 31 760
Performance Measurement and Regulation…
Tim Coelli, Denis Lawrence
Hardcover
R4,080
Discovery Miles 40 800
|