![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
Emphasizing the impact of computer software and computational technology on econometric theory and development, this text presents recent advances in the application of computerized tools to econometric techniques and practicesaEURO"focusing on current innovations in Monte Carlo simulation, computer-aided testing, model selection, and Bayesian methodology for improved econometric analyses.
Model a Wide Range of Count Time Series Handbook of Discrete-Valued Time Series presents state-of-the-art methods for modeling time series of counts and incorporates frequentist and Bayesian approaches for discrete-valued spatio-temporal data and multivariate data. While the book focuses on time series of counts, some of the techniques discussed can be applied to other types of discrete-valued time series, such as binary-valued or categorical time series. Explore a Balanced Treatment of Frequentist and Bayesian Perspectives Accessible to graduate-level students who have taken an elementary class in statistical time series analysis, the book begins with the history and current methods for modeling and analyzing univariate count series. It next discusses diagnostics and applications before proceeding to binary and categorical time series. The book then provides a guide to modern methods for discrete-valued spatio-temporal data, illustrating how far modern applications have evolved from their roots. The book ends with a focus on multivariate and long-memory count series. Get Guidance from Masters in the Field Written by a cohesive group of distinguished contributors, this handbook provides a unified account of the diverse techniques available for observation- and parameter-driven models. It covers likelihood and approximate likelihood methods, estimating equations, simulation methods, and a Bayesian approach for model fitting.
A Guide to Modern Econometrics, Fifth Edition has become established as a highly successful textbook. It serves as a guide to alternative techniques in econometrics with an emphasis on intuition and the practical implementation of these approaches. This fifth edition builds upon the success of its predecessors. The text has been carefully checked and updated, taking into account recent developments and insights. It includes new material on casual inference, the use and limitation of p-values, instrumental variables estimation and its implementation, regression discontinuity design, standardized coefficients, and the presentation of estimation results.
The new research method presented in this book ensures that all economic theories are falsifiable and that irrefutable theories are scientifically sound. Figueroa combines the logically consistent aspects of Popperian and process epistemologies in his alpha-beta method to address the widespread problem of too-general empirical research methods used in economics. He argues that scientific rules can be applied to economics to make sense of society, but that they must address the complexity of reality as well as the simplicity of the abstract on which hard sciences can rely. Furthermore, because the alpha-beta method combines approaches to address the difficulties of scientifically analyzing complex society, it also extends to other social sciences that have historically relied on empirical methods. This groundbreaking Pivot is ideal for students and researchers dedicated to promoting the progress of scientific research in all social sciences.
Financial, Macro and Micro Econometrics Using R, Volume 42, provides state-of-the-art information on important topics in econometrics, including multivariate GARCH, stochastic frontiers, fractional responses, specification testing and model selection, exogeneity testing, causal analysis and forecasting, GMM models, asset bubbles and crises, corporate investments, classification, forecasting, nonstandard problems, cointegration, financial market jumps and co-jumps, among other topics.
This is an essential how-to guide on the application of structural equation modeling (SEM) techniques with the AMOS software, focusing on the practical applications of both simple and advanced topics. Written in an easy-to-understand conversational style, the book covers everything from data collection and screening to confirmatory factor analysis, structural model analysis, mediation, moderation, and more advanced topics such as mixture modeling, censored date, and non-recursive models. Through step-by-step instructions, screen shots, and suggested guidelines for reporting, Collier cuts through abstract definitional perspectives to give insight on how to actually run analysis. Unlike other SEM books, the examples used will often start in SPSS and then transition to AMOS so that the reader can have full confidence in running the analysis from beginning to end. Best practices are also included on topics like how to determine if your SEM model is formative or reflective, making it not just an explanation of SEM topics, but a guide for researchers on how to develop a strong methodology while studying their respective phenomenon of interest. With a focus on practical applications of both basic and advanced topics, and with detailed work-through examples throughout, this book is ideal for experienced researchers and beginners across the behavioral and social sciences.
This textbook discusses central statistical concepts and their use in business and economics. To endure the hardship of abstract statistical thinking, business and economics students need to see interesting applications at an early stage. Accordingly, the book predominantly focuses on exercises, several of which draw on simple applications of non-linear theory. The main body presents central ideas in a simple, straightforward manner; the exposition is concise, without sacrificing rigor. The book bridges the gap between theory and applications, with most exercises formulated in an economic context. Its simplicity of style makes the book suitable for students at any level, and every chapter starts out with simple problems. Several exercises, however, are more challenging, as they are devoted to the discussion of non-trivial economic problems where statistics plays a central part.
This book covers recent advances in efficiency evaluations, most notably Data Envelopment Analysis (DEA) and Stochastic Frontier Analysis (SFA) methods. It introduces the underlying theories, shows how to make the relevant calculations and discusses applications. The aim is to make the reader aware of the pros and cons of the different methods and to show how to use these methods in both standard and non-standard cases. Several software packages have been developed to solve some of the most common DEA and SFA models. This book relies on R, a free, open source software environment for statistical computing and graphics. This enables the reader to solve not only standard problems, but also many other problem variants. Using R, one can focus on understanding the context and developing a good model. One is not restricted to predefined model variants and to a one-size-fits-all approach. To facilitate the use of R, the authors have developed an R package called Benchmarking, which implements the main methods within both DEA and SFA. The book uses mathematical formulations of models and assumptions, but it de-emphasizes the formal proofs - in part by placing them in appendices -- or by referring to the original sources. Moreover, the book emphasizes the usage of the theories and the interpretations of the mathematical formulations. It includes a series of small examples, graphical illustrations, simple extensions and questions to think about. Also, it combines the formal models with less formal economic and organizational thinking. Last but not least it discusses some larger applications with significant practical impacts, including the design of benchmarking-based regulations of energy companies in different European countries, and the development of merger control programs for competition authorities.
Complex-Valued Modeling in Economics and Finance outlines the theory, methodology, and techniques behind modeling economic processes using complex variables theory. The theory of complex variables functions is widely used in many scientific fields, since work with complex variables can appropriately describe different complex real-life processes. Many economic indicators and factors reflecting the properties of the same object can be represented in the form of complex variables. By describing the relationship between various indicators using the functions of these variables, new economic and financial models can be created which are often more accurate than the models of real variables. This book pays critical attention to complex variables production in stock market modeling, modeling illegal economy, time series forecasting, complex auto-aggressive models, and economic dynamics modeling. Very little has been published on this topic and its applications within the fields of economics and finance, and this volume appeals to graduate-level students studying economics, academic researchers in economics and finance, and economists.
This book brings together presentations of some of the fundamental new research that has begun to appear in the areas of dynamic structural modeling, nonlinear structural modeling, time series modeling, nonparametric inference, and chaotic attractor inference. The contents of this volume comprise the proceedings of the third of a conference series entitled International Symposia in Economic Theory and Econometrics. This conference was held at the IC;s2 (Innovation, Creativity and Capital) Institute at the University of Texas at Austin on May 22-23, l986.
This volume is dedicated to two recent intensive areas of research
in the econometrics of panel data, namely nonstationary panels and
dynamic panels. It includes a comprehensive survey of the
nonstationary panel literature including panel unit root tests,
spurious panel regressions and panel cointegration
This book aims to bring together studies using different data types (panel data, cross-sectional data and time series data) and different methods (for example, panel regression, nonlinear time series, chaos approach, deep learning, machine learning techniques among others) and to create a source for those interested in these topics and methods by addressing some selected applied econometrics topics which have been developed in recent years. It creates a common meeting ground for scientists who give econometrics education in Turkey to study, and contribute to the delivery of the authors' knowledge to the people who take interest. This book can also be useful for "Applied Economics and Econometrics" courses in postgraduate education as a material source
Over the last decade, dynamical systems theory and related
nonlinear methods have had a major impact on the analysis of time
series data from complex systems. Recent developments in
mathematical methods of state-space reconstruction, time-delay
embedding, and surrogate data analysis, coupled with readily
accessible and powerful computational facilities used in gathering
and processing massive quantities of high-frequency data, have
provided theorists and practitioners unparalleled opportunities for
exploratory data analysis, modelling, forecasting, and
control.
An understanding of the behaviour of financial assets and the evolution of economies has never been as important as today. This book looks at these complex systems from the perspective of the physicist. So called 'econophysics' and its application to finance has made great strides in recent years. Less emphasis has been placed on the broader subject of macroeconomics and many economics students are still taught traditional neo-classical economics. The reader is given a general primer in statistical physics, probability theory, and use of correlation functions. Much of the mathematics that is developed is frequently no longer included in undergraduate physics courses. The statistical physics of Boltzmann and Gibbs is one of the oldest disciplines within physics and it can be argued that it was first applied to ensembles of molecules as opposed to being applied to social agents only by way of historical accident. The authors argue by analogy that the theory can be applied directly to economic systems comprising assemblies of interacting agents. The necessary tools and mathematics are developed in a clear and concise manner. The body of work, now termed econophysics, is then developed. The authors show where traditional methods break down and show how the probability distributions and correlation functions can be properly understood using high frequency data. Recent work by the physics community on risk and market crashes are discussed together with new work on betting markets as well as studies of speculative peaks that occur in housing markets. The second half of the book continues the empirical approach showing how by analogy with thermodynamics, a self-consistent attack can be made on macroeconomics. This leads naturally to economic production functions being equated to entropy functions - a new concept for economists. Issues relating to non-equilibrium naturally arise during the development and application of this approach to economics. These are discussed in the context of superstatistics and adiabatic processes. As a result it does seem ultimately possible to reconcile the approach with non-equilibrium systems, and the ideas are applied to study income and wealth distributions, which with their power law distribution functions have puzzled many researchers ever since Pareto discovered them over 100 years ago. This book takes a pedagogical approach to these topics and is aimed at final year undergraduate and beginning gradaute or post-graduate students in physics, economics, and business. However, the experienced researcher and quant should also find much of interest.
Quants, physicists working on Wall Street as quantitative analysts, have been widely blamed for triggering financial crises with their complex mathematical models. Their formulas were meant to allow Wall Street to prosper without risk. But in this penetrating insider's look at the recent economic collapse, Emanuel Derman--former head quant at Goldman Sachs--explains the collision between mathematical modeling and economics and what makes financial models so dangerous. Though such models imitate the style of physics and employ the language of mathematics, theories in physics aim for a description of reality--but in finance, models can shoot only for a very limited approximation of reality. Derman uses his firsthand experience in financial theory and practice to explain the complicated tangles that have paralyzed the economy. "Models.Behaving.Badly. "exposes Wall Street's love affair with models, and shows us why nobody will ever be able to write a model that can encapsulate human behavior.
The Handbook of U.S. Labor Statistics is recognized as an authoritative resource on the U.S. labor force. It continues and enhances the Bureau of Labor Statistics's (BLS) discontinued publication, Labor Statistics. It allows the user to understand recent developments as well as to compare today's economy with past history. This edition includes new tables on occupational safety and health and income in the United States. The Handbook is a comprehensive reference providing an abundance of data on a variety of topics including: *Employment and unemployment; *Earnings; *Prices; *Productivity; *Consumer expenditures; *Occupational safety and health; *Union membership; *Working poor *And much more! Features of the publication In addition to over 215 tables that present practical data, the Handbook provides: *Introductory material for each chapter that contains highlights of salient data and figures that call attention to noteworthy trends in the data *Notes and definitions, which contain concise descriptions of the data sources, concepts, definitions, and methodology from which the data are derived *References to more comprehensive reports which provide additional data and more extensive descriptions of estimation methods, sampling, and reliability measures
The first part of this book discusses institutions and mechanisms of algorithmic trading, market microstructure, high-frequency data and stylized facts, time and event aggregation, order book dynamics, trading strategies and algorithms, transaction costs, market impact and execution strategies, risk analysis, and management. The second part covers market impact models, network models, multi-asset trading, machine learning techniques, and nonlinear filtering. The third part discusses electronic market making, liquidity, systemic risk, recent developments and debates on the subject.
Since the financial crisis, the issue of the 'one percent' has become the centre of intense public debate, unavoidable even for members of the elite themselves. Moreover, inquiring into elites has taken centre-stage once again in both journalistic investigations and academic research. New Directions in Elite Studies attempts to move the social scientific study of elites beyond economic analysis, which has greatly improved our knowledge of inequality, but is restricted to income and wealth. In contrast, this book mobilizes a broad scope of research methods to uncover the social composition of the power elite - the 'field of power'. It reconstructs processes through which people gain access to positions in this particular social space, examines the various forms of capital they mobilize in the process - economic, but also cultural and social capital - and probes changes over time and variations across national contexts. Bringing together the most advanced research into elites by a European and multidisciplinary group of scholars, this book presents an agenda for the future study of elites. It will appeal to all those interested in the study of elites, inequality, class, power, and gender inequality.
The book aims at perfecting the national governance system and improving national governance ability. It evaluates the balance sheets of the state and residents, non-financial corporations, financial institutions and the central bank, the central government, local government and external sectors - the goal being to provide a systematic analysis of the characteristics and trajectory of China's economic expansion and structural adjustment, as well as objective assessments of short and long-term economic operations, debt risks and financial risks with regard to the institutional and structural characteristics of economic development in market-oriented reform. It puts forward a preliminary analysis of China's national and sectoral balance sheets on the basis of scientific estimates of various kinds of data, analyzes from a new perspective the major issues that are currently troubling China - development sustainability, government transformation, local government debt, welfare reform, and the financial opening-up and stability - and explores corresponding policies, measures, and institutional arrangements.
Microsimulation models provide an exciting new tool for analysing the distributional impact and cost of government policy changes. They can also be used to analyse the current or future structure of society. This volume contains papers describing new developments at the frontiers of microsimulation modelling, and draws upon experiences in a wide range of countries. Some papers aim to share with other modellers, experience gained in designing and running microsimulation models and their use in government policy formulation. They also examine issues at the frontiers of the discipline, such as how to include usage of health, education and welfare services in models. Other chapters focus upon describing the innovative new approaches being taken in dynamic microsimulation modelling. They describe some of the policy applications for which dynamic models are being used in Europe, Australia and New Zealand. Topics covered include retirement income modelling, pension reform, the behavioural impact of tax changes, child care demand, and the inclusion of government services within models. Attention is also given to validating the results of models and estimating their statistical reliability.
First published in 1992, The Efficiency of New Issue Markets provides a comprehensive overview of under-pricing and through this assess the efficiency of new issue markets. The book provides a further theoretical development of the adverse selection model of the new issue market and addresses the hypothesis that the method of distribution of new issues has an important bearing on the efficiency of these markets. In doing this, the book tests the efficiency of the Offer for Sale new issue market, which demonstrates the validity of the adverse selection model and contradicts the monopsony power hypothesis. This examines the relative efficiency of the new issue markets which demonstrates the importance of distribution in determining relative efficiency.
This book explores the possibility of using social media data for detecting socio-economic recovery activities. In the last decade, there have been intensive research activities focusing on social media during and after disasters. This approach, which views people's communication on social media as a sensor for real-time situations, has been widely adopted as the "people as sensor" approach. Furthermore, to improve recovery efforts after large-scale disasters, detecting communities' real-time recovery situations is essential, since conventional socio-economic recovery indicators, such as governmental statistics, are not published in real time. Thanks to its timeliness, using social media data can fill the gap. Motivated by this possibility, this book especially focuses on the relationships between people's communication on Twitter and Facebook pages, and socio-economic recovery activities as reflected in the used-car market data and the housing market data in the case of two major disasters: the Great East Japan Earthquake and Tsunami of 2011 and Hurricane Sandy in 2012. The book pursues an interdisciplinary approach, combining e.g. disaster recovery studies, crisis informatics, and economics. In terms of its contributions, firstly, the book sheds light on the "people as sensors" approach for detecting socio-economic recovery activities, which has not been thoroughly studied to date but has the potential to improve situation awareness during the recovery phase. Secondly, the book proposes new socio-economic recovery indicators: used-car market data and housing market data. Thirdly, in the context of using social media during the recovery phase, the results demonstrate the importance of distinguishing between social media data posted both by people who are at or near disaster-stricken areas and by those who are farther away.
Suitable for statisticians, mathematicians, actuaries, and students interested in the problems of insurance and analysis of lifetimes, Statistical Methods with Applications to Demography and Life Insurance presents contemporary statistical techniques for analyzing life distributions and life insurance problems. It not only contains traditional material but also incorporates new problems and techniques not discussed in existing actuarial literature. The book mainly focuses on the analysis of an individual life and describes statistical methods based on empirical and related processes. Coverage ranges from analyzing the tails of distributions of lifetimes to modeling population dynamics with migrations. To help readers understand the technical points, the text covers topics such as the Stieltjes, Wiener, and Ito integrals. It also introduces other themes of interest in demography, including mixtures of distributions, analysis of longevity and extreme value theory, and the age structure of a population. In addition, the author discusses net premiums for various insurance policies. Mathematical statements are carefully and clearly formulated and proved while avoiding excessive technicalities as much as possible. The book illustrates how these statements help solve numerous statistical problems. It also includes more than 70 exercises.
This book provides the tools and concepts necessary to study the
behavior of econometric estimators and test statistics in large
samples. An econometric estimator is a solution to an optimization
problem; that is, a problem that requires a body of techniques to
determine a specific solution in a defined set of possible
alternatives that best satisfies a selected object function or set
of constraints. Thus, this highly mathematical book investigates
situations concerning large numbers, in which the assumptions of
the classical linear model fail. Economists, of course, face these
situations often. |
You may like...
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
|