![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
Although interest in spatial regression models has surged in recent years, a comprehensive, up-to-date text on these approaches does not exist. Filling this void, Introduction to Spatial Econometrics presents a variety of regression methods used to analyze spatial data samples that violate the traditional assumption of independence between observations. It explores a wide range of alternative topics, including maximum likelihood and Bayesian estimation, various types of spatial regression specifications, and applied modeling situations involving different circumstances. Leaders in this field, the authors clarify the often-mystifying phenomenon of simultaneous spatial dependence. By presenting new methods, they help with the interpretation of spatial regression models, especially ones that include spatial lags of the dependent variable. The authors also examine the relationship between spatiotemporal processes and long-run equilibrium states that are characterized by simultaneous spatial dependence. MATLAB (R) toolboxes useful for spatial econometric estimation are available on the authors' websites. This work covers spatial econometric modeling as well as numerous applied illustrations of the methods. It encompasses many recent advances in spatial econometric models-including some previously unpublished results.
World Statistics on Mining and Utilities 2018 provides a unique biennial overview of the role of mining and utility activities in the world economy. This extensive resource from UNIDO provides detailed time series data on the level, structure and growth of international mining and utility activities by country and sector. Country level data is clearly presented on the number of establishments, employment and output of activities such as: coal, iron ore and crude petroleum mining as well as production and supply of electricity, natural gas and water. This unique and comprehensive source of information meets the growing demand of data users who require detailed and reliable statistical information on the primary industry and energy producing sectors. The publication provides internationally comparable data to economic researchers, development strategists and business communities who influence the policy of industrial development and its environmental sustainability.
Covering a broad range of topics, this text provides a comprehensive survey of the modeling of chaotic dynamics and complexity in the natural and social sciences. Its attention to models in both the physical and social sciences and the detailed philosophical approach make this a unique text in the midst of many current books on chaos and complexity. Including an extensive index and bibliography along with numerous examples and simplified models, this is an ideal course text.
This handbook covers DEA topics that are extensively used and solidly based. The purpose of the handbook is to (1) describe and elucidate the state of the field and (2), where appropriate, extend the frontier of DEA research. It defines the state-of-the-art of DEA methodology and its uses. This handbook is intended to represent a milestone in the progression of DEA. Written by experts, who are generally major contributors to the topics to be covered, it includes a comprehensive review and discussion of basic DEA models, which, in the present issue extensions to the basic DEA methods, and a collection of DEA applications in the areas of banking, engineering, health care, and services. The handbook's chapters are organized into two categories: (i) basic DEA models, concepts, and their extensions, and (ii) DEA applications. First edition contributors have returned to update their work. The second edition includes updated versions of selected first edition chapters. New chapters have been added on: different approaches with no need for a priori choices of weights (called multipliers) that reflect meaningful trade-offs, construction of static and dynamic DEA technologies, slacks-based model and its extensions, DEA models for DMUs that have internal structures network DEA that can be used for measuring supply chain operations, Selection of DEA applications in the service sector with a focus on building a conceptual framework, research design and interpreting results. "
Originally published in 1984. This book examines two important dimensions of efficiency in the foreign exchange market using econometric techniques. It responds to the macroeconomics trend to re-examining the theories of exchange rate determination following the erratic behaviour of exchange rates in the late 1970s. In particular the text looks at the relation between spot and forward exchange rates and the term structure of the forward premium, both of which require a joint test of market efficiency and the equilibrium model. Approaches used are the regression of spot rates on lagged forward rates and an explicit time series analysis of the spot and forward rates, using data from Canada, the United Kingdom, the Netherlands, Switzerland and Germany.
Originally published in 1979. This book addresses three questions regarding uncertainty in economic life: how do we define uncertainty and use the concept meaningfully to provide conclusions; how can the level of uncertainty associated with a particular variable of economic interest be measured; and does experience provide any support for the view that uncertainty really matters. It develops a theory of the effect of price uncertainty on production and trade, takes a graphical approach to look at effects of a mean preserving spread to create rules for ordering distributions, and finishes with an econometric analysis of the effects of Brazil's adoption of a crawling peg in reducing real exchange rate uncertainty. This is an important early study into the significance of uncertainty.
This book examines conventional time series in the context of stationary data prior to a discussion of cointegration, with a focus on multivariate models. The authors provide a detailed and extensive study of impulse responses and forecasting in the stationary and non-stationary context, considering small sample correction, volatility and the impact of different orders of integration. Models with expectations are considered along with alternate methods such as Singular Spectrum Analysis (SSA), the Kalman Filter and Structural Time Series, all in relation to cointegration. Using single equations methods to develop topics, and as examples of the notion of cointegration, Burke, Hunter, and Canepa provide direction and guidance to the now vast literature facing students and graduate economists.
Environmental risk directly affects the financial stability of banks since they bear the financial consequences of the loss of liquidity of the entities to which they lend and of the financial penalties imposed resulting from the failure to comply with regulations and for actions taken that are harmful to the natural environment. This book explores the impact of environmental risk on the banking sector and analyzes strategies to mitigate this risk with a special emphasis on the role of modelling. It argues that environmental risk modelling allows banks to estimate the patterns and consequences of environmental risk on their operations, and to take measures within the context of asset and liability management to minimize the likelihood of losses. An important role here is played by the environmental risk modelling methodology as well as the software and mathematical and econometric models used. It examines banks' responses to macroprudential risk, particularly from the point of view of their adaptation strategies; the mechanisms of its spread; risk management and modelling; and sustainable business models. It introduces the basic concepts, definitions, and regulations concerning this type of risk, within the context of its influence on the banking industry. The book is primarily based on a quantitative and qualitative approach and proposes the delivery of a new methodology of environmental risk management and modelling in the banking sector. As such, it will appeal to researchers, scholars, and students of environmental economics, finance and banking, sociology, law, and political sciences.
Advanced Statistics for Kinesiology and Exercise Science is the first textbook to cover advanced statistical methods in the context of the study of human performance. Divided into three distinct sections, the book introduces and explores in depth both analysis of variance (ANOVA) and regressions analyses, including chapters on: preparing data for analysis; one-way, factorial, and repeated-measures ANOVA; analysis of covariance and multiple analyses of variance and covariance; diagnostic tests; regression models for quantitative and qualitative data; model selection and validation; logistic regression Drawing clear lines between the use of IBM SPSS Statistics software and interpreting and analyzing results, and illustrated with sport and exercise science-specific sample data and results sections throughout, the book offers an unparalleled level of detail in explaining advanced statistical techniques to kinesiology students. Advanced Statistics for Kinesiology and Exercise Science is an essential text for any student studying advanced statistics or research methods as part of an undergraduate or postgraduate degree programme in kinesiology, sport and exercise science, or health science.
Factor Analysis and Dimension Reduction in R provides coverage, with worked examples, of a large number of dimension reduction procedures along with model performance metrics to compare them. Factor analysis in the form of principal components analysis (PCA) or principal factor analysis (PFA) is familiar to most social scientists. However, what is less familiar is understanding that factor analysis is a subset of the more general statistical family of dimension reduction methods. The social scientist's toolkit for factor analysis problems can be expanded to include the range of solutions this book presents. In addition to covering FA and PCA with orthogonal and oblique rotation, this book's coverage includes higher-order factor models, bifactor models, models based on binary and ordinal data, models based on mixed data, generalized low-rank models, cluster analysis with GLRM, models involving supplemental variables or observations, Bayesian factor analysis, regularized factor analysis, testing for unidimensionality, and prediction with factor scores. The second half of the book deals with other procedures for dimension reduction. These include coverage of kernel PCA, factor analysis with multidimensional scaling, locally linear embedding models, Laplacian eigenmaps, diffusion maps, force directed methods, t-distributed stochastic neighbor embedding, independent component analysis (ICA), dimensionality reduction via regression (DRR), non-negative matrix factorization (NNMF), Isomap, Autoencoder, uniform manifold approximation and projection (UMAP) models, neural network models, and longitudinal factor analysis models. In addition, a special chapter covers metrics for comparing model performance. Features of this book include: Numerous worked examples with replicable R code Explicit comprehensive coverage of data assumptions Adaptation of factor methods to binary, ordinal, and categorical data Residual and outlier analysis Visualization of factor results Final chapters that treat integration of factor analysis with neural network and time series methods Presented in color with R code and introduction to R and RStudio, this book will be suitable for graduate-level and optional module courses for social scientists, and on quantitative methods and multivariate statistics courses.
Introduction to Financial Mathematics: Option Valuation, Second Edition is a well-rounded primer to the mathematics and models used in the valuation of financial derivatives. The book consists of fifteen chapters, the first ten of which develop option valuation techniques in discrete time, the last five describing the theory in continuous time. The first half of the textbook develops basic finance and probability. The author then treats the binomial model as the primary example of discrete-time option valuation. The final part of the textbook examines the Black-Scholes model. The book is written to provide a straightforward account of the principles of option pricing and examines these principles in detail using standard discrete and stochastic calculus models. Additionally, the second edition has new exercises and examples, and includes many tables and graphs generated by over 30 MS Excel VBA modules available on the author's webpage https://home.gwu.edu/~hdj/.
In order to make informed decisions, there are three important elements: intuition, trust, and analytics. Intuition is based on experiential learning and recent research has shown that those who rely on their "gut feelings" may do better than those who don't. Analytics, however, are important in a data-driven environment to also inform decision making. The third element, trust, is critical for knowledge sharing to take place. These three elements-intuition, analytics, and trust-make a perfect combination for decision making. This book gathers leading researchers who explore the role of these three elements in the process of decision-making.
Offering a unique balance between applications and calculations, Monte Carlo Methods and Models in Finance and Insurance incorporates the application background of finance and insurance with the theory and applications of Monte Carlo methods. It presents recent methods and algorithms, including the multilevel Monte Carlo method, the statistical Romberg method, and the Heath-Platen estimator, as well as recent financial and actuarial models, such as the Cheyette and dynamic mortality models. The authors separately discuss Monte Carlo techniques, stochastic process basics, and the theoretical background and intuition behind financial and actuarial mathematics, before bringing the topics together to apply the Monte Carlo methods to areas of finance and insurance. This allows for the easy identification of standard Monte Carlo tools and for a detailed focus on the main principles of financial and insurance mathematics. The book describes high-level Monte Carlo methods for standard simulation and the simulation of stochastic processes with continuous and discontinuous paths. It also covers a wide selection of popular models in finance and insurance, from Black-Scholes to stochastic volatility to interest rate to dynamic mortality. Through its many numerical and graphical illustrations and simple, insightful examples, this book provides a deep understanding of the scope of Monte Carlo methods and their use in various financial situations. The intuitive presentation encourages readers to implement and further develop the simulation methods.
There is no shortage of incentives to study and reduce poverty in our societies. Poverty is studied in economics and political sciences, and population surveys are an important source of information about it. The design and analysis of such surveys is principally a statistical subject matter and the computer is essential for their data compilation and processing. Focusing on The European Union Statistics on Income and Living Conditions (EU-SILC), a program of annual national surveys which collect data related to poverty and social exclusion, Statistical Studies of Income, Poverty and Inequality in Europe: Computing and Graphics in R presents a set of statistical analyses pertinent to the general goals of EU-SILC. The contents of the volume are biased toward computing and statistics, with reduced attention to economics, political and other social sciences. The emphasis is on methods and procedures as opposed to results, because the data from annual surveys made available since publication and in the near future will degrade the novelty of the data used and the results derived in this volume. The aim of this volume is not to propose specific methods of analysis, but to open up the analytical agenda and address the aspects of the key definitions in the subject of poverty assessment that entail nontrivial elements of arbitrariness. The presented methods do not exhaust the range of analyses suitable for EU-SILC, but will stimulate the search for new methods and adaptation of established methods that cater to the identified purposes.
Global econometric models have a long history. From the early 1970s to the present, as modeling techniques have advanced, different modeling paradigms have emerged and been used to support national and international policy making. One purpose of this volume - based on a conference in recognition of the seminal impact of Nobel Prize winner in Economic Sciences Lawrence R Klein, whose pioneering work has spawned the field of international econometric modeling - is to survey these developments from today's perspective.A second objective of the volume is to shed light on the wide range of attempts to broaden the scope of modeling on an international scale. Beyond new developments in traditional areas of the trade and financial flows, the volume reviews new approaches to the modeling of linkages between macroeconomic activity and individual economic units, new research on the analysis of trends in income distribution and economic wellbeing on a global scale, and innovative ideas about modeling the interactions between economic development and the environment.With the expansion of elaborated economic linkages, this volume makes an important contribution to the evolving literature of global econometric models.
The main purpose of this book is to resolve deficiencies and limitations that currently exist when using Technical Analysis (TA). Particularly, TA is being used either by academics as an "economic test" of the weak-form Efficient Market Hypothesis (EMH) or by practitioners as a main or supplementary tool for deriving trading signals. This book approaches TA in a systematic way utilizing all the available estimation theory and tests. This is achieved through the developing of novel rule-based pattern recognizers, and the implementation of statistical tests for assessing the importance of realized returns. More emphasis is given to technical patterns where subjectivity in their identification process is apparent. Our proposed methodology is based on the algorithmic and thus unbiased pattern recognition. The unified methodological framework presented in this book can serve as a benchmark for both future academic studies that test the null hypothesis of the weak-form EMH and for practitioners that want to embed TA within their trading/investment decision making processes.
Modern marketing managers need intuitive and effective tools not just for designing strategies but also for general management. This hands-on book introduces a range of contemporary management and marketing tools and concepts with a focus on forecasting, creating stimulating processes, and implementation. Topics addressed range from creating a clear vision, setting goals, and developing strategies, to implementing strategic analysis tools, consumer value models, budgeting, strategic and operational marketing plans. Special attention is paid to change management and digital transformation in the marketing landscape. Given its approach and content, the book offers a valuable asset for all professionals and advanced MBA students looking for 'real-life' tools and applications.
Handbook of Computational Economics: Heterogeneous Agent Modeling, Volume Four, focuses on heterogeneous agent models, emphasizing recent advances in macroeconomics (including DSGE), finance, empirical validation and experiments, networks and related applications. Capturing the advances made since the publication of Volume Two (Tesfatsion & Judd, 2006), it provides high-level literature with sections devoted to Macroeconomics, Finance, Empirical Validation and Experiments, Networks, and other applications, including Innovation Diffusion in Heterogeneous Populations, Market Design and Electricity Markets, and a final section on Perspectives on Heterogeneity.
The main objective of this book is to develop a strategy and policy measures to enhance the formalization of the shadow economy in order to improve the competitiveness of the economy and contribute to economic growth; it explores these issues with special reference to Serbia. The size and development of the shadow economy in Serbia and other Central and Eastern European countries are estimated using two different methods (the MIMIC method and household-tax-compliance method). Micro-estimates are based on a special survey of business entities in Serbia, which for the first time allows us to explore the shadow economy from the perspective of enterprises and entrepreneurs. The authors identify the types of shadow economy at work in business entities, the determinants of shadow economy participation, and the impact of competition from the informal sector on businesses. Readers will learn both about the potential fiscal effects of reducing the shadow economy to the levels observed in more developed countries and the effects that formalization of the shadow economy can have on economic growth.
In these two volumes, a group of distinguished economists debate the way in which evidence, in particular econometric evidence, can and should be used to relate macroeconomic theories to the real world. Topics covered include the business cycle, monetary policy, economic growth, the impact of new econometric techniques, the IS-LM model, the labour market, new Keynesian macroeconomics, and the use of macroeconomics in official documents.
This volume investigates the accuracy and dynamic performance of a high-frequency forecast model for the Japanese and United States economies based on the Current Quarter Model (CQM) or High Frequency Model (HFM) developed by the late Professor Emeritus Lawrence R. Klein. It also presents a survey of recent developments in high-frequency forecasts and gives an example application of the CQM model in forecasting Gross Regional Products (GRPs).
This volume is a collection of methodological developments and applications of simulation-based methods that were presented at a workshop at Louisiana State University in November, 2009. The first two papers are extensions of the GHK simulator: one reconsiders the computation of the probabilities in a discrete choice model while another example uses an adaptive version of sparse-grids integration (SGI) instead of simulation. Two studies are focused specifically on the methodology: the first compares the performance of the maximum-simulated likelihood (MSL) approach with a proposed composite marginal likelihood (CML) approach in multivariate ordered-response situations, while the second examines methods of testing for the presence of heterogeneity in the heterogeneity model. Further topics examined include: education savings accounts, parent contributions and education attainment; estimating the effect of exchange rate flexibility on financial account openness; estimating a fractional response model with a count endogenous regressor; and modelling and forecasting volatility in a bayesian approach.
This book presents Professor Lawrence R Klein and his group's last quarterly econometric model of the United States economy that they had produced at the University of Pennsylvania. This is the last econometric model that Lawrence Klein and his disciples have left after some 50 years of cumulated efforts of constructing the US economy model up to around 2000. It was widely known as the WEFA Econometric Model Mark 10, and is the culmination of Professor Klein's research which spans more than 70 years, and would please not only Professor Klein's old students and colleagues, but also younger students who have heard so much of Klein models but have yet to see the latest model in its complete and printed form.
Since the middle of the twentieth century, economists have invested great resources into using statistical evidence to relate macroeconomic theories to the real world, and many new econometric techniques have been employed. In these two volumes, a distinguished group of economic theorists, econometricians, and economic methodologists examine how evidence has been used and how it should be used to understand the real world. Volume 1 focuses on the contribution of econometric techniques to understanding the macroeconomic world. It covers the use of evidence to understand the business cycle, the operation of monetary policy, and economic growth. A further section offers assessments of the overall impact of recent econometric techniques such as cointegration and unit roots. Volume 2 focuses on the labour market and economic policy, with sections covering the IS-LM model, the labour market, new Keynesian macroeconomics, and the use of macroeconomics in official documents (in both the USA and the EU). These volumes will be valuable to advanced undergraduates, graduate students, and practitioners for their clear presentation of opposing perspectives on macroeconomics and how evidence should be used. The chapters are complemented by discussion sections revealing the perspectives of other contributors on the methodological issues raised.
Originally published in 1984. Since the logic underlying economic theory can only be grasped fully by a thorough understanding of the mathematics, this book will be invaluable to economists wishing to understand vast areas of important research. It provides a basic introduction to the fundamental mathematical ideas of topology and calculus, and uses these to present modern singularity theory and recent results on the generic existence of isolated price equilibria in exchange economies. |
![]() ![]() You may like...
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,229
Discovery Miles 22 290
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,403
Discovery Miles 34 030
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,583
Discovery Miles 65 830
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R8,102
Discovery Miles 81 020
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
|