![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
Design and Analysis of Time Series Experiments presents the elements of statistical time series analysis while also addressing recent developments in research design and causal modeling. A distinguishing feature of the book is its integration of design and analysis of time series experiments. Drawing examples from criminology, economics, education, pharmacology, public policy, program evaluation, public health, and psychology, Design and Analysis of Time Series Experiments is addressed to researchers and graduate students in a wide range of behavioral, biomedical and social sciences. Readers learn not only how-to skills but, also the underlying rationales for the design features and the analytical methods. ARIMA algebra, Box-Jenkins-Tiao models and model-building strategies, forecasting, and Box-Tiao impact models are developed in separate chapters. The presentation of the models and model-building assumes only exposure to an introductory statistics course, with more difficult mathematical material relegated to appendices. Separate chapters cover threats to statistical conclusion validity, internal validity, construct validity, and external validity with an emphasis on how these threats arise in time series experiments. Design structures for controlling the threats are presented and illustrated through examples. The chapters on statistical conclusion validity and internal validity introduce Bayesian methods, counterfactual causality and synthetic control group designs. Building on the earlier of the authors, Design and Analysis of Time Series Experiments includes more recent developments in modeling, and considers design issues in greater detail than any existing work. Additionally, the book appeals to those who want to conduct or interpret time series experiments, as well as to those interested in research designs for causal inference.
This book presents recent research on robustness in econometrics. Robust data processing techniques - i.e., techniques that yield results minimally affected by outliers - and their applications to real-life economic and financial situations are the main focus of this book. The book also discusses applications of more traditional statistical techniques to econometric problems. Econometrics is a branch of economics that uses mathematical (especially statistical) methods to analyze economic systems, to forecast economic and financial dynamics, and to develop strategies for achieving desirable economic performance. In day-by-day data, we often encounter outliers that do not reflect the long-term economic trends, e.g., unexpected and abrupt fluctuations. As such, it is important to develop robust data processing techniques that can accommodate these fluctuations.
Scenario planning is the principles, methods, and techniques for looking forward into the future and trying to anticipate and influence what is to come next. This book provides students and line managers in organizations with the means to create better scenarios and to use them to create winning business strategies. The purpose is to shed new light on scenarios and scenario-like thinking in organizations for managers at every level within a company. The book covers scenarios such as: economic outlooks; political environments; acquisitions; downsizing, and more.
This proceedings volume presents the latest scientific research and trends in experimental economics, with particular focus on neuroeconomics. Derived from the 2016 Computational Methods in Experimental Economics (CMEE) conference held in Szczecin, Poland, this book features research and analysis of novel computational methods in neuroeconomics. Neuroeconomics is an interdisciplinary field that combines neuroscience, psychology and economics to build a comprehensive theory of decision making. At its core, neuroeconomics analyzes the decision-making process not only in terms of external conditions or psychological aspects, but also from the neuronal point of view by examining the cerebral conditions of decision making. The application of IT enhances the possibilities of conducting such analyses. Such studies are now performed by software that provides interaction among all the participants and possibilities to register their reactions more accurately. This book examines some of these applications and methods. Featuring contributions on both theory and application, this book is of interest to researchers, students, academics and professionals interested in experimental economics, neuroeconomics and behavioral economics.
This book investigates several competing forecasting models for interest rates, financial returns, and realized volatility, addresses the usefulness of nonlinear models for hedging purposes, and proposes new computational techniques to estimate financial processes.
Various imperfections in existing market systems prevent the free market from serving as a truly efficient allocation mechanism, but optimization of economic activities provides an effective remedial measure. Cooperative optimization claims that socially optimal and individually rational solutions to decision problems involving strategic action over time exist. To ensure that cooperation will last throughout the agreement period, however, the stringent condition of subgame consistency is required. This textbook presents a study of subgame consistent economic optimization, developing game-theoretic optimization techniques to establish the foundation for an effective policy menu to tackle the suboptimal behavior that the conventional market mechanism fails to resolve.
Econometric theory, as presented in textbooks and the econometric literature generally, is a somewhat disparate collection of findings. Its essential nature is to be a set of demonstrated results that increase over time, each logically based on a specific set of axioms or assumptions, yet at every moment, rather than a finished work, these inevitably form an incomplete body of knowledge. The practice of econometric theory consists of selecting from, applying, and evaluating this literature, so as to test its applicability and range. The creation, development, and use of computer software has led applied economic research into a new age. This book describes the history of econometric computation from 1950 to the present day, based upon an interactive survey involving the collaboration of the many econometricians who have designed and developed this software. It identifies each of the econometric software packages that are made available to and used by economists and econometricians worldwide.
This textbook discusses central statistical concepts and their use in business and economics. To endure the hardship of abstract statistical thinking, business and economics students need to see interesting applications at an early stage. Accordingly, the book predominantly focuses on exercises, several of which draw on simple applications of non-linear theory. The main body presents central ideas in a simple, straightforward manner; the exposition is concise, without sacrificing rigor. The book bridges the gap between theory and applications, with most exercises formulated in an economic context. Its simplicity of style makes the book suitable for students at any level, and every chapter starts out with simple problems. Several exercises, however, are more challenging, as they are devoted to the discussion of non-trivial economic problems where statistics plays a central part.
Empirical Studies In Applied Economics presents nine previously unpublished analyses in monograph form. In this work, the topics are presented so that each chapter stands on its own. The emphasis is on the applications but attention is also given to the econometric and statistical issues for advanced readers. Econometric methods include multivariate regression analysis, limited dependent variable analysis, and other maximum likelihood techniques. The empirical topics include the measurement of competition and market power in natural gas transportation markets and in the pharmaceutical market for chemotherapy drugs. Additional topics include an empirical analysis of NFL football demand, the accuracy of an econometric model for mail demand, and the allocation of police services in rural Alaska. Other chapters consider the valuation of technology patents and the determination of patent scope, duration, and reasonable royalty, and the reaction of financial markets to health scares in the fast-food industry. Finally, two chapters are devoted to the theory and testing of synergistic health effects from the combined exposure to asbestos and cigarette smoking.
This edited book contains several state-of-the-art papers devoted to econometrics of risk. Some papers provide theoretical analysis of the corresponding mathematical, statistical, computational, and economical models. Other papers describe applications of the novel risk-related econometric techniques to real-life economic situations. The book presents new methods developed just recently, in particular, methods using non-Gaussian heavy-tailed distributions, methods using non-Gaussian copulas to properly take into account dependence between different quantities, methods taking into account imprecise ("fuzzy") expert knowledge, and many other innovative techniques. This versatile volume helps practitioners to learn how to apply new techniques of econometrics of risk, and researchers to further improve the existing models and to come up with new ideas on how to best take into account economic risks.
This book proposes new methods to value equity and model the Markowitz efficient frontier using Markov switching models and provide new evidence and solutions to capture the persistence observed in stock returns across developed and emerging markets.
The new research method presented in this book ensures that all economic theories are falsifiable and that irrefutable theories are scientifically sound. Figueroa combines the logically consistent aspects of Popperian and process epistemologies in his alpha-beta method to address the widespread problem of too-general empirical research methods used in economics. He argues that scientific rules can be applied to economics to make sense of society, but that they must address the complexity of reality as well as the simplicity of the abstract on which hard sciences can rely. Furthermore, because the alpha-beta method combines approaches to address the difficulties of scientifically analyzing complex society, it also extends to other social sciences that have historically relied on empirical methods. This groundbreaking Pivot is ideal for students and researchers dedicated to promoting the progress of scientific research in all social sciences.
This book proposes new tools and models to price options, assess market volatility, and investigate the market efficiency hypothesis. In particular, it considers new models for hedge funds and derivatives of derivatives, and adds to the literature of testing for the efficiency of markets both theoretically and empirically.
Unique blend of asymptotic theory and small sample practice through simulation experiments and data analysis. Novel reproducing kernel Hilbert space methods for the analysis of smoothing splines and local polynomials. Leading to uniform error bounds and honest confidence bands for the mean function using smoothing splines Exhaustive exposition of algorithms, including the Kalman filter, for the computation of smoothing splines of arbitrary order.
provide models that could be used by do-it-yourselfers and also can be used toprovideunderstandingofthebackgroundissuessothatonecandoabetter job of working with the (proprietary) algorithms of the software vendors. In this book we strive to provide models that capture many of the - tails faced by ?rms operating in a modern supply chain, but we stop short of proposing models for economic analysis of the entire multi-player chain. In other words, we produce models that are useful for planning within a supply chain rather than models for planning the supply chain. The usefulness of the models is enhanced greatly by the fact that they have been implemented - ing computer modeling languages. Implementations are shown in Chapter 7, which allows solutions to be found using a computer. A reasonable question is: why write the book now? It is a combination of opportunities that have recently become available. The availability of mod- inglanguagesandcomputersthatprovidestheopportunitytomakepractical use of the models that we develop. Meanwhile, software companies are p- viding software for optimized production planning in a supply chain. The opportunity to make use of such software gives rise to a need to understand some of the issues in computational models for optimized planning. This is best done by considering simple models and examples.
Spatial Econometrics is a rapidly evolving field born from the joint efforts of economists, statisticians, econometricians and regional scientists. The book provides the reader with a broad view of the topic by including both methodological and application papers. Indeed the application papers relate to a number of diverse scientific fields ranging from hedonic models of house pricing to demography, from health care to regional economics, from the analysis of R&D spillovers to the study of retail market spatial characteristics. Particular emphasis is given to regional economic applications of spatial econometrics methods with a number of contributions specifically focused on the spatial concentration of economic activities and agglomeration, regional paths of economic growth, regional convergence of income and productivity and the evolution of regional employment. Most of the papers appearing in this book were solicited from the International Workshop on Spatial Econometrics and Statistics held in Rome (Italy) in 2006.
This book is a collection of selected papers presented at the Annual Meeting of the European Academy of Management and Business Economics (AEDEM), held at the Faculty of Economics and Business of the University of Barcelona, 05 - 07 June, 2012. This edition of the conference has been presented with the slogan "Creating new opportunities in an uncertain environment". There are different ways for assessing uncertainty in management but this book mainly focused on soft computing theories and their role in assessing uncertainty in a complex world. The present book gives a comprehensive overview of general management topics and discusses some of the most recent developments in all the areas of business and management including management, marketing, business statistics, innovation and technology, finance, sports and tourism. This book might be of great interest for anyone working in the area of management and business economics and might be especially useful for scientists and graduate students doing research in these fields.
Exploring and understanding the analysis of economic development is essential as global economies continue to experience extreme fluctuation. Econometrics brings together statistical methods for practical content and economic relations. Econometric Methods for Analyzing Economic Development is a comprehensive collection that focuses on various regions and their economies at a pivotal time when the majority of nations are struggling with stabilizing their economies. Outlining areas such as employment rates, utilization of natural resources, and regional impacts, this collection of research is an excellent tool for scholars, academics, and professionals looking to expand their knowledge on today s turbulent and changing economy."
This book proposes new methods to build optimal portfolios and to analyze market liquidity and volatility under market microstructure effects, as well as new financial risk measures using parametric and non-parametric techniques. In particular, it investigates the market microstructure of foreign exchange and futures markets.
The present book deals with coalition games in which expected pay-offs are only vaguely known. In fact, this idea about vagueness of expectations ap pears to be adequate to real situations in which the coalitional bargaining anticipates a proper realization of the game with a strategic behaviour of players. The vagueness being present in the expectations of profits is mod elled by means of the theory of fuzzy set and fuzzy quantities. The fuzziness of decision-making and strategic behaviour attracts the attention of mathematicians and its particular aspects are discussed in sev eral works. One can mention in this respect in particular the book "Fuzzy and Multiobjective Games for Conflict Resolution" by Ichiro Nishizaki and Masatoshi Sakawa (referred below as 43]) which has recently appeared in the series Studies in Fuzziness and Soft Computing published by Physica-Verlag in which the present book is also apperaing. That book, together with the one you carry in your hands, form in a certain sense a complementary pair. They present detailed views on two main aspects forming the core of game theory: strategic (mostly 2-person) games, and coalitional (or cooperative) games. As a pair they offer quite a wide overview of fuzzy set theoretical approaches to game theoretical models of human behaviour."
Floro Ernesto Caroleo and Francesco Pastore This book was conceived to collect selected essays presented at the session on "The Labour Market Impact of the European Union Enlargements. A New Regional Geography of Europe?" of the XXII Conference of the Italian Association of Labour Economics (AIEL). The session aimed to stimulate the debate on the continuity/ fracture of regional patterns of development and employment in old and new European Union (EU) regions. In particular, we asked whether, and how different, the causes of emergence and the evolution of regional imbalances in the new EU members of Central and Eastern Europe (CEE) are compared to those in the old EU members. Several contributions in this book suggest that a factor common to all backward regions, often neglected in the literature, is to be found in their higher than average degree of structural change or, more precisely, in the hardship they expe- ence in coping with the process of structural change typical of all advanced economies. In the new EU members of CEE, structural change is still a consequence of the continuing process of transition from central planning to a market economy, but also of what Fabrizio et al. (2009) call the "second transition," namely that related to the run-up to and entry in the EU.
This book covers recent advances in efficiency evaluations, most notably Data Envelopment Analysis (DEA) and Stochastic Frontier Analysis (SFA) methods. It introduces the underlying theories, shows how to make the relevant calculations and discusses applications. The aim is to make the reader aware of the pros and cons of the different methods and to show how to use these methods in both standard and non-standard cases. Several software packages have been developed to solve some of the most common DEA and SFA models. This book relies on R, a free, open source software environment for statistical computing and graphics. This enables the reader to solve not only standard problems, but also many other problem variants. Using R, one can focus on understanding the context and developing a good model. One is not restricted to predefined model variants and to a one-size-fits-all approach. To facilitate the use of R, the authors have developed an R package called Benchmarking, which implements the main methods within both DEA and SFA. The book uses mathematical formulations of models and assumptions, but it de-emphasizes the formal proofs - in part by placing them in appendices -- or by referring to the original sources. Moreover, the book emphasizes the usage of the theories and the interpretations of the mathematical formulations. It includes a series of small examples, graphical illustrations, simple extensions and questions to think about. Also, it combines the formal models with less formal economic and organizational thinking. Last but not least it discusses some larger applications with significant practical impacts, including the design of benchmarking-based regulations of energy companies in different European countries, and the development of merger control programs for competition authorities.
The editors are pleased to offer the following papers to the reader
in recognition and appreciation of the contributions to our
literature made by Robert Engle and Sir Clive Granger, winners of
the 2003 Nobel Prize in Economics. The basic themes of this part of
Volume 20 of Advances in Econometrics are time varying betas of the
capital asset pricing model, analysis of predictive densities of
nonlinear models of stock returns, modelling multivariate dynamic
correlations, flexible seasonal time series models, estimation of
long-memory time series models, the application of the technique of
boosting in volatility forecasting, the use of different time
scales in GARCH modelling, out-of-sample evaluation of the Fed
Model in stock price valuation, structural change as an alternative
to long memory, the use of smooth transition auto-regressions in
stochastic volatility modelling, the analysis of the balanced-ness
of regressions analyzing Taylor-Type rules of the Fed Funds rate, a
mixture-of-experts approach for the estimation of stochastic
volatility, a modern assessment of Clives first published paper on
Sunspot activity, and a new class of models of tail-dependence in
time series subject to jumps.
The book aims at perfecting the national governance system and improving national governance ability. It evaluates the balance sheets of the state and residents, non-financial corporations, financial institutions and the central bank, the central government, local government and external sectors - the goal being to provide a systematic analysis of the characteristics and trajectory of China's economic expansion and structural adjustment, as well as objective assessments of short and long-term economic operations, debt risks and financial risks with regard to the institutional and structural characteristics of economic development in market-oriented reform. It puts forward a preliminary analysis of China's national and sectoral balance sheets on the basis of scientific estimates of various kinds of data, analyzes from a new perspective the major issues that are currently troubling China - development sustainability, government transformation, local government debt, welfare reform, and the financial opening-up and stability - and explores corresponding policies, measures, and institutional arrangements.
In the 2nd edition some sections of Part I are omitted for better readability, and a brand new chapter is devoted to volatility risk. As a consequence, hedging of plain-vanilla options and valuation of exotic options are no longer limited to the Black-Scholes framework with constant volatility. In the 3rd printing of the 2nd edition, the second Chapter on discrete-time markets has been extensively revised. Proofs of several results are simplified and completely new sections on optimal stopping problems and Dynkin games are added. Applications to the valuation and hedging of American-style and game options are presented in some detail. The theme of stochastic volatility also reappears systematically in the second part of the book, which has been revised fundamentally, presenting much more detailed analyses of the various interest-rate models available: the authors' perspective throughout is that the choice of a model should be based on the reality of how a particular sector of the financial market functions, never neglecting to examine liquid primary and derivative assets and identifying the sources of trading risk associated. This long-awaited new edition of an outstandingly successful, well-established book, concentrating on the most pertinent and widely accepted modelling approaches, provides the reader with a text focused on practical rather than theoretical aspects of financial modelling. |
![]() ![]() You may like...
Novel Bioinspired Actuator Designs for…
Philipp Beckerle, Maziar Ahmad Sharbafi, …
Hardcover
R3,596
Discovery Miles 35 960
Computing with Data - An Introduction to…
Guy Lebanon, Mohamed El-Geish
Hardcover
R2,907
Discovery Miles 29 070
Artificial Intelligence Applications and…
Ilias Maglogiannis, Lazaros Iliadis, …
Hardcover
R2,962
Discovery Miles 29 620
AWS Billing and Cost Management User…
Documentation Team
Hardcover
Emerging Technologies in Data Mining and…
Joao Manuel R.S. Tavares, Satyajit Chakrabarti, …
Hardcover
R5,821
Discovery Miles 58 210
Design and Crosstalk Analysis in Carbon…
P. Uma Sathyakam, Partha Sharathi Mallick
Hardcover
R3,365
Discovery Miles 33 650
Multimedia Data Mining and Analytics…
Aaron K Baughman, Jiang Gao, …
Hardcover
|