![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
This is the first of two volumes containing papers and commentaries presented at the Eleventh World Congress of the Econometric Society, held in Montreal, Canada in August 2015. These papers provide state-of-the-art guides to the most important recent research in economics. The book includes surveys and interpretations of key developments in economics and econometrics, and discussion of future directions for a wide variety of topics, covering both theory and application. These volumes provide a unique, accessible survey of progress on the discipline, written by leading specialists in their fields. The first volume includes theoretical and applied papers addressing topics such as dynamic mechanism design, agency problems, and networks.
This book examines conventional time series in the context of stationary data prior to a discussion of cointegration, with a focus on multivariate models. The authors provide a detailed and extensive study of impulse responses and forecasting in the stationary and non-stationary context, considering small sample correction, volatility and the impact of different orders of integration. Models with expectations are considered along with alternate methods such as Singular Spectrum Analysis (SSA), the Kalman Filter and Structural Time Series, all in relation to cointegration. Using single equations methods to develop topics, and as examples of the notion of cointegration, Burke, Hunter, and Canepa provide direction and guidance to the now vast literature facing students and graduate economists.
The main objective of this book is to develop a strategy and policy measures to enhance the formalization of the shadow economy in order to improve the competitiveness of the economy and contribute to economic growth; it explores these issues with special reference to Serbia. The size and development of the shadow economy in Serbia and other Central and Eastern European countries are estimated using two different methods (the MIMIC method and household-tax-compliance method). Micro-estimates are based on a special survey of business entities in Serbia, which for the first time allows us to explore the shadow economy from the perspective of enterprises and entrepreneurs. The authors identify the types of shadow economy at work in business entities, the determinants of shadow economy participation, and the impact of competition from the informal sector on businesses. Readers will learn both about the potential fiscal effects of reducing the shadow economy to the levels observed in more developed countries and the effects that formalization of the shadow economy can have on economic growth.
Coverage has been extended to include recent topics. The book again presents a unified treatment of economic theory, with the method of maximum likelihood playing a key role in both estimation and testing. Exercises are included and the book is suitable as a general text for final-year undergraduate and postgraduate students.
Applied Time Series Modelling and Forecasting provides a relatively non-technical introduction to applied time series econometrics and forecasting involving non-stationary data. The emphasis is very much on the why and how and, as much as possible, the authors confine technical material to boxes or point to the relevant sources for more detailed information. This book is based on an earlier title Using Cointegration Analysis in Econometric Modelling by Richard Harris. As well as updating material covered in the earlier book, there are two major additions involving panel tests for unit roots and cointegration and forecasting of financial time series. Harris and Sollis have also incorporated as many of the latest techniques in the area as possible including: testing for periodic integration and cointegration; GLS detrending when testing for unit roots; structural breaks and season unit root testing; testing for cointegration with a structural break; asymmetric tests for cointegration; testing for super-exogeniety; seasonal cointegration in multivariate models; and approaches to structural macroeconomic modelling. In addition, the discussion of certain topics, such as testing for unique vectors, has been simplified. Applied Time Series Modelling and Forecasting has been written for students taking courses in financial economics and forecasting, applied time series, and econometrics at advanced undergraduate and postgraduate levels. It will also be useful for practitioners who wish to understand the application of time series modelling e.g. financial brokers. Data sets and econometric code for implementing some of the more recent procedures covered in the book can be found on the following web site www.wiley.co.uk/harris
How the obsession with quantifying human performance threatens business, medicine, education, government-and the quality of our lives Today, organizations of all kinds are ruled by the belief that the path to success is quantifying human performance, publicizing the results, and dividing up the rewards based on the numbers. But in our zeal to instill the evaluation process with scientific rigor, we've gone from measuring performance to fixating on measuring itself-and this tyranny of metrics now threatens the quality of our organizations and lives. In this brief, accessible, and powerful book, Jerry Muller uncovers the damage metrics are causing and shows how we can begin to fix the problem. Filled with examples from business, medicine, education, government, and other fields, the book explains why paying for measured performance doesn't work, why surgical scorecards may increase deaths, and much more. But Muller also shows that, when used as a complement to judgment based on personal experience, metrics can be beneficial, and he includes an invaluable checklist of when and how to use them. The result is an essential corrective to a harmful trend that increasingly affects us all.
The Regression Discontinuity (RD) design is one of the most popular and credible research designs for program evaluation and causal inference. This volume 38 of Advances in Econometrics collects twelve innovative and thought-provoking contributions to the RD literature, covering a wide range of methodological and practical topics. Some chapters touch on foundational methodological issues such as identification, interpretation, implementation, falsification testing, estimation and inference, while others focus on more recent and related topics such as identification and interpretation in a discontinuity-in-density framework, empirical structural estimation, comparative RD methods, and extrapolation. These chapters not only give new insights for current methodological and empirical research, but also provide new bases and frameworks for future work in this area. This volume contributes to the rapidly expanding RD literature by bringing together theoretical and applied econometricians, statisticians, and social, behavioural and biomedical scientists, in the hope that these interactions will further spark innovative practical developments in this important and active research area.
This introductory overview explores the methods, models and interdisciplinary links of artificial economics, a new way of doing economics in which the interactions of artificial economic agents are computationally simulated to study their individual and group behavior patterns. Conceptually and intuitively, and with simple examples, Mercado addresses the differences between the basic assumptions and methods of artificial economics and those of mainstream economics. He goes on to explore various disciplines from which the concepts and methods of artificial economics originate; for example cognitive science, neuroscience, artificial intelligence, evolutionary science and complexity science. Introductory discussions on several controversial issues are offered, such as the application of the concepts of evolution and complexity in economics and the relationship between artificial intelligence and the philosophies of mind. This is one of the first books to fully address artificial economics, emphasizing its interdisciplinary links and presenting in a balanced way its occasionally controversial aspects.
Non-market valuation has become a broadly accepted and widely practiced means of measuring the economic values of the environment and natural resources. In this book, the authors provide a guide to the statistical and econometric practices that economists employ in estimating non-market values.The authors develop the econometric models that underlie the basic methods: contingent valuation, travel cost models, random utility models and hedonic models. They analyze the measurement of non-market values as a procedure with two steps: the estimation of parameters of demand and preference functions and the calculation of benefits from the estimated models. Each of the models is carefully developed from the preference function to the behavioral or response function that researchers observe. The models are then illustrated with datasets that characterize the kinds of data researchers typically deal with. The real world data and clarity of writing in this book will appeal to environmental economists, students, researchers and practitioners in multilateral banks and government agencies.
A unique and comprehensive source of information, this book is the only international publication providing economists, planners, policymakers and business people with worldwide statistics on current performance and trends in the manufacturing sector. The Yearbook is designed to facilitate international comparisons relating to manufacturing activity and industrial development and performance. It provides data which can be used to analyse patterns of growth and related long term trends, structural change and industrial performance in individual industries. Statistics on employment patterns, wages, consumption and gross output and other key indicators are also presented.
'Experiments in Organizational Economics' highlights the importance of replicating previous economic experiments. Replication enables experimental findings to be subjected to rigorous scrutiny. Despite this obvious advantage, direct replication remains relatively scant in economics. One possible explanation for this situation is that publication outlets favor novel work over tests of robustness. Readers will gain a better understanding of the role that replication plays in economic discovery as well as valuable insights into the robustness of previously reported findings.
Technical analysis points out that the best source of information to beat the market is the price itself. Introducing readers to technical analysis in a more succinct and practical way, Ramlall focuses on the key aspects, benefits, drawbacks, and the main tools of technical analysis. Chart Patterns, Point & Figure, Stochastics, Sentiment indicators, Elliot Wave Theory, RSI, R, Candlesticks and more are covered, including both the concepts and the practical applications. Also including programming technical analysis tools, this book is a valuable tool for both researchers and practitioners.
The last decade has brought dramatic changes in the way that researchers analyze economic and financial time series. This book synthesizes these recent advances and makes them accessible to first-year graduate students. James Hamilton provides the first adequate text-book treatments of important innovations such as vector autoregressions, generalized method of moments, the economic and statistical consequences of unit roots, time-varying variances, and nonlinear time series models. In addition, he presents basic tools for analyzing dynamic systems (including linear representations, autocovariance generating functions, spectral analysis, and the Kalman filter) in a way that integrates economic theory with the practical difficulties of analyzing and interpreting real-world data. "Time Series Analysis" fills an important need for a textbook that integrates economic theory, econometrics, and new results. The book is intended to provide students and researchers with a self-contained survey of time series analysis. It starts from first principles and should be readily accessible to any beginning graduate student, while it is also intended to serve as a reference book for researchers.
This textbook provides future data analysts with the tools, methods, and skills needed to answer data-focused, real-life questions; to carry out data analysis; and to visualize and interpret results to support better decisions in business, economics, and public policy. Data wrangling and exploration, regression analysis, machine learning, and causal analysis are comprehensively covered, as well as when, why, and how the methods work, and how they relate to each other. As the most effective way to communicate data analysis, running case studies play a central role in this textbook. Each case starts with an industry-relevant question and answers it by using real-world data and applying the tools and methods covered in the textbook. Learning is then consolidated by 360 practice questions and 120 data exercises. Extensive online resources, including raw and cleaned data and codes for all analysis in Stata, R, and Python, can be found at www.gabors-data-analysis.com.
* A useful guide to financial product modeling and to minimizing business risk and uncertainty * Looks at wide range of financial assets and markets and correlates them with enterprises' profitability * Introduces advanced and novel machine learning techniques in finance such as Support Vector Machine, Neural Networks, Random Forest, K-Nearest Neighbors, Extreme Learning Machine, Deep Learning Approaches and applies them to analyze finance data sets * Real world applicable examples to further understanding
If you know a little bit about financial mathematics but don't yet know a lot about programming, then C++ for Financial Mathematics is for you. C++ is an essential skill for many jobs in quantitative finance, but learning it can be a daunting prospect. This book gathers together everything you need to know to price derivatives in C++ without unnecessary complexities or technicalities. It leads the reader step-by-step from programming novice to writing a sophisticated and flexible financial mathematics library. At every step, each new idea is motivated and illustrated with concrete financial examples. As employers understand, there is more to programming than knowing a computer language. As well as covering the core language features of C++, this book teaches the skills needed to write truly high quality software. These include topics such as unit tests, debugging, design patterns and data structures. The book teaches everything you need to know to solve realistic financial problems in C++. It can be used for self-study or as a textbook for an advanced undergraduate or master's level course.
Nanak Kakwani and Hyun Hwa Son make use of social welfare functions to derive indicators of development relevant to specific social objectives, such as poverty- and inequality-reduction. Arguing that the measurement of development cannot be value-free, the authors assert that if indicators of development are to have policy relevance, they must be assessed on the basis of the social objectives in question. This study develops indicators that are sensitive to both the level and the distribution of individuals' capabilities. The idea of the social welfare function, defined in income space, is extended to the concept of the social well-being function, defined in capability space. Through empirical analysis from selected developing countries, with a particular focus on Brazil, the authors shape techniques appropriate to the analysis of development in different dimensions. The focus of this evidence-based policy analysis is to evaluate alternative policies affecting the capacities of people to enjoy a better life.
This book offers a useful combination of probabilistic and statistical tools for analyzing nonlinear time series. Key features of the book include a study of the extremal behavior of nonlinear time series and a comprehensive list of nonlinear models that address different aspects of nonlinearity. Several inferential methods, including quasi likelihood methods, sequential Markov Chain Monte Carlo Methods and particle filters, are also included so as to provide an overall view of the available tools for parameter estimation for nonlinear models. A chapter on integer time series models based on several thinning operations, which brings together all recent advances made in this area, is also included. Readers should have attended a prior course on linear time series, and a good grasp of simulation-based inferential methods is recommended. This book offers a valuable resource for second-year graduate students and researchers in statistics and other scientific areas who need a basic understanding of nonlinear time series.
Focusing on Bayesian approaches and computations using analytic and simulation-based methods for inference, Time Series: Modeling, Computation, and Inference, Second Edition integrates mainstream approaches for time series modeling with significant recent developments in methodology and applications of time series analysis. It encompasses a graduate-level account of Bayesian time series modeling, analysis and forecasting, a broad range of references to state-of-the-art approaches to univariate and multivariate time series analysis, and contacts research frontiers in multivariate time series modeling and forecasting. It presents overviews of several classes of models and related methodology for inference, statistical computation for model fitting and assessment, and forecasting. It explores the connections between time- and frequency-domain approaches and develop various models and analyses using Bayesian formulations and computation, including use of computations based on Markov chain Monte Carlo (MCMC) and sequential Monte Carlo (SMC) methods. It illustrates the models and methods with examples and case studies from a variety of fields, including signal processing, biomedicine, environmental science, and finance. Along with core models and methods, the book represents state-of-the art approaches to analysis and forecasting in challenging time series problems. It also demonstrates the growth of time series analysis into new application areas in recent years, and contacts recent and relevant modeling developments and research challenges. New in the second edition: Expanded on aspects of core model theory and methodology. Multiple new examples and exercises. Detailed development of dynamic factor models. Updated discussion and connections with recent and current research frontiers.
The generalized method of moments (GMM) estimation has emerged over the past decade as providing a ready to use, flexible tool of application to a large number of econometric and economic models by relying on mild, plausible assumptions. The principal objective of this volume, the first devoted entirely to the GMM methodology, is to offer a complete and up to date presentation of the theory of GMM estimation as well as insights into the use of these methods in empirical studies. It is also designed to serve as a unified framework for teaching estimation theory in econometrics. Contributors to the volume include well-known authorities in the field based in North America, the UK/Europe, and Australia.
Space is a crucial variable in any economic activity. Spatial Economics is the branch of economics that explicitly aims to incorporate the space dimension in the analysis of economic phenomena. From its beginning in the last century, Spatial Economics has contributed to the understanding of the economy by developing plenty of theoretical models as well as econometric techniques having the "space" as a core dimension of the analysis.This edited volume addresses the complex issue of Spatial Economics from an applied point of view. This volume is part of a more complex project including another edited volume (Spatial Economics Volume I: Theory) collecting original papers which address Spatial Economics from a theoretical perspective.
Space is a crucial variable in any economic activity. Spatial Economics is the branch of economics that explicitly aims to incorporate the space dimension in the analysis of economic phenomena. From its beginning in the last century, Spatial Economics has contributed to the understanding of the economy by developing plenty of theoretical models as well as econometric techniques having the "space" as a core dimension of the analysis. This edited volume addresses the complex issue of Spatial Economics from a theoretical point of view. This volume is part of a more complex project including another edited volume (Spatial Economics Volume II: Applications) collecting original papers which address Spatial Economics from an applied perspective.
Business analytics has grown to be a key topic in business curricula, and there is a need for stronger quantitative skills and understanding of fundamental concepts. This book is intended to present key concepts related to quantitative analysis in business. It is targeted to business students, undergraduate and graduate, taking an introductory core course. Topics covered include knowledge management, visualization, sampling and hypothesis testing, regression (simple, multiple, and logistic), as well as optimization modeling. It concludes with a brief overview of data mining. Concepts are demonstrated with worked examples. |
You may like...
Creativity in Computing and DataFlow…
Suyel Namasudra, Veljko Milutinovic
Hardcover
R4,204
Discovery Miles 42 040
Photosynthesis in Bryophytes and Early…
David T. Hanson, Steven K. Rice
Hardcover
R6,561
Discovery Miles 65 610
Applying AI-Based IoT Systems to…
Bhatia Madhulika, Bhatia Surabhi, …
Hardcover
R6,677
Discovery Miles 66 770
The Oxford Handbook of Gender in…
Savita Kumra, Ruth Simpson, …
Hardcover
R4,519
Discovery Miles 45 190
|