![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
Herbert Scarf is a highly esteemed distinguished American economist. He is internationally famous for his early epoch-making work on optimal inventory policies and his highly influential study with Andrew Clark on optimal policies for a multi-echelon inventory problem, which initiated the important and flourishing field of supply chain management. Equally, he has gained world recognition for his classic study on the stability of the Walrasian price adjustment processes and his fundamental analysis on the relationship between the core and the set of competitive equilibria (the so-called Edgeworth conjecture). Further achievements include his remarkable sufficient condition for the existence of a core in non-transferable utility games and general exchange economies, his seminal paper with Lloyd Shapley on housing markets, and his pioneering study on increasing returns and models of production in the presence of indivisibilities. All in all, however, the name of Scarf is always remembered as a synonym for the computation of economic equilibria and fixed points. In the early 1960s he invented a path-breaking technique for computing equilibrium prices.This work has generated a major research field in economics termed Applied General Equilibrium Analysis and a corresponding area in operations research known as Simplicial Fixed Point Methods. This book comprises all his research articles and consists of four volumes. This volume collects Herbert Scarf's papers in the area of Operations Research and Management.
Economic history is the most quantitative branch of history, reflecting the interests and profiting from the techniques and concepts of economics. This essay, first published in 1977, provides an extensive contribution to quantitative historiography by delivering a critical guide to the sources of the numerical data of the period 1700 to 1850. This title will be of interest to students of history, finance and economics.
"Students of econometrics and their teachers will find this book to be the best introduction to the subject at the graduate and advanced undergraduate level. Starting with least squares regression, Hayashi provides an elegant exposition of all the standard topics of econometrics, including a detailed discussion of stationary and non-stationary time series. The particular strength of the book is the excellent balance between econometric theory and its applications, using GMM as an organizing principle throughout. Each chapter includes a detailed empirical example taken from classic and current applications of econometrics."--Dale Jorgensen, Harvard University ""Econometrics" will be a very useful book for intermediate and advanced graduate courses. It covers the topics with an easy to understand approach while at the same time offering a rigorous analysis. The computer programming tips and problems should also be useful to students. I highly recommend this book for an up-to-date coverage and thoughtful discussion of topics in the methodology and application of econometrics."--Jerry A. Hausman, Massachusetts Institute of Technology ""Econometrics" covers both modern and classic topics without shifting gears. The coverage is quite advanced yet the presentation is simple. Hayashi brings students to the frontier of applied econometric practice through a careful and efficient discussion of modern economic theory. The empirical exercises are very useful. . . . The projects are carefully crafted and have been thoroughly debugged."--Mark W. Watson, Princeton University ""Econometrics" strikes a good balance between technical rigor and clear exposition. . . . The use of empiricalexamples is well done throughout. I very much like the use of old 'classic' examples. It gives students a sense of history--and shows that great empirical econometrics is a matter of having important ideas and good data, not just fancy new methods. . . . The style is just great, informal and engaging."--James H. Stock, John F. Kennedy School of Government, Harvard University
This book focuses on the application of the partial hedging approach from modern math finance to equity-linked life insurance contracts. It provides an accessible, up-to-date introduction to quantifying financial and insurance risks. The book also explains how to price innovative financial and insurance products from partial hedging perspectives. Each chapter presents the problem, the mathematical formulation, theoretical results, derivation details, numerical illustrations, and references to further reading.
A Handbook of Statistical Analyses Using SPSS clearly describes how to conduct a range of univariate and multivariate statistical analyses using the latest version of the Statistical Package for the Social Sciences, SPSS 11. Each chapter addresses a different type of analytical procedure applied to one or more data sets, primarily from the social and behavioral sciences areas. Each chapter also contains exercises relating to the data sets introduced, providing readers with a means to develop both their SPSS and statistical skills. Model answers to the exercises are also provided. Readers can download all of the data sets from a companion Web site furnished by the authors.
Financial econometrics is one of the greatest on-going success stories of recent decades, as it has become one of the most active areas of research in econometrics. In this book, Michael Clements presents a clear and logical explanation of the key concepts and ideas of forecasts of economic and financial variables. He shows that forecasts of the single most likely outcome of an economic and financial variable are of limited value. Forecasts that provide more information on the expected likely ranges of outcomes are more relevant. This book provides a comprehensive treatment of the evaluation of different types of forecasts and draws out the parallels between the different approaches. It describes the methods of evaluating these more complex forecasts which provide a fuller description of the range of possible future outcomes.
The interaction between mathematicians, statisticians and econometricians working in actuarial sciences and finance is producing numerous meaningful scientific results. This volume introduces new ideas, in the form of four-page papers, presented at the international conference Mathematical and Statistical Methods for Actuarial Sciences and Finance (MAF), held at Universidad Carlos III de Madrid (Spain), 4th-6th April 2018. The book covers a wide variety of subjects in actuarial science and financial fields, all discussed in the context of the cooperation between the three quantitative approaches. The topics include: actuarial models; analysis of high frequency financial data; behavioural finance; carbon and green finance; credit risk methods and models; dynamic optimization in finance; financial econometrics; forecasting of dynamical actuarial and financial phenomena; fund performance evaluation; insurance portfolio risk analysis; interest rate models; longevity risk; machine learning and soft-computing in finance; management in insurance business; models and methods for financial time series analysis, models for financial derivatives; multivariate techniques for financial markets analysis; optimization in insurance; pricing; probability in actuarial sciences, insurance and finance; real world finance; risk management; solvency analysis; sovereign risk; static and dynamic portfolio selection and management; trading systems. This book is a valuable resource for academics, PhD students, practitioners, professionals and researchers, and is also of interest to other readers with quantitative background knowledge.
Numerical analysis is the study of computation and its accuracy, stability and often its implementation on a computer. This book focuses on the principles of numerical analysis and is intended to equip those readers who use statistics to craft their own software and to understand the advantages and disadvantages of different numerical methods.
The book provides an up-to-date survey of statistical and econometric techniques for the analysis of count data, with a focus on conditional distribution models. The book starts with a presentation of the benchmark Poisson regression model. Alternative models address unobserved heterogeneity, state dependence, selectivity, endogeneity, underreporting, and clustered sampling. Testing and estimation is discussed. Finally, applications are reviewed in various fields.
This book employs a computable general equilibrium (CGE) model - a widely used economic model which uses actual data to provide economic analysis and policy assessment - and applies it to economic data on Singapore's tourism industry. The authors set out to demonstrate how a novice modeller can acquire the necessary skills and knowledge to successfully apply general equilibrium models to tourism studies. The chapters explain how to build a computable general equilibrium model for tourism, how to conduct simulation and, most importantly, how to analyse modelling results. This applied study acts as a modelling book at both introductory and intermediate levels, specifically targeting students and researchers who are interested in and wish to learn computable general equilibrium modelling. The authors offer insightful analysis of Singapore's tourism industry and provide both students and researchers with a guide on how to apply general equilibrium models to actual economic data and draw accurate conclusions.
Introduction to Statistics with SPSS offers an introduction to statistics that can be used before, during or after a course on statistics. Covering a wide range of terms and techniques, including simple and multiple regressions, this book guides the student to enter data from a simple research project into a computer, provide an adequate analysis of the data and present a report on the findings.
Spatial Econometrics provides a modern, powerful and flexible skillset to early career researchers interested in entering this rapidly expanding discipline. It articulates the principles and current practice of modern spatial econometrics and spatial statistics, combining rigorous depth of presentation with unusual depth of coverage. Introducing and formalizing the principles of, and 'need' for, models which define spatial interactions, the book provides a comprehensive framework for almost every major facet of modern science. Subjects covered at length include spatial regression models, weighting matrices, estimation procedures and the complications associated with their use. The work particularly focuses on models of uncertainty and estimation under various complications relating to model specifications, data problems, tests of hypotheses, along with systems and panel data extensions which are covered in exhaustive detail. Extensions discussing pre-test procedures and Bayesian methodologies are provided at length. Throughout, direct applications of spatial models are described in detail, with copious illustrative empirical examples demonstrating how readers might implement spatial analysis in research projects. Designed as a textbook and reference companion, every chapter concludes with a set of questions for formal or self--study. Finally, the book includes extensive supplementing information in a large sample theory in the R programming language that supports early career econometricians interested in the implementation of statistical procedures covered.
There are many problems regarding poverty, inequality and growth in developing countries in Asia and Africa. Policy makers at the national level and at international institutions such as the United Nations, World Bank, International Monetary Fund and others have implemented various policies in order to decrease poverty and inequality. This book provides empirical observations on Asian countries and Africa. Each chapter provides theoretical and empirical analysis on regional case studies with an emphasis on policy implications. The book will be of use to many who wish to assess and improve policies in developing countries and mitigate poverty and inequality, and stimulate growth, by drawing on relevant empirical research and economic theories. Clearly, there have been numerous policy failures and the book aims to provide a basis for improving policies and outcomes based on relevant empirical observations.
This short book introduces the main ideas of statistical inference in a way that is both user friendly and mathematically sound. Particular emphasis is placed on the common foundation of many models used in practice. In addition, the book focuses on the formulation of appropriate statistical models to study problems in business, economics, and the social sciences, as well as on how to interpret the results from statistical analyses. The book will be useful to students who are interested in rigorous applications of statistics to problems in business, economics and the social sciences, as well as students who have studied statistics in the past, but need a more solid grounding in statistical techniques to further their careers. Jacco Thijssen is professor of finance at the University of York, UK. He holds a PhD in mathematical economics from Tilburg University, Netherlands. His main research interests are in applications of optimal stopping theory, stochastic calculus, and game theory to problems in economics and finance. Professor Thijssen has earned several awards for his statistics teaching.
The purpose of this book is to introduce novice researchers to the tools of meta-analysis and meta-regression analysis and to summarize the state of the art for existing practitioners. Meta-regression analysis addresses the rising "Tower of Babel" that current economics and business research has become. Meta-analysis is the statistical analysis of previously published, or reported, research findings on a given hypothesis, empirical effect, phenomenon, or policy intervention. It is a systematic review of all the relevant scientific knowledge on a specific subject and is an essential part of the evidence-based practice movement in medicine, education and the social sciences. However, research in economics and business is often fundamentally different from what is found in the sciences and thereby requires different methods for its synthesis-meta-regression analysis. This book develops, summarizes, and applies these meta-analytic methods.
This book brings together cutting edge contributions in the fields of international economics, micro theory, welfare economics and econometrics, with contributions from Donald R. Davis, Avinash K. Dixit, Tadashi Inoue, Ronald W. Jones, Dale W. Jorgenson, K. Rao Kadiyala, Murray C. Kemp, Kenneth M. Kletzer, Anne O. Krueger, Mukul Majumdar, Daniel McFadden, Lionel McKenzie, James R. Melvin, James C. Moore, Takashi Negishi, Yoshihiko Otani, Raymond Riezman, Paul A. Samuelson, Joaquim Silvestre and Marie Thursby.
This book explores new topics in modern research on empirical corporate finance and applied accounting, especially the econometric analysis of microdata. Dubbed "financial microeconometrics" by the author, this concept unites both methodological and applied approaches. The book examines how quantitative methods can be applied in corporate finance and accounting research in order to predict companies getting into financial distress. Presented in a clear and straightforward manner, it also suggests methods for linking corporate governance to financial performance, and discusses what the determinants of accounting disclosures are. Exploring these questions by way of numerous practical examples, this book is intended for researchers, practitioners and students who are not yet familiar with the variety of approaches available for data analysis and microeconometrics. "This book on financial microeconometrics is an excellent starting point for research in corporate finance and accounting. In my view, the text is positioned between a narrative and a scientific treatise. It is based on a vast amount of literature but is not overloaded with formulae. My appreciation of financial microeconometrics has very much increased. The book is well organized and properly written. I enjoyed reading it." Wolfgang Marty, Senior Investment Strategist, AgaNola AG
Prepares readers to analyze data and interpret statistical results using the increasingly popular R more quickly than other texts through LessR extensions which remove the need to program. By introducing R through less R, readers learn how to organize data for analysis, read the data into R, and produce output without performing numerous functions and programming first. Readers can select the necessary procedure and change the relevant variables without programming. Quick Starts introduce readers to the concepts and commands reviewed in the chapters. Margin notes define, illustrate, and cross-reference the key concepts. When readers encounter a term previously discussed, the margin notes identify the page number to the initial introduction. Scenarios highlight the use of a specific analysis followed by the corresponding R/lessR input and an interpretation of the resulting output. Numerous examples of output from psychology, business, education, and other social sciences demonstrate how to interpret results and worked problems help readers test their understanding. www.lessRstats.com website features the lessR program, the book's 2 data sets referenced in standard text and SPSS formats so readers can practice using R/lessR by working through the text examples and worked problems, PDF slides for each chapter, solutions to the book's worked problems, links to R/lessR videos to help readers better understand the program, and more. New to this edition: o upgraded functionality and data visualizations of the lessR package, which is now aesthetically equal to the ggplot 2 R standard o new features to replace and extend previous content, such as aggregating data with pivot tables with a simple lessR function call.
Herbert Scarf is a highly esteemed distinguished American economist. He is internationally famous for his early epoch-making work on optimal inventory policies and his highly influential study with Andrew Clark on optimal policies for a multi-echelon inventory problem, which initiated the important and flourishing field of supply chain management. Equally, he has gained world recognition for his classic study on the stability of the Walrasian price adjustment processes and his fundamental analysis on the relationship between the core and the set of competitive equilibria (the so-called Edgeworth conjecture). Further achievements include his remarkable sufficient condition for the existence of a core in non-transferable utility games and general exchange economies, his seminal paper with Lloyd Shapley on housing markets, and his pioneering study on increasing returns and models of production in the presence of indivisibilities. All in all, however, the name of Scarf is always remembered as a synonym for the computation of economic equilibria and fixed points. In the early 1960s he invented a path-breaking technique for computing equilibrium prices. This work has generated a major research field in economics termed Applied General Equilibrium Analysis and a corresponding area in operations research known as Simplicial Fixed Point Methods. This book comprises all his research articles and consists of four volumes. This volume collects Herbert Scarf's papers in the area of Economics and Game Theory.
Econometric Model Specification reviews and extends the author's papers on consistent model specification testing and semi-nonparametric modeling and inference. This book consists of two parts. The first part discusses consistent tests of functional form of regression and conditional distribution models, including a consistent test of the martingale difference hypothesis for time series regression errors. In the second part, semi-nonparametric modeling and inference for duration and auction models are considered, as well as a general theory of the consistency and asymptotic normality of semi-nonparametric sieve maximum likelihood estimators. Moreover, this volume also contains addendums and appendices that provide detailed proofs and extensions of all the results. It is uniquely self-contained and is a useful source for students and researchers interested in model specification issues.
The objective of this book is the discussion and the practical illustration of techniques used in applied macroeconometrics. There are currently three competing approaches: the LSE (London School of Economics) approach, the VAR approach, and the intertemporal optimization/Real Business Cycle approach. This book discusses and illustrates the empirical research strategy of these three alternative approaches, pairing them with extensive discussions and replications of the relevant empirical work. Common benchmarks are used to evaluate the alternative approaches.
Offering a unique balance between applications and calculations, Monte Carlo Methods and Models in Finance and Insurance incorporates the application background of finance and insurance with the theory and applications of Monte Carlo methods. It presents recent methods and algorithms, including the multilevel Monte Carlo method, the statistical Romberg method, and the Heath-Platen estimator, as well as recent financial and actuarial models, such as the Cheyette and dynamic mortality models. The authors separately discuss Monte Carlo techniques, stochastic process basics, and the theoretical background and intuition behind financial and actuarial mathematics, before bringing the topics together to apply the Monte Carlo methods to areas of finance and insurance. This allows for the easy identification of standard Monte Carlo tools and for a detailed focus on the main principles of financial and insurance mathematics. The book describes high-level Monte Carlo methods for standard simulation and the simulation of stochastic processes with continuous and discontinuous paths. It also covers a wide selection of popular models in finance and insurance, from Black-Scholes to stochastic volatility to interest rate to dynamic mortality. Through its many numerical and graphical illustrations and simple, insightful examples, this book provides a deep understanding of the scope of Monte Carlo methods and their use in various financial situations. The intuitive presentation encourages readers to implement and further develop the simulation methods.
This book presents some of Arnold Zellner's outstanding contributions to the philosophy, theory and application of Bayesian analysis, particularly as it relates to statistics, econometrics and economics. The volume contains both previously published and new material which cite and discuss the work of Bayesians who have made a contribution by helping researchers and analysts in many professions to become more effective in learning from data and making decisions. Bayesian and non-Bayesian approaches are compared in several papers. Other articles include theoretical and applied results on estimation, model comparison, prediction, forecasting, prior densities, model formulation and hypothesis testing. In addition, a new information processing approach is presented that yields Bayes's Theorem as a perfectly efficient information processing rule. This volume will be essential reading for academics and students interested in qualitative methods as well as industrial analysts and government officials.
This peer reviewed volume is part of an annual series, dedicated to the presentation and discussion of state of the art studies in the application of management science to the solution of significant managerial decision making problems. It is hoped that this research annual will significantly aid in the dissemination of actual applications of management science in both the public and private sectors. Volume 11 is directed toward the applications of mathematical programming to (1) Multi-criteria decision making, (2) Supply chain management, (3) Performance management, and (4) Risk analysis. Its use can be found both in university classes in management science and operations research, (management and engineering schools), as well as to both the researcher and practitioner of management science and operations research.
Economic Models for Industrial Organization focuses on the specification and estimation of econometric models for research in industrial organization. In recent decades, empirical work in industrial organization has moved towards dynamic and equilibrium models, involving econometric methods which have features distinct from those used in other areas of applied economics. These lecture notes, aimed for a first or second-year PhD course, motivate and explain these econometric methods, starting from simple models and building to models with the complexity observed in typical research papers. The covered topics include discrete-choice demand analysis, models of dynamic behavior and dynamic games, multiple equilibria in entry games and partial identification, and auction models. |
You may like...
Local Area Networks to WANS - Network…
Nathan J Muller, Robert P Davidson
Hardcover
R3,963
Discovery Miles 39 630
Enterprise Information Systems IV
Mario G. Piattini, Joaquim Filipe, …
Hardcover
R4,172
Discovery Miles 41 720
User-centered Requirements - The…
Karen L. McGraw, Karan Harbison
Hardcover
R4,660
Discovery Miles 46 600
Information Systems - Debates…
Priya Seetharaman, Jocelyn Cranefield
Paperback
R1,295
Discovery Miles 12 950
Laboratory Information Management…
Christine Paszko, Elizabeth Turner
Paperback
R4,205
Discovery Miles 42 050
|