![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
This book describes a system of mathematical models and methods that can be used to analyze real economic and managerial decisions and to improve their effectiveness. Application areas include: management of development and operation budgets, assessment and management of economic systems using an energy entropy approach, equation of exchange rates and forecasting foreign exchange operations, evaluation of innovative projects, monitoring of governmental programs, risk management of investment processes, decisions on the allocation of resources, and identification of competitive industrial clusters. The proposed methods and models were tested on the example of Kazakhstan's economy, but the generated solutions will be useful for applications at other levels and in other countries. Regarding your book "Mathematical Methods and Models in Economics", I am impressed because now it is time when "econometrics" is becoming more appreciated by economists and by schools that are the hosts or employers of modern economists. ... Your presented results really impressed me. John F. Nash, Jr., Princeton University, Nobel Memorial Prize in Economic Sciences The book is within my scope of interest because of its novelty and practicality. First, there is a need for realistic modeling of complex systems, both natural and artificial that conclude computer and economic systems. There has been an ongoing effort in developing models dealing with complexity and incomplete knowledge. Consequently, it is clear to recognize the contribution of Mutanov to encapsulate economic modeling with emphasis on budgeting and innovation. Secondly, the method proposed by Mutanov has been verified by applying to the case of the Republic of Kazakhstan, with her vibrant emerging economy. Thirdly, Chapter 5 of the book is of particular interest for the computer technology community because it deals with innovation. In summary, the book of Mutanov should become one of the outstanding recognized pragmatic guides for dealing with innovative systems. Andrzej Rucinski, University of New Hampshire This book is unique in its theoretical findings and practical applicability. The book is an illuminating study based on an applied mathematical model which uses methods such as linear programming and input-output analysis. Moreover, this work demonstrates the author's great insight and academic brilliance in the fields of finance, technological innovations and marketing vis-a-vis the market economy. From both theoretical and practical standpoint, this work is indeed a great achievement. Yeon Cheon Oh, President of Seoul National University
The beginning of the age of artificial intelligence and machine learning has created new challenges and opportunities for data analysts, statisticians, mathematicians, econometricians, computer scientists and many others. At the root of these techniques are algorithms and methods for clustering and classifying different types of large datasets, including time series data. Time Series Clustering and Classification includes relevant developments on observation-based, feature-based and model-based traditional and fuzzy clustering methods, feature-based and model-based classification methods, and machine learning methods. It presents a broad and self-contained overview of techniques for both researchers and students. Features Provides an overview of the methods and applications of pattern recognition of time series Covers a wide range of techniques, including unsupervised and supervised approaches Includes a range of real examples from medicine, finance, environmental science, and more R and MATLAB code, and relevant data sets are available on a supplementary website
The aim of the proposed volume will be to present new developments in the methodology and practice of CGE techniques as they apply to recent issues in international trade policy. The volume will be of interest to academic researchers working in trade policy analysis and applied general equilibrium, advanced graduate students in international economics, applied researchers in multilateral organizations, and policymakers who need to work with and interpret the results of CGE analysis.
The goal of this book is to assess the efficacy of India's financial deregulation programme by analyzing the developments in cost efficiency and total factor productivity growth across different ownership types and size classes in the banking sector over the post-deregulation years. The work also gauges the impact of inclusion or exclusion of a proxy for non-traditional activities on the cost efficiency estimates for Indian banks, and ranking of distinct ownership groups. It also investigates the hitherto neglected aspect of the nature of returns-to-scale in the Indian banking industry. In addition, the work explores the key bank-specific factors that explain the inter-bank variations in efficiency and productivity growth. Overall, the empirical results of this work allow us to ascertain whether the gradualist approach to reforming the banking system in a developing economy like India has yielded the most significant policy goal of achieving efficiency and productivity gains. The authors believe that the findings of this book could give useful policy directions and suggestions to other developing economies that have embarked on a deregulation path or are contemplating doing so.
In many applications of econometrics and economics, a large proportion of the questions of interest are identification. An economist may be interested in uncovering the true signal when the data could be very noisy, such as time-series spurious regression and weak instruments problems, to name a few. In this book, High-Dimensional Econometrics and Identification, we illustrate the true signal and, hence, identification can be recovered even with noisy data in high-dimensional data, e.g., large panels. High-dimensional data in econometrics is the rule rather than the exception. One of the tools to analyze large, high-dimensional data is the panel data model.High-Dimensional Econometrics and Identification grew out of research work on the identification and high-dimensional econometrics that we have collaborated on over the years, and it aims to provide an up-todate presentation of the issues of identification and high-dimensional econometrics, as well as insights into the use of these results in empirical studies. This book is designed for high-level graduate courses in econometrics and statistics, as well as used as a reference for researchers.
This handbook covers DEA topics that are extensively used and solidly based. The purpose of the handbook is to (1) describe and elucidate the state of the field and (2), where appropriate, extend the frontier of DEA research. It defines the state-of-the-art of DEA methodology and its uses. This handbook is intended to represent a milestone in the progression of DEA. Written by experts, who are generally major contributors to the topics to be covered, it includes a comprehensive review and discussion of basic DEA models, which, in the present issue extensions to the basic DEA methods, and a collection of DEA applications in the areas of banking, engineering, health care, and services. The handbook's chapters are organized into two categories: (i) basic DEA models, concepts, and their extensions, and (ii) DEA applications. First edition contributors have returned to update their work. The second edition includes updated versions of selected first edition chapters. New chapters have been added on: different approaches with no need for a priori choices of weights (called multipliers) that reflect meaningful trade-offs, construction of static and dynamic DEA technologies, slacks-based model and its extensions, DEA models for DMUs that have internal structures network DEA that can be used for measuring supply chain operations, Selection of DEA applications in the service sector with a focus on building a conceptual framework, research design and interpreting results. "
Volume 1 covers statistical methods related to unit roots, trend breaks and their interplay. Testing for unit roots has been a topic of wide interest and the author was at the forefront of this research. The book covers important topics such as the Phillips-Perron unit root test and theoretical analyses about their properties, how this and other tests could be improved, and ingredients needed to achieve better tests and the proposal of a new class of tests. Also included are theoretical studies related to time series models with unit roots and the effect of span versus sampling interval on the power of the tests. Moreover, this book deals with the issue of trend breaks and their effect on unit root tests. This research agenda fostered by the author showed that trend breaks and unit roots can easily be confused. Hence, the need for new testing procedures, which are covered.Volume 2 is about statistical methods related to structural change in time series models. The approach adopted is off-line whereby one wants to test for structural change using a historical dataset and perform hypothesis testing. A distinctive feature is the allowance for multiple structural changes. The methods discussed have, and continue to be, applied in a variety of fields including economics, finance, life science, physics and climate change. The articles included address issues of estimation, testing and/or inference in a variety of models: short-memory regressors and errors, trends with integrated and/or stationary errors, autoregressions, cointegrated models, multivariate systems of equations, endogenous regressors, long-memory series, among others. Other issues covered include the problems of non-monotonic power and the pitfalls of adopting a local asymptotic framework. Empirical analyses are provided for the US real interest rate, the US GDP, the volatility of asset returns and climate change.
Four years ago "Research in Experimental Economics" published experimental evidence on fundraising and charitable contributions. This volume returns to the intrigue with philanthropy. Employing a mixture of laboratory and field experiments as well as theoretical research we present this new volume, "Charity with Choice." New waves of experiments are taking advantage of well calibrated environments established by past efforts to add new features to experiments such as endogeneity and self-selection. Adventurous new research programs are popping up and some of them are captured here in this volume. Among the major themes in which the tools of choice, endogeneity, and self-selection are employed are: What increases or decreases charitable activity? and How do organizational and managerial issues affect the performance of non-profit organizations?
Herbert Scarf is a highly esteemed distinguished American economist. He is internationally famous for his early epoch-making work on optimal inventory policies and his highly influential study with Andrew Clark on optimal policies for a multi-echelon inventory problem, which initiated the important and flourishing field of supply chain management. Equally, he has gained world recognition for his classic study on the stability of the Walrasian price adjustment processes and his fundamental analysis on the relationship between the core and the set of competitive equilibria (the so-called Edgeworth conjecture). Further achievements include his remarkable sufficient condition for the existence of a core in non-transferable utility games and general exchange economies, his seminal paper with Lloyd Shapley on housing markets, and his pioneering study on increasing returns and models of production in the presence of indivisibilities. All in all, however, the name of Scarf is always remembered as a synonym for the computation of economic equilibria and fixed points. In the early 1960s he invented a path-breaking technique for computing equilibrium prices.This work has generated a major research field in economics termed Applied General Equilibrium Analysis and a corresponding area in operations research known as Simplicial Fixed Point Methods. This book comprises all his research articles and consists of four volumes. The volume collects Herbert Scarf's papers in the area of Applied Equilibrium Analysis.
The book describes the structure of the Keynes-Leontief Model (KLM) of Japan and discusses how the Japanese economy can overcome the long-term economic deflation that has taken place since the mid-1990s. The large-scale econometric model and its analysis have been important for planning several policy measures and examining the economic structure of a country. However, it seems that the development and maintenance of the KLM would be very costly. The book discusses how the KLM is developed and employed for the policy analyses.
This Volume of "Advances in Econometrics" contains a selection of papers presented initially at the 7th Annual Advances in Econometrics Conference held on the LSU campus in Baton Rouge, Louisiana during November 14-16, 2008. The theme of the conference was 'Nonparametric Econometric Methods', and the papers selected for inclusion in this Volume span a range of nonparametric techniques including kernel smoothing, empirical copulas, series estimators, and smoothing splines along with a variety of semiparametric methods. The papers in this Volume cover topics of interest to those who wish to familiarize themselves with current nonparametric methodology. Many papers also identify areas deserving of future attention. There exist survey papers devoted to recent developments in nonparametric nance, constrained nonparametric regression, miparametric/nonparametric environmental econometrics and nonparametric models with non-stationary data. There exist theoretical papers dealing with novel approaches for partial identification of the distribution of treatment effects, xed effects semiparametric panel data models, functional coefficient models with time series data, exponential series estimators of empirical copulas, estimation of multivariate CDFs and bias-reduction methods for density estimation. There also exist a number of applications that analyze returns to education, the evolution of income and life expectancy, the role of governance in growth, farm production, city size and unemployment rates, derivative pricing, and environmental pollution and economic growth. In short, this Volume contains a range of theoretical developments, surveys, and applications that would be of interest to those who wish to keep abreast of some of the most important current developments in the field of nonparametric estimation.
The conference, 'Measurement Error: Econometrics and Practice' was recently hosted by Aston University and organised jointly by researchers from Aston University and Lund University to highlight the enormous problems caused by measurement error in Economic and Financial data which often go largely unnoticed. Thanks to the sponsorship from Eurostat, a number of distinguished researchers were invited to present keynote lectures. Professor Arnold Zellner from University of Chicago shared his knowledge on measurement error in general; Professor William Barnett from the University of Kansas gave a lecture on implications of measurement error on monetary policy, whilst Dennis Fixler shared his knowledge on how statistical agencies deal with measurement errors. This volume is the result of the selection of high-quality papers presented at the conference and is designed to draw attention to the enormous problem in econometrics of measurement error in data provided by the worlds leading statistical agencies; highlighting consequences of data error and offering solutions to deal with such problems. This volume should appeal to economists, financial analysts and practitioners interested in studying and solving economic problems and building econometric models in everyday operations.
It is impossible to understand modern economics without knowledge of the basic tools of gametheory and mechanism design. This book provides a graduate-level introduction to the economic modeling of strategic behavior. The goal is to teach Economics doctoral students the tools of game theory and mechanism design that all economists should know.
Exotic Betting at the Racetrack is unique as it covers the efficient-inefficient strategy to price and find profitable racetrack bets, along with handicapping that provides actual bets made by the author on essentially all of the major wagers offered at US racetracks. The book starts with efficiency, accuracy of the win odds, arbitrage, and optimal betting strategies. Examples and actual bets are shown for various wagers including win, place and show, exacta, quinella, double, trifecta, superfecta, Pick 3, 4 and 6 and rainbow pick 5 and 6. There are discussions of major races including the Breeders' Cup, Pegasus, Dubai World Cup and the US Triple Crown from 2012-2018. Dosage analysis is also described and used. An additional feature concerns great horses such as the great mares Rachel Alexandra, Zenyatta, Goldikova, Treve, Beholder and Song Bird. There is a discussion of horse ownership and a tour through arguably the world's top trainer Frederico Tesio and his stables and horses in Italy.Related Link(s)
Originally published in 1971, this is a rigorous analysis of the economic aspects of the efficiency of public enterprises at the time. The author first restates and extends the relevant parts of welfare economics, and then illustrates its application to particular cases, drawing on the work of the National Board for Prices and Incomes, of which he was Deputy Chairman. The analysis is developed stage by stage, with the emphasis on applicability and ease of comprehension, rather than on generality or mathematical elegance. Financial performance, the second-best, the optimal degree of complexity of price structures and problems of optimal quality are first discussed in a static framework. Time is next introduced, leading to a marginal cost concept derived from a multi-period optimizing model. The analysis is then related to urban transport, shipping, gas and coal. This is likely to become a standard work of more general scope than the authors earlier book on electricity supply. It rests, however, on a similar combination of economic theory and high-level experience of the real problems of public enterprises.
Non-market valuation has become a broadly accepted and widely practiced means of measuring the economic values of the environment and natural resources. In this book, the authors provide a guide to the statistical and econometric practices that economists employ in estimating non-market values.The authors develop the econometric models that underlie the basic methods: contingent valuation, travel cost models, random utility models and hedonic models. They analyze the measurement of non-market values as a procedure with two steps: the estimation of parameters of demand and preference functions and the calculation of benefits from the estimated models. Each of the models is carefully developed from the preference function to the behavioral or response function that researchers observe. The models are then illustrated with datasets that characterize the kinds of data researchers typically deal with. The real world data and clarity of writing in this book will appeal to environmental economists, students, researchers and practitioners in multilateral banks and government agencies.
Learn more about modern Econometrics with this comprehensive introduction to the field, featuring engaging applications and bringing contemporary theories to life. Introduction to Econometrics, 4th Edition, Global Edition by Stock and Watson is the ultimate introductory guide that connects modern theory with motivating, engaging applications. The text ensures you get a solid grasp of this challenging subject's theoretical background, building on the philosophy that applications should drive the theory, not the other way around. The latest edition maintains the focus on currency, focusing on empirical analysis and incorporating real-world questions and data by using results directly relevant to the applications. The text contextualises the study of Econometrics with a comprehensive introduction and review of economics, data, and statistics before proceeding to an extensive regression analysis studying the different variables and regression parameters. With a large data set increasingly used in Economics and related fields, a new chapter dedicated to Big Data will help you learn more about this growing and exciting area. Sharing a variety of resources and tools to help your understanding and critical thinking of the topics introduced, such as General Interest boxes, or end-of-chapter, and empirical exercises and summaries, this industry-leading text will help you acquire a sophisticated knowledge of this fascinating subject. Reach every student by pairing this text with Pearson MyLab (R) Economics MyLab is the teaching and learning platform that empowers you to reach every student. By combining trusted author content with digital tools and a flexible platform, MyLab (R) personalises the learning experience and improves results for each student. If you would like to purchase both the physical text and MyLab Economics search for: 9781292264561 Introduction to Econometrics, 4th Edition, Global Edition with MyLab Economics Package consists of: 9781292264455 Introduction to Econometrics, 4th Edition, Global Edition 9781292264516 Introduction to Econometrics, 4th Edition, Global Edition MyLab Economics 9780136879787 Introduction to Econometrics, 4th Edition, Global Edition Pearson eText Pearson MyLab (R) Economics is not included. Students, if Pearson MyLab Economics is a recommended/mandatory component of the course, please ask your instructor for the correct ISBN. Pearson MyLab (R) Economics should only be purchased when required by an instructor. Instructors, contact your Pearson representative for more information.
Originally published in 1984. This book examines two important dimensions of efficiency in the foreign exchange market using econometric techniques. It responds to the macroeconomics trend to re-examining the theories of exchange rate determination following the erratic behaviour of exchange rates in the late 1970s. In particular the text looks at the relation between spot and forward exchange rates and the term structure of the forward premium, both of which require a joint test of market efficiency and the equilibrium model. Approaches used are the regression of spot rates on lagged forward rates and an explicit time series analysis of the spot and forward rates, using data from Canada, the United Kingdom, the Netherlands, Switzerland and Germany.
World Statistics on Mining and Utilities 2018 provides a unique biennial overview of the role of mining and utility activities in the world economy. This extensive resource from UNIDO provides detailed time series data on the level, structure and growth of international mining and utility activities by country and sector. Country level data is clearly presented on the number of establishments, employment and output of activities such as: coal, iron ore and crude petroleum mining as well as production and supply of electricity, natural gas and water. This unique and comprehensive source of information meets the growing demand of data users who require detailed and reliable statistical information on the primary industry and energy producing sectors. The publication provides internationally comparable data to economic researchers, development strategists and business communities who influence the policy of industrial development and its environmental sustainability.
Originating in economics but now used in a variety of disciplines, including medicine, epidemiology and the social sciences, this book provides accessible coverage of the theoretical foundations of the Logit model as well as its applications to concrete problems. It is written not only for economists but for researchers working in disciplines where it is necessary to model qualitative random variables. J.S. Cramer has also provided data sets on which to practice Logit analysis.
This book is based on two Sir Richard Stone lectures at the Bank of England and the National Institute for Economic and Social Research. Largely non-technical, the first part of the book covers some of the broader issues involved in Stone's and others' work in statistics. It explores the more philosophical issues attached to statistics, econometrics and forecasting and describes the paradigm shift back to the Bayesian approach to scientific inference. The first part concludes with simple examples from the different worlds of educational management and golf clubs. The second, more technical part covers in detail the structural econometric time series analysis (SEMTSA) approach to statistical and econometric modeling.
Introduction to Financial Mathematics: Option Valuation, Second Edition is a well-rounded primer to the mathematics and models used in the valuation of financial derivatives. The book consists of fifteen chapters, the first ten of which develop option valuation techniques in discrete time, the last five describing the theory in continuous time. The first half of the textbook develops basic finance and probability. The author then treats the binomial model as the primary example of discrete-time option valuation. The final part of the textbook examines the Black-Scholes model. The book is written to provide a straightforward account of the principles of option pricing and examines these principles in detail using standard discrete and stochastic calculus models. Additionally, the second edition has new exercises and examples, and includes many tables and graphs generated by over 30 MS Excel VBA modules available on the author's webpage https://home.gwu.edu/~hdj/.
Originally published in 1979. This book addresses three questions regarding uncertainty in economic life: how do we define uncertainty and use the concept meaningfully to provide conclusions; how can the level of uncertainty associated with a particular variable of economic interest be measured; and does experience provide any support for the view that uncertainty really matters. It develops a theory of the effect of price uncertainty on production and trade, takes a graphical approach to look at effects of a mean preserving spread to create rules for ordering distributions, and finishes with an econometric analysis of the effects of Brazil's adoption of a crawling peg in reducing real exchange rate uncertainty. This is an important early study into the significance of uncertainty.
The theme of this book is health outcomes in India, in particular to outcomes relating to its caste and religious groups and, within these groups, to their women and children. The book's tenor is analytical and based upon a rigorous examination of recent data from both government and non-government sources. The major areas covered are sanitation, use by mothers of the government's child development services, child malnutrition, deaths in families, gender discrimination, and the measurement of welfare.
In this testament to the distinguished career of H.S. Houthakker a number of Professor Houthakker's friends, former colleagues and former students offer essays which build upon and extend his many contributions to economics in aggregation, consumption, growth and trade. Among the many distinguished contributors are Paul Samuelson, Werner Hildenbrand, John Muellbauer and Lester Telser. The book also includes four previously unpublished papers and notes by its distinguished dedicatee. |
You may like...
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
|