![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
The manuscript reviews some key ideas about artificial intelligence, and relates them to economics. These include its relation to robotics, and the concepts of synthetic emotions, consciousness, and life. The economic implications of the advent of artificial intelligence, such as its effect on prices and wages, appropriate patent policy, and the possibility of accelerating productivity, are discussed. The growing field of artificial economics and the use of artificial agents in experimental economics is considered.
Macroeconomic Modelling has undergone radical changes in the last few years. There has been considerable innovation in developing robust solution techniques for the new breed of increasingly complex models. Similarly there has been a growing consensus on their long run and dynamic properties, as well as much development on existing themes such as modelling expectations and policy rules. This edited volume focuses on those areas which have undergone the most significant and imaginative developments and brings together the very best of modelling practice. We include specific sections on (I) Solving Large Macroeconomic Models, (II) Rational Expectations and Learning Approaches, (III) Macro Dynamics, and (IV) Long Run and Closures. All of the contributions offer new research whilst putting their developments firmly in context and as such will influence much future research in the area. It will be an invaluable text for those in policy institutions as well as academics and advanced students in the fields of economics, mathematics, business and government. Our contributors include those working in central banks, the IMF, European Commission and established academics.
Econometric models are made up of assumptions which never exactly match reality. Among the most contested ones is the requirement that the coefficients of an econometric model remain stable over time. Recent years have therefore seen numerous attempts to test for it or to model possible structural change when it can no longer be ignored. This collection of papers from Empirical Economics mirrors part of this development. The point of departure of most studies in this volume is the standard linear regression model Yt = x;fJt + U (t = I, ... , 1), t where notation is obvious and where the index t emphasises the fact that structural change is mostly discussed and encountered in a time series context. It is much less of a problem for cross section data, although many tests apply there as well. The null hypothesis of most tests for structural change is that fJt = fJo for all t, i.e. that the same regression applies to all time periods in the sample and that the disturbances u are well behaved. The well known Chow test for instance assumes t that there is a single structural shift at a known point in time, i.e. that fJt = fJo (t< t*), and fJt = fJo + t1fJ (t"?:. t*), where t* is known.
This important collection brings together leading econometricians to discuss advances in the areas of the econometrics of panel data. The papers in this collection can be grouped into two categories. The first, which includes chapters by Amemiya, Baltagi, Arellano, Bover and Labeaga, primarily deal with different aspects of limited dependent variables and sample selectivity. The second group of papers, including those by Nerlove, Schmidt and Ahn, Kiviet, Davies and Lahiri, consider issues that arise in the estimation of dyanamic (possibly) heterogeneous panel data models. Overall, the contributors focus on the issues of simplifying complex real-world phenomena into easily generalisable inferences from individual outcomes. As the contributions of G. S. Maddala in the fields of limited dependent variables and panel data were particularly influential, it is a fitting tribute that this volume is dedicated to him.
Swaps, futures, options, structured instruments - a wide range of derivative products is traded in today's financial markets. Analyzing, pricing and managing such products often requires fairly sophisticated quantitative tools and methods. This book serves as an introduction to financial mathematics with special emphasis on aspects relevant in practice. In addition to numerous illustrative examples, algorithmic implementations are demonstrated using "Mathematica" and the software package "UnRisk" (available for both students and teachers). The content is organized in 15 chapters that can be treated as independent modules. In particular, the exposition is tailored for classroom use in a Bachelor or Master program course, as well as for practitioners who wish to further strengthen their quantitative background.
This book reports the results of five empirical studies undertaken in the early seventies by a collaboration headed by Professor Morishima. It deals with applications of the general equilibrium models whose theoretical aspects have been one of Professor Morishima's main interests. Four main econometric models are constructed for the USA, the UK, and Japan. These are used as a basis for the discussion of various topics in economic theory, such as: the existence and stability or instability of the neoclassical path of full employment growth equilibrium and a von Neumann-type path of balanced growth at constant proces; the antimony between price-stability and full employment; the Samuelson-LeChatelier principle; the theory of the balanced-budget multiplier; the three Hicksian laws of the gross substitutes system; the Brown-Jones super-multipliers of international trade, and so on. In addition, this 1972 work makes a quantitative evaluation for the US economy of monetary and fiscal policies as short-run measures for achieving full employment; the effectiveness of built-in flexibility of taxes in the UK economy is discussed; and estimates are made of the rapid decrease in disguised unemployment in post-war Japan.
This tutorial presents a hands-on introduction to a new discrete choice modeling approach based on the behavioral notion of regret-minimization. This so-called Random Regret Minimization-approach (RRM) forms a counterpart of the Random Utility Maximization-approach (RUM) to discrete choice modeling, which has for decades dominated the field of choice modeling and adjacent fields such as transportation, marketing and environmental economics. Being as parsimonious as conventional RUM-models and compatible with popular software packages, the RRM-approach provides an alternative and appealing account of choice behavior. Rather than providing highly technical discussions as usually encountered in scholarly journals, this tutorial aims to allow readers to explore the RRM-approach and its potential and limitations hands-on and based on a detailed discussion of examples. This tutorial is written for students, scholars and practitioners who have a basic background in choice modeling in general and RUM-modeling in particular. It has been taken care of that all concepts and results should be clear to readers that do not have an advanced knowledge of econometrics.
Charles de Gaulle commence ses Memoires d'Espoir, ainsi: 'La France vient du fond des ages. Elle vito Les Siecles l'appellent. Mais elle demeure elle-meme au long du temps. Ses limites peuvent se modifier sans que changent Ie relief, Ie climat, les fleuves, les mers, qui la marquent indefmiment. Y habitent des peuples qu'etreignent, au cours de l'Histoire, les epreuves les plus diverses, mais que la nature des choses, utilisee par la politique, petrit sans cesse en une seule nation. Celle-ci a embrasse de nombreuses generations. Elle en comprend actuellement plusieurs. Elle en enfantera beaucoup d'autres. Mais, de par la geograpbie du pays qui est Ie sien, de par Ie genie des races qui la composent, de par les voisinages qui l'entourent, elle revet un caractere constant qui fait dependre de leurs peres les Fran ais de chaque epoque et les engage pour leurs descendants. A moins de se rompre, cet ensemble humain, sur ce territoire, au sein de cet univers, comporte donc un passe, un present, un avenir, indissolubles. Aussi l'ttat, qui repond de la France, est-il en charge, a la fois, de son heritage d'bier, de ses interets d'aujourd'hui et de ses espoirs de demain. ' A la lurniere de cette idee de nation, il est clair, qu'un dialogue entre nations est eminemment important et que la Semaine Universitaire Franco Neerlandaise est une institution pour stimuler ce dialogue."
This book was first published in 1989. Inference and prediction in human affairs are characterised by a cognitive and reactive sample space, the elements of which are aware both of the statistician and of each other. It is therefore not surprising that methodologies borrowed from classical statistics and the physical sciences have yielded disappointingly few lasting empirical insights and have sometimes failed in predictive mode. This book puts the underlying methodology of socioeconomic statistics on a firmer footing by placing it within the ambit of inferential and predictive games. It covers such problems as learning, publication, non-response, strategic response, the nature and possibility of rational expectations, time inconsistency, intrinsic nonstationarity, and the existence of probabilities. Ideas are introduced such as real-time survey schemes, argument instability and reaction-proof forecasting based on stochastic approximation. Applications are canvassed to such topics as attitude measurement, political polling, econometric modelling under heterogeneous information, and the forecasting of hallmark events.
Game Theory has provided an extremely useful tool in enabling economists to venture into unknown areas. Its concepts of conflict and cooperation apply whenever the actions of several agents are interdependent; providing language to formulate as well as to structure, analyze, and understand strategic scenarios. Economic Behavior, Game Theory, and Technology in Emerging Markets explores game theory and its deep impact in developmental economics, specifically the manner in which it provides a way of formalizing institutions. This is particularly important for emerging economies which have not yet received much attention in the academic world. This publication is useful for academics, professors, and researchers in this field, but it has also been compiled to meet the needs of non-specialists as well.
This book was first published in 1995. The problem of disparities between different estimates of GDP is well known and widely discussed. Here, the authors describe a method for examining the discrepancies using a technique allocating them with reference to data reliability. The method enhances the reliability of the underlying data and leads to maximum-likelihood estimates. It is illustrated by application to the UK national accounts for the period 1920-1990. The book includes a full set of estimates for this period, including runs of industrial data for the period 1948-1990, which are longer than those available from any other source. The statistical technique allows estimates of standard errors of the data to be calculated and verified; these are presented both for data in levels and for changes in variables over 1-, 2- and 5-year periods.
9
New Directions in Computational Economics brings together for the first time a diverse selection of papers, sharing the underlying theme of application of computing technology as a tool for achieving solutions to realistic problems in computational economics and related areas in the environmental, ecological and energy fields. Part I of the volume addresses experimental and computational issues in auction mechanisms, including a survey of recent results for sealed bid auctions. The second contribution uses neural networks as the basis for estimating bid functions for first price sealed bid auctions. Also presented is the `smart market' computational mechanism which better matches bids and offers for natural gas. Part II consists of papers that formulate and solve models of economics systems. Amman and Kendrick's paper deals with control models and the computational difficulties that result from nonconvexities. Using goal programming, Nagurney, Thore and Pan formulate spatial resource allocation models to analyze various policy issues. Thompson and Thrall next present a rigorous mathematical analysis of the relationship between efficiency and profitability. The problem of matching uncertain streams of assets and liabilities is solved using stochastic optimization techniques in the following paper in this section. Finally, Part III applies economic concepts to issues in computer science in addition to using computational techniques to solve economic models.
First published in 2004, this is a rigorous but user-friendly book on the application of stochastic control theory to economics. A distinctive feature of the book is that mathematical concepts are introduced in a language and terminology familiar to graduate students of economics. The standard topics of many mathematics, economics and finance books are illustrated with real examples documented in the economic literature. Moreover, the book emphasises the dos and don'ts of stochastic calculus, cautioning the reader that certain results and intuitions cherished by many economists do not extend to stochastic models. A special chapter (Chapter 5) is devoted to exploring various methods of finding a closed-form representation of the value function of a stochastic control problem, which is essential for ascertaining the optimal policy functions. The book also includes many practice exercises for the reader. Notes and suggested readings are provided at the end of each chapter for more references and possible extensions.
All humans eventually die, but life expectancies differ over time and among different demographic groups. Teasing out the various causes and correlates of death is a challenge, and it is one we take on in this book. A look at the data on mortality is both interesting and suggestive of some possible relationships. In 1900 life expectancies at birth were 46. 3 and 48. 3 years for men and women respectively, a gender differential of a bit less than 5 percent. Life expectancies for whites then were about 0. 3 years longer than that of the whole population, but life expectancies for blacks were only about 33 years for men and women. At age 65, the remaining life expectancies were about 12 and 11 years for whites and blacks respectively. Fifty years later, life expectancies at birth had grown to 66 and 71 years for males and females respectively. The percentage differential between the sexes was now almost up to 10 percent. The life expectancies of whites were about one year longer than that for the entire population. The big change was for blacks, whose life expectancy had grown to over 60 years with black females living about 5 percent longer than their male counterparts. At age 65 the remaining expected life had increased about two years with much larger percentage gains for blacks.
Cost Structure and the Measurement of Economic Performance is designed to provide a comprehensive guide for students, researchers or consultants who wish to model, construct, interpret, and use economic performance measures. The topical emphasis is on productivity growth and its dependence on the cost structure. The methodological focus is on application of the tools of economic analysis - the `thinking structure' provided by microeconomic theory - to measure technological or cost structure, and link it with market and regulatory structure. This provides a rich basis for evaluation of economic performance and its determinants. The format of the book stresses topics or questions of interest rather than the theoretical tools for analysis. Traditional productivity growth modeling and measurement practices that result in a productivity residual often called the `measure of our ignorance' are initially overviewed, and then the different aspects of technological, market and regulatory structure that might underlie this residual are explored. The ultimate goal is to decompose or explain the residual, by modeling and measuring a multitude of impacts that determine the economic performance of firms, sectors, and economies. The chapters are organized with three broad goals in mind. The first is to introduce the overall ideas involved in economic performance measurement and traditional productivity growth analysis. Issues associated with different types of (short and long run, internal and external) cost economies, market and regulatory impacts, and other general cost efficiencies that might impact these measures are then explored. Finally, some of the theoretical, data construction and econometric tools necessary to justify and implement these models are emphasized.
Supply chain management (SCM) strives for creating competitive advantage and value for customers by integrating business processes from end users through original suppliers. However, the question of how SCM influences the value of a firm is not fully answered. Various conceptual frameworks that explain the coherence of SCM and company value, comprehended as value-based SCM, are well accepted in scientific research, but quantitative approaches to value-based SCM are found rather seldom. The book contributes to this research gap by proposing quantitative models that allow for assessing influences of SCM on the value of a firm. Opposed to existing models that limit the observation to chosen facets of SCM or selected value drivers, this holistic approach is adequate to * reflect configurational and operational aspects of SCM, * cover all phases of the product life cycle, * financially compare value impacts of profitability-related and asset-related value drivers, and * assess influences of dynamics and uncertainties on company value.
A. Dogramaci and N.R. Adam Productivity of a firm is influenced both by economic forces which act at the macro level and impose themselves on the individual firm as well as internal factors that result from decisions and processes which take place within the boundaries of the firm. Efforts towards increasing the produc tivity level of firms need to be based on a sound understanding of how the above processes take place. Our objective in this volume is to present some of the recent research work in this field. The volume consists of three parts. In part I, two macro issues are addressed (taxation and inflation) and their relation to produc tivity is analyzed. The second part of the volume focuses on methods for productivity analysis within the firm. Finally, the third part of the book deals with two additional productivity analysis techniques and their applications to public utilities. The objective of the volume is not to present a unified point of view, but rather to cover a sample of different methodologies and perspectives through original, scholarly papers."
Studies in Consumer Demand - Econometric Methods Applied to Market Data contains eight previously unpublished studies of consumer demand. Each study stands on its own as a complete econometric analysis of demand for a well-defined consumer product. The econometric methods range from simple regression techniques applied in the first four chapters, to the use of logit and multinomial logit models used in chapters 5 and 6, to the use of nested logit models in chapters 6 and 7, and finally to the discrete/continuous modeling methods used in chapter 8. Emphasis is on applications rather than econometric theory. In each case, enough detail is provided for the reader to understand the purpose of the analysis, the availability and suitability of data, and the econometric approach to measuring demand.
This friendly guide is the companion you need to convert pure mathematics into understanding and facility with a host of probabilistic tools. The book provides a high-level view of probability and its most powerful applications. It begins with the basic rules of probability and quickly progresses to some of the most sophisticated modern techniques in use, including Kalman filters, Monte Carlo techniques, machine learning methods, Bayesian inference and stochastic processes. It draws on thirty years of experience in applying probabilistic methods to problems in computational science and engineering, and numerous practical examples illustrate where these techniques are used in the real world. Topics of discussion range from carbon dating to Wasserstein GANs, one of the most recent developments in Deep Learning. The underlying mathematics is presented in full, but clarity takes priority over complete rigour, making this text a starting reference source for researchers and a readable overview for students.
All former Soviet Union countries experience their past as a heavy burden. It led to the centralisation of scientific personnel, the separation of research from teaching at universities, and a concentration of certain branches of technology in different parts of the Union. This has given rise to a one-sided technology and science potential which frequently cannot be sufficiently supported due to a lack of adequate finance. Cooperation between the Baltic States themselves is often hampered by an exaggerated sense of national identity, and international cooperation can be made difficult by linguistic problems. A critical issue is finance. The Baltic States themselves are experiencing budgetary constraints, and the West is cutting back on funding. The analytical issues dealt with here include specific questions, such as in the sectors of energy policy, electrical equipment and electronics, and environmental considerations. The transfer of technology is also discussed, as is security: there is the possibility that science and scientific results can be obtained from the former Soviet Union at low cost by the criminal community.
The ?nite-dimensional nonlinear complementarity problem (NCP) is a s- tem of ?nitely many nonlinear inequalities in ?nitely many nonnegative variables along with a special equation that expresses the complementary relationship between the variables and corresponding inequalities. This complementarity condition is the key feature distinguishing the NCP from a general inequality system, lies at the heart of all constrained optimi- tion problems in ?nite dimensions, provides a powerful framework for the modeling of equilibria of many kinds, and exhibits a natural link between smooth and nonsmooth mathematics. The ?nite-dimensional variational inequality (VI), which is a generalization of the NCP, provides a broad unifying setting for the study of optimization and equilibrium problems and serves as the main computational framework for the practical solution of a host of continuum problems in the mathematical sciences. The systematic study of the ?nite-dimensional NCP and VI began in the mid-1960s; in a span of four decades, the subject has developed into a very fruitful discipline in the ?eld of mathematical programming. The - velopments include a rich mathematical theory, a host of e?ective solution algorithms, a multitude of interesting connections to numerous disciplines, and a wide range of important applications in engineering and economics. As a result of their broad associations, the literature of the VI/CP has bene?ted from contributions made by mathematicians (pure, applied, and computational), computer scientists, engineers of many kinds (civil, ch- ical, electrical, mechanical, and systems), and economists of diverse exp- tise (agricultural, computational, energy, ?nancial, and spatial).
The basic characteristic of Modern Linear and Nonlinear Econometrics is that it presents a unified approach of modern linear and nonlinear econometrics in a concise and intuitive way. It covers four major parts of modern econometrics: linear and nonlinear estimation and testing, time series analysis, models with categorical and limited dependent variables, and, finally, a thorough analysis of linear and nonlinear panel data modeling. Distinctive features of this handbook are: -A unified approach of both linear and nonlinear econometrics, with an integration of the theory and the practice in modern econometrics. Emphasis on sound theoretical and empirical relevance and intuition. Focus on econometric and statistical methods for the analysis of linear and nonlinear processes in economics and finance, including computational methods and numerical tools. -Completely worked out empirical illustrations are provided throughout, the macroeconomic and microeconomic (household and firm level) data sets of which are available from the internet; these empirical illustrations are taken from finance (e.g. CAPM and derivatives), international economics (e.g. exchange rates), innovation economics (e.g. patenting), business cycle analysis, monetary economics, housing economics, labor and educational economics (e.g. demand for teachers according to gender) and many others. -Exercises are added to the chapters, with a focus on the interpretation of results; several of these exercises involve the use of actual data that are typical for current empirical work and that are made available on the internet. What is also distinguishable in Modern Linear and Nonlinear Econometrics is that every major topic has a number of examples, exercises or case studies. By this learning by doing' method the intention is to prepare the reader to be able to design, develop and successfully finish his or her own research and/or solve real world problems.
Sample data alone never suffice to draw conclusions about populations. Inference always requires assumptions about the population and sampling process. Statistical theory has revealed much about how strength of assumptions affects the precision of point estimates, but has had much less to say about how it affects the identification of population parameters. Indeed, it has been commonplace to think of identification as a binary event - a parameter is either identified or not - and to view point identification as a pre-condition for inference. Yet there is enormous scope for fruitful inference using data and assumptions that partially identify population parameters. This book explains why and shows how. The book presents in a rigorous and thorough manner the main elements of Charles Manski's research on partial identification of probability distributions. One focus is prediction with missing outcome or covariate data. Another is decomposition of finite mixtures, with application to the analysis of contaminated sampling and ecological inference. A third major focus is the analysis of treatment response. Whatever the particular subject under study, the presentation follows a common path. The author first specifies the sampling process generating the available data and asks what may be learned about population parameters using the empirical evidence alone. He then ask how the (typically) setvalued identification regions for these parameters shrink if various assumptions are imposed. The approach to inference that runs throughout the book is deliberately conservative and thoroughly nonparametric. Conservative nonparametric analysis enables researchers to learn from the available data without imposing untenable assumptions. It enables establishment of a domain of consensus among researchers who may hold disparate beliefs about what assumptions are appropriate. Charles F. Manski is Board of Trustees Professor at Northwestern University. He is author of Identification Problems in the Social Sciences and Analog Estimation Methods in Econometrics. He is a Fellow of the American Academy of Arts and Sciences, the American Association for the Advancement of Science, and the Econometric Society.
This conference brought together an international group of fisheries economists from academia, business, government, and inter-governmentalagencies, to consider a coordinated project to build an econometric model of the world trade in groundfish. A number of the conference participants had just spent up to six weeks at Memorial University of Newfoundland working and preparing papers on the project. This volume presents the papers that these scholars produced, plus additional papers prepared by other conference participants. In addition, various lectures and discussionswhich were transcribed from tapes made of the proceedings are included. The introductory essay explains the genesis of the conference, describes the approach taken to modelling the groundfish trade, very briefly summarizes the technical papers, and describes future plans. The project is continuing as planned, and a second conference was held in St. John's in August 1990. The conference was a NATO Advanced Research Workshop and we wish to thank the ScientificAffairs Division ofNATO for their financial support. Additional financial support was received from the Canadian Centre for Fisheries Innovation in St. John's, the Department of Fisheries and Oceans of the Government of Canada, the Department of Fisheries of the Government of Newfoundland and Labrador, Memorial University of Newfoundland and Air Nova; we acknowledge with appreciation their help. |
You may like...
The Oxford Handbook of Applied Bayesian…
Anthony O'Hagan, Mike West
Hardcover
R4,188
Discovery Miles 41 880
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
The Multi-Agent Transport Simulation…
Andreas Horni, Kai Nagel, …
Hardcover
R1,633
Discovery Miles 16 330
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
The Handbook of Historical Economics
Alberto Bisin, Giovanni Federico
Paperback
R2,567
Discovery Miles 25 670
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,581
Discovery Miles 35 810
|