![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
Control theory methods in economics have historically developed over three phases. The first involved basically the feedback control rules in a deterministic framework which were applied in macrodynamic models for analyzing stabilization policies. The second phase raised the issues of various types of inconsistencies in deterministic optimal control models due to changing information and other aspects of stochasticity. Rational expectations models have been extensively used in this plan to resolve some of the inconsistency problems. The third phase has recently focused on the various aspects of adaptive control. where stochasticity and information adaptivity are introduced in diverse ways e.g . risk adjustment and risk sensitivity of optimal control, recursive updating rules via Kalman filtering and weighted recursive least squares and variable structure control methods in nonlinear framework. Problems of efficient econometric estimation of optimal control models have now acquired significant importance. This monograph provides an integrated view of control theory methods, synthesizing the three phases from feedback control to stochastic control and from stochastic control to adaptive control. Aspects of econometric estimation are strongly emphasized here, since these are very important in empirical applications in economics."
All humans eventually die, but life expectancies differ over time and among different demographic groups. Teasing out the various causes and correlates of death is a challenge, and it is one we take on in this book. A look at the data on mortality is both interesting and suggestive of some possible relationships. In 1900 life expectancies at birth were 46. 3 and 48. 3 years for men and women respectively, a gender differential of a bit less than 5 percent. Life expectancies for whites then were about 0. 3 years longer than that of the whole population, but life expectancies for blacks were only about 33 years for men and women. At age 65, the remaining life expectancies were about 12 and 11 years for whites and blacks respectively. Fifty years later, life expectancies at birth had grown to 66 and 71 years for males and females respectively. The percentage differential between the sexes was now almost up to 10 percent. The life expectancies of whites were about one year longer than that for the entire population. The big change was for blacks, whose life expectancy had grown to over 60 years with black females living about 5 percent longer than their male counterparts. At age 65 the remaining expected life had increased about two years with much larger percentage gains for blacks.
Spatial Microeconometrics introduces the reader to the basic concepts of spatial statistics, spatial econometrics and the spatial behavior of economic agents at the microeconomic level. Incorporating useful examples and presenting real data and datasets on real firms, the book takes the reader through the key topics in a systematic way. The book outlines the specificities of data that represent a set of interacting individuals with respect to traditional econometrics that treat their locational choices as exogenous and their economic behavior as independent. In particular, the authors address the consequences of neglecting such important sources of information on statistical inference and how to improve the model predictive performances. The book presents the theory, clarifies the concepts and instructs the readers on how to perform their own analyses, describing in detail the codes which are necessary when using the statistical language R. The book is written by leading figures in the field and is completely up to date with the very latest research. It will be invaluable for graduate students and researchers in economic geography, regional science, spatial econometrics, spatial statistics and urban economics.
This friendly guide is the companion you need to convert pure mathematics into understanding and facility with a host of probabilistic tools. The book provides a high-level view of probability and its most powerful applications. It begins with the basic rules of probability and quickly progresses to some of the most sophisticated modern techniques in use, including Kalman filters, Monte Carlo techniques, machine learning methods, Bayesian inference and stochastic processes. It draws on thirty years of experience in applying probabilistic methods to problems in computational science and engineering, and numerous practical examples illustrate where these techniques are used in the real world. Topics of discussion range from carbon dating to Wasserstein GANs, one of the most recent developments in Deep Learning. The underlying mathematics is presented in full, but clarity takes priority over complete rigour, making this text a starting reference source for researchers and a readable overview for students.
Macroeconomic Modelling has undergone radical changes in the last few years. There has been considerable innovation in developing robust solution techniques for the new breed of increasingly complex models. Similarly there has been a growing consensus on their long run and dynamic properties, as well as much development on existing themes such as modelling expectations and policy rules. This edited volume focuses on those areas which have undergone the most significant and imaginative developments and brings together the very best of modelling practice. We include specific sections on (I) Solving Large Macroeconomic Models, (II) Rational Expectations and Learning Approaches, (III) Macro Dynamics, and (IV) Long Run and Closures. All of the contributions offer new research whilst putting their developments firmly in context and as such will influence much future research in the area. It will be an invaluable text for those in policy institutions as well as academics and advanced students in the fields of economics, mathematics, business and government. Our contributors include those working in central banks, the IMF, European Commission and established academics.
Testing for a unit root is now an essential part of time series
analysis. Indeed no time series study in economics, and other
disciplines that use time series observations, can ignore the
crucial issue of nonstationarity caused by a unit root. However,
the literature on the topic is large and often technical, making it
difficult to understand the key practical issues.
Who decides how official statistics are produced? Do politicians have control or are key decisions left to statisticians in independent statistical agencies? Interviews with statisticians in Australia, Canada, Sweden, the UK and the USA were conducted to get insider perspectives on the nature of decision making in government statistical administration. While the popular adage suggests there are 'lies, damned lies and statistics', this research shows that official statistics in liberal democracies are far from mistruths; they are consistently insulated from direct political interference. Yet, a range of subtle pressures and tensions exist that governments and statisticians must manage. The power over statistics is distributed differently in different countries, and this book explains why. Differences in decision-making powers across countries are the result of shifting pressures politicians and statisticians face to be credible, and the different national contexts that provide distinctive institutional settings for the production of government numbers.
Game Theory has provided an extremely useful tool in enabling economists to venture into unknown areas. Its concepts of conflict and cooperation apply whenever the actions of several agents are interdependent; providing language to formulate as well as to structure, analyze, and understand strategic scenarios. Economic Behavior, Game Theory, and Technology in Emerging Markets explores game theory and its deep impact in developmental economics, specifically the manner in which it provides a way of formalizing institutions. This is particularly important for emerging economies which have not yet received much attention in the academic world. This publication is useful for academics, professors, and researchers in this field, but it has also been compiled to meet the needs of non-specialists as well.
Swaps, futures, options, structured instruments - a wide range of derivative products is traded in today's financial markets. Analyzing, pricing and managing such products often requires fairly sophisticated quantitative tools and methods. This book serves as an introduction to financial mathematics with special emphasis on aspects relevant in practice. In addition to numerous illustrative examples, algorithmic implementations are demonstrated using "Mathematica" and the software package "UnRisk" (available for both students and teachers). The content is organized in 15 chapters that can be treated as independent modules. In particular, the exposition is tailored for classroom use in a Bachelor or Master program course, as well as for practitioners who wish to further strengthen their quantitative background.
Accessible to a general audience with some background in statistics and computing Many examples and extended case studies Illustrations using R and Rstudio A true blend of statistics and computer science -- not just a grab bag of topics from each
This tutorial presents a hands-on introduction to a new discrete choice modeling approach based on the behavioral notion of regret-minimization. This so-called Random Regret Minimization-approach (RRM) forms a counterpart of the Random Utility Maximization-approach (RUM) to discrete choice modeling, which has for decades dominated the field of choice modeling and adjacent fields such as transportation, marketing and environmental economics. Being as parsimonious as conventional RUM-models and compatible with popular software packages, the RRM-approach provides an alternative and appealing account of choice behavior. Rather than providing highly technical discussions as usually encountered in scholarly journals, this tutorial aims to allow readers to explore the RRM-approach and its potential and limitations hands-on and based on a detailed discussion of examples. This tutorial is written for students, scholars and practitioners who have a basic background in choice modeling in general and RUM-modeling in particular. It has been taken care of that all concepts and results should be clear to readers that do not have an advanced knowledge of econometrics.
9
? In his "Prime ricerche sulla rivoluzione dei prezzi in Firenze" (1939), Giuseppe Parenti, by Fernand Braudel regarded as an author who "se classait, d'entree de jeu et sans discussion possible, a la hauteur meme d'Earl Jefferson Hamilton. . . . " begins his opening lines with a description/de?nition of the price revolution which took place in the XVI in Europe as "that extraordinary enhancement of all things that occurred in European countries around the second half of the XVI; revolution in the true meaning of the word, as not only, like any strong price increase, it modi?ed the wealth distribution process and changed the relative position of the various social categories and of the different functions of the economic activity, but affected too, in a way that was not enough studied yet, the relative evolution of the various national economies, and ?nally, . . . . . . . . . ., certainly contributed to the birth, or at least to the dissemination, of the new naturalistic economic ideas, from which the economic science would have sprung." De?nition that can be taken as the founding metaphor of this volume."
New Directions in Computational Economics brings together for the first time a diverse selection of papers, sharing the underlying theme of application of computing technology as a tool for achieving solutions to realistic problems in computational economics and related areas in the environmental, ecological and energy fields. Part I of the volume addresses experimental and computational issues in auction mechanisms, including a survey of recent results for sealed bid auctions. The second contribution uses neural networks as the basis for estimating bid functions for first price sealed bid auctions. Also presented is the `smart market' computational mechanism which better matches bids and offers for natural gas. Part II consists of papers that formulate and solve models of economics systems. Amman and Kendrick's paper deals with control models and the computational difficulties that result from nonconvexities. Using goal programming, Nagurney, Thore and Pan formulate spatial resource allocation models to analyze various policy issues. Thompson and Thrall next present a rigorous mathematical analysis of the relationship between efficiency and profitability. The problem of matching uncertain streams of assets and liabilities is solved using stochastic optimization techniques in the following paper in this section. Finally, Part III applies economic concepts to issues in computer science in addition to using computational techniques to solve economic models.
The manuscript reviews some key ideas about artificial intelligence, and relates them to economics. These include its relation to robotics, and the concepts of synthetic emotions, consciousness, and life. The economic implications of the advent of artificial intelligence, such as its effect on prices and wages, appropriate patent policy, and the possibility of accelerating productivity, are discussed. The growing field of artificial economics and the use of artificial agents in experimental economics is considered.
On May 27-31, 1985, a series of symposia was held at The University of Western Ontario, London, Canada, to celebrate the 70th birthday of Pro fessor V. M. Joshi. These symposia were chosen to reflect Professor Joshi's research interests as well as areas of expertise in statistical science among faculty in the Departments of Statistical and Actuarial Sciences, Economics, Epidemiology and Biostatistics, and Philosophy. From these symposia, the six volumes which comprise the "Joshi Festschrift" have arisen. The 117 articles in this work reflect the broad interests and high quality of research of those who attended our conference. We would like to thank all of the contributors for their superb cooperation in helping us to complete this project. Our deepest gratitude must go to the three people who have spent so much of their time in the past year typing these volumes: Jackie Bell, Lise Constant, and Sandy Tarnowski. This work has been printed from "camera ready" copy produced by our Vax 785 computer and QMS Lasergraphix printers, using the text processing software TEX. At the initiation of this project, we were neophytes in the use of this system. Thank you, Jackie, Lise, and Sandy, for having the persistence and dedication needed to complete this undertaking."
A. Dogramaci and N.R. Adam Productivity of a firm is influenced both by economic forces which act at the macro level and impose themselves on the individual firm as well as internal factors that result from decisions and processes which take place within the boundaries of the firm. Efforts towards increasing the produc tivity level of firms need to be based on a sound understanding of how the above processes take place. Our objective in this volume is to present some of the recent research work in this field. The volume consists of three parts. In part I, two macro issues are addressed (taxation and inflation) and their relation to produc tivity is analyzed. The second part of the volume focuses on methods for productivity analysis within the firm. Finally, the third part of the book deals with two additional productivity analysis techniques and their applications to public utilities. The objective of the volume is not to present a unified point of view, but rather to cover a sample of different methodologies and perspectives through original, scholarly papers."
Max-Min problems are two-step allocation problems in which one side must make his move knowing that the other side will then learn what the move is and optimally counter. They are fundamental in parti cular to military weapons-selection problems involving large systems such as Minuteman or Polaris, where the systems in the mix are so large that they cannot be concealed from an opponent. One must then expect the opponent to determine on an optlmal mixture of, in the case men tioned above, anti-Minuteman and anti-submarine effort. The author's first introduction to a problem of Max-Min type occurred at The RAND Corporation about 1951. One side allocates anti-missile defenses to various cities. The other side observes this allocation and then allocates missiles to those cities. If F(x, y) denotes the total residual value of the cities after the attack, with x denoting the defender's strategy and y the attacker's, the problem is then to find Max MinF(x, y) = Max MinF(x, y)] ."
Models, Methods, Concepts and Applications of the Analytic Hierarchy Process is a volume dedicated to selected applications of the Analytic Hierarchy Process (AHP) focused on three themes: economics, the social sciences, and the linking of measurement with human values. (1) The AHP offers economists a substantially different approach to dealing with economic problems through ratio scales. The main mathematical models on which economics has based its quantitative thinking up to now are utility theory, which uses interval scales, and linear programming. We hope that the variety of examples included here can perhaps stimulate researchers in economics to try applying this new approach. (2) The second theme is concerned with the social sciences. The AHP offers psychologists and political scientists the methodology to quantify and derive measurements for intangibles. We hope that the examples included in this book will encourage them to examine the methods of AHP in terms of the problems they seek to solve. (3) The third theme is concerned with providing people in the physical and engineering sciences with a quantitative method to link hard measurement to human values. In such a process one needs to interpret what the measurements mean. A number is useless until someone understands what it means. It can have different meanings in different problems. Ten dollars are plenty to satisfy one's hunger but are useless by themselves in buying a new car. Such measurements are only indicators of the state of a system, but do not relate to the values of the human observers of that system. AHP methods can help resolve the conflicts between hard measurement data and human values.
The present work is an extension of my doctoral thesis done at Stanford in the early 1970s. In one clear sense it responds to the call for consilience by Edward O. Wilson. I agree with Wilson that there is a pressing need in the sciences today for the unification of the social with the natural sciences. I consider the present work to proceed from the perspective of behavioral ecology, specifically a subfield which I choose to call interpersonal behavioral ecology th Ecology, as a general field, has emerged in the last quarter of the 20 century as a major theme of concern as we have become increasingly aware that we must preserve the planet whose limited resources we share with all other earthly creatures. Interpersonal behavioral ecology, however, focuses not on the physical environment, but upon our social environment. It concerns our interpersonal behavioral interactions at all levels, from simple dyadic one-to-one personal interactions to our larger, even global, social, economic, and political interactions. Interpersonal behavioral ecology, as I see it, then, is concerned with our behavior toward each other, from the most obvious behaviors of war between nations, to excessive competition, exploitation, crime, abuse, and even to the ways in which we interact with each other as individuals in the family, in our social lives, in the workplace, and in the marketplace.
Studies in Consumer Demand - Econometric Methods Applied to Market Data contains eight previously unpublished studies of consumer demand. Each study stands on its own as a complete econometric analysis of demand for a well-defined consumer product. The econometric methods range from simple regression techniques applied in the first four chapters, to the use of logit and multinomial logit models used in chapters 5 and 6, to the use of nested logit models in chapters 6 and 7, and finally to the discrete/continuous modeling methods used in chapter 8. Emphasis is on applications rather than econometric theory. In each case, enough detail is provided for the reader to understand the purpose of the analysis, the availability and suitability of data, and the econometric approach to measuring demand.
Technology Commercialization: DEA and Related Analytical Methods for Evaluating The Use and Implementation of Technical Innovation examines both general Research & Development commercialization and targeted new product innovation. New product development is a major occupation of the technical sector of the global economy and is viewed in many ways as a means of economic stability for a business, an industry, and a country. The heart of the book is a detailing of the analytical methods-with special, but not exclusive emphasis on DEA methods-for evaluating and ranking the most promising R & D and technical innovation being developed. The sponsors of the research and development may involve universities, countries, industries, and corporations-all of these sources are covered in the book. In addition, the trade-off of environmental problems vis-a-vis new product development is discussed in a section of the book. Sten Thore (editor and author) has woven together the chapter contributions by a strong group of international researchers into a book that has characteristics of both a monograph and a unified edited volume of well-written papers in DEA, technology evaluation, R&D, and environmental economics. Finally, the use of DEA as an evaluation method for product innovation is an important new development in the field of R&D commercialization.
The ?nite-dimensional nonlinear complementarity problem (NCP) is a s- tem of ?nitely many nonlinear inequalities in ?nitely many nonnegative variables along with a special equation that expresses the complementary relationship between the variables and corresponding inequalities. This complementarity condition is the key feature distinguishing the NCP from a general inequality system, lies at the heart of all constrained optimi- tion problems in ?nite dimensions, provides a powerful framework for the modeling of equilibria of many kinds, and exhibits a natural link between smooth and nonsmooth mathematics. The ?nite-dimensional variational inequality (VI), which is a generalization of the NCP, provides a broad unifying setting for the study of optimization and equilibrium problems and serves as the main computational framework for the practical solution of a host of continuum problems in the mathematical sciences. The systematic study of the ?nite-dimensional NCP and VI began in the mid-1960s; in a span of four decades, the subject has developed into a very fruitful discipline in the ?eld of mathematical programming. The - velopments include a rich mathematical theory, a host of e?ective solution algorithms, a multitude of interesting connections to numerous disciplines, and a wide range of important applications in engineering and economics. As a result of their broad associations, the literature of the VI/CP has bene?ted from contributions made by mathematicians (pure, applied, and computational), computer scientists, engineers of many kinds (civil, ch- ical, electrical, mechanical, and systems), and economists of diverse exp- tise (agricultural, computational, energy, ?nancial, and spatial).
The basic characteristic of Modern Linear and Nonlinear Econometrics is that it presents a unified approach of modern linear and nonlinear econometrics in a concise and intuitive way. It covers four major parts of modern econometrics: linear and nonlinear estimation and testing, time series analysis, models with categorical and limited dependent variables, and, finally, a thorough analysis of linear and nonlinear panel data modeling. Distinctive features of this handbook are: -A unified approach of both linear and nonlinear econometrics, with an integration of the theory and the practice in modern econometrics. Emphasis on sound theoretical and empirical relevance and intuition. Focus on econometric and statistical methods for the analysis of linear and nonlinear processes in economics and finance, including computational methods and numerical tools. -Completely worked out empirical illustrations are provided throughout, the macroeconomic and microeconomic (household and firm level) data sets of which are available from the internet; these empirical illustrations are taken from finance (e.g. CAPM and derivatives), international economics (e.g. exchange rates), innovation economics (e.g. patenting), business cycle analysis, monetary economics, housing economics, labor and educational economics (e.g. demand for teachers according to gender) and many others. -Exercises are added to the chapters, with a focus on the interpretation of results; several of these exercises involve the use of actual data that are typical for current empirical work and that are made available on the internet. What is also distinguishable in Modern Linear and Nonlinear Econometrics is that every major topic has a number of examples, exercises or case studies. By this learning by doing' method the intention is to prepare the reader to be able to design, develop and successfully finish his or her own research and/or solve real world problems.
Sample data alone never suffice to draw conclusions about populations. Inference always requires assumptions about the population and sampling process. Statistical theory has revealed much about how strength of assumptions affects the precision of point estimates, but has had much less to say about how it affects the identification of population parameters. Indeed, it has been commonplace to think of identification as a binary event - a parameter is either identified or not - and to view point identification as a pre-condition for inference. Yet there is enormous scope for fruitful inference using data and assumptions that partially identify population parameters. This book explains why and shows how. The book presents in a rigorous and thorough manner the main elements of Charles Manski's research on partial identification of probability distributions. One focus is prediction with missing outcome or covariate data. Another is decomposition of finite mixtures, with application to the analysis of contaminated sampling and ecological inference. A third major focus is the analysis of treatment response. Whatever the particular subject under study, the presentation follows a common path. The author first specifies the sampling process generating the available data and asks what may be learned about population parameters using the empirical evidence alone. He then ask how the (typically) setvalued identification regions for these parameters shrink if various assumptions are imposed. The approach to inference that runs throughout the book is deliberately conservative and thoroughly nonparametric. Conservative nonparametric analysis enables researchers to learn from the available data without imposing untenable assumptions. It enables establishment of a domain of consensus among researchers who may hold disparate beliefs about what assumptions are appropriate. Charles F. Manski is Board of Trustees Professor at Northwestern University. He is author of Identification Problems in the Social Sciences and Analog Estimation Methods in Econometrics. He is a Fellow of the American Academy of Arts and Sciences, the American Association for the Advancement of Science, and the Econometric Society. |
You may like...
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,987
Discovery Miles 29 870
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
|