![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
All humans eventually die, but life expectancies differ over time and among different demographic groups. Teasing out the various causes and correlates of death is a challenge, and it is one we take on in this book. A look at the data on mortality is both interesting and suggestive of some possible relationships. In 1900 life expectancies at birth were 46. 3 and 48. 3 years for men and women respectively, a gender differential of a bit less than 5 percent. Life expectancies for whites then were about 0. 3 years longer than that of the whole population, but life expectancies for blacks were only about 33 years for men and women. At age 65, the remaining life expectancies were about 12 and 11 years for whites and blacks respectively. Fifty years later, life expectancies at birth had grown to 66 and 71 years for males and females respectively. The percentage differential between the sexes was now almost up to 10 percent. The life expectancies of whites were about one year longer than that for the entire population. The big change was for blacks, whose life expectancy had grown to over 60 years with black females living about 5 percent longer than their male counterparts. At age 65 the remaining expected life had increased about two years with much larger percentage gains for blacks.
Econophysics applies the methodology of physics to the study of economics. However, whilst physicists have good understanding of statistical physics, they may be unfamiliar with recent advances in statistical conjectures, including Bayesian and predictive methods. Equally, economists with knowledge of probabilities do not have a background in statistical physics and agent-based models. Proposing a unified view for a dynamic probabilistic approach, this book is useful for advanced undergraduate and graduate students as well as researchers in physics, economics and finance. The book takes a finitary approach to the subject, discussing the essentials of applied probability, and covering finite Markov chain theory and its applications to real systems. Each chapter ends with a summary, suggestions for further reading, and exercises with solutions at the end of the book.
Econophysics is an emerging interdisciplinary field that takes advantage of the concepts and methods of statistical physics to analyse economic phenomena. This book expands the explanatory scope of econophysics to the real economy by using methods from statistical physics to analyse the success and failure of companies. Using large data sets of companies and income-earners in Japan and Europe, a distinguished team of researchers show how these methods allow us to analyse companies, from huge corporations to small firms, as heterogeneous agents interacting at multiple layers of complex networks. They then show how successful this approach is in explaining a wide range of recent findings relating to the dynamics of companies. With mathematics kept to a minimum, the book is not only a lively introduction to the field of econophysics but also provides fresh insights into company behaviour.
Macroeconomic Modelling has undergone radical changes in the last few years. There has been considerable innovation in developing robust solution techniques for the new breed of increasingly complex models. Similarly there has been a growing consensus on their long run and dynamic properties, as well as much development on existing themes such as modelling expectations and policy rules. This edited volume focuses on those areas which have undergone the most significant and imaginative developments and brings together the very best of modelling practice. We include specific sections on (I) Solving Large Macroeconomic Models, (II) Rational Expectations and Learning Approaches, (III) Macro Dynamics, and (IV) Long Run and Closures. All of the contributions offer new research whilst putting their developments firmly in context and as such will influence much future research in the area. It will be an invaluable text for those in policy institutions as well as academics and advanced students in the fields of economics, mathematics, business and government. Our contributors include those working in central banks, the IMF, European Commission and established academics.
This 2005 volume brings together twelve papers by many of the most prominent applied general equilibrium modelers honoring Herbert Scarf, the father of equilibrium computation in economics. It deals with developments in applied general equilibrium, a field which has broadened greatly since the 1980s. The contributors discuss some traditional as well as some modern topics in the field, including non-convexities in economy-wide models, tax policy, developmental modeling and energy modeling. The book also covers a range of distinct approaches, conceptual issues and computational algorithms, such as calibration and areas of application such as macroeconomics of real business cycles and finance. An introductory chapter written by the editors maps out issues and scenarios for the future evolution of applied general equilibrium.
This 2005 volume contains the papers presented in honor of the lifelong achievements of Thomas J. Rothenberg on the occasion of his retirement. The authors of the chapters include many of the leading econometricians of our day, and the chapters address topics of current research significance in econometric theory. The chapters cover four themes: identification and efficient estimation in econometrics, asymptotic approximations to the distributions of econometric estimators and tests, inference involving potentially nonstationary time series, such as processes that might have a unit autoregressive root, and nonparametric and semiparametric inference. Several of the chapters provide overviews and treatments of basic conceptual issues, while others advance our understanding of the properties of existing econometric procedures and/or propose others. Specific topics include identification in nonlinear models, inference with weak instruments, tests for nonstationary in time series and panel data, generalized empirical likelihood estimation, and the bootstrap.
Originally published in 2005, Weather Derivative Valuation covers all the meteorological, statistical, financial and mathematical issues that arise in the pricing and risk management of weather derivatives. There are chapters on meteorological data and data cleaning, the modelling and pricing of single weather derivatives, the modelling and valuation of portfolios, the use of weather and seasonal forecasts in the pricing of weather derivatives, arbitrage pricing for weather derivatives, risk management, and the modelling of temperature, wind and precipitation. Specific issues covered in detail include the analysis of uncertainty in weather derivative pricing, time-series modelling of daily temperatures, the creation and use of probabilistic meteorological forecasts and the derivation of the weather derivative version of the Black-Scholes equation of mathematical finance. Written by consultants who work within the weather derivative industry, this book is packed with practical information and theoretical insight into the world of weather derivative pricing.
Testing for a unit root is now an essential part of time series
analysis. Indeed no time series study in economics, and other
disciplines that use time series observations, can ignore the
crucial issue of nonstationarity caused by a unit root. However,
the literature on the topic is large and often technical, making it
difficult to understand the key practical issues.
Game Theory has provided an extremely useful tool in enabling economists to venture into unknown areas. Its concepts of conflict and cooperation apply whenever the actions of several agents are interdependent; providing language to formulate as well as to structure, analyze, and understand strategic scenarios. Economic Behavior, Game Theory, and Technology in Emerging Markets explores game theory and its deep impact in developmental economics, specifically the manner in which it provides a way of formalizing institutions. This is particularly important for emerging economies which have not yet received much attention in the academic world. This publication is useful for academics, professors, and researchers in this field, but it has also been compiled to meet the needs of non-specialists as well.
Swaps, futures, options, structured instruments - a wide range of derivative products is traded in today's financial markets. Analyzing, pricing and managing such products often requires fairly sophisticated quantitative tools and methods. This book serves as an introduction to financial mathematics with special emphasis on aspects relevant in practice. In addition to numerous illustrative examples, algorithmic implementations are demonstrated using "Mathematica" and the software package "UnRisk" (available for both students and teachers). The content is organized in 15 chapters that can be treated as independent modules. In particular, the exposition is tailored for classroom use in a Bachelor or Master program course, as well as for practitioners who wish to further strengthen their quantitative background.
This tutorial presents a hands-on introduction to a new discrete choice modeling approach based on the behavioral notion of regret-minimization. This so-called Random Regret Minimization-approach (RRM) forms a counterpart of the Random Utility Maximization-approach (RUM) to discrete choice modeling, which has for decades dominated the field of choice modeling and adjacent fields such as transportation, marketing and environmental economics. Being as parsimonious as conventional RUM-models and compatible with popular software packages, the RRM-approach provides an alternative and appealing account of choice behavior. Rather than providing highly technical discussions as usually encountered in scholarly journals, this tutorial aims to allow readers to explore the RRM-approach and its potential and limitations hands-on and based on a detailed discussion of examples. This tutorial is written for students, scholars and practitioners who have a basic background in choice modeling in general and RUM-modeling in particular. It has been taken care of that all concepts and results should be clear to readers that do not have an advanced knowledge of econometrics.
This important collection brings together leading econometricians to discuss advances in the areas of the econometrics of panel data. The papers in this collection can be grouped into two categories. The first, which includes chapters by Amemiya, Baltagi, Arellano, Bover and Labeaga, primarily deal with different aspects of limited dependent variables and sample selectivity. The second group of papers, including those by Nerlove, Schmidt and Ahn, Kiviet, Davies and Lahiri, consider issues that arise in the estimation of dyanamic (possibly) heterogeneous panel data models. Overall, the contributors focus on the issues of simplifying complex real-world phenomena into easily generalisable inferences from individual outcomes. As the contributions of G. S. Maddala in the fields of limited dependent variables and panel data were particularly influential, it is a fitting tribute that this volume is dedicated to him.
9
? In his "Prime ricerche sulla rivoluzione dei prezzi in Firenze" (1939), Giuseppe Parenti, by Fernand Braudel regarded as an author who "se classait, d'entree de jeu et sans discussion possible, a la hauteur meme d'Earl Jefferson Hamilton. . . . " begins his opening lines with a description/de?nition of the price revolution which took place in the XVI in Europe as "that extraordinary enhancement of all things that occurred in European countries around the second half of the XVI; revolution in the true meaning of the word, as not only, like any strong price increase, it modi?ed the wealth distribution process and changed the relative position of the various social categories and of the different functions of the economic activity, but affected too, in a way that was not enough studied yet, the relative evolution of the various national economies, and ?nally, . . . . . . . . . ., certainly contributed to the birth, or at least to the dissemination, of the new naturalistic economic ideas, from which the economic science would have sprung." De?nition that can be taken as the founding metaphor of this volume."
This book reports the results of five empirical studies undertaken in the early seventies by a collaboration headed by Professor Morishima. It deals with applications of the general equilibrium models whose theoretical aspects have been one of Professor Morishima's main interests. Four main econometric models are constructed for the USA, the UK, and Japan. These are used as a basis for the discussion of various topics in economic theory, such as: the existence and stability or instability of the neoclassical path of full employment growth equilibrium and a von Neumann-type path of balanced growth at constant proces; the antimony between price-stability and full employment; the Samuelson-LeChatelier principle; the theory of the balanced-budget multiplier; the three Hicksian laws of the gross substitutes system; the Brown-Jones super-multipliers of international trade, and so on. In addition, this 1972 work makes a quantitative evaluation for the US economy of monetary and fiscal policies as short-run measures for achieving full employment; the effectiveness of built-in flexibility of taxes in the UK economy is discussed; and estimates are made of the rapid decrease in disguised unemployment in post-war Japan.
New Directions in Computational Economics brings together for the first time a diverse selection of papers, sharing the underlying theme of application of computing technology as a tool for achieving solutions to realistic problems in computational economics and related areas in the environmental, ecological and energy fields. Part I of the volume addresses experimental and computational issues in auction mechanisms, including a survey of recent results for sealed bid auctions. The second contribution uses neural networks as the basis for estimating bid functions for first price sealed bid auctions. Also presented is the `smart market' computational mechanism which better matches bids and offers for natural gas. Part II consists of papers that formulate and solve models of economics systems. Amman and Kendrick's paper deals with control models and the computational difficulties that result from nonconvexities. Using goal programming, Nagurney, Thore and Pan formulate spatial resource allocation models to analyze various policy issues. Thompson and Thrall next present a rigorous mathematical analysis of the relationship between efficiency and profitability. The problem of matching uncertain streams of assets and liabilities is solved using stochastic optimization techniques in the following paper in this section. Finally, Part III applies economic concepts to issues in computer science in addition to using computational techniques to solve economic models.
The manuscript reviews some key ideas about artificial intelligence, and relates them to economics. These include its relation to robotics, and the concepts of synthetic emotions, consciousness, and life. The economic implications of the advent of artificial intelligence, such as its effect on prices and wages, appropriate patent policy, and the possibility of accelerating productivity, are discussed. The growing field of artificial economics and the use of artificial agents in experimental economics is considered.
On May 27-31, 1985, a series of symposia was held at The University of Western Ontario, London, Canada, to celebrate the 70th birthday of Pro fessor V. M. Joshi. These symposia were chosen to reflect Professor Joshi's research interests as well as areas of expertise in statistical science among faculty in the Departments of Statistical and Actuarial Sciences, Economics, Epidemiology and Biostatistics, and Philosophy. From these symposia, the six volumes which comprise the "Joshi Festschrift" have arisen. The 117 articles in this work reflect the broad interests and high quality of research of those who attended our conference. We would like to thank all of the contributors for their superb cooperation in helping us to complete this project. Our deepest gratitude must go to the three people who have spent so much of their time in the past year typing these volumes: Jackie Bell, Lise Constant, and Sandy Tarnowski. This work has been printed from "camera ready" copy produced by our Vax 785 computer and QMS Lasergraphix printers, using the text processing software TEX. At the initiation of this project, we were neophytes in the use of this system. Thank you, Jackie, Lise, and Sandy, for having the persistence and dedication needed to complete this undertaking."
A. Dogramaci and N.R. Adam Productivity of a firm is influenced both by economic forces which act at the macro level and impose themselves on the individual firm as well as internal factors that result from decisions and processes which take place within the boundaries of the firm. Efforts towards increasing the produc tivity level of firms need to be based on a sound understanding of how the above processes take place. Our objective in this volume is to present some of the recent research work in this field. The volume consists of three parts. In part I, two macro issues are addressed (taxation and inflation) and their relation to produc tivity is analyzed. The second part of the volume focuses on methods for productivity analysis within the firm. Finally, the third part of the book deals with two additional productivity analysis techniques and their applications to public utilities. The objective of the volume is not to present a unified point of view, but rather to cover a sample of different methodologies and perspectives through original, scholarly papers."
The methodological needs of environmental studies are unique in the breadth of research questions that can be posed, calling for a textbook that covers a broad swath of approaches to conducting research with potentially many different kinds of evidence. Fully updated to address new developments such as the effects of the internet, recent trends in the use of computers, remote sensing, and large data sets, this new edition of Research Methods for Environmental Studies is written specifically for social science-based research into the environment. This revised edition contains new chapters on coding, focus groups, and an extended treatment of hypothesis testing. The textbook covers the best-practice research methods most used to study the environment and its connections to societal and economic activities and objectives. Over five key parts, Kanazawa introduces quantitative and qualitative approaches, mixed methods, and the special requirements of interdisciplinary research, emphasizing that methodological practice should be tailored to the specific needs of the project. Within these parts, detailed coverage is provided on key topics including the identification of a research project, hypothesis testing, spatial analysis, the case study method, ethnographic approaches, discourse analysis, mixed methods, survey and interview techniques, focus groups, and ethical issues in environmental research. Drawing on a variety of extended and updated examples to encourage problem-based learning and fully addressing the challenges associated with interdisciplinary investigation, this book will be an essential resource for students embarking on courses exploring research methods in environmental studies.
Max-Min problems are two-step allocation problems in which one side must make his move knowing that the other side will then learn what the move is and optimally counter. They are fundamental in parti cular to military weapons-selection problems involving large systems such as Minuteman or Polaris, where the systems in the mix are so large that they cannot be concealed from an opponent. One must then expect the opponent to determine on an optlmal mixture of, in the case men tioned above, anti-Minuteman and anti-submarine effort. The author's first introduction to a problem of Max-Min type occurred at The RAND Corporation about 1951. One side allocates anti-missile defenses to various cities. The other side observes this allocation and then allocates missiles to those cities. If F(x, y) denotes the total residual value of the cities after the attack, with x denoting the defender's strategy and y the attacker's, the problem is then to find Max MinF(x, y) = Max MinF(x, y)] ."
This book was first published in 1989. Inference and prediction in human affairs are characterised by a cognitive and reactive sample space, the elements of which are aware both of the statistician and of each other. It is therefore not surprising that methodologies borrowed from classical statistics and the physical sciences have yielded disappointingly few lasting empirical insights and have sometimes failed in predictive mode. This book puts the underlying methodology of socioeconomic statistics on a firmer footing by placing it within the ambit of inferential and predictive games. It covers such problems as learning, publication, non-response, strategic response, the nature and possibility of rational expectations, time inconsistency, intrinsic nonstationarity, and the existence of probabilities. Ideas are introduced such as real-time survey schemes, argument instability and reaction-proof forecasting based on stochastic approximation. Applications are canvassed to such topics as attitude measurement, political polling, econometric modelling under heterogeneous information, and the forecasting of hallmark events.
The present work is an extension of my doctoral thesis done at Stanford in the early 1970s. In one clear sense it responds to the call for consilience by Edward O. Wilson. I agree with Wilson that there is a pressing need in the sciences today for the unification of the social with the natural sciences. I consider the present work to proceed from the perspective of behavioral ecology, specifically a subfield which I choose to call interpersonal behavioral ecology th Ecology, as a general field, has emerged in the last quarter of the 20 century as a major theme of concern as we have become increasingly aware that we must preserve the planet whose limited resources we share with all other earthly creatures. Interpersonal behavioral ecology, however, focuses not on the physical environment, but upon our social environment. It concerns our interpersonal behavioral interactions at all levels, from simple dyadic one-to-one personal interactions to our larger, even global, social, economic, and political interactions. Interpersonal behavioral ecology, as I see it, then, is concerned with our behavior toward each other, from the most obvious behaviors of war between nations, to excessive competition, exploitation, crime, abuse, and even to the ways in which we interact with each other as individuals in the family, in our social lives, in the workplace, and in the marketplace.
Studies in Consumer Demand - Econometric Methods Applied to Market Data contains eight previously unpublished studies of consumer demand. Each study stands on its own as a complete econometric analysis of demand for a well-defined consumer product. The econometric methods range from simple regression techniques applied in the first four chapters, to the use of logit and multinomial logit models used in chapters 5 and 6, to the use of nested logit models in chapters 6 and 7, and finally to the discrete/continuous modeling methods used in chapter 8. Emphasis is on applications rather than econometric theory. In each case, enough detail is provided for the reader to understand the purpose of the analysis, the availability and suitability of data, and the econometric approach to measuring demand.
Technology Commercialization: DEA and Related Analytical Methods for Evaluating The Use and Implementation of Technical Innovation examines both general Research & Development commercialization and targeted new product innovation. New product development is a major occupation of the technical sector of the global economy and is viewed in many ways as a means of economic stability for a business, an industry, and a country. The heart of the book is a detailing of the analytical methods-with special, but not exclusive emphasis on DEA methods-for evaluating and ranking the most promising R & D and technical innovation being developed. The sponsors of the research and development may involve universities, countries, industries, and corporations-all of these sources are covered in the book. In addition, the trade-off of environmental problems vis-a-vis new product development is discussed in a section of the book. Sten Thore (editor and author) has woven together the chapter contributions by a strong group of international researchers into a book that has characteristics of both a monograph and a unified edited volume of well-written papers in DEA, technology evaluation, R&D, and environmental economics. Finally, the use of DEA as an evaluation method for product innovation is an important new development in the field of R&D commercialization. |
You may like...
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
Quantitative statistical techniques
Swanepoel Swanepoel, Vivier Vivier, …
Paperback
(2)R751 Discovery Miles 7 510
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
|