![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
Aspects of Robust Statistics are important in many areas. Based on the International Conference on Robust Statistics 2001 (ICORS 2001) in Vorau, Austria, this volume discusses future directions of the discipline, bringing together leading scientists, experienced researchers and practitioners, as well as younger researchers. The papers cover a multitude of different aspects of Robust Statistics. For instance, the fundamental problem of data summary (weights of evidence) is considered and its robustness properties are studied. Further theoretical subjects include e.g.: robust methods for skewness, time series, longitudinal data, multivariate methods, and tests. Some papers deal with computational aspects and algorithms. Finally, the aspects of application and programming tools complete the volume.
The present work is an extension of my doctoral thesis done at Stanford in the early 1970s. In one clear sense it responds to the call for consilience by Edward O. Wilson. I agree with Wilson that there is a pressing need in the sciences today for the unification of the social with the natural sciences. I consider the present work to proceed from the perspective of behavioral ecology, specifically a subfield which I choose to call interpersonal behavioral ecology th Ecology, as a general field, has emerged in the last quarter of the 20 century as a major theme of concern as we have become increasingly aware that we must preserve the planet whose limited resources we share with all other earthly creatures. Interpersonal behavioral ecology, however, focuses not on the physical environment, but upon our social environment. It concerns our interpersonal behavioral interactions at all levels, from simple dyadic one-to-one personal interactions to our larger, even global, social, economic, and political interactions. Interpersonal behavioral ecology, as I see it, then, is concerned with our behavior toward each other, from the most obvious behaviors of war between nations, to excessive competition, exploitation, crime, abuse, and even to the ways in which we interact with each other as individuals in the family, in our social lives, in the workplace, and in the marketplace.
Technology Commercialization: DEA and Related Analytical Methods for Evaluating The Use and Implementation of Technical Innovation examines both general Research & Development commercialization and targeted new product innovation. New product development is a major occupation of the technical sector of the global economy and is viewed in many ways as a means of economic stability for a business, an industry, and a country. The heart of the book is a detailing of the analytical methods-with special, but not exclusive emphasis on DEA methods-for evaluating and ranking the most promising R & D and technical innovation being developed. The sponsors of the research and development may involve universities, countries, industries, and corporations-all of these sources are covered in the book. In addition, the trade-off of environmental problems vis-a-vis new product development is discussed in a section of the book. Sten Thore (editor and author) has woven together the chapter contributions by a strong group of international researchers into a book that has characteristics of both a monograph and a unified edited volume of well-written papers in DEA, technology evaluation, R&D, and environmental economics. Finally, the use of DEA as an evaluation method for product innovation is an important new development in the field of R&D commercialization.
Since 1993 a major research programme, "Stochastic Decision Analysis in Forest Management" has been running at Department of Economics and Natural Resources, The Royal Veterinary and Agricultural University (KVL), Copenhagen, in collaboration with Institute of Mathematical Statistics, University of Copenhagen (KU). The research is funded by the two Universities; The Danish Agricultural and Veterinary Research Council; The Danish Research Academy; The National Forest and Nature Agency; and Danish Informatics Network in the Agricultural Sciepces (DINA). A first international workshop in the research programme was held 5 - 8 August, 1996 at Eldrupgaard, Denmark, within the frameworkofacollaborationagreementbetween University of California at Berkeley (UCB) and the Danish Universities, and funded by The Danish Research Academy and the L0venholm Foundation. Having participated in the workshop, Professor Peter Berck (UCB) suggested that the papers be published along with selected papers in the same scientific field, i.e. mainly cointegration analysis of time series in forestry. The editors express their sincere appreciations to the many persons who have contributed to the realisation of the present book: participants in the research programme and the workshop, in particular Professors S0ren Johansen (KU) and Peter Berck (UCB); authors outside the programme/workshop; reviewers of the papers not previously published, in particuler Associate Professors Niels Haldrup (Aarhus University) and Henrik Hansen (KVL); and finally Mrs Mette Riis and Lizzie Rohde who did the tedious work of giving the papers a uniform style. Copenhagen, October 1998.
In this book, we synthesize a rich and vast literature on econometric challenges associated with accounting choices and their causal effects. Identi?cation and es- mation of endogenous causal effects is particularly challenging as observable data are rarely directly linked to the causal effect of interest. A common strategy is to employ logically consistent probability assessment via Bayes' theorem to connect observable data to the causal effect of interest. For example, the implications of earnings management as equilibrium reporting behavior is a centerpiece of our explorations. Rather than offering recipes or algorithms, the book surveys our - periences with accounting and econometrics. That is, we focus on why rather than how. The book can be utilized in a variety of venues. On the surface it is geared - ward graduate studies and surely this is where its roots lie. If we're serious about our studies, that is, if we tackle interesting and challenging problems, then there is a natural progression. Our research addresses problems that are not well - derstood then incorporates them throughout our curricula as our understanding improves and to improve our understanding (in other words, learning and c- riculum development are endogenous). For accounting to be a vibrant academic discipline, we believe it is essential these issues be confronted in the undergr- uate classroom as well as graduate studies. We hope we've made some progress with examples which will encourage these developments.
Every advance in computer architecture and software tempts statisticians to tackle numerically harder problems. To do so intelligently requires a good working knowledge of numerical analysis. This book equips students to craft their own software and to understand the advantages and disadvantages of different numerical methods. Issues of numerical stability, accurate approximation, computational complexity, and mathematical modeling share the limelight in a broad yet rigorous overview of those parts of numerical analysis most relevant to statisticians. In this second edition, the material on optimization has been completely rewritten. There is now an entire chapter on the MM algorithm in addition to more comprehensive treatments of constrained optimization, penalty and barrier methods, and model selection via the lasso. There is also new material on the Cholesky decomposition, Gram-Schmidt orthogonalization, the QR decomposition, the singular value decomposition, and reproducing kernel Hilbert spaces. The discussions of the bootstrap, permutation testing, independent Monte Carlo, and hidden Markov chains are updated, and a new chapter on advanced MCMC topics introduces students to Markov random fields, reversible jump MCMC, and convergence analysis in Gibbs sampling. Numerical Analysis for Statisticians can serve as a graduate text for a course surveying computational statistics. With a careful selection of topics and appropriate supplementation, it can be used at the undergraduate level. It contains enough material for a graduate course on optimization theory. Because many chapters are nearly self-contained, professional statisticians will also find the book useful as a reference.
Control theory methods in economics have historically developed over three phases. The first involved basically the feedback control rules in a deterministic framework which were applied in macrodynamic models for analyzing stabilization policies. The second phase raised the issues of various types of inconsistencies in deterministic optimal control models due to changing information and other aspects of stochasticity. Rational expectations models have been extensively used in this plan to resolve some of the inconsistency problems. The third phase has recently focused on the various aspects of adaptive control. where stochasticity and information adaptivity are introduced in diverse ways e.g . risk adjustment and risk sensitivity of optimal control, recursive updating rules via Kalman filtering and weighted recursive least squares and variable structure control methods in nonlinear framework. Problems of efficient econometric estimation of optimal control models have now acquired significant importance. This monograph provides an integrated view of control theory methods, synthesizing the three phases from feedback control to stochastic control and from stochastic control to adaptive control. Aspects of econometric estimation are strongly emphasized here, since these are very important in empirical applications in economics."
This 2005 volume brings together twelve papers by many of the most prominent applied general equilibrium modelers honoring Herbert Scarf, the father of equilibrium computation in economics. It deals with developments in applied general equilibrium, a field which has broadened greatly since the 1980s. The contributors discuss some traditional as well as some modern topics in the field, including non-convexities in economy-wide models, tax policy, developmental modeling and energy modeling. The book also covers a range of distinct approaches, conceptual issues and computational algorithms, such as calibration and areas of application such as macroeconomics of real business cycles and finance. An introductory chapter written by the editors maps out issues and scenarios for the future evolution of applied general equilibrium.
This 2005 volume contains the papers presented in honor of the lifelong achievements of Thomas J. Rothenberg on the occasion of his retirement. The authors of the chapters include many of the leading econometricians of our day, and the chapters address topics of current research significance in econometric theory. The chapters cover four themes: identification and efficient estimation in econometrics, asymptotic approximations to the distributions of econometric estimators and tests, inference involving potentially nonstationary time series, such as processes that might have a unit autoregressive root, and nonparametric and semiparametric inference. Several of the chapters provide overviews and treatments of basic conceptual issues, while others advance our understanding of the properties of existing econometric procedures and/or propose others. Specific topics include identification in nonlinear models, inference with weak instruments, tests for nonstationary in time series and panel data, generalized empirical likelihood estimation, and the bootstrap.
Originally published in 2005, Weather Derivative Valuation covers all the meteorological, statistical, financial and mathematical issues that arise in the pricing and risk management of weather derivatives. There are chapters on meteorological data and data cleaning, the modelling and pricing of single weather derivatives, the modelling and valuation of portfolios, the use of weather and seasonal forecasts in the pricing of weather derivatives, arbitrage pricing for weather derivatives, risk management, and the modelling of temperature, wind and precipitation. Specific issues covered in detail include the analysis of uncertainty in weather derivative pricing, time-series modelling of daily temperatures, the creation and use of probabilistic meteorological forecasts and the derivation of the weather derivative version of the Black-Scholes equation of mathematical finance. Written by consultants who work within the weather derivative industry, this book is packed with practical information and theoretical insight into the world of weather derivative pricing.
In economics, many quantities are related to each other. Such
economic relations are often much more complex than relations in
science and engineering, where some quantities are independence and
the relation between others can be well approximated by
linear To make economic models more adequate, we need more accurate
techniques for describing dependence. Such techniques are currently
being developed. This book contains description of state-of-the-art
techniques for modeling dependence and economic applications
of
The manuscript reviews some key ideas about artificial intelligence, and relates them to economics. These include its relation to robotics, and the concepts of synthetic emotions, consciousness, and life. The economic implications of the advent of artificial intelligence, such as its effect on prices and wages, appropriate patent policy, and the possibility of accelerating productivity, are discussed. The growing field of artificial economics and the use of artificial agents in experimental economics is considered.
Macroeconomic Modelling has undergone radical changes in the last few years. There has been considerable innovation in developing robust solution techniques for the new breed of increasingly complex models. Similarly there has been a growing consensus on their long run and dynamic properties, as well as much development on existing themes such as modelling expectations and policy rules. This edited volume focuses on those areas which have undergone the most significant and imaginative developments and brings together the very best of modelling practice. We include specific sections on (I) Solving Large Macroeconomic Models, (II) Rational Expectations and Learning Approaches, (III) Macro Dynamics, and (IV) Long Run and Closures. All of the contributions offer new research whilst putting their developments firmly in context and as such will influence much future research in the area. It will be an invaluable text for those in policy institutions as well as academics and advanced students in the fields of economics, mathematics, business and government. Our contributors include those working in central banks, the IMF, European Commission and established academics.
Econometric models are made up of assumptions which never exactly match reality. Among the most contested ones is the requirement that the coefficients of an econometric model remain stable over time. Recent years have therefore seen numerous attempts to test for it or to model possible structural change when it can no longer be ignored. This collection of papers from Empirical Economics mirrors part of this development. The point of departure of most studies in this volume is the standard linear regression model Yt = x;fJt + U (t = I, ... , 1), t where notation is obvious and where the index t emphasises the fact that structural change is mostly discussed and encountered in a time series context. It is much less of a problem for cross section data, although many tests apply there as well. The null hypothesis of most tests for structural change is that fJt = fJo for all t, i.e. that the same regression applies to all time periods in the sample and that the disturbances u are well behaved. The well known Chow test for instance assumes t that there is a single structural shift at a known point in time, i.e. that fJt = fJo (t< t*), and fJt = fJo + t1fJ (t"?:. t*), where t* is known.
The methodological needs of environmental studies are unique in the breadth of research questions that can be posed, calling for a textbook that covers a broad swath of approaches to conducting research with potentially many different kinds of evidence. Fully updated to address new developments such as the effects of the internet, recent trends in the use of computers, remote sensing, and large data sets, this new edition of Research Methods for Environmental Studies is written specifically for social science-based research into the environment. This revised edition contains new chapters on coding, focus groups, and an extended treatment of hypothesis testing. The textbook covers the best-practice research methods most used to study the environment and its connections to societal and economic activities and objectives. Over five key parts, Kanazawa introduces quantitative and qualitative approaches, mixed methods, and the special requirements of interdisciplinary research, emphasizing that methodological practice should be tailored to the specific needs of the project. Within these parts, detailed coverage is provided on key topics including the identification of a research project, hypothesis testing, spatial analysis, the case study method, ethnographic approaches, discourse analysis, mixed methods, survey and interview techniques, focus groups, and ethical issues in environmental research. Drawing on a variety of extended and updated examples to encourage problem-based learning and fully addressing the challenges associated with interdisciplinary investigation, this book will be an essential resource for students embarking on courses exploring research methods in environmental studies.
This important collection brings together leading econometricians to discuss advances in the areas of the econometrics of panel data. The papers in this collection can be grouped into two categories. The first, which includes chapters by Amemiya, Baltagi, Arellano, Bover and Labeaga, primarily deal with different aspects of limited dependent variables and sample selectivity. The second group of papers, including those by Nerlove, Schmidt and Ahn, Kiviet, Davies and Lahiri, consider issues that arise in the estimation of dyanamic (possibly) heterogeneous panel data models. Overall, the contributors focus on the issues of simplifying complex real-world phenomena into easily generalisable inferences from individual outcomes. As the contributions of G. S. Maddala in the fields of limited dependent variables and panel data were particularly influential, it is a fitting tribute that this volume is dedicated to him.
Swaps, futures, options, structured instruments - a wide range of derivative products is traded in today's financial markets. Analyzing, pricing and managing such products often requires fairly sophisticated quantitative tools and methods. This book serves as an introduction to financial mathematics with special emphasis on aspects relevant in practice. In addition to numerous illustrative examples, algorithmic implementations are demonstrated using "Mathematica" and the software package "UnRisk" (available for both students and teachers). The content is organized in 15 chapters that can be treated as independent modules. In particular, the exposition is tailored for classroom use in a Bachelor or Master program course, as well as for practitioners who wish to further strengthen their quantitative background.
This book reports the results of five empirical studies undertaken in the early seventies by a collaboration headed by Professor Morishima. It deals with applications of the general equilibrium models whose theoretical aspects have been one of Professor Morishima's main interests. Four main econometric models are constructed for the USA, the UK, and Japan. These are used as a basis for the discussion of various topics in economic theory, such as: the existence and stability or instability of the neoclassical path of full employment growth equilibrium and a von Neumann-type path of balanced growth at constant proces; the antimony between price-stability and full employment; the Samuelson-LeChatelier principle; the theory of the balanced-budget multiplier; the three Hicksian laws of the gross substitutes system; the Brown-Jones super-multipliers of international trade, and so on. In addition, this 1972 work makes a quantitative evaluation for the US economy of monetary and fiscal policies as short-run measures for achieving full employment; the effectiveness of built-in flexibility of taxes in the UK economy is discussed; and estimates are made of the rapid decrease in disguised unemployment in post-war Japan.
This tutorial presents a hands-on introduction to a new discrete choice modeling approach based on the behavioral notion of regret-minimization. This so-called Random Regret Minimization-approach (RRM) forms a counterpart of the Random Utility Maximization-approach (RUM) to discrete choice modeling, which has for decades dominated the field of choice modeling and adjacent fields such as transportation, marketing and environmental economics. Being as parsimonious as conventional RUM-models and compatible with popular software packages, the RRM-approach provides an alternative and appealing account of choice behavior. Rather than providing highly technical discussions as usually encountered in scholarly journals, this tutorial aims to allow readers to explore the RRM-approach and its potential and limitations hands-on and based on a detailed discussion of examples. This tutorial is written for students, scholars and practitioners who have a basic background in choice modeling in general and RUM-modeling in particular. It has been taken care of that all concepts and results should be clear to readers that do not have an advanced knowledge of econometrics.
Charles de Gaulle commence ses Memoires d'Espoir, ainsi: 'La France vient du fond des ages. Elle vito Les Siecles l'appellent. Mais elle demeure elle-meme au long du temps. Ses limites peuvent se modifier sans que changent Ie relief, Ie climat, les fleuves, les mers, qui la marquent indefmiment. Y habitent des peuples qu'etreignent, au cours de l'Histoire, les epreuves les plus diverses, mais que la nature des choses, utilisee par la politique, petrit sans cesse en une seule nation. Celle-ci a embrasse de nombreuses generations. Elle en comprend actuellement plusieurs. Elle en enfantera beaucoup d'autres. Mais, de par la geograpbie du pays qui est Ie sien, de par Ie genie des races qui la composent, de par les voisinages qui l'entourent, elle revet un caractere constant qui fait dependre de leurs peres les Fran ais de chaque epoque et les engage pour leurs descendants. A moins de se rompre, cet ensemble humain, sur ce territoire, au sein de cet univers, comporte donc un passe, un present, un avenir, indissolubles. Aussi l'ttat, qui repond de la France, est-il en charge, a la fois, de son heritage d'bier, de ses interets d'aujourd'hui et de ses espoirs de demain. ' A la lurniere de cette idee de nation, il est clair, qu'un dialogue entre nations est eminemment important et que la Semaine Universitaire Franco Neerlandaise est une institution pour stimuler ce dialogue."
This book was first published in 1989. Inference and prediction in human affairs are characterised by a cognitive and reactive sample space, the elements of which are aware both of the statistician and of each other. It is therefore not surprising that methodologies borrowed from classical statistics and the physical sciences have yielded disappointingly few lasting empirical insights and have sometimes failed in predictive mode. This book puts the underlying methodology of socioeconomic statistics on a firmer footing by placing it within the ambit of inferential and predictive games. It covers such problems as learning, publication, non-response, strategic response, the nature and possibility of rational expectations, time inconsistency, intrinsic nonstationarity, and the existence of probabilities. Ideas are introduced such as real-time survey schemes, argument instability and reaction-proof forecasting based on stochastic approximation. Applications are canvassed to such topics as attitude measurement, political polling, econometric modelling under heterogeneous information, and the forecasting of hallmark events.
Game Theory has provided an extremely useful tool in enabling economists to venture into unknown areas. Its concepts of conflict and cooperation apply whenever the actions of several agents are interdependent; providing language to formulate as well as to structure, analyze, and understand strategic scenarios. Economic Behavior, Game Theory, and Technology in Emerging Markets explores game theory and its deep impact in developmental economics, specifically the manner in which it provides a way of formalizing institutions. This is particularly important for emerging economies which have not yet received much attention in the academic world. This publication is useful for academics, professors, and researchers in this field, but it has also been compiled to meet the needs of non-specialists as well.
This third edition of Braun and Murdoch's bestselling textbook now includes discussion of the use and design principles of the tidyverse packages in R, including expanded coverage of ggplot2, and R Markdown. The expanded simulation chapter introduces the Box-Muller and Metropolis-Hastings algorithms. New examples and exercises have been added throughout. This is the only introduction you'll need to start programming in R, the computing standard for analyzing data. This book comes with real R code that teaches the standards of the language. Unlike other introductory books on the R system, this book emphasizes portable programming skills that apply to most computing languages and techniques used to develop more complex projects. Solutions, datasets, and any errata are available from www.statprogr.science. Worked examples - from real applications - hundreds of exercises, and downloadable code, datasets, and solutions make a complete package for anyone working in or learning practical data science.
This book was first published in 1995. The problem of disparities between different estimates of GDP is well known and widely discussed. Here, the authors describe a method for examining the discrepancies using a technique allocating them with reference to data reliability. The method enhances the reliability of the underlying data and leads to maximum-likelihood estimates. It is illustrated by application to the UK national accounts for the period 1920-1990. The book includes a full set of estimates for this period, including runs of industrial data for the period 1948-1990, which are longer than those available from any other source. The statistical technique allows estimates of standard errors of the data to be calculated and verified; these are presented both for data in levels and for changes in variables over 1-, 2- and 5-year periods.
9 |
You may like...
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Applied Econometric Analysis - Emerging…
Brian W Sloboda, Yaya Sissoko
Hardcover
R5,351
Discovery Miles 53 510
The Handbook of Historical Economics
Alberto Bisin, Giovanni Federico
Paperback
R2,567
Discovery Miles 25 670
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
|