![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
In economics, many quantities are related to each other. Such
economic relations are often much more complex than relations in
science and engineering, where some quantities are independence and
the relation between others can be well approximated by
linear To make economic models more adequate, we need more accurate
techniques for describing dependence. Such techniques are currently
being developed. This book contains description of state-of-the-art
techniques for modeling dependence and economic applications
of
This is the first textbook designed to teach statistics to students in aviation courses. All examples and exercises are grounded in an aviation context, including flight instruction, air traffic control, airport management, and human factors. Structured in six parts, theiscovers the key foundational topics relative to descriptive and inferential statistics, including hypothesis testing, confidence intervals, z and t tests, correlation, regression, ANOVA, and chi-square. In addition, this book promotes both procedural knowledge and conceptual understanding. Detailed, guided examples are presented from the perspective of conducting a research study. Each analysis technique is clearly explained, enabling readers to understand, carry out, and report results correctly. Students are further supported by a range of pedagogical features in each chapter, including objectives, a summary, and a vocabulary check. Digital supplements comprise downloadable data sets and short video lectures explaining key concepts. Instructors also have access to PPT slides and an instructor’s manual that consists of a test bank with multiple choice exams, exercises with data sets, and solutions. This is the ideal statistics textbook for aviation courses globally, especially in aviation statistics, research methods in aviation, human factors, and related areas.
The manuscript reviews some key ideas about artificial intelligence, and relates them to economics. These include its relation to robotics, and the concepts of synthetic emotions, consciousness, and life. The economic implications of the advent of artificial intelligence, such as its effect on prices and wages, appropriate patent policy, and the possibility of accelerating productivity, are discussed. The growing field of artificial economics and the use of artificial agents in experimental economics is considered.
Macroeconomic Modelling has undergone radical changes in the last few years. There has been considerable innovation in developing robust solution techniques for the new breed of increasingly complex models. Similarly there has been a growing consensus on their long run and dynamic properties, as well as much development on existing themes such as modelling expectations and policy rules. This edited volume focuses on those areas which have undergone the most significant and imaginative developments and brings together the very best of modelling practice. We include specific sections on (I) Solving Large Macroeconomic Models, (II) Rational Expectations and Learning Approaches, (III) Macro Dynamics, and (IV) Long Run and Closures. All of the contributions offer new research whilst putting their developments firmly in context and as such will influence much future research in the area. It will be an invaluable text for those in policy institutions as well as academics and advanced students in the fields of economics, mathematics, business and government. Our contributors include those working in central banks, the IMF, European Commission and established academics.
Econometric models are made up of assumptions which never exactly match reality. Among the most contested ones is the requirement that the coefficients of an econometric model remain stable over time. Recent years have therefore seen numerous attempts to test for it or to model possible structural change when it can no longer be ignored. This collection of papers from Empirical Economics mirrors part of this development. The point of departure of most studies in this volume is the standard linear regression model Yt = x;fJt + U (t = I, ... , 1), t where notation is obvious and where the index t emphasises the fact that structural change is mostly discussed and encountered in a time series context. It is much less of a problem for cross section data, although many tests apply there as well. The null hypothesis of most tests for structural change is that fJt = fJo for all t, i.e. that the same regression applies to all time periods in the sample and that the disturbances u are well behaved. The well known Chow test for instance assumes t that there is a single structural shift at a known point in time, i.e. that fJt = fJo (t< t*), and fJt = fJo + t1fJ (t"?:. t*), where t* is known.
This important collection brings together leading econometricians to discuss advances in the areas of the econometrics of panel data. The papers in this collection can be grouped into two categories. The first, which includes chapters by Amemiya, Baltagi, Arellano, Bover and Labeaga, primarily deal with different aspects of limited dependent variables and sample selectivity. The second group of papers, including those by Nerlove, Schmidt and Ahn, Kiviet, Davies and Lahiri, consider issues that arise in the estimation of dyanamic (possibly) heterogeneous panel data models. Overall, the contributors focus on the issues of simplifying complex real-world phenomena into easily generalisable inferences from individual outcomes. As the contributions of G. S. Maddala in the fields of limited dependent variables and panel data were particularly influential, it is a fitting tribute that this volume is dedicated to him.
Swaps, futures, options, structured instruments - a wide range of derivative products is traded in today's financial markets. Analyzing, pricing and managing such products often requires fairly sophisticated quantitative tools and methods. This book serves as an introduction to financial mathematics with special emphasis on aspects relevant in practice. In addition to numerous illustrative examples, algorithmic implementations are demonstrated using "Mathematica" and the software package "UnRisk" (available for both students and teachers). The content is organized in 15 chapters that can be treated as independent modules. In particular, the exposition is tailored for classroom use in a Bachelor or Master program course, as well as for practitioners who wish to further strengthen their quantitative background.
This book reports the results of five empirical studies undertaken in the early seventies by a collaboration headed by Professor Morishima. It deals with applications of the general equilibrium models whose theoretical aspects have been one of Professor Morishima's main interests. Four main econometric models are constructed for the USA, the UK, and Japan. These are used as a basis for the discussion of various topics in economic theory, such as: the existence and stability or instability of the neoclassical path of full employment growth equilibrium and a von Neumann-type path of balanced growth at constant proces; the antimony between price-stability and full employment; the Samuelson-LeChatelier principle; the theory of the balanced-budget multiplier; the three Hicksian laws of the gross substitutes system; the Brown-Jones super-multipliers of international trade, and so on. In addition, this 1972 work makes a quantitative evaluation for the US economy of monetary and fiscal policies as short-run measures for achieving full employment; the effectiveness of built-in flexibility of taxes in the UK economy is discussed; and estimates are made of the rapid decrease in disguised unemployment in post-war Japan.
How could Finance benefit from AI? How can AI techniques provide an edge? Moving well beyond simply speeding up computation, this book tackles AI for Finance from a range of perspectives including business, technology, research, and students. Covering aspects like algorithms, big data, and machine learning, this book answers these and many other questions.
This tutorial presents a hands-on introduction to a new discrete choice modeling approach based on the behavioral notion of regret-minimization. This so-called Random Regret Minimization-approach (RRM) forms a counterpart of the Random Utility Maximization-approach (RUM) to discrete choice modeling, which has for decades dominated the field of choice modeling and adjacent fields such as transportation, marketing and environmental economics. Being as parsimonious as conventional RUM-models and compatible with popular software packages, the RRM-approach provides an alternative and appealing account of choice behavior. Rather than providing highly technical discussions as usually encountered in scholarly journals, this tutorial aims to allow readers to explore the RRM-approach and its potential and limitations hands-on and based on a detailed discussion of examples. This tutorial is written for students, scholars and practitioners who have a basic background in choice modeling in general and RUM-modeling in particular. It has been taken care of that all concepts and results should be clear to readers that do not have an advanced knowledge of econometrics.
This third edition of Braun and Murdoch's bestselling textbook now includes discussion of the use and design principles of the tidyverse packages in R, including expanded coverage of ggplot2, and R Markdown. The expanded simulation chapter introduces the Box-Muller and Metropolis-Hastings algorithms. New examples and exercises have been added throughout. This is the only introduction you'll need to start programming in R, the computing standard for analyzing data. This book comes with real R code that teaches the standards of the language. Unlike other introductory books on the R system, this book emphasizes portable programming skills that apply to most computing languages and techniques used to develop more complex projects. Solutions, datasets, and any errata are available from www.statprogr.science. Worked examples - from real applications - hundreds of exercises, and downloadable code, datasets, and solutions make a complete package for anyone working in or learning practical data science.
Charles de Gaulle commence ses Memoires d'Espoir, ainsi: 'La France vient du fond des ages. Elle vito Les Siecles l'appellent. Mais elle demeure elle-meme au long du temps. Ses limites peuvent se modifier sans que changent Ie relief, Ie climat, les fleuves, les mers, qui la marquent indefmiment. Y habitent des peuples qu'etreignent, au cours de l'Histoire, les epreuves les plus diverses, mais que la nature des choses, utilisee par la politique, petrit sans cesse en une seule nation. Celle-ci a embrasse de nombreuses generations. Elle en comprend actuellement plusieurs. Elle en enfantera beaucoup d'autres. Mais, de par la geograpbie du pays qui est Ie sien, de par Ie genie des races qui la composent, de par les voisinages qui l'entourent, elle revet un caractere constant qui fait dependre de leurs peres les Fran ais de chaque epoque et les engage pour leurs descendants. A moins de se rompre, cet ensemble humain, sur ce territoire, au sein de cet univers, comporte donc un passe, un present, un avenir, indissolubles. Aussi l'ttat, qui repond de la France, est-il en charge, a la fois, de son heritage d'bier, de ses interets d'aujourd'hui et de ses espoirs de demain. ' A la lurniere de cette idee de nation, il est clair, qu'un dialogue entre nations est eminemment important et que la Semaine Universitaire Franco Neerlandaise est une institution pour stimuler ce dialogue."
This book was first published in 1989. Inference and prediction in human affairs are characterised by a cognitive and reactive sample space, the elements of which are aware both of the statistician and of each other. It is therefore not surprising that methodologies borrowed from classical statistics and the physical sciences have yielded disappointingly few lasting empirical insights and have sometimes failed in predictive mode. This book puts the underlying methodology of socioeconomic statistics on a firmer footing by placing it within the ambit of inferential and predictive games. It covers such problems as learning, publication, non-response, strategic response, the nature and possibility of rational expectations, time inconsistency, intrinsic nonstationarity, and the existence of probabilities. Ideas are introduced such as real-time survey schemes, argument instability and reaction-proof forecasting based on stochastic approximation. Applications are canvassed to such topics as attitude measurement, political polling, econometric modelling under heterogeneous information, and the forecasting of hallmark events.
Game Theory has provided an extremely useful tool in enabling economists to venture into unknown areas. Its concepts of conflict and cooperation apply whenever the actions of several agents are interdependent; providing language to formulate as well as to structure, analyze, and understand strategic scenarios. Economic Behavior, Game Theory, and Technology in Emerging Markets explores game theory and its deep impact in developmental economics, specifically the manner in which it provides a way of formalizing institutions. This is particularly important for emerging economies which have not yet received much attention in the academic world. This publication is useful for academics, professors, and researchers in this field, but it has also been compiled to meet the needs of non-specialists as well.
This book was first published in 1995. The problem of disparities between different estimates of GDP is well known and widely discussed. Here, the authors describe a method for examining the discrepancies using a technique allocating them with reference to data reliability. The method enhances the reliability of the underlying data and leads to maximum-likelihood estimates. It is illustrated by application to the UK national accounts for the period 1920-1990. The book includes a full set of estimates for this period, including runs of industrial data for the period 1948-1990, which are longer than those available from any other source. The statistical technique allows estimates of standard errors of the data to be calculated and verified; these are presented both for data in levels and for changes in variables over 1-, 2- and 5-year periods.
9
The idea that simplicity matters in science is as old as science itself, with the much cited example of Ockham's Razor, 'entia non sunt multiplicanda praeter necessitatem': entities are not to be multiplied beyond necessity. A problem with Ockham's razor is that nearly everybody seems to accept it, but few are able to define its exact meaning and to make it operational in a non-arbitrary way. Using a multidisciplinary perspective including philosophers, mathematicians, econometricians and economists, this 2002 monograph examines simplicity by asking six questions: what is meant by simplicity? How is simplicity measured? Is there an optimum trade-off between simplicity and goodness-of-fit? What is the relation between simplicity and empirical modelling? What is the relation between simplicity and prediction? What is the connection between simplicity and convenience? The book concludes with reflections on simplicity by Nobel Laureates in Economics.
New Directions in Computational Economics brings together for the first time a diverse selection of papers, sharing the underlying theme of application of computing technology as a tool for achieving solutions to realistic problems in computational economics and related areas in the environmental, ecological and energy fields. Part I of the volume addresses experimental and computational issues in auction mechanisms, including a survey of recent results for sealed bid auctions. The second contribution uses neural networks as the basis for estimating bid functions for first price sealed bid auctions. Also presented is the `smart market' computational mechanism which better matches bids and offers for natural gas. Part II consists of papers that formulate and solve models of economics systems. Amman and Kendrick's paper deals with control models and the computational difficulties that result from nonconvexities. Using goal programming, Nagurney, Thore and Pan formulate spatial resource allocation models to analyze various policy issues. Thompson and Thrall next present a rigorous mathematical analysis of the relationship between efficiency and profitability. The problem of matching uncertain streams of assets and liabilities is solved using stochastic optimization techniques in the following paper in this section. Finally, Part III applies economic concepts to issues in computer science in addition to using computational techniques to solve economic models.
First published in 2004, this is a rigorous but user-friendly book on the application of stochastic control theory to economics. A distinctive feature of the book is that mathematical concepts are introduced in a language and terminology familiar to graduate students of economics. The standard topics of many mathematics, economics and finance books are illustrated with real examples documented in the economic literature. Moreover, the book emphasises the dos and don'ts of stochastic calculus, cautioning the reader that certain results and intuitions cherished by many economists do not extend to stochastic models. A special chapter (Chapter 5) is devoted to exploring various methods of finding a closed-form representation of the value function of a stochastic control problem, which is essential for ascertaining the optimal policy functions. The book also includes many practice exercises for the reader. Notes and suggested readings are provided at the end of each chapter for more references and possible extensions.
All humans eventually die, but life expectancies differ over time and among different demographic groups. Teasing out the various causes and correlates of death is a challenge, and it is one we take on in this book. A look at the data on mortality is both interesting and suggestive of some possible relationships. In 1900 life expectancies at birth were 46. 3 and 48. 3 years for men and women respectively, a gender differential of a bit less than 5 percent. Life expectancies for whites then were about 0. 3 years longer than that of the whole population, but life expectancies for blacks were only about 33 years for men and women. At age 65, the remaining life expectancies were about 12 and 11 years for whites and blacks respectively. Fifty years later, life expectancies at birth had grown to 66 and 71 years for males and females respectively. The percentage differential between the sexes was now almost up to 10 percent. The life expectancies of whites were about one year longer than that for the entire population. The big change was for blacks, whose life expectancy had grown to over 60 years with black females living about 5 percent longer than their male counterparts. At age 65 the remaining expected life had increased about two years with much larger percentage gains for blacks.
Cost Structure and the Measurement of Economic Performance is designed to provide a comprehensive guide for students, researchers or consultants who wish to model, construct, interpret, and use economic performance measures. The topical emphasis is on productivity growth and its dependence on the cost structure. The methodological focus is on application of the tools of economic analysis - the `thinking structure' provided by microeconomic theory - to measure technological or cost structure, and link it with market and regulatory structure. This provides a rich basis for evaluation of economic performance and its determinants. The format of the book stresses topics or questions of interest rather than the theoretical tools for analysis. Traditional productivity growth modeling and measurement practices that result in a productivity residual often called the `measure of our ignorance' are initially overviewed, and then the different aspects of technological, market and regulatory structure that might underlie this residual are explored. The ultimate goal is to decompose or explain the residual, by modeling and measuring a multitude of impacts that determine the economic performance of firms, sectors, and economies. The chapters are organized with three broad goals in mind. The first is to introduce the overall ideas involved in economic performance measurement and traditional productivity growth analysis. Issues associated with different types of (short and long run, internal and external) cost economies, market and regulatory impacts, and other general cost efficiencies that might impact these measures are then explored. Finally, some of the theoretical, data construction and econometric tools necessary to justify and implement these models are emphasized.
Models, Methods, Concepts and Applications of the Analytic Hierarchy Process is a volume dedicated to selected applications of the Analytic Hierarchy Process (AHP) focused on three themes: economics, the social sciences, and the linking of measurement with human values. (1) The AHP offers economists a substantially different approach to dealing with economic problems through ratio scales. The main mathematical models on which economics has based its quantitative thinking up to now are utility theory, which uses interval scales, and linear programming. We hope that the variety of examples included here can perhaps stimulate researchers in economics to try applying this new approach. (2) The second theme is concerned with the social sciences. The AHP offers psychologists and political scientists the methodology to quantify and derive measurements for intangibles. We hope that the examples included in this book will encourage them to examine the methods of AHP in terms of the problems they seek to solve. (3) The third theme is concerned with providing people in the physical and engineering sciences with a quantitative method to link hard measurement to human values. In such a process one needs to interpret what the measurements mean. A number is useless until someone understands what it means. It can have different meanings in different problems. Ten dollars are plenty to satisfy one's hunger but are useless by themselves in buying a new car. Such measurements are only indicators of the state of a system, but do not relate to the values of the human observers of that system. AHP methods can help resolve the conflicts between hard measurement data and human values.
Supply chain management (SCM) strives for creating competitive advantage and value for customers by integrating business processes from end users through original suppliers. However, the question of how SCM influences the value of a firm is not fully answered. Various conceptual frameworks that explain the coherence of SCM and company value, comprehended as value-based SCM, are well accepted in scientific research, but quantitative approaches to value-based SCM are found rather seldom. The book contributes to this research gap by proposing quantitative models that allow for assessing influences of SCM on the value of a firm. Opposed to existing models that limit the observation to chosen facets of SCM or selected value drivers, this holistic approach is adequate to * reflect configurational and operational aspects of SCM, * cover all phases of the product life cycle, * financially compare value impacts of profitability-related and asset-related value drivers, and * assess influences of dynamics and uncertainties on company value.
A. Dogramaci and N.R. Adam Productivity of a firm is influenced both by economic forces which act at the macro level and impose themselves on the individual firm as well as internal factors that result from decisions and processes which take place within the boundaries of the firm. Efforts towards increasing the produc tivity level of firms need to be based on a sound understanding of how the above processes take place. Our objective in this volume is to present some of the recent research work in this field. The volume consists of three parts. In part I, two macro issues are addressed (taxation and inflation) and their relation to produc tivity is analyzed. The second part of the volume focuses on methods for productivity analysis within the firm. Finally, the third part of the book deals with two additional productivity analysis techniques and their applications to public utilities. The objective of the volume is not to present a unified point of view, but rather to cover a sample of different methodologies and perspectives through original, scholarly papers."
Studies in Consumer Demand - Econometric Methods Applied to Market Data contains eight previously unpublished studies of consumer demand. Each study stands on its own as a complete econometric analysis of demand for a well-defined consumer product. The econometric methods range from simple regression techniques applied in the first four chapters, to the use of logit and multinomial logit models used in chapters 5 and 6, to the use of nested logit models in chapters 6 and 7, and finally to the discrete/continuous modeling methods used in chapter 8. Emphasis is on applications rather than econometric theory. In each case, enough detail is provided for the reader to understand the purpose of the analysis, the availability and suitability of data, and the econometric approach to measuring demand. |
You may like...
The Locomotive; new ser. vol. 19 no. 1…
Hartford Steam Boiler Inspection and
Hardcover
R829
Discovery Miles 8 290
The Politicization of Parenthood…
Martina Richter, Sabine Andresen
Hardcover
R4,047
Discovery Miles 40 470
Dismantling Public Policy - Preferences…
Michael W. Bauer, Andrew Jordan, …
Hardcover
R3,241
Discovery Miles 32 410
Gender Regimes in Transition in Central…
Gillian Pascall, Anna Kwak
Hardcover
R2,396
Discovery Miles 23 960
|