![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
In the Administration building at Linkopi ] ng University we have one of Oscar Reutersvard' ] s "Impossible Figures" in three dimensions. I call it "Perspectives of Science." When viewed from a speci c point in space there is order and structure in the 3-dimensional gure. When viewed from other points there is disorder and no structure. If a speci c scienti c paradigm is used, there is order and structure; otherwise there is disorder and no structure. My perspective in Transportation Science has focused on understanding the mathematical structure and the logic underlying the choice probability models in common use. My book with N. F. Stewart on the Gravity model (Erlander and Stewart 1990), was written in this perspective. The present book stems from the same desire to understand underlying assumptions and structure. It investigateshow far a new way of de ning Cost-Minimizing Behavior can take us.Itturnsoutthatall commonlyusedchoiceprobabilitydistributionsoflogittype- log linear probability functions - follow from cost-minimizing behavior de ned in the new way. In addition some new nested models appear."
From Robin Sickles: As I indicated to you some months ago Professor William Horrace and I would like Springer to publish a Festschrift in Honor of Peter Schmidt, our professor. Peter s accomplishments are legendary among his students and the profession. I have a bit of that student perspective in my introductory and closing remarks on the website for the conference we had in his honor this last July. I have attached the conference program from which selected papers will come (as well as from students who were unable to attend). You will also find the names of his students (40) on the website. A top twenty economics department could be started up from those 40 students. Papers from some festschrifts have a thematic link among the papers based on subject material. What I think is unique to this festschrift is that the theme running through the papers will be Peter s remarkable legacy left to his students to frame a problem and then analyze and examine it in depth using rigorous techniques but rarely just for the purpose of showcasing technical refinements per se. I think this would be a book that graduate students would find invaluable in their early research careers and seasoned scholars would find invaluable in both their and their students research."
This work contains an up-to-date coverage of the last 20 years' advances in Bayesian inference in econometrics, with an emphasis on dynamic models. It shows how to treat Bayesian inference in non linear models, by integrating the useful developments of numerical integration techniques based on simulations (such as Markov Chain Monte Carlo methods), and the long available analytical results of Bayesian inference for linear regression models. It thus covers a broad range of rather recent models for economic time series, such as non linear models, autoregressive conditional heteroskedastic regressions, and cointegrated vector autoregressive models. It contains also an extensive chapter on unit root inference from the Bayesian viewpoint. Several examples illustrate the methods. This book is intended for econometrics and statistics postgraduates, professors and researchers in economics departments, business schools, statistics departments, or any research centre in the same fields, especially econometricians.
This book offers a unique and insightful econometric evaluation of the policies used to fight transnational terrorism between 1990 and 2014. It uses the tools of modern economics, game theory and structural econometrics to analyze the roles of foreign aid, educational capital, and military intervention. Jean-Paul Azam and Veronique Thelen analyze panel data over 25 years across 124 countries. They prove that foreign aid plays a key role in inducing recipient governments to protect the donors' political and economic interests within their sphere of influence. Demonstrating that countries endowed with better educational capital export fewer terrorist attacks, they also illustrate that, in contrast, military intervention is counter-productive in abating terrorism. Recognizing the strides taken by the Obama administration to increase the role of foreign aid and reduce the use of military interventions, this book shows the significant impact this has had in reducing the number of transnational terrorist attacks per source country, and suggests further developments in this vein. Practical and timely, this book will be of particular interest to students and scholars of economics and political science, as well as those working on the wider issue of terrorism. Presenting a series of new findings, the book will also appeal to international policy makers and government officials.
This book proposes a new methodology for the selection of one (model) from among a set of alternative econometric models. Let us recall that a model is an abstract representation of reality which brings out what is relevant to a particular economic issue. An econometric model is also an analytical characterization of the joint probability distribution of some random variables of interest, which yields some information on how the actual economy works. This information will be useful only if it is accurate and precise; that is, the information must be far from ambiguous and close to what we observe in the real world Thus, model selection should be performed on the basis of statistics which summarize the degree of accuracy and precision of each model. A model is accurate if it predicts right; it is precise if it produces tight confidence intervals. A first general approach to model selection includes those procedures based on both characteristics, precision and accuracy. A particularly interesting example of this approach is that of Hildebrand, Laing and Rosenthal (1980). See also Hendry and Richard (1982). A second general approach includes those procedures that use only one of the two dimensions to discriminate among models. In general, most of the tests we are going to examine correspond to this category.
This 30th volume of the International Symposia in Economic Theory and Econometrics explores the latest social and financial developments across Asian markets. Chapters cover a range of topics such as the impact of COVID-19 related events in Southeast Asia along the determinants of capital structure before and during the pandemic; the influence of new distribution concepts on macro and micro economic levels; as well as the effects of long-term cross-currency basis swaps on government bonds. These peer-reviewed papers touch on a variety of timely, interdisciplinary subjects such as real earnings impact and the effects of public policy. Together, Quantitative Analysis of Social and Financial Market Development is a crucial resource of current, cutting-edge research for any scholar of international finance and economics.
The advent of low cost computation has made many previously intractable econometric models empirically feasible and computational methods are now realized as an integral part of the theory.This book provides graduate students and researchers not only with a sound theoretical introduction to the topic, but allows the reader through an internet based interactive computing method to learn from theory to practice the different techniques discussed in the book. Among the theoretical issues presented are linear regression analysis, univariate time series modelling with some interesting extensions such as ARCH models and dimensionality reduction techniques.The electronic version of the book including all computational possibilites can be viewed athttp://www.xplore-stat.de/ebooks/ebooks.html
A classic treatise that defined the field of applied demand analysis, Consumer Demand in the United States: Prices, Income, and Consumption Behavior is now fully updated and expanded for a new generation. Consumption expenditures by households in the United States account for about 70% of America's GDP. The primary focus in this book is on how households adjust these expenditures in response to changes in price and income. Econometric estimates of price and income elasticities are obtained for an exhaustive array of goods and services using data from surveys conducted by the Bureau of Labor Statistics and aggregate consumption expenditures from the National Income and Product Accounts, providing a better understanding of consumer demand. Practical models for forecasting future price and income elasticities are also demonstrated. Fully revised with over a dozen new chapters and appendices, the book revisits the original Houthakker-Taylor models while examining new material as well, such as the use of quantile regression and the stationarity of consumer preference. It also explores the emerging connection between neuroscience and consumer behavior, integrating the economic literature on demand theory with psychology literature. The most comprehensive treatment of the topic to date, this volume will be an essential resource for any researcher, student or professional economist working on consumer behavior or demand theory, as well as investors and policymakers concerned with the impact of economic fluctuations.
This trusted textbook returns in its 4th edition with even more exercises to help consolidate understanding - and a companion website featuring additional materials, including a solutions manual for instructors. Offering a unique blend of theory and practical application, it provides ideal preparation for doing applied econometric work as it takes students from a basic level up to an advanced understanding in an intuitive, step-by-step fashion. Clear presentation of economic tests and methods of estimation is paired with practical guidance on using several types of software packages. Using real world data throughout, the authors place emphasis upon the interpretation of results, and the conclusions to be drawn from them in econometric work. This book will be essential reading for economics undergraduate and master's students taking a course in applied econometrics. Its practical nature makes it ideal for modules requiring a research project. New to this Edition: - Additional practical exercises throughout to help consolidate understanding - A freshly-updated companion website featuring a new solutions manual for instructors
Places, Towns and Townships is an excellent resource for anyone in need of data for all of the nation's cities, towns, townships, villages, and census-designated places in one convenient source. It compiles essential information about places in the United States and the people who live in them such as: * population * housing * income * education * employment * crime * and much more! In addition to the tables, Places, Towns and Townships includes text that describes key findings, figures that call attention to noteworthy trends in data, and rankings of the largest cities by various demographics. Compiled from multiple government sources, the data in this unique reference volume represents the most current and accurate information available. This data will not be updated for several years, making Places, Towns and Townships an invaluable resource in the years to come.
"Transportation Economics" explores the efficient use of society's
scarce resources for the movement of people and goods. This book
carefully examines transportation markets and standard economic
tools, how these resources are used, and how the allocation of
society resources affects transportation activities. This textbook is unique in that it uses a detailed analysis of
econometric results from current transportation literature to
provide an integrated collection of theory and application. Its
numerous case studies illustrate the economic principles, discuss
testable hypotheses, analyze econometric results, and examine each
study's implications for public policy. These features make this a
well-developed introduction to the foundations of transportation
economics. Additional case studies on a spectrum of domestic and
international transportation topics available at http:
//www.blackwellpublishers.co.uk/mccarthy in order to keep students
abreast of recent developments in the field and their implications
for public policy. The paperback edition of this book is not available from Blackwell in the US or Canda.
Reformation of Econometrics is a sequel to The Formation of Econometrics: A Historical Perspective (1993, OUP) which traces the formation of econometric theory during the period 1930-1960. This book provides an account of the advances in the field of econometrics since the 1970s. Based on original research, it focuses on the reformists' movement and schools of thought and practices that attempted a paradigm shift in econometrics in the 1970s and 1980s. It describes the formation and consolidation of the Cowles Commission (CC) paradigm and traces and analyses the three major methodological attempts to resolve problems involved in model choice and specification of the CC paradigm. These attempts have reoriented the focus of econometric research from internal questions (how to optimally estimate a priori given structural parameters) to external questions (how to choose, design, and specify models). It also examines various modelling issues and problems through two case studies - modelling the Phillips curve and business cycles. The third part of the book delves into the development of three key aspects of model specification in detail - structural parameters, error terms, and model selection and design procedures. The final chapter uses citation analyses to study the impact of the CC paradigm over the span of three and half decades (1970-2005). The citation statistics show that the impact has remained extensive and relatively strong in spite of certain weakening signs. It implies that the reformative attempts have fallen short of causing a paradigm shift.
This volume is in honour of the remarkable career of the Father of Spatial Econometrics, Professor Jean Paelinck, presently of the Tinbergen Institute, Rotterdam. Jean Paelinck, arguably, is the founder of modern spatial econometrics. The impact on the profession through his work in spatial econometrics, regional science, and more conventional economics can be measured in many ways: through the work of his students, his devotion to and activism in facilitating the diffusion of regional science to Poland, the formulation and development of his FLEUR model, his co-founding of the French-speaking Regional Science Association, the voluminous references to his scholarly publications, his many invitations to be a featured speaker at conferences and universities throughout the world, the offices he has held in scholarly and professional associations, Erasmus University Rotterdam and the Netherlands Economic Institute, and the numerous honorary degrees he has been awarded. A series of special sessions in honour of Jean Paelinck were organized at the most prominent regional science meetings around the world. A number of prominent scholars in the field organized and participated in special sessions labelled In Honour of Professor Paelinck.' These sessions reflect a truly global reach of the techniques and methods pioneered by him. As an outgrowth of six conferences final versions of the selection of papers are collected in this volume. Prominent ideas contained in each of the selected contributions can be traced explicitly to work by Jean Paelinck.
Studies in Consumer Demand - Econometric Methods Applied to Market Data contains eight previously unpublished studies of consumer demand. Each study stands on its own as a complete econometric analysis of demand for a well-defined consumer product. The econometric methods range from simple regression techniques applied in the first four chapters, to the use of logit and multinomial logit models used in chapters 5 and 6, to the use of nested logit models in chapters 6 and 7, and finally to the discrete/continuous modeling methods used in chapter 8. Emphasis is on applications rather than econometric theory. In each case, enough detail is provided for the reader to understand the purpose of the analysis, the availability and suitability of data, and the econometric approach to measuring demand.
This book provides the first ever comprehensive economic evaluation of the long-standing German system of works councils and worker directors on company boards. This system of codetermination, or "Mitbestimmung, " is unique in the degree of information provision, consultation, and participation ceded employees. Addison analyzes the effects of works councils on establishment productivity, profitability, investment in physical and intangible capital, employment, training, wages and organizational flexibility, as well as the influence of worker directors on some of the same indicators plus, critically, shareholder value. Today, works councils are in decline while worker directors have scarcely been embraced either from within or without. This book examines these challenges and addresses the likely evolution of codetermination.
This book addresses environmental and climate change induced migration from the vantage point of migration studies, offering a broad spectrum of approaches for considering the environment/climate/migration nexus. Research on the subject is still frequently narrowed down to climate change vulnerability and the environmental push factor. The book establishes the interconnections between societal and environmental vulnerability, and migration and capability, allowing appreciation of migration in the frame of climate as a case of spatial and social mobility, that is, as a strategy of persons and groups to deal with a grossly unequal distribution of life chances across the world. In their introduction, the editors fan out the current debate and state the need to transcend predominantly policy-oriented approaches to migration. The first section of the volume focuses on "Methodologies and Methods" and presents very distinct approaches to think climate induced migration. Subsequent chapters explore the sensitivity of existing migration flows to climate change in Ghana and Bangladesh, the complex relationship between migration, demographic change and coping capacities in Canada, methodological challenges of a household survey on the significance of migration and remittances for adaptation in the Hindu Kush region and an econometric study of the aftermath of the 1998 floods in Bangladesh. The second part, "Areas of Concern: Politics and Human Rights", deepens the analysis of discourses as well as of the implications of proposed and implemented policies. Contributors discuss such topics as environmental migration as a multi-causal problem, climate migration as a consequence in an alarmist discourse and climate migration as a solution. A study of an integrated relocation program in Papua New Guinea is followed by chapters on the promise and the flaws of planned relocation policy, global policy on protection of environmental migrants including both internally displaced peoples and those who cross international borders. A concluding chapter places human agency at centre stage and explores the interplay between human rights, capability and migration.
A systematic treatment of dynamic decision making and performance measurement Modern business environments are dynamic. Yet, the models used to make decisions and quantify success within them are stuck in the past. In a world where demands, resources, and technology are interconnected and evolving, measures of efficiency need to reflect that environment. In Dynamic Efficiency and Productivity Measurement, Elvira Silva, Spiro E. Stefanou, and Alfons Oude Lansink look at the business process from a dynamic perspective. Their systematic study covers dynamic production environments where current production decisions impact future production possibilities. By considering practical factors like adjustments over time, this book offers an important lens for contemporary microeconomic analysis. Silva, Stefanou, and Lansink develop the analytical foundations of dynamic production technology in both primal and dual representations, with an emphasis on directional distance functions. They cover concepts measuring the production structure (economies of scale, economies of scope, capacity utilization) and performance (allocative, scale and technical inefficiency, productivity) in a methodological and comprehensive way. Through a unified approach, Dynamic Efficiency and Productivity Measurement offers a guide to how firms maximize potential in changing environments and an invaluable contribution to applied microeconomics.
This book explains inflation dynamic, using time series data from 1960 for 42 countries. These countries are different in every aspect, historically, culturally, socially, politically, institutionally, and economically. They are chosen on the basis of the data availability only and cover the Middle East and North Africa (MENA) region, Africa, Asia, the Caribbean, Europe, Australasia, and the United States. Inflation reached double digits in the developed countries in the 1970s and 80s, and then central banks, successfully stabilized it by anchoring inflation expectations for decades, until now. Conditional on common and country-specific shocks such as oil price shocks, financial and banking and political crises, wars, pandemics, natural disasters etc., the book tests various theoretical models about the long and short run relationships between money and prices, money growth and inflation, money growth and real output, expected inflation; the output gap, fiscal policy, and inflation, using a number of parametric and non-parametric methods, and pays attention to specifications and estimations problems. In addition, it explains why policymakers in inflation - targeting countries, e.g. the U.S., failed to anticipate the recent sudden rise in inflation. And, it examines the fallibility of the Modern Monetary Theory's policy prescription to reduce inflation by raising taxes. This is a unique and innovative book, which will find an audience among students, academics, researchers, policy makers, analysts in corporations, private and central banks and international monetary institutions.
This book explores the novel uses and potentials of Data Envelopment Analysis (DEA) under big data. These areas are of widespread interest to researchers and practitioners alike. Considering the vast literature on DEA, one could say that DEA has been and continues to be, a widely used technique both in performance and productivity measurement, having covered a plethora of challenges and debates within the modelling framework.
The most authoritative and up-to-date core econometrics textbook available Econometrics is the quantitative language of economic theory, analysis, and empirical work, and it has become a cornerstone of graduate economics programs. Econometrics provides graduate and PhD students with an essential introduction to this foundational subject in economics and serves as an invaluable reference for researchers and practitioners. This comprehensive textbook teaches fundamental concepts, emphasizes modern, real-world applications, and gives students an intuitive understanding of econometrics. Covers the full breadth of econometric theory and methods with mathematical rigor while emphasizing intuitive explanations that are accessible to students of all backgrounds Draws on integrated, research-level datasets, provided on an accompanying website Discusses linear econometrics, time series, panel data, nonparametric methods, nonlinear econometric models, and modern machine learning Features hundreds of exercises that enable students to learn by doing Includes in-depth appendices on matrix algebra and useful inequalities and a wealth of real-world examples Can serve as a core textbook for a first-year PhD course in econometrics and as a follow-up to Bruce E. Hansen's Probability and Statistics for Economists
This book offers a series of statistical tests to determine if the "crowd out" problem, known to hinder the effectiveness of Keynesian economic stimulus programs, can be overcome by monetary programs. It concludes there are programs that can do this, specifically "accommodative monetary policy." They were not used to any great extent prior to the Quantitative Easing program in 2008, causing the failure of many fiscal stimulus programs through no fault of their own. The book includes exhaustive statistical tests to prove this point. There is also a policy analysis section of the book. It examines how effectively the Federal Reserve's anti-crowd out programs have actually worked, to the extent they were undertaken at all. It finds statistical evidence that using commercial and savings banks instead of investment banks when implementing accommodating monetary policy would have markedly improved their effectiveness. This volume, with its companion volume Why Fiscal Stimulus Programs Fail, Volume 2: Statistical Tests Comparing Monetary Policy to Growth, provides 1000 separate statistical tests on the US economy to prove these assertions.
In many branches of science relevant observations are taken sequentially over time. Bayesian Analysis of Time Series discusses how to use models that explain the probabilistic characteristics of these time series and then utilizes the Bayesian approach to make inferences about their parameters. This is done by taking the prior information and via Bayes theorem implementing Bayesian inferences of estimation, testing hypotheses, and prediction. The methods are demonstrated using both R and WinBUGS. The R package is primarily used to generate observations from a given time series model, while the WinBUGS packages allows one to perform a posterior analysis that provides a way to determine the characteristic of the posterior distribution of the unknown parameters. Features Presents a comprehensive introduction to the Bayesian analysis of time series. Gives many examples over a wide variety of fields including biology, agriculture, business, economics, sociology, and astronomy. Contains numerous exercises at the end of each chapter many of which use R and WinBUGS. Can be used in graduate courses in statistics and biostatistics, but is also appropriate for researchers, practitioners and consulting statisticians. About the author Lyle D. Broemeling, Ph.D., is Director of Broemeling and Associates Inc., and is a consulting biostatistician. He has been involved with academic health science centers for about 20 years and has taught and been a consultant at the University of Texas Medical Branch in Galveston, The University of Texas MD Anderson Cancer Center and the University of Texas School of Public Health. His main interest is in developing Bayesian methods for use in medical and biological problems and in authoring textbooks in statistics. His previous books for Chapman & Hall/CRC include Bayesian Biostatistics and Diagnostic Medicine, and Bayesian Methods for Agreement.
THE GUIDE FOR ANYONE AFRAID TO LEARN STATISTICS & ANALYTICS UPDATED WITH NEW EXAMPLES & EXERCISES This book discusses statistics and analytics using plain language and avoiding mathematical jargon. If you thought you couldn't learn these data analysis subjects because they were too technical or too mathematical, this book is for you! This edition delivers more everyday examples and end-of-chapter exercises and contains updated instructions for using Microsoft Excel. You'll use downloadable data sets and spreadsheet solutions, template-based solutions you can put right to work. Using this book, you will understand the important concepts of statistics and analytics, including learning the basic vocabulary of these subjects. Create tabular and visual summaries and learn to avoid common charting errors Gain experience working with common descriptive statistics measures including the mean, median, and mode; and standard deviation and variance, among others Understand the probability concepts that underlie inferential statistics Learn how to apply hypothesis tests, using Z, t, chi-square, ANOVA, and other techniques Develop skills using regression analysis, the most commonly-used Inferential statistical method Explore results produced by predictive analytics software Choose the right statistical or analytic techniques for any data analysis task Optionally, read the "Equation Blackboards," designed for readers who want to learn about the mathematical foundations of selected methods
The University of Oxford has been and continues to be one of the most important global centres for economics. With six chapters on themes in Oxford economics and 24 chapters on the lives and work of Oxford economists, this volume shows how economics became established at the University, how it produced some of the world's best-known economists, including Francis Ysidro Edgeworth, Roy Harrod and David Hendry, and how it remains a global force for the very best in teaching and research in economics. With original contributions from a stellar cast, this volume provides economists - especially those interested in macroeconomics and the history of economic thought - with the first in-depth analysis of Oxford economics. |
You may like...
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Fat Chance - Probability from 0 to 1
Benedict Gross, Joe Harris, …
Hardcover
R1,923
Discovery Miles 19 230
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
|