![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
Macroeconomic Policy in the Canadian Economy investigates developments in Canada over the last forty years, using recent advances in the field of applied econometrics. In particular, the book analyzes the theoretical foundations of public sector activities and evaluates the several theories of government growth. Issues of convergence are also investigated as they manifest themselves in per capita income across Canadian provinces, and as to how successful government income equalization policies have been in furthering such convergence. Moreover, the openness of the Canadian economy is investigated in terms of the importance of exports on GDP growth and of its participation in the world of an internationally integrated capital market. The book also analyzes monetary policy issues and investigates the role of monetary aggregates and the effectiveness of monetary policy. Finally, it addresses the issue of the existence or not of electoral and partisan cycles in Canada, by incorporating both fiscal and monetary principles and applying them to the lively world of Canadian politics.
Empirical Studies In Applied Economics presents nine previously unpublished analyses in monograph form. In this work, the topics are presented so that each chapter stands on its own. The emphasis is on the applications but attention is also given to the econometric and statistical issues for advanced readers. Econometric methods include multivariate regression analysis, limited dependent variable analysis, and other maximum likelihood techniques. The empirical topics include the measurement of competition and market power in natural gas transportation markets and in the pharmaceutical market for chemotherapy drugs. Additional topics include an empirical analysis of NFL football demand, the accuracy of an econometric model for mail demand, and the allocation of police services in rural Alaska. Other chapters consider the valuation of technology patents and the determination of patent scope, duration, and reasonable royalty, and the reaction of financial markets to health scares in the fast-food industry. Finally, two chapters are devoted to the theory and testing of synergistic health effects from the combined exposure to asbestos and cigarette smoking.
Written to complement the second edition of best-selling textbook Introductory Econometrics for Finance, this book provides a comprehensive introduction to the use of the Regression Analysis of Time Series (RATS) software for modelling in finance and beyond. It provides numerous worked examples with carefully annotated code and detailed explanations of the outputs, giving readers the knowledge and confidence to use the software for their own research and to interpret their own results. A wide variety of important modelling approaches are covered, including such topics as time-series analysis and forecasting, volatility modelling, limited dependent variable and panel methods, switching models and simulations methods. The book is supported by an accompanying website containing freely downloadable data and RATS instructions.
Commerce, Complexity, and Evolution is a significant contribution to the paradigm - straddling economics, finance, marketing, and management - which acknowledges that commercial systems are evolutionary, and must therefore be analysed using evolutionary tools. Evolutionary systems display complicated behaviours which are to a significant degree generated endogenously, rather than being solely the product of exogenous shocks, hence the conjunction of complexity with evolution. This volume considers a wide range of systems, from the entire economy at one extreme to the behaviour of single markets at the other. The papers are united by methodologies which at their core are evolutionary, though the techniques cover a wide range, from philosophical discourse to differential equations, genetic algorithms, multi-agent simulations and cellular automata. Issues considered include the dynamics of debt-deflation, stock management in a complex environment, interactions between consumers and its effect upon market behaviour, and nonlinear methods to profit from financial market volatility.
Written to complement the second edition of best-selling textbook Introductory Econometrics for Finance, this book provides a comprehensive introduction to the use of the Regression Analysis of Time Series (RATS) software for modelling in finance and beyond. It provides numerous worked examples with carefully annotated code and detailed explanations of the outputs, giving readers the knowledge and confidence to use the software for their own research and to interpret their own results. A wide variety of important modelling approaches are covered, including such topics as time-series analysis and forecasting, volatility modelling, limited dependent variable and panel methods, switching models and simulations methods. The book is supported by an accompanying website containing freely downloadable data and RATS instructions.
Parallel Algorithms for Linear Models provides a complete and detailed account of the design, analysis and implementation of parallel algorithms for solving large-scale linear models. It investigates and presents efficient, numerically stable algorithms for computing the least-squares estimators and other quantities of interest on massively parallel systems. The monograph is in two parts. The first part consists of four chapters and deals with the computational aspects for solving linear models that have applicability in diverse areas. The remaining two chapters form the second part, which concentrates on numerical and computational methods for solving various problems associated with seemingly unrelated regression equations (SURE) and simultaneous equations models. The practical issues of the parallel algorithms and the theoretical aspects of the numerical methods will be of interest to a broad range of researchers working in the areas of numerical and computational methods in statistics and econometrics, parallel numerical algorithms, parallel computing and numerical linear algebra. The aim of this monograph is to promote research in the interface of econometrics, computational statistics, numerical linear algebra and parallelism.
The three decades which have followed the publication of Heinz Neudecker's seminal paper Some Theorems on Matrix Differentiation with Special Reference to Kronecker Products' in the Journal of the American Statistical Association (1969) have witnessed the growing influence of matrix analysis in many scientific disciplines. Amongst these are the disciplines to which Neudecker has contributed directly - namely econometrics, economics, psychometrics and multivariate analysis. This book aims to illustrate how powerful the tools of matrix analysis have become as weapons in the statistician's armoury. The majority of its chapters are concerned primarily with theoretical innovations, but all of them have applications in view, and some of them contain extensive illustrations of the applied techniques. This book will provide research workers and graduate students with a cross-section of innovative work in the fields of matrix methods and multivariate statistical analysis. It should be of interest to students and practitioners in a wide range of subjects which rely upon modern methods of statistical analysis. The contributors to the book are themselves practitioners of a wide range of subjects including econometrics, psychometrics, educational statistics, computation methods and electrical engineering, but they find a common ground in the methods which are represented in the book. It is envisaged that the book will serve as an important work of reference and as a source of inspiration for some years to come.
Price and quantity indices are important, much-used measuring instruments, and it is therefore necessary to have a good understanding of their properties. When it was published, this book is the first comprehensive text on index number theory since Irving Fisher's 1922 The Making of Index Numbers. The book covers intertemporal and interspatial comparisons; ratio- and difference-type measures; discrete and continuous time environments; and upper- and lower-level indices. Guided by economic insights, this book develops the instrumental or axiomatic approach. There is no role for behavioural assumptions. In addition to subject matter chapters, two entire chapters are devoted to the rich history of the subject.
Structural Funds: Growth, Employment and the Environment is a book on the role of transfers designed for assisting sustainable development of less developed regions within the European Union. The book places special emphasis on the future path of the Greek economy and discusses likely outcomes -related directly to the impact of these transfers- in: * Growth and macroeconomic convergence * Employment in key sectors of the economy * Energy demand and its environmental aspect The book uses macroeconomic modelling and modern applied econometric techniques to analyze these issues, thus offering a coherent methodological framework for their presentation.To this extent, Structural Funds: Growth, Employment and the Environment can serve to: * Academic researchers and economists in recipient countries who can gain a better understanding of how national authorities can best design and implement the strategic allocation and utilization of these funds to maximize the benefits for the domestic economy * Policymakers in the European Union by offering a sound and rigorously elaborated treatment which can be applied as an estimation and comparison tool for the effects of Structural Funds both at the national and the international level * Economists in Eastern European countries which are at the pre-accession stage and will be eligible for this type of transfers in the near future.
Investment is crucial to the development of a nations economy and welfare. In contrast to the situation in the United States, investment activity in Europe has been quite modest over the past few years. This volume gathers together a number of papers by prominent researchers in the field of investment. It provides an overview of recent developments in this area and presents new empirical findings on the determinants and implications of the investment process in European countries. Among the topics examined are the role played by taxation, uncertainty and the financial systems, as well as the relevance of corporate governance to the investment process. Two chapters are dedicated to infrastructure investment and foreign direct investment.
Financial globalization has increased the significance of methods used in the evaluation of country risk, one of the major research topics in economics and finance. Written by experts in the fields of multicriteria methodology, credit risk assessment, operations research, and financial management, this book develops a comprehensive framework for evaluating models based on several classification techniques that emerge from different theoretical directions. This book compares different statistical and data mining techniques, noting the advantages of each method, and introduces new multicriteria methodologies that are important to country risk modeling. Key topics include: (1) A review of country risk definitions and an overview of the most recent tools in country risk management, (2) In-depth analysis of statistical, econometric and non-parametric classification techniques, (3) Several real-world applications of the methodologies described throughout the text, (4) Future research directions for country risk assessment problems. This work is a useful toolkit for economists, financial managers, bank managers, operations researchers, management scientists, and risk analysts. Moreover, the book can also be used as a supplementary text for graduate courses in finance and financial risk management.
An observational study is an empiric investigation of effects caused by treatments when randomized experimentation is unethical or infeasible. Observational studies are common in most fields that study the effects of treatments on people, including medicine, economics, epidemiology, education, psychology, political science and sociology. The quality and strength of evidence provided by an observational study is determined largely by its design. Design of Observational Studies is both an introduction to statistical inference in observational studies and a detailed discussion of the principles that guide the design of observational studies. Design of Observational Studies is divided into four parts. Chapters 2, 3, and 5 of Part I cover concisely, in about one hundred pages, many of the ideas discussed in Rosenbaum's Observational Studies (also published by Springer) but in a less technical fashion. Part II discusses the practical aspects of using propensity scores and other tools to create a matched comparison that balances many covariates. Part II includes a chapter on matching in R. In Part III, the concept of design sensitivity is used to appraise the relative ability of competing designs to distinguish treatment effects from biases due to unmeasured covariates. Part IV discusses planning the analysis of an observational study, with particular reference to Sir Ronald Fisher's striking advice for observational studies, "make your theories elaborate." The second edition of his book, Observational Studies, was published by Springer in 2002.
"Mathematical Optimization and Economic Analysis" is a self-contained introduction to various optimization techniques used in economic modeling and analysis such as geometric, linear, and convex programming and data envelopment analysis. Through a systematic approach, this book demonstrates the usefulness of these mathematical tools in quantitative and qualitative economic analysis. The book presents specific examples to demonstrate each technique's advantages and applicability as well as numerous applications of these techniques to industrial economics, regulatory economics, trade policy, economic sustainability, production planning, and environmental policy. Key Features include: - A detailed presentation of both single-objective and multiobjective optimization; - An in-depth exposition of various applied optimization problems; - Implementation of optimization tools to improve the accuracy of various economic models; - Extensive resources suggested for further reading. This book is intended for graduate and postgraduate students studying quantitative economics, as well as economics researchers and applied mathematicians. Requirements include a basic knowledge of calculus and linear algebra, and a familiarity with economic modeling.
In this testament to the distinguished career of H.S. Houthakker a number of Professor Houthakker's friends, former colleagues and former students offer essays which build upon and extend his many contributions to economics in aggregation, consumption, growth and trade. Among the many distinguished contributors are Paul Samuelson, Werner Hildenbrand, John Muellbauer and Lester Telser. The book also includes four previously unpublished papers and notes by its distinguished dedicatee.
To derive rational and convincible solutions to practical decision making problems in complex and hierarchical human organizations, the decision making problems are formulated as relevant mathematical programming problems which are solved by developing optimization techniques so as to exploit characteristics or structural features of the formulated problems. In particular, for resolving con?ict in decision making in hierarchical managerial or public organizations, the multi level formula tion of the mathematical programming problems has been often employed together with the solution concept of Stackelberg equilibrium. However, weconceivethatapairoftheconventionalformulationandthesolution concept is not always suf?cient to cope with a large variety of decision making situations in actual hierarchical organizations. The following issues should be taken into consideration in expression and formulation of decision making problems. Informulationofmathematicalprogrammingproblems, itistacitlysupposedthat decisions are made by a single person while game theory deals with economic be havior of multiple decision makers with fully rational judgment. Because two level mathematical programming problems are interpreted as static Stackelberg games, multi level mathematical programming is relevant to noncooperative game theory; in conventional multi level mathematical programming models employing the so lution concept of Stackelberg equilibrium, it is assumed that there is no communi cation among decision makers, or they do not make any binding agreement even if there exists such communication. However, for decision making problems in such as decentralized large ?rms with divisional independence, it is quite natural to sup pose that there exists communication and some cooperative relationship among the decision maker
The optimisation of economic systems over time, and in an uncertain environment, is central to the study of economic behaviour. The behaviour of rational decision makers, whether they are market agents, firms, or governments and their agencies, is governed by decisions designed to seeure the best outcomes subject to the perceived information and economic responses (inlcuding those of other agents). Economic behaviour has therefore to be analysed in terms of the outcomes of a multiperiod stochastic optimisation process containing four main components: the economic responses (the dynamic constraints, represented by an economic model); the objec tive function (the goals and their priorities); the conditioning information (expected exogenous events and the expected future state of the economy); and risk manage ment (how uncertainties are accommodated). The papers presented in this book all analyse some aspect of economic behaviour related to the objectives, information, or risk components of the decision process. While the construction of economic models obviously also has a vital role to play, that component has received much greater (or almost exclusive) attention elsewhere. These papers examine optimising behaviour in a wide range of economic problems, both theoretical and applied. They reflect a variety of concerns: economic responses under rational expectations; the Lucas critique and optimal fiscal or monetary poli eies; market management; partly endogenous goals; evaluating government reactions; locational decisions; uncertainty and information structures; and forecasting with endogenous reactions."
This book will interest and assist people who are dealing with the problems of predicitons of time series in higher education and research. It will greatly assist people who apply time series theory to practical problems in their work and also serve as a textbook for postgraduate students in statistics economics and related subjects.
This book links the questions people ask about why things exist, why the world is the way it is, and whether and how it is possible to change their society or world with the societal myths they develop and teach to answer those questions and organize and bring order to their communal lives. It also is about the need for change in western societies’ current organizing concept, classical (Lockean) liberalism. Despite the attempts of numerous insightful political thinkers, the myth of classical liberalism has developed so many cracks that it cannot be put back together again. If not entirely failed, it is at this point unsalvageable in its present form. Never the thought of just one person, the liberal model of individual religious, political, and economic freedom developed over hundreds of years starting with Martin Luther’s dictum that every man should be his own priest. Although, classical liberalism means different things to different people, at its most basic level, this model sees human beings as individuals who exist prior to government and have rights over government and the social good. That is, the individual right always trumps the moral and social good and individuals have few obligations to one another unless they actively choose to undertake them. Possibility’s Parents argues that Lockean liberalism has reached the end of its logic in ways that make it unable to handle the western world’s most pressing problems and that novelists whose writing includes the form and texture of myth have important insights to offer on the way forward.
Do economics and statistics succeed in explaining human social behaviour? To answer this question. Leland Gerson Neuberg studies some pioneering controlled social experiments. Starting in the late 1960s, economists and statisticians sought to improve social policy formation with random assignment experiments such as those that provided income guarantees in the form of a negative income tax. This book explores anomalies in the conceptual basis of such experiments and in the foundations of statistics and economics more generally. Scientific inquiry always faces certain philosophical problems. Controlled experiments of human social behaviour, however, cannot avoid some methodological difficulties not evident in physical science experiments. Drawing upon several examples, the author argues that methodological anomalies prevent microeconomics and statistics from explaining human social behaviour as coherently as the physical sciences explain nature. He concludes that controlled social experiments are a frequently overrated tool for social policy improvement.
This book provides a game theoretic model of interaction among VoIP telecommunications providers regarding their willingness to enter peering agreements with one another. The author shows that the incentive to peer is generally based on savings from otherwise payable long distance fees. At the same time, termination fees can have a countering and dominant effect, resulting in an environment in which VoIP firms decide against peering. Various scenarios of peering and rules for allocation of the savings are considered. The first part covers the relevant aspects of game theory and network theory, trying to give an overview of the concepts required in the subsequent application. The second part of the book introduces first a model of how the savings from peering can be calculated and then turns to the actual formation of peering relationships between VoIP firms. The conditions under which firms are willing to peer are then described, considering the possible influence of a regulatory body.
Getting Started with a SIMPLIS Approach is particularly appropriate for those users who are not experts in statistics, but have a basic understanding of multivariate analysis that would allow them to use this handbook as a good first foray into LISREL. Part I introduces the topic, presents the study that serves as the background for the explanation of matters, and provides the basis for Parts II and III, which, in turn, explain the process of estimation of the measurement model and the structural model, respectively. In each section, we also suggest essential literature to support the utilization of the handbook. After having read the book, readers will have acquired a basic knowledge of structural equation modeling, namely using the LISREL program, and will be prepared to continue with the learning process."
Since its establishment in the 1950s the American Economic Association's Committee on Economic Education has sought to promote improved instruction in economics and to facilitate this objective by stimulating research on the teaching of economics. These efforts are most apparent in the sessions on economic education that the Committee organizes at the Association's annual meetings. At these sessions economists interested in economic education have opportunities to present new ideas on teaching and research and also to report the findings of their research. The record of this activity can be found in the Proceedings of the American Eco nomic Review. The Committee on Economic Education and its members have been actively involved in a variety of other projects. In the early 1960s it organized the National Task Force on Economic Education that spurred the development of economics teaching at the precollege level. This in turn led to the development of a standardized research instrument, a high school test of economic understanding. This was followed later in the 1960s by the preparation of a similar test of understanding college economics. The development of these two instruments greatly facilitated research on the impact of economics instruction, opened the way for application of increasingly sophisticated statistical methods in measuring the impact of economic education, and initiated a steady stream of research papers on a subject that previously had not been explored."
Articles on econometric methodology with special reference to the
quantification of poverty and economic inequality are presented in
this book. Poverty and inequality measurement present special
problems to the econometrician, and most of these papers analyze
how to attack those problems.
Testing for a unit root is now an essential part of time series
analysis. Indeed no time series study in economics, and other
disciplines that use time series observations, can ignore the
crucial issue of nonstationarity caused by a unit root. However,
the literature on the topic is large and often technical, making it
difficult to understand the key practical issues.
Unique blend of asymptotic theory and small sample practice through simulation experiments and data analysis. Novel reproducing kernel Hilbert space methods for the analysis of smoothing splines and local polynomials. Leading to uniform error bounds and honest confidence bands for the mean function using smoothing splines Exhaustive exposition of algorithms, including the Kalman filter, for the computation of smoothing splines of arbitrary order. |
You may like...
Dungeons & Dragons - Honour Among…
Chris Pine, Michelle Rodriguez, …
DVD
Star Wars: Episode 9 - The Rise Of…
Daisy Ridley, Adam Driver, …
Blu-ray disc
(2)R453 Discovery Miles 4 530
|