Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics
Originally published in 1979. This book addresses three questions regarding uncertainty in economic life: how do we define uncertainty and use the concept meaningfully to provide conclusions; how can the level of uncertainty associated with a particular variable of economic interest be measured; and does experience provide any support for the view that uncertainty really matters. It develops a theory of the effect of price uncertainty on production and trade, takes a graphical approach to look at effects of a mean preserving spread to create rules for ordering distributions, and finishes with an econometric analysis of the effects of Brazil's adoption of a crawling peg in reducing real exchange rate uncertainty. This is an important early study into the significance of uncertainty.
Covering a broad range of topics, this text provides a comprehensive survey of the modeling of chaotic dynamics and complexity in the natural and social sciences. Its attention to models in both the physical and social sciences and the detailed philosophical approach make this a unique text in the midst of many current books on chaos and complexity. Including an extensive index and bibliography along with numerous examples and simplified models, this is an ideal course text.
Four years ago "Research in Experimental Economics" published experimental evidence on fundraising and charitable contributions. This volume returns to the intrigue with philanthropy. Employing a mixture of laboratory and field experiments as well as theoretical research we present this new volume, "Charity with Choice." New waves of experiments are taking advantage of well calibrated environments established by past efforts to add new features to experiments such as endogeneity and self-selection. Adventurous new research programs are popping up and some of them are captured here in this volume. Among the major themes in which the tools of choice, endogeneity, and self-selection are employed are: What increases or decreases charitable activity? and How do organizational and managerial issues affect the performance of non-profit organizations?
This handbook covers DEA topics that are extensively used and solidly based. The purpose of the handbook is to (1) describe and elucidate the state of the field and (2), where appropriate, extend the frontier of DEA research. It defines the state-of-the-art of DEA methodology and its uses. This handbook is intended to represent a milestone in the progression of DEA. Written by experts, who are generally major contributors to the topics to be covered, it includes a comprehensive review and discussion of basic DEA models, which, in the present issue extensions to the basic DEA methods, and a collection of DEA applications in the areas of banking, engineering, health care, and services. The handbook's chapters are organized into two categories: (i) basic DEA models, concepts, and their extensions, and (ii) DEA applications. First edition contributors have returned to update their work. The second edition includes updated versions of selected first edition chapters. New chapters have been added on: different approaches with no need for a priori choices of weights (called multipliers) that reflect meaningful trade-offs, construction of static and dynamic DEA technologies, slacks-based model and its extensions, DEA models for DMUs that have internal structures network DEA that can be used for measuring supply chain operations, Selection of DEA applications in the service sector with a focus on building a conceptual framework, research design and interpreting results. "
This book examines conventional time series in the context of stationary data prior to a discussion of cointegration, with a focus on multivariate models. The authors provide a detailed and extensive study of impulse responses and forecasting in the stationary and non-stationary context, considering small sample correction, volatility and the impact of different orders of integration. Models with expectations are considered along with alternate methods such as Singular Spectrum Analysis (SSA), the Kalman Filter and Structural Time Series, all in relation to cointegration. Using single equations methods to develop topics, and as examples of the notion of cointegration, Burke, Hunter, and Canepa provide direction and guidance to the now vast literature facing students and graduate economists.
Advanced Statistics for Kinesiology and Exercise Science is the first textbook to cover advanced statistical methods in the context of the study of human performance. Divided into three distinct sections, the book introduces and explores in depth both analysis of variance (ANOVA) and regressions analyses, including chapters on: preparing data for analysis; one-way, factorial, and repeated-measures ANOVA; analysis of covariance and multiple analyses of variance and covariance; diagnostic tests; regression models for quantitative and qualitative data; model selection and validation; logistic regression Drawing clear lines between the use of IBM SPSS Statistics software and interpreting and analyzing results, and illustrated with sport and exercise science-specific sample data and results sections throughout, the book offers an unparalleled level of detail in explaining advanced statistical techniques to kinesiology students. Advanced Statistics for Kinesiology and Exercise Science is an essential text for any student studying advanced statistics or research methods as part of an undergraduate or postgraduate degree programme in kinesiology, sport and exercise science, or health science.
This Volume of "Advances in Econometrics" contains a selection of papers presented initially at the 7th Annual Advances in Econometrics Conference held on the LSU campus in Baton Rouge, Louisiana during November 14-16, 2008. The theme of the conference was 'Nonparametric Econometric Methods', and the papers selected for inclusion in this Volume span a range of nonparametric techniques including kernel smoothing, empirical copulas, series estimators, and smoothing splines along with a variety of semiparametric methods. The papers in this Volume cover topics of interest to those who wish to familiarize themselves with current nonparametric methodology. Many papers also identify areas deserving of future attention. There exist survey papers devoted to recent developments in nonparametric nance, constrained nonparametric regression, miparametric/nonparametric environmental econometrics and nonparametric models with non-stationary data. There exist theoretical papers dealing with novel approaches for partial identification of the distribution of treatment effects, xed effects semiparametric panel data models, functional coefficient models with time series data, exponential series estimators of empirical copulas, estimation of multivariate CDFs and bias-reduction methods for density estimation. There also exist a number of applications that analyze returns to education, the evolution of income and life expectancy, the role of governance in growth, farm production, city size and unemployment rates, derivative pricing, and environmental pollution and economic growth. In short, this Volume contains a range of theoretical developments, surveys, and applications that would be of interest to those who wish to keep abreast of some of the most important current developments in the field of nonparametric estimation.
The entropy concept was developed and used by Shannon in 1940 as a measure of uncertainty in the context of information theory. In 1957 Jaynes made use of Shannon's entropy concept as a basis for estimation and inference in problems that are ill-suited for traditional statistical procedures. This volume consists of two sections. The first section contains papers developing econometric methods based on the entropy principle. An interesting array of applications is presented in the second section of the volume.
It is very useful and timely book as demand forecasting has become a very crucial tool and provides important information for destination on which policy are created and implemented. This is especially important given the complexities arising the aftermath of the Covid19 pandemic. * It looks at novel and recent developments in this field including judgement and scenario forecasting. * Offers a comprehensive approach to tourism econometrics, looking at a variety of aspects. * The authors are experts in this field and of the highest academic calibre.
The conference, 'Measurement Error: Econometrics and Practice' was recently hosted by Aston University and organised jointly by researchers from Aston University and Lund University to highlight the enormous problems caused by measurement error in Economic and Financial data which often go largely unnoticed. Thanks to the sponsorship from Eurostat, a number of distinguished researchers were invited to present keynote lectures. Professor Arnold Zellner from University of Chicago shared his knowledge on measurement error in general; Professor William Barnett from the University of Kansas gave a lecture on implications of measurement error on monetary policy, whilst Dennis Fixler shared his knowledge on how statistical agencies deal with measurement errors. This volume is the result of the selection of high-quality papers presented at the conference and is designed to draw attention to the enormous problem in econometrics of measurement error in data provided by the worlds leading statistical agencies; highlighting consequences of data error and offering solutions to deal with such problems. This volume should appeal to economists, financial analysts and practitioners interested in studying and solving economic problems and building econometric models in everyday operations.
Global econometric models have a long history. From the early 1970s to the present, as modeling techniques have advanced, different modeling paradigms have emerged and been used to support national and international policy making. One purpose of this volume - based on a conference in recognition of the seminal impact of Nobel Prize winner in Economic Sciences Lawrence R Klein, whose pioneering work has spawned the field of international econometric modeling - is to survey these developments from today's perspective.A second objective of the volume is to shed light on the wide range of attempts to broaden the scope of modeling on an international scale. Beyond new developments in traditional areas of the trade and financial flows, the volume reviews new approaches to the modeling of linkages between macroeconomic activity and individual economic units, new research on the analysis of trends in income distribution and economic wellbeing on a global scale, and innovative ideas about modeling the interactions between economic development and the environment.With the expansion of elaborated economic linkages, this volume makes an important contribution to the evolving literature of global econometric models.
Modern marketing managers need intuitive and effective tools not just for designing strategies but also for general management. This hands-on book introduces a range of contemporary management and marketing tools and concepts with a focus on forecasting, creating stimulating processes, and implementation. Topics addressed range from creating a clear vision, setting goals, and developing strategies, to implementing strategic analysis tools, consumer value models, budgeting, strategic and operational marketing plans. Special attention is paid to change management and digital transformation in the marketing landscape. Given its approach and content, the book offers a valuable asset for all professionals and advanced MBA students looking for 'real-life' tools and applications.
What happens to risk as the economic horizon goes to zero and risk is seen as an exposure to a change in state that may occur instantaneously at any time? All activities that have been undertaken statically at a fixed finite horizon can now be reconsidered dynamically at a zero time horizon, with arrival rates at the core of the modeling. This book, aimed at practitioners and researchers in financial risk, delivers the theoretical framework and various applications of the newly established dynamic conic finance theory. The result is a nonlinear non-Gaussian valuation framework for risk management in finance. Risk-free assets disappear and low risk portfolios must pay for their risk reduction with negative expected returns. Hedges may be constructed to enhance value by exploiting risk interactions. Dynamic trading mechanisms are synthesized by machine learning algorithms. Optimal exposures are designed for option positioning simultaneously across all strikes and maturities.
The main purpose of this book is to resolve deficiencies and limitations that currently exist when using Technical Analysis (TA). Particularly, TA is being used either by academics as an "economic test" of the weak-form Efficient Market Hypothesis (EMH) or by practitioners as a main or supplementary tool for deriving trading signals. This book approaches TA in a systematic way utilizing all the available estimation theory and tests. This is achieved through the developing of novel rule-based pattern recognizers, and the implementation of statistical tests for assessing the importance of realized returns. More emphasis is given to technical patterns where subjectivity in their identification process is apparent. Our proposed methodology is based on the algorithmic and thus unbiased pattern recognition. The unified methodological framework presented in this book can serve as a benchmark for both future academic studies that test the null hypothesis of the weak-form EMH and for practitioners that want to embed TA within their trading/investment decision making processes.
How could Finance benefit from AI? How can AI techniques provide an edge? Moving well beyond simply speeding up computation, this book tackles AI for Finance from a range of perspectives including business, technology, research, and students. Covering aspects like algorithms, big data, and machine learning, this book answers these and many other questions.
This volume investigates the accuracy and dynamic performance of a high-frequency forecast model for the Japanese and United States economies based on the Current Quarter Model (CQM) or High Frequency Model (HFM) developed by the late Professor Emeritus Lawrence R. Klein. It also presents a survey of recent developments in high-frequency forecasts and gives an example application of the CQM model in forecasting Gross Regional Products (GRPs).
This book presents Professor Lawrence R Klein and his group's last quarterly econometric model of the United States economy that they had produced at the University of Pennsylvania. This is the last econometric model that Lawrence Klein and his disciples have left after some 50 years of cumulated efforts of constructing the US economy model up to around 2000. It was widely known as the WEFA Econometric Model Mark 10, and is the culmination of Professor Klein's research which spans more than 70 years, and would please not only Professor Klein's old students and colleagues, but also younger students who have heard so much of Klein models but have yet to see the latest model in its complete and printed form.
The main objective of this book is to develop a strategy and policy measures to enhance the formalization of the shadow economy in order to improve the competitiveness of the economy and contribute to economic growth; it explores these issues with special reference to Serbia. The size and development of the shadow economy in Serbia and other Central and Eastern European countries are estimated using two different methods (the MIMIC method and household-tax-compliance method). Micro-estimates are based on a special survey of business entities in Serbia, which for the first time allows us to explore the shadow economy from the perspective of enterprises and entrepreneurs. The authors identify the types of shadow economy at work in business entities, the determinants of shadow economy participation, and the impact of competition from the informal sector on businesses. Readers will learn both about the potential fiscal effects of reducing the shadow economy to the levels observed in more developed countries and the effects that formalization of the shadow economy can have on economic growth.
Originally published in 1976 and with second edition published in 1984. This book established itself as the first genuinely introductory text on econometric methods, assuming no formal background on the part of the reader. The second edition maintains this distinctive feature. Fundamental concepts are carefully explained and, where possible, techniques are developed by verbal reasoning rather than formal proof. It provides all the material for a basic course. and is also ideal for a student working alone. Very little knowledge of maths and statistics is assumed, and the logic of statistical method is carefully stated. There are numerous exercises, designed to help the student assess individual progress. Methods are described with computer solutions in mind and the author shows how a variety of different calculations can be performed with relatively simple programs. This new edition also includes much new material - statistical tables are now included and their use carefully explained.
Originally published in 1984. Since the logic underlying economic theory can only be grasped fully by a thorough understanding of the mathematics, this book will be invaluable to economists wishing to understand vast areas of important research. It provides a basic introduction to the fundamental mathematical ideas of topology and calculus, and uses these to present modern singularity theory and recent results on the generic existence of isolated price equilibria in exchange economies.
Originally published in 1978. This book is designed to enable students on main courses in economics to comprehend literature which employs econometric techniques as a method of analysis, to use econometric techniques themselves to test hypotheses about economic relationships and to understand some of the difficulties involved in interpreting results. While the book is mainly aimed at second-year undergraduates undertaking courses in applied economics, its scope is sufficiently wide to take in students at postgraduate level who have no background in econometrics - it integrates fully the mathematical and statistical techniques used in econometrics with micro- and macroeconomic case studies.
Originally published in 1970; with a second edition in 1989. Empirical Bayes methods use some of the apparatus of the pure Bayes approach, but an actual prior distribution is assumed to generate the data sequence. It can be estimated thus producing empirical Bayes estimates or decision rules. In this second edition, details are provided of the derivation and the performance of empirical Bayes rules for a variety of special models. Attention is given to the problem of assessing the goodness of an empirical Bayes estimator for a given set of prior data. Chapters also focus on alternatives to the empirical Bayes approach and actual applications of empirical Bayes methods.
Originally published in 1985. Mathematical methods and models to facilitate the understanding of the processes of economic dynamics and prediction were refined considerably over the period before this book was written. The field had grown; and many of the techniques involved became extremely complicated. Areas of particular interest include optimal control, non-linear models, game-theoretic approaches, demand analysis and time-series forecasting. This book presents a critical appraisal of developments and identifies potentially productive new directions for research. It synthesises work from mathematics, statistics and economics and includes a thorough analysis of the relationship between system understanding and predictability.
Originally published in 1960 and 1966. This is an elementary introduction to the sources of economic statistics and their uses in answering economic questions. No mathematical knowledge is assumed, and no mathematical symbols are used. The book shows - by asking and answering a number of typical questions of applied economics - what the most useful statistics are, where they are found, and how they are to be interpreted and presented. The reader is introduced to the major British, European and American official sources, to the social accounts, to index numbers and averaging, and to elementary aids to inspection such as moving averages and scatter diagrams.
Originally published in 1929. This balanced combination of fieldwork, statistical measurement, and realistic applications shows a synthesis of economics and political science in a conception of an organic relationship between the two sciences that involves functional analysis, institutional interpretation, and a more workmanlike approach to questions of organization such as division of labour and the control of industry. The treatise applies the test of fact through statistical analysis to economic and political theories for the quantitative and institutional approach in solving social and industrial problems. It constructs a framework of concepts, combining both economic and political theory, to systematically produce an original statement in general terms of the principles and methods for statistical fieldwork. The separation into Parts allows selective reading for the methods of statistical measurement; the principles and fallacies of applying these measures to economic and political fields; and the resultant construction of a statistical economics and politics. Basic statistical concepts are described for application, with each method of statistical measurement illustrated with instances relevant to the economic and political theory discussed and a statistical glossary is included. |
You may like...
Statistics for Business and Economics…
Paul Newbold, William Carlson, …
Paperback
R2,397
Discovery Miles 23 970
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,524
Discovery Miles 35 240
Kwantitatiewe statistiese tegnieke
Swanepoel Swanepoel, Vivier Vivier, …
Book
Quantitative statistical techniques
Swanepoel Swanepoel, Vivier Vivier, …
Paperback
(2)
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
|