![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Now in its third edition, Essential Econometric Techniques: A Guide to Concepts and Applications is a concise, student-friendly textbook which provides an introductory grounding in econometrics, with an emphasis on the proper application and interpretation of results. Drawing on the author's extensive teaching experience, this book offers intuitive explanations of concepts such as heteroskedasticity and serial correlation, and provides step-by-step overviews of each key topic. This new edition contains more applications, brings in new material including a dedicated chapter on panel data techniques, and moves the theoretical proofs to appendices. After Chapter 7, students will be able to design and conduct rudimentary econometric research. The next chapters cover multicollinearity, heteroskedasticity, and autocorrelation, followed by techniques for time-series analysis and panel data. Excel data sets for the end-of-chapter problems are available as a digital supplement. A solutions manual is also available for instructors, as well as PowerPoint slides for each chapter. Essential Econometric Techniques shows students how economic hypotheses can be questioned and tested using real-world data, and is the ideal supplementary text for all introductory econometrics courses.
This book describes the functions frequently used in deep neural networks. For this purpose, 37 activation functions are explained both mathematically and visually, and given with their LaTeX implementations due to their common use in scientific articles.
Based on economic knowledge and logical reasoning, this book proposes a solution to economic recessions and offers a route for societal change to end capitalism. The author starts with a brief review of the history of economics, and then questions and rejects the trend of recent decades that has seen econometrics replace economic theory. By reviewing the different schools of economic thought and by examining the limitations of existing theories to business cycles and economic growth, the author forms a new theory to explain cyclic economic growth. According to this theory, economic recessions result from innovation scarcity, which in turn results from the flawed design of the patent system. The author suggests a new design for the patent system and envisions that the new design would bring about large economic and societal changes. Under this new patent system, the synergy of the patent and capital markets would ensure that economic recessions could be avoided and that the economy would grow at the highest speed.
This is the perfect (and essential) supplement for all econometrics
classes--from a rigorous first undergraduate course, to a first
master's, to a PhD course.
Using data from the World Values Survey, this book sheds light on the link between happiness and the social group to which one belongs. The work is based on a rigorous statistical analysis of differences in the probability of happiness and life satisfaction between the predominant social group and subordinate groups. The cases of India and South Africa receive deep attention in dedicated chapters on cast and race, with other chapters considering issues such as cultural bias, religion, patriarchy, and gender. An additional chapter offers a global perspective. On top of this, the longitudinal nature of the data facilitates an examination of how world happiness has evolved between 1994 and 2014. This book will be a valuable reference for advanced students, scholars and policymakers involved in development economics, well-being, development geography, and sociology.
The development of economics changed dramatically during the twentieth century with the emergence of econometrics, macroeconomics and a more scientific approach in general. One of the key individuals in the transformation of economics was Ragnar Frisch, professor at the University of Oslo and the first Nobel Laureate in economics in 1969. He was a co-founder of the Econometric Society in 1930 (after having coined the word econometrics in 1926) and edited the journal Econometrics for twenty-two years. The discovery of the manuscripts of a series of eight lectures given by Frisch at the Henri Poincare Institute in March-April 1933 on The Problems and Methods of Econometrics will enable economists to more fully understand his overall vision of econometrics. This book is a rare exhibition of Frisch's overview on econometrics and is published here in English for the first time. Edited and with an introduction by Olav Bjerkholt and Ariane Dupont-Kieffer, Frisch's eight lectures provide an accessible and astute discussion of econometric issues from philosophical foundations to practical procedures. Concerning the development of economics in the twentieth century and the broader visions about economic science in general and econometrics in particular held by Ragnar Frisch, this book will appeal to anyone with an interest in the history of economics and econometrics.
The Analytic Network Process (ANP), developed by Thomas Saaty in his work on multicriteria decision making, applies network structures with dependence and feedback to complex decision making. This new edition of Decision Making with the Analytic Network Process is a selection of the latest applications of ANP to economic, social and political decisions, and also to technological design. The ANP is a methodological tool that is helpful to organize knowledge and thinking, elicit judgments registered in both in memory and in feelings, quantify the judgments and derive priorities from them, and finally synthesize these diverse priorities into a single mathematically and logically justifiable overall outcome. In the process of deriving this outcome, the ANP also allows for the representation and synthesis of diverse opinions in the midst of discussion and debate. The book focuses on the application of the ANP in three different areas: economics, the social sciences and the linking of measurement with human values. Economists can use the ANP for an alternate approach for dealing with economic problems than the usual mathematical models on which economics bases its quantitative thinking. For psychologists, sociologists and political scientists, the ANP offers the methodology they have sought for some time to quantify and derive measurements for intangibles. Finally the book applies the ANP to provide people in the physical and engineering sciences with a quantitative method to link hard measurement to human values. In such a process, one is able to interpret the true meaning of measurements made on a uniform scale using a unit.
The behaviour of commodity prices never ceases to marvel economists, financial analysts, industry experts, and policymakers. Unexpected swings in commodity prices used to occur infrequently but have now become a permanent feature of global commodity markets. This book is about modelling commodity price shocks. It is intended to provide insights into the theoretical, conceptual, and empirical modelling of the underlying causes of global commodity price shocks. Three main objectives motivated the writing of this book. First, to provide a variety of modelling frameworks for documenting the frequency and intensity of commodity price shocks. Second, to evaluate existing approaches used for forecasting large movements in future commodity prices. Third, to cover a wide range and aspects of global commodities including currencies, rare-hard-lustrous transition metals, agricultural commodities, energy, and health pandemics. Some attempts have already been made towards modelling commodity price shocks. However, most tend to narrowly focus on a subset of commodity markets, i.e., agricultural commodities market and/or the energy market. In this book, the author moves the needle forward by operationalizing different models, which allow researchers to identify the underlying causes and effects of commodity price shocks. Readers also learn about different commodity price forecasting models. The author presents the topics to readers assuming less prior or specialist knowledge. Thus, the book is accessible to industry analysts, researchers, undergraduate and graduate students in economics and financial economics, academic and professional economists, investors, and financial professionals working in different sectors of the commodity markets. Another advantage of the book's approach is that readers are not only exposed to several innovative modelling techniques to add to their modelling toolbox but are also exposed to diverse empirical applications of the techniques presented.
This book is devoted to biased sampling problems (also called choice-based sampling in Econometrics parlance) and over-identified parameter estimation problems. Biased sampling problems appear in many areas of research, including Medicine, Epidemiology and Public Health, the Social Sciences and Economics. The book addresses a range of important topics, including case and control studies, causal inference, missing data problems, meta-analysis, renewal process and length biased sampling problems, capture and recapture problems, case cohort studies, exponential tilting genetic mixture models etc. The goal of this book is to make it easier for Ph. D students and new researchers to get started in this research area. It will be of interest to all those who work in the health, biological, social and physical sciences, as well as those who are interested in survey methodology and other areas of statistical science, among others.
Advances in Austrian Economics is a research annual whose editorial policy is to publish original research articles on Austrian economics. Each volume attempts to apply the insights of Austrian economics and related approaches to topics that are of current interest in economics and cognate disciplines. Volume 21 exemplifies this focus by highlighting key research from the Austrian tradition of economics with other research traditions in economics and related areas.
This book investigates why economics makes less visible progress over time than scientific fields with a strong practical component, where interactions with physical technologies play a key role. The thesis of the book is that the main impediment to progress in economics is "false feedback", which it defines as the false result of an empirical study, such as empirical evidence produced by a statistical model that violates some of its assumptions. In contrast to scientific fields that work with physical technologies, false feedback is hard to recognize in economics. Economists thus have difficulties knowing where they stand in their inquiries, and false feedback will regularly lead them in the wrong directions. The book searches for the reasons behind the emergence of false feedback. It thereby contributes to a wider discussion in the field of metascience about the practices of researchers when pursuing their daily business. The book thus offers a case study of metascience for the field of empirical economics. The main strength of the book are the numerous smaller insights it provides throughout. The book delves into deep discussions of various theoretical issues, which it illustrates by many applied examples and a wide array of references, especially to philosophy of science. The book puts flesh on complicated and often abstract subjects, particularly when it comes to controversial topics such as p-hacking. The reader gains an understanding of the main challenges present in empirical economic research and also the possible solutions. The main audience of the book are all applied researchers working with data and, in particular, those who have found certain aspects of their research practice problematic.
This book reflects the state of the art on nonlinear economic dynamics, financial market modelling and quantitative finance. It contains eighteen papers with topics ranging from disequilibrium macroeconomics, monetary dynamics, monopoly, financial market and limit order market models with boundedly rational heterogeneous agents to estimation, time series modelling and empirical analysis and from risk management of interest-rate products, futures price volatility and American option pricing with stochastic volatility to evaluation of risk and derivatives of electricity market. The book illustrates some of the most recent research tools in these areas and will be of interest to economists working in economic dynamics and financial market modelling, to mathematicians who are interested in applying complexity theory to economics and finance and to market practitioners and researchers in quantitative finance interested in limit order, futures and electricity market modelling, derivative pricing and risk management.
The book provides an integrated approach to risk sharing, risk spreading and efficient regulation through principal agent models. It emphasizes the role of information asymmetry and risk sharing in contracts as an alternative to transaction cost considerations. It examines how contracting, as an institutional mechanism to conduct transactions, spreads risks while attempting consolidation. It further highlights the shifting emphasis in contracts from Coasian transaction cost saving to risk sharing and shows how it creates difficulties associated with risk spreading, and emphasizes the need for efficient regulation of contracts at various levels. Each of the chapters is structured using a principal agent model, and all chapters incorporate adverse selection (and exogenous randomness) as a result of information asymmetry, as well as moral hazard (and endogenous randomness) due to the self-interest-seeking behavior on the part of the participants.
This book primarily addresses the optimality aspects of covariate designs. A covariate model is a combination of ANOVA and regression models. Optimal estimation of the parameters of the model using a suitable choice of designs is of great importance; as such choices allow experimenters to extract maximum information for the unknown model parameters. The main emphasis of this monograph is to start with an assumed covariate model in combination with some standard ANOVA set-ups such as CRD, RBD, BIBD, GDD, BTIBD, BPEBD, cross-over, multi-factor, split-plot and strip-plot designs, treatment control designs, etc. and discuss the nature and availability of optimal covariate designs. In some situations, optimal estimations of both ANOVA and the regression parameters are provided. Global optimality and D-optimality criteria are mainly used in selecting the design. The standard optimality results of both discrete and continuous set-ups have been adapted, and several novel combinatorial techniques have been applied for the construction of optimum designs using Hadamard matrices, the Kronecker product, Rao-Khatri product, mixed orthogonal arrays to name a few.
This book addresses both theoretical developments in and practical applications of econometric techniques to finance-related problems. It includes selected edited outcomes of the International Econometric Conference of Vietnam (ECONVN2018), held at Banking University, Ho Chi Minh City, Vietnam on January 15-16, 2018. Econometrics is a branch of economics that uses mathematical (especially statistical) methods to analyze economic systems, to forecast economic and financial dynamics, and to develop strategies for achieving desirable economic performance. An extremely important part of economics is finances: a financial crisis can bring the whole economy to a standstill and, vice versa, a smart financial policy can dramatically boost economic development. It is therefore crucial to be able to apply mathematical techniques of econometrics to financial problems. Such applications are a growing field, with many interesting results - and an even larger number of challenges and open problems.
This monograph provides a unified and comprehensive treatment of an order-theoretic fixed point theory in partially ordered sets and its various useful interactions with topological structures. The material progresses systematically, by presenting the preliminaries before moving to more advanced topics. In the treatment of the applications a wide range of mathematical theories and methods from nonlinear analysis and integration theory are applied; an outline of which has been given an appendix chapter to make the book self-contained. Graduate students and researchers in nonlinear analysis, pure and applied mathematics, game theory and mathematical economics will find this book useful.
This volume comprises the classic articles on methods of identification and estimation of simultaneous equations econometric models. It includes path-breaking contributions by Trygve Haavelmo and Tjalling Koopmans, who founded the subject and received Nobel prizes for their work. It presents original articles that developed and analysed the leading methods for estimating the parameters of simultaneous equations systems: instrumental variables, indirect least squares, generalized least squares, two-stage and three-stage least squares, and maximum likelihood. Many of the articles are not readily accessible to readers in any other form.
Advances in Econometrics is a research annual whose editorial policy is to publish original research articles that contain enough details so that economists and econometricians who are not experts in the topics will find them accessible and useful in their research. Volume 37 exemplifies this focus by highlighting key research from new developments in econometrics.
This book presents selected peer-reviewed contributions from the International Work-Conference on Time Series, ITISE 2017, held in Granada, Spain, September 18-20, 2017. It discusses topics in time series analysis and forecasting, including advanced mathematical methodology, computational intelligence methods for time series, dimensionality reduction and similarity measures, econometric models, energy time series forecasting, forecasting in real problems, online learning in time series as well as high-dimensional and complex/big data time series. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing computer science, mathematics, statistics and econometrics.
* Includes many mathematical examples and problems for students to work directly with both standard and nonstandard models of behaviour to develop problem-solving and critical-thinking skills which are more valuable to students than memorizing content which will quickly be forgotten. * The applications explored in the text emphasise issues of inequality, social mobility, culture and poverty to demonstrate the impact of behavioral economics in areas which students are most passionate about. * The text has a standardized structure (6 parts, 3 chapters in each) which provides a clear and consistent roadmap for students taking the course.
The second edition of this widely acclaimed text presents a thoroughly up-to-date intuitive account of recent developments in econometrics. It continues to present the frontiers of research in an accessible form for non-specialist econometricians, advanced undergraduates and graduate students wishing to carry out applied econometric research. This new edition contains substantially revised chapters on cointegration and vector autoregressive (VAR) modelling, reflecting the developments that have been made in these important areas since the first edition. Special attention is given to the Dickey-Pantula approach and the testing for the order of integration of a variable in the presence of a structural break. For VAR models, impulse response analysis is explained and illustrated. There is also a detailed but intuitive explanation of the Johansen method, an increasingly popular technique. The text contains specially constructed and original tables of critical values for a wide range of tests for stationarity and cointegration. These tables are for Dickey-Fuller tests, Dickey-Hasza-Fuller and HEGY seasonal integration tests and the Perron 'additive outlier' integration test.
This book provides an up-to-date series of advanced chapters on applied financial econometric techniques pertaining the various fields of commodities finance, mathematics & stochastics, international macroeconomics and financial econometrics. International Financial Markets: Volume I provides a key repository on the current state of knowledge, the latest debates and recent literature on international financial markets. Against the background of the "financialization of commodities" since the 2008 sub-primes crisis, section one contains recent contributions on commodity and financial markets, pushing the frontiers of applied econometrics techniques. The second section is devoted to exchange rate and current account dynamics in an environment characterized by large global imbalances. Part three examines the latest research in the field of meta-analysis in economics and finance. This book will be useful to students and researchers in applied econometrics; academics and students seeking convenient access to an unfamiliar area. It will also be of great interest established researchers seeking a single repository on the current state of knowledge, current debates and relevant literature.
Volume 36 of Advances in Econometrics recognizes Aman Ullah's significant contributions in many areas of econometrics and celebrates his long productive career. The volume features original papers on the theory and practice of econometrics that is related to the work of Aman Ullah. Topics include nonparametric/semiparametric econometrics; finite sample econometrics; shrinkage methods; information/entropy econometrics; model specification testing; robust inference; panel/spatial models. Advances in Econometrics is a research annual whose editorial policy is to publish original research articles that contain enough details so that economists and econometricians who are not experts in the topics will find them accessible and useful in their research.
Following the recent publication of the award winning and much acclaimed "The New Palgrave Dictionary of Economics," second edition which brings together Nobel Prize winners and the brightest young scholars to survey the discipline, we are pleased to announce "The New Palgrave Economics Collection." Due to demand from the economics community these books address key subject areas within the field. Each title is comprised of specially selected articles from the Dictionary and covers a fundamental theme within the discipline. All of the articles have been specifically chosen by the editors of the Dictionary, Steven N.Durlauf and Lawrence E.Blume and are written by leading practitioners in the field. The Collections provide the reader with easy to access information on complex and important subject areas, and allow individual scholars and students to have their own personal reference copy. |
You may like...
Regulation of Cloud Services under US…
Sara Gabriella Hoffman
Hardcover
R1,585
Discovery Miles 15 850
Anatomy Ontologies for Bioinformatics…
Albert Burger, Duncan Davidson, …
Paperback
R4,032
Discovery Miles 40 320
|