![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
The Socialist Industrial State (1976) examines the state-socialist system, taking as the central example the Soviet Union - where the goals and values of Marxism-Leninism and the particular institutions, the form of economy and polity, were first adopted and developed. It then considers the historical developments, differences in culture, the level of economic development and the political processes of different state-socialist countries around the globe.
Essentials of Time Series for Financial Applications serves as an agile reference for upper level students and practitioners who desire a formal, easy-to-follow introduction to the most important time series methods applied in financial applications (pricing, asset management, quant strategies, and risk management). Real-life data and examples developed with EViews illustrate the links between the formal apparatus and the applications. The examples either directly exploit the tools that EViews makes available or use programs that by employing EViews implement specific topics or techniques. The book balances a formal framework with as few proofs as possible against many examples that support its central ideas. Boxes are used throughout to remind readers of technical aspects and definitions and to present examples in a compact fashion, with full details (workout files) available in an on-line appendix. The more advanced chapters provide discussion sections that refer to more advanced textbooks or detailed proofs.
This book explains in simple settings the fundamental ideas of financial market modelling and derivative pricing, using the no-arbitrage principle. Relatively elementary mathematics leads to powerful notions and techniques - such as viability, completeness, self-financing and replicating strategies, arbitrage and equivalent martingale measures - which are directly applicable in practice. The general methods are applied in detail to pricing and hedging European and American options within the Cox-Ross-Rubinstein (CRR) binomial tree model. A simple approach to discrete interest rate models is included, which, though elementary, has some novel features. All proofs are written in a user-friendly manner, with each step carefully explained and following a natural flow of thought. In this way the student learns how to tackle new problems.
This book analyzes evolution of monetary policy in Rwanda since it was first implemented by the National Bank of Rwanda in 1964 when the bank was established. It contributes to the understanding of monetary policy which is formulation and implementation in different stages of development of a financial system that comprises the financial market (money market and capital market), financial intermediaries such as commercial banks, and the financial sector infrastructures such as payment systems and the credit reference bureau. The book breaks down applied empirical research on the assessment of key assumptions of a monetary targeting framework, namely the stability of money multiplier and money demand using econometrics of time series, through a number of case studies. Presenting a detailed empirical analysis of the monetary transmission mechanism, one of the most analyzed topics in central banks in advanced economies, this book is a valuable read for central bankers and other researchers of monetary policy, particularly in developing economies.
This is the first textbook designed to teach statistics to students in aviation courses. All examples and exercises are grounded in an aviation context, including flight instruction, air traffic control, airport management, and human factors. Structured in six parts, theiscovers the key foundational topics relative to descriptive and inferential statistics, including hypothesis testing, confidence intervals, z and t tests, correlation, regression, ANOVA, and chi-square. In addition, this book promotes both procedural knowledge and conceptual understanding. Detailed, guided examples are presented from the perspective of conducting a research study. Each analysis technique is clearly explained, enabling readers to understand, carry out, and report results correctly. Students are further supported by a range of pedagogical features in each chapter, including objectives, a summary, and a vocabulary check. Digital supplements comprise downloadable data sets and short video lectures explaining key concepts. Instructors also have access to PPT slides and an instructor’s manual that consists of a test bank with multiple choice exams, exercises with data sets, and solutions. This is the ideal statistics textbook for aviation courses globally, especially in aviation statistics, research methods in aviation, human factors, and related areas.
This book is intended to provide the reader with a firm conceptual and empirical understanding of basic information-theoretic econometric models and methods. Because most data are observational, practitioners work with indirect noisy observations and ill-posed econometric models in the form of stochastic inverse problems. Consequently, traditional econometric methods in many cases are not applicable for answering many of the quantitative questions that analysts wish to ask. After initial chapters deal with parametric and semiparametric linear probability models, the focus turns to solving nonparametric stochastic inverse problems. In succeeding chapters, a family of power divergence measure likelihood functions are introduced for a range of traditional and nontraditional econometric-model problems. Finally, within either an empirical maximum likelihood or loss context, Ron C. Mittelhammer and George G. Judge suggest a basis for choosing a member of the divergence family.
Both parts of Volume 44 of Advances in Econometrics pay tribute to Fabio Canova for his major contributions to economics over the last four decades. Throughout his long and distinguished career, Canova's research has achieved both a prolific publication record and provided stellar research to the profession. His colleagues, co-authors and PhD students wish to express their deep gratitude to Fabio for his intellectual leadership and guidance, whilst showcasing the extensive advances in knowledge and theory made available by Canova for professionals in the field. Advances in Econometrics publishes original scholarly econometrics papers with the intention of expanding the use of developed and emerging econometric techniques by disseminating ideas on the theory and practice of econometrics throughout the empirical economic, business and social science literature. Annual volume themes, selected by the Series Editors, are their interpretation of important new methods and techniques emerging in economics, statistics and the social sciences.
This text covers the basic theory and computation for mathematical modeling in linear programming. It provides a strong background on how to set up mathematical proofs and high-level computation methods, and includes substantial background material and direction. Paris presents an intuitive and novel discussion of what it means to solve a system of equations that is a crucial stepping stone for solving any linear program. The discussion of the simplex method for solving linear programs gives an economic interpretation to every step of the simplex algorithm. The text combines in a unique and novel way the microeconomics of production with the structure of linear programming to give students and scholars of economics a clear notion of what it means, formulating a model of economic equilibrium and the computation of opportunity cost in the presence of many outputs and inputs.
Although geometry has always aided intuition in econometrics, more recently differential geometry has become a standard tool in the analysis of statistical models, offering a deeper appreciation of existing methodologies and highlighting the essential issues which can be hidden in an algebraic development of a problem. Originally published in 2000, this volume was an early example of the application of these techniques to econometrics. An introductory chapter provides a brief tutorial for those unfamiliar with the tools of Differential Geometry. The topics covered in the following chapters demonstrate the power of the geometric method to provide practical solutions and insight into problems of econometric inference.
The rich, multi-faceted and multi-disciplinary field of matching-based market design is an active and important one due to its highly successful applications with economic and sociological impact. Its home is economics, but with intimate connections to algorithm design and operations research. With chapters contributed by over fifty top researchers from all three disciplines, this volume is unique in its breadth and depth, while still being a cohesive and unified picture of the field, suitable for the uninitiated as well as the expert. It explains the dominant ideas from computer science and economics underlying the most important results on market design and introduces the main algorithmic questions and combinatorial structures. Methodologies and applications from both the pre-Internet and post-Internet eras are covered in detail. Key chapters discuss the basic notions of efficiency, fairness and incentives, and the way market design seeks solutions guided by normative criteria borrowed from social choice theory.
This book explains inflation dynamic, using time series data from 1960 for 42 countries. These countries are different in every aspect, historically, culturally, socially, politically, institutionally, and economically. They are chosen on the basis of the data availability only and cover the Middle East and North Africa (MENA) region, Africa, Asia, the Caribbean, Europe, Australasia, and the United States. Inflation reached double digits in the developed countries in the 1970s and 80s, and then central banks, successfully stabilized it by anchoring inflation expectations for decades, until now. Conditional on common and country-specific shocks such as oil price shocks, financial and banking and political crises, wars, pandemics, natural disasters etc., the book tests various theoretical models about the long and short run relationships between money and prices, money growth and inflation, money growth and real output, expected inflation; the output gap, fiscal policy, and inflation, using a number of parametric and non-parametric methods, and pays attention to specifications and estimations problems. In addition, it explains why policymakers in inflation - targeting countries, e.g. the U.S., failed to anticipate the recent sudden rise in inflation. And, it examines the fallibility of the Modern Monetary Theory's policy prescription to reduce inflation by raising taxes. This is a unique and innovative book, which will find an audience among students, academics, researchers, policy makers, analysts in corporations, private and central banks and international monetary institutions.
"Econometrics textbooks see their subject as a set of techniques; Magnus and Morgan see it as a set of practices. A combination of controlled experiment and anthropology of science, Methodology and Tacit Knowledge gives a rare inside view of how econometricians work, why econometrics is an art and not a set of simple recipes, and why, like all artists, econometricians differ in their techniques and finished works. This is economic methodology at its best." Kevin Hoover, University of California, Davis "The tacit knowledge experiment was a highly commendable initiative. Its exploration of the theme of how knowledge is acquired and used in applied econometrics is unique and produced some fascinating insights into this process." Adrian Pagan, Australian National University "It is rare, perhaps unique, to find leading empirical economists face the prospect of modelling the same phenomena, with the same data within the same limited time frame. A valuable and illuminating experiment in comparative research methodologies, made all the more provocative when compared to the excellent original study by Tobin." Richard Blundell, University College London This book will be of considerable interest to economists and to econometricians concerned about the methodology of their own discipline, and will provide valuable material for researchers in science studies and for teachers of econometrics.
This comprehensive book is an introduction to multilevel Bayesian models in R using brms and the Stan programming language. Featuring a series of fully worked analyses of repeated-measures data, focus is placed on active learning through the analyses of the progressively more complicated models presented throughout the book. In this book, the authors offer an introduction to statistics entirely focused on repeated measures data beginning with very simple two-group comparisons and ending with multinomial regression models with many 'random effects'. Across 13 well-structured chapters, readers are provided with all the code necessary to run all the analyses and make all the plots in the book, as well as useful examples of how to interpret and write-up their own analyses. This book provides an accessible introduction for readers in any field, with any level of statistical background. Senior undergraduate students, graduate students, and experienced researchers looking to 'translate' their skills with more traditional models to a Bayesian framework, will benefit greatly from the lessons in this text.
This proceedings volume presents new methods and applications in applied economics with special interest in advanced cross-section data estimation methodology. Featuring select contributions from the 2019 International Conference on Applied Economics (ICOAE 2019) held in Milan, Italy, this book explores areas such as applied macroeconomics, applied microeconomics, applied financial economics, applied international economics, applied agricultural economics, applied marketing and applied managerial economics. International Conference on Applied Economics (ICOAE) is an annual conference that started in 2008, designed to bring together economists from different fields of applied economic research, in order to share methods and ideas. Applied economics is a rapidly growing field of economics that combines economic theory with econometrics, to analyze economic problems of the real world, usually with economic policy interest. In addition, there is growing interest in the field of applied economics for cross-section data estimation methods, tests and techniques. This volume makes a contribution in the field of applied economic research by presenting the most current research. Featuring country specific studies, this book is of interest to academics, students, researchers, practitioners, and policy makers in applied economics, econometrics and economic policy.
This book gives a thorough and systematic introduction to the latest research results about fuzzy decision-making method based on prospect theory. It includes eight chapters: Introduction, Intuitionistic fuzzy MADM based on prospect theory, QUALIFLEX based on prospect theory with probabilistic linguistic information, Group PROMETHEE based on prospect theory with hesitant fuzzy linguistic information, Prospect consensus with probabilistic hesitant fuzzy preference information, Improved TODIM based on prospect theory and the improved TODIM with probabilistic hesitant fuzzy information, etc. This book is suitable for the researchers in the fields of fuzzy mathematics, operations research, behavioral science, management science and engineering, etc. It is also useful as a textbook for postgraduate and senior-year undergraduate students of the relevant professional institutions of higher learning.
This book focuses on the competitive situation and policy outlook of China's provincial economy in the 13th five-year period. It begins with a general evaluation report on the country's provincial comprehensive Economic Competitiveness, followed by analyses at the international, national and regional levels, industrial and enterprise levels. On the basis of domestic and international research findings, it further enriches our understanding of provincial competitiveness, analyzes the domestic and international situation, explores new changes, new norms, new situations and new challenges concerning China's provincial economy in the past few years, reveals the characteristics and relative differences of different types, defines their internal competitive strengths and weaknesses, and provides valuable theoretical content to guide decision-making.
The essays in this special volume survey some of the most recent advances in the global analysis of dynamic models for economics, finance and the social sciences. They deal in particular with a range of topics from mathematical methods as well as numerous applications including recent developments on asset pricing, heterogeneous beliefs, global bifurcations in complementarity games, international subsidy games and issues in economic geography. A number of stochastic dynamic models are also analysed. The book is a collection of essays in honour of the 60th birthday of Laura Gardini.
Self-contained chapters on the most important applications and methodologies in finance, which can easily be used for the reader’s research or as a reference for courses on empirical finance. Each chapter is reproducible in the sense that the reader can replicate every single figure, table, or number by simply copy-pasting the code we provide. A full-fledged introduction to machine learning with tidymodels based on tidy principles to show how factor selection and option pricing can benefit from Machine Learning methods. Chapter 2 on accessing & managing financial data shows how to retrieve and prepare the most important datasets in the field of financial economics: CRSP and Compustat. The chapter also contains detailed explanations of the most important data characteristics. Each chapter provides exercises that are based on established lectures and exercise classes and which are designed to help students to dig deeper. The exercises can be used for self-studying or as source of inspiration for teaching exercises.
The search for symmetry is part of the fundamental scientific paradigm in mathematics and physics. Can this be valid also for economics? This book represents an attempt to explore this possibility. The behavior of price-taking producers, monopolists, monopsonists, sectoral market equilibria, behavior under risk and uncertainty, and two-person zero- and non-zero-sum games are analyzed and discussed under the unifying structure called the linear complementarity problem. Furthermore, the equilibrium problem allows for the relaxation of often-stated but unnecessary assumptions. This unifying approach offers the advantage of a better understanding of the structure of economic models. It also introduces the simplest and most elegant algorithm for solving a wide class of problems.
• Introduces the dynamics, principles and mathematics behind ten macroeconomic models allowing students to visualise the models and understand the economic intuition behind them. • Provides a step-by-step guide, and the necessary MATLAB codes, to allow readers to simulate and experiment with the models themselves.
The collapse in commodity prices since 1980 has been a major cause of the economic crisis in a large number of developing countries. This book investigates whether the commodity-producing countries, by joint action, could have prevented the price collapse by appropriate supply management. The analysis is focused on the markets for the tropical beverage crops: coffee, cocoa, and tea. Using new econometric models for each market, the impact of alternative supply management schemes on supply, consumption, prices, and export earnings is simulated for the later 1980s. The results indicate that supply management by producing countries would, indeed, have been a viable alternative to the `free market' approach favoured by the developed countries. This has important implications for current international commodity policy, and, in particular, for future joint action by producing countries to overcome persistent commodity surpluses as a complement to needed diversification.
Handbook of Field Experiments, Volume Two explains how to conduct experimental research, presents a catalog of research to date, and describes which areas remain to be explored. The new volume includes sections on field experiments in education in developing countries, how to design social protection programs, a section on how to combat poverty, and updates on data relating to the impact and determinants of health levels in low-income countries. Separating itself from circumscribed debates of specialists, this volume surpasses the many journal articles and narrowly-defined books written by practitioners. This ongoing series will be of particular interest to scholars working with experimental methods. Users will find results from politics, education, and more.
Introduction to Functional Data Analysis provides a concise textbook introduction to the field. It explains how to analyze functional data, both at exploratory and inferential levels. It also provides a systematic and accessible exposition of the methodology and the required mathematical framework. The book can be used as textbook for a semester-long course on FDA for advanced undergraduate or MS statistics majors, as well as for MS and PhD students in other disciplines, including applied mathematics, environmental science, public health, medical research, geophysical sciences and economics. It can also be used for self-study and as a reference for researchers in those fields who wish to acquire solid understanding of FDA methodology and practical guidance for its implementation. Each chapter contains plentiful examples of relevant R code and theoretical and data analytic problems. The material of the book can be roughly divided into four parts of approximately equal length: 1) basic concepts and techniques of FDA, 2) functional regression models, 3) sparse and dependent functional data, and 4) introduction to the Hilbert space framework of FDA. The book assumes advanced undergraduate background in calculus, linear algebra, distributional probability theory, foundations of statistical inference, and some familiarity with R programming. Other required statistics background is provided in scalar settings before the related functional concepts are developed. Most chapters end with references to more advanced research for those who wish to gain a more in-depth understanding of a specific topic.
In this monograph the authors give a systematic approach to the probabilistic properties of the fixed point equation X=AX+B. A probabilistic study of the stochastic recurrence equation X_t=A_tX_{t-1}+B_t for real- and matrix-valued random variables A_t, where (A_t,B_t) constitute an iid sequence, is provided. The classical theory for these equations, including the existence and uniqueness of a stationary solution, the tail behavior with special emphasis on power law behavior, moments and support, is presented. The authors collect recent asymptotic results on extremes, point processes, partial sums (central limit theory with special emphasis on infinite variance stable limit theory), large deviations, in the univariate and multivariate cases, and they further touch on the related topics of smoothing transforms, regularly varying sequences and random iterative systems. The text gives an introduction to the Kesten-Goldie theory for stochastic recurrence equations of the type X_t=A_tX_{t-1}+B_t. It provides the classical results of Kesten, Goldie, Guivarc'h, and others, and gives an overview of recent results on the topic. It presents the state-of-the-art results in the field of affine stochastic recurrence equations and shows relations with non-affine recursions and multivariate regular variation. |
You may like...
Electronic Multimedia Publishing…
Fillia Makedon, Samuel A. Rebelsky
Hardcover
R2,707
Discovery Miles 27 070
Baseline - Confronting Reality and…
Charles Protzman, Fred Whiton, …
Hardcover
R3,937
Discovery Miles 39 370
|