![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
To fully function in today's global real estate industry, students and professionals increasingly need to understand how to implement essential and cutting-edge quantitative techniques. This book presents an easy-to-read guide to applying quantitative analysis in real estate aimed at non-cognate undergraduate and masters students, and meets the requirements of modern professional practice. Through case studies and examples illustrating applications using data sourced from dedicated real estate information providers and major firms in the industry, the book provides an introduction to the foundations underlying statistical data analysis, common data manipulations and understanding descriptive statistics, before gradually building up to more advanced quantitative analysis, modelling and forecasting of real estate markets. Our examples and case studies within the chapters have been specifically compiled for this book and explicitly designed to help the reader acquire a better understanding of the quantitative methods addressed in each chapter. Our objective is to equip readers with the skills needed to confidently carry out their own quantitative analysis and be able to interpret empirical results from academic work and practitioner studies in the field of real estate and in other asset classes. Both undergraduate and masters level students, as well as real estate analysts in the professions, will find this book to be essential reading.
This is a thorough exploration of the models and methods of financial econometrics by one of the world's leading financial econometricians and is for students in economics, finance, statistics, mathematics, and engineering who are interested in financial applications. Based on courses taught around the world, the up-to-date content covers developments in econometrics and finance over the last twenty years while ensuring a solid grounding in the fundamental principles of the field. Care has been taken to link theory and application to provide real-world context for students. Worked exercises and empirical examples have also been included to make sure complicated concepts are solidly explained and understood.
This second edition retains the positive features of being clearly written, well organized, and incorporating calculus in the text, while adding expanded coverage on game theory, experimental economics, and behavioural economics. It remains more focused and manageable than similar textbooks, and provides a concise yet comprehensive treatment of the core topics of microeconomics, including theories of the consumer and of the firm, market structure, partial and general equilibrium, and market failures caused by public goods, externalities and asymmetric information. The book includes helpful solved problems in all the substantive chapters, as well as over seventy new mathematical exercises and enhanced versions of the ones in the first edition. The authors make use of the book's full color with sharp and helpful graphs and illustrations. This mathematically rigorous textbook is meant for students at the intermediate level who have already had an introductory course in microeconomics, and a calculus course.
Game theory has revolutionised our understanding of industrial organisation and the traditional theory of the firm. Despite these advances, industrial economists have tended to rely on a restricted set of tools from game theory, focusing on static and repeated games to analyse firm structure and behaviour. Luca Lambertini, a leading expert on the application of differential game theory to economics, argues that many dynamic phenomena in industrial organisation (such as monopoly, oligopoly, advertising, R&D races) can be better understood and analysed through the use of differential games. After illustrating the basic elements of the theory, Lambertini guides the reader through the main models, spanning from optimal control problems describing the behaviour of a monopolist through to oligopoly games in which firms' strategies include prices, quantities and investments. This approach will be of great value to students and researchers in economics and those interested in advanced applications of game theory.
In contrast to mainstream economics, complexity theory conceives the economy as a complex system of heterogeneous interacting agents characterised by limited information and bounded rationality. Agent Based Models (ABMs) are the analytical and computational tools developed by the proponents of this emerging methodology. Aimed at students and scholars of contemporary economics, this book includes a comprehensive toolkit for agent-based computational economics, now quickly becoming the new way to study evolving economic systems. Leading scholars in the field explain how ABMs can be applied fruitfully to many real-world economic examples and represent a great advancement over mainstream approaches. The essays discuss the methodological bases of agent-based approaches and demonstrate step-by-step how to build, simulate and analyse ABMs and how to validate their outputs empirically using the data. They also present a wide set of applications of these models to key economic topics, including the business cycle, labour markets, and economic growth.
Structural vector autoregressive (VAR) models are important tools for empirical work in macroeconomics, finance, and related fields. This book not only reviews the many alternative structural VAR approaches discussed in the literature, but also highlights their pros and cons in practice. It provides guidance to empirical researchers as to the most appropriate modeling choices, methods of estimating, and evaluating structural VAR models. The book traces the evolution of the structural VAR methodology and contrasts it with other common methodologies, including dynamic stochastic general equilibrium (DSGE) models. It is intended as a bridge between the often quite technical econometric literature on structural VAR modeling and the needs of empirical researchers. The focus is not on providing the most rigorous theoretical arguments, but on enhancing the reader's understanding of the methods in question and their assumptions. Empirical examples are provided for illustration.
This is the first of two volumes containing papers and commentaries presented at the Eleventh World Congress of the Econometric Society, held in Montreal, Canada in August 2015. These papers provide state-of-the-art guides to the most important recent research in economics. The book includes surveys and interpretations of key developments in economics and econometrics, and discussion of future directions for a wide variety of topics, covering both theory and application. These volumes provide a unique, accessible survey of progress on the discipline, written by leading specialists in their fields. The first volume includes theoretical and applied papers addressing topics such as dynamic mechanism design, agency problems, and networks.
Factor models have become the most successful tool in the analysis and forecasting of high-dimensional time series. This monograph provides an extensive account of the so-called General Dynamic Factor Model methods. The topics covered include: asymptotic representation problems, estimation, forecasting, identification of the number of factors, identification of structural shocks, volatility analysis, and applications to macroeconomic and financial data.
A number of clubs in professional sports leagues exhibit winning streaks over a number of consecutive seasons that do not conform to the standard economic model of a professional sports league developed by El Hodiri and Quirk (1994) and Fort and Quirk (1995). These clubs seem to display what we call "unsustainable runs," defined as a period of two to four seasons where the club acquires expensive talent and attempts to win a league championship despite not having the market size to sustain such a competitive position in the long run. The standard model predicts that clubs that locate in large economic markets will tend to acquire more talent and achieve more success on the field and at the box office than clubs that are located in small markets.This book builds a model that can allow for unsustainable runs yet retain most of the features of the standard model then subjects it to empirical verification. The new model we develop in the book has as its central feature the ability to generate two equilibria for a club under certain conditions. In the empirical sections of the book, we use time-series analysis to attempt to test for the presence of unsustainable runs using historical data from National Football League (NFL), National Basketball Association (NBA), National Hockey League (NHL) and Major League Baseball (MLB). The multiple equilibria model retains all of the features of the standard model of a professional sports league that is accepted quite universally by economists, yet it offers a much richer approach by including an exploration of the effects of revenues that are earned at the league level (television, apparel, naming rights, etc.) that are then shared by all of the member clubs, making this book very unique and of great interest to scholars in a variety of fields in economics.
This book presents eleven classic papers by the late Professor Suzanne Scotchmer with introductions by leading economists and legal scholars. This book introduces Scotchmer's life and work; analyses her pioneering contributions to the economics of patents and innovation incentives, with a special focus on the modern theory of cumulative innovation; and describes her pioneering work on law and economics, evolutionary game theory, and general equilibrium/club theory. This book also provides a self-contained introduction to students who want to learn more about the various fields that Professor Scotchmer worked in, with a particular focus on patent incentives and cumulative innovation.
This book develops a machine-learning framework for predicting economic growth. It can also be considered as a primer for using machine learning (also known as data mining or data analytics) to answer economic questions. While machine learning itself is not a new idea, advances in computing technology combined with a dawning realization of its applicability to economic questions makes it a new tool for economists.
Davidson and MacKinnon have written an outstanding textbook for graduates in econometrics, covering both basic and advanced topics and using geometrical proofs throughout for clarity of exposition. The book offers a unified theoretical perspective, and emphasizes the practical applications of modern theory.
This book constitutes the first serious attempt to explain the basics of econometrics and its applications in the clearest and simplest manner possible. Recognising the fact that a good level of mathematics is no longer a necessary prerequisite for economics/financial economics undergraduate and postgraduate programmes, it introduces this key subdivision of economics to an audience who might otherwise have been deterred by its complex nature.
Central Bank Balance Sheet and Real Business Cycles argues that a deeper comprehension of changes to the central bank balance sheet can lead to more effective policymaking. Any transaction engaged in by the central bank-issuing currency, conducting foreign exchange operations, investing its own funds, intervening to provide emergency liquidity assistance and carrying out monetary policy operations-influences its balance sheet. Despite this, many central banks throughout the world have largely ignored balance sheet movements, and have instead focused on implementing interest rates. In this book, Mustapha Abiodun Akinkunmi highlights the challenges and controversies faced by central banks in the past and present when implementing policies, and analyzes the links between these policies, the central bank balance sheet, and the consequences to economies as a whole. He argues that the composition and evolution of the central bank balance sheet provides a valuable basis for understanding the needs of an economy, and is an important tool in developing strategies that would most effectively achieve policy goals. This book is an important resource for anyone interested in monetary policy or whose work is affected by the actions of the policies of central banks.
This lively book lays out a methodology of confidence distributions and puts them through their paces. Among other merits, they lead to optimal combinations of confidence from different sources of information, and they can make complex models amenable to objective and indeed prior-free analysis for less subjectively inclined statisticians. The generous mixture of theory, illustrations, applications and exercises is suitable for statisticians at all levels of experience, as well as for data-oriented scientists. Some confidence distributions are less dispersed than their competitors. This concept leads to a theory of risk functions and comparisons for distributions of confidence. Neyman-Pearson type theorems leading to optimal confidence are developed and richly illustrated. Exact and optimal confidence distribution is the gold standard for inferred epistemic distributions. Confidence distributions and likelihood functions are intertwined, allowing prior distributions to be made part of the likelihood. Meta-analysis in likelihood terms is developed and taken beyond traditional methods, suiting it in particular to combining information across diverse data sources.
This book presents the latest advances in the theory and practice of Marshall-Olkin distributions. These distributions have been increasingly applied in statistical practice in recent years, as they make it possible to describe interesting features of stochastic models like non-exchangeability, tail dependencies and the presence of a singular component. The book presents cutting-edge contributions in this research area, with a particular emphasis on financial and economic applications. It is recommended for researchers working in applied probability and statistics, as well as for practitioners interested in the use of stochastic models in economics. This volume collects selected contributions from the conference “Marshall-Olkin Distributions: Advances in Theory and Applications,” held in Bologna on October 2-3, 2013.
The main objective of this book is to develop a strategy and policy measures to enhance the formalization of the shadow economy in order to improve the competitiveness of the economy and contribute to economic growth; it explores these issues with special reference to Serbia. The size and development of the shadow economy in Serbia and other Central and Eastern European countries are estimated using two different methods (the MIMIC method and household-tax-compliance method). Micro-estimates are based on a special survey of business entities in Serbia, which for the first time allows us to explore the shadow economy from the perspective of enterprises and entrepreneurs. The authors identify the types of shadow economy at work in business entities, the determinants of shadow economy participation, and the impact of competition from the informal sector on businesses. Readers will learn both about the potential fiscal effects of reducing the shadow economy to the levels observed in more developed countries and the effects that formalization of the shadow economy can have on economic growth.
Inequality is a charged topic. Measures of income inequality rose in the USA in the 1990s to levels not seen since 1929 and gave rise to a suspicion, not for the first time, of a link between radical inequality and financial instability with a resulting crisis under capitalism. Professional macroeconomists have generally taken little interest in inequality because, within the parameters of traditional economic theory, the economy will stabilize itself at full employment. In addition, enlightened economists could enact stabilizing measures to manage any imbalances. The dominant voices among academic economists were unable to interpret the causal forces at work during both the Great Depression and the recent global financial crisis. In Inequality and Instability, James K. Galbraith argues that since there has been no serious work done on the macroeconomic effects of inequality, new sources of evidence are required. Galbraith offers for the first time a vast expansion of the capacity to calculate measures of inequality both at lower and higher levels of aggregation. Instead of measuring inequality as traditionally done, by country, Galbraith insists that to understand real differences that have real effects, inequality must be examined through both smaller and larger administrative units, like sub-national levels within and between states and provinces, multinational continental economies, and the world. He points out that inequality could be captured by measures across administrative boundaries to capture data on more specific groups to which people belong. For example, in China, economic inequality reflects the difference in average income levels between city and countryside, or between coastal regions and the interior, and a simple ratio averages would be an indicator of trends in inequality over the country as a whole. In a comprehensive presentation of this new method of using data, Inequality and Instability offers an unequaled look at the US economy and various global economies that was not accessible to us before. This provides a more sophisticated and a more accurate picture of inequality around the world, and how inequality is one of the most basic sources of economic instability.
Pioneered by American economist Paul Samuelson, revealed preference theory is based on the idea that the preferences of consumers are revealed in their purchasing behavior. Researchers in this field have developed complex and sophisticated mathematical models to capture the preferences that are 'revealed' through consumer choice behavior. This study of consumer demand and behavior is closely tied up with econometrics (especially nonparametric econometrics), where testing the validity of different theoretical models is an important aspect of research. The theory of revealed preference has a very long and distinguished tradition in economics, but there was no systematic presentation of the theory until now. This book deals with basic questions in economic theory, such as the relation between theory and data, and studies the situations in which empirical observations are consistent or inconsistent with some of the best known theories in economics.
In recent years nonlinearities have gained increasing importance in economic and econometric research, particularly after the financial crisis and the economic downturn after 2007. This book contains theoretical, computational and empirical papers that incorporate nonlinearities in econometric models and apply them to real economic problems. It intends to serve as an inspiration for researchers to take potential nonlinearities in account. Researchers should be aware of applying linear model-types spuriously to problems which include non-linear features. It is indispensable to use the correct model type in order to avoid biased recommendations for economic policy.
This book deals with the application of wavelet and spectral methods for the analysis of nonlinear and dynamic processes in economics and finance. It reflects some of the latest developments in the area of wavelet methods applied to economics and finance. The topics include business cycle analysis, asset prices, financial econometrics, and forecasting. An introductory paper by James Ramsey, providing a personal retrospective of a decade's research on wavelet analysis, offers an excellent overview over the field.
This publication provides insight into the agricultural sector. It illustrates new tendencies in agricultural economics and dynamics (interrelationship with other sectors in rural zones and multifunctionality) and the implications of the World Trade Organization negotiations in the international trade of agricultural products. Due to environmental problems, availability of budget, consumer preferences for food safety and pressure from the World Trade Organization, there are many changes in the agricultural sector. This book addresses those new developments and provides insights into possible future developments. The agricultural activity is an economic sector that is fundamental for a sustainable economic growth of every country. However, this sector has many particularities, namely those related with some structural problems (many farms with reduced dimension, sometimes lack of vocational training of the farmers, difficulties of put the farmers together in associations and cooperatives), variations of the productions and prices over the year and some environmental problems derived from the utilization of pesticides and fertilizers.
In the era of Big Data our society is given the unique opportunity to understand the inner dynamics and behavior of complex socio-economic systems. Advances in the availability of very large databases, in capabilities for massive data mining, as well as progress in complex systems theory, multi-agent simulation and computational social science open the possibility of modeling phenomena never before successfully achieved. This contributed volume from the Perm Winter School address the problems of the mechanisms and statistics of the socio-economics system evolution with a focus on financial markets powered by the high-frequency data analysis.
This book provides a detailed introduction to the theoretical and methodological foundations of production efficiency analysis using benchmarking. Two of the more popular methods of efficiency evaluation are Stochastic Frontier Analysis (SFA) and Data Envelopment Analysis (DEA), both of which are based on the concept of a production possibility set and its frontier. Depending on the assumed objectives of the decision-making unit, a Production, Cost, or Profit Frontier is constructed from observed data on input and output quantities and prices. While SFA uses different maximum likelihood estimation techniques to estimate a parametric frontier, DEA relies on mathematical programming to create a nonparametric frontier. Yet another alternative is the Convex Nonparametric Frontier, which is based on the assumed convexity of the production possibility set and creates a piecewise linear frontier consisting of a number of tangent hyper planes. Three of the papers in this volume provide a detailed and relatively easy to follow exposition of the underlying theory from neoclassical production economics and offer step-by-step instructions on the appropriate model to apply in different contexts and how to implement them. Of particular appeal are the instructions on (i) how to write the codes for different SFA models on STATA, (ii) how to write a VBA Macro for repetitive solution of the DEA problem for each production unit on Excel Solver, and (iii) how to write the codes for the Nonparametric Convex Frontier estimation. The three other papers in the volume are primarily theoretical and will be of interest to PhD students and researchers hoping to make methodological and conceptual contributions to the field of nonparametric efficiency analysis.
Economists can use computer algebra systems to manipulate symbolic models, derive numerical computations, and analyze empirical relationships among variables. Maxima is an open-source multi-platform computer algebra system that rivals proprietary software. Maxima's symbolic and computational capabilities enable economists and financial analysts to develop a deeper understanding of models by allowing them to explore the implications of differences in parameter values, providing numerical solutions to problems that would be otherwise intractable, and by providing graphical representations that can guide analysis. This book provides a step-by-step tutorial for using this program to examine the economic relationships that form the core of microeconomics in a way that complements traditional modeling techniques. Readers learn how to phrase the relevant analysis and how symbolic expressions, numerical computations, and graphical representations can be used to learn from microeconomic models. In particular, comparative statics analysis is facilitated. Little has been published on Maxima and its applications in economics and finance, and this volume will appeal to advanced undergraduates, graduate-level students studying microeconomics, academic researchers in economics and finance, economists, and financial analysts. |
You may like...
Applied Econometric Analysis - Emerging…
Brian W Sloboda, Yaya Sissoko
Hardcover
R5,351
Discovery Miles 53 510
Spatial Analysis Using Big Data…
Yoshiki Yamagata, Hajime Seya
Paperback
R3,021
Discovery Miles 30 210
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
|