Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
Ranked Set Sampling: 65 Years Improving the Accuracy in Data Gathering is an advanced survey technique which seeks to improve the likelihood that collected sample data presents a good representation of the population and minimizes the costs associated with obtaining them. The main focus of many agricultural, ecological and environmental studies is the development of well designed, cost-effective and efficient sampling designs, giving RSS techniques a particular place in resolving the disciplinary problems of economists in application contexts, particularly experimental economics. This book seeks to place RSS at the heart of economic study designs.
Now in its fourth edition, this landmark text" "provides a fresh, accessible and well-written introduction to the subject. With a rigorous pedagogical framework, which sets it apart from comparable texts, the latest edition features an expanded website providing numerous real life data sets and examples.
Today, most money is credit money, created by commercial banks. While credit can finance innovation, excessive credit can lead to boom/bust cycles, such as the recent financial crisis. This highlights how the organization of our monetary system is crucial to stability. One way to achieve this is by separating the unit of account from the medium of exchange and in pre-modern Europe, such a separation existed. This new volume examines this idea of monetary separation and this history of monetary arrangements in the North and Baltic Seas region, from the Hanseatic League onwards. This book provides a theoretical analysis of four historical cases in the Baltic and North Seas region, with a view to examining evolution of monetary arrangements from a new monetary economics perspective. Since the objective exhange value of money (its purchasing power), reflects subjective individual valuations of commodities, the author assesses these historical cases by means of exchange rates. Using theories from new monetary economics , the book explores how the units of account and their media of exchange evolved as social conventions, and offers new insight into the separation between the two. Through this exploration, it puts forward that money is a social institution, a clearing device for the settlement of accounts, and so the value of money, or a separate unit of account, ultimately results from the size of its network of users. The History of Money and Monetary Arrangements offers a highly original new insight into monetary arrangments as an evolutionary process. It will be of great interest to an international audience of scholars and students, including those with an interest in economic history, evolutionary economics and new monetary economics.
Globalization and information and communications technology (ICT) have played a pivotal role in revolutionizing value creation through the development of human capital formation. The constantly changing needs and structure of the labour market are primarily responsible for the conversion of a traditional economy relying fundamentally on the application of physical abilities to a knowledge-based economy relying on ideas, technologies and innovations. In this economy, knowledge has to be created, acquired, developed, transmitted, preserved and utilized for the improvement of individual and social welfare. Comparative Advantage in the Knowledge Economy: A National and Organizational Resource provides a comprehensive and insightful understanding of all the dimensions of a transition from a traditional to a knowledge economy. It attempts to explain how educational achievement, skilled manpower, investment in knowledge capital and analytics will be the key to success of a nation's comparative advantage in the globalized era. The volume should be of interest to students, researchers and teachers of economics, policy makers and advanced graduate students with an interest in economic analyses and development policy.
Three leading experts have produced a landmark work based on a set of working papers published by the Center for Operations Research and Econometrics (CORE) at the Universite Catholique de Louvain in 1994 under the title 'Repeated Games', which holds almost mythic status among game theorists. Jean-Francois Mertens, Sylvain Sorin and Shmuel Zamir have significantly elevated the clarity and depth of presentation with many results presented at a level of generality that goes far beyond the original papers - many written by the authors themselves. Numerous results are new, and many classic results and examples are not to be found elsewhere. Most remain state of the art in the literature. This book is full of challenging and important problems that are set up as exercises, with detailed hints provided for their solutions. A new bibliography traces the development of the core concepts up to the present day.
Handbook of Field Experiments provides tactics on how to conduct experimental research, also presenting a comprehensive catalog on new results from research and areas that remain to be explored. This updated addition to the series includes an entire chapters on field experiments, the politics and practice of social experiments, the methodology and practice of RCTs, and the econometrics of randomized experiments. These topics apply to a wide variety of fields, from politics, to education, and firm productivity, providing readers with a resource that sheds light on timely issues, such as robustness and external validity. Separating itself from circumscribed debates of specialists, this volume surpasses in usefulness the many journal articles and narrowly-defined books written by practitioners.
This book presents some of Arnold Zellner's outstanding contributions to the philosophy, theory and application of Bayesian analysis, particularly as it relates to statistics, econometrics and economics. The volume contains both previously published and new material which cite and discuss the work of Bayesians who have made a contribution by helping researchers and analysts in many professions to become more effective in learning from data and making decisions. Bayesian and non-Bayesian approaches are compared in several papers. Other articles include theoretical and applied results on estimation, model comparison, prediction, forecasting, prior densities, model formulation and hypothesis testing. In addition, a new information processing approach is presented that yields Bayes's Theorem as a perfectly efficient information processing rule. This volume will be essential reading for academics and students interested in qualitative methods as well as industrial analysts and government officials.
Putting Econometrics in its Place is an original and fascinating book, in which Peter Swann argues that econometrics has dominated applied economics for far too long and displaced other essential techniques. While Peter Swann is critical of the monopoly that econometrics currently holds in applied economics, the more important and positive contribution of the book is to propose a new direction and a new attitude to applied economics.The advance of econometrics from its early days has been a massive achievement, but it has also been problematic; practical results from the use of econometrics are often disappointing. The author argues that to get applied economics back on course economists must use a much wider variety of research techniques, and must once again learn to respect vernacular knowledge of the economy. This vernacular includes the knowledge gathered by ordinary people from their everyday interactions with markets. While vernacular knowledge is often unsystematic and informal, it offers insights that can never be found from formal analysis alone. As a serious, original and sometimes contentious book, its readership will be varied and international. Scholars throughout the many fields of economics - both skilled and unskilled in econometrics - are likely to be intrigued by the serious alternative approaches outlined within the book. It will also appeal to communities of economists outside economics departments in government, industry and business as well as business and management schools. Research centres for applied economics, policy research and innovation research, will also find it of interest due to its focus on getting reliable results rather than methodological orthodoxy for its own sake.
This book presents the reader with new operators and matrices that arise in the area of matrix calculus. The properties of these mathematical concepts are investigated and linked with zero-one matrices such as the commutation matrix. Elimination and duplication matrices are revisited and partitioned into submatrices. Studying the properties of these submatrices facilitates achieving new results for the original matrices themselves. Different concepts of matrix derivatives are presented and transformation principles linking these concepts are obtained. One of these concepts is used to derive new matrix calculus results, some involving the new operators and others the derivatives of the operators themselves. The last chapter contains applications of matrix calculus, including optimization, differentiation of log-likelihood functions, iterative interpretations of maximum likelihood estimators, and a Lagrangian multiplier test for endogeneity.
Spatial Microeconometrics introduces the reader to the basic concepts of spatial statistics, spatial econometrics and the spatial behavior of economic agents at the microeconomic level. Incorporating useful examples and presenting real data and datasets on real firms, the book takes the reader through the key topics in a systematic way. The book outlines the specificities of data that represent a set of interacting individuals with respect to traditional econometrics that treat their locational choices as exogenous and their economic behavior as independent. In particular, the authors address the consequences of neglecting such important sources of information on statistical inference and how to improve the model predictive performances. The book presents the theory, clarifies the concepts and instructs the readers on how to perform their own analyses, describing in detail the codes which are necessary when using the statistical language R. The book is written by leading figures in the field and is completely up to date with the very latest research. It will be invaluable for graduate students and researchers in economic geography, regional science, spatial econometrics, spatial statistics and urban economics.
This volume gathers peer-reviewed contributions that address a wide range of recent developments in the methodology and applications of data analysis and classification tools in micro and macroeconomic problems. The papers were originally presented at the 29th Conference of the Section on Classification and Data Analysis of the Polish Statistical Association, SKAD 2020, held in Sopot, Poland, September 7-9, 2020. Providing a balance between methodological contributions and empirical papers, the book is divided into five parts focusing on methodology, finance, economics, social issues and applications dealing with COVID-19 data. It is aimed at a wide audience, including researchers at universities and research institutions, graduate and doctoral students, practitioners, data scientists and employees in public statistical institutions.
The volatility of financial returns changes over time and, for the last thirty years, Generalized Autoregressive Conditional Heteroscedasticity (GARCH) models have provided the principal means of analyzing, modeling, and monitoring such changes. Taking into account that financial returns typically exhibit heavy tails that is, extreme values can occur from time to time Andrew Harvey's new book shows how a small but radical change in the way GARCH models are formulated leads to a resolution of many of the theoretical problems inherent in the statistical theory. The approach can also be applied to other aspects of volatility, such as those arising from data on the range of returns and the time between trades. Furthermore, the more general class of Dynamic Conditional Score models extends to robust modeling of outliers in the levels of time series and to the treatment of time-varying relationships. As such, there are applications not only to financial data but also to macroeconomic time series and to time series in other disciplines. The statistical theory draws on basic principles of maximum likelihood estimation and, by doing so, leads to an elegant and unified treatment of nonlinear time-series modeling. The practical value of the proposed models is illustrated by fitting them to real data sets."
This Handbook provides an authoritative overview of current research in the field of cost-benefit analysis and is designed as a starting point for those interested in undertaking advanced research. The Handbook contains major contributions to the development of the field, focussing on standard microeconomic policy evaluations, the relatively neglected area of macroeconomic policy and its integration into a formal CBA framework, and dynamic considerations in CBA Presenting insights from many influential thinkers, and edited by a leading academic in the field, this comprehensive work will prove an invaluable reference tool for economists, researchers and scholars.
The book's comprehensive coverage on the application of econometric methods to empirical analysis of economic issues is impressive. It uncovers the missing link between textbooks on economic theory and econometrics and highlights the powerful connection between economic theory and empirical analysis perfectly through examples on rigorous experimental design. The use of data sets for estimation derived with the Monte Carlo method helps facilitate the understanding of the role of hypothesis testing applied to economic models. Topics covered in the book are: consumer behavior, producer behavior, market equilibrium, macroeconomic models, qualitative-response models, panel data analysis and time-series analysis. Key econometric models are introduced, specified, estimated and evaluated. The treatment on methods of estimation in econometrics and the discipline of hypothesis testing makes it a must-have for graduate students of economics and econometrics and aids their understanding on how to estimate economic models and evaluate the results in terms of policy implications.
Mechanism design is the field of economics that treats institutions and procedures as variables that can be selected in order to achieve desired objectives. An important aspect of a mechanism is the communication among its participants that it requires, which complements other design features such as incentives and complexity. A calculus-based theory of communication in mechanisms is developed in this book. The value of a calculus-based approach lies in its familiarity as well as the insight into mechanisms that it provides. Results are developed concerning (i) a first order approach to the construction of mechanisms, (ii) the range of mechanisms that can be used to achieve a given objective, as well as (iii) lower bounds on the required communication.
This book, which was first published in 1980, is concerned with one particular branch of growth theory, namely descriptive growth theory. It is typically assumed in growth theory that both the factors and goods market are perfectly competitive. In particular this implies amongst other things that the reward to each factor is identical in each sector of the economy. In this book the assumption of identical factor rewards is relaxed and the implications of an intersectoral wage differential for economic growth are analysed. There is also some discussion on the short-term and long-run effects of minimum wage legislation on growth. This book will serve as key reading for students of economics.
Although geometry has always aided intuition in econometrics, more recently differential geometry has become a standard tool in the analysis of statistical models, offering a deeper appreciation of existing methodologies and highlighting the essential issues which can be hidden in an algebraic development of a problem. Originally published in 2000, this volume was an early example of the application of these techniques to econometrics. An introductory chapter provides a brief tutorial for those unfamiliar with the tools of Differential Geometry. The topics covered in the following chapters demonstrate the power of the geometric method to provide practical solutions and insight into problems of econometric inference.
Spatial Microeconometrics introduces the reader to the basic concepts of spatial statistics, spatial econometrics and the spatial behavior of economic agents at the microeconomic level. Incorporating useful examples and presenting real data and datasets on real firms, the book takes the reader through the key topics in a systematic way. The book outlines the specificities of data that represent a set of interacting individuals with respect to traditional econometrics that treat their locational choices as exogenous and their economic behavior as independent. In particular, the authors address the consequences of neglecting such important sources of information on statistical inference and how to improve the model predictive performances. The book presents the theory, clarifies the concepts and instructs the readers on how to perform their own analyses, describing in detail the codes which are necessary when using the statistical language R. The book is written by leading figures in the field and is completely up to date with the very latest research. It will be invaluable for graduate students and researchers in economic geography, regional science, spatial econometrics, spatial statistics and urban economics.
Understanding why so many people across the world are so poor is one of the central intellectual challenges of our time. This book provides the tools and data that will enable students, researchers and professionals to address that issue. Empirical Development Economics has been designed as a hands-on teaching tool to investigate the causes of poverty. The book begins by introducing the quantitative approach to development economics. Each section uses data to illustrate key policy issues. Part One focuses on the basics of understanding the role of education, technology and institutions in determining why incomes differ so much across individuals and countries. In Part Two, the focus is on techniques to address a number of topics in development, including how firms invest, how households decide how much to spend on their children's education, whether microcredit helps the poor, whether food aid works, who gets private schooling and whether property rights enhance investment. A distinctive feature of the book is its presentation of a range of approaches to studying development questions. Development economics has undergone a major change in focus over the last decade with the rise of experimental methods to address development issues; this book shows how these methods relate to more traditional ones. Please visit the book's website at www.empiricalde.com for online supplements including Stata files and solutions to the exercises.
Law and economics research has had an enormous impact on the laws of contracts, torts, property, crimes, corporations, and antitrust, as well as public regulation and fundamental rights. The Law and Economics of Patent Damages, Antitrust, and Legal Process examines several areas of important research by a variety of international scholars. It contains technical papers on the appropriate way to estimate damages in patent disputes, as well as methods for evaluating relevant markets and vertically integrated firms when determining the competitive effects of mergers and other actions. There are also papers on the implication of different legal processes, regulations, and liability rules on consumer welfare, which range from the impact of delays in legal decisions in labour cases in France to issues of criminal liability related to the use of artificial intelligence. This volume of Research in Law and Economics is a must-read for researchers and professionals of patent damages, antitrust, labour, and legal process.
First published in 1996, Dynamic Disequilibrium Modeling presents some surveys and developments in dynamic disequilibrium and continuous time econometric modeling along with related research from associated fields. Specific areas covered include applications in business cycles and growth, tests for nonlinearity, rationing and disequilibrium dynamics, and demographic and international applications. The contents of this volume comprise the proceedings of the ninth conference in The International Symposia in Economic Theory and Econometrics series under the general editorship of William Barnett. The proceedings volume includes the most important papers presented at a conference held at the University of Munich on August 31-September 4, 1993.
Bringing together a collection of previously published work, this book provides a discussion of major considerations relating to the construction of econometric models that work well to explain economic phenomena, predict future outcomes and be useful for policy-making. Analytical relations between dynamic econometric structural models and empirical time series MVARMA, VAR, transfer function, and univariate ARIMA models are established with important application for model-checking and model construction. The theory and applications of these procedures to a variety of econometric modeling and forecasting problems as well as Bayesian and non-Bayesian testing, shrinkage estimation and forecasting procedures are also presented and applied. Finally, attention is focused on the effects of disaggregation on forecasting precision and the Marshallian Macroeconomic Model that features demand, supply and entry equations for major sectors of economies is analysed and described. This volume will prove invaluable to professionals, academics and students alike.
This collection brings together important contributions by leading econometricians on (i) parametric approaches to qualitative and sample selection models, (ii) nonparametric and semi-parametric approaches to qualitative and sample selection models, and (iii) nonlinear estimation of cross-sectional and time series models. The advances achieved here can have important bearing on the choice of methods and analytical techniques in applied research.
Econophysics is an emerging interdisciplinary field that takes advantage of the concepts and methods of statistical physics to analyse economic phenomena. This book expands the explanatory scope of econophysics to the real economy by using methods from statistical physics to analyse the success and failure of companies. Using large data sets of companies and income-earners in Japan and Europe, a distinguished team of researchers show how these methods allow us to analyse companies, from huge corporations to small firms, as heterogeneous agents interacting at multiple layers of complex networks. They then show how successful this approach is in explaining a wide range of recent findings relating to the dynamics of companies. With mathematics kept to a minimum, the book is not only a lively introduction to the field of econophysics but also provides fresh insights into company behaviour. |
You may like...
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,672
Discovery Miles 26 720
Elementary Bayesian Statistics
Gordon Antelman, Albert Madansky, …
Hardcover
R4,557
Discovery Miles 45 570
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R7,998
Discovery Miles 79 980
|