![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
This book undertakes a theoretical and econometric analysis of intense economic growth in selected European countries during the end of the twentieth century and the beginning of the twenty first. Focusing on the accelerated economic growth that occurred in Ireland, the Netherlands, Spain, and Turkey, this book investigates the determinants and consequences of this "miracle" growth and discusses them in context of growth and development processes observed in European market-type economies after the World War II. Using imperfect knowledge economics (IKE) as a theoretical framework to interpret the empirical results, this book provides a fresh theoretical perspective in comparison with current Neo-classical, Keynesian and institutional paradigms. With this systematic approach, the authors seek to provide a unified methodology for evaluating the phenomenon of intense economic growth that has heretofore been missing from the discipline. Combining diverse theoretical and methodological strategies to provide a holistic understanding of the historical process of economic change, this volume will be of interest to students and scholars of economic growth, econometrics, political economy, and the new institutional economics as well as policymakers.
This book provides a coherent description of the main concepts and statistical methods used to analyse economic performance. The focus is on measures of performance that are of practical relevance to policy makers. Most, if not all, of these measures can be viewed as measures of productivity and/or efficiency. Linking fields as diverse as index number theory, data envelopment analysis and stochastic frontier analysis, the book explains how to compute measures of input and output quantity change that are consistent with measurement theory. It then discusses ways in which meaningful measures of productivity change can be decomposed into measures of technical progress, environmental change, and different types of efficiency change. The book is aimed at graduate students, researchers, statisticians, accountants and economists working in universities, regulatory authorities, government departments and private firms. The book contains many numerical examples. Computer codes and datasets are available on a companion website.
The global financial crisis saw many Eurozone countries bearing excessive public debt. This led the government bond yields of some peripheral countries to rise sharply, resulting in the outbreak of the European sovereign debt crisis. The debt crisis is characterized by its immediate spread from Greece, the country of origin, to its neighbouring countries and the connection between the Eurozone banking sector and the public sector debt. Addressing these interesting features, this book sheds light on the impacts of the crisis on various financial markets in Europe. This book is among the first to conduct a thorough empirical analysis of the European sovereign debt crisis. It analyses, using advanced econometric methodologies, why the crisis escalated so prominently, having significant impacts on a wide range of financial markets, and was not just limited to government bond markets. The book also allows one to understand the consequences and the overall impact of such a debt crisis, enabling investors and policymakers to formulate diversification strategies, and create suitable regulatory frameworks.
This title, first published in 1970, provides a comprehensive account of the public finance system in Britain. As well as providing a concise outline of the monetary system as a basis for the realistic understanding of public finance, the author also describes the pattern of government expenditure and revenue in the twentieth-century and goes on to give a detailed account of the taxation system up until April 1969. This title will be of interest to students of monetary economics.
This study, first published in 1994, is intended to deepen the readers understanding of the phenomenon of equilibrium credit rationing in two areas. The first area concerns the form that equilibrium credit rationing assumes and its importance in determining the behaviour of interest rates. The second concerns the role of equilibrium credit rationing in transmitting monetary shocks to the real sector. This title will be of interest to students of monetary economics.
This short book introduces the main ideas of statistical inference in a way that is both user friendly and mathematically sound. Particular emphasis is placed on the common foundation of many models used in practice. In addition, the book focuses on the formulation of appropriate statistical models to study problems in business, economics, and the social sciences, as well as on how to interpret the results from statistical analyses. The book will be useful to students who are interested in rigorous applications of statistics to problems in business, economics and the social sciences, as well as students who have studied statistics in the past, but need a more solid grounding in statistical techniques to further their careers. Jacco Thijssen is professor of finance at the University of York, UK. He holds a PhD in mathematical economics from Tilburg University, Netherlands. His main research interests are in applications of optimal stopping theory, stochastic calculus, and game theory to problems in economics and finance. Professor Thijssen has earned several awards for his statistics teaching.
The object of this work, first published in 1977, is to examine the history of the economic and monetary union (EMU) in the European Community, the policies of the parties involved and the conflicts of interest created in the political and economic environment within which all this has taken place. This title will be of interest to students of monetary economics and finance.
'Refreshingly clear and engaging' Tim Harford 'Delightful . . . full of unique insights' Prof Sir David Spiegelhalter There's no getting away from statistics. We encounter them every day. We are all users of statistics whether we like it or not. Do missed appointments really cost the NHS GBP1bn per year? What's the difference between the mean gender pay gap and the median gender pay gap? How can we work out if a claim that we use 42 billion single-use plastic straws per year in the UK is accurate? What did the Vote Leave campaign's GBP350m bus really mean? How can we tell if the headline 'Public pensions cost you GBP4,000 a year' is correct? Does snow really cost the UK economy GBP1bn per day? But how do we distinguish statistical fact from fiction? What can we do to decide whether a number, claim or news story is accurate? Without an understanding of data, we cannot truly understand what is going on in the world around us. Written by Anthony Reuben, the BBC's first head of statistics, Statistical is an accessible and empowering guide to challenging the numbers all around us.
Written in a highly accessible style, A Factor Model Approach to Derivative Pricing lays a clear and structured foundation for the pricing of derivative securities based upon simple factor model related absence of arbitrage ideas. This unique and unifying approach provides for a broad treatment of topics and models, including equity, interest-rate, and credit derivatives, as well as hedging and tree-based computational methods, but without reliance on the heavy prerequisites that often accompany such topics. Key features A single fundamental absence of arbitrage relationship based on factor models is used to motivate all the results in the book A structured three-step procedure is used to guide the derivation of absence of arbitrage equations and illuminate core underlying concepts Brownian motion and Poisson process driven models are treated together, allowing for a broad and cohesive presentation of topics The final chapter provides a new approach to risk neutral pricing that introduces the topic as a seamless and natural extension of the factor model approach Whether being used as text for an intermediate level course in derivatives, or by researchers and practitioners who are seeking a better understanding of the fundamental ideas that underlie derivative pricing, readers will appreciate the book's ability to unify many disparate topics and models under a single conceptual theme. James A Primbs is an Associate Professor of Finance at the Mihaylo College of Business and Economics at California State University, Fullerton.
This second edition of Design of Observational Studies is both an introduction to statistical inference in observational studies and a detailed discussion of the principles that guide the design of observational studies. An observational study is an empiric investigation of effects caused by treatments when randomized experimentation is unethical or infeasible. Observational studies are common in most fields that study the effects of treatments on people, including medicine, economics, epidemiology, education, psychology, political science and sociology. The quality and strength of evidence provided by an observational study is determined largely by its design. Design of Observational Studies is organized into five parts. Chapters 2, 3, and 5 of Part I cover concisely many of the ideas discussed in Rosenbaum's Observational Studies (also published by Springer) but in a less technical fashion. Part II discusses the practical aspects of using propensity scores and other tools to create a matched comparison that balances many covariates, and includes an updated chapter on matching in R. In Part III, the concept of design sensitivity is used to appraise the relative ability of competing designs to distinguish treatment effects from biases due to unmeasured covariates. Part IV is new to this edition; it discusses evidence factors and the computerized construction of more than one comparison group. Part V discusses planning the analysis of an observational study, with particular reference to Sir Ronald Fisher's striking advice for observational studies: "make your theories elaborate." This new edition features updated exploration of causal influence, with four new chapters, a new R package DOS2 designed as a companion for the book, and discussion of several of the latest matching packages for R. In particular, DOS2 allows readers to reproduce many analyses from Design of Observational Studies.
The composition of portfolios is one of the most fundamental and important methods in financial engineering, used to control the risk of investments. This book provides a comprehensive overview of statistical inference for portfolios and their various applications. A variety of asset processes are introduced, including non-Gaussian stationary processes, nonlinear processes, non-stationary processes, and the book provides a framework for statistical inference using local asymptotic normality (LAN). The approach is generalized for portfolio estimation, so that many important problems can be covered. This book can primarily be used as a reference by researchers from statistics, mathematics, finance, econometrics, and genomics. It can also be used as a textbook by senior undergraduate and graduate students in these fields.
Customer and Business Analytics: Applied Data Mining for Business Decision Making Using R explains and demonstrates, via the accompanying open-source software, how advanced analytical tools can address various business problems. It also gives insight into some of the challenges faced when deploying these tools. Extensively classroom-tested, the text is ideal for students in customer and business analytics or applied data mining as well as professionals in small- to medium-sized organizations. The book offers an intuitive understanding of how different analytics algorithms work. Where necessary, the authors explain the underlying mathematics in an accessible manner. Each technique presented includes a detailed tutorial that enables hands-on experience with real data. The authors also discuss issues often encountered in applied data mining projects and present the CRISP-DM process model as a practical framework for organizing these projects. Showing how data mining can improve the performance of organizations, this book and its R-based software provide the skills and tools needed to successfully develop advanced analytics capabilities.
Originally published in 1981, this book considers one particular area of econometrics- the linear model- where significant recent advances have been made. It considers both single and multiequation models with varying co-efficients, explains the various theories and techniques connected with these and goes on to describe the various applications of the models. Whilst the detailed explanation of the models will interest primarily econometrics specialists, the implications of the advances outlined and the applications of the models will intrest a wide range of economists.
Originally published in 1992 this title came out of a conference on emotion and cognition as antecedents and consequences of health and disease processes in children and adolescents. The theoretical rationale for the conference was based on the assumption that the development of emotion, cognition, health and illness are processes that influence each other through the life span and that these reciprocal interactions begin in infancy. The chapters discuss developmental theories, research and implications for interventions as they relate to promoting health, preventing disease, and treating illness in children and adolescents.
This study, first published in 1979, examines and contrasts two concepts of credit rationing. The first concept takes the relevant price of credit to be the explicit interest rate on the loan and defines the demand for credit as the amount an individual borrower would like to receive at that rate. Under the alternative definition, the price of credit consists of the complete set of loan terms confronting a class of borrowers with given characteristics, while the demand for credit equals the total number of loan which members of the class would like to receive at those terms. This title will be of interest to students of monetary economics.
Originally published in 1984. This book examines two important dimensions of efficiency in the foreign exchange market using econometric techniques. It responds to the macroeconomics trend to re-examining the theories of exchange rate determination following the erratic behaviour of exchange rates in the late 1970s. In particular the text looks at the relation between spot and forward exchange rates and the term structure of the forward premium, both of which require a joint test of market efficiency and the equilibrium model. Approaches used are the regression of spot rates on lagged forward rates and an explicit time series analysis of the spot and forward rates, using data from Canada, the United Kingdom, the Netherlands, Switzerland and Germany.
This title, first published in 1984, considers a temporary monetary equilibrium theory under certainty in a differentiable framework. Using the techniques of differential topology the author investigates the structure of the set of temporary monetary equilibria. Temporary Monetary Equilibrium Theory: A Differentiable Approach will be of interest to students of monetary economics.
Updated to textbook form by popular demand, this second edition discusses diverse mathematical models used in economics, ecology, and the environmental sciences with emphasis on control and optimization. It is intended for graduate and upper-undergraduate course use, however, applied mathematicians, industry practitioners, and a vast number of interdisciplinary academics will find the presentation highly useful. Core topics of this text are: * Economic growth and technological development * Population dynamics and human impact on the environment * Resource extraction and scarcity * Air and water contamination * Rational management of the economy and environment * Climate change and global dynamics The step-by-step approach taken is problem-based and easy to follow. The authors aptly demonstrate that the same models may be used to describe different economic and environmental processes and that similar investigation techniques are applicable to analyze various models. Instructors will appreciate the substantial flexibility that this text allows while designing their own syllabus. Chapters are essentially self-contained and may be covered in full, in part, and in any order. Appropriate one- and two-semester courses include, but are not limited to, Applied Mathematical Modeling, Mathematical Methods in Economics and Environment, Models of Biological Systems, Applied Optimization Models, and Environmental Models. Prerequisites for the courses are Calculus and, preferably, Differential Equations.
This book is about learning from data using the Generalized Additive Models for Location, Scale and Shape (GAMLSS). GAMLSS extends the Generalized Linear Models (GLMs) and Generalized Additive Models (GAMs) to accommodate large complex datasets, which are increasingly prevalent. In particular, the GAMLSS statistical framework enables flexible regression and smoothing models to be fitted to the data. The GAMLSS model assumes that the response variable has any parametric (continuous, discrete or mixed) distribution which might be heavy- or light-tailed, and positively or negatively skewed. In addition, all the parameters of the distribution (location, scale, shape) can be modelled as linear or smooth functions of explanatory variables. Key Features: Provides a broad overview of flexible regression and smoothing techniques to learn from data whilst also focusing on the practical application of methodology using GAMLSS software in R. Includes a comprehensive collection of real data examples, which reflect the range of problems addressed by GAMLSS models and provide a practical illustration of the process of using flexible GAMLSS models for statistical learning. R code integrated into the text for ease of understanding and replication. Supplemented by a website with code, data and extra materials. This book aims to help readers understand how to learn from data encountered in many fields. It will be useful for practitioners and researchers who wish to understand and use the GAMLSS models to learn from data and also for students who wish to learn GAMLSS through practical examples.
Despite numerous books on research methodology, many have failed to present a complete, hands-on, practical book to lead college classes or individuals through the research process. We are seeing more and more scientific papers from all research fields that fail to meet the basic criteria in terms of research methods, as well as the structure, writing style and presentation of results. This book aims to address this gap in the market by providing an authoritative, easy to follow guide to research methods and how to apply them. Qualitative Methods in Economics is focused not only on the research methods/techniques but also the methodology. The main objective of this book is to discuss qualitative methods and their use in economics and social science research. Chapters identify several of the research approaches commonly used in social studies, from the importance of the role of science through to the techniques of data collection. Using an example research paper to examine the methods used to present the research, the second half of this book breaks down how to present and format your results successfully. This book will be of use to students and researchers who want to improve their research methods and read up on the new and cutting edge advances in research methods, as well as those who like to study ways to improve the research process.
This title, first published in 1984, is a contribution to applied international trade theory. The author explores the specification and estimation of a multisector general equilibrium model of the open economy. The model is formulated with the aim of assessing empirically the effects of three key policy variables on trade flows, domestic prices, and the trade balance. The policy variables with which the author is concerned are the rate of growth of the stock of domestic credit, commercial policy, as represented by tariffs, and, finally, the exchange rate. This title will be of interest to students of economics.
Both parts of Volume 44 of Advances in Econometrics pay tribute to Fabio Canova for his major contributions to economics over the last four decades. Throughout his long and distinguished career, Canova's research has achieved both a prolific publication record and provided stellar research to the profession. His colleagues, co-authors and PhD students wish to express their deep gratitude to Fabio for his intellectual leadership and guidance, whilst showcasing the extensive advances in knowledge and theory made available by Canova for professionals in the field. Advances in Econometrics publishes original scholarly econometrics papers with the intention of expanding the use of developed and emerging econometric techniques by disseminating ideas on the theory and practice of econometrics throughout the empirical economic, business and social science literature. Annual volume themes, selected by the Series Editors, are their interpretation of important new methods and techniques emerging in economics, statistics and the social sciences.
'Big data' is now readily available to economic historians, thanks to the digitisation of primary sources, collaborative research linking different data sets, and the publication of databases on the internet. Key economic indicators, such as the consumer price index, can be tracked over long periods, and qualitative information, such as land use, can be converted to a quantitative form. In order to fully exploit these innovations it is necessary to use sophisticated statistical techniques to reveal the patterns hidden in datasets, and this book shows how this can be done. A distinguished group of economic historians have teamed up with younger researchers to pilot the application of new techniques to 'big data'. Topics addressed in this volume include prices and the standard of living, money supply, credit markets, land values and land use, transport, technological innovation, and business networks. The research spans the medieval, early modern and modern periods. Research methods include simultaneous equation systems, stochastic trends and discrete choice modelling. This book is essential reading for doctoral and post-doctoral researchers in business, economic and social history. The case studies will also appeal to historical geographers and applied econometricians.
This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging when based on conventional methods. Among others, it introduces a modern text-mining method called dynamic topic modeling in detail and applies it to the message board of Bitcoins. The unique synthesis of theory and practice supported by computational tools is reflected not only in the selection of topics, but also in the fine balance of scientific contributions on practical implementation and theoretical concepts. This link between theory and practice offers theoreticians insights into considerations of applicability and, vice versa, provides practitioners convenient access to new techniques in quantitative finance. Hence the book will appeal both to researchers, including master and PhD students, and practitioners, such as financial engineers. The results presented in the book are fully reproducible and all quantlets needed for calculations are provided on an accompanying website. The Quantlet platform quantlet.de, quantlet.com, quantlet.org is an integrated QuantNet environment consisting of different types of statistics-related documents and program codes. Its goal is to promote reproducibility and offer a platform for sharing validated knowledge native to the social web. QuantNet and the corresponding Data-Driven Documents-based visualization allows readers to reproduce the tables, pictures and calculations inside this Springer book.
This book investigates whether the effects of economic integration differ according to the size of countries. The analysis incorporates a classification of the size of countries, reflecting the key economic characteristics of economies in order to provide an appropriate benchmark for each size group in the empirical analysis of the effects of asymmetric economic integration. The formation or extension of Preferential Trade Areas (PTAs) leads to a reduction in trade costs. This poses a critical secondary question as to the extent to which trade costs differ according to the size of countries. The extent to which membership of PTAs has an asymmetric impact on trade flow according to the size of member countries is analyzed by employing econometric tools and general equilibrium analysis, estimating both the ex-post and ex-ante effects of economic integration on the size of countries, using a data set of 218 countries, 45 of which are European. ? |
You may like...
The Handbook of Historical Economics
Alberto Bisin, Giovanni Federico
Paperback
R2,567
Discovery Miles 25 670
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
State Profiles 2022 - The Population and…
Hannah Anderson Krog
Hardcover
R4,858
Discovery Miles 48 580
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,987
Discovery Miles 29 870
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
|