![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
This title was first published in 2003. This book provides a much-needed comprehensive and up-to-date treatise on financial distress modelling. Since many of the challenges facing researchers of financial distress can only be addressed by a totally new research design and modelling methodology, this book concentrates on extending the potential for bankruptcy analysis from single-equation modelling to multi-equation analysis. Essentially, the work provides an innovative new approach by comparing each firm with itself over time rather than testing specific hypotheses or improving predictive and classificatory accuracy. Added to this new design, a whole new methodology - or way of modelling the process - is applied in the form of a family of models of which the traditional single equation logit or MDA models is just a special case. Preliminary two-equation and three-equation models are presented and tested in the final chapters as a taste of things to come. The groundwork for a full treatise on these sorts of multi-equation systems is laid for further study - this family of models could be used as a basis for more specific applications to different industries and to test hypotheses concerning influential variables to bankruptcy risk.
This book covers diverse themes, including institutions and efficiency, choice and values, law and economics, development and policy, and social and economic measurement. Written in honour of the distinguished economist Satish K. Jain, this compilation of essays should appeal not only to students and researchers of economic theory but also to those interested in the design and evaluation of institutions and policy.
This volume deals with a range of contemporary issues in Indian and other world economies, with a focus on economic theory and policy and their longstanding implications. It analyses and predicts the mechanisms that can come into play to determine the function of institutions and the impact of public policy.
Financial econometrics is one of the greatest on-going success stories of recent decades, as it has become one of the most active areas of research in econometrics. In this book, Michael Clements presents a clear and logical explanation of the key concepts and ideas of forecasts of economic and financial variables. He shows that forecasts of the single most likely outcome of an economic and financial variable are of limited value. Forecasts that provide more information on the expected likely ranges of outcomes are more relevant. This book provides a comprehensive treatment of the evaluation of different types of forecasts and draws out the parallels between the different approaches. It describes the methods of evaluating these more complex forecasts which provide a fuller description of the range of possible future outcomes.
Numerical analysis is the study of computation and its accuracy, stability and often its implementation on a computer. This book focuses on the principles of numerical analysis and is intended to equip those readers who use statistics to craft their own software and to understand the advantages and disadvantages of different numerical methods.
In order to make informed decisions, there are three important elements: intuition, trust, and analytics. Intuition is based on experiential learning and recent research has shown that those who rely on their "gut feelings" may do better than those who don't. Analytics, however, are important in a data-driven environment to also inform decision making. The third element, trust, is critical for knowledge sharing to take place. These three elements-intuition, analytics, and trust-make a perfect combination for decision making. This book gathers leading researchers who explore the role of these three elements in the process of decision-making.
The book provides an up-to-date survey of statistical and econometric techniques for the analysis of count data, with a focus on conditional distribution models. The book starts with a presentation of the benchmark Poisson regression model. Alternative models address unobserved heterogeneity, state dependence, selectivity, endogeneity, underreporting, and clustered sampling. Testing and estimation is discussed. Finally, applications are reviewed in various fields.
Spatial Econometrics provides a modern, powerful and flexible skillset to early career researchers interested in entering this rapidly expanding discipline. It articulates the principles and current practice of modern spatial econometrics and spatial statistics, combining rigorous depth of presentation with unusual depth of coverage. Introducing and formalizing the principles of, and 'need' for, models which define spatial interactions, the book provides a comprehensive framework for almost every major facet of modern science. Subjects covered at length include spatial regression models, weighting matrices, estimation procedures and the complications associated with their use. The work particularly focuses on models of uncertainty and estimation under various complications relating to model specifications, data problems, tests of hypotheses, along with systems and panel data extensions which are covered in exhaustive detail. Extensions discussing pre-test procedures and Bayesian methodologies are provided at length. Throughout, direct applications of spatial models are described in detail, with copious illustrative empirical examples demonstrating how readers might implement spatial analysis in research projects. Designed as a textbook and reference companion, every chapter concludes with a set of questions for formal or self--study. Finally, the book includes extensive supplementing information in a large sample theory in the R programming language that supports early career econometricians interested in the implementation of statistical procedures covered.
A Handbook of Statistical Analyses Using SPSS clearly describes how to conduct a range of univariate and multivariate statistical analyses using the latest version of the Statistical Package for the Social Sciences, SPSS 11. Each chapter addresses a different type of analytical procedure applied to one or more data sets, primarily from the social and behavioral sciences areas. Each chapter also contains exercises relating to the data sets introduced, providing readers with a means to develop both their SPSS and statistical skills. Model answers to the exercises are also provided. Readers can download all of the data sets from a companion Web site furnished by the authors.
This book focuses on the application of the partial hedging approach from modern math finance to equity-linked life insurance contracts. It provides an accessible, up-to-date introduction to quantifying financial and insurance risks. The book also explains how to price innovative financial and insurance products from partial hedging perspectives. Each chapter presents the problem, the mathematical formulation, theoretical results, derivation details, numerical illustrations, and references to further reading.
The composition of portfolios is one of the most fundamental and important methods in financial engineering, used to control the risk of investments. This book provides a comprehensive overview of statistical inference for portfolios and their various applications. A variety of asset processes are introduced, including non-Gaussian stationary processes, nonlinear processes, non-stationary processes, and the book provides a framework for statistical inference using local asymptotic normality (LAN). The approach is generalized for portfolio estimation, so that many important problems can be covered. This book can primarily be used as a reference by researchers from statistics, mathematics, finance, econometrics, and genomics. It can also be used as a textbook by senior undergraduate and graduate students in these fields.
The interaction between mathematicians, statisticians and econometricians working in actuarial sciences and finance is producing numerous meaningful scientific results. This volume introduces new ideas, in the form of four-page papers, presented at the international conference Mathematical and Statistical Methods for Actuarial Sciences and Finance (MAF), held at Universidad Carlos III de Madrid (Spain), 4th-6th April 2018. The book covers a wide variety of subjects in actuarial science and financial fields, all discussed in the context of the cooperation between the three quantitative approaches. The topics include: actuarial models; analysis of high frequency financial data; behavioural finance; carbon and green finance; credit risk methods and models; dynamic optimization in finance; financial econometrics; forecasting of dynamical actuarial and financial phenomena; fund performance evaluation; insurance portfolio risk analysis; interest rate models; longevity risk; machine learning and soft-computing in finance; management in insurance business; models and methods for financial time series analysis, models for financial derivatives; multivariate techniques for financial markets analysis; optimization in insurance; pricing; probability in actuarial sciences, insurance and finance; real world finance; risk management; solvency analysis; sovereign risk; static and dynamic portfolio selection and management; trading systems. This book is a valuable resource for academics, PhD students, practitioners, professionals and researchers, and is also of interest to other readers with quantitative background knowledge.
This book explores new topics in modern research on empirical corporate finance and applied accounting, especially the econometric analysis of microdata. Dubbed "financial microeconometrics" by the author, this concept unites both methodological and applied approaches. The book examines how quantitative methods can be applied in corporate finance and accounting research in order to predict companies getting into financial distress. Presented in a clear and straightforward manner, it also suggests methods for linking corporate governance to financial performance, and discusses what the determinants of accounting disclosures are. Exploring these questions by way of numerous practical examples, this book is intended for researchers, practitioners and students who are not yet familiar with the variety of approaches available for data analysis and microeconometrics. "This book on financial microeconometrics is an excellent starting point for research in corporate finance and accounting. In my view, the text is positioned between a narrative and a scientific treatise. It is based on a vast amount of literature but is not overloaded with formulae. My appreciation of financial microeconometrics has very much increased. The book is well organized and properly written. I enjoyed reading it." Wolfgang Marty, Senior Investment Strategist, AgaNola AG
This book employs a computable general equilibrium (CGE) model - a widely used economic model which uses actual data to provide economic analysis and policy assessment - and applies it to economic data on Singapore's tourism industry. The authors set out to demonstrate how a novice modeller can acquire the necessary skills and knowledge to successfully apply general equilibrium models to tourism studies. The chapters explain how to build a computable general equilibrium model for tourism, how to conduct simulation and, most importantly, how to analyse modelling results. This applied study acts as a modelling book at both introductory and intermediate levels, specifically targeting students and researchers who are interested in and wish to learn computable general equilibrium modelling. The authors offer insightful analysis of Singapore's tourism industry and provide both students and researchers with a guide on how to apply general equilibrium models to actual economic data and draw accurate conclusions.
Herbert Scarf is a highly esteemed distinguished American economist. He is internationally famous for his early epoch-making work on optimal inventory policies and his highly influential study with Andrew Clark on optimal policies for a multi-echelon inventory problem, which initiated the important and flourishing field of supply chain management. Equally, he has gained world recognition for his classic study on the stability of the Walrasian price adjustment processes and his fundamental analysis on the relationship between the core and the set of competitive equilibria (the so-called Edgeworth conjecture). Further achievements include his remarkable sufficient condition for the existence of a core in non-transferable utility games and general exchange economies, his seminal paper with Lloyd Shapley on housing markets, and his pioneering study on increasing returns and models of production in the presence of indivisibilities. All in all, however, the name of Scarf is always remembered as a synonym for the computation of economic equilibria and fixed points. In the early 1960s he invented a path-breaking technique for computing equilibrium prices. This work has generated a major research field in economics termed Applied General Equilibrium Analysis and a corresponding area in operations research known as Simplicial Fixed Point Methods. This book comprises all his research articles and consists of four volumes. This volume collects Herbert Scarf's papers in the area of Economics and Game Theory.
Econometric Model Specification reviews and extends the author's papers on consistent model specification testing and semi-nonparametric modeling and inference. This book consists of two parts. The first part discusses consistent tests of functional form of regression and conditional distribution models, including a consistent test of the martingale difference hypothesis for time series regression errors. In the second part, semi-nonparametric modeling and inference for duration and auction models are considered, as well as a general theory of the consistency and asymptotic normality of semi-nonparametric sieve maximum likelihood estimators. Moreover, this volume also contains addendums and appendices that provide detailed proofs and extensions of all the results. It is uniquely self-contained and is a useful source for students and researchers interested in model specification issues.
The objective of this book is the discussion and the practical illustration of techniques used in applied macroeconometrics. There are currently three competing approaches: the LSE (London School of Economics) approach, the VAR approach, and the intertemporal optimization/Real Business Cycle approach. This book discusses and illustrates the empirical research strategy of these three alternative approaches, pairing them with extensive discussions and replications of the relevant empirical work. Common benchmarks are used to evaluate the alternative approaches.
Explores the Origin of the Recent Banking Crisis and how to Preclude Future Crises Shedding new light on the recent worldwide banking debacle, The Banking Crisis Handbook presents possible remedies as to what should have been done prior, during, and after the crisis. With contributions from well-known academics and professionals, the book contains exclusive, new research that will undoubtedly assist bank executives, risk management departments, and other financial professionals to attain a clear picture of the banking crisis and prevent future banking collapses. The first part of the book explains how the crisis originated. It discusses the role of subprime mortgages, shadow banks, ineffective risk management, poor financial regulations, and hedge funds in causing the collapse of financial systems. The second section examines how the crisis affected the global market as well as individual countries and regions, such as Asia and Greece. In the final part, the book explores short- and long-term solutions, including government intervention, financial regulations, efficient bank default risk approaches, and methods to evaluate credit risk. It also looks at when government intervention in financial markets can be ethically justified.
Economic history is the most quantitative branch of history, reflecting the interests and profiting from the techniques and concepts of economics. This essay, first published in 1977, provides an extensive contribution to quantitative historiography by delivering a critical guide to the sources of the numerical data of the period 1700 to 1850. This title will be of interest to students of history, finance and economics.
How could Finance benefit from AI? How can AI techniques provide an edge? Moving well beyond simply speeding up computation, this book tackles AI for Finance from a range of perspectives including business, technology, research, and students. Covering aspects like algorithms, big data, and machine learning, this book answers these and many other questions.
Economic Models for Industrial Organization focuses on the specification and estimation of econometric models for research in industrial organization. In recent decades, empirical work in industrial organization has moved towards dynamic and equilibrium models, involving econometric methods which have features distinct from those used in other areas of applied economics. These lecture notes, aimed for a first or second-year PhD course, motivate and explain these econometric methods, starting from simple models and building to models with the complexity observed in typical research papers. The covered topics include discrete-choice demand analysis, models of dynamic behavior and dynamic games, multiple equilibria in entry games and partial identification, and auction models.
There are many problems regarding poverty, inequality and growth in developing countries in Asia and Africa. Policy makers at the national level and at international institutions such as the United Nations, World Bank, International Monetary Fund and others have implemented various policies in order to decrease poverty and inequality. This book provides empirical observations on Asian countries and Africa. Each chapter provides theoretical and empirical analysis on regional case studies with an emphasis on policy implications. The book will be of use to many who wish to assess and improve policies in developing countries and mitigate poverty and inequality, and stimulate growth, by drawing on relevant empirical research and economic theories. Clearly, there have been numerous policy failures and the book aims to provide a basis for improving policies and outcomes based on relevant empirical observations.
'Overall, the book is highly technical, including full mathematical proofs of the results stated. Potential readers are post-graduate students or researchers in Quantitative Risk Management willing to have a manual with the state-of-the-art on portfolio diversification and risk aggregation with heavy tails, including the fundamental theorems as well as collateral (but most useful) results on majorization and copula theory.'Quantitative Finance This book offers a unified approach to the study of crises, large fluctuations, dependence and contagion effects in economics and finance. It covers important topics in statistical modeling and estimation, which combine the notions of copulas and heavy tails - two particularly valuable tools of today's research in economics, finance, econometrics and other fields - in order to provide a new way of thinking about such vital problems as diversification of risk and propagation of crises through financial markets due to contagion phenomena, among others. The aim is to arm today's economists with a toolbox suited for analyzing multivariate data with many outliers and with arbitrary dependence patterns. The methods and topics discussed and used in the book include, in particular, majorization theory, heavy-tailed distributions and copula functions - all applied to study robustness of economic, financial and statistical models, and estimation methods to heavy tails and dependence.
The book is a collection of essays in honour of Clive Granger. The chapters are by some of the world's leading econometricians, all of whom have collaborated with or studied with (or both) Clive Granger. Central themes of Granger's work are reflected in the book with attention to tests for unit roots and cointegration, tests of misspecification, forecasting models and forecast evaluation, non-linear and non-parametric econometric techniques, and overall, a careful blend of practical empirical work and strong theory. The book shows the scope of Granger's research and the range of the profession that has been influenced by his work.
This book presents the methodology and applications of Data Envelopment Analysis (DEA) in measuring productivity, efficiency and effectiveness in Financial Services firms such as banks, bank branches, stock markets, pension funds, mutual funds, insurance firms, credit unions, risk tolerance, and corporate failure prediction. Financial service DEA research includes banking; insurance businesses; hedge, pension and mutual funds; and credit unions. Significant business transactions among financial service organizations such as bank mergers and acquisitions and valuation of IPOs have also been the focus of DEA research. The book looks at the range of DEA uses for financial services by presenting prior studies, examining the current capabilities reflected in the most recent research, and projecting future new uses of DEA in finance related applications. |
You may like...
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,581
Discovery Miles 35 810
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
The Leading Indicators - A Short History…
Zachary Karabell
Paperback
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
|