Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics
A provocative new analysis of immigration's long-term effects on a nation's economy and culture. Over the last two decades, as economists began using big datasets and modern computing power to reveal the sources of national prosperity, their statistical results kept pointing toward the power of culture to drive the wealth of nations. In The Culture Transplant, Garett Jones documents the cultural foundations of cross-country income differences, showing that immigrants import cultural attitudes from their homelands—toward saving, toward trust, and toward the role of government—that persist for decades, and likely for centuries, in their new national homes. Full assimilation in a generation or two, Jones reports, is a myth. And the cultural traits migrants bring to their new homes have enduring effects upon a nation's economic potential. Built upon mainstream, well-reviewed academic research that hasn't pierced the public consciousness, this book offers a compelling refutation of an unspoken consensus that a nation's economic and political institutions won't be changed by immigration. Jones refutes the common view that we can discuss migration policy without considering whether migration can, over a few generations, substantially transform the economic and political institutions of a nation. And since most of the world's technological innovations come from just a handful of nations, Jones concludes, the entire world has a stake in whether migration policy will help or hurt the quality of government and thus the quality of scientific breakthroughs in those rare innovation powerhouses.
Handbook of Empirical Economics and Finance explores the latest developments in the analysis and modeling of economic and financial data. Well-recognized econometric experts discuss the rapidly growing research in economics and finance and offer insight on the future direction of these fields. Focusing on micro models, the first group of chapters describes the statistical issues involved in the analysis of econometric models with cross-sectional data often arising in microeconomics. The book then illustrates time series models that are extensively used in empirical macroeconomics and finance. The last set of chapters explores the types of panel data and spatial models that are becoming increasingly significant in analyzing complex economic behavior and policy evaluations. This handbook brings together both background material and new methodological and applied results that are extremely important to the current and future frontiers in empirical economics and finance. It emphasizes inferential issues that transpire in the analysis of cross-sectional, time series, and panel data-based empirical models in economics, finance, and related disciplines.
This book brings together domains in financial asset pricing and valuation, financial investment theory, econometrics modeling, and the empirical analyses of financial data by applying appropriate econometric techniques. These domains are highly intertwined and should be properly understood in order to correctly and effectively harness the power of data and methods for investment and financial decision-making. The book is targeted at advanced finance undergraduates and beginner professionals performing financial forecasts or empirical modeling who will find it refreshing to see how forecasting is not simply running a least squares regression line across data points, and that there are many minefields and pitfalls to avoid, such as spurious results and incorrect interpretations.
Pathwise estimation and inference for diffusion market models discusses contemporary techniques for inferring, from options and bond prices, the market participants' aggregate view on important financial parameters such as implied volatility, discount rate, future interest rate, and their uncertainty thereof. The focus is on the pathwise inference methods that are applicable to a sole path of the observed prices and do not require the observation of an ensemble of such paths. This book is pitched at the level of senior undergraduate students undertaking research at honors year, and postgraduate candidates undertaking Master's or PhD degree by research. From a research perspective, this book reaches out to academic researchers from backgrounds as diverse as mathematics and probability, econometrics and statistics, and computational mathematics and optimization whose interest lie in analysis and modelling of financial market data from a multi-disciplinary approach. Additionally, this book is also aimed at financial market practitioners participating in capital market facing businesses who seek to keep abreast with and draw inspiration from novel approaches in market data analysis. The first two chapters of the book contains introductory material on stochastic analysis and the classical diffusion stock market models. The remaining chapters discuss more special stock and bond market models and special methods of pathwise inference for market parameter for different models. The final chapter describes applications of numerical methods of inference of bond market parameters to forecasting of short rate. Nikolai Dokuchaev is an associate professor in Mathematics and Statistics at Curtin University. His research interests include mathematical and statistical finance, stochastic analysis, PDEs, control, and signal processing. Lin Yee Hin is a practitioner in the capital market facing industry. His research interests include econometrics, non-parametric regression, and scientific computing.
The quantitative modeling of complex systems of interacting risks is a fairly recent development in the financial and insurance industries. Over the past decades, there has been tremendous innovation and development in the actuarial field. In addition to undertaking mortality and longevity risks in traditional life and annuity products, insurers face unprecedented financial risks since the introduction of equity-linking insurance in 1960s. As the industry moves into the new territory of managing many intertwined financial and insurance risks, non-traditional problems and challenges arise, presenting great opportunities for technology development. Today's computational power and technology make it possible for the life insurance industry to develop highly sophisticated models, which were impossible just a decade ago. Nonetheless, as more industrial practices and regulations move towards dependence on stochastic models, the demand for computational power continues to grow. While the industry continues to rely heavily on hardware innovations, trying to make brute force methods faster and more palatable, we are approaching a crossroads about how to proceed. An Introduction to Computational Risk Management of Equity-Linked Insurance provides a resource for students and entry-level professionals to understand the fundamentals of industrial modeling practice, but also to give a glimpse of software methodologies for modeling and computational efficiency. Features Provides a comprehensive and self-contained introduction to quantitative risk management of equity-linked insurance with exercises and programming samples Includes a collection of mathematical formulations of risk management problems presenting opportunities and challenges to applied mathematicians Summarizes state-of-arts computational techniques for risk management professionals Bridges the gap between the latest developments in finance and actuarial literature and the practice of risk management for investment-combined life insurance Gives a comprehensive review of both Monte Carlo simulation methods and non-simulation numerical methods Runhuan Feng is an Associate Professor of Mathematics and the Director of Actuarial Science at the University of Illinois at Urbana-Champaign. He is a Fellow of the Society of Actuaries and a Chartered Enterprise Risk Analyst. He is a Helen Corley Petit Professorial Scholar and the State Farm Companies Foundation Scholar in Actuarial Science. Runhuan received a Ph.D. degree in Actuarial Science from the University of Waterloo, Canada. Prior to joining Illinois, he held a tenure-track position at the University of Wisconsin-Milwaukee, where he was named a Research Fellow. Runhuan received numerous grants and research contracts from the Actuarial Foundation and the Society of Actuaries in the past. He has published a series of papers on top-tier actuarial and applied probability journals on stochastic analytic approaches in risk theory and quantitative risk management of equity-linked insurance. Over the recent years, he has dedicated his efforts to developing computational methods for managing market innovations in areas of investment combined insurance and retirement planning.
Sufficient dimension reduction is a rapidly developing research field that has wide applications in regression diagnostics, data visualization, machine learning, genomics, image processing, pattern recognition, and medicine, because they are fields that produce large datasets with a large number of variables. Sufficient Dimension Reduction: Methods and Applications with R introduces the basic theories and the main methodologies, provides practical and easy-to-use algorithms and computer codes to implement these methodologies, and surveys the recent advances at the frontiers of this field. Features Provides comprehensive coverage of this emerging research field. Synthesizes a wide variety of dimension reduction methods under a few unifying principles such as projection in Hilbert spaces, kernel mapping, and von Mises expansion. Reflects most recent advances such as nonlinear sufficient dimension reduction, dimension folding for tensorial data, as well as sufficient dimension reduction for functional data. Includes a set of computer codes written in R that are easily implemented by the readers. Uses real data sets available online to illustrate the usage and power of the described methods. Sufficient dimension reduction has undergone momentous development in recent years, partly due to the increased demands for techniques to process high-dimensional data, a hallmark of our age of Big Data. This book will serve as the perfect entry into the field for the beginning researchers or a handy reference for the advanced ones. The author Bing Li obtained his Ph.D. from the University of Chicago. He is currently a Professor of Statistics at the Pennsylvania State University. His research interests cover sufficient dimension reduction, statistical graphical models, functional data analysis, machine learning, estimating equations and quasilikelihood, and robust statistics. He is a fellow of the Institute of Mathematical Statistics and the American Statistical Association. He is an Associate Editor for The Annals of Statistics and the Journal of the American Statistical Association.
A fair question to ask of an advocate of subjective Bayesianism (which the author is) is "how would you model uncertainty?" In this book, the author writes about how he has done it using real problems from the past, and offers additional comments about the context in which he was working.
Proven Methods for Big Data Analysis As big data has become standard in many application areas, challenges have arisen related to methodology and software development, including how to discover meaningful patterns in the vast amounts of data. Addressing these problems, Applied Biclustering Methods for Big and High-Dimensional Data Using R shows how to apply biclustering methods to find local patterns in a big data matrix. The book presents an overview of data analysis using biclustering methods from a practical point of view. Real case studies in drug discovery, genetics, marketing research, biology, toxicity, and sports illustrate the use of several biclustering methods. References to technical details of the methods are provided for readers who wish to investigate the full theoretical background. All the methods are accompanied with R examples that show how to conduct the analyses. The examples, software, and other materials are available on a supplementary website.
This book explores how econometric modelling can be used to provide valuable insight into international housing markets. Initially describing the role of econometrics modelling in real estate market research and how it has developed in recent years, the book goes on to compare and contrast the impact of various macroeconomic factors on developed and developing housing markets. Explaining the similarities and differences in the impact of financial crises on housing markets around the world, the author's econometric analysis of housing markets across the world provides a broad and nuanced perspective on the impact of both international financial markets and local macro economy on housing markets. With discussion of countries such as China, Germany, UK, US and South Africa, the lessons learned will be of interest to scholars of Real Estate economics around the world.
Estimate and Interpret Results from Ordered Regression Models Ordered Regression Models: Parallel, Partial, and Non-Parallel Alternatives presents regression models for ordinal outcomes, which are variables that have ordered categories but unknown spacing between the categories. The book provides comprehensive coverage of the three major classes of ordered regression models (cumulative, stage, and adjacent) as well as variations based on the application of the parallel regression assumption. The authors first introduce the three "parallel" ordered regression models before covering unconstrained partial, constrained partial, and nonparallel models. They then review existing tests for the parallel regression assumption, propose new variations of several tests, and discuss important practical concerns related to tests of the parallel regression assumption. The book also describes extensions of ordered regression models, including heterogeneous choice models, multilevel ordered models, and the Bayesian approach to ordered regression models. Some chapters include brief examples using Stata and R. This book offers a conceptual framework for understanding ordered regression models based on the probability of interest and the application of the parallel regression assumption. It demonstrates the usefulness of numerous modeling alternatives, showing you how to select the most appropriate model given the type of ordinal outcome and restrictiveness of the parallel assumption for each variable. Web ResourceMore detailed examples are available on a supplementary website. The site also contains JAGS, R, and Stata codes to estimate the models along with syntax to reproduce the results.
Economic evaluation has become an essential component of clinical trial design to show that new treatments and technologies offer value to payers in various healthcare systems. Although many books exist that address the theoretical or practical aspects of cost-effectiveness analysis, this book differentiates itself from the competition by detailing how to apply health economic evaluation techniques in a clinical trial context, from both academic and pharmaceutical/commercial perspectives. It also includes a special chapter for clinical trials in Cancer. Design & Analysis of Clinical Trials for Economic Evaluation & Reimbursement is not just about performing cost-effectiveness analyses. It also emphasizes the strategic importance of economic evaluation and offers guidance and advice on the complex factors at play before, during, and after an economic evaluation. Filled with detailed examples, the book bridges the gap between applications of economic evaluation in industry (mainly pharmaceutical) and what students may learn in university courses. It provides readers with access to SAS and STATA code. In addition, Windows-based software for sample size and value of information analysis is available free of charge-making it a valuable resource for students considering a career in this field or for those who simply wish to know more about applying economic evaluation techniques. The book includes coverage of trial design, case report form design, quality of life measures, sample sizes, submissions to regulatory authorities for reimbursement, Markov models, cohort models, and decision trees. Examples and case studies are provided at the end of each chapter. Presenting first-hand insights into how economic evaluations are performed from a drug development perspective, the book supplies readers with the foundation required to succeed in an environment where clinical trials and cost-effectiveness of new treatments are central. It also includes thought-provoking exercises for use in classroom and seminar discussions.
Since the publication of the first edition over 30 years ago, the literature related to Pareto distributions has flourished to encompass computer-based inference methods. Pareto Distributions, Second Edition provides broad, up-to-date coverage of the Pareto model and its extensions. This edition expands several chapters to accommodate recent results and reflect the increased use of more computer-intensive inference procedures. New to the Second Edition New material on multivariate inequality Recent ways of handling the problems of inference for Pareto models and their generalizations and extensions New discussions of bivariate and multivariate income and survival models This book continues to provide researchers with a useful resource for understanding the statistical aspects of Pareto and Pareto-like distributions. It covers income models and properties of Pareto distributions, measures of inequality for studying income distributions, inference procedures for Pareto distributions, and various multivariate Pareto distributions existing in the literature.
This book brings together the latest research in the areas of market microstructure and high-frequency finance along with new econometric methods to address critical practical issues in these areas of research. Thirteen chapters, each of which makes a valuable and significant contribution to the existing literature have been brought together, spanning a wide range of topics including information asymmetry and the information content in limit order books, high-frequency return distribution models, multivariate volatility forecasting, analysis of individual trading behaviour, the analysis of liquidity, price discovery across markets, market microstructure models and the information content of order flow. These issues are central both to the rapidly expanding practice of high frequency trading in financial markets and to the further development of the academic literature in this area. The volume will therefore be of immediate interest to practitioners and academics. This book was originally published as a special issue of European Journal of Finance.
Tackling the cybersecurity challenge is a matter of survival for society at large. Cyber attacks are rapidly increasing in sophistication and magnitude-and in their destructive potential. New threats emerge regularly, the last few years having seen a ransomware boom and distributed denial-of-service attacks leveraging the Internet of Things. For organisations, the use of cybersecurity risk management is essential in order to manage these threats. Yet current frameworks have drawbacks which can lead to the suboptimal allocation of cybersecurity resources. Cyber insurance has been touted as part of the solution - based on the idea that insurers can incentivize companies to improve their cybersecurity by offering premium discounts - but cyber insurance levels remain limited. This is because companies have difficulty determining which cyber insurance products to purchase, and insurance companies struggle to accurately assess cyber risk and thus develop cyber insurance products. To deal with these challenges, this volume presents new models for cybersecurity risk management, partly based on the use of cyber insurance. It contains: A set of mathematical models for cybersecurity risk management, including (i) a model to assist companies in determining their optimal budget allocation between security products and cyber insurance and (ii) a model to assist insurers in designing cyber insurance products. The models use adversarial risk analysis to account for the behavior of threat actors (as well as the behavior of companies and insurers). To inform these models, we draw on psychological and behavioural economics studies of decision-making by individuals regarding cybersecurity and cyber insurance. We also draw on organizational decision-making studies involving cybersecurity and cyber insurance. Its theoretical and methodological findings will appeal to researchers across a wide range of cybersecurity-related disciplines including risk and decision analysis, analytics, technology management, actuarial sciences, behavioural sciences, and economics. The practical findings will help cybersecurity professionals and insurers enhance cybersecurity and cyber insurance, thus benefiting society as a whole. This book grew out of a two-year European Union-funded project under Horizons 2020, called CYBECO (Supporting Cyber Insurance from a Behavioral Choice Perspective).
The book's comprehensive coverage on the application of econometric methods to empirical analysis of economic issues is impressive. It uncovers the missing link between textbooks on economic theory and econometrics and highlights the powerful connection between economic theory and empirical analysis perfectly through examples on rigorous experimental design. The use of data sets for estimation derived with the Monte Carlo method helps facilitate the understanding of the role of hypothesis testing applied to economic models. Topics covered in the book are: consumer behavior, producer behavior, market equilibrium, macroeconomic models, qualitative-response models, panel data analysis and time-series analysis. Key econometric models are introduced, specified, estimated and evaluated. The treatment on methods of estimation in econometrics and the discipline of hypothesis testing makes it a must-have for graduate students of economics and econometrics and aids their understanding on how to estimate economic models and evaluate the results in terms of policy implications.
Change of Time and Change of Measure provides a comprehensive account of two topics that are of particular significance in both theoretical and applied stochastics: random change of time and change of probability law.Random change of time is key to understanding the nature of various stochastic processes, and gives rise to interesting mathematical results and insights of importance for the modeling and interpretation of empirically observed dynamic processes. Change of probability law is a technique for solving central questions in mathematical finance, and also has a considerable role in insurance mathematics, large deviation theory, and other fields.The book comprehensively collects and integrates results from a number of scattered sources in the literature and discusses the importance of the results relative to the existing literature, particularly with regard to mathematical finance. It is invaluable as a textbook for graduate-level courses and students or a handy reference for researchers and practitioners in financial mathematics and econometrics.
The contributions in this volume, by leading economists from major universities in Europe and USA, cover research at the front line of econometric analysis and labour market applications. The volume includes several papers on equilibrium search models (a relatively new field), and job matching, both seen from a theoretical and from an applied point of view. Methods on and empirical analyses of unemployment durations are also discussed. Finally, a large group of papers examine the structure and the dynamics of the labour market in a number of countries using panel data. This group includes papers on data quality and policy evaluation. The high unemployment in most countries makes it necessary to come up with studies and methods for analysing the impact of different elements of economic policies. This volume is intended to contribute to further development in the use of panel data in economic analyses.
Space is a crucial variable in any economic activity. Spatial Economics is the branch of economics that explicitly aims to incorporate the space dimension in the analysis of economic phenomena. From its beginning in the last century, Spatial Economics has contributed to the understanding of the economy by developing plenty of theoretical models as well as econometric techniques having the "space" as a core dimension of the analysis.This edited volume addresses the complex issue of Spatial Economics from an applied point of view. This volume is part of a more complex project including another edited volume (Spatial Economics Volume I: Theory) collecting original papers which address Spatial Economics from a theoretical perspective.
Maurice Potron (1872-1942), a French Jesuit mathematician, constructed and analyzed a highly original, but virtually unknown economic model. This book presents translated versions of all his economic writings, preceded by a long introduction which sketches his life and environment based on extensive archival research and family documents. Potron had no education in economics and almost no contact with the economists of his time. His primary source of inspiration was the social doctrine of the Church, which had been updated at the end of the nineteenth century. Faced with the 'economic evils' of his time, he reacted by utilizing his talents as a mathematician and an engineer to invent and formalize a general disaggregated model in which production, employment, prices and wages are the main unknowns. He introduced four basic principles or normative conditions ('sufficient production', the 'right to rest', 'justice in exchange', and the 'right to live') to define satisfactory regimes of production and labour on the one hand, and of prices and wages on the other. He studied the conditions for the existence of these regimes, both on the quantity side and the value side, and he explored the way to implement them. This book makes it clear that Potron was the first author to develop a full input-output model, to use the Perron-Frobenius theorem in economics, to state a duality result, and to formulate the Hawkins-Simon condition. These are all techniques which now belong to the standard toolkit of economists. This book will be of interest to Economics postgraduate students and researchers, and will be essential reading for courses dealing with the history of mathematical economics in general, and linear production theory in particular.
This book offers the representative macroeconometric models and their applications for the Japanese economy in different development stages throughout the postwar years up to the present. It presents a summary of three types of macroeconometric models and analyses: Social accounting analyses of national income and related indices following the tradition of C Clark, S Kuznets, R Stone and World Bank Development Reports; Inter-industrial and inter-regional analyses of the Japanese economy a la W Leontief and the CGE (computable general equilibrium) type of applications to Comprehensive Development Plans; Macroeconometric model building for the Japanese economy and its applications with a survey of various models in Japan, including the historic Osaka University ISER (Institute of Social and Economic Research) model and present day Government models. As many Asian economies are going through the stages of development that Japan has experienced over the past few decades, this book will be extremely relevant to them and other developing countries as a reference for years to come.
Space is a crucial variable in any economic activity. Spatial Economics is the branch of economics that explicitly aims to incorporate the space dimension in the analysis of economic phenomena. From its beginning in the last century, Spatial Economics has contributed to the understanding of the economy by developing plenty of theoretical models as well as econometric techniques having the "space" as a core dimension of the analysis. This edited volume addresses the complex issue of Spatial Economics from a theoretical point of view. This volume is part of a more complex project including another edited volume (Spatial Economics Volume II: Applications) collecting original papers which address Spatial Economics from an applied perspective.
This book contains a set of notes prepared by Ragnar Frisch for a lecture series that he delivered at Yale University in 1930. The lecture notes provide not only a valuable source document for the history of econometrics, but also a more systematic introduction to some of Frisch's key methodological ideas than his other works so far published in various media for the econometrics community. In particular, these notes contain a number of prescient ideas precursory to some of the most important notions developed in econometrics during the 1970s and 1980s More remarkably, Frisch demonstrated a deep understanding of what econometric or statistical analysis could achieve under the situation where there lacked known correct theoretical models. This volume has been rigorously edited and comes with an introductory essay from Olav Bjerkholt and Duo Qin placing the notes in their historical context.
Designed for a one-semester course, Applied Statistics for Business and Economics offers students in business and the social sciences an effective introduction to some of the most basic and powerful techniques available for understanding their world. Numerous interesting and important examples reflect real-life situations, stimulating students to think realistically in tackling these problems. Calculations can be performed using any standard spreadsheet package. To help with the examples, the author offers both actual and hypothetical databases on his website http: //iwu.edu/ bleekley The text explores ways to describe data and the relationships found in data. It covers basic probability tools, Bayes? theorem, sampling, estimation, and confidence intervals. The text also discusses hypothesis testing for one and two samples, contingency tables, goodness-of-fit, analysis of variance, and population variances. In addition, the author develops the concepts behind the linear relationship between two numeric variables (simple regression) as well as the potentially nonlinear relationships among more than two variables (multiple regression). The final chapter introduces classical time-series analysis and how it applies to business and economics. This text provides a practical understanding of the value of statistics in the real world. After reading the book, students will be able to summarize data in insightful ways using charts, graphs, and summary statistics as well as make inferences from samples, especially about relationships.
Using unique and cutting-edge research, Schofield a prominent author in the US for a number of years, explores the growth area of positive political economy within economics and politics. The first book to explain the spatial model of voting from a mathematical, economics and game-theory perspective it is essential reading for all those studying positive political economy.
This impressive collection from some of today's leading distributional analysts provides an overview a wide range of economic, statistical and sociological relationships that have been opened up for scientific study by the work of two turn-of-the-20th-century economists: C. Gini and M. O. Lorenz. The authors include such figues as Barry Arnold and Frank Cowell and the resulting book deserves its place on the bookshelf of serious mathematical economists everywhere. |
You may like...
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Statistics for Business and Economics…
Paul Newbold, William Carlson, …
Paperback
R2,397
Discovery Miles 23 970
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,524
Discovery Miles 35 240
The Mahalanobis Growth Model - A…
Chetan Ghate, Pawan Gopalakrishnan, …
Hardcover
R1,860
Discovery Miles 18 600
Kwantitatiewe statistiese tegnieke
Swanepoel Swanepoel, Vivier Vivier, …
Book
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
|