![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
Maurice Potron (1872-1942), a French Jesuit mathematician, constructed and analyzed a highly original, but virtually unknown economic model. This book presents translated versions of all his economic writings, preceded by a long introduction which sketches his life and environment based on extensive archival research and family documents. Potron had no education in economics and almost no contact with the economists of his time. His primary source of inspiration was the social doctrine of the Church, which had been updated at the end of the nineteenth century. Faced with the 'economic evils' of his time, he reacted by utilizing his talents as a mathematician and an engineer to invent and formalize a general disaggregated model in which production, employment, prices and wages are the main unknowns. He introduced four basic principles or normative conditions ('sufficient production', the 'right to rest', 'justice in exchange', and the 'right to live') to define satisfactory regimes of production and labour on the one hand, and of prices and wages on the other. He studied the conditions for the existence of these regimes, both on the quantity side and the value side, and he explored the way to implement them. This book makes it clear that Potron was the first author to develop a full input-output model, to use the Perron-Frobenius theorem in economics, to state a duality result, and to formulate the Hawkins-Simon condition. These are all techniques which now belong to the standard toolkit of economists. This book will be of interest to Economics postgraduate students and researchers, and will be essential reading for courses dealing with the history of mathematical economics in general, and linear production theory in particular.
Over the last thirty years there has been extensive use of continuous time econometric methods in macroeconomic modelling. This monograph presents the first continuous time macroeconometric model of the United Kingdom incorporating stochastic trends. Its development represents a major step forward in continuous time macroeconomic modelling. The book describes the new model in detail and, like earlier models, it is designed in such a way as to permit a rigorous mathematical analysis of its steady-state and stability properties, thus providing a valuable check on the capacity of the model to generate plausible long-run behaviour. The model is estimated using newly developed exact Gaussian estimation methods for continuous time econometric models incorporating unobservable stochastic trends. The book also includes discussion of the application of the model to dynamic analysis and forecasting.
Space is a crucial variable in any economic activity. Spatial Economics is the branch of economics that explicitly aims to incorporate the space dimension in the analysis of economic phenomena. From its beginning in the last century, Spatial Economics has contributed to the understanding of the economy by developing plenty of theoretical models as well as econometric techniques having the "space" as a core dimension of the analysis.This edited volume addresses the complex issue of Spatial Economics from an applied point of view. This volume is part of a more complex project including another edited volume (Spatial Economics Volume I: Theory) collecting original papers which address Spatial Economics from a theoretical perspective.
The two-volume book studies the economic and industrial development of Japan and China in modern times and draws distinctions between the different paths of industrialization and economic modernization taken in the two countries, based on statistical materials, quantitative analysis and multivariate statistical analysis. The first volume analyses the relationship between technological innovation and economic development in Japan before World War II and sheds light on technological innovation in the Japanese context with particular emphasis on the importance of the patent system. The second volume studies the basic conditions and overall economic development of industrial development, chiefly during the period of the Republic of China (1912-1949), taking a comparative perspective and bringing the case of modern Japan into the discussion. The book will appeal to academics and general readers interested in economic development and the modern economic history of East Asia, development economics, as well as industrial and technological history.
Designed for a one-semester course, Applied Statistics for Business and Economics offers students in business and the social sciences an effective introduction to some of the most basic and powerful techniques available for understanding their world. Numerous interesting and important examples reflect real-life situations, stimulating students to think realistically in tackling these problems. Calculations can be performed using any standard spreadsheet package. To help with the examples, the author offers both actual and hypothetical databases on his website http: //iwu.edu/ bleekley The text explores ways to describe data and the relationships found in data. It covers basic probability tools, Bayes? theorem, sampling, estimation, and confidence intervals. The text also discusses hypothesis testing for one and two samples, contingency tables, goodness-of-fit, analysis of variance, and population variances. In addition, the author develops the concepts behind the linear relationship between two numeric variables (simple regression) as well as the potentially nonlinear relationships among more than two variables (multiple regression). The final chapter introduces classical time-series analysis and how it applies to business and economics. This text provides a practical understanding of the value of statistics in the real world. After reading the book, students will be able to summarize data in insightful ways using charts, graphs, and summary statistics as well as make inferences from samples, especially about relationships.
Space is a crucial variable in any economic activity. Spatial Economics is the branch of economics that explicitly aims to incorporate the space dimension in the analysis of economic phenomena. From its beginning in the last century, Spatial Economics has contributed to the understanding of the economy by developing plenty of theoretical models as well as econometric techniques having the "space" as a core dimension of the analysis. This edited volume addresses the complex issue of Spatial Economics from a theoretical point of view. This volume is part of a more complex project including another edited volume (Spatial Economics Volume II: Applications) collecting original papers which address Spatial Economics from an applied perspective.
This book brings together the latest research in the areas of market microstructure and high-frequency finance along with new econometric methods to address critical practical issues in these areas of research. Thirteen chapters, each of which makes a valuable and significant contribution to the existing literature have been brought together, spanning a wide range of topics including information asymmetry and the information content in limit order books, high-frequency return distribution models, multivariate volatility forecasting, analysis of individual trading behaviour, the analysis of liquidity, price discovery across markets, market microstructure models and the information content of order flow. These issues are central both to the rapidly expanding practice of high frequency trading in financial markets and to the further development of the academic literature in this area. The volume will therefore be of immediate interest to practitioners and academics. This book was originally published as a special issue of European Journal of Finance.
Meghnad Desai's work presents a significant challenge to economics as currently practised. Poverty, Famine and Economic Development brings together essays which reflect his long-standing interest in economic development. Issues discussed include econometric testing of the disguised unemployment hypothesis, theoretical and applied approaches to famine, poverty in rich as well as poor countries, poverty in Latin America and state involvement in economic development. The volume also includes a discussion of the essay by Lenin which was the basis of the 'New Economic Policy', the first attempt at Market Socialism in the Soviet Union. The volume also includes a substantial autobiographical preface, in which Lord Desai explains how he became an economist and the influences behind the development of his thought, as well as a specific introduction explaining how he came to produce the papers included in this volume.
Pathwise estimation and inference for diffusion market models discusses contemporary techniques for inferring, from options and bond prices, the market participants' aggregate view on important financial parameters such as implied volatility, discount rate, future interest rate, and their uncertainty thereof. The focus is on the pathwise inference methods that are applicable to a sole path of the observed prices and do not require the observation of an ensemble of such paths. This book is pitched at the level of senior undergraduate students undertaking research at honors year, and postgraduate candidates undertaking Master's or PhD degree by research. From a research perspective, this book reaches out to academic researchers from backgrounds as diverse as mathematics and probability, econometrics and statistics, and computational mathematics and optimization whose interest lie in analysis and modelling of financial market data from a multi-disciplinary approach. Additionally, this book is also aimed at financial market practitioners participating in capital market facing businesses who seek to keep abreast with and draw inspiration from novel approaches in market data analysis. The first two chapters of the book contains introductory material on stochastic analysis and the classical diffusion stock market models. The remaining chapters discuss more special stock and bond market models and special methods of pathwise inference for market parameter for different models. The final chapter describes applications of numerical methods of inference of bond market parameters to forecasting of short rate. Nikolai Dokuchaev is an associate professor in Mathematics and Statistics at Curtin University. His research interests include mathematical and statistical finance, stochastic analysis, PDEs, control, and signal processing. Lin Yee Hin is a practitioner in the capital market facing industry. His research interests include econometrics, non-parametric regression, and scientific computing.
Proven Methods for Big Data Analysis As big data has become standard in many application areas, challenges have arisen related to methodology and software development, including how to discover meaningful patterns in the vast amounts of data. Addressing these problems, Applied Biclustering Methods for Big and High-Dimensional Data Using R shows how to apply biclustering methods to find local patterns in a big data matrix. The book presents an overview of data analysis using biclustering methods from a practical point of view. Real case studies in drug discovery, genetics, marketing research, biology, toxicity, and sports illustrate the use of several biclustering methods. References to technical details of the methods are provided for readers who wish to investigate the full theoretical background. All the methods are accompanied with R examples that show how to conduct the analyses. The examples, software, and other materials are available on a supplementary website.
Estimate and Interpret Results from Ordered Regression Models Ordered Regression Models: Parallel, Partial, and Non-Parallel Alternatives presents regression models for ordinal outcomes, which are variables that have ordered categories but unknown spacing between the categories. The book provides comprehensive coverage of the three major classes of ordered regression models (cumulative, stage, and adjacent) as well as variations based on the application of the parallel regression assumption. The authors first introduce the three "parallel" ordered regression models before covering unconstrained partial, constrained partial, and nonparallel models. They then review existing tests for the parallel regression assumption, propose new variations of several tests, and discuss important practical concerns related to tests of the parallel regression assumption. The book also describes extensions of ordered regression models, including heterogeneous choice models, multilevel ordered models, and the Bayesian approach to ordered regression models. Some chapters include brief examples using Stata and R. This book offers a conceptual framework for understanding ordered regression models based on the probability of interest and the application of the parallel regression assumption. It demonstrates the usefulness of numerous modeling alternatives, showing you how to select the most appropriate model given the type of ordinal outcome and restrictiveness of the parallel assumption for each variable. Web ResourceMore detailed examples are available on a supplementary website. The site also contains JAGS, R, and Stata codes to estimate the models along with syntax to reproduce the results.
Using unique and cutting-edge research, Schofield a prominent author in the US for a number of years, explores the growth area of positive political economy within economics and politics. The first book to explain the spatial model of voting from a mathematical, economics and game-theory perspective it is essential reading for all those studying positive political economy.
This impressive collection from some of today's leading distributional analysts provides an overview a wide range of economic, statistical and sociological relationships that have been opened up for scientific study by the work of two turn-of-the-20th-century economists: C. Gini and M. O. Lorenz. The authors include such figues as Barry Arnold and Frank Cowell and the resulting book deserves its place on the bookshelf of serious mathematical economists everywhere.
Extreme Value Modeling and Risk Analysis: Methods and Applications presents a broad overview of statistical modeling of extreme events along with the most recent methodologies and various applications. The book brings together background material and advanced topics, eliminating the need to sort through the massive amount of literature on the subject. After reviewing univariate extreme value analysis and multivariate extremes, the book explains univariate extreme value mixture modeling, threshold selection in extreme value analysis, and threshold modeling of non-stationary extremes. It presents new results for block-maxima of vine copulas, develops time series of extremes with applications from climatology, describes max-autoregressive and moving maxima models for extremes, and discusses spatial extremes and max-stable processes. The book then covers simulation and conditional simulation of max-stable processes; inference methodologies, such as composite likelihood, Bayesian inference, and approximate Bayesian computation; and inferences about extreme quantiles and extreme dependence. It also explores novel applications of extreme value modeling, including financial investments, insurance and financial risk management, weather and climate disasters, clinical trials, and sports statistics. Risk analyses related to extreme events require the combined expertise of statisticians and domain experts in climatology, hydrology, finance, insurance, sports, and other fields. This book connects statistical/mathematical research with critical decision and risk assessment/management applications to stimulate more collaboration between these statisticians and specialists.
Economic evaluation has become an essential component of clinical trial design to show that new treatments and technologies offer value to payers in various healthcare systems. Although many books exist that address the theoretical or practical aspects of cost-effectiveness analysis, this book differentiates itself from the competition by detailing how to apply health economic evaluation techniques in a clinical trial context, from both academic and pharmaceutical/commercial perspectives. It also includes a special chapter for clinical trials in Cancer. Design & Analysis of Clinical Trials for Economic Evaluation & Reimbursement is not just about performing cost-effectiveness analyses. It also emphasizes the strategic importance of economic evaluation and offers guidance and advice on the complex factors at play before, during, and after an economic evaluation. Filled with detailed examples, the book bridges the gap between applications of economic evaluation in industry (mainly pharmaceutical) and what students may learn in university courses. It provides readers with access to SAS and STATA code. In addition, Windows-based software for sample size and value of information analysis is available free of charge-making it a valuable resource for students considering a career in this field or for those who simply wish to know more about applying economic evaluation techniques. The book includes coverage of trial design, case report form design, quality of life measures, sample sizes, submissions to regulatory authorities for reimbursement, Markov models, cohort models, and decision trees. Examples and case studies are provided at the end of each chapter. Presenting first-hand insights into how economic evaluations are performed from a drug development perspective, the book supplies readers with the foundation required to succeed in an environment where clinical trials and cost-effectiveness of new treatments are central. It also includes thought-provoking exercises for use in classroom and seminar discussions.
Since the publication of the first edition over 30 years ago, the literature related to Pareto distributions has flourished to encompass computer-based inference methods. Pareto Distributions, Second Edition provides broad, up-to-date coverage of the Pareto model and its extensions. This edition expands several chapters to accommodate recent results and reflect the increased use of more computer-intensive inference procedures. New to the Second Edition New material on multivariate inequality Recent ways of handling the problems of inference for Pareto models and their generalizations and extensions New discussions of bivariate and multivariate income and survival models This book continues to provide researchers with a useful resource for understanding the statistical aspects of Pareto and Pareto-like distributions. It covers income models and properties of Pareto distributions, measures of inequality for studying income distributions, inference procedures for Pareto distributions, and various multivariate Pareto distributions existing in the literature.
This book contains the most complete set of the Chinese national income and its components based on system of national accounts. It points out some fundamental issues concerning the estimation of China's national income and it is intended to the students of the field of China study around the world.
This book explores how econometric modelling can be used to provide valuable insight into international housing markets. Initially describing the role of econometrics modelling in real estate market research and how it has developed in recent years, the book goes on to compare and contrast the impact of various macroeconomic factors on developed and developing housing markets. Explaining the similarities and differences in the impact of financial crises on housing markets around the world, the author's econometric analysis of housing markets across the world provides a broad and nuanced perspective on the impact of both international financial markets and local macro economy on housing markets. With discussion of countries such as China, Germany, UK, US and South Africa, the lessons learned will be of interest to scholars of Real Estate economics around the world.
Achille Nicolas Isnard (1749-1803) an engineer with a keen interest in political economy, is best known for demonstrating the concept of market equilibrium using a system of simultaneous equations. The breadth and depth of his work undoubtedly established him as one of the forerunners of modern mathematical economics, yet his seminal contributions to the study of economics remained largely unrecognized until the latter half of the twentieth century. This pioneering new book, the first in English, examines Isnard's life and illuminates his major contributions to political economy. It contains substantial extracts from a number of his publications presented both in English translation and in the original French so Isnard can now finally achieve his place at the heart of discussion on the origins of mathematical economics. The diverse issues covered here will ensure that this book appeals not only to economists with an interest in the history of mathematical economics, but to anyone interested in the emergence of political economy and in wider social thought during the Enlightenment.
Many economic theories depend on the presence or absence of a unit root for their validity, and econometric and statistical theory undergo considerable changes when unit roots are present. Thus, knowledge on unit roots has become so important, necessitating an extensive, compact, and nontechnical book on this subject. This book is rested on this motivation and introduces the literature on unit roots in a comprehensive manner to both empirical and theoretical researchers in economics and other areas. By providing a clear, complete, and critical discussion of unit root literature, In Choi covers a wide range of topics, including uniform confidence interval construction, unit root tests allowing structural breaks, mildly explosive processes, exuberance testing, fractionally integrated processes, seasonal unit roots and panel unit root testing. Extensive, up to date, and readily accessible, this book is a comprehensive reference source on unit roots for both students and applied workers.
Tackling the cybersecurity challenge is a matter of survival for society at large. Cyber attacks are rapidly increasing in sophistication and magnitude-and in their destructive potential. New threats emerge regularly, the last few years having seen a ransomware boom and distributed denial-of-service attacks leveraging the Internet of Things. For organisations, the use of cybersecurity risk management is essential in order to manage these threats. Yet current frameworks have drawbacks which can lead to the suboptimal allocation of cybersecurity resources. Cyber insurance has been touted as part of the solution - based on the idea that insurers can incentivize companies to improve their cybersecurity by offering premium discounts - but cyber insurance levels remain limited. This is because companies have difficulty determining which cyber insurance products to purchase, and insurance companies struggle to accurately assess cyber risk and thus develop cyber insurance products. To deal with these challenges, this volume presents new models for cybersecurity risk management, partly based on the use of cyber insurance. It contains: A set of mathematical models for cybersecurity risk management, including (i) a model to assist companies in determining their optimal budget allocation between security products and cyber insurance and (ii) a model to assist insurers in designing cyber insurance products. The models use adversarial risk analysis to account for the behavior of threat actors (as well as the behavior of companies and insurers). To inform these models, we draw on psychological and behavioural economics studies of decision-making by individuals regarding cybersecurity and cyber insurance. We also draw on organizational decision-making studies involving cybersecurity and cyber insurance. Its theoretical and methodological findings will appeal to researchers across a wide range of cybersecurity-related disciplines including risk and decision analysis, analytics, technology management, actuarial sciences, behavioural sciences, and economics. The practical findings will help cybersecurity professionals and insurers enhance cybersecurity and cyber insurance, thus benefiting society as a whole. This book grew out of a two-year European Union-funded project under Horizons 2020, called CYBECO (Supporting Cyber Insurance from a Behavioral Choice Perspective).
The goal of Portfolio Rebalancing is to provide mathematical and empirical analysis of the effects of portfolio rebalancing on portfolio returns and risks. The mathematical analysis answers the question of when and why fixed-weight portfolios might outperform buy-and-hold portfolios based on volatilities and returns. The empirical analysis, aided by mathematical insights, will examine the effects of portfolio rebalancing in capital markets for asset allocation portfolios and portfolios of stocks, bonds, and commodities.
The quantitative modeling of complex systems of interacting risks is a fairly recent development in the financial and insurance industries. Over the past decades, there has been tremendous innovation and development in the actuarial field. In addition to undertaking mortality and longevity risks in traditional life and annuity products, insurers face unprecedented financial risks since the introduction of equity-linking insurance in 1960s. As the industry moves into the new territory of managing many intertwined financial and insurance risks, non-traditional problems and challenges arise, presenting great opportunities for technology development. Today's computational power and technology make it possible for the life insurance industry to develop highly sophisticated models, which were impossible just a decade ago. Nonetheless, as more industrial practices and regulations move towards dependence on stochastic models, the demand for computational power continues to grow. While the industry continues to rely heavily on hardware innovations, trying to make brute force methods faster and more palatable, we are approaching a crossroads about how to proceed. An Introduction to Computational Risk Management of Equity-Linked Insurance provides a resource for students and entry-level professionals to understand the fundamentals of industrial modeling practice, but also to give a glimpse of software methodologies for modeling and computational efficiency. Features Provides a comprehensive and self-contained introduction to quantitative risk management of equity-linked insurance with exercises and programming samples Includes a collection of mathematical formulations of risk management problems presenting opportunities and challenges to applied mathematicians Summarizes state-of-arts computational techniques for risk management professionals Bridges the gap between the latest developments in finance and actuarial literature and the practice of risk management for investment-combined life insurance Gives a comprehensive review of both Monte Carlo simulation methods and non-simulation numerical methods Runhuan Feng is an Associate Professor of Mathematics and the Director of Actuarial Science at the University of Illinois at Urbana-Champaign. He is a Fellow of the Society of Actuaries and a Chartered Enterprise Risk Analyst. He is a Helen Corley Petit Professorial Scholar and the State Farm Companies Foundation Scholar in Actuarial Science. Runhuan received a Ph.D. degree in Actuarial Science from the University of Waterloo, Canada. Prior to joining Illinois, he held a tenure-track position at the University of Wisconsin-Milwaukee, where he was named a Research Fellow. Runhuan received numerous grants and research contracts from the Actuarial Foundation and the Society of Actuaries in the past. He has published a series of papers on top-tier actuarial and applied probability journals on stochastic analytic approaches in risk theory and quantitative risk management of equity-linked insurance. Over the recent years, he has dedicated his efforts to developing computational methods for managing market innovations in areas of investment combined insurance and retirement planning.
A fair question to ask of an advocate of subjective Bayesianism (which the author is) is "how would you model uncertainty?" In this book, the author writes about how he has done it using real problems from the past, and offers additional comments about the context in which he was working.
Examining the crucial topic of race relations, this book
explores the economic and social environments that play a
significant role in determining economic outcomes and why racial
disparities persist. With contributions from a range of international contributors
including Edward Wolff and Catherine Weinberger, the book compares
how various racial groups fare and are affected in different ways
by economic and social institution. Themes covered in the book
include:
This is an invaluable resource for researchers and academics
across a number of disciplines including political economy, ethnic
and multicultural studies, Asian studies, and sociology. |
![]() ![]() You may like...
Marketing Concepts And Strategies
Sally Dibb, William Pride, …
Paperback
I Shouldnt Be Telling You This
Jeff Goldblum, The Mildred Snitzer Orchestra
CD
R443
Discovery Miles 4 430
|