![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
This book explores how econometric modelling can be used to provide valuable insight into international housing markets. Initially describing the role of econometrics modelling in real estate market research and how it has developed in recent years, the book goes on to compare and contrast the impact of various macroeconomic factors on developed and developing housing markets. Explaining the similarities and differences in the impact of financial crises on housing markets around the world, the author's econometric analysis of housing markets across the world provides a broad and nuanced perspective on the impact of both international financial markets and local macro economy on housing markets. With discussion of countries such as China, Germany, UK, US and South Africa, the lessons learned will be of interest to scholars of Real Estate economics around the world.
Using unique and cutting-edge research, Schofield a prominent author in the US for a number of years, explores the growth area of positive political economy within economics and politics. The first book to explain the spatial model of voting from a mathematical, economics and game-theory perspective it is essential reading for all those studying positive political economy.
The quantitative modeling of complex systems of interacting risks is a fairly recent development in the financial and insurance industries. Over the past decades, there has been tremendous innovation and development in the actuarial field. In addition to undertaking mortality and longevity risks in traditional life and annuity products, insurers face unprecedented financial risks since the introduction of equity-linking insurance in 1960s. As the industry moves into the new territory of managing many intertwined financial and insurance risks, non-traditional problems and challenges arise, presenting great opportunities for technology development. Today's computational power and technology make it possible for the life insurance industry to develop highly sophisticated models, which were impossible just a decade ago. Nonetheless, as more industrial practices and regulations move towards dependence on stochastic models, the demand for computational power continues to grow. While the industry continues to rely heavily on hardware innovations, trying to make brute force methods faster and more palatable, we are approaching a crossroads about how to proceed. An Introduction to Computational Risk Management of Equity-Linked Insurance provides a resource for students and entry-level professionals to understand the fundamentals of industrial modeling practice, but also to give a glimpse of software methodologies for modeling and computational efficiency. Features Provides a comprehensive and self-contained introduction to quantitative risk management of equity-linked insurance with exercises and programming samples Includes a collection of mathematical formulations of risk management problems presenting opportunities and challenges to applied mathematicians Summarizes state-of-arts computational techniques for risk management professionals Bridges the gap between the latest developments in finance and actuarial literature and the practice of risk management for investment-combined life insurance Gives a comprehensive review of both Monte Carlo simulation methods and non-simulation numerical methods Runhuan Feng is an Associate Professor of Mathematics and the Director of Actuarial Science at the University of Illinois at Urbana-Champaign. He is a Fellow of the Society of Actuaries and a Chartered Enterprise Risk Analyst. He is a Helen Corley Petit Professorial Scholar and the State Farm Companies Foundation Scholar in Actuarial Science. Runhuan received a Ph.D. degree in Actuarial Science from the University of Waterloo, Canada. Prior to joining Illinois, he held a tenure-track position at the University of Wisconsin-Milwaukee, where he was named a Research Fellow. Runhuan received numerous grants and research contracts from the Actuarial Foundation and the Society of Actuaries in the past. He has published a series of papers on top-tier actuarial and applied probability journals on stochastic analytic approaches in risk theory and quantitative risk management of equity-linked insurance. Over the recent years, he has dedicated his efforts to developing computational methods for managing market innovations in areas of investment combined insurance and retirement planning.
A fair question to ask of an advocate of subjective Bayesianism (which the author is) is "how would you model uncertainty?" In this book, the author writes about how he has done it using real problems from the past, and offers additional comments about the context in which he was working.
This impressive collection from some of today's leading distributional analysts provides an overview a wide range of economic, statistical and sociological relationships that have been opened up for scientific study by the work of two turn-of-the-20th-century economists: C. Gini and M. O. Lorenz. The authors include such figues as Barry Arnold and Frank Cowell and the resulting book deserves its place on the bookshelf of serious mathematical economists everywhere.
Score your highest in econometrics? Easy. Econometrics can prove challenging for many students unfamiliar with the terms and concepts discussed in a typical econometrics course. "Econometrics For Dummies "eliminates that confusion with easy-to-understand explanations of important topics in the study of economics. "Econometrics For Dummies "breaks down this complex subject and provides you with an easy-to-follow course supplement to further refine your understanding of how econometrics works and how it can be applied in real-world situations.An excellent resource for anyone participating in a college or graduate level econometrics courseProvides you with an easy-to-follow introduction to the techniques and applications of econometricsHelps you score high on exam day If you're seeking a degree in economics and looking for a plain-English guide to this often-intimidating course, "Econometrics For Dummies" has you covered.
Designed for a one-semester course, Applied Statistics for Business and Economics offers students in business and the social sciences an effective introduction to some of the most basic and powerful techniques available for understanding their world. Numerous interesting and important examples reflect real-life situations, stimulating students to think realistically in tackling these problems. Calculations can be performed using any standard spreadsheet package. To help with the examples, the author offers both actual and hypothetical databases on his website http: //iwu.edu/ bleekley The text explores ways to describe data and the relationships found in data. It covers basic probability tools, Bayes? theorem, sampling, estimation, and confidence intervals. The text also discusses hypothesis testing for one and two samples, contingency tables, goodness-of-fit, analysis of variance, and population variances. In addition, the author develops the concepts behind the linear relationship between two numeric variables (simple regression) as well as the potentially nonlinear relationships among more than two variables (multiple regression). The final chapter introduces classical time-series analysis and how it applies to business and economics. This text provides a practical understanding of the value of statistics in the real world. After reading the book, students will be able to summarize data in insightful ways using charts, graphs, and summary statistics as well as make inferences from samples, especially about relationships.
Achille Nicolas Isnard (1749-1803) an engineer with a keen interest in political economy, is best known for demonstrating the concept of market equilibrium using a system of simultaneous equations. The breadth and depth of his work undoubtedly established him as one of the forerunners of modern mathematical economics, yet his seminal contributions to the study of economics remained largely unrecognized until the latter half of the twentieth century. This pioneering new book, the first in English, examines Isnard's life and illuminates his major contributions to political economy. It contains substantial extracts from a number of his publications presented both in English translation and in the original French so Isnard can now finally achieve his place at the heart of discussion on the origins of mathematical economics. The diverse issues covered here will ensure that this book appeals not only to economists with an interest in the history of mathematical economics, but to anyone interested in the emergence of political economy and in wider social thought during the Enlightenment.
Models for repeated measurements will be of interest to research statisticians in agriculture, medicine, economics, and psychology, and to the many consulting statisticians who want an up-to-date expository account of this important topic. The second edition of this successful book has been completely revised and updated to take account of developments in the area over the last few years. This book is organized into four parts. In the first part, the general context of repeated measurements is presented. In the following three parts, a large number of concrete examples, including data tables, is presented to illustrate the models available. The book also provides a very extensive and updated bibliography of the repeated measurements literature.
Big data is presenting challenges to cybersecurity. For an example, the Internet of Things (IoT) will reportedly soon generate a staggering 400 zettabytes (ZB) of data a year. Self-driving cars are predicted to churn out 4000 GB of data per hour of driving. Big data analytics, as an emerging analytical technology, offers the capability to collect, store, process, and visualize these vast amounts of data. Big Data Analytics in Cybersecurity examines security challenges surrounding big data and provides actionable insights that can be used to improve the current practices of network operators and administrators. Applying big data analytics in cybersecurity is critical. By exploiting data from the networks and computers, analysts can discover useful network information from data. Decision makers can make more informative decisions by using this analysis, including what actions need to be performed, and improvement recommendations to policies, guidelines, procedures, tools, and other aspects of the network processes. Bringing together experts from academia, government laboratories, and industry, the book provides insight to both new and more experienced security professionals, as well as data analytics professionals who have varying levels of cybersecurity expertise. It covers a wide range of topics in cybersecurity, which include: Network forensics Threat analysis Vulnerability assessment Visualization Cyber training. In addition, emerging security domains such as the IoT, cloud computing, fog computing, mobile computing, and cyber-social networks are examined. The book first focuses on how big data analytics can be used in different aspects of cybersecurity including network forensics, root-cause analysis, and security training. Next it discusses big data challenges and solutions in such emerging cybersecurity domains as fog computing, IoT, and mobile app security. The book concludes by presenting the tools and datasets for future cybersecurity research.
This monograph examines the domain of classical political economy using the methodologies developed in recent years both by the new discipline of econo-physics and by computing science. This approach is used to re-examine the classical subdivisions of political economy: production, exchange, distribution and finance. The book begins by examining the most basic feature of economic life production and asks what it is about physical laws that allows production to take place. How is it that human labour is able to modify the world? It looks at the role that information has played in the process of mass production and the extent to which human labour still remains a key resource. The Ricardian labour theory of value is re-examined in the light of econophysics, presenting agent based models in which the Ricardian theory of value appears as an emergent property. The authors present models giving rise to the class distribution of income, and the long term evolution of profit rates in market economies. Money is analysed using tools drawn both from computer science and the recent Chartalist school of financial theory. Covering a combination of techniques drawn from three areas, classical political economy, theoretical computer science and econophysics, to produce models that deepen our understanding of economic reality, this new title will be of interest to higher level doctoral and research students, as well as scientists working in the field of econophysics.
Examining the crucial topic of race relations, this book
explores the economic and social environments that play a
significant role in determining economic outcomes and why racial
disparities persist. With contributions from a range of international contributors
including Edward Wolff and Catherine Weinberger, the book compares
how various racial groups fare and are affected in different ways
by economic and social institution. Themes covered in the book
include:
This is an invaluable resource for researchers and academics
across a number of disciplines including political economy, ethnic
and multicultural studies, Asian studies, and sociology.
The time series in the first part of this third biennial compilation show the values of each country's imports and exports, the trading blocs, and each country's most important trading partners. Next, imports and exports of selected commodities are detailed by quantity and by value over the years. Annual and cumulative trade balances are listed and a breakdown is given by Standard International Trade Classification groups. Part Four provides balance of payment figures and indicates changes in indebtedness at various times. A detailed list of tables and an alphabetical index permit quick and easy access to any information required.
Thijs ten Raa, author of the acclaimed text The Economics of Input-Output Analysis, now takes the reader to the forefront of the field. This volume collects and unifies his and his co-authors' research papers on national accounting, input-output coefficients, economic theory, dynamic models, stochastic analysis, and performance analysis. The research is driven by the task to analyze national economies. The final part of the book scrutinizes the emerging Asian economies in the light of international competition.
Applied financial econometrics subjects are featured in this second volume, with papers that survey important research even as they make unique empirical contributions to the literature. These subjects are familiar: portfolio choice, trading volume, the risk-return tradeoff, option pricing, bond yields, and the management, supervision, and measurement of extreme and infrequent risks. Yet their treatments are exceptional, drawing on current data and evidence to reflect recent events and scholarship. A landmark in its coverage, this volume should propel financial econometric research for years. Presents a broad survey of current research
This two-volume work aims to present as completely as possible the methods of statistical inference with special reference to their economic applications. The reader will find a description not only of the classical concepts and results of mathematical statistics, but also of concepts and methods recently developed for the specific needs of econometrics. The authors have sought to avoid an overly technical presentation and go to some lengths to encourage an intuitive understanding of the results by providing numerous examples throughout. The breadth of approaches and the extensive coverage of the two volumes provide for a thorough and entirely self-contained course in modern econometrics. Volume 1 provides an introduction to general concepts and methods in statistics and econometrics, and goes on to cover estimation and prediction. Volume 2 focuses on testing, confidence regions, model selection, and asymptotic theory.
The development of economics changed dramatically during the twentieth century with the emergence of econometrics, macroeconomics and a more scientific approach in general. One of the key individuals in the transformation of economics was Ragnar Frisch, professor at the University of Oslo and the first Nobel Laureate in economics in 1969. He was a co-founder of the Econometric Society in 1930 (after having coined the word econometrics in 1926) and edited the journal Econometrics for twenty-two years. The discovery of the manuscripts of a series of eight lectures given by Frisch at the Henri Poincar Institute in March April 1933 on The Problems and Methods of Econometrics will enable economists to more fully understand his overall vision of econometrics. This book is a rare exhibition of Frisch 's overview on econometrics and is published here in English for the first time. Edited and with an introduction by Olav Bjerkholt and Ariane Dupont-Kieffer, Frisch 's eight lectures provide an accessible and astute discussion of econometric issues from philosophical foundations to practical procedures. Concerning the development of economics in the twentieth century and the broader visions about economic science in general and econometrics in particular held by Ragnar Frisch, this book will appeal to anyone with an interest in the history of economics and econometrics.
Extreme Value Modeling and Risk Analysis: Methods and Applications presents a broad overview of statistical modeling of extreme events along with the most recent methodologies and various applications. The book brings together background material and advanced topics, eliminating the need to sort through the massive amount of literature on the subject. After reviewing univariate extreme value analysis and multivariate extremes, the book explains univariate extreme value mixture modeling, threshold selection in extreme value analysis, and threshold modeling of non-stationary extremes. It presents new results for block-maxima of vine copulas, develops time series of extremes with applications from climatology, describes max-autoregressive and moving maxima models for extremes, and discusses spatial extremes and max-stable processes. The book then covers simulation and conditional simulation of max-stable processes; inference methodologies, such as composite likelihood, Bayesian inference, and approximate Bayesian computation; and inferences about extreme quantiles and extreme dependence. It also explores novel applications of extreme value modeling, including financial investments, insurance and financial risk management, weather and climate disasters, clinical trials, and sports statistics. Risk analyses related to extreme events require the combined expertise of statisticians and domain experts in climatology, hydrology, finance, insurance, sports, and other fields. This book connects statistical/mathematical research with critical decision and risk assessment/management applications to stimulate more collaboration between these statisticians and specialists.
Many of the complex problems faced by decision makers involve uncertainty as well as multiple conflicting objectives. This book provides a complete understanding of the types of objective functions that should be used in multiattribute decision making. By using tools such as preference, value, and utility functions, readers will learn state-of-the-art methods to analyze prospects to guide decision making and will develop a process that guarantees a defensible analysis to rationalize choices. Summarizing and distilling classical techniques and providing extensive coverage of recent advances in the field, the author offers practical guidance on how to make good decisions in the face of uncertainty. This text will appeal to graduate students and practitioners alike in systems engineering, operations research, business, management, government, climate change, energy, and healthcare.
Showcasing fuzzy set theory, this book highlights the enormous potential of fuzzy logic in helping to analyse the complexity of a wide range of socio-economic patterns and behaviour. The contributions to this volume explore the most up-to-date fuzzy-set methods for the measurement of socio-economic phenomena in a multidimensional and/or dynamic perspective. Thus far, fuzzy-set theory has primarily been utilised in the social sciences in the field of poverty measurement. These chapters examine the latest work in this area, while also exploring further applications including social exclusion, the labour market, educational mismatch, sustainability, quality of life and violence against women. The authors demonstrate that real-world situations are often characterised by imprecision, uncertainty and vagueness, which cannot be properly described by the classical set theory which uses a simple true-false binary logic. By contrast, fuzzy-set theory has been shown to be a powerful tool for describing the multidimensionality and complexity of social phenomena. This book will be of significant interest to economists, statisticians and sociologists utilising quantitative methods to explore socio-economic phenomena.
Pathwise estimation and inference for diffusion market models discusses contemporary techniques for inferring, from options and bond prices, the market participants' aggregate view on important financial parameters such as implied volatility, discount rate, future interest rate, and their uncertainty thereof. The focus is on the pathwise inference methods that are applicable to a sole path of the observed prices and do not require the observation of an ensemble of such paths. This book is pitched at the level of senior undergraduate students undertaking research at honors year, and postgraduate candidates undertaking Master's or PhD degree by research. From a research perspective, this book reaches out to academic researchers from backgrounds as diverse as mathematics and probability, econometrics and statistics, and computational mathematics and optimization whose interest lie in analysis and modelling of financial market data from a multi-disciplinary approach. Additionally, this book is also aimed at financial market practitioners participating in capital market facing businesses who seek to keep abreast with and draw inspiration from novel approaches in market data analysis. The first two chapters of the book contains introductory material on stochastic analysis and the classical diffusion stock market models. The remaining chapters discuss more special stock and bond market models and special methods of pathwise inference for market parameter for different models. The final chapter describes applications of numerical methods of inference of bond market parameters to forecasting of short rate. Nikolai Dokuchaev is an associate professor in Mathematics and Statistics at Curtin University. His research interests include mathematical and statistical finance, stochastic analysis, PDEs, control, and signal processing. Lin Yee Hin is a practitioner in the capital market facing industry. His research interests include econometrics, non-parametric regression, and scientific computing.
Estimate and Interpret Results from Ordered Regression Models Ordered Regression Models: Parallel, Partial, and Non-Parallel Alternatives presents regression models for ordinal outcomes, which are variables that have ordered categories but unknown spacing between the categories. The book provides comprehensive coverage of the three major classes of ordered regression models (cumulative, stage, and adjacent) as well as variations based on the application of the parallel regression assumption. The authors first introduce the three "parallel" ordered regression models before covering unconstrained partial, constrained partial, and nonparallel models. They then review existing tests for the parallel regression assumption, propose new variations of several tests, and discuss important practical concerns related to tests of the parallel regression assumption. The book also describes extensions of ordered regression models, including heterogeneous choice models, multilevel ordered models, and the Bayesian approach to ordered regression models. Some chapters include brief examples using Stata and R. This book offers a conceptual framework for understanding ordered regression models based on the probability of interest and the application of the parallel regression assumption. It demonstrates the usefulness of numerous modeling alternatives, showing you how to select the most appropriate model given the type of ordinal outcome and restrictiveness of the parallel assumption for each variable. Web ResourceMore detailed examples are available on a supplementary website. The site also contains JAGS, R, and Stata codes to estimate the models along with syntax to reproduce the results.
Model Building is the most fruitful area of economics, designed to solve real-world problems using all available methods such as mathematical, computational and analytical, without distinction. Wherever necessary, we should not be reluctant to develop new techniques, whether mathematical or computational. That is the philosophy of this volume.The volume is divided into three distinct parts: Methods, Theory and Applications. The Methods section is in turn subdivided into Mathematical Programming and Econometrics and Adaptive Control System, which are widely used in econometric analysis. The impacts of fiscal policy in a regime with independent monetary authority and dynamic models of environmental taxation are considered.In the section on "Modelling Business Organization", a model of a Japanese organization is presented. Furthermore, a model suitable for an efficient budget management of a health service unit by applying goal programming method is analyzed, taking into account various socio-economic factors. This is followed by a section on "Modelling National Economies", in which macroeconometric models for the EU member countries are analyzed, to find instruments that stabilize inflation with coordinated action.
The impact of globalization of financial markets is a highly debated topic, particularly in recent months when the issue of globalization and contagion of financial distress has become a focus of intense policy debate. The papers in this volume provide an up-to-date overview of the key issues in this debate. While most of the contributions were prepared after the initial outbreak of the current global turmoil and financial crisis, they identify the relative strengths of the risk diversification and risk transmission processes and examine the empirical evidence to date. The book considers the relative roles of banks, nonbank financial institutions and capital markets in both risk diversification and risk transmission. It then evaluates the current status of crisis resolution in a global context, and speculates where to go from here in terms of understanding, resolution, prevention and public policy.
There is no book currently available that gives a comprehensive treatment of the design, construction, and use of index numbers. However, there is a pressing need for one in view of the increasing and more sophisticated employment of index numbers in the whole range of applied economics and specifically in discussions of macroeconomic policy. In this book, R. G. D. Allen meets this need in simple and consistent terms and with comprehensive coverage. The text begins with an elementary survey of the index-number problem before turning to more detailed treatments of the theory and practice of index numbers. The binary case in which one time period is compared with another is first developed and illustrated with numerous examples. This is to prepare the ground for the central part of the text on runs of index numbers. Particular attention is paid both to fixed-weighted and to chain forms as used in a wide range of published index numbers taken mainly from British official sources. This work deals with some further problems in the construction of index numbers, problems which are both troublesome and largely unresolved. These include the use of sampling techniques in index-number design and the theoretical and practical treatment of quality changes. It is also devoted to a number of detailed and specific applications of index-number techniques to problems ranging from national-income accounting, through the measurement of inequality of incomes and international comparisons of real incomes, to the use of index numbers of stock-market prices. Aimed primarily at students of economics, whatever their age and range of interests, this work will also be of use to those who handle index numbers professionally. "R. G. D. Allen" (1906-1983) was Professor Emeritus at the University of London. He was also once president of the Royal Statistical Society and Treasurer of the British Academy where he was a fellow. He is the author of "Basic Mathematics," "Mathematical Analysis for Economists," "Mathematical Economics" and "Macroeconomic Theory." |
You may like...
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
|