![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
Economic Modeling Using Artificial Intelligence Methods examines the application of artificial intelligence methods to model economic data. Traditionally, economic modeling has been modeled in the linear domain where the principles of superposition are valid. The application of artificial intelligence for economic modeling allows for a flexible multi-order non-linear modeling. In addition, game theory has largely been applied in economic modeling. However, the inherent limitation of game theory when dealing with many player games encourages the use of multi-agent systems for modeling economic phenomena. The artificial intelligence techniques used to model economic data include: multi-layer perceptron neural networks radial basis functions support vector machines rough sets genetic algorithm particle swarm optimization simulated annealing multi-agent system incremental learning fuzzy networks Signal processing techniques are explored to analyze economic data, and these techniques are the time domain methods, time-frequency domain methods and fractals dimension approaches. Interesting economic problems such as causality versus correlation, simulating the stock market, modeling and controling inflation, option pricing, modeling economic growth as well as portfolio optimization are examined. The relationship between economic dependency and interstate conflict is explored, and knowledge on how economics is useful to foster peace - and vice versa - is investigated. Economic Modeling Using Artificial Intelligence Methods deals with the issue of causality in the non-linear domain and applies the automatic relevance determination, the evidence framework, Bayesian approach and Granger causality to understand causality and correlation. Economic Modeling Using Artificial Intelligence Methods makes an important contribution to the area of econometrics, and is a valuable source of reference for graduate students, researchers and financial practitioners.
Though globalisation of the world economy is currently a powerful force, people’s international mobility appears to still be very limited. The goal of this book is to improve our knowledge of the true effects of migration flows. It includes contributions by prominent academic researchers analysing the socio-economic impact of migration in a variety of contexts: interconnection of people and trade flows, causes and consequences of capital remittances, understanding the macroeconomic impact of migration and the labour market effects of people’s flows. The latest analytical methodologies are employed in all chapters, while interesting policy guidelines emerge from the investigations. The style of the volume makes it accessible for both non-experts and advanced readers interested in this hot topic of today’s world.
Analyze key indicators more accurately to make smarter market moves The Economic Indicator Handbook helps investors more easily evaluate economic trends, to better inform investment decision making and other key strategic financial planning. Written by a Bloomberg Senior Economist, this book presents a visual distillation of the indicators every investor should follow, with clear explanation of how they're measured, what they mean, and how that should inform investment thinking. The focus on graphics, professional application, Bloomberg terminal functionality, and practicality makes this guide a quick, actionable read that could immediately start improving investment outcomes. Coverage includes gross domestic product, employment data, industrial production, new residential construction, consumer confidence, retail and food service sales, and commodities, plus guidance on the secret indicators few economists know or care about. Past performance can predict future results if you know how to read the indicators. Modern investing requires a careful understanding of the macroeconomic forces that lift and topple markets on a regular basis, and how they shift to move entire economies. This book is a visual guide to recognizing these forces and tracking their behavior, helping investors identify entry and exit points that maximize profit and minimize loss. * Quickly evaluate economic trends * Make more informed investment decisions * Understand the most essential indicators * Translate predictions into profitable actions Savvy market participants know how critical certain indicators are to the formulation of a profitable, effective market strategy. A daily indicator check can inform day-to-day investing, and long-term tracking can result in a stronger, more robust portfolio. For the investor who knows that better information leads to better outcomes, The Economic Indicator Handbook is an exceptionally useful resource.
Mathematical Statistics for Economics and Business, Second Edition, provides a comprehensive introduction to the principles of mathematical statistics which underpin statistical analyses in the fields of economics, business, and econometrics. The selection of topics in this textbook is designed to provide students with a conceptual foundation that will facilitate a substantial understanding of statistical applications in these subjects. This new edition has been updated throughout and now also includes a downloadable Student Answer Manual containing detailed solutions to half of the over 300 end-of-chapter problems. After introducing the concepts of probability, random variables, and probability density functions, the author develops the key concepts of mathematical statistics, most notably: expectation, sampling, asymptotics, and the main families of distributions. The latter half of the book is then devoted to the theories of estimation and hypothesis testing with associated examples and problems that indicate their wide applicability in economics and business. Features of the new edition include: a reorganization of topic flow and presentation to facilitate reading and understanding; inclusion of additional topics of relevance to statistics and econometric applications; a more streamlined and simple-to-understand notation for multiple integration and multiple summation over general sets or vector arguments; updated examples; new end-of-chapter problems; a solution manual for students; a comprehensive answer manual for instructors; and a theorem and definition map. This book has evolved from numerous graduate courses in mathematical statistics and econometrics taught by the author, and will be ideal for students beginning graduate study as well as for advanced undergraduates.
This book investigates the relationship between environmental degradation and income, focusing on carbon dioxide (CO2) emissions from around the world, to explore the possibility of sustainable development under global warming. Although many researchers have tackled this problem by estimating the Environmental Kuznets Curve (EKC), unlike the approach to sulfur dioxide emissions, there seems to be little consensus about whether EKC is formed with regard to CO2 emissions. Thus, EKC is one of the most controversial issues in the field of environmental economics. This book contributes three points with academic rigor. First, an unbalanced panel dataset containing over 150 countries with the latest CO2 emission data between 1960 and 2010 is constructed. Second, based on this dataset, the CO2 emission-income relationship is analyzed using strict econometric methods such as the dynamic panel model. Third, as it is often pointed out that some factors other than income affect CO2 emission, several variables were added to the estimation model to examine the effects of changes of industrial structure, energy composition, and overseas trade on CO2 emission.
The Who, What, and Where of America is designed to provide a sampling of key demographic information. It covers the United States, every state, each metropolitan statistical area, and all the counties and cities with a population of 20,000 or more. Who: Age, Race and Ethnicity, and Household Structure What: Education, Employment, and Income Where: Migration, Housing, and Transportation Each part is preceded by highlights and ranking tables that show how areas diverge from the national norm. These research aids are invaluable for understanding data from the ACS and for highlighting what it tells us about who we are, what we do, and where we live. Each topic is divided into four tables revealing the results of the data collected from different types of geographic areas in the United States, generally with populations greater than 20,000. ·Table A. States ·Table B. Counties ·Table C. Metropolitan Areas ·Table D. Cities In this edition, you will find social and economic estimates on the ways American communities are changing with regard to the following: ·Age and race ·Health care coverage ·Marital history ·Education attainment ·Income and occupation ·Commute time to work ·Employment status ·Home values and monthly costs ·Veteran status ·Size of home or rental unit This title is the latest in the County and City Extra Series of publications from Bernan Press. Other titles include County and City Extra, County and City Extra: Special Decennial Census Edition, and Places, Towns, and Townships.
The Analytic Hierarchy Process (AHP) has been one of the foremost mathematical methods for decision making with multiple criteria and has been widely studied in the operations research literature as well as applied to solve countless real-world problems. This book is meant to introduce and strengthen the readers' knowledge of the AHP, no matter how familiar they may be with the topic. This book provides a concise, yet self-contained, introduction to the AHP that uses a novel and more pedagogical approach. It begins with an introduction to the principles of the AHP, covering the critical points of the method, as well as some of its applications. Next, the book explores further aspects of the method, including the derivation of the priority vector, the estimation of inconsistency, and the use of AHP for group decisions. Each of these is introduced by relaxing initial assumptions. Furthermore, this booklet covers extensions of AHP, which are typically neglected in elementary expositions of the methods. Such extensions concern different numerical representations of preferences and the interval and fuzzy representations of preferences to account for uncertainty. During the whole exposition, an eye is kept on the most recent developments of the method.
Written for a broad audience this book offers a comprehensive account of early warning systems for hydro meteorological disasters such as floods and storms, and for geological disasters such as earthquakes. One major theme is the increasingly important role in early warning systems played by the rapidly evolving fields of space and information technology. The authors, all experts in their respective fields, offer a comprehensive and in-depth insight into the current and future perspectives for early warning systems. The text is aimed at decision-makers in the political arena, scientists, engineers and those responsible for public communication and dissemination of warnings.
This book investigates the existence of stochastic and deterministic convergence of real output per worker and the sources of output (physical capital per worker, human capital per worker, total factor productivity -TFP- and average annual hours worked) in 21 OECD countries over the period 1970-2011. Towards this end, the authors apply a large battery of panel unit root and stationarity tests, all of which are robust to the presence of cross-sectional dependence. The evidence fails to provide clear-cut evidence of convergence dynamics either in real GDP per worker or in the series of the sources of output. Due to some limitations associated with second-generation panel unit root and stationarity tests, the authors further use the more flexible PANIC approach which provides evidence that real GDP per worker, real physical capital per worker, human capital and average annual hours exhibit some degree of deterministic convergence, whereas TFP series display a high degree of stochastic convergence.
This volume addresses advanced DEA methodology and techniques developed for modeling unique new performance evaluation issues. Many numerical examples, real management cases and verbal descriptions make it very valuable for researchers and practitioners.
This book presents modern developments in time series econometrics that are applied to macroeconomic and financial time series, bridging the gap between methods and realistic applications. It presents the most important approaches to the analysis of time series, which may be stationary or nonstationary. Modelling and forecasting univariate time series is the starting point. For multiple stationary time series, Granger causality tests and vector autogressive models are presented. As the modelling of nonstationary uni- or multivariate time series is most important for real applied work, unit root and cointegration analysis as well as vector error correction models are a central topic. Tools for analysing nonstationary data are then transferred to the panel framework. Modelling the (multivariate) volatility of financial time series with autogressive conditional heteroskedastic models is also treated.
Students in both social and natural sciences often seek regression methods to explain the frequency of events, such as visits to a doctor, auto accidents, or new patents awarded. This book provides the most comprehensive and up-to-date account of models and methods to interpret such data. The authors have conducted research in the field for more than twenty-five years. In this book, they combine theory and practice to make sophisticated methods of analysis accessible to researchers and practitioners working with widely different types of data and software in areas such as applied statistics, econometrics, marketing, operations research, actuarial studies, demography, biostatistics, and quantitative social sciences. The book may be used as a reference work on count models or by students seeking an authoritative overview. Complementary material in the form of data sets, template programs, and bibliographic resources can be accessed on the Internet through the authors' homepages. This second edition is an expanded and updated version of the first, with new empirical examples and more than one hundred new references added. The new material includes new theoretical topics, an updated and expanded treatment of cross-section models, coverage of bootstrap-based and simulation-based inference, expanded treatment of time series, multivariate and panel data, expanded treatment of endogenous regressors, coverage of quantile count regression, and a new chapter on Bayesian methods.
One of the best known statisticians of the 20th century, Frederick Mosteller has inspired numerous statisticians and other scientists by his creative approach to statistics and its applications. This volume collects 40 of his most original and influential papers, capturing the variety and depth of his writings. It is hoped that sharing these writings with a new generation of researchers will inspire them to build upon his insights and efforts.
One of the most important features of China's economic emergence has been the role of foreign investment and foreign companies. The importance goes well beyond the USD 1.6 trillion in foreign direct investment that China has received since it started opening its economy. Using the tools of economic impact analysis, the author estimates that around one-third of China's GDP in recent years has been generated by the investments, operations, and supply chains of foreign invested companies. In addition, foreign companies have developed industries, created suppliers and distributors, introduced modern technologies, improved business practices, modernized management training, improved sustainability performance, and helped shape China's legal and regulatory systems. These impacts have helped China become the world's second largest economy, its leading exporter, and one of its leading destinations for inward investment. The book provides a powerful analysis of China's policies toward foreign investment that can inform policy makers around the world, while giving foreign companies tools to demonstrate their contributions to host countries and showing the tremendous power of foreign investment to help transform economies.
The volatility of financial returns changes over time and, for the last thirty years, Generalized Autoregressive Conditional Heteroscedasticity (GARCH) models have provided the principal means of analyzing, modeling, and monitoring such changes. Taking into account that financial returns typically exhibit heavy tails that is, extreme values can occur from time to time Andrew Harvey's new book shows how a small but radical change in the way GARCH models are formulated leads to a resolution of many of the theoretical problems inherent in the statistical theory. The approach can also be applied to other aspects of volatility, such as those arising from data on the range of returns and the time between trades. Furthermore, the more general class of Dynamic Conditional Score models extends to robust modeling of outliers in the levels of time series and to the treatment of time-varying relationships. As such, there are applications not only to financial data but also to macroeconomic time series and to time series in other disciplines. The statistical theory draws on basic principles of maximum likelihood estimation and, by doing so, leads to an elegant and unified treatment of nonlinear time-series modeling. The practical value of the proposed models is illustrated by fitting them to real data sets."
The volatility of financial returns changes over time and, for the last thirty years, Generalized Autoregressive Conditional Heteroscedasticity (GARCH) models have provided the principal means of analyzing, modeling, and monitoring such changes. Taking into account that financial returns typically exhibit heavy tails that is, extreme values can occur from time to time Andrew Harvey's new book shows how a small but radical change in the way GARCH models are formulated leads to a resolution of many of the theoretical problems inherent in the statistical theory. The approach can also be applied to other aspects of volatility, such as those arising from data on the range of returns and the time between trades. Furthermore, the more general class of Dynamic Conditional Score models extends to robust modeling of outliers in the levels of time series and to the treatment of time-varying relationships. As such, there are applications not only to financial data but also to macroeconomic time series and to time series in other disciplines. The statistical theory draws on basic principles of maximum likelihood estimation and, by doing so, leads to an elegant and unified treatment of nonlinear time-series modeling. The practical value of the proposed models is illustrated by fitting them to real data sets."
In recent years, as part of the increasing "informationization" of industry and the economy, enterprises have been accumulating vast amounts of detailed data such as high-frequency transaction data in nancial markets and point-of-sale information onindividualitems in theretail sector. Similarly,vast amountsof data arenow ava- able on business networks based on inter rm transactions and shareholdings. In the past, these types of information were studied only by economists and management scholars. More recently, however, researchers from other elds, such as physics, mathematics, and information sciences, have become interested in this kind of data and, based on novel empirical approaches to searching for regularities and "laws" akin to those in the natural sciences, have produced intriguing results. This book is the proceedings of the international conference THICCAPFA7 that was titled "New Approaches to the Analysis of Large-Scale Business and E- nomic Data," held in Tokyo, March 1-5, 2009. The letters THIC denote the Tokyo Tech (Tokyo Institute of Technology)-Hitotsubashi Interdisciplinary Conference. The conference series, titled APFA (Applications of Physics in Financial Analysis), focuses on the analysis of large-scale economic data. It has traditionally brought physicists and economists together to exchange viewpoints and experience (APFA1 in Dublin 1999, APFA2 in Liege ` 2000, APFA3 in London 2001, APFA4 in Warsaw 2003, APFA5 in Torino 2006, and APFA6 in Lisbon 2007). The aim of the conf- ence is to establish fundamental analytical techniques and data collection methods, taking into account the results from a variety of academic disciplines.
The book examines applications in two disparate fields linked by the importance of valuing information: public health and space. Researchers in the health field have developed some of the most innovative methodologies for valuing information, used to help determine, for example, the value of diagnostics in informing patient treatment decisions. In the field of space, recent applications of value-of-information methods are critical for informing decisions on investment in satellites that collect data about air quality, fresh water supplies, climate and other natural and environmental resources affecting global health and quality of life.
The Analytic Network Process (ANP), developed by Thomas Saaty in his work on multicriteria decision making, applies network structures with dependence and feedback to complex decision making. This new edition of Decision Making with the Analytic Network Process is a selection of the latest applications of ANP to economic, social and political decisions, and also to technological design. The ANP is a methodological tool that is helpful to organize knowledge and thinking, elicit judgments registered in both in memory and in feelings, quantify the judgments and derive priorities from them, and finally synthesize these diverse priorities into a single mathematically and logically justifiable overall outcome. In the process of deriving this outcome, the ANP also allows for the representation and synthesis of diverse opinions in the midst of discussion and debate. The book focuses on the application of the ANP in three different areas: economics, the social sciences and the linking of measurement with human values. Economists can use the ANP for an alternate approach for dealing with economic problems than the usual mathematical models on which economics bases its quantitative thinking. For psychologists, sociologists and political scientists, the ANP offers the methodology they have sought for some time to quantify and derive measurements for intangibles. Finally the book applies the ANP to provide people in the physical and engineering sciences with a quantitative method to link hard measurement to human values. In such a process, one is able to interpret the true meaning of measurements made on a uniform scale using a unit.
Up-to-date coverage of most micro-econometric topics; first half parametric, second half semi- (non-) parametric Many empirical examples and tips in applying econometric theories to data Essential ideas and steps shown for most estimators and tests; well-suited for both applied and theoretical readers
In macro-econometrics more attention needs to be paid to the relationships among deterministic trends of different variables, or co-trending, especially when economic growth is of concern. The number of relationships, i.e., the co-trending rank, plays an important role in evaluating the veracity of propositions, particularly relating to the Japanese economic growth in view of the structural changes involved within it. This book demonstrates how to determine the co-trending rank from a given set of time series data for different variables. At the same time, the method determines how many of the co-trending relations also represent cointegrations. This enables us to perform statistical inference on the parameters of relations among the deterministic trends. Co-trending is an important contribution to the fields of econometric methods, macroeconomics, and time series analyses.
This workbook consists of exercises taken from Likelihood-Based
Inferences in Cointegrated Vector Autoregressive Models by Soren
Johansen, together with worked-out solutions.
This book investigates whether the effects of economic integration differ according to the size of countries. The analysis incorporates a classification of the size of countries, reflecting the key economic characteristics of economies in order to provide an appropriate benchmark for each size group in the empirical analysis of the effects of asymmetric economic integration. The formation or extension of Preferential Trade Areas (PTAs) leads to a reduction in trade costs. This poses a critical secondary question as to the extent to which trade costs differ according to the size of countries. The extent to which membership of PTAs has an asymmetric impact on trade flow according to the size of member countries is analyzed by employing econometric tools and general equilibrium analysis, estimating both the ex-post and ex-ante effects of economic integration on the size of countries, using a data set of 218 countries, 45 of which are European. ?
The Oxford Handbook of Computational Economics and Finance provides a survey of both the foundations of and recent advances in the frontiers of analysis and action. It is both historically and interdisciplinarily rich and also tightly connected to the rise of digital society. It begins with the conventional view of computational economics, including recent algorithmic development in computing rational expectations, volatility, and general equilibrium. It then moves from traditional computing in economics and finance to recent developments in natural computing, including applications of nature-inspired intelligence, genetic programming, swarm intelligence, and fuzzy logic. Also examined are recent developments of network and agent-based computing in economics. How these approaches are applied is examined in chapters on such subjects as trading robots and automated markets. The last part deals with the epistemology of simulation in its trinity form with the integration of simulation, computation, and dynamics. Distinctive is the focus on natural computationalism and the examination of the implications of intelligent machines for the future of computational economics and finance. Not merely individual robots, but whole integrated systems are extending their "immigration" to the world of Homo sapiens, or symbiogenesis.
Over the last thirty years there has been extensive use of continuous time econometric methods in macroeconomic modelling. This monograph presents a continuous time macroeconometric model of the United Kingdom incorporating stochastic trends. Its development represents a major step forward in continuous time macroeconomic modelling. The book describes the model in detail and, like earlier models, it is designed in such a way as to permit a rigorous mathematical analysis of its steady-state and stability properties, thus providing a valuable check on the capacity of the model to generate plausible long-run behaviour. The model is estimated using newly developed exact Gaussian estimation methods for continuous time econometric models incorporating unobservable stochastic trends. The book also includes discussion of the application of the model to dynamic analysis and forecasting. |
You may like...
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Kwantitatiewe statistiese tegnieke
Swanepoel Swanepoel, Vivier Vivier, …
Book
R359
Discovery Miles 3 590
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
|