![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
This book treats the notion of morphisms in spatial analysis, paralleling these concepts in spatial statistics (Part I) and spatial econometrics (Part II). The principal concept is morphism (e.g., isomorphisms, homomorphisms, and allomorphisms), which is defined as a structure preserving the functional linkage between mathematical properties or operations in spatial statistics and spatial econometrics, among other disciplines. The purpose of this book is to present selected conceptions in both domains that are structurally the same, even though their labelling and the notation for their elements may differ. As the approaches presented here are applied to empirical materials in geography and economics, the book will also be of interest to scholars of regional science, quantitative geography and the geospatial sciences. It is a follow-up to the book "Non-standard Spatial Statistics and Spatial Econometrics" by the same authors, which was published by Springer in 2011.
Stochastic Averaging and Extremum Seeking treats methods inspired by attempts to understand the seemingly non-mathematical question of bacterial chemotaxis and their application in other environments. The text presents significant generalizations on existing stochastic averaging theory developed from scratch and necessitated by the need to avoid violation of previous theoretical assumptions by algorithms which are otherwise effective in treating these systems. Coverage is given to four main topics. Stochastic averaging theorems are developed for the analysis of continuous-time nonlinear systems with random forcing, removing prior restrictions on nonlinearity growth and on the finiteness of the time interval. The new stochastic averaging theorems are usable not only as approximation tools but also for providing stability guarantees. Stochastic extremum-seeking algorithms are introduced for optimization of systems without available models. Both gradient- and Newton-based algorithms are presented, offering the user the choice between the simplicity of implementation (gradient) and the ability to achieve a known, arbitrary convergence rate (Newton). The design of algorithms for non-cooperative/adversarial games is described. The analysis of their convergence to Nash equilibria is provided. The algorithms are illustrated on models of economic competition and on problems of the deployment of teams of robotic vehicles. Bacterial locomotion, such as chemotaxis in E. coli, is explored with the aim of identifying two simple feedback laws for climbing nutrient gradients. Stochastic extremum seeking is shown to be a biologically-plausible interpretation for chemotaxis. For the same chemotaxis-inspired stochastic feedback laws, the book also provides a detailed analysis of convergence for models of nonholonomic robotic vehicles operating in GPS-denied environments. The book contains block diagrams and several simulation examples, including examples arising from bacterial locomotion, multi-agent robotic systems, and economic market models. Stochastic Averaging and Extremum Seeking will be informative for control engineers from backgrounds in electrical, mechanical, chemical and aerospace engineering and to applied mathematicians. Economics researchers, biologists, biophysicists and roboticists will find the applications examples instructive.
This book reviews the three most popular methods (and their extensions) in applied economics and other social sciences: matching, regression discontinuity, and difference in differences. The book introduces the underlying econometric/statistical ideas, shows what is identified and how the identified parameters are estimated, and then illustrates how they are applied with real empirical examples. The book emphasizes how to implement the three methods with data: many data and programs are provided in the online appendix. All readers--theoretical econometricians/statisticians, applied economists/social-scientists and researchers/students--will find something useful in the book from different perspectives.
This book presents the works and research findings of physicists, economists, mathematicians, statisticians, and financial engineers who have undertaken data-driven modelling of market dynamics and other empirical studies in the field of Econophysics. During recent decades, the financial market landscape has changed dramatically with the deregulation of markets and the growing complexity of products. The ever-increasing speed and decreasing costs of computational power and networks have led to the emergence of huge databases. The availability of these data should permit the development of models that are better founded empirically, and econophysicists have accordingly been advocating that one should rely primarily on the empirical observations in order to construct models and validate them. The recent turmoil in financial markets and the 2008 crash appear to offer a strong rationale for new models and approaches. The Econophysics community accordingly has an important future role to play in market modelling. The Econophys-Kolkata VIII conference proceedings are devoted to the presentation of many such modelling efforts and address recent developments. A number of leading researchers from across the globe report on their recent work, comment on the latest issues, and review the contemporary literature.
Hardbound. A comprehensive reference work for teaching at graduate level and research in empirical finance. The chapters cover a wide range of statistical and probabilistic methods applied to a variety of financial methods and are written by internationally renowned experts.
This handbook presents a systematic overview of approaches to, diversity, and problems involved in interdisciplinary rating methodologies. Historically, the purpose of ratings is to achieve information transparency regarding a given body's activities, whether in the field of finance, banking, or sports for example. This book focuses on commonly used rating methods in three important fields: finance, sports, and the social sector. In the world of finance, investment decisions are largely shaped by how positively or negatively economies or financial instruments are rated. Ratings have thus become a basis of trust for investors. Similarly, sports evaluation and funding are largely based on core ratings. From local communities to groups of nations, public investment and funding are also dependent on how these bodies are continuously rated against expected performance targets. As such, ratings need to reflect the consensus of all stakeholders on selected aspects of the work and how to evaluate their success. The public should also have the opportunity to participate in this process. The authors examine current rating approaches from a variety of proposals that are closest to the public consensus, analyzing the rating models and summarizing the methods of their construction. This handbook offers a valuable reference guide for managers, analysts, economists, business informatics specialists, and researchers alike.
This book provides a quantitative framework for the analysis of conflict dynamics and for estimating the economic costs associated with civil wars. The author develops modified Lotka-Volterra equations to model conflict dynamics, to yield realistic representations of battle processes, and to allow us to assess prolonged conflict traps. The economic costs of civil wars are evaluated with the help of two alternative methods: Firstly, the author employs a production function to determine how the destruction of human and physical capital stocks undermines economic growth in the medium term. Secondly, he develops a synthetic control approach, where the cost is obtained as the divergence of actual economic activity from a hypothetical path in the absence of civil war. The difference between the two approaches gives an indication of the adverse externalities impinging upon the economy in the form of institutional destruction. By using detailed time-series regarding battle casualties, local socio-economic indicators, and capital stock destruction during the Greek Civil War (1946-1949), a full-scale application of the above framework is presented and discussed.
The Analytic Network Process (ANP), developed by Thomas Saaty in his work on multicriteria decision making, applies network structures with dependence and feedback to complex decision making. This new edition of Decision Making with the Analytic Network Process is a selection of the latest applications of ANP to economic, social and political decisions, and also to technological design. The ANP is a methodological tool that is helpful to organize knowledge and thinking, elicit judgments registered in both in memory and in feelings, quantify the judgments and derive priorities from them, and finally synthesize these diverse priorities into a single mathematically and logically justifiable overall outcome. In the process of deriving this outcome, the ANP also allows for the representation and synthesis of diverse opinions in the midst of discussion and debate. The book focuses on the application of the ANP in three different areas: economics, the social sciences and the linking of measurement with human values. Economists can use the ANP for an alternate approach for dealing with economic problems than the usual mathematical models on which economics bases its quantitative thinking. For psychologists, sociologists and political scientists, the ANP offers the methodology they have sought for some time to quantify and derive measurements for intangibles. Finally the book applies the ANP to provide people in the physical and engineering sciences with a quantitative method to link hard measurement to human values. In such a process, one is able to interpret the true meaning of measurements made on a uniform scale using a unit.
This book is devoted to biased sampling problems (also called choice-based sampling in Econometrics parlance) and over-identified parameter estimation problems. Biased sampling problems appear in many areas of research, including Medicine, Epidemiology and Public Health, the Social Sciences and Economics. The book addresses a range of important topics, including case and control studies, causal inference, missing data problems, meta-analysis, renewal process and length biased sampling problems, capture and recapture problems, case cohort studies, exponential tilting genetic mixture models etc. The goal of this book is to make it easier for Ph. D students and new researchers to get started in this research area. It will be of interest to all those who work in the health, biological, social and physical sciences, as well as those who are interested in survey methodology and other areas of statistical science, among others.
This book reflects the state of the art on nonlinear economic dynamics, financial market modelling and quantitative finance. It contains eighteen papers with topics ranging from disequilibrium macroeconomics, monetary dynamics, monopoly, financial market and limit order market models with boundedly rational heterogeneous agents to estimation, time series modelling and empirical analysis and from risk management of interest-rate products, futures price volatility and American option pricing with stochastic volatility to evaluation of risk and derivatives of electricity market. The book illustrates some of the most recent research tools in these areas and will be of interest to economists working in economic dynamics and financial market modelling, to mathematicians who are interested in applying complexity theory to economics and finance and to market practitioners and researchers in quantitative finance interested in limit order, futures and electricity market modelling, derivative pricing and risk management.
This book offers a practical guide to Agent Based economic modeling, adopting a "learning by doing" approach to help the reader master the fundamental tools needed to create and analyze Agent Based models. After providing them with a basic "toolkit" for Agent Based modeling, it present and discusses didactic models of real financial and economic systems in detail. While stressing the main features and advantages of the bottom-up perspective inherent to this approach, the book also highlights the logic and practical steps that characterize the model building procedure. A detailed description of the underlying codes, developed using R and C, is also provided. In addition, each didactic model is accompanied by exercises and applications designed to promote active learning on the part of the reader. Following the same approach, the book also presents several complementary tools required for the analysis and validation of the models, such as sensitivity experiments, calibration exercises, economic network and statistical distributions analysis. By the end of the book, the reader will have gained a deeper understanding of the Agent Based methodology and be prepared to use the fundamental techniques required to start developing their own economic models. Accordingly, "Economics with Heterogeneous Interacting Agents" will be of particular interest to graduate and postgraduate students, as well as to academic institutions and lecturers interested in including an overview of the AB approach to economic modeling in their courses.
This book addresses both theoretical developments in and practical applications of econometric techniques to finance-related problems. It includes selected edited outcomes of the International Econometric Conference of Vietnam (ECONVN2018), held at Banking University, Ho Chi Minh City, Vietnam on January 15-16, 2018. Econometrics is a branch of economics that uses mathematical (especially statistical) methods to analyze economic systems, to forecast economic and financial dynamics, and to develop strategies for achieving desirable economic performance. An extremely important part of economics is finances: a financial crisis can bring the whole economy to a standstill and, vice versa, a smart financial policy can dramatically boost economic development. It is therefore crucial to be able to apply mathematical techniques of econometrics to financial problems. Such applications are a growing field, with many interesting results - and an even larger number of challenges and open problems.
This collection of original articles 8 years in the making
shines a bright light on recent advances in financial econometrics.
From a survey of mathematical and statistical tools for
understanding nonlinear Markov processes to an exploration of the
time-series evolution of the risk-return tradeoff for stock market
investment, noted scholars Yacine Ait-Sahalia and Lars Peter Hansen
benchmark the current state of knowledge while contributors build a
framework for its growth. Whether in the presence of statistical
uncertainty or the proven advantages and limitations of value at
risk models, readers will discover that they can set few
constraints on the value of this long-awaited volume.
This book presents selected peer-reviewed contributions from the International Work-Conference on Time Series, ITISE 2017, held in Granada, Spain, September 18-20, 2017. It discusses topics in time series analysis and forecasting, including advanced mathematical methodology, computational intelligence methods for time series, dimensionality reduction and similarity measures, econometric models, energy time series forecasting, forecasting in real problems, online learning in time series as well as high-dimensional and complex/big data time series. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing computer science, mathematics, statistics and econometrics.
This book explores a wide range of issues related to the methodology, organization, and technologies of analytical work, showing the potential of using analytical tools and statistical indicators for studying socio-economic processes, forecasting, organizing effective companies, and improving managerial decisions. At the level of "living knowledge" in the broad context, it describes the essence of analytical technologies and means of applying analytical and statistical work. The book is of interest to readers regardless of their specialization: scientific research, medicine, pedagogics, law, administrative work, or economic practice. Starting from the premise that readers are familiar with the theory of statistics, which has formulated the general methods and principles of establishing the quantitative characteristics of mass phenomena and processes, it describes the concepts, definitions, indicators and classifications of socio-economic statistics, taking into consideration the international standards and the present-day practice of statistics in Russia. Although concise, the book provides plenty of study material as well as questions at the end of each chapter It is particularly useful for those interested in self-study or remote education, as well as business leaders who are interested in gaining a scientific understanding of their financial and economic activities.
This book provides the first comprehensive introduction to multi-agent, multi-choice repetitive games, such as the Kolkata Restaurant Problem and the Minority Game. It explains how the tangible formulations of these games, using stochastic strategies developed by statistical physicists employing both classical and quantum physics, have led to very efficient solutions to the problems posed. Further, it includes sufficient introductory notes on information-processing strategies employing both classical statistical physics and quantum mechanics. Games of this nature, in which agents are presented with choices, from among which their goal is to make the minority choice, offer effective means of modeling herd behavior and market dynamics and are highly relevant to assessing systemic risk. Accordingly, this book will be of interest to economists, physicists, and computer scientists alike.
Recent advancements in data collection will affect all aspects of businesses, improving and bringing complexity to management and demanding integration of all resources, principles, and processes. The interpretation of these new technologies is essential to the advancement of management and business. The Handbook of Research on Expanding Business Opportunities With Information Systems and Analytics is a vital scholarly publication that examines technological advancements in data collection that will influence major change in many aspects of business through a multidisciplinary approach. Featuring coverage on a variety of topics such as market intelligence, knowledge management, and brand management, this book explores new complexities to management and other aspects of business. This publication is designed for entrepreneurs, business managers and executives, researchers, business professionals, data analysts, academicians, and graduate-level students seeking relevant research on data collection advancements.
This second edition sees the light three years after the first one: too short a time to feel seriously concerned to redesign the entire book, but sufficient to be challenged by the prospect of sharpening our investigation on the working of econometric dynamic models and to be inclined to change the title of the new edition by dropping the "Topics in" of the former edition. After considerable soul searching we agreed to include several results related to topics already covered, as well as additional sections devoted to new and sophisticated techniques, which hinge mostly on the latest research work on linear matrix polynomials by the second author. This explains the growth of chapter one and the deeper insight into representation theorems in the last chapter of the book. The role of the second chapter is that of providing a bridge between the mathematical techniques in the backstage and the econometric profiles in the forefront of dynamic modelling. For this purpose, we decided to add a new section where the reader can find the stochastic rationale of vector autoregressive specifications in econometrics. The third (and last) chapter improves on that of the first edition by re- ing the fruits of the thorough analytic equipment previously drawn up."
This monograph provides a unified and comprehensive treatment of an order-theoretic fixed point theory in partially ordered sets and its various useful interactions with topological structures. The material progresses systematically, by presenting the preliminaries before moving to more advanced topics. In the treatment of the applications a wide range of mathematical theories and methods from nonlinear analysis and integration theory are applied; an outline of which has been given an appendix chapter to make the book self-contained. Graduate students and researchers in nonlinear analysis, pure and applied mathematics, game theory and mathematical economics will find this book useful.
This book scientifically tests the assertion that accommodative monetary policy can eliminate the "crowd out" problem, allowing fiscal stimulus programs (such as tax cuts or increased government spending) to stimulate the economy as intended. It also tests to see if natural growth in th economy can cure the crowd out problem as well or better. The book is intended to be the largest scale scientific test ever performed on this topic. It includes about 800 separate statistical tests on the U.S. economy testing different parts or all of the period 1960 - 2010. These tests focus on whether accommodative monetary policy, which increases the pool of loanable resources, can offset the crowd out problem as well as natural growth in the economy. The book, employing the best scientific methods available to economists for this type of problem, concludes accommodate monetary policy could have, but until the quantitative easing program, Federal Reserve efforts to accommodate fiscal stimulus programs were not large enough to offset more than 23% to 44% of any one year's crowd out problem. That provides the science part of the answer as to why accommodative monetary policy didn't accommodate: too little of it was tried. The book also tests whether other increases in loanable funds, occurring because of natural growth in the economy or changes in the savings rate can also offset crowd out. It concludes they can, and that these changes tend to be several times as effective as accommodative monetary policy. This book's companion volume Why Fiscal Stimulus Programs Fail explores the policy implications of these results.
A lot of economic problems can be formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories. The series is designed to bring together those mathematicians who are seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking effective mathematical tools for their research.
A wide variety of processes occur on multiple scales, either naturally or as a consequence of measurement. This book contains methodology for the analysis of data that arise from such multiscale processes. The book brings together a number of recent developments and makes them accessible to a wider audience. Taking a Bayesian approach allows for full accounting of uncertainty, and also addresses the delicate issue of uncertainty at multiple scales. The Bayesian approach also facilitates the use of knowledge from prior experience or data, and these methods can handle different amounts of prior knowledge at different scales, as often occurs in practice. The book is aimed at statisticians, applied mathematicians, and engineers working on problems dealing with multiscale processes in time and/or space, such as in engineering, finance, and environmetrics. The book will also be of interest to those working on multiscale computation research. The main prerequisites are knowledge of Bayesian statistics and basic Markov chain Monte Carlo methods. A number of real-world examples are thoroughly analyzed in order to demonstrate the methods and to assist the readers in applying these methods to their own work. To further assist readers, the authors are making source code (for R) available for many of the basic methods discussed herein.
The book details the innovative TERM (The Enormous Regional Model) approach to regional and national economic modeling, and explains the conversion from a comparative-static to a dynamic model. It moves on to an adaptation of TERM to water policy, including the additional theoretical and database requirements of the dynamic TERM-H2O model. In particular, it examines the contrasting economic impacts of water buyback policy and recurring droughts in the Murray-Darling Basin. South-east Queensland, where climate uncertainty has been borne out by record-breaking drought and the worst floods in living memory, provides a chapter-length case study. The exploration of the policy background and implications of TERM's dynamic modeling will provide food for thought in policy making circles worldwide, where there is a pressing need for solutions to similarly intractable problems in water management.
"A Companion to Theoretical Econometrics" provides a comprehensive
reference to the basics of econometrics. It focuses on the
foundations of the field and at the same time integrates popular
topics often encountered by practitioners. The chapters are written
by international experts and provide up-to-date research in areas
not usually covered by standard econometric texts.
This book is an exceptional reference for readers who require
quick access to the foundation theories in this field. Chapters are
organized to provide clear information and to point to further
readings on the subject. Important topics covered include:
This completely restructured, updated third edition of the volume first published in 1992 provides a general overview of the econometrics of panel data, both from a theoretical and from an applied viewpoint. Since the pioneering papers by Kuh (1959), Mundlak (1961), Hoch (1962), and Balestra and Nerlove (1966), the pooling of cross section and time series data has become an increasingly popular way of quantifying economic relationships. Each series provides information lacking in the other, so a combination of both leads to more accurate and reliable results than would be achievable by one type of series alone.
Part I is concerned with the fundamentals of panel data econometrics, both linear and non linear; Part II deals with more advanced topics such as dynamic models, simultaneity and measurement errors, unit roots and co integration, incomplete panels and selectivity, duration and count models, etc. This volume also provides insights into the use of panel data in empirical studies. Part III deals with surveys in several major fields of applied economics, such as investment demand, foreign direct investment and international trade, production efficiency, labour supply, transitions on the labour market, etc. Six new chapters about R&D and innovation, wages, health economics, policy evaluation, growth empirics and the impact of monetary policy have been included. |
![]() ![]() You may like...
Automotive Embedded Systems - Key…
M. Kathiresh, R. Neelaveni
Hardcover
R3,896
Discovery Miles 38 960
Artificial Intelligence and Hardware…
Ashutosh Mishra, Jaekwang Cha, …
Hardcover
R3,567
Discovery Miles 35 670
Millimeter-wave Integrated Technologies…
Wynand Lambrechts, Saurabh Sinha
Hardcover
R2,902
Discovery Miles 29 020
Dark Silicon and Future On-chip Systems…
Suyel Namasudra, Hamid Sarbazi-Azad
Hardcover
R4,186
Discovery Miles 41 860
Energy-Efficient Fault-Tolerant Systems
Jimson Mathew, Rishad A. Shafik, …
Hardcover
R5,095
Discovery Miles 50 950
Microwave Active Circuit Analysis and…
Clive Poole, Izzat Darwazeh
Hardcover
Renewable Power for Sustainable Growth…
Atif Iqbal, Hasmat Malik, …
Hardcover
R4,535
Discovery Miles 45 350
|