![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
This book is an extension of the author's first book and serves as a guide and manual on how to specify and compute 2-, 3-, and 4-Event Bayesian Belief Networks (BBN). It walks the learner through the steps of fitting and solving fifty BBN numerically, using mathematical proof. The author wrote this book primarily for inexperienced learners as well as professionals, while maintaining a proof-based academic rigor. The author's first book on this topic, a primer introducing learners to the basic complexities and nuances associated with learning Bayes' theorem and inverse probability for the first time, was meant for non-statisticians unfamiliar with the theorem-as is this book. This new book expands upon that approach and is meant to be a prescriptive guide for building BBN and executive decision-making for students and professionals; intended so that decision-makers can invest their time and start using this inductive reasoning principle in their decision-making processes. It highlights the utility of an algorithm that served as the basis for the first book, and includes fifty 2-, 3-, and 4-event BBN of numerous variants.
New Directions in Computational Economics brings together for the first time a diverse selection of papers, sharing the underlying theme of application of computing technology as a tool for achieving solutions to realistic problems in computational economics and related areas in the environmental, ecological and energy fields. Part I of the volume addresses experimental and computational issues in auction mechanisms, including a survey of recent results for sealed bid auctions. The second contribution uses neural networks as the basis for estimating bid functions for first price sealed bid auctions. Also presented is the smart market' computational mechanism which better matches bids and offers for natural gas. Part II consists of papers that formulate and solve models of economics systems. Amman and Kendrick's paper deals with control models and the computational difficulties that result from nonconvexities. Using goal programming, Nagurney, Thore and Pan formulate spatial resource allocation models to analyze various policy issues. Thompson and Thrall next present a rigorous mathematical analysis of the relationship between efficiency and profitability. The problem of matching uncertain streams of assets and liabilities is solved using stochastic optimization techniques in the following paper in this section. Finally, Part III applies economic concepts to issues in computer science in addition to using computational techniques to solve economic models.
World-renowned experts in spatial statistics and spatial econometrics present the latest advances in specification and estimation of spatial econometric models. This includes information on the development of tools and software, and various applications. The text introduces new tests and estimators for spatial regression models, including discrete choice and simultaneous equation models. The performance of techniques is demonstrated through simulation results and a wide array of applications related to economic growth, international trade, knowledge externalities, population-employment dynamics, urban crime, land use, and environmental issues. An exciting new text for academics with a theoretical interest in spatial statistics and econometrics, and for practitioners looking for modern and up-to-date techniques.
Elementary Bayesian Statistics is a thorough and easily accessible introduction to the theory and practical application of Bayesian statistics. It presents methods to assist in the collection, summary and presentation of numerical data.Bayesian statistics are becoming an increasingly important and more frequently used method for analysing statistical data. The author defines concepts and methods with a variety of examples and uses a stage-by-stage approach to coach the reader through the applied examples. Also included are a wide range of problems to challenge the reader and the book makes extensive use of Minitab to apply computational techniques to statistical problems. Issues covered include probability, Bayes's Theorem and categorical states, frequency, the Bernoulli process and Poisson process, estimation, testing hypotheses and the normal process with known parameters and uncertain parameters. Elementary Bayesian Statistics will be an essential resource for students as a supplementary text in traditional statistics courses. It will also be welcomed by academics, researchers and econometricians wishing to know more about Bayesian statistics.
Economic Phenomena before and after War is the result of the author's search for a scientific explanation of modern wars, by means of economic statistical data, in the statistics of consumption, production and natural growth of population. The theory discussed assumes that a state of war in modern communities is dependent on the general economic equilibrium, which becomes more and more unstable as industrialization progresses. A state of war indicates a turning point in the action of balancing forces; it moves the economic forces in an opposite direction and is therefore a means for stabilizing the general economic equilibrium.
This volume collects authoritative contributions on analytical methods and mathematical statistics. The methods presented include resampling techniques; the minimization of divergence; estimation theory and regression, eventually under shape or other constraints or long memory; and iterative approximations when the optimal solution is difficult to achieve. It also investigates probability distributions with respect to their stability, heavy-tailness, Fisher information and other aspects, both asymptotically and non-asymptotically. The book not only presents the latest mathematical and statistical methods and their extensions, but also offers solutions to real-world problems including option pricing. The selected, peer-reviewed contributions were originally presented at the workshop on Analytical Methods in Statistics, AMISTAT 2015, held in Prague, Czech Republic, November 10-13, 2015.
This book focuses on quantitative survey methodology, data collection and cleaning methods. Providing starting tools for using and analyzing a file once a survey has been conducted, it addresses fields as diverse as advanced weighting, editing, and imputation, which are not well-covered in corresponding survey books. Moreover, it presents numerous empirical examples from the author's extensive research experience, particularly real data sets from multinational surveys.
Economists can use computer algebra systems to manipulate symbolic models, derive numerical computations, and analyze empirical relationships among variables. Maxima is an open-source multi-platform computer algebra system that rivals proprietary software. Maxima's symbolic and computational capabilities enable economists and financial analysts to develop a deeper understanding of models by allowing them to explore the implications of differences in parameter values, providing numerical solutions to problems that would be otherwise intractable, and by providing graphical representations that can guide analysis. This book provides a step-by-step tutorial for using this program to examine the economic relationships that form the core of microeconomics in a way that complements traditional modeling techniques. Readers learn how to phrase the relevant analysis and how symbolic expressions, numerical computations, and graphical representations can be used to learn from microeconomic models. In particular, comparative statics analysis is facilitated. Little has been published on Maxima and its applications in economics and finance, and this volume will appeal to advanced undergraduates, graduate-level students studying microeconomics, academic researchers in economics and finance, economists, and financial analysts.
Professionals are constantly searching for competitive solutions to help determine current and future economic tendencies. Econometrics uses statistical methods and real-world data to predict and establish specific trends within business and finance. This analytical method sustains limitless potential, but the necessary research for professionals to understand and implement this approach is lacking. Applied Econometric Analysis: Emerging Research and Opportunities explores the theoretical and practical aspects of detailed econometric theories and applications within economics, political science, public policy, business, and finance. Featuring coverage on a broad range of topics such as cointegration, machine learning, and time series analysis, this book is ideally designed for economists, policymakers, financial analysts, marketers, researchers, academicians, and graduate students seeking research on the various techniques of econometric concepts.
Anyone who wants to understand stock market cycles and develop a focused, thoughtful, and solidly grounded valuation approach to the stock market must read this book. Bolten explains the causes and patterns of the cycles and identifies the causes of stock price changes. He identifies the sources of risks in the stock market and in individual stocks. Also covered is how the interaction of expected return and risk creates stock market cycles. Bolten talks about the industry sectors most likely to be profitable investments in each stage of the stock market cycles, while identifying the stock market bubble and sinkhole warning signs. The role of the Federal Reserve in each stage of the stock market cycle is also discussed. All the categories of risk are identified and explained while no specific risk is left undiscussed. The underlying causes for long-term stock price trends and cycles are highlighted. The book is useful in many areas including stock analysis, portfolio management, cost of equity capital, financing strategies, business valuations and spotting profit opportunities caused by general economic and specific company changes.
Providing researchers in economics, finance, and statistics with an up-to-date introduction to applying Bayesian techniques to empirical studies, this book covers the full range of the new numerical techniques which have been developed over the last thirty years. Notably, these are: Monte Carlo sampling, antithetic replication, importance sampling, and Gibbs sampling. The author covers both advances in theory and modern approaches to numerical and applied problems, and includes applications drawn from a variety of different fields within economics, while also providing a quick overview of the underlying statistical ideas of Bayesian thought. The result is a book which presents a roadmap of applied economic questions that can now be addressed empirically with Bayesian methods. Consequently, many researchers will find this a readily readable survey of this growing topic.
This book is an introductory exposition of different topics that emerged in the literature as unifying themes between two fields of econometrics of time series, namely nonlinearity and nonstationarity. Papers on these topics have exploded over the last two decades, but they are rarely ex amined together. There is, undoubtedly, a variety of arguments that justify such a separation. But there are also good reasons that motivate their combination. People who are reluctant to a combined analysis might argue that nonlinearity and nonstationarity enhance non-trivial problems, so their combination does not stimulate interest in regard to plausibly increased difficulties. This argument can, however, be balanced by other ones of an economic nature. A predominant idea, today, is that a nonstationary series exhibits persistent deviations from its long-run components (either deterministic or stochastic trends). These persistent deviations are modelized in various ways: unit root models, fractionally integrated processes, models with shifts in the time trend, etc. However, there are many other behaviors inherent to nonstationary processes, that are not reflected in linear models. For instance, economic variables with mixture distributions, or processes that are state-dependent, undergo episodes of changing dynamics. In models with multiple long-run equi libria, the moving from an equilibrium to another sometimes implies hys teresis. Also, it is known that certain shocks can change the economic fundamentals, thereby reducing the possibility that an initial position is re-established after a shock (irreversibility)."
Major transport infrastructures are increasingly in the news as both the engineering and financing possibilities come together. However, these projects have also demonstrated the inadequacy of most existing approaches to forecasting their impacts and their overall evaluation. This collection of papers from a conference organized by the Association of d'Econometrie Appliquee represents a state of the art look at issues of forecasting traffic, developing pricing strategies and estimating the impacts in a set of papers by leading authorities from Europe, North America and Japan.
This book combines both a comprehensive analytical framework and economic statistics that enable business decision makers to anticipate developing economic trends. The author blends recent and historical economic data with economic theory to provide important benchmarks or rules of thumb that give both economists and noneconomists enhanced understanding of unfolding economic data and their interrelationships. Through the matrix system, a disciplined approach is described for integrating readily available economic data into a comprehensive analysis without complex formulas. The extensive appendix of monthly key economic factors for 1978-1991 makes this an important reference source for economic and financial trend analysis. A new and practical method for economic trend analysis is introduced that provides more advanced knowledge than available from economic newsletters. Schaeffer begins with a general description of the business cycle and the typical behavior and effect of the credit markets, commercial banks, and the Federal Reserve. Next, fourteen key economic factors regularly reported by the business press are described, such as the capacity utilization rate and yield on three-month Treasury bills. Benchmarks for each of these key economic factors are set forth, together with an insightful discussion of the interrelationships indicating economic trends. A detailed discussion of the 1978-1991 American economy, incorporating monthly data from the historical matrix, demonstrates the practical application of the matrix system. Executives, investors, financial officers, and government policymakers will find this book useful in decision making.
The volume examines the state-of-the-art of productivity and efficiency analysis. It brings together a selection of the best papers from the 10th North American Productivity Workshop. By analyzing world-wide perspectives on challenges that local economies and institutions may face when changes in productivity are observed, readers can quickly assess the impact of productivity measurement, productivity growth, dynamics of productivity change, measures of labor productivity, measures of technical efficiency in different sectors, frontier analysis, measures of performance, industry instability and spillover effects. The contributions in this volume focus on the theory and application of economics, econometrics, statistics, management science and operational research related to problems in the areas of productivity and efficiency measurement. Popular techniques and methodologies including stochastic frontier analysis and data envelopment analysis are represented. Chapters also cover broader issues related to measuring, understanding, incentivizing and improving the productivity and performance of firms, public services, and industries.
Occupational licensure, including regulation of the professions, dates back to the medieval period. While the guilds that performed this regulatory function have long since vanished, professional regulation continues to this day. For instance, in the United States, 22 per cent of American workers must hold licenses simply to do their jobs. While long-established professions have more settled regulatory paradigms, the case studies in Paradoxes of Professional Regulation explore other professions, taking note of incompetent services and the serious risks they pose to the physical, mental, or emotional health, financial well-being, or legal status of uninformed consumers. Michael J. Trebilcock examines five case studies of the regulation of diverse professions, including alternative medicine, mental health care provision, financial planning, immigration consulting, and legal services. Noting the widely divergent approaches to the regulation of the same professions across different jurisdictions - paradoxes of professional regulation - the book is an attempt to develop a set of regulatory principles for the future. In its comparative approach, Paradoxes of Professional Regulation gets at the heart of the tensions influencing the regulatory landscape, and works toward practical lessons for bringing greater coherence to the way in which professions are regulated.
• Introduces the dynamics, principles and mathematics behind ten macroeconomic models allowing students to visualise the models and understand the economic intuition behind them. • Provides a step-by-step guide, and the necessary MATLAB codes, to allow readers to simulate and experiment with the models themselves.
A timely work which represents a major reappraisal of business cycle theory. It revives, with the help of modern analytical techniques, an old theme of Keynesian macroeconomics, namely that "market psychology" (i.e., volatile expectations) may be a significant cause of economic fluctuations. It is of interest not only to economists, but also to mathematicians and physicists.
This book develops the analysis of Time Series from its formal beginnings in the 1890s through to the publication of Box and Jenkins' watershed publication in 1970, showing how these methods laid the foundations for the modern techniques of Time Series analysis that are in use today.
This book explores the potential for renewable energy development and the adoption of sustainable production processes in Latin America and the Caribbean. By examining the energy transition process, the impact of environmental degradation, and the relationship between renewable energy sources and economic growth, the effects of increased globalisation and liberalisation in this part of the world are analysed. Particular attention is given to renewable energy investment, the energy-economics growth nexus, the impact of trade openness, and the mitigation of carbon emissions. This book aims to highlight econometric techniques that can be used to tackle issues relating to globalisation, the energy transition, and environmental degradation. It will be relevant to researchers and policymakers interested in energy and environmental economics.
This Festschrift is dedicated to Goetz Trenkler on the occasion of his 65th birthday. As can be seen from the long list of contributions, Goetz has had and still has an enormous range of interests, and colleagues to share these interests with. He is a leading expert in linear models with a particular focus on matrix algebra in its relation to statistics. He has published in almost all major statistics and matrix theory journals. His research activities also include other areas (like nonparametrics, statistics and sports, combination of forecasts and magic squares, just to mention afew). Goetz Trenkler was born in Dresden in 1943. After his school years in East G- many and West-Berlin, he obtained a Diploma in Mathematics from Free University of Berlin (1970), where he also discovered his interest in Mathematical Statistics. In 1973, he completed his Ph.D. with a thesis titled: On a distance-generating fu- tion of probability measures. He then moved on to the University of Hannover to become Lecturer and to write a habilitation-thesis (submitted 1979) on alternatives to the Ordinary Least Squares estimator in the Linear Regression Model, a topic that would become his predominant ?eld of research in the years to come.
This book systematically provides a prospective integrated approach for complexity social science in its view of statistical physics and mathematics, with an impressive collection of the knowledge and expertise of leading researchers from all over the world. The book mainly covers both finitary methods of statistical equilibrium and data-driven analysis by econophysics. The late Professor Masanao Aoki of UCLA, who passed away at the end of July 2018, in his later years dedicated himself to the reconstruction of macroeconomics mainly in terms of statistical physics. Professor Aoki, who was already an IEEE fellow, was also named an Econometric Society Fellow in 1979. Until the early 1990s, however, his contributions were focused on the new developments of a novel algorithm for the time series model and their applications to economic data. Those contributions were undoubtedly equivalent to the Nobel Prize-winning work of Granger's "co-integration method". After the publications of his New Approaches to Macroeconomic Modeling and Modeling Aggregate Behavior and Fluctuations in Economics, both published by Cambridge University Press, in 1996 and 2002, respectively, his contributions rapidly became known and spread throughout the field. In short, these new works challenged econophysicists to develop evolutionary stochastic dynamics, multiple equilibria, and externalities as field effects and revolutionized the stochastic views of interacting agents. In particular, the publication of Reconstructing Macroeconomics, also by Cambridge University Press (2007), in cooperation with Hiroshi Yoshikawa, further sharpened the process of embodying "a perspective from statistical physics and combinatorial stochastic processes" in economic modeling. Interestingly, almost concurrently with Prof. Aoki's newest development, similar approaches were appearing. Thus, those who were working in the same context around the world at that time came together, exchanging their results during the past decade. In memory of Prof. Aoki, this book has been planned by authors who followed him to present the most advanced outcomes of his heritage.
Quants, physicists working on Wall Street as quantitative analysts, have been widely blamed for triggering financial crises with their complex mathematical models. Their formulas were meant to allow Wall Street to prosper without risk. But in this penetrating insider's look at the recent economic collapse, Emanuel Derman--former head quant at Goldman Sachs--explains the collision between mathematical modeling and economics and what makes financial models so dangerous. Though such models imitate the style of physics and employ the language of mathematics, theories in physics aim for a description of reality--but in finance, models can shoot only for a very limited approximation of reality. Derman uses his firsthand experience in financial theory and practice to explain the complicated tangles that have paralyzed the economy. "Models.Behaving.Badly. "exposes Wall Street's love affair with models, and shows us why nobody will ever be able to write a model that can encapsulate human behavior. |
![]() ![]() You may like...
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,491
Discovery Miles 34 910
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
|