![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
This volume is the result of an Advances in Econometrics conference held in November of 2002 at Louisiana State University in recognition of Halbert White's pioneering work published in Econometrica in 1980 and 1982 on robust variance-covariance estimation and quasi-maximum likelihood estimation. It contains 11 papers on a range of related topics including the estimation of possibly misspecified error component and fixed effects panel models, estimation and inference in possibly misspecified quantile regression models, quasi-maximum likelihood estimation of linear regression models with bounded and symmetric errors and quasi-maximum likelihood estimation of models with parameter dependencies between the mean vector and error variance-covariance matrix. Other topics include GMM, HAC, Heckit, asymmetric GARCH, Cross-Entropy, and multivariate deterministic trend estimation and testing under various possible misspecifications.
Spatial statistics are useful in subjects as diverse as climatology, ecology, economics, environmental and earth sciences, epidemiology, image analysis and more. This book covers the best-known spatial models for three types of spatial data: geostatistical data (stationarity, intrinsic models, variograms, spatial regression and space-time models), areal data (Gibbs-Markov fields and spatial auto-regression) and point pattern data (Poisson, Cox, Gibbs and Markov point processes). The level is relatively advanced, and the presentation concise but complete. The most important statistical methods and their asymptotic
properties are described, including estimation in geostatistics,
autocorrelation and second-order statistics, maximum likelihood
methods, approximate inference using the pseudo-likelihood or
Monte-Carlo simulations, statistics for point processes and
Bayesian hierarchical models. A chapter is devoted to Markov Chain
Monte Carlo simulation (Gibbs sampler, Metropolis-Hastings
algorithms and exact simulation). This book is the English translation of Modelisation et Statistique Spatiales published by Springer in the series Mathematiques & Applications, a series established by Societe de Mathematiques Appliquees et Industrielles (SMAI)."
This book examines the numerous aspects of exchange rates and the dynamics of macroeconomics, focusing on the PPP puzzle, volatility, levels, with an exploration of the real exchange rate misalignment of the Central European countries single equation approach, an examination of the real equilibrium exchange rate in China, exchange rate dynamics and pass-through effects in Russia and Hungary, and structural shocks on economies.
Japan is a tiny country that occupies only 0.25% of the world's total land area. However, this small country is the world's third largest in economy: the Japanese GDP is roughly equivalent to the sum of any two major countries in Europe as of 2012. This book is a first attempt to ask leaders of top Japanese companies, such as Toyota, about their thoughts on mathematics. The topics range from mathematical problems in specific areas (e.g., exploration of natural resources, communication networks, finance) to mathematical strategy that helps a leader who has to weigh many different issues and make decisions in a timely manner, and even to mathematical literacy that ensures quality control. The reader may notice that every article reflects the authors' way of life and thinking, which can be evident in even one sentence. This book is an enlarged English edition of the Japanese book What Mathematics Can Do for You: Essays and Tips from Japanese Industry Leaders. In this edition we have invited the contributions of three mathematicians who have been working to expand and strengthen the interaction between mathematics and industry. The role of mathematics is usually invisible when it is applied effectively and smoothly in science and technology, and mathematical strategy is often hidden when it is used properly and successfully. The business leaders in successful top Japanese companies are well aware of this invisible feature of mathematics in applications aside from the intrinsic depth of mathematics. What Mathematics Can Do for You ultimately provides the reader an opportunity to notice what is hidden but key to business strategy.
The entropy concept was developed and used by Shannon in 1940 as a measure of uncertainty in the context of information theory. In 1957 Jaynes made use of Shannon's entropy concept as a basis for estimation and inference in problems that are ill-suited for traditional statistical procedures. This volume consists of two sections. The first section contains papers developing econometric methods based on the entropy principle. An interesting array of applications is presented in the second section of the volume.
This book presents an extensive survey of the theory and empirics of international parity conditions which are critical to our understanding of the linkages between world markets and the movement of interest and exchange rates across countries. The book falls into three parts dealing with the theory, methods of econometric testing and existing empirical evidence. Although it is intended to provide a consensus view on the subject, the authors also make some controversial propositions, particularly on the purchasing power parity conditions.
The worlds of Wall Street and The City have always held a certain allure, but in recent years have left an indelible mark on the wider public consciousness and there has been a need to become more financially literate. The quantitative nature of complex financial transactions makes them a fascinating subject area for mathematicians of all types, whether for general interest or because of the enormous monetary rewards on offer. An Introduction to Quantitative Finance concerns financial derivatives - a derivative being a contract between two entities whose value derives from the price of an underlying financial asset - and the probabilistic tools that were developed to analyse them. The theory in the text is motivated by a desire to provide a suitably rigorous yet accessible foundation to tackle problems the author encountered whilst trading derivatives on Wall Street. The book combines an unusual blend of real-world derivatives trading experience and rigorous academic background. Probability provides the key tools for analysing and valuing derivatives. The price of a derivative is closely linked to the expected value of its pay-out, and suitably scaled derivative prices are martingales, fundamentally important objects in probability theory. The prerequisite for mastering the material is an introductory undergraduate course in probability. The book is otherwise self-contained and in particular requires no additional preparation or exposure to finance. It is suitable for a one-semester course, quickly exposing readers to powerful theory and substantive problems. The book may also appeal to students who have enjoyed probability and have a desire to see how it can be applied. Signposts are given throughout the text to more advanced topics and to different approaches for those looking to take the subject further.
This is the second of three volumes surveying the state of the art
in Game Theory and its applications to many and varied fields, in
particular to economics. The chapters in the present volume are
contributed by outstanding authorities, and provide comprehensive
coverage and precise statements of the main results in each area.
The applications include empirical evidence. The following topics
are covered: communication and correlated equilibria, coalitional
games and coalition structures, utility and subjective probability,
common knowledge, bargaining, zero-sum games, differential games,
and applications of game theory to signalling, moral hazard,
search, evolutionary biology, international relations, voting
procedures, social choice, public economics, politics, and cost
allocation. This handbook will be of interest to scholars in
economics, political science, psychology, mathematics and biology.
For more information on the Handbooks in Economics series, please
see our home page on http: //www.elsevier.nl/locate/hes
Hardbound. This is the fourth volume of the Handbook of Econometrics. The Handbook is a definitive reference source and teaching aid for econometricians. It examines models, estimation theory, data analysis and field applications in econometrics. Comprehensive surveys, written by experts, discuss recent developments at a level suitable for professional use by economists, econometricians, statisticians, and in advanced graduate econometrics courses.
A high school student can create deep Q-learning code to control her robot, without any understanding of the meaning of 'deep' or 'Q', or why the code sometimes fails. This book is designed to explain the science behind reinforcement learning and optimal control in a way that is accessible to students with a background in calculus and matrix algebra. A unique focus is algorithm design to obtain the fastest possible speed of convergence for learning algorithms, along with insight into why reinforcement learning sometimes fails. Advanced stochastic process theory is avoided at the start by substituting random exploration with more intuitive deterministic probing for learning. Once these ideas are understood, it is not difficult to master techniques rooted in stochastic control. These topics are covered in the second part of the book, starting with Markov chain theory and ending with a fresh look at actor-critic methods for reinforcement learning.
This book presents the latest advances in the theory and practice of Marshall-Olkin distributions. These distributions have been increasingly applied in statistical practice in recent years, as they make it possible to describe interesting features of stochastic models like non-exchangeability, tail dependencies and the presence of a singular component. The book presents cutting-edge contributions in this research area, with a particular emphasis on financial and economic applications. It is recommended for researchers working in applied probability and statistics, as well as for practitioners interested in the use of stochastic models in economics. This volume collects selected contributions from the conference "Marshall-Olkin Distributions: Advances in Theory and Applications," held in Bologna on October 2-3, 2013.
Each chapter of Macroeconometrics is written by respected econometricians in order to provide useful information and perspectives for those who wish to apply econometrics in macroeconomics. The chapters are all written with clear methodological perspectives, making the virtues and limitations of particular econometric approaches accessible to a general readership familiar with applied macroeconomics. The real tensions in macroeconometrics are revealed by the critical comments from different econometricians, having an alternative perspective, which follow each chapter.
In recent years, as part of the increasing "informationization" of industry and the economy, enterprises have been accumulating vast amounts of detailed data such as high-frequency transaction data in nancial markets and point-of-sale information onindividualitems in theretail sector. Similarly,vast amountsof data arenow ava- able on business networks based on inter rm transactions and shareholdings. In the past, these types of information were studied only by economists and management scholars. More recently, however, researchers from other elds, such as physics, mathematics, and information sciences, have become interested in this kind of data and, based on novel empirical approaches to searching for regularities and "laws" akin to those in the natural sciences, have produced intriguing results. This book is the proceedings of the international conference THICCAPFA7 that was titled "New Approaches to the Analysis of Large-Scale Business and E- nomic Data," held in Tokyo, March 1-5, 2009. The letters THIC denote the Tokyo Tech (Tokyo Institute of Technology)-Hitotsubashi Interdisciplinary Conference. The conference series, titled APFA (Applications of Physics in Financial Analysis), focuses on the analysis of large-scale economic data. It has traditionally brought physicists and economists together to exchange viewpoints and experience (APFA1 in Dublin 1999, APFA2 in Liege ` 2000, APFA3 in London 2001, APFA4 in Warsaw 2003, APFA5 in Torino 2006, and APFA6 in Lisbon 2007). The aim of the conf- ence is to establish fundamental analytical techniques and data collection methods, taking into account the results from a variety of academic disciplines.
This book presents a theory of the general dynamic economic equilibrium which is a development of the static theory of Walras and Pareto. The work has built up an analytical model of the effective, current movement of an economic system, founded on the logic of the individual changing programmes - a basis for finding out the laws of all types of endogenous and exogenous movements of the economy. Indeed, the model can be used in the treatment of the typical problems of dynamic economics, by means of the author's method of variational dynamic analysis.
Tax policy questions may relate to specific problems, concerning perhaps the revenue implications of a particular tax, or they may involve an extensive analysis of the cost and redistributive effects of many taxes and transfer payments. This book is concerned with the ways in which tax policy design can be enhanced by the use of a behavioural tax microsimulation model capable of evaluating the effects of planned or actual tax reforms. An advantage of such a large-scale tax simulation model, which reflects the heterogeneity of the population and captures the details of the tax structure, is that it can examine detailed practical policy questions and can provide direct inputs into policy debates. After introducing behavioural models, the authors discuss the role of means testing, several hypothetical policy reforms, actual and proposed reforms and recent modelling developments.Tax Policy Design and Behavioural Microsimulation Modelling will be of interest to academics and researchers of economics, econometrics and public finance. It will also be useful reading for policymakers responsible for the formulation of taxation.
Two central problems in the pure theory of economic growth are analysed in this monograph: 1) the dynamic laws governing the economic growth processes, 2) the kinematic and geometric properties of the set of solutions to the dynamic systems. With allegiance to rigor and the emphasis on the theoretical fundamentals of prototype mathematical growth models, the treatise is written in the theorem-proof style. To keep the exposition orderly and as smooth as possible, the economic analysis has been separated from the purely mathematical issues, and hence the monograph is organized in two books. Regarding the scope and content of the two books, an "Introduction and Over view" has been prepared to offer both motivation and a brief account. The introduc tion is especially designed to give a recapitulation of the mathematical theory and results presented in Book II, which are used as the unifying mathematical framework in the analysis and exposition of the different economic growth models in Book I. Economists would probably prefer to go directly to Book I and proceed by consult ing the mathematical theorems of Book II in confirming the economic theorems in Book I. Thereby, both the independence and interdependence of the economic and mathematical argumentations are respected."
Jean-Jacques Rousseau wrote in the Preface to his famous Discourse on Inequality that "I consider the subject of the following discourse as one of the most interesting questions philosophy can propose, and unhappily for us, one of the most thorny that philosophers can have to solve. For how shall we know the source of inequality between men, if we do not begin by knowing mankind?" (Rousseau, 1754). This citation of Rousseau appears in an article in Spanish where Dagum (2001), in the memory of whom this book is published, also cites Socrates who said that the only useful knowledge is that which makes us better and Seneca who wrote that knowing what a straight line is, is not important if we do not know what rectitude is. These references are indeed a good illustration of Dagum's vast knowledge, which was clearly not limited to the ?eld of Economics. For Camilo the ?rst part of Rousseau's citation certainly justi?ed his interest in the ?eld of inequality which was at the centre of his scienti?c preoccupations. It should however be stressed that for Camilo the second part of the citation represented a "solid argument in favor of giving macroeconomic foundations to microeconomic behavior" (Dagum, 2001). More precisely, "individualism and methodological holism complete each other in contributing to the explanation of individual and social behavior" (Dagum, 2001).
Nonlinear Time Series Analysis of Economic and Financial Data provides an examination of the flourishing interest that has developed in this area over the past decade. The constant theme throughout this work is that standard linear time series tools leave unexamined and unexploited economically significant features in frequently used data sets. The book comprises original contributions written by specialists in the field, and offers a combination of both applied and methodological papers. It will be useful to both seasoned veterans of nonlinear time series analysis and those searching for an informative panoramic look at front-line developments in the area.
The subject theory is important in finance, economics, investment strategies, health sciences, environment, industrial engineering, etc.
The basic characteristic of Modern Linear and Nonlinear Econometrics is that it presents a unified approach of modern linear and nonlinear econometrics in a concise and intuitive way. It covers four major parts of modern econometrics: linear and nonlinear estimation and testing, time series analysis, models with categorical and limited dependent variables, and, finally, a thorough analysis of linear and nonlinear panel data modeling. Distinctive features of this handbook are: -A unified approach of both linear and nonlinear econometrics, with an integration of the theory and the practice in modern econometrics. Emphasis on sound theoretical and empirical relevance and intuition. Focus on econometric and statistical methods for the analysis of linear and nonlinear processes in economics and finance, including computational methods and numerical tools. -Completely worked out empirical illustrations are provided throughout, the macroeconomic and microeconomic (household and firm level) data sets of which are available from the internet; these empirical illustrations are taken from finance (e.g. CAPM and derivatives), international economics (e.g. exchange rates), innovation economics (e.g. patenting), business cycle analysis, monetary economics, housing economics, labor and educational economics (e.g. demand for teachers according to gender) and many others. -Exercises are added to the chapters, with a focus on the interpretation of results; several of these exercises involve the use of actual data that are typical for current empirical work and that are made available on the internet. What is also distinguishable in Modern Linear and Nonlinear Econometrics is that everymajor topic has a number of examples, exercises or case studies. By this learning by doing' method the intention is to prepare the reader to be able to design, develop and successfully finish his or her own research and/or solve real world problems.
This book provides a better understanding of how intellectual
property can improve economic and business performance. It focuses
on three particular issues: the valuation of patents, the transfer
of knowledge, and the management of innovation and intellectual
property. Scholars from leading worldwide institutions use
quantitative methods and advanced survey techniques to explore the
complex relationship between patents, innovation, venture capital
and scientific research. The book focuses on three broad issues:
the valuation of patents, the transfer of knowledge, and the
management of innovation and intellectual property.
During 1985-86, the acquisition editor for the humanities and social sciences division of Kluwer Academic Publishers in the Netherlands visited the University of Horida (where I was also visiting while on sabbatical leave from Wilfrid Laurier University as the McKethan-Matherly Senior Research Fellow) to discuss publishing plans of the faculty. He expressed a keen interest in publishing the proceedings of the conference of the Canadian Econometric Study Group (CESG) that was to be held the following year at WLU. This volume is the end product of his interest, endurance, and persistence. But for his persistence I would have given up on th~ project Most of the papers (though not all) included in this volume are based on presentations at CESG conferences. In some cases scholars were invited to contribute to this volume where their research complimented those presented at these conferences even though they were not conference participants. Since papers selected for presentation at the CESG conferences are generally the finished product of scholarly research and often under submission to refereed journals, it was not possible to publish the conference proceedings in their entirety. Accordingly it was decided, in consultation with the publisher, to invite a select list of authors to submit significant extensions of the papers they presented at the CESG conferences for inclusion in this volume. The editor wishes to express gratitude to all those authors who submitted their papers for evaluation by anonymous referees and for making revisions to conform to our editorial process.
The book addresses the problem of calculation of d-dimensional integrals (conditional expectations) in filter problems. It develops new methods of deterministic numerical integration, which can be used to speed up and stabilize filter algorithms. With the help of these methods, better estimates and predictions of latent variables are made possible in the fields of economics, engineering and physics. The resulting procedures are tested within four detailed simulation studies.
This book contains contributions from friends of Ben Stevens,
remembering and celebrating his life and his work. Following his
untimely death, a set of special sessions were organized for the
program of the November 1998 meetings of the Regional Science
Association International, held in Sante Fe, New Mexico.
"Economics, Econometrics and The LINK" is a collection of scholarly contributions by leading scholars from the U.S., Europe, and Asia dealing with issues of economics and econometrics. The book contains a learned and erudite exposition of macroeconomics and macroeconomic modeling including national, sectoral, issues exchange rate, environment, international price competitiveness and international linkages. It presents a comprehensive perspective of econometric modeling - country-specific, sector-specific and issue-specific. The volume is a tribute to the work of Lawrence R. Klein from all his friends who share a common agenda, viz. to relate the study of economics to the studies of mankind. |
You may like...
A Practical Guide to Design for Additive…
Olaf Diegel, Axel Nordin, …
Hardcover
R4,635
Discovery Miles 46 350
Modern Filter Design - Active RC and…
Mohammed S. Ghausi, Kenneth R Laker
Hardcover
R2,724
Discovery Miles 27 240
The Interaction of Compilation…
David J. Lilja, Peter L. Bird
Hardcover
R2,812
Discovery Miles 28 120
|