![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
Up-to-date coverage of most micro-econometric topics; first half parametric, second half semi- (non-) parametric Many empirical examples and tips in applying econometric theories to data Essential ideas and steps shown for most estimators and tests; well-suited for both applied and theoretical readers
This book presents an empirical investigation into the relationship between companies' short-term response to capital and labor market frictions and performance. Two different kinds of performance measures are considered, namely innovation performance and firm performance. The author focuses on two major topics: first, on the relation between innovation performance and the use of trade credit. Second, on the relation between firm performance and the use of temporary employment. The use of in-depth firm-level data and state-of-the-art microeconometric methods provide the scientific rigor to this important investigation to answer the questions currently being confronted by many companies in different economies.
This volume discusses the latest techniques and their economic
applications for modern industries like computer, pharmaceutical,
banking and other manaufacturing. These industries are most
important for a growing economy. Both econometric and mathematical
programming techniques are analyzed so as to develop a synthetic
approach. The industrial applications not only emphasize the
various aspects of R&D spending, advertisement expenditure and
imperfect market structures, but also assess the economic benefits
of measuring some specific performance paremers in the light of
policy reforms adopted in a growing economy.
To derive rational and convincible solutions to practical decision making problems in complex and hierarchical human organizations, the decision making problems are formulated as relevant mathematical programming problems which are solved by developing optimization techniques so as to exploit characteristics or structural features of the formulated problems. In particular, for resolving con?ict in decision making in hierarchical managerial or public organizations, the multi level formula tion of the mathematical programming problems has been often employed together with the solution concept of Stackelberg equilibrium. However, weconceivethatapairoftheconventionalformulationandthesolution concept is not always suf?cient to cope with a large variety of decision making situations in actual hierarchical organizations. The following issues should be taken into consideration in expression and formulation of decision making problems. Informulationofmathematicalprogrammingproblems, itistacitlysupposedthat decisions are made by a single person while game theory deals with economic be havior of multiple decision makers with fully rational judgment. Because two level mathematical programming problems are interpreted as static Stackelberg games, multi level mathematical programming is relevant to noncooperative game theory; in conventional multi level mathematical programming models employing the so lution concept of Stackelberg equilibrium, it is assumed that there is no communi cation among decision makers, or they do not make any binding agreement even if there exists such communication. However, for decision making problems in such as decentralized large ?rms with divisional independence, it is quite natural to sup pose that there exists communication and some cooperative relationship among the decision maker
A host of internationally recognized experts have been brought together to examine one of the most important sectors in today's world economy, the information sector. The study utilizes the most recent quantitative and econometric research on the media and information sectors and their markets. Most of the work presented is from two international conferences and other invited conferences.
This book presents recent research on robustness in econometrics. Robust data processing techniques - i.e., techniques that yield results minimally affected by outliers - and their applications to real-life economic and financial situations are the main focus of this book. The book also discusses applications of more traditional statistical techniques to econometric problems. Econometrics is a branch of economics that uses mathematical (especially statistical) methods to analyze economic systems, to forecast economic and financial dynamics, and to develop strategies for achieving desirable economic performance. In day-by-day data, we often encounter outliers that do not reflect the long-term economic trends, e.g., unexpected and abrupt fluctuations. As such, it is important to develop robust data processing techniques that can accommodate these fluctuations.
The estimation of the effects of treatments ??? endogenous
variables representing everything from individual participation in
a training program to national participation in a World Bank loan
program ??? has occupied much of the theoretical and applied
econometric research literatures in recent years. This volume
brings together a diverse collection of papers on this important
topic by leaders in the field from around the world. Some of the
papers offer new theoretical contributions on various estimation
techniques and others provide timely empirical applications
illustrating the benefits of these and other methods. All of the
papers share two common themes. First, as different estimators
estimate different treatment effect parameters, it is vital to know
what you are estimating and to know to whom the estimate applies.
Second, as different estimators require different identification
assumptions, it is crucial to understand the assumptions underlying
each estimator. In empirical applications, the researcher must also
make the case that the assumptions hold based on the available data
and the institutional context. The theoretical contributions range
over a variety of different estimators drawn from both statistics
and econometrics, including matching and other non-parametric
methods, panel methods, instrumental variables, methods based on
hazard rate models and principal stratification, and they draw upon
both the Bayesian and classical statistical traditions. The
empirical contributions focus mainly on the evaluation of active
labor market programs in Europe and the United States, but also
examine of the effect of parenthood on wages and of the number of
children on child health.
Scenario planning is the principles, methods, and techniques for looking forward into the future and trying to anticipate and influence what is to come next. This book provides students and line managers in organizations with the means to create better scenarios and to use them to create winning business strategies. The purpose is to shed new light on scenarios and scenario-like thinking in organizations for managers at every level within a company. The book covers scenarios such as: economic outlooks; political environments; acquisitions; downsizing, and more.
The conduct of most of social science occurs outside the laboratory. Such studies in field science explore phenomena that cannot for practical, technical, or ethical reasons be explored under controlled conditions. These phenomena cannot be fully isolated from their environment or investigated by manipulation or intervention. Yet measurement, including rigorous or clinical measurement, does provide analysts with a sound basis for discerning what occurs under field conditions, and why. Science Outside the Laboratory explores the state of measurement theory, its reliability, and the role expert judgment plays in field investigations from the perspective of the philosophy of science. Its discussion of the problems of passive observation, the calculus of observation, the two-model problem, and model-based consensus uses illustrations drawn primarily from economics. The treatment clarifies the extent to which measurement provides valid information about objects and events in field sciences, but also has implications for measurement in the laboratory.
This revised edition of Ryuzo Sato's seminal work illustrates the timeless nature of his contribution to economics. It is as pertinent today as when it was originally conceived, over twenty years ago. This book deals with a variety of topics in economic theory, ranging from the analysis of production functions to the general recoverability problem of optimal dynamic behavior. They are unified in the theme of 'transformation and invariance'. This book demonstrates the first application of the Lie theory to modern economics and provides a revealing analysis of market behavior and economic invariance. This book will be of interest to scholars of industrial economics, innovation, econometrics and microeconomics.
This book investigates several competing forecasting models for interest rates, financial returns, and realized volatility, addresses the usefulness of nonlinear models for hedging purposes, and proposes new computational techniques to estimate financial processes.
Info-metrics is the science of modeling, reasoning, and drawing inferences under conditions of noisy and insufficient information. It is at the intersection of information theory, statistical inference, and decision-making under uncertainty. It plays an important role in helping make informed decisions even when there is inadequate or incomplete information because it provides a framework to process available information with minimal reliance on assumptions that cannot be validated. In this pioneering book, Amos Golan, a leader in info-metrics, focuses on unifying information processing, modeling and inference within a single constrained optimization framework. Foundations of Info-Metrics provides an overview of modeling and inference, rather than a problem specific model, and progresses from the simple premise that information is often insufficient to provide a unique answer for decisions we wish to make. Each decision, or solution, is derived from the available input information along with a choice of inferential procedure. The book contains numerous multidisciplinary applications and case studies, which demonstrate the simplicity and generality of the framework in real world settings. Examples include initial diagnosis at an emergency room, optimal dose decisions, election forecasting, network and information aggregation, weather pattern analyses, portfolio allocation, strategy inference for interacting entities, incorporation of prior information, option pricing, and modeling an interacting social system. Graphical representations illustrate how results can be visualized while exercises and problem sets facilitate extensions. This book is this designed to be accessible for researchers, graduate students, and practitioners across the disciplines.
Econometric theory, as presented in textbooks and the econometric literature generally, is a somewhat disparate collection of findings. Its essential nature is to be a set of demonstrated results that increase over time, each logically based on a specific set of axioms or assumptions, yet at every moment, rather than a finished work, these inevitably form an incomplete body of knowledge. The practice of econometric theory consists of selecting from, applying, and evaluating this literature, so as to test its applicability and range. The creation, development, and use of computer software has led applied economic research into a new age. This book describes the history of econometric computation from 1950 to the present day, based upon an interactive survey involving the collaboration of the many econometricians who have designed and developed this software. It identifies each of the econometric software packages that are made available to and used by economists and econometricians worldwide.
Empirical Studies In Applied Economics presents nine previously unpublished analyses in monograph form. In this work, the topics are presented so that each chapter stands on its own. The emphasis is on the applications but attention is also given to the econometric and statistical issues for advanced readers. Econometric methods include multivariate regression analysis, limited dependent variable analysis, and other maximum likelihood techniques. The empirical topics include the measurement of competition and market power in natural gas transportation markets and in the pharmaceutical market for chemotherapy drugs. Additional topics include an empirical analysis of NFL football demand, the accuracy of an econometric model for mail demand, and the allocation of police services in rural Alaska. Other chapters consider the valuation of technology patents and the determination of patent scope, duration, and reasonable royalty, and the reaction of financial markets to health scares in the fast-food industry. Finally, two chapters are devoted to the theory and testing of synergistic health effects from the combined exposure to asbestos and cigarette smoking.
This edited book contains several state-of-the-art papers devoted to econometrics of risk. Some papers provide theoretical analysis of the corresponding mathematical, statistical, computational, and economical models. Other papers describe applications of the novel risk-related econometric techniques to real-life economic situations. The book presents new methods developed just recently, in particular, methods using non-Gaussian heavy-tailed distributions, methods using non-Gaussian copulas to properly take into account dependence between different quantities, methods taking into account imprecise ("fuzzy") expert knowledge, and many other innovative techniques. This versatile volume helps practitioners to learn how to apply new techniques of econometrics of risk, and researchers to further improve the existing models and to come up with new ideas on how to best take into account economic risks.
This book proposes new tools and models to price options, assess market volatility, and investigate the market efficiency hypothesis. In particular, it considers new models for hedge funds and derivatives of derivatives, and adds to the literature of testing for the efficiency of markets both theoretically and empirically.
Spatial Econometrics is a rapidly evolving field born from the joint efforts of economists, statisticians, econometricians and regional scientists. The book provides the reader with a broad view of the topic by including both methodological and application papers. Indeed the application papers relate to a number of diverse scientific fields ranging from hedonic models of house pricing to demography, from health care to regional economics, from the analysis of R&D spillovers to the study of retail market spatial characteristics. Particular emphasis is given to regional economic applications of spatial econometrics methods with a number of contributions specifically focused on the spatial concentration of economic activities and agglomeration, regional paths of economic growth, regional convergence of income and productivity and the evolution of regional employment. Most of the papers appearing in this book were solicited from the International Workshop on Spatial Econometrics and Statistics held in Rome (Italy) in 2006.
The Handbook is a definitive reference source and teaching aid for
This book is a collection of selected papers presented at the Annual Meeting of the European Academy of Management and Business Economics (AEDEM), held at the Faculty of Economics and Business of the University of Barcelona, 05 - 07 June, 2012. This edition of the conference has been presented with the slogan "Creating new opportunities in an uncertain environment". There are different ways for assessing uncertainty in management but this book mainly focused on soft computing theories and their role in assessing uncertainty in a complex world. The present book gives a comprehensive overview of general management topics and discusses some of the most recent developments in all the areas of business and management including management, marketing, business statistics, innovation and technology, finance, sports and tourism. This book might be of great interest for anyone working in the area of management and business economics and might be especially useful for scientists and graduate students doing research in these fields.
This book proposes new methods to value equity and model the Markowitz efficient frontier using Markov switching models and provide new evidence and solutions to capture the persistence observed in stock returns across developed and emerging markets.
The present book deals with coalition games in which expected pay-offs are only vaguely known. In fact, this idea about vagueness of expectations ap pears to be adequate to real situations in which the coalitional bargaining anticipates a proper realization of the game with a strategic behaviour of players. The vagueness being present in the expectations of profits is mod elled by means of the theory of fuzzy set and fuzzy quantities. The fuzziness of decision-making and strategic behaviour attracts the attention of mathematicians and its particular aspects are discussed in sev eral works. One can mention in this respect in particular the book "Fuzzy and Multiobjective Games for Conflict Resolution" by Ichiro Nishizaki and Masatoshi Sakawa (referred below as 43]) which has recently appeared in the series Studies in Fuzziness and Soft Computing published by Physica-Verlag in which the present book is also apperaing. That book, together with the one you carry in your hands, form in a certain sense a complementary pair. They present detailed views on two main aspects forming the core of game theory: strategic (mostly 2-person) games, and coalitional (or cooperative) games. As a pair they offer quite a wide overview of fuzzy set theoretical approaches to game theoretical models of human behaviour."
provide models that could be used by do-it-yourselfers and also can be used toprovideunderstandingofthebackgroundissuessothatonecandoabetter job of working with the (proprietary) algorithms of the software vendors. In this book we strive to provide models that capture many of the - tails faced by ?rms operating in a modern supply chain, but we stop short of proposing models for economic analysis of the entire multi-player chain. In other words, we produce models that are useful for planning within a supply chain rather than models for planning the supply chain. The usefulness of the models is enhanced greatly by the fact that they have been implemented - ing computer modeling languages. Implementations are shown in Chapter 7, which allows solutions to be found using a computer. A reasonable question is: why write the book now? It is a combination of opportunities that have recently become available. The availability of mod- inglanguagesandcomputersthatprovidestheopportunitytomakepractical use of the models that we develop. Meanwhile, software companies are p- viding software for optimized production planning in a supply chain. The opportunity to make use of such software gives rise to a need to understand some of the issues in computational models for optimized planning. This is best done by considering simple models and examples.
Floro Ernesto Caroleo and Francesco Pastore This book was conceived to collect selected essays presented at the session on "The Labour Market Impact of the European Union Enlargements. A New Regional Geography of Europe?" of the XXII Conference of the Italian Association of Labour Economics (AIEL). The session aimed to stimulate the debate on the continuity/ fracture of regional patterns of development and employment in old and new European Union (EU) regions. In particular, we asked whether, and how different, the causes of emergence and the evolution of regional imbalances in the new EU members of Central and Eastern Europe (CEE) are compared to those in the old EU members. Several contributions in this book suggest that a factor common to all backward regions, often neglected in the literature, is to be found in their higher than average degree of structural change or, more precisely, in the hardship they expe- ence in coping with the process of structural change typical of all advanced economies. In the new EU members of CEE, structural change is still a consequence of the continuing process of transition from central planning to a market economy, but also of what Fabrizio et al. (2009) call the "second transition," namely that related to the run-up to and entry in the EU.
Unique blend of asymptotic theory and small sample practice through simulation experiments and data analysis. Novel reproducing kernel Hilbert space methods for the analysis of smoothing splines and local polynomials. Leading to uniform error bounds and honest confidence bands for the mean function using smoothing splines Exhaustive exposition of algorithms, including the Kalman filter, for the computation of smoothing splines of arbitrary order.
This book proposes new methods to build optimal portfolios and to analyze market liquidity and volatility under market microstructure effects, as well as new financial risk measures using parametric and non-parametric techniques. In particular, it investigates the market microstructure of foreign exchange and futures markets. |
You may like...
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Spatial Analysis Using Big Data…
Yoshiki Yamagata, Hajime Seya
Paperback
R3,021
Discovery Miles 30 210
|