![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
The methodological needs of environmental studies are unique in the breadth of research questions that can be posed, calling for a textbook that covers a broad swath of approaches to conducting research with potentially many different kinds of evidence. Fully updated to address new developments such as the effects of the internet, recent trends in the use of computers, remote sensing, and large data sets, this new edition of Research Methods for Environmental Studies is written specifically for social science-based research into the environment. This revised edition contains new chapters on coding, focus groups, and an extended treatment of hypothesis testing. The textbook covers the best-practice research methods most used to study the environment and its connections to societal and economic activities and objectives. Over five key parts, Kanazawa introduces quantitative and qualitative approaches, mixed methods, and the special requirements of interdisciplinary research, emphasizing that methodological practice should be tailored to the specific needs of the project. Within these parts, detailed coverage is provided on key topics including the identification of a research project, hypothesis testing, spatial analysis, the case study method, ethnographic approaches, discourse analysis, mixed methods, survey and interview techniques, focus groups, and ethical issues in environmental research. Drawing on a variety of extended and updated examples to encourage problem-based learning and fully addressing the challenges associated with interdisciplinary investigation, this book will be an essential resource for students embarking on courses exploring research methods in environmental studies.
This book investigates whether the effects of economic integration differ according to the size of countries. The analysis incorporates a classification of the size of countries, reflecting the key economic characteristics of economies in order to provide an appropriate benchmark for each size group in the empirical analysis of the effects of asymmetric economic integration. The formation or extension of Preferential Trade Areas (PTAs) leads to a reduction in trade costs. This poses a critical secondary question as to the extent to which trade costs differ according to the size of countries. The extent to which membership of PTAs has an asymmetric impact on trade flow according to the size of member countries is analyzed by employing econometric tools and general equilibrium analysis, estimating both the ex-post and ex-ante effects of economic integration on the size of countries, using a data set of 218 countries, 45 of which are European. ?
Two central problems in the pure theory of economic growth are analysed in this monograph: 1) the dynamic laws governing the economic growth processes, 2) the kinematic and geometric properties of the set of solutions to the dynamic systems. With allegiance to rigor and the emphasis on the theoretical fundamentals of prototype mathematical growth models, the treatise is written in the theorem-proof style. To keep the exposition orderly and as smooth as possible, the economic analysis has been separated from the purely mathematical issues, and hence the monograph is organized in two books. Regarding the scope and content of the two books, an "Introduction and Over view" has been prepared to offer both motivation and a brief account. The introduc tion is especially designed to give a recapitulation of the mathematical theory and results presented in Book II, which are used as the unifying mathematical framework in the analysis and exposition of the different economic growth models in Book I. Economists would probably prefer to go directly to Book I and proceed by consult ing the mathematical theorems of Book II in confirming the economic theorems in Book I. Thereby, both the independence and interdependence of the economic and mathematical argumentations are respected.
Over the last decade or so, applied general equilibrium models have rapidly become a major tool for policy advice on issues regarding allocation and efficiency, most notably taxes and tariffs. This reflects the power of the general equilibrium approach to allocative questions and the capability of today's applied models to come up with realistic answers. However, it by no means implies that the theoretical, practical and empirical problems faced by researchers in applied modelling have all been solved in a satisfactory way. Rather, a promising field of research has been opened up, inviting theorists and practitioners to further explore and exploit its potential. The state of the art in applied general equilibrium modelling is reflected in this volume. The introductory Chapter (Part I) evaluates the use of economic modelling to address policy questions, and discusses the advantages and disadvantages of applied general equilibrium models. Three substantive issues are dealt with in Chapters 2-8: Tax Reform and Capital (Part II), Intertemporal Aspects and Expectations (Part III), and Taxes and the Labour Market (Part IV). While all parts contain results relevant for economic policy, it is clear that theory and applications for these areas are in different stages of development. We hope that this book will bring inspiration, insight and information to researchers, students and policy advisors.
Gini's mean difference (GMD) was first introduced by Corrado Gini in 1912 as an alternative measure of variability. GMD and the parameters which are derived from it (such as the Gini coefficient or the concentration ratio) have been in use in the area of income distribution for almost a century. In practice, the use of GMD as a measure of variability is justified whenever the investigator is not ready to impose, without questioning, the convenient world of normality. This makes the GMD of critical importance in the complex research of statisticians, economists, econometricians, and policy makers. This book focuses on imitating analyses that are based on variance by replacing variance with the GMD and its variants. In this way, the text showcases how almost everything that can be done with the variance as a measure of variability, can be replicated by using Gini. Beyond this, there are marked benefits to utilizing Gini as opposed to other methods. One of the advantages of using Gini methodology is that it provides a unified system that enables the user to learn about various aspects of the underlying distribution. It also provides a systematic method and a unified terminology. Using Gini methodology can reduce the risk of imposing assumptions that are not supported by the data on the model. With these benefits in mind the text uses the covariance-based approach, though applications to other approaches are mentioned as well.
Financial Asset Pricing Theory offers a comprehensive overview of the classic and the current research in theoretical asset pricing. Asset pricing is developed around the concept of a state-price deflator which relates the price of any asset to its future (risky) dividends and thus incorporates how to adjust for both time and risk in asset valuation. The willingness of any utility-maximizing investor to shift consumption over time defines a state-price deflator which provides a link between optimal consumption and asset prices that leads to the Consumption-based Capital Asset Pricing Model (CCAPM). A simple version of the CCAPM cannot explain various stylized asset pricing facts, but these asset pricing 'puzzles' can be resolved by a number of recent extensions involving habit formation, recursive utility, multiple consumption goods, and long-run consumption risks. Other valuation techniques and modelling approaches (such as factor models, term structure models, risk-neutral valuation, and option pricing models) are explained and related to state-price deflators. The book will serve as a textbook for an advanced course in theoretical financial economics in a PhD or a quantitative Master of Science program. It will also be a useful reference book for researchers and finance professionals. The presentation in the book balances formal mathematical modelling and economic intuition and understanding. Both discrete-time and continuous-time models are covered. The necessary concepts and techniques concerning stochastic processes are carefully explained in a separate chapter so that only limited previous exposure to dynamic finance models is required.
The interaction between mathematicians and statisticians reveals to be an effective approach to the analysis of insurance and financial problems, in particular in an operative perspective. The Maf2006 conference, held at the University of Salerno in 2006, had precisely this purpose and the collection published here gathers some of the papers presented at the conference and successively worked out to this aim. They cover a wide variety of subjects in insurance and financial fields.
This is an unusual book because it contains a great deal of formulas. Hence it is a blend of monograph, textbook, and handbook.It is intended for students and researchers who need quick access to useful formulas appearing in the linear regression model and related matrix theory. This is not a regular textbook - this is supporting material for courses given in linear statistical models. Such courses are extremely common at universities with quantitative statistical analysis programs."
A lot of economic problems can be formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories. The series is designed to bring together those mathematicians who are seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking effective mathematical tools for their research.
A comprehensive, up-to-date textbook on nonparametric methods for students and researchers Until now, students and researchers in nonparametric and semiparametric statistics and econometrics have had to turn to the latest journal articles to keep pace with these emerging methods of economic analysis. Nonparametric Econometrics fills a major gap by gathering together the most up-to-date theory and techniques and presenting them in a remarkably straightforward and accessible format. The empirical tests, data, and exercises included in this textbook help make it the ideal introduction for graduate students and an indispensable resource for researchers. Nonparametric and semiparametric methods have attracted a great deal of attention from statisticians in recent decades. While the majority of existing books on the subject operate from the presumption that the underlying data is strictly continuous in nature, more often than not social scientists deal with categorical data-nominal and ordinal-in applied settings. The conventional nonparametric approach to dealing with the presence of discrete variables is acknowledged to be unsatisfactory. This book is tailored to the needs of applied econometricians and social scientists. Qi Li and Jeffrey Racine emphasize nonparametric techniques suited to the rich array of data types-continuous, nominal, and ordinal-within one coherent framework. They also emphasize the properties of nonparametric estimators in the presence of potentially irrelevant variables. Nonparametric Econometrics covers all the material necessary to understand and apply nonparametric methods for real-world problems.
In production and service sectors we often come across situations where females remain largely overshadowed by males both in terms of wages and productivity. Men are generally assigned jobs that require more physical work while the 'less' strenuous job is allocated to the females. However, the gender dimension of labor process in the service sector in India has remained relatively unexplored. There are certain activities in the service sector where females are more suitable than males. The service sector activities are usually divided into OAE and Establishments. In this work, an attempt has been made to segregate the productivity of females compared to that of males on the basis of both partial and complete separability models. An estimate has also been made of the female labor supply function. The results present a downward trend for female participation both in Own Account Enterprises (OAE) and Establishment. The higher the female shadow wage the lower their supply. This lends support to the supposition that female labor participation is a type of "distress supply" rather than a positive indicator of women's empowerment. Analysis of the National Sample Service Organization data indicates that in all the sectors women are generally paid less than men. A micro-econometric study reveals that even in firms that employ solely female labor, incidence of full-time labor is deplorably poor. It is this feature that results in women workers' lower earnings and their deprivation.
This volume is centered around the issue of market design and resulting market dynamics. The economic crisis of 2007-2009 has once again highlighted the importance of a proper design of market protocols and institutional details for economic dynamics and macroeconomics. Papers in this volume capture institutional details of particular markets, behavioral details of agents' decision making as well as spillovers between markets and effects to the macroeconomy. Computational methods are used to replicate and understand market dynamics emerging from interaction of heterogeneous agents, and to develop models that have predictive power for complex market dynamics. Finally treatments of overlapping generations models and differential games with heterogeneous actors are provided.
This is an introduction to time series that emphasizes methods and analysis of data sets. The logic and tools of model-building for stationary and non-stationary time series are developed and numerous exercises, many of which make use of the included computer package, provide the reader with ample opportunity to develop skills. Statisticians and students will learn the latest methods in time series and forecasting, along with modern computational models and algorithms.
Microsimulation models provide an exciting new tool for analysing the distributional impact and cost of government policy changes. They can also be used to analyse the current or future structure of society. This volume contains papers describing new developments at the frontiers of microsimulation modelling, and draws upon experiences in a wide range of countries. Some papers aim to share with other modellers, experience gained in designing and running microsimulation models and their use in government policy formulation. They also examine issues at the frontiers of the discipline, such as how to include usage of health, education and welfare services in models. Other chapters focus upon describing the innovative new approaches being taken in dynamic microsimulation modelling. They describe some of the policy applications for which dynamic models are being used in Europe, Australia and New Zealand. Topics covered include retirement income modelling, pension reform, the behavioural impact of tax changes, child care demand, and the inclusion of government services within models. Attention is also given to validating the results of models and estimating their statistical reliability.
This is the first textbook designed to teach statistics to students in aviation courses. All examples and exercises are grounded in an aviation context, including flight instruction, air traffic control, airport management, and human factors. Structured in six parts, theiscovers the key foundational topics relative to descriptive and inferential statistics, including hypothesis testing, confidence intervals, z and t tests, correlation, regression, ANOVA, and chi-square. In addition, this book promotes both procedural knowledge and conceptual understanding. Detailed, guided examples are presented from the perspective of conducting a research study. Each analysis technique is clearly explained, enabling readers to understand, carry out, and report results correctly. Students are further supported by a range of pedagogical features in each chapter, including objectives, a summary, and a vocabulary check. Digital supplements comprise downloadable data sets and short video lectures explaining key concepts. Instructors also have access to PPT slides and an instructor’s manual that consists of a test bank with multiple choice exams, exercises with data sets, and solutions. This is the ideal statistics textbook for aviation courses globally, especially in aviation statistics, research methods in aviation, human factors, and related areas.
This edited collection concerns nonlinear economic relations that involve time. It is divided into four broad themes that all reflect the work and methodology of Professor Timo Terasvirta, one of the leading scholars in the field of nonlinear time series econometrics. The themes are: Testing for linearity and functional form, specification testing and estimation of nonlinear time series models in the form of smooth transition models, model selection and econometric methodology, and finally applications within the area of financial econometrics. All these research fields include contributions that represent state of the art in econometrics such as testing for neglected nonlinearity in neural network models, time-varying GARCH and smooth transition models, STAR models and common factors in volatility modeling, semi-automatic general to specific model selection for nonlinear dynamic models, high-dimensional data analysis for parametric and semi-parametric regression models with dependent data, commodity price modeling, financial analysts earnings forecasts based on asymmetric loss function, local Gaussian correlation and dependence for asymmetric return dependence, and the use of bootstrap aggregation to improve forecast accuracy. Each chapter represents original scholarly work, and reflects the intellectual impact that Timo Terasvirta has had and will continue to have, on the profession.
How could Finance benefit from AI? How can AI techniques provide an edge? Moving well beyond simply speeding up computation, this book tackles AI for Finance from a range of perspectives including business, technology, research, and students. Covering aspects like algorithms, big data, and machine learning, this book answers these and many other questions.
A lot of economic problems can be formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories. The series is designed to bring together those mathematicians who are seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking effective mathematical tools for their research.
Studies in Global Econometrics is a collection of essays on the use of cross-country data based on purchasing power parities. The two major applications are the development over time of per capital gross domestic products, (including that of their inequalities among countries and regions) and the fitting of cross-country demand equations for broad groups of consumer goods. The introductory chapter provides highlights of the author's work as relating to these developments. One of the main topics of the work is a system of demand equations for broad groups of consumer goods fitted by means of cross-country data. These data are from the International Comparison Program, which provides PPP-based figures for a number of years and countries. Similar data are used for the measurement of the dispersion of national per capita incomes between and within seven geographic regions.
From Catastrophe to Chaos: A General Theory of Economic Discontinuities presents and unusual perspective on economics and economic analysis. Current economic theory largely depends upon assuming that the world is fundamentally continuous. However, an increasing amount of economic research has been done using approaches that allow for discontinuities such as catastrophe theory, chaos theory, synergetics, and fractal geometry. The spread of such approaches across a variety of disciplines of thought has constituted a virtual intellectual revolution in recent years. This book reviews the applications of these approaches in various subdisciplines of economics and draws upon past economic thinkers to develop an integrated view of economics as a whole from the perspective of inherent discontinuity.
1 DATA ENVELOPMENT ANALYSIS Data Envelopment Analysis (DEA) was initially developed as a method for assessing the comparative efficiencies of organisational units such as the branches of a bank, schools, hospital departments or restaurants. The key in each case is that they perform feature which makes the units comparable the same function in terms of the kinds of resource they use and the types of output they produce. For example all bank branches to be compared would typically use staff and capital assets to effect income generating activities such as advancing loans, selling financial products and carrying out banking transactions on behalf of their clients. The efficiencies assessed in this context by DEA are intended to reflect the scope for resource conservation at the unit being assessed without detriment to its outputs, or alternatively, the scope for output augmentation without additional resources. The efficiencies assessed are comparative or relative because they reflect scope for resource conservation or output augmentation at one unit relative to other comparable benchmark units rather than in some absolute sense. We resort to relative rather than absolute efficiencies because in most practical contexts we lack sufficient information to derive the superior measures of absolute efficiency. DEA was initiated by Charnes Cooper and Rhodes in 1978 in their seminal paper Chames et al. (1978). The paper operationalised and extended by means of linear programming production economics concepts of empirical efficiency put forth some twenty years earlier by Farrell (1957).
This book provides an overview of three generations of spatial econometric models: models based on cross-sectional data, static models based on spatial panels and dynamic spatial panel data models. The book not only presents different model specifications and their corresponding estimators, but also critically discusses the purposes for which these models can be used and how their results should be interpreted.
Observers and Macroeconomic Systems is concerned with the computational aspects of using a control-theoretic approach to the analysis of dynamic macroeconomic systems. The focus is on using a separate model for the development of the control policies. In particular, it uses the observer-based approach whereby the separate model learns to behave in a similar manner to the economic system through output-injections. The book shows how this approach can be used to learn the forward-looking behaviour of economic actors which is a distinguishing feature of dynamic macroeconomic models. It also shows how it can be used in conjunction with low-order models to undertake policy analysis with a large practical econometric model. This overcomes some of the computational problems arising from using just the large econometric models to compute optimal policy trajectories. The work also develops visual simulation software tools that can be used for policy analysis with dynamic macroeconomic systems.
Each chapter of Macroeconometrics is written by respected econometricians in order to provide useful information and perspectives for those who wish to apply econometrics in macroeconomics. The chapters are all written with clear methodological perspectives, making the virtues and limitations of particular econometric approaches accessible to a general readership familiar with applied macroeconomics. The real tensions in macroeconometrics are revealed by the critical comments from different econometricians, having an alternative perspective, which follow each chapter.
This handbook covers DEA topics that are extensively used and solidly based. The purpose of the handbook is to (1) describe and elucidate the state of the field and (2), where appropriate, extend the frontier of DEA research. It defines the state-of-the-art of DEA methodology and its uses. This handbook is intended to represent a milestone in the progression of DEA. Written by experts, who are generally major contributors to the topics to be covered, it includes a comprehensive review and discussion of basic DEA models, which, in the present issue extensions to the basic DEA methods, and a collection of DEA applications in the areas of banking, engineering, health care, and services. The handbook's chapters are organized into two categories: (i) basic DEA models, concepts, and their extensions, and (ii) DEA applications. First edition contributors have returned to update their work. The second edition includes updated versions of selected first edition chapters. New chapters have been added on: different approaches with no need for a priori choices of weights (called multipliers) that reflect meaningful trade-offs, construction of static and dynamic DEA technologies, slacks-based model and its extensions, DEA models for DMUs that have internal structures network DEA that can be used for measuring supply chain operations, Selection of DEA applications in the service sector with a focus on building a conceptual framework, research design and interpreting results. " |
You may like...
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,581
Discovery Miles 35 810
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
|