![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
This book constitutes the first serious attempt to explain the basics of econometrics and its applications in the clearest and simplest manner possible. Recognising the fact that a good level of mathematics is no longer a necessary prerequisite for economics/financial economics undergraduate and postgraduate programmes, it introduces this key subdivision of economics to an audience who might otherwise have been deterred by its complex nature.
This book is a volume in the Penn Press Anniversary Collection. To mark its 125th anniversary in 2015, the University of Pennsylvania Press rereleased more than 1,100 titles from Penn Press's distinguished backlist from 1899-1999 that had fallen out of print. Spanning an entire century, the Anniversary Collection offers peer-reviewed scholarship in a wide range of subject areas.
This title, first published in 1984, is a contribution to applied international trade theory. The author explores the specification and estimation of a multisector general equilibrium model of the open economy. The model is formulated with the aim of assessing empirically the effects of three key policy variables on trade flows, domestic prices, and the trade balance. The policy variables with which the author is concerned are the rate of growth of the stock of domestic credit, commercial policy, as represented by tariffs, and, finally, the exchange rate. This title will be of interest to students of economics.
Now in its third edition, Essential Econometric Techniques: A Guide to Concepts and Applications is a concise, student-friendly textbook which provides an introductory grounding in econometrics, with an emphasis on the proper application and interpretation of results. Drawing on the author's extensive teaching experience, this book offers intuitive explanations of concepts such as heteroskedasticity and serial correlation, and provides step-by-step overviews of each key topic. This new edition contains more applications, brings in new material including a dedicated chapter on panel data techniques, and moves the theoretical proofs to appendices. After Chapter 7, students will be able to design and conduct rudimentary econometric research. The next chapters cover multicollinearity, heteroskedasticity, and autocorrelation, followed by techniques for time-series analysis and panel data. Excel data sets for the end-of-chapter problems are available as a digital supplement. A solutions manual is also available for instructors, as well as PowerPoint slides for each chapter. Essential Econometric Techniques shows students how economic hypotheses can be questioned and tested using real-world data, and is the ideal supplementary text for all introductory econometrics courses.
The last decade has brought dramatic changes in the way that researchers analyze economic and financial time series. This book synthesizes these recent advances and makes them accessible to first-year graduate students. James Hamilton provides the first adequate text-book treatments of important innovations such as vector autoregressions, generalized method of moments, the economic and statistical consequences of unit roots, time-varying variances, and nonlinear time series models. In addition, he presents basic tools for analyzing dynamic systems (including linear representations, autocovariance generating functions, spectral analysis, and the Kalman filter) in a way that integrates economic theory with the practical difficulties of analyzing and interpreting real-world data. "Time Series Analysis" fills an important need for a textbook that integrates economic theory, econometrics, and new results. The book is intended to provide students and researchers with a self-contained survey of time series analysis. It starts from first principles and should be readily accessible to any beginning graduate student, while it is also intended to serve as a reference book for researchers.
Stochastic Limit Theory, published in 1994, has become a standard reference in its field. Now reissued in a new edition, offering updated and improved results and an extended range of topics, Davidson surveys asymptotic (large-sample) distribution theory with applications to econometrics, with particular emphasis on the problems of time dependence and heterogeneity. The book is designed to be useful on two levels. First, as a textbook and reference work, giving definitions of the relevant mathematical concepts, statements, and proofs of the important results from the probability literature, and numerous examples; and second, as an account of recent work in the field of particular interest to econometricians. It is virtually self-contained, with all but the most basic technical prerequisites being explained in their context; mathematical topics include measure theory, integration, metric spaces, and topology, with applications to random variables, and an extended treatment of conditional probability. Other subjects treated include: stochastic processes, mixing processes, martingales, mixingales, and near-epoch dependence; the weak and strong laws of large numbers; weak convergence; and central limit theorems for nonstationary and dependent processes. The functional central limit theorem and its ramifications are covered in detail, including an account of the theoretical underpinnings (the weak convergence of measures on metric spaces), Brownian motion, the multivariate invariance principle, and convergence to stochastic integrals. This material is of special relevance to the theory of cointegration. The new edition gives updated and improved versions of many of the results and extends the coverage of many topics, in particular the theory of convergence to alpha-stable limits of processes with infinite variance.
Financial Economics and Econometrics provides an overview of the core topics in theoretical and empirical finance, with an emphasis on applications and interpreting results. Structured in five parts, the book covers financial data and univariate models; asset returns; interest rates, yields and spreads; volatility and correlation; and corporate finance and policy. Each chapter begins with a theory in financial economics, followed by econometric methodologies which have been used to explore the theory. Next, the chapter presents empirical evidence and discusses seminal papers on the topic. Boxes offer insights on how an idea can be applied to other disciplines such as management, marketing and medicine, showing the relevance of the material beyond finance. Readers are supported with plenty of worked examples and intuitive explanations throughout the book, while key takeaways, 'test your knowledge' and 'test your intuition' features at the end of each chapter also aid student learning. Digital supplements including PowerPoint slides, computer codes supplements, an Instructor's Manual and Solutions Manual are available for instructors. This textbook is suitable for upper-level undergraduate and graduate courses on financial economics, financial econometrics, empirical finance and related quantitative areas.
The book, Sustainability and Resources: Theoretical Issues in Dynamic Economics, presents a collection of mathematical models dealing with sustainability and resource management.The focus in Part A is on harvesting renewable resources, while Part B explores the optimal extraction of exhaustible resources. Part C introduces models dealing with uncertainty. Some are descriptive models; others have deep roots in intertemporal welfare economics. The tools of dynamic optimization developed in the 1960s are used in a formal, rigorous presentation to address wide-ranging issues that have appeared in academic research as well as policy debates on the world stage.The book also provides a self-contained treatment that is accessible to advanced undergraduate and graduate students, who are interested in dynamic models of resource allocation and social welfare, resource management, and applications of optimization theory and methods of probability theory to economics. For researchers in dynamic economics, it will be an invaluable source for formal treatment of substantive macroeconomic issues raised by policymakers. The part dealing with uncertainty and random dynamical systems (largely developed by the author and his collaborators) exposes the reader to contemporary frontiers of research on stochastic processes with novel applications to economic problems.
This Open Access Brief presents the KAPSARC Global Energy Macroeconometric Model (KGEMM). KGEMM is a policy analysis tool for examining the impacts of domestic policy measures and global economic and energy shocks on the Kingdom of Saudi Arabia. The model has eight blocks (real sector, fiscal, monetary, external sector, price, labor and wages, energy, population, and age cohorts) that interact with each other to represent the Kingdom's macroeconomy and energy linkages. It captures New Keynesian demand-side features anchored to medium-run equilibrium and long-run aggregate supply. It applies a cointegration and equilibrium correction modeling (ECM) methodology to time series data to estimate the model's behavioral equations in the framework of Autometrics, a general-to-specific econometric modeling strategy. Hence, the model combines 'theory-driven' approach with 'data-driven' approach. The Brief begins with an introduction to the theoretical framework of the model and the KGEMM methodology and then walks the reader through the structure of the model and its behavioral equations. The book closes with simulations showing the application of the model. Providing a detailed introduction to a cutting-edge, robust predictive model, this Brief will be of great use to researchers and policymakers interested in macroeconomics, energy economics, econometrics, and more specifically, the economy of Saudi Arabia.
This book is the first of its kind to systematically analyze and apply Lim Chong Yah's S-Curve Hypothesis to the various facets of economic growth and economic transition. By augmenting the mathematical and economical sophistication of the hypothesis, this book extends the S-Curve hypothesis to provide further insight into economic growth and transition. It also utilizes a construction of a stochastic growth model to provide the microeconomic foundation for the S-Curve hypothesis. This model resolves the puzzle of why some developing countries experience economic take-off, while others do not. The book analyzes and extends discussion on the S-Curve, and also applies the S-Curve hypothesis to predict long-term growth in Japan and Singapore. It serves as an excellent resource for people interested in Lim's growth theory.
Market Analysis for Real Estate is a comprehensive introduction to how real estate markets work and the analytical tools and techniques that can be used to identify and interpret market signals. The markets for space and varied property assets, including residential, office, retail, and industrial, are presented, analyzed, and integrated into a complete understanding of the role of real estate markets within the workings of contemporary urban economies. Unlike other books on market analysis, the economic and financial theory in this book is rigorous and well integrated with the specifics of the real estate market. Furthermore, it is thoroughly explained as it assumes no previous coursework in economics or finance on the part of the reader. The theoretical discussion is backed up with numerous real estate case study examples and problems, which are presented throughout the text to assist both student and teacher.
This book unifies and extends the definition and measurement of economic efficiency and its use as a real-life benchmarking technique for actual organizations. Analytically, the book relies on the economic theory of duality as guiding framework. Empirically, it shows how the alternative models can be implemented by way of Data Envelopment Analysis. An accompanying software programmed in the open-source Julia language is used to solve the models. The package is a self-contained set of functions that can be used for individual learning and instruction. The source code, associated documentation, and replication notebooks are available online. The book discusses the concept of economic efficiency at the firm level, comparing observed to optimal economic performance, and its decomposition according to technical and allocative criteria. Depending on the underlying technical efficiency measure, economic efficiency can be decomposed multiplicatively or additively. Part I of the book deals with the classic multiplicative approach that decomposes cost and revenue efficiency based on radial distance functions. Subsequently, the book examines how these partial approaches can be expanded to the notion of profitability efficiency, considering both the input and output dimensions of the firm, and relying on the generalized distance function for the measurement of technical efficiency. Part II is devoted to the recent additive framework related to the decomposition of economic inefficiency defined in terms of cost, revenue, and profit. The book presents economic models for the Russell and enhanced graph Russell measures, the weighted additive distance function, the directional distance function, the modified directional distance function, and the Hoelder distance function. Each model is presented in a separate chapter. New approaches that qualify and generalize previous results are also introduced in the last chapters, including the reverse directional distance function and the general direct approach. The book concludes by highlighting the importance of benchmarking economic efficiency for all business stakeholders and recalling the main conclusions obtained from many years of research on this topic. The book offers different alternatives to measure economic efficiency based on a set of desirable properties and advises on the choice of specific economic efficiency models.
A provocative new analysis of immigration's long-term effects on a nation's economy and culture. Over the last two decades, as economists began using big datasets and modern computing power to reveal the sources of national prosperity, their statistical results kept pointing toward the power of culture to drive the wealth of nations. In The Culture Transplant, Garett Jones documents the cultural foundations of cross-country income differences, showing that immigrants import cultural attitudes from their homelands—toward saving, toward trust, and toward the role of government—that persist for decades, and likely for centuries, in their new national homes. Full assimilation in a generation or two, Jones reports, is a myth. And the cultural traits migrants bring to their new homes have enduring effects upon a nation's economic potential. Built upon mainstream, well-reviewed academic research that hasn't pierced the public consciousness, this book offers a compelling refutation of an unspoken consensus that a nation's economic and political institutions won't be changed by immigration. Jones refutes the common view that we can discuss migration policy without considering whether migration can, over a few generations, substantially transform the economic and political institutions of a nation. And since most of the world's technological innovations come from just a handful of nations, Jones concludes, the entire world has a stake in whether migration policy will help or hurt the quality of government and thus the quality of scientific breakthroughs in those rare innovation powerhouses.
Stochastic differential equations are differential equations whose solutions are stochastic processes. They exhibit appealing mathematical properties that are useful in modeling uncertainties and noisy phenomena in many disciplines. This book is motivated by applications of stochastic differential equations in target tracking and medical technology and, in particular, their use in methodologies such as filtering, smoothing, parameter estimation, and machine learning. It builds an intuitive hands-on understanding of what stochastic differential equations are all about, but also covers the essentials of Ito calculus, the central theorems in the field, and such approximation schemes as stochastic Runge-Kutta. Greater emphasis is given to solution methods than to analysis of theoretical properties of the equations. The book's practical approach assumes only prior understanding of ordinary differential equations. The numerous worked examples and end-of-chapter exercises include application-driven derivations and computational assignments. MATLAB/Octave source code is available for download, promoting hands-on work with the methods.
It is impossible to understand modern economics without knowledge of the basic tools of gametheory and mechanism design. This book provides a graduate-level introduction to the economic modeling of strategic behavior. The goal is to teach Economics doctoral students the tools of game theory and mechanism design that all economists should know.
Leonid Hurwicz (1917-2008) was a major figure in modern theoretical economics whose contributions over sixty-five years spanned at least five areas: econometrics, nonlinear programming, decision theory, microeconomic theory, and mechanism design. In 2007, at age ninety, he received the Nobel Memorial Prize in Economics (shared with Eric Maskin and Roger Myerson) for pioneering the field of mechanism design and incentive compatibility. Hurwicz made seminal contributions in the other areas as well. In non-linear programming, he contributed to the understanding of Lagrange-Kuhn-Tucker problems (along with co-authors Kenneth Arrowand Hirofumi Uzawa). In econometrics, the Hurwicz bias in the least-squares analysis of time series is a fundamental and commonly cited benchmark. In decision theory, the Hurwicz criterion for decision-making under ambiguity is routinely invoked, sometimes without a citation since his original paper was never published. In microeconomic theory, Hurwicz (along with Arrow and H.D. Block) initiated the study of stability of the market mechanism, and (with Uzawa) solved the classic integrability of demand problem, a core result in neoclassical consumer theory. While some of Hurwicz's work were published in journals, many remain scattered as chapters in books which are difficult to access; yet others were never published at all. The Collected Papers of Leonid Hurwicz is the first volume in a series of four that will bring his oeuvre in one place, to bring to light the totality of his intellectual output, to document his contribution to economics and the extent of his legacy, with the express purpose to make it easily available for future generations of researchers to build upon.
This third edition capitalizes on the success of the previous editions and leverages the important advancements in visualization, data analysis, and sharing capabilities that have emerged in recent years. It serves as an accelerated guide to decision support designs for consultants, service professionals and students. This 'fast track' enables a ramping up of skills in Excel for those who may have never used it to reach a level of mastery that will allow them to integrate Excel with widely available associated applications, make use of intelligent data visualization and analysis techniques, automate activity through basic VBA designs, and develop easy-to-use interfaces for customizing use. The content of this edition has been completely restructured and revised, with updates that correspond with the latest versions of software and references to contemporary add-in development across platforms. It also features best practices in design and analytical consideration, including methodical discussions of problem structuring and evaluation, as well as numerous case examples from practice.
Today, most money is credit money, created by commercial banks. While credit can finance innovation, excessive credit can lead to boom/bust cycles, such as the recent financial crisis. This highlights how the organization of our monetary system is crucial to stability. One way to achieve this is by separating the unit of account from the medium of exchange and in pre-modern Europe, such a separation existed. This new volume examines this idea of monetary separation and this history of monetary arrangements in the North and Baltic Seas region, from the Hanseatic League onwards. This book provides a theoretical analysis of four historical cases in the Baltic and North Seas region, with a view to examining evolution of monetary arrangements from a new monetary economics perspective. Since the objective exhange value of money (its purchasing power), reflects subjective individual valuations of commodities, the author assesses these historical cases by means of exchange rates. Using theories from new monetary economics , the book explores how the units of account and their media of exchange evolved as social conventions, and offers new insight into the separation between the two. Through this exploration, it puts forward that money is a social institution, a clearing device for the settlement of accounts, and so the value of money, or a separate unit of account, ultimately results from the size of its network of users. The History of Money and Monetary Arrangements offers a highly original new insight into monetary arrangments as an evolutionary process. It will be of great interest to an international audience of scholars and students, including those with an interest in economic history, evolutionary economics and new monetary economics.
A complete resource for finance students, this textbook presents the most common empirical approaches in finance in a comprehensive and well-illustrated manner that shows how econometrics is used in practice, and includes detailed case studies to explain how the techniques are used in relevant financial contexts. Maintaining the accessible prose and clear examples of previous editions, the new edition of this best-selling textbook provides support for the main industry-standard software packages, expands the coverage of introductory mathematical and statistical techniques into two chapters for students without prior econometrics knowledge, and includes a new chapter on advanced methods. Learning outcomes, key concepts and end-of-chapter review questions (with full solutions online) highlight the main chapter takeaways and allow students to self-assess their understanding. Online resources include extensive teacher and student support materials, including EViews, Stata, R, and Python software guides.
Computable general equilibrium (CGE) models play an important role in supporting public-policy making on such issues as trade, climate change and taxation. This significantly revised volume, keeping pace with the next-generation standard CGE model, is the only undergraduate-level introduction of its kind. The volume utilizes a graphical approach to explain the economic theory underlying a CGE model, and provides results from simple, small-scale CGE models to illustrate the links between theory and model outcomes. Its eleven hands-on exercises introduce modelling techniques that are applied to real-world economic problems. Students learn how to integrate their separate fields of economic study into a comprehensive, general equilibrium perspective as they develop their skills as producers or consumers of CGE-based analysis.
Microeconometrics Using Stata, Second Edition is an invaluable reference for researchers and students interested in applied microeconometric methods. Like previous editions, this text covers all the classic microeconometric techniques ranging from linear models to instrumental-variables regression to panel-data estimation to nonlinear models such as probit, tobit, Poisson, and choice models. Each of these discussions has been updated to show the most modern implementation in Stata, and many include additional explanation of the underlying methods. In addition, the authors introduce readers to performing simulations in Stata and then use simulations to illustrate methods in other parts of the book. They even teach you how to code your own estimators in Stata. The second edition is greatly expanded—the new material is so extensive that the text now comprises two volumes. In addition to the classics, the book now teaches recently developed econometric methods and the methods newly added to Stata. Specifically, the book includes entirely new chapters on duration models randomized control trials and exogenous treatment effects endogenous treatment effects models for endogeneity and heterogeneity, including finite mixture models, structural equation models, and nonlinear mixed-effects models spatial autoregressive models semiparametric regression lasso for prediction and inference Bayesian analysis Anyone interested in learning classic and modern econometric methods will find this the perfect companion. And those who apply these methods to their own data will return to this reference over and over as they need to implement the various techniques described in this book.
Continuous-Time Models in Corporate Finance synthesizes four decades of research to show how stochastic calculus can be used in corporate finance. Combining mathematical rigor with economic intuition, Santiago Moreno-Bromberg and Jean-Charles Rochet analyze corporate decisions such as dividend distribution, the issuance of securities, and capital structure and default. They pay particular attention to financial intermediaries, including banks and insurance companies. The authors begin by recalling the ways that option-pricing techniques can be employed for the pricing of corporate debt and equity. They then present the dynamic model of the trade-off between taxes and bankruptcy costs and derive implications for optimal capital structure. The core chapter introduces the workhorse liquidity-management model--where liquidity and risk management decisions are made in order to minimize the costs of external finance. This model is used to study corporate finance decisions and specific features of banks and insurance companies. The book concludes by presenting the dynamic agency model, where financial frictions stem from the lack of interest alignment between a firm's manager and its financiers. The appendix contains an overview of the main mathematical tools used throughout the book. Requiring some familiarity with stochastic calculus methods, Continuous-Time Models in Corporate Finance will be useful for students, researchers, and professionals who want to develop dynamic models of firms' financial decisions. |
You may like...
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
|