![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
Self-contained chapters on the most important applications and methodologies in finance, which can easily be used for the reader’s research or as a reference for courses on empirical finance. Each chapter is reproducible in the sense that the reader can replicate every single figure, table, or number by simply copy-pasting the code we provide. A full-fledged introduction to machine learning with tidymodels based on tidy principles to show how factor selection and option pricing can benefit from Machine Learning methods. Chapter 2 on accessing & managing financial data shows how to retrieve and prepare the most important datasets in the field of financial economics: CRSP and Compustat. The chapter also contains detailed explanations of the most important data characteristics. Each chapter provides exercises that are based on established lectures and exercise classes and which are designed to help students to dig deeper. The exercises can be used for self-studying or as source of inspiration for teaching exercises.
In the theory and practice of econometrics the model, the method and the data are all interdependent links in information recovery-estimation and inference. Seldom, however, are the economic and statistical models correctly specified, the data complete or capable of being replicated, the estimation rules ‘optimal’ and the inferences free of distortion. Faced with these problems, Maximum Entropy Economeirics provides a new basis for learning from economic and statistical models that may be non-regular in the sense that they are ill-posed or underdetermined and the data are partial or incomplete. By extending the maximum entropy formalisms used in the physical sciences, the authors present a new set of generalized entropy techniques designed to recover information about economic systems. The authors compare the generalized entropy techniques with the performance of the relevant traditional methods of information recovery and clearly demonstrate theories with applications including
The book evaluates the importance of constitutional rules and property rights for the German economy in 1990-2015. It is an economic historical study embedded in institutional economics with main references to positive constitutional economics and the property rights theory. This interdisciplinary work adopts a theoretical-empirical dimension and a qualitative-quantitative approach. Formal institutions played a fundamental role in Germany's post-reunification economic changes. They set the legal and institutional framework for the transition process of Eastern Germany and the unification, integration and convergence between the two parts of the country. Although the latter process was not completed, the effects of these formal rules were positive, especially for the former GDR.
Provides sound knowledge of optimal decision making in statistics and operations research problems. Serves a quick reference by exploring the research literature on the subject with commercial value-added research applications in statistics and operations research. Provides sound knowledge of optimisations and statistical techniques in modelling of real-world problems. Reviews recent developments and contributions in optimal decision-making problems using optimisation and statistical techniques. Provides an understanding of formulations of decision-making problems and their solution procedures. Describes latest developments in modelling of real-world problems and their solution approaches.
As one of the first texts to take a behavioral approach to macroeconomic expectations, this book introduces a new way of doing economics. Roetheli uses cognitive psychology in a bottom-up method of modeling macroeconomic expectations. His research is based on laboratory experiments and historical data, which he extends to real-world situations. Pattern extrapolation is shown to be the key to understanding expectations of inflation and income. The quantitative model of expectations is used to analyze the course of inflation and nominal interest rates in a range of countries and historical periods. The model of expected income is applied to the analysis of business cycle phenomena such as the great recession in the United States. Data and spreadsheets are provided for readers to do their own computations of macroeconomic expectations. This book offers new perspectives in many areas of macro and financial economics.
This book covers all the topics found in introductory descriptive statistics courses, including simple linear regression and time series analysis, the fundamentals of inferential statistics (probability theory, random sampling and estimation theory), and inferential statistics itself (confidence intervals, testing). Each chapter starts with the necessary theoretical background, which is followed by a variety of examples. The core examples are based on the content of the respective chapter, while the advanced examples, designed to deepen students' knowledge, also draw on information and material from previous chapters. The enhanced online version helps students grasp the complexity and the practical relevance of statistical analysis through interactive examples and is suitable for undergraduate and graduate students taking their first statistics courses, as well as for undergraduate students in non-mathematical fields, e.g. economics, the social sciences etc.
Since the financial crisis, the issue of the 'one percent' has become the centre of intense public debate, unavoidable even for members of the elite themselves. Moreover, inquiring into elites has taken centre-stage once again in both journalistic investigations and academic research. New Directions in Elite Studies attempts to move the social scientific study of elites beyond economic analysis, which has greatly improved our knowledge of inequality, but is restricted to income and wealth. In contrast, this book mobilizes a broad scope of research methods to uncover the social composition of the power elite - the 'field of power'. It reconstructs processes through which people gain access to positions in this particular social space, examines the various forms of capital they mobilize in the process - economic, but also cultural and social capital - and probes changes over time and variations across national contexts. Bringing together the most advanced research into elites by a European and multidisciplinary group of scholars, this book presents an agenda for the future study of elites. It will appeal to all those interested in the study of elites, inequality, class, power, and gender inequality.
This practical guide in Eviews is aimed at practitioners and students in business, economics, econometrics, and finance. It uses a step-by-step approach to equip readers with a toolkit that enables them to make the most of this widely used econometric analysis software. Statistical and econometrics concepts are explained visually with examples, problems, and solutions. Developed by economists, the Eviews statistical software package is used most commonly for time-series oriented econometric analysis. It allows users to quickly develop statistical relations from data and then use those relations to forecast future values of the data. The package provides convenient ways to enter or upload data series, create new series from existing ones, display and print series, carry out statistical analyses of relationships among series, and manipulate results and output. This highly hands-on resource includes more than 200 illustrative graphs and tables and tutorials throughout. Abdulkader Aljandali is Senior Lecturer at Coventry University in London. He is currently leading the Stochastic Finance Module taught as part of the Global Financial Trading MSc. His previously published work includes Exchange Rate Volatility in Emerging Markers, Quantitative Analysis, Multivariate Methods & Forecasting with IBM SPSS Statistics and Multivariate Methods and Forecasting with IBM (R) SPSS (R) Statistics. Dr Aljandali is an established member of the British Accounting and Finance Association and the Higher Education Academy. Motasam Tatahi is a specialist in the areas of Macroeconomics, Financial Economics, and Financial Econometrics at the European Business School, Regent's University London, where he serves as Principal Lecturer and Dissertation Coordinator for the MSc in Global Banking and Finance at The European Business School-London.
This book has taken form over several years as a result of a number of courses taught at the University of Pennsylvania and at Columbia University and a series of lectures I have given at the International Monetary Fund. Indeed, I began writing down my notes systematically during the academic year 1972-1973 while at the University of California, Los Angeles. The diverse character of the audience, as well as my own conception of what an introductory and often terminal acquaintance with formal econometrics ought to encompass, have determined the style and content of this volume. The selection of topics and the level of discourse give sufficient variety so that the book can serve as the basis for several types of courses. As an example, a relatively elementary one-semester course can be based on Chapters one through five, omitting the appendices to these chapters and a few sections in some of the chapters so indicated. This would acquaint the student with the basic theory of the general linear model, some of the prob lems often encountered in empirical research, and some proposed solutions. For such a course, I should also recommend a brief excursion into Chapter seven (logit and pro bit analysis) in view of the increasing availability of data sets for which this type of analysis is more suitable than that based on the general linear model."
Arthur Vogt has devoted a great deal of his scientific efforts to
both person and work of Irving Fisher. This book, written with J
nos Barta, gives an excellent impression of Fisher's great
contributions to the theory of the price index on the one hand. On
the other hand, it continues Fisher's work on this subject along
the lines which several authors drew with respect to price index
theory since Fisher's death fifty years ago.
This important new book presents the theoretical, econometric and applied foundations of the economics of innovation as well as offering a new approach to the measurement of technical change. The author, a leading expert in innovation economics and management, critically reviews current schools of thought and presents his own contribution to measurement techniques. Measurements of technical change have focused on the characteristics of price and quantity whilst useful theories and reliable indicators of the quality of innovation in new products have been sorely lacking. The author examines the theoretical foundations of the measurement of technical change and extends the analysis to consider the econometric and empirical perspective in the process of innovation. He outlines the key contributions to innovation research by reviewing the English-language literature and providing a very useful guide to the most important contributions in other languages. In the measurement of the quality of innovation, the techniques used in the author's contribution to new 'technometrics' are presented and explained in detail and are applied to the most important topical problems in innovation and management. This significant addition to the literature will be invaluable to graduates, scholars and managers working in the area of technical change, technology and innovation management.
• Introduces the dynamics, principles and mathematics behind ten macroeconomic models allowing students to visualise the models and understand the economic intuition behind them. • Provides a step-by-step guide, and the necessary MATLAB codes, to allow readers to simulate and experiment with the models themselves.
The emergence of new firm-level data, including the European Community Innovation Survey (CIS), has led to a surge of studies on innovation and firm behaviour. This book documents progress in four interrelated fields: investigation of the use of new indicators of innovation output; investigation of determinants of innovative behavior; the role of spillovers, the public knowledge infrastructure and research and development collaboration; and the impact of innovation on firm performance. Written by an international group of contributors, the studies are based on agriculture and the manufacturing and service industries in Europe and Canada and provide new insights into the driving forces behind innovation.
Dynamic Programming in Economics is an outgrowth of a course intended for students in the first year PhD program and for researchers in Macroeconomics Dynamics. It can be used by students and researchers in Mathematics as well as in Economics. The purpose of Dynamic Programming in Economics is twofold: (a) to provide a rigorous, but not too complicated, treatment of optimal growth models in infinite discrete time horizon, (b) to train the reader to the use of optimal growth models and hence to help him to go further in his research. We are convinced that there is a place for a book which stays somewhere between the "minimum tool kit" and specialized monographs leading to the frontiers of research on optimal growth.
This is the first book to investigate individual's pessimistic and optimistic prospects for the future and their economic consequences based on sound mathematical foundations. The book focuses on fundamental uncertainty called Knightian uncertainty, where the probability distribution governing uncertainty is unknown, and it provides the reader with methods to formulate how pessimism and optimism act in an economy in a strict and unified way. After presenting decision-theoretic foundations for prudent behaviors under Knightian uncertainty, the book applies these ideas to economic models that include portfolio inertia, indeterminacy of equilibria in the Arrow-Debreu economy and in a stochastic overlapping-generations economy, learning, dynamic asset-pricing models, search, real options, and liquidity preferences. The book then proceeds to characterizations of pessimistic ( -contaminated) and optimistic ( -exuberant) behaviors under Knightian uncertainty and people's inherent pessimism (surprise aversion) and optimism (surprise loving). Those characterizations are shown to be useful in understanding several observed behaviors in the global financial crisis and in its aftermath. The book is highly recommended not only to researchers who wish to understand the mechanism of how pessimism and optimism affect economic phenomena, but also to policy makers contemplating effective economic policies whose success delicately hinges upon people's mindsets in the market. Kiyohiko Nishimura is Professor at the National Graduate Institute for Policy Studies (GRIPS) and Professor Emeritus and Distinguished Project Research Fellow of the Center for Advanced Research in Finance at The University of Tokyo. Hiroyuki Ozaki is Professor of Economics at Keio University.
Partial least squares structural equation modeling (PLS-SEM) has become a standard approach for analyzing complex inter-relationships between observed and latent variables. Researchers appreciate the many advantages of PLS-SEM such as the possibility to estimate very complex models and the method's flexibility in terms of data requirements and measurement specification. This practical open access guide provides a step-by-step treatment of the major choices in analyzing PLS path models using R, a free software environment for statistical computing, which runs on Windows, macOS, and UNIX computer platforms. Adopting the R software's SEMinR package, which brings a friendly syntax to creating and estimating structural equation models, each chapter offers a concise overview of relevant topics and metrics, followed by an in-depth description of a case study. Simple instructions give readers the "how-tos" of using SEMinR to obtain solutions and document their results. Rules of thumb in every chapter provide guidance on best practices in the application and interpretation of PLS-SEM.
This title is concerned with the investigation of the contemporary financial issues of the e-commerce market.
This book investigates why economics makes less visible progress over time than scientific fields with a strong practical component, where interactions with physical technologies play a key role. The thesis of the book is that the main impediment to progress in economics is "false feedback", which it defines as the false result of an empirical study, such as empirical evidence produced by a statistical model that violates some of its assumptions. In contrast to scientific fields that work with physical technologies, false feedback is hard to recognize in economics. Economists thus have difficulties knowing where they stand in their inquiries, and false feedback will regularly lead them in the wrong directions. The book searches for the reasons behind the emergence of false feedback. It thereby contributes to a wider discussion in the field of metascience about the practices of researchers when pursuing their daily business. The book thus offers a case study of metascience for the field of empirical economics. The main strength of the book are the numerous smaller insights it provides throughout. The book delves into deep discussions of various theoretical issues, which it illustrates by many applied examples and a wide array of references, especially to philosophy of science. The book puts flesh on complicated and often abstract subjects, particularly when it comes to controversial topics such as p-hacking. The reader gains an understanding of the main challenges present in empirical economic research and also the possible solutions. The main audience of the book are all applied researchers working with data and, in particular, those who have found certain aspects of their research practice problematic.
This is the perfect (and essential) supplement for all econometrics
classes--from a rigorous first undergraduate course, to a first
master's, to a PhD course.
Any enquiry into the nature, performance, role, demerits, growth, efficiency, or other aspects of financial services such as banking and insurance activities, requires rigorous estimates of their economic output, i.e., the economic contributions made by these firms, as well as by the industries as a whole. Accordingly, this book condenses several theoretical, methodological, empirical, and philosophical issues in conceptualizing, measuring, and empirically operationalizing the economic output of the banking and insurance industries. The analytical focus is on both Global and Emerging Markets perspectives. The book synthesizes applied and conceptual evidence to locate the chosen theme's analytical patterns, consensus, and disagreements. The selected subject matter is studied within the firm-level and aggregate settings, bringing literature of varied scopes together. Contributions from various international academics, practitioners, and policymakers further enrich the narrative. The book concludes with data-driven case studies that analyze the extent to which the critical performance parameters of the banking and insurance industries in the BRIICS economies - including estimation of aggregate industry-level partial factor productivities, total factor productivity, technical efficiency, and returns to scale - vary concerning alternate measures of their output. The present work also provides a brief note on the inputs measurement dimension, following which there is a discussion on the limitations, future scope, and conclusions. This work will be valuable for researchers and policymakers undertaking performance analyses related to banking and insurance activities. It shall provide them with the examination of a plethora of analytical options and related issues on the theory and praxis of output measurement, all finely organized into one single volume.
The book describes the theoretical principles of nonstatistical methods of data analysis but without going deep into complex mathematics. The emphasis is laid on presentation of solved examples of real data either from authors' laboratories or from open literature. The examples cover wide range of applications such as quality assurance and quality control, critical analysis of experimental data, comparison of data samples from various sources, robust linear and nonlinear regression as well as various tasks from financial analysis. The examples are useful primarily for chemical engineers including analytical/quality laboratories in industry, designers of chemical and biological processes. Features: Exclusive title on Mathematical Gnostics with multidisciplinary applications, and specific focus on chemical engineering. Clarifies the role of data space metrics including the right way of aggregation of uncertain data. Brings a new look on the data probability, information, entropy and thermodynamics of data uncertainty. Enables design of probability distributions for all real data samples including smaller ones. Includes data for examples with solutions with exercises in R or Python. The book is aimed for Senior Undergraduate Students, Researchers, and Professionals in Chemical/Process Engineering, Engineering Physics, Stats, Mathematics, Materials, Geotechnical, Civil Engineering, Mining, Sales, Marketing and Service, and Finance.
Generalized Method of Moments (GMM) has become one of the main statistical tools for the analysis of economic and financial data. This book is the first to provide an intuitive introduction to the method combined with a unified treatment of GMM statistical theory and a survey of recent important developments in the field. Providing a comprehensive treatment of GMM estimation and inference, it is designed as a resource for both the theory and practice of GMM: it discusses and proves formally all the main statistical results, and illustrates all inference techniques using empirical examples in macroeconomics and finance. Building from the instrumental variables estimator in static linear models, it presents the asymptotic statistical theory of GMM in nonlinear dynamic models. Within this framework it covers classical results on estimation and inference techniques, such as the overidentifying restrictions test and tests of structural stability, and reviews the finite sample performance of these inference methods. And it discusses in detail recent developments on covariance matrix estimation, the impact of model misspecification, moment selection, the use of the bootstrap, and weak instrument asymptotics.
This book sheds new light on a recently introduced monetary tool - negative interest rates policy (NIRP). It provides in-depth insight into this phenomenon, conducted by the central banks in several economies, for example, the Eurozone, Switzerland and Japan, and its possible impact on systemic risk. Although it has been introduced as a temporary policy instrument, it may remain widely used for a longer period and by a greater range of central banks than initially expected, thus the book explores its effects and implications on the banking sector and financial markets, with a particular focus on potentially adverse consequences. There is a strong accent on the uniqueness of negative policy rates in the context of financial stability concerns. The authors assess whether NIRP has any - or in principle a stronger - impact on systemic risk than conventional monetary policy. The book is targeted at presenting and evaluating the initial experiences of NIRP policy during normal, i.e. pre-COVID, times, rather than in periods in which pre-established macroeconomic relations are rapidly disrupted or, specifically, when the source of the disruption is not purely economic in nature, unlike in systemic crisis. The authors adopt both theoretical and practical approaches to explore the key issues and outline the policy implications for both monetary and macroprudential authorities, with respect to negative interest rate policy, thus the book will provide a useful guide for policymakers, academics, advanced students and researchers of financial economics and international finance.
Valuable software, realistic examples, clear writing, and fascinating topics help you master key spreadsheet and business analytics skills with SPREADSHEET MODELING AND DECISION ANALYSIS, 8E. You'll find everything you need to become proficient in today's most widely used business analytics techniques using Microsoft (R) Office Excel (R) 2016. Author Cliff Ragsdale -- respected innovator in business analytics -- guides you through the skills you need, using the latest Excel (R) for Windows. You gain the confidence to apply what you learn to real business situations with step-by-step instructions and annotated screen images that make examples easy to follow. The World of Management Science sections further demonstrates how each topic applies to a real company. Each new edition includes extended trial licenses for Analytic Solver Platform and XLMiner with powerful simulation and optimization tools for descriptive and prescriptive analytics and a full suite of tools for data mining in Excel. |
![]() ![]() You may like...
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,204
Discovery Miles 22 040
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,881
Discovery Miles 38 810
Statistics for Business and Economics…
Paul Newbold, William Carlson, …
Paperback
R2,479
Discovery Miles 24 790
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
|