![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
The book evaluates the importance of constitutional rules and property rights for the German economy in 1990-2015. It is an economic historical study embedded in institutional economics with main references to positive constitutional economics and the property rights theory. This interdisciplinary work adopts a theoretical-empirical dimension and a qualitative-quantitative approach. Formal institutions played a fundamental role in Germany's post-reunification economic changes. They set the legal and institutional framework for the transition process of Eastern Germany and the unification, integration and convergence between the two parts of the country. Although the latter process was not completed, the effects of these formal rules were positive, especially for the former GDR.
This book presents the principles and methods for the practical analysis and prediction of economic and financial time series. It covers decomposition methods, autocorrelation methods for univariate time series, volatility and duration modeling for financial time series, and multivariate time series methods, such as cointegration and recursive state space modeling. It also includes numerous practical examples to demonstrate the theory using real-world data, as well as exercises at the end of each chapter to aid understanding. This book serves as a reference text for researchers, students and practitioners interested in time series, and can also be used for university courses on econometrics or computational finance.
Economic indicators provide invaluable insights into how different economies and financial markets are performing, enabling practitioners to adjust their investment strategies in order to gain knowledge about markets and to achieve higher returns. However, in order to make the right decisions, you must know how to interpret the relevant indicators. Using Economic Indicators in Analysing Financial Markets provides this important guidance. The first and second part of Using Economic Indicators in Analysing Financial Markets focuses on the short-term analysis, explaining exactly what the indicators are, why they are significant, where and when they are published, and how reliable they are. In the third part, author Bernd Krampen highlights medium and long-term economic trends: It is shown how some previously discussed and additional market indicators like stocks, bond yields, commodities can be employed as basis for forecasting both GDP growth and inflation. This includes the estimation of possible future recessions. In the fourth part the predominantly good forecast properties of sentiment indicators are illustrated examining the real estate market, which is rounded up by an introduction into psychology and Behavioural Finance providing further tips and tricks in analysing financial markets. Using Economic Indicators in Analysing Financial Markets is an invaluable resource for investors, strategists, policymakers, students, and private investors worldwide who want to understand the true meaning of the latest economic trends to make the best decisions for future profits on financial markets.
The book describes the theoretical principles of nonstatistical methods of data analysis but without going deep into complex mathematics. The emphasis is laid on presentation of solved examples of real data either from authors' laboratories or from open literature. The examples cover wide range of applications such as quality assurance and quality control, critical analysis of experimental data, comparison of data samples from various sources, robust linear and nonlinear regression as well as various tasks from financial analysis. The examples are useful primarily for chemical engineers including analytical/quality laboratories in industry, designers of chemical and biological processes. Features: Exclusive title on Mathematical Gnostics with multidisciplinary applications, and specific focus on chemical engineering. Clarifies the role of data space metrics including the right way of aggregation of uncertain data. Brings a new look on the data probability, information, entropy and thermodynamics of data uncertainty. Enables design of probability distributions for all real data samples including smaller ones. Includes data for examples with solutions with exercises in R or Python. The book is aimed for Senior Undergraduate Students, Researchers, and Professionals in Chemical/Process Engineering, Engineering Physics, Stats, Mathematics, Materials, Geotechnical, Civil Engineering, Mining, Sales, Marketing and Service, and Finance.
Military organizations around the world are normally huge producers and consumers of data. Accordingly, they stand to gain from the many benefits associated with data analytics. However, for leaders in defense organizations-either government or industry-accessible use cases are not always available. This book presents a diverse collection of cases that explore the realm of possibilities in military data analytics. These use cases explore such topics as: Context for maritime situation awareness Data analytics for electric power and energy applications Environmental data analytics in military operations Data analytics and training effectiveness evaluation Harnessing single board computers for military data analytics Analytics for military training in virtual reality environments A chapter on using single board computers explores their application in a variety of domains, including wireless sensor networks, unmanned vehicles, and cluster computing. The investigation into a process for extracting and codifying expert knowledge provides a practical and useful model for soldiers that can support diagnostics, decision making, analysis of alternatives, and myriad other analytical processes. Data analytics is seen as having a role in military learning, and a chapter in the book describes the ongoing work with the United States Army Research Laboratory to apply data analytics techniques to the design of courses, evaluation of individual and group performances, and the ability to tailor the learning experience to achieve optimal learning outcomes in a minimum amount of time. Another chapter discusses how virtual reality and analytics are transforming training of military personnel. Virtual reality and analytics are also transforming monitoring, decision making, readiness, and operations. Military Applications of Data Analytics brings together a collection of technical and application-oriented use cases. It enables decision makers and technologists to make connections between data analytics and such fields as virtual reality and cognitive science that are driving military organizations around the world forward.
Dependence Modeling with Copulas covers the substantial advances that have taken place in the field during the last 15 years, including vine copula modeling of high-dimensional data. Vine copula models are constructed from a sequence of bivariate copulas. The book develops generalizations of vine copula models, including common and structured factor models that extend from the Gaussian assumption to copulas. It also discusses other multivariate constructions and parametric copula families that have different tail properties and presents extensive material on dependence and tail properties to assist in copula model selection. The author shows how numerical methods and algorithms for inference and simulation are important in high-dimensional copula applications. He presents the algorithms as pseudocode, illustrating their implementation for high-dimensional copula models. He also incorporates results to determine dependence and tail properties of multivariate distributions for future constructions of copula models.
Contemporary economists, when analyzing economic behavior of people, need to use the diversity of research methods and modern ways of discovering knowledge. The increasing popularity of using economic experiments requires the use of IT tools and quantitative methods that facilitate the analysis of the research material obtained as a result of the experiments and the formulation of correct conclusions. This proceedings volume presents problems in contemporary economics and provides innovative solutions using a range of quantitative and experimental tools. Featuring selected contributions presented at the 2018 Computational Methods in Experimental Economics Conference (CMEE 2018), this book provides a modern economic perspective on such important issues as: sustainable development, consumption, production, national wealth, the silver economy, behavioral finance, economic and non-economic factors determining the behavior of household members, consumer preferences, social campaigns, and neuromarketing. International case studies are also offered.
The Great East Japan Earthquake of March 2011 left the entire world
in a state of shock. The international community was unable to
fathom how a major economic power, with one of the most extensive
natural disaster preparedness programs in the world, could be laid
bare to such destruction. Even other highly developed countries
began questioning their own abilities to handle natural disasters.
Different nations have faced disasters of varying intensity
throughout history, and it is in the best interests of the global
community to share experiences and wisdom in order to minimize
damage wrought by future catastrophes.
Metrology is the study of measurement science. Although classical economists have emphasized the importance of measurement per se, the majority of economics-based writings on the topic have taken the form of government reports related to the activities of specific national metrology laboratories. This book is the first systematic study of measurement activity at a national metrology laboratory, and the laboratory studied is the U.S. National Institute of Standards and Technology (NIST) within the U.S. Department of Commerce. The primary objective of the book is to emphasize for academic and policy audiences the economic importance of measurement not only as an area of study but also as a tool for sustaining technological advancement as an element of economic growth. Toward this goal, the book offers an overview of the economic benefits and consequences of measurement standards; an argument for public sector support of measurement standards; a historical perspective of the measurement activities at NIST; an empirical analysis of one particular measurement activity at NIST, namely calibration testing; and a roadmap for future research on the economics of metrology.
This book investigates why economics makes less visible progress over time than scientific fields with a strong practical component, where interactions with physical technologies play a key role. The thesis of the book is that the main impediment to progress in economics is "false feedback", which it defines as the false result of an empirical study, such as empirical evidence produced by a statistical model that violates some of its assumptions. In contrast to scientific fields that work with physical technologies, false feedback is hard to recognize in economics. Economists thus have difficulties knowing where they stand in their inquiries, and false feedback will regularly lead them in the wrong directions. The book searches for the reasons behind the emergence of false feedback. It thereby contributes to a wider discussion in the field of metascience about the practices of researchers when pursuing their daily business. The book thus offers a case study of metascience for the field of empirical economics. The main strength of the book are the numerous smaller insights it provides throughout. The book delves into deep discussions of various theoretical issues, which it illustrates by many applied examples and a wide array of references, especially to philosophy of science. The book puts flesh on complicated and often abstract subjects, particularly when it comes to controversial topics such as p-hacking. The reader gains an understanding of the main challenges present in empirical economic research and also the possible solutions. The main audience of the book are all applied researchers working with data and, in particular, those who have found certain aspects of their research practice problematic.
Volume 39A of Research in the History of Economic Thought and Methodology features a selection of essays presented at the 2019 Conference of the Latin American Society for the History of Economic Thought (ALAHPE), edited by Felipe Almeida and Carlos Eduardo Suprinyak, as well as a new general-research essay by Daniel Kuehn, an archival discovery by Katia Caldari and Luca Fiorito, and a book review by John Hall.
This book is summarizing the results of the workshop "Uniform Distribution and Quasi-Monte Carlo Methods" of the RICAM Special Semester on "Applications of Algebra and Number Theory" in October 2013. The survey articles in this book focus on number theoretic point constructions, uniform distribution theory, and quasi-Monte Carlo methods. As deterministic versions of the Monte Carlo method, quasi-Monte Carlo rules enjoy increasing popularity, with many fruitful applications in mathematical practice, as for example in finance, computer graphics, and biology. The goal of this book is to give an overview of recent developments in uniform distribution theory, quasi-Monte Carlo methods, and their applications, presented by leading experts in these vivid fields of research.
The second book in a set of ten on quantitative finance for practitioners Presents the theory needed to better understand applications Supplements previous training in mathematics Built from the author's four decades of experience in industry, research, and teaching
The interaction between mathematicians and statisticians reveals to be an effective approach to the analysis of insurance and financial problems, in particular in an operative perspective. The Maf2006 conference, held at the University of Salerno in 2006, had precisely this purpose and the collection published here gathers some of the papers presented at the conference and successively worked out to this aim. They cover a wide variety of subjects in insurance and financial fields.
In-depth coverage of discrete-time theory and methodology. Numerous, fully worked out examples and exercises in every chapter. Mathematically rigorous and consistent yet bridging various basic and more advanced concepts. Judicious balance of financial theory, mathematical, and computational methods. Guide to Material.
This book contains some of the results from the research project "Demand for Food in the Nordic Countries," which was initiated in 1988 by Professor Olof Bolin of the Agricultural University in Ultuna, Sweden and by Professor Karl Iohan Weckman, of the University of Helsinki, Finland. A pilot study was carried out by Bengt Assarsson, which in 1989 led to a successful application for a research grant from the NKJ (The Nordic Contact Body for Agricultural Research) through the national research councils for agricultural research in Denmark, Finland, Norway and Sweden. We are very grateful to Olof Bolin and Karl Iohan Weckman, without whom this project would not have come about, and to the national research councils in the Nordic countries for the generous financial support we have received for this project. We have received comments and suggestions from many colleagues, and this has improved our work substantially. At the start of the project a reference group was formed, consisting of Professor Olof Bolin, Professor Anders Klevmarken, Agr. lie. Gert Aage Nielsen, Professor Karl Iohan Weckman and Cando oecon. Per Halvor Vale. Gert Aage Nielsen left the group early in the project for a position in Landbanken, and was replaced by Professor Lars Otto, while Per Halvor Vale soon joined the research staff. The reference group has given us useful suggestions and encouraged us in our work. Weare very grateful to them.
Microbehavioral Econometric Methods and Environmental Studies uses microeconometric methods to model the behavior of individuals, then demonstrates the modelling approaches in addressing policy needs. It links theory and methods with applications, and it incorporates data to connect individual choices and global environmental issues. This extension of traditional environmental economics presents modeling strategies and methodological techniques, then applies them to hands-on examples.Throughout the book, readers can access chapter summaries, problem sets, multiple household survey data with regard to agricultural and natural resources in Sub-Saharan Africa, South America, and India, and empirical results and solutions from the SAS software.
This book surveys big data tools used in macroeconomic forecasting and addresses related econometric issues, including how to capture dynamic relationships among variables; how to select parsimonious models; how to deal with model uncertainty, instability, non-stationarity, and mixed frequency data; and how to evaluate forecasts, among others. Each chapter is self-contained with references, and provides solid background information, while also reviewing the latest advances in the field. Accordingly, the book offers a valuable resource for researchers, professional forecasters, and students of quantitative economics.
Any enquiry into the nature, performance, role, demerits, growth, efficiency, or other aspects of financial services such as banking and insurance activities, requires rigorous estimates of their economic output, i.e., the economic contributions made by these firms, as well as by the industries as a whole. Accordingly, this book condenses several theoretical, methodological, empirical, and philosophical issues in conceptualizing, measuring, and empirically operationalizing the economic output of the banking and insurance industries. The analytical focus is on both Global and Emerging Markets perspectives. The book synthesizes applied and conceptual evidence to locate the chosen theme's analytical patterns, consensus, and disagreements. The selected subject matter is studied within the firm-level and aggregate settings, bringing literature of varied scopes together. Contributions from various international academics, practitioners, and policymakers further enrich the narrative. The book concludes with data-driven case studies that analyze the extent to which the critical performance parameters of the banking and insurance industries in the BRIICS economies - including estimation of aggregate industry-level partial factor productivities, total factor productivity, technical efficiency, and returns to scale - vary concerning alternate measures of their output. The present work also provides a brief note on the inputs measurement dimension, following which there is a discussion on the limitations, future scope, and conclusions. This work will be valuable for researchers and policymakers undertaking performance analyses related to banking and insurance activities. It shall provide them with the examination of a plethora of analytical options and related issues on the theory and praxis of output measurement, all finely organized into one single volume.
In the modern world, data is a vital asset for any organization, regardless of industry or size. The world is built upon data. However, data without knowledge is useless. The aim of this book, briefly, is to introduce new approaches that can be used to shape and forecast the future by combining the two disciplines of Statistics and Economics.Readers of Modeling and Advanced Techniques in Modern Economics can find valuable information from a diverse group of experts on topics such as finance, econometric models, stochastic financial models and machine learning, and application of models to financial and macroeconomic data.
This book sheds new light on a recently introduced monetary tool - negative interest rates policy (NIRP). It provides in-depth insight into this phenomenon, conducted by the central banks in several economies, for example, the Eurozone, Switzerland and Japan, and its possible impact on systemic risk. Although it has been introduced as a temporary policy instrument, it may remain widely used for a longer period and by a greater range of central banks than initially expected, thus the book explores its effects and implications on the banking sector and financial markets, with a particular focus on potentially adverse consequences. There is a strong accent on the uniqueness of negative policy rates in the context of financial stability concerns. The authors assess whether NIRP has any - or in principle a stronger - impact on systemic risk than conventional monetary policy. The book is targeted at presenting and evaluating the initial experiences of NIRP policy during normal, i.e. pre-COVID, times, rather than in periods in which pre-established macroeconomic relations are rapidly disrupted or, specifically, when the source of the disruption is not purely economic in nature, unlike in systemic crisis. The authors adopt both theoretical and practical approaches to explore the key issues and outline the policy implications for both monetary and macroprudential authorities, with respect to negative interest rate policy, thus the book will provide a useful guide for policymakers, academics, advanced students and researchers of financial economics and international finance.
Covers the key issues required for students wishing to understand and analyse the core empirical issues in economics. It focuses on descriptive statistics, probability concepts and basic econometric techniques and has an accompanying website that contains all the data used in the examples and provides exercises for undertaking original research.
This book bridges the gap between economic theory and spatial econometric techniques. It is accessible to those with only a basic statistical background and no prior knowledge of spatial econometric methods. It provides a comprehensive treatment of the topic, motivating the reader with examples and analysis. The volume provides a rigorous treatment of the basic spatial linear model, and it discusses the violations of the classical regression assumptions that occur when dealing with spatial data.
Features Accessible to readers with a basic background in probability and statistics Covers fundamental concepts of experimental design and cause-effect relationships Introduces classical ANOVA models, including contrasts and multiple testing Provides an example-based introduction to mixed models Features basic concepts of split-plot and incomplete block designs R code available for all steps Supplementary website with additional resources and updates
Features Accessible to readers with a basic background in probability and statistics Covers fundamental concepts of experimental design and cause-effect relationships Introduces classical ANOVA models, including contrasts and multiple testing Provides an example-based introduction to mixed models Features basic concepts of split-plot and incomplete block designs R code available for all steps Supplementary website with additional resources and updates |
You may like...
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
|