Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics
Parallel Algorithms for Linear Models provides a complete and detailed account of the design, analysis and implementation of parallel algorithms for solving large-scale linear models. It investigates and presents efficient, numerically stable algorithms for computing the least-squares estimators and other quantities of interest on massively parallel systems. The monograph is in two parts. The first part consists of four chapters and deals with the computational aspects for solving linear models that have applicability in diverse areas. The remaining two chapters form the second part, which concentrates on numerical and computational methods for solving various problems associated with seemingly unrelated regression equations (SURE) and simultaneous equations models. The practical issues of the parallel algorithms and the theoretical aspects of the numerical methods will be of interest to a broad range of researchers working in the areas of numerical and computational methods in statistics and econometrics, parallel numerical algorithms, parallel computing and numerical linear algebra. The aim of this monograph is to promote research in the interface of econometrics, computational statistics, numerical linear algebra and parallelism.
The first edition of this book has been described as a landmark book, being the first of its kind in applied econometrics. This second edition is thoroughly revised and updated and explains how to use many recent technical developments in time series econometrics. The main objective of the book is to help many applied economists, with a limited background in econometric estimation theory, to understand and apply widely used time eseries econometric techniques.
The Oxford Handbook of the Economics of Networks represents the frontier of research into how and why networks form, how they influence behavior, how they help govern outcomes in an interactive world, and how they shape collective decision making, opinion formation, and diffusion dynamics. From a methodological perspective, the contributors to this volume devote attention to theory, field experiments, laboratory experiments, and econometrics. Theoretical work in network formation, games played on networks, repeated games, and the interaction between linking and behavior is synthesized. A number of chapters are devoted to studying social process mediated by networks. Topics here include opinion formation, diffusion of information and disease, and learning. There are also chapters devoted to financial contagion and systemic risk, motivated in part by the recent financial crises. Another section discusses communities, with applications including social trust, favor exchange, and social collateral; the importance of communities for migration patterns; and the role that networks and communities play in the labor market. A prominent role of networks, from an economic perspective, is that they mediate trade. Several chapters cover bilateral trade in networks, strategic intermediation, and the role of networks in international trade. Contributions discuss as well the role of networks for organizations. On the one hand, one chapter discusses the role of networks for the performance of organizations, while two other chapters discuss managing networks of consumers and pricing in the presence of network-based spillovers. Finally, the authors discuss the internet as a network with attention to the issue of net neutrality.
Bernan Press proudly presents the 15th edition of Employment, Hours, and Earnings: States and Areas, 2020. A special addition to Bernan Press Handbook of U.S. Labor Statistics: Employment, Earnings, Prices, Productivity, and Other Labor Data, this reference is a consolidated wealth of employment information, providing monthly and annual data on hours worked and earnings made by industry, including figures and summary information spanning several years. These data are presented for states and metropolitan statistical areas. This edition features: Nearly 300 tables with data on employment for each state, the District of Columbia, and the nation's seventy-five largest metropolitan statistical areas (MSAs) Detailed, non-seasonally adjusted, industry data organized by month and year Hours and earnings data for each state, by industry An introduction for each state and the District of Columbia that denotes salient data and noteworthy trends, including changes in population and the civilian labor force, industry increases and declines, employment and unemployment statistics, and a chart detailing employment percentages, by industry Ranking of the seventy-five largest MSAs, including census population estimates, unemployment rates, and the percent change in total nonfarm employment, Concise technical notes that explain pertinent facts about the data, including sources, definitions, and significant changes; and provides references for further guidance A comprehensive appendix that details the geographical components of the seventy-five largest MSAs The employment, hours, and earnings data in this publication provide a detailed and timely picture of the fifty states, the District of Columbia, and the nation's seventy-five largest MSAs. These data can be used to analyze key factors affecting state and local economies and to compare national cyclical trends to local-level economic activity. This reference is an excellent source of information for analysts in both the public and private sectors. Readers who are involved in public policy can use these data to determine the health of the economy, to clearly identify which sectors are growing and which are declining, and to determine the need for federal assistance. State and local jurisdictions can use the data to determine the need for services, including training and unemployment assistance, and for planning and budgetary purposes. In addition, the data can be used to forecast tax revenue. In private industry, the data can be used by business owners to compare their business to the economy as a whole; and to identify suitable areas when making decisions about plant locations, wholesale and retail trade outlets, and for locating a particular sector base.
Through analysis of the European Union Emissions Trading Scheme (EU ETS) and the Clean Development Mechanism (CDM), this book demonstrates how to use a variety of econometric techniques to analyze the evolving and expanding carbon markets sphere, techniques that can be extrapolated to the worldwide marketplace. It features stylized facts about carbon markets from an economics perspective, as well as covering key aspects of pricing strategies, risk and portfolio management.
Franz Ferschl is seventy. According to his birth certificate it is true, but it is unbelievable. Two of the three editors remembers very well the Golden Age of Operations Research at Bonn when Franz Ferschl worked together with Wilhelm Krelle, Martin Beckmann and Horst Albach. The importance of this fruitful cooperation is reflected by the fact that half of the contributors to this book were strongly influenced by Franz Ferschl and his colleagues at the University of Bonn. Clearly, Franz Ferschl left his traces at all the other places of his professional activities, in Vienna and Munich. This is demonstrated by the present volume as well. Born in 1929 in the Upper-Austrian Miihlviertel, his scientific education brought him to Vienna where he studied mathematics. In his early years he was attracted by Statistics and Operations Research. During his employment at the Osterreichische Bundeskammer fUr Gewerbliche Wirtschaft in Vienna he prepared his famous book on queueing theory and stochastic processes in economics. This work has been achieved during his scarce time left by his duties at the Bundeskammer, mostly between 6 a.m. and midnight. All those troubles were, however, soon rewarded by the chair of statistics at Bonn University. As a real Austrian, the amenities of the Rhineland could not prevent him from returning to Vienna, where he took the chair of statistics.
This book collects results from ad hoc surveys on firms pricing behavior conducted in 2003 and 2004 by nine National central banks of the Euro area in the context of a joint research project (Eurosystem Inflation Persistence Network). These surveys have proved to be an efficient way to test theories on the pricing strategies of economic agents, documenting, in qualitative terms, the underlying rationale of the observed pricing patterns. The book provides an unprecedented amount of information from more than 11,000 euro area firms, addressing issues such as the relevance of nominal and real rigidities, the information set used by firms in the price setting process, the strategy followed to review prices, the frequency of both price reviews and price changes, the reasons underlying price stickiness, and asymmetries in price adjustment. It also compares results for the euro area to those obtained for other countries by similar studies. Finally, it draws the main implications for theoretical modeling and for monetary policy.
* Includes many mathematical examples and problems for students to work directly with both standard and nonstandard models of behaviour to develop problem-solving and critical-thinking skills which are more valuable to students than memorizing content which will quickly be forgotten. * The applications explored in the text emphasise issues of inequality, social mobility, culture and poverty to demonstrate the impact of behavioral economics in areas which students are most passionate about. * The text has a standardized structure (6 parts, 3 chapters in each) which provides a clear and consistent roadmap for students taking the course.
In response to the damage caused by a growth-led global economy, researchers across the world started investigating the association between environmental pollution and its possible determinants using different models and techniques. Most famously, the environmental Kuznets curve hypothesizes an inverted U-shaped association between environmental quality and gross domestic product (GDP). This book explores the latest literature on the environmental Kuznets curve, including developments in the methodology, the impacts of the pandemic, and other recent findings. Researchers have recently broadened the range of the list of drivers of environmental pollution under consideration, which now includes variables such as foreign direct investment, trade expansion, financial development, human activities, population growth, and renewable and nonrenewable energy resources, all of which vary across different countries and times. And in addition to CO2 emissions, other proxies for environmental quality – such as water, land, and ecological footprints – have been used in recent studies. This book also incorporates analysis of the relationship between economic growth and the environment during the COVID-19 crisis, presenting new empirical work on the impact of the pandemic on energy use, the financial sector, trade, and tourism. Collectively, these developments have improved the direction and extent of the environmental Kuznets curve hypothesis and broadened the basket of dependent and independent variables which may be incorporated. This book will be invaluable reading for researchers in environmental economics and econometrics.
Risk Measures and Insurance Solvency Benchmarks: Fixed-Probability Levels in Renewal Risk Models is written for academics and practitioners who are concerned about potential weaknesses of the Solvency II regulatory system. It is also intended for readers who are interested in pure and applied probability, have a taste for classical and asymptotic analysis, and are motivated to delve into rather intensive calculations. The formal prerequisite for this book is a good background in analysis. The desired prerequisite is some degree of probability training, but someone with knowledge of the classical real-variable theory, including asymptotic methods, will also find this book interesting. For those who find the proofs too complicated, it may be reassuring that most results in this book are formulated in rather elementary terms. This book can also be used as reading material for basic courses in risk measures, insurance mathematics, and applied probability. The material of this book was partly used by the author for his courses in several universities in Moscow, Copenhagen University, and in the University of Montreal. Features Requires only minimal mathematical prerequisites in analysis and probability Suitable for researchers and postgraduate students in related fields Could be used as a supplement to courses in risk measures, insurance mathematics and applied probability.
This book investigates why economics makes less visible progress over time than scientific fields with a strong practical component, where interactions with physical technologies play a key role. The thesis of the book is that the main impediment to progress in economics is "false feedback", which it defines as the false result of an empirical study, such as empirical evidence produced by a statistical model that violates some of its assumptions. In contrast to scientific fields that work with physical technologies, false feedback is hard to recognize in economics. Economists thus have difficulties knowing where they stand in their inquiries, and false feedback will regularly lead them in the wrong directions. The book searches for the reasons behind the emergence of false feedback. It thereby contributes to a wider discussion in the field of metascience about the practices of researchers when pursuing their daily business. The book thus offers a case study of metascience for the field of empirical economics. The main strength of the book are the numerous smaller insights it provides throughout. The book delves into deep discussions of various theoretical issues, which it illustrates by many applied examples and a wide array of references, especially to philosophy of science. The book puts flesh on complicated and often abstract subjects, particularly when it comes to controversial topics such as p-hacking. The reader gains an understanding of the main challenges present in empirical economic research and also the possible solutions. The main audience of the book are all applied researchers working with data and, in particular, those who have found certain aspects of their research practice problematic.
The research and its outcomes presented here focus on spatial sampling of agricultural resources. The authors introduce sampling designs and methods for producing accurate estimates of crop production for harvests across different regions and countries. With the help of real and simulated examples performed with the open-source software R, readers will learn about the different phases of spatial data collection. The agricultural data analyzed in this book help policymakers and market stakeholders to monitor the production of agricultural goods and its effects on environment and food safety.
If you know a little bit about financial mathematics but don't yet know a lot about programming, then C++ for Financial Mathematics is for you. C++ is an essential skill for many jobs in quantitative finance, but learning it can be a daunting prospect. This book gathers together everything you need to know to price derivatives in C++ without unnecessary complexities or technicalities. It leads the reader step-by-step from programming novice to writing a sophisticated and flexible financial mathematics library. At every step, each new idea is motivated and illustrated with concrete financial examples. As employers understand, there is more to programming than knowing a computer language. As well as covering the core language features of C++, this book teaches the skills needed to write truly high quality software. These include topics such as unit tests, debugging, design patterns and data structures. The book teaches everything you need to know to solve realistic financial problems in C++. It can be used for self-study or as a textbook for an advanced undergraduate or master's level course.
In recent years, interest in rigorous impact evaluation has grown tremendously in policy-making, economics, public health, social sciences and international relations. Evidence-based policy-making has become a recurring theme in public policy, alongside greater demands for accountability in public policies and public spending, and requests for independent and rigorous impact evaluations for policy evidence. Froelich and Sperlich offer a comprehensive and up-to-date approach to quantitative impact evaluation analysis, also known as causal inference or treatment effect analysis, illustrating the main approaches for identification and estimation: experimental studies, randomization inference and randomized control trials (RCTs), matching and propensity score matching and weighting, instrumental variable estimation, difference-in-differences, regression discontinuity designs, quantile treatment effects, and evaluation of dynamic treatments. The book is designed for economics graduate courses but can also serve as a manual for professionals in research institutes, governments, and international organizations, evaluating the impact of a wide range of public policies in health, environment, transport and economic development.
This book offers a unique and insightful econometric evaluation of the policies used to fight transnational terrorism between 1990 and 2014. It uses the tools of modern economics, game theory and structural econometrics to analyze the roles of foreign aid, educational capital, and military intervention. Jean-Paul Azam and Veronique Thelen analyze panel data over 25 years across 124 countries. They prove that foreign aid plays a key role in inducing recipient governments to protect the donors' political and economic interests within their sphere of influence. Demonstrating that countries endowed with better educational capital export fewer terrorist attacks, they also illustrate that, in contrast, military intervention is counter-productive in abating terrorism. Recognizing the strides taken by the Obama administration to increase the role of foreign aid and reduce the use of military interventions, this book shows the significant impact this has had in reducing the number of transnational terrorist attacks per source country, and suggests further developments in this vein. Practical and timely, this book will be of particular interest to students and scholars of economics and political science, as well as those working on the wider issue of terrorism. Presenting a series of new findings, the book will also appeal to international policy makers and government officials.
Economic Modeling Using Artificial Intelligence Methods examines the application of artificial intelligence methods to model economic data. Traditionally, economic modeling has been modeled in the linear domain where the principles of superposition are valid. The application of artificial intelligence for economic modeling allows for a flexible multi-order non-linear modeling. In addition, game theory has largely been applied in economic modeling. However, the inherent limitation of game theory when dealing with many player games encourages the use of multi-agent systems for modeling economic phenomena. The artificial intelligence techniques used to model economic data include: multi-layer perceptron neural networks radial basis functions support vector machines rough sets genetic algorithm particle swarm optimization simulated annealing multi-agent system incremental learning fuzzy networks Signal processing techniques are explored to analyze economic data, and these techniques are the time domain methods, time-frequency domain methods and fractals dimension approaches. Interesting economic problems such as causality versus correlation, simulating the stock market, modeling and controling inflation, option pricing, modeling economic growth as well as portfolio optimization are examined. The relationship between economic dependency and interstate conflict is explored, and knowledge on how economics is useful to foster peace - and vice versa - is investigated. Economic Modeling Using Artificial Intelligence Methods deals with the issue of causality in the non-linear domain and applies the automatic relevance determination, the evidence framework, Bayesian approach and Granger causality to understand causality and correlation. Economic Modeling Using Artificial Intelligence Methods makes an important contribution to the area of econometrics, and is a valuable source of reference for graduate students, researchers and financial practitioners.
The methodological needs of environmental studies are unique in the breadth of research questions that can be posed, calling for a textbook that covers a broad swath of approaches to conducting research with potentially many different kinds of evidence. Fully updated to address new developments such as the effects of the internet, recent trends in the use of computers, remote sensing, and large data sets, this new edition of Research Methods for Environmental Studies is written specifically for social science-based research into the environment. This revised edition contains new chapters on coding, focus groups, and an extended treatment of hypothesis testing. The textbook covers the best-practice research methods most used to study the environment and its connections to societal and economic activities and objectives. Over five key parts, Kanazawa introduces quantitative and qualitative approaches, mixed methods, and the special requirements of interdisciplinary research, emphasizing that methodological practice should be tailored to the specific needs of the project. Within these parts, detailed coverage is provided on key topics including the identification of a research project, hypothesis testing, spatial analysis, the case study method, ethnographic approaches, discourse analysis, mixed methods, survey and interview techniques, focus groups, and ethical issues in environmental research. Drawing on a variety of extended and updated examples to encourage problem-based learning and fully addressing the challenges associated with interdisciplinary investigation, this book will be an essential resource for students embarking on courses exploring research methods in environmental studies.
"Game Theory for Economists" introduces economists to the game-theoretic approach of modelling economic behaviour and interaction, focusing on concepts and ideas from the vast field of game-theoretic models which find commonly used applications in economics. This careful selection of topics allows the reader to concentrate on the parts of the game which are the most relevant for the economist who does not want to become a specialist. Written at a level appropriate for a student or researcher with a solid microeconomic background, the book should provide the reader with skills necessary to formalize economic games and to make them accessible for game theoretic analysis. It offers a concise introduction to game theory which provides economists with the techniques and results necessary to follow the literature in economic theory; helps the reader formalize economic problems; and, concentrates on equilibrium concepts that are most commonly used in economics.
Master the proven, traditional methods in project management as well as the latest agile practices with Kloppenborg/Anantatmula/Wells' CONTEMPORARY PROJECT MANAGEMENT, 5E. This edition presents project management techniques and expert examples drawn from successful practice and the latest research. All content reflects the knowledge areas and processes of the 6th edition of the PMBOK (R) Guide as well as the domains and principles of the 7th edition of the PMBOK (R) Guide. The book's focused approach helps you build a strong portfolio to showcase project management skills. New features, glossary and an integrated case highlight agile practices, mindset and techniques, while PMP (R)-style questions prepare you for the new 2021 PMP (R) certification exam. You also learn to use Microsoft (R) Project to automate processes. Gain the expertise you need to become a Certified Associate in Project Management (CAPM (R)) or Certified Project Management Professional (PMP (R)) with this edition and MindTap digital resources.
A systematic treatment of dynamic decision making and performance measurement Modern business environments are dynamic. Yet, the models used to make decisions and quantify success within them are stuck in the past. In a world where demands, resources, and technology are interconnected and evolving, measures of efficiency need to reflect that environment. In Dynamic Efficiency and Productivity Measurement, Elvira Silva, Spiro E. Stefanou, and Alfons Oude Lansink look at the business process from a dynamic perspective. Their systematic study covers dynamic production environments where current production decisions impact future production possibilities. By considering practical factors like adjustments over time, this book offers an important lens for contemporary microeconomic analysis. Silva, Stefanou, and Lansink develop the analytical foundations of dynamic production technology in both primal and dual representations, with an emphasis on directional distance functions. They cover concepts measuring the production structure (economies of scale, economies of scope, capacity utilization) and performance (allocative, scale and technical inefficiency, productivity) in a methodological and comprehensive way. Through a unified approach, Dynamic Efficiency and Productivity Measurement offers a guide to how firms maximize potential in changing environments and an invaluable contribution to applied microeconomics.
In the Administration building at Linkopi ] ng University we have one of Oscar Reutersvard' ] s "Impossible Figures" in three dimensions. I call it "Perspectives of Science." When viewed from a speci c point in space there is order and structure in the 3-dimensional gure. When viewed from other points there is disorder and no structure. If a speci c scienti c paradigm is used, there is order and structure; otherwise there is disorder and no structure. My perspective in Transportation Science has focused on understanding the mathematical structure and the logic underlying the choice probability models in common use. My book with N. F. Stewart on the Gravity model (Erlander and Stewart 1990), was written in this perspective. The present book stems from the same desire to understand underlying assumptions and structure. It investigateshow far a new way of de ning Cost-Minimizing Behavior can take us.Itturnsoutthatall commonlyusedchoiceprobabilitydistributionsoflogittype- log linear probability functions - follow from cost-minimizing behavior de ned in the new way. In addition some new nested models appear."
The University of Oxford has been and continues to be one of the most important global centres for economics. With six chapters on themes in Oxford economics and 24 chapters on the lives and work of Oxford economists, this volume shows how economics became established at the University, how it produced some of the world's best-known economists, including Francis Ysidro Edgeworth, Roy Harrod and David Hendry, and how it remains a global force for the very best in teaching and research in economics. With original contributions from a stellar cast, this volume provides economists - especially those interested in macroeconomics and the history of economic thought - with the first in-depth analysis of Oxford economics.
This 2nd edition compendium contains and explains essential statistical formulas within an economic context. Expanded by more than 100 pages compared to the 1st edition, the compendium has been supplemented with numerous additional practical examples, which will help readers to better understand the formulas and their practical applications. This statistical formulary is presented in a practice-oriented, clear, and understandable manner, as it is needed for meaningful and relevant application in global business, as well as in the academic setting and economic practice. The topics presented include, but are not limited to: statistical signs and symbols, descriptive statistics, empirical distributions, ratios and index figures, correlation analysis, regression analysis, inferential statistics, probability calculation, probability distributions, theoretical distributions, statistical estimation methods, confidence intervals, statistical testing methods, the Peren-Clement index, and the usual statistical tables. Given its scope, the book offers an indispensable reference guide and is a must-read for undergraduate and graduate students, as well as managers, scholars, and lecturers in business, politics, and economics. |
You may like...
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,508
Discovery Miles 25 080
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,524
Discovery Miles 35 240
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,105
Discovery Miles 61 050
|