Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics
The Who, What, and Where of America is designed to provide a sampling of key demographic information. It covers the United States, every state, each metropolitan statistical area, and all the counties and cities with a population of 20,000 or more. Who: Age, Race and Ethnicity, and Household Structure What: Education, Employment, and Income Where: Migration, Housing, and Transportation Each part is preceded by highlights and ranking tables that show how areas diverge from the national norm. These research aids are invaluable for understanding data from the ACS and for highlighting what it tells us about who we are, what we do, and where we live. Each topic is divided into four tables revealing the results of the data collected from different types of geographic areas in the United States, generally with populations greater than 20,000. Table A. States Table B. Counties Table C. Metropolitan Areas Table D. Cities In this edition, you will find social and economic estimates on the ways American communities are changing with regard to the following: Age and race Health care coverage Marital history Education attainment Income and occupation Commute time to work Employment status Home values and monthly costs Veteran status Size of home or rental unit This title is the latest in the County and City Extra Series of publications from Bernan Press. Other titles include County and City Extra, County and City Extra: Special Decennial Census Edition, and Places, Towns, and Townships.
Modelling Spatial and Spatial-Temporal Data: A Bayesian Approach is aimed at statisticians and quantitative social, economic and public health students and researchers who work with small-area spatial and spatial-temporal data. It assumes a grounding in statistical theory up to the standard linear regression model. The book compares both hierarchical and spatial econometric modelling, providing both a reference and a teaching text with exercises in each chapter. The book provides a fully Bayesian, self-contained, treatment of the underlying statistical theory, with chapters dedicated to substantive applications. The book includes WinBUGS code and R code and all datasets are available online. Part I covers fundamental issues arising when modelling spatial and spatial-temporal data. Part II focuses on modelling cross-sectional spatial data and begins by describing exploratory methods that help guide the modelling process. There are then two theoretical chapters on Bayesian models and a chapter of applications. Two chapters follow on spatial econometric modelling, one describing different models, the other substantive applications. Part III discusses modelling spatial-temporal data, first introducing models for time series data. Exploratory methods for detecting different types of space-time interaction are presented, followed by two chapters on the theory of space-time separable (without space-time interaction) and inseparable (with space-time interaction) models. An applications chapter includes: the evaluation of a policy intervention; analysing the temporal dynamics of crime hotspots; chronic disease surveillance; and testing for evidence of spatial spillovers in the spread of an infectious disease. A final chapter suggests some future directions and challenges. Robert Haining is Emeritus Professor in Human Geography, University of Cambridge, England. He is the author of Spatial Data Analysis in the Social and Environmental Sciences (1990) and Spatial Data Analysis: Theory and Practice (2003). He is a Fellow of the RGS-IBG and of the Academy of Social Sciences. Guangquan Li is Senior Lecturer in Statistics in the Department of Mathematics, Physics and Electrical Engineering, Northumbria University, Newcastle, England. His research includes the development and application of Bayesian methods in the social and health sciences. He is a Fellow of the Royal Statistical Society.
The book describes the theoretical principles of nonstatistical methods of data analysis but without going deep into complex mathematics. The emphasis is laid on presentation of solved examples of real data either from authors' laboratories or from open literature. The examples cover wide range of applications such as quality assurance and quality control, critical analysis of experimental data, comparison of data samples from various sources, robust linear and nonlinear regression as well as various tasks from financial analysis. The examples are useful primarily for chemical engineers including analytical/quality laboratories in industry, designers of chemical and biological processes. Features: Exclusive title on Mathematical Gnostics with multidisciplinary applications, and specific focus on chemical engineering. Clarifies the role of data space metrics including the right way of aggregation of uncertain data. Brings a new look on the data probability, information, entropy and thermodynamics of data uncertainty. Enables design of probability distributions for all real data samples including smaller ones. Includes data for examples with solutions with exercises in R or Python. The book is aimed for Senior Undergraduate Students, Researchers, and Professionals in Chemical/Process Engineering, Engineering Physics, Stats, Mathematics, Materials, Geotechnical, Civil Engineering, Mining, Sales, Marketing and Service, and Finance.
Military organizations around the world are normally huge producers and consumers of data. Accordingly, they stand to gain from the many benefits associated with data analytics. However, for leaders in defense organizations-either government or industry-accessible use cases are not always available. This book presents a diverse collection of cases that explore the realm of possibilities in military data analytics. These use cases explore such topics as: Context for maritime situation awareness Data analytics for electric power and energy applications Environmental data analytics in military operations Data analytics and training effectiveness evaluation Harnessing single board computers for military data analytics Analytics for military training in virtual reality environments A chapter on using single board computers explores their application in a variety of domains, including wireless sensor networks, unmanned vehicles, and cluster computing. The investigation into a process for extracting and codifying expert knowledge provides a practical and useful model for soldiers that can support diagnostics, decision making, analysis of alternatives, and myriad other analytical processes. Data analytics is seen as having a role in military learning, and a chapter in the book describes the ongoing work with the United States Army Research Laboratory to apply data analytics techniques to the design of courses, evaluation of individual and group performances, and the ability to tailor the learning experience to achieve optimal learning outcomes in a minimum amount of time. Another chapter discusses how virtual reality and analytics are transforming training of military personnel. Virtual reality and analytics are also transforming monitoring, decision making, readiness, and operations. Military Applications of Data Analytics brings together a collection of technical and application-oriented use cases. It enables decision makers and technologists to make connections between data analytics and such fields as virtual reality and cognitive science that are driving military organizations around the world forward.
Since the financial crisis, the issue of the 'one percent' has become the centre of intense public debate, unavoidable even for members of the elite themselves. Moreover, inquiring into elites has taken centre-stage once again in both journalistic investigations and academic research. New Directions in Elite Studies attempts to move the social scientific study of elites beyond economic analysis, which has greatly improved our knowledge of inequality, but is restricted to income and wealth. In contrast, this book mobilizes a broad scope of research methods to uncover the social composition of the power elite - the 'field of power'. It reconstructs processes through which people gain access to positions in this particular social space, examines the various forms of capital they mobilize in the process - economic, but also cultural and social capital - and probes changes over time and variations across national contexts. Bringing together the most advanced research into elites by a European and multidisciplinary group of scholars, this book presents an agenda for the future study of elites. It will appeal to all those interested in the study of elites, inequality, class, power, and gender inequality.
A thrilling behind-the-scenes exploration of how governments past and present have been led astray by bad data - and why it is so hard to measure things and to do it well. Our politicians make vital decisions and declarations every day that rely on official data. But should all statistics be trusted? In BAD DATA, House of Commons Library statistician Georgina Sturge draws back the curtain on how governments of the past and present have been led astray by figures littered with inconsistency, guesswork and uncertainty. Discover how a Hungarian businessman's bright idea caused half a million people to go missing from UK migration statistics. Find out why it's possible for two politicians to disagree over whether poverty has gone up or down, using the same official numbers, and for both to be right at the same time. And hear about how policies like ID cards, super-casinos and stopping ex-convicts from reoffending failed to live up to their promise because they were based on shaky data. With stories that range from the troubling to the empowering to the downright absurd, BAD DATA reveals secrets from the usually closed-off world of policy-making. It also suggests how - once we understand the human story behind the numbers - we can make more informed choices about who to trust, and when.
* Starts from the basics, focusing less on proofs and the high-level math underlying regressions, and adopts an engaging tone to provide a text which is entirely accessible to students who don't have a stats background * New chapter on integrity and ethics in regression analysis * Each chapter offers boxed examples, stories, exercises and clear summaries, all of which are designed to support student learning * Optional appendix of statistical tools, providing a primer to readers who need it * Code in R and Stata, and data sets and exercises in Stata and CSV, to allow students to practice running their own regressions * Author-created videos on YouTube * PPT lecture slides and test bank for instructors
Originally published in 1939, this book forms the second part of a two-volume series on the mathematics required for the examinations of the Institute of Actuaries, focusing on finite differences, probability and elementary statistics. Miscellaneous examples are included at the end of the text. This book will be of value to anyone with an interest in actuarial science and mathematics.
This book investigates why economics makes less visible progress over time than scientific fields with a strong practical component, where interactions with physical technologies play a key role. The thesis of the book is that the main impediment to progress in economics is "false feedback", which it defines as the false result of an empirical study, such as empirical evidence produced by a statistical model that violates some of its assumptions. In contrast to scientific fields that work with physical technologies, false feedback is hard to recognize in economics. Economists thus have difficulties knowing where they stand in their inquiries, and false feedback will regularly lead them in the wrong directions. The book searches for the reasons behind the emergence of false feedback. It thereby contributes to a wider discussion in the field of metascience about the practices of researchers when pursuing their daily business. The book thus offers a case study of metascience for the field of empirical economics. The main strength of the book are the numerous smaller insights it provides throughout. The book delves into deep discussions of various theoretical issues, which it illustrates by many applied examples and a wide array of references, especially to philosophy of science. The book puts flesh on complicated and often abstract subjects, particularly when it comes to controversial topics such as p-hacking. The reader gains an understanding of the main challenges present in empirical economic research and also the possible solutions. The main audience of the book are all applied researchers working with data and, in particular, those who have found certain aspects of their research practice problematic.
Metrology is the study of measurement science. Although classical economists have emphasized the importance of measurement per se, the majority of economics-based writings on the topic have taken the form of government reports related to the activities of specific national metrology laboratories. This book is the first systematic study of measurement activity at a national metrology laboratory, and the laboratory studied is the U.S. National Institute of Standards and Technology (NIST) within the U.S. Department of Commerce. The primary objective of the book is to emphasize for academic and policy audiences the economic importance of measurement not only as an area of study but also as a tool for sustaining technological advancement as an element of economic growth. Toward this goal, the book offers an overview of the economic benefits and consequences of measurement standards; an argument for public sector support of measurement standards; a historical perspective of the measurement activities at NIST; an empirical analysis of one particular measurement activity at NIST, namely calibration testing; and a roadmap for future research on the economics of metrology.
This book covers all the topics found in introductory descriptive statistics courses, including simple linear regression and time series analysis, the fundamentals of inferential statistics (probability theory, random sampling and estimation theory), and inferential statistics itself (confidence intervals, testing). Each chapter starts with the necessary theoretical background, which is followed by a variety of examples. The core examples are based on the content of the respective chapter, while the advanced examples, designed to deepen students' knowledge, also draw on information and material from previous chapters. The enhanced online version helps students grasp the complexity and the practical relevance of statistical analysis through interactive examples and is suitable for undergraduate and graduate students taking their first statistics courses, as well as for undergraduate students in non-mathematical fields, e.g. economics, the social sciences etc.
The second book in a set of ten on quantitative finance for practitioners Presents the theory needed to better understand applications Supplements previous training in mathematics Built from the author's four decades of experience in industry, research, and teaching
This practical guide in Eviews is aimed at practitioners and students in business, economics, econometrics, and finance. It uses a step-by-step approach to equip readers with a toolkit that enables them to make the most of this widely used econometric analysis software. Statistical and econometrics concepts are explained visually with examples, problems, and solutions. Developed by economists, the Eviews statistical software package is used most commonly for time-series oriented econometric analysis. It allows users to quickly develop statistical relations from data and then use those relations to forecast future values of the data. The package provides convenient ways to enter or upload data series, create new series from existing ones, display and print series, carry out statistical analyses of relationships among series, and manipulate results and output. This highly hands-on resource includes more than 200 illustrative graphs and tables and tutorials throughout. Abdulkader Aljandali is Senior Lecturer at Coventry University in London. He is currently leading the Stochastic Finance Module taught as part of the Global Financial Trading MSc. His previously published work includes Exchange Rate Volatility in Emerging Markers, Quantitative Analysis, Multivariate Methods & Forecasting with IBM SPSS Statistics and Multivariate Methods and Forecasting with IBM (R) SPSS (R) Statistics. Dr Aljandali is an established member of the British Accounting and Finance Association and the Higher Education Academy. Motasam Tatahi is a specialist in the areas of Macroeconomics, Financial Economics, and Financial Econometrics at the European Business School, Regent's University London, where he serves as Principal Lecturer and Dissertation Coordinator for the MSc in Global Banking and Finance at The European Business School-London.
The book evaluates the importance of constitutional rules and property rights for the German economy in 1990-2015. It is an economic historical study embedded in institutional economics with main references to positive constitutional economics and the property rights theory. This interdisciplinary work adopts a theoretical-empirical dimension and a qualitative-quantitative approach. Formal institutions played a fundamental role in Germany's post-reunification economic changes. They set the legal and institutional framework for the transition process of Eastern Germany and the unification, integration and convergence between the two parts of the country. Although the latter process was not completed, the effects of these formal rules were positive, especially for the former GDR.
Any enquiry into the nature, performance, role, demerits, growth, efficiency, or other aspects of financial services such as banking and insurance activities, requires rigorous estimates of their economic output, i.e., the economic contributions made by these firms, as well as by the industries as a whole. Accordingly, this book condenses several theoretical, methodological, empirical, and philosophical issues in conceptualizing, measuring, and empirically operationalizing the economic output of the banking and insurance industries. The analytical focus is on both Global and Emerging Markets perspectives. The book synthesizes applied and conceptual evidence to locate the chosen theme's analytical patterns, consensus, and disagreements. The selected subject matter is studied within the firm-level and aggregate settings, bringing literature of varied scopes together. Contributions from various international academics, practitioners, and policymakers further enrich the narrative. The book concludes with data-driven case studies that analyze the extent to which the critical performance parameters of the banking and insurance industries in the BRIICS economies - including estimation of aggregate industry-level partial factor productivities, total factor productivity, technical efficiency, and returns to scale - vary concerning alternate measures of their output. The present work also provides a brief note on the inputs measurement dimension, following which there is a discussion on the limitations, future scope, and conclusions. This work will be valuable for researchers and policymakers undertaking performance analyses related to banking and insurance activities. It shall provide them with the examination of a plethora of analytical options and related issues on the theory and praxis of output measurement, all finely organized into one single volume.
Partial least squares structural equation modeling (PLS-SEM) has become a standard approach for analyzing complex inter-relationships between observed and latent variables. Researchers appreciate the many advantages of PLS-SEM such as the possibility to estimate very complex models and the method's flexibility in terms of data requirements and measurement specification. This practical open access guide provides a step-by-step treatment of the major choices in analyzing PLS path models using R, a free software environment for statistical computing, which runs on Windows, macOS, and UNIX computer platforms. Adopting the R software's SEMinR package, which brings a friendly syntax to creating and estimating structural equation models, each chapter offers a concise overview of relevant topics and metrics, followed by an in-depth description of a case study. Simple instructions give readers the "how-tos" of using SEMinR to obtain solutions and document their results. Rules of thumb in every chapter provide guidance on best practices in the application and interpretation of PLS-SEM.
This book sheds new light on a recently introduced monetary tool - negative interest rates policy (NIRP). It provides in-depth insight into this phenomenon, conducted by the central banks in several economies, for example, the Eurozone, Switzerland and Japan, and its possible impact on systemic risk. Although it has been introduced as a temporary policy instrument, it may remain widely used for a longer period and by a greater range of central banks than initially expected, thus the book explores its effects and implications on the banking sector and financial markets, with a particular focus on potentially adverse consequences. There is a strong accent on the uniqueness of negative policy rates in the context of financial stability concerns. The authors assess whether NIRP has any - or in principle a stronger - impact on systemic risk than conventional monetary policy. The book is targeted at presenting and evaluating the initial experiences of NIRP policy during normal, i.e. pre-COVID, times, rather than in periods in which pre-established macroeconomic relations are rapidly disrupted or, specifically, when the source of the disruption is not purely economic in nature, unlike in systemic crisis. The authors adopt both theoretical and practical approaches to explore the key issues and outline the policy implications for both monetary and macroprudential authorities, with respect to negative interest rate policy, thus the book will provide a useful guide for policymakers, academics, advanced students and researchers of financial economics and international finance.
This is the perfect (and essential) supplement for all econometrics
classes--from a rigorous first undergraduate course, to a first
master's, to a PhD course.
This book has taken form over several years as a result of a number of courses taught at the University of Pennsylvania and at Columbia University and a series of lectures I have given at the International Monetary Fund. Indeed, I began writing down my notes systematically during the academic year 1972-1973 while at the University of California, Los Angeles. The diverse character of the audience, as well as my own conception of what an introductory and often terminal acquaintance with formal econometrics ought to encompass, have determined the style and content of this volume. The selection of topics and the level of discourse give sufficient variety so that the book can serve as the basis for several types of courses. As an example, a relatively elementary one-semester course can be based on Chapters one through five, omitting the appendices to these chapters and a few sections in some of the chapters so indicated. This would acquaint the student with the basic theory of the general linear model, some of the prob lems often encountered in empirical research, and some proposed solutions. For such a course, I should also recommend a brief excursion into Chapter seven (logit and pro bit analysis) in view of the increasing availability of data sets for which this type of analysis is more suitable than that based on the general linear model."
Arthur Vogt has devoted a great deal of his scientific efforts to
both person and work of Irving Fisher. This book, written with J
nos Barta, gives an excellent impression of Fisher's great
contributions to the theory of the price index on the one hand. On
the other hand, it continues Fisher's work on this subject along
the lines which several authors drew with respect to price index
theory since Fisher's death fifty years ago.
Operation Research methods are often used in every field of modern life like industry, economy and medicine. The authors have compiled of the latest advancements in these methods in this volume comprising some of what is considered the best collection of these new approaches. These can be counted as a direct shortcut to what you may search for. This book provides useful applications of the new developments in OR written by leading scientists from some international universities. Another volume about exciting applications of Operations Research is planned in the near future. We hope you enjoy and benefit from this series!
This important new book presents the theoretical, econometric and applied foundations of the economics of innovation as well as offering a new approach to the measurement of technical change. The author, a leading expert in innovation economics and management, critically reviews current schools of thought and presents his own contribution to measurement techniques. Measurements of technical change have focused on the characteristics of price and quantity whilst useful theories and reliable indicators of the quality of innovation in new products have been sorely lacking. The author examines the theoretical foundations of the measurement of technical change and extends the analysis to consider the econometric and empirical perspective in the process of innovation. He outlines the key contributions to innovation research by reviewing the English-language literature and providing a very useful guide to the most important contributions in other languages. In the measurement of the quality of innovation, the techniques used in the author's contribution to new 'technometrics' are presented and explained in detail and are applied to the most important topical problems in innovation and management. This significant addition to the literature will be invaluable to graduates, scholars and managers working in the area of technical change, technology and innovation management.
The emergence of new firm-level data, including the European Community Innovation Survey (CIS), has led to a surge of studies on innovation and firm behaviour. This book documents progress in four interrelated fields: investigation of the use of new indicators of innovation output; investigation of determinants of innovative behavior; the role of spillovers, the public knowledge infrastructure and research and development collaboration; and the impact of innovation on firm performance. Written by an international group of contributors, the studies are based on agriculture and the manufacturing and service industries in Europe and Canada and provide new insights into the driving forces behind innovation.
This is the first book to investigate individual's pessimistic and optimistic prospects for the future and their economic consequences based on sound mathematical foundations. The book focuses on fundamental uncertainty called Knightian uncertainty, where the probability distribution governing uncertainty is unknown, and it provides the reader with methods to formulate how pessimism and optimism act in an economy in a strict and unified way. After presenting decision-theoretic foundations for prudent behaviors under Knightian uncertainty, the book applies these ideas to economic models that include portfolio inertia, indeterminacy of equilibria in the Arrow-Debreu economy and in a stochastic overlapping-generations economy, learning, dynamic asset-pricing models, search, real options, and liquidity preferences. The book then proceeds to characterizations of pessimistic ( -contaminated) and optimistic ( -exuberant) behaviors under Knightian uncertainty and people's inherent pessimism (surprise aversion) and optimism (surprise loving). Those characterizations are shown to be useful in understanding several observed behaviors in the global financial crisis and in its aftermath. The book is highly recommended not only to researchers who wish to understand the mechanism of how pessimism and optimism affect economic phenomena, but also to policy makers contemplating effective economic policies whose success delicately hinges upon people's mindsets in the market. Kiyohiko Nishimura is Professor at the National Graduate Institute for Policy Studies (GRIPS) and Professor Emeritus and Distinguished Project Research Fellow of the Center for Advanced Research in Finance at The University of Tokyo. Hiroyuki Ozaki is Professor of Economics at Keio University. |
You may like...
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,524
Discovery Miles 35 240
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,508
Discovery Miles 25 080
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,105
Discovery Miles 61 050
Statistics for Business and Economics…
Paul Newbold, William Carlson, …
R2,178
Discovery Miles 21 780
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
|