![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
A unique and comprehensive source of information, this book is the only international publication providing economists, planners, policymakers and business people with worldwide statistics on current performance and trends in the manufacturing sector. The Yearbook is designed to facilitate international comparisons relating to manufacturing activity and industrial development and performance. It provides data which can be used to analyse patterns of growth and related long term trends, structural change and industrial performance in individual industries. Statistics on employment patterns, wages, consumption and gross output and other key indicators are also presented.
This two volume set is a collection of 30 classic papers presenting ideas which have now become standard in the field of Bayesian inference. Topics covered include the central field of statistical inference as well as applications to areas of probability theory, information theory, utility theory and computational theory. It is organized into seven sections: foundations, information theory and prior distributions; robustness and outliers; hierarchical, multivariate and non-parametric models; asymptotics; computations and Monte Carlo methods; and Bayesian econometrics.
Economic Time Series: Modeling and Seasonality is a focused resource on analysis of economic time series as pertains to modeling and seasonality, presenting cutting-edge research that would otherwise be scattered throughout diverse peer-reviewed journals. This compilation of 21 chapters showcases the cross-fertilization between the fields of time series modeling and seasonal adjustment, as is reflected both in the contents of the chapters and in their authorship, with contributors coming from academia and government statistical agencies. For easier perusal and absorption, the contents have been grouped into seven topical sections: Section I deals with periodic modeling of time series, introducing, applying, and comparing various seasonally periodic models Section II examines the estimation of time series components when models for series are misspecified in some sense, and the broader implications this has for seasonal adjustment and business cycle estimation Section III examines the quantification of error in X-11 seasonal adjustments, with comparisons to error in model-based seasonal adjustments Section IV discusses some practical problems that arise in seasonal adjustment: developing asymmetric trend-cycle filters, dealing with both temporal and contemporaneous benchmark constraints, detecting trading-day effects in monthly and quarterly time series, and using diagnostics in conjunction with model-based seasonal adjustment Section V explores outlier detection and the modeling of time series containing extreme values, developing new procedures and extending previous work Section VI examines some alternative models and inference procedures for analysis of seasonal economic time series Section VII deals with aspects of modeling, estimation, and forecasting for nonseasonal economic time series By presenting new methodological developments as well as pertinent empirical analyses and reviews of established methods, the book provides much that is stimulating and practically useful for the serious researcher and analyst of economic time series.
This book presents models and statistical methods for the analysis of recurrent event data. The authors provide broad, detailed coverage of the major approaches to analysis, while emphasizing the modeling assumptions that they are based on. More general intensity-based models are also considered, as well as simpler models that focus on rate or mean functions. Parametric, nonparametric and semiparametric methodologies are all covered, with procedures for estimation, testing and model checking.
Thoroughly classroom tested, this introductory text covers all the topics that constitute a foundation for basic econometrics, with concise and intuitive explanations of technical material. Important proofs are shown in detail; however, the focus is on developing regression models and understanding the residual
This book covers the basics of processing and spectral analysis of monovariate discrete-time signals. The approach is practical, the aim being to acquaint the reader with the indications for and drawbacks of the various methods and to highlight possible misuses. The book is rich in original ideas, visualized in new and illuminating ways, and is structured so that parts can be skipped without loss of continuity. Many examples are included, based on synthetic data and real measurements from the fields of physics, biology, medicine, macroeconomics etc., and a complete set of MATLAB exercises requiring no previous experience of programming is provided. Prior advanced mathematical skills are not needed in order to understand the contents: a good command of basic mathematical analysis is sufficient. Where more advanced mathematical tools are necessary, they are included in an Appendix and presented in an easy-to-follow way. With this book, digital signal processing leaves the domain of engineering to address the needs of scientists and scholars in traditionally less quantitative disciplines, now facing increasing amounts of data.
World Inequality Report 2018 is the most authoritative and up-to-date account of global trends in inequality. Researched, compiled, and written by a team of the world’s leading economists of inequality, it presents—with unrivaled clarity and depth—information and analysis that will be vital to policy makers and scholars everywhere. Inequality has taken center stage in public debate as the wealthiest people in most parts of the world have seen their share of the economy soar relative to that of others, many of whom, especially in the West, have experienced stagnation. The resulting political and social pressures have posed harsh new challenges for governments and created a pressing demand for reliable data. The World Inequality Lab at the Paris School of Economics and the University of California, Berkeley, has answered this call by coordinating research into the latest trends in the accumulation and distribution of income and wealth on every continent. This inaugural report analyzes the Lab’s findings, which include data from major countries where information has traditionally been difficult to acquire, such as China, India, and Brazil. Among nations, inequality has been decreasing as traditionally poor countries’ economies have caught up with the West. The report shows, however, that inequality has been steadily deepening within almost every nation, though national trajectories vary, suggesting the importance of institutional and policy frameworks in shaping inequality. World Inequality Report 2018 will be a key document for anyone concerned about one of the most imperative and contentious subjects in contemporary politics and economics.
Upon the backdrop of impressive progress made by the Indian economy during the last two decades after the large-scale economic reforms in the early 1990s, this book evaluates the performance of the economy on some income and non-income dimensions of development at the national, state and sectoral levels. It examines regional economic growth and inequality in income originating from agriculture, industry and services. In view of the importance of the agricultural sector, despite its declining share in gross domestic product, it evaluates the performance of agricultural production and the impact of agricultural reforms on spatial integration of food grain markets. It studies rural poverty, analyzing the trend in employment, the trickle-down process and the inclusiveness of growth in rural India. It also evaluates the impact of microfinance, as an instrument of financial inclusion, on the socio-economic conditions of rural households. Lastly, it examines the relative performance of fifteen major states of India in terms of education, health and human development. An important feature of the book is that it approaches these issues, applying rigorously advanced econometric methods, and focusing primarily on their regional disparities during the post-reform period vis-a-vis the pre-reform period. It offers important results to guide policies for future development.
Delivering cutting-edge coverage that includes the latest thinking and practices from the field, QUALITY AND PERFORMANCE EXCELLENCE, 8e presents the basic principles and tools associated with quality and performance excellence. Packed with relevant, real-world examples, the text thoroughly illustrates how these principles and methods have been put into effect in a variety of organizations. It also highlights the relationship between basic principles and the popular theories and models studied in management courses. The eighth edition reflects the 2015-16 Baldrige criteria and includes new boxed features, experiential exercises, and up-to-date case studies that give you practical experience working with real-world issues. Many cases focus on large and small companies in manufacturing and service industries in North and South America, Europe, and Asia-Pacific. In addition, chapters now open with a "Performance Excellence Profile" highlighting a recent Baldrige recipient.
Statistics is used in two senses, singular and plural. In the singular, it concerns with the whole subject of statistics, as a branch of knowledge. In the plural sense, it relates to the numerical facts, data gathered systematically with some definite object in view. Thus, Statistics is the science, which deals with the collection, analysis and interpretation of data. An understanding of the logic and theory of statistics is essential for the students of agriculture who are expected to know the technique of analyzing numerical data and drawing useful conclusions. It is the intention of the author to keep the practical manual at a readability level at appropriate for students who do not have a mathematical background. This book has been prepared for the students and teachers as well to acquaint the basic concepts of statistical principles and procedures of calculations as per the syllabi of 5th Dean's committee of ICAR for undergraduate courses in agriculture and allied sciences.
The present book has been well prepared to meet the requirements of the students of Animal and Veterinary Science, Animal Biotechnology and other related fields. The book will serve as a text book not only for students in Veterinary science but also for those who want to know "What statistics in all about" or who need to be familiar with at least the language and fundamental concepts of statistics. The book will serve well to build necessary background for those who will take more advanced courses in statistics including the specialized applications. The salient features are: The book has been designed in accordance with the new VCI syllabus, 2016 (MSVE-2016). The book will be very useful for students of SAU's/ICAR institutes and those preparing for JRF/SRF/various competitive examinations. Each chapter of this book contains complete self explanatory theory and a fairly number of solved examples. Solved examples for each topic are given in an elegant and more interesting way to make the users understand them easily. Subject matter has been explained in a simple way that the students can easily understand and feel encouraged to solve questions themselves given in unsolved problems.
Building on the strength of the first edition, Quantitative Methods for Business and Economics provides a simple introduction to the mathematical and statistical techniques needed in business. This book is accessible and easy to use, with the emphasis clearly on how to apply quantitative techniques to business situations. It includes numerous real world applications and many opportunities for student interaction. It is clearly focused on business, management and economics students taking a single module in Quantitative Methods.
"Econometric Theory" presents a modern approach to the theory of
econometric estimation and inference, with particular applications
to time series. An ideal reference for practitioners and
researchers, the book is also suited for advanced two-semester
econometrics courses and one-semester regression courses. Based on lectures originally given to graduates at the London School of Economics, the book applies recent developments in asymptotic theory to derive the properties of estimators when the model is only partially specified. Topics covered in depth include the linear regression model, dynamic modeling, simultaneous equations, optimization estimators, hypothesis testing, and the theory of nonstationary time series and cointegration.
This book analyzes the following four distinct, although not dissimilar, areas of social choice theory and welfare economics: nonstrategic choice, Harsanyi's aggregation theorems, distributional ethics and strategic choice. While for aggregation of individual ranking of social states, whether the persons behave strategically or non-strategically, the decision making takes place under complete certainty; in the Harsanyi framework uncertainty has a significant role in the decision making process. Another ingenious characteristic of the book is the discussion of ethical approaches to evaluation of inequality arising from unequal distributions of achievements in the different dimensions of human well-being. Given its wide coverage, combined with newly added materials, end-chapter problems and bibliographical notes, the book will be helpful material for students and researchers interested in this frontline area research. Its lucid exposition, along with non-technical and graphical illustration of the concepts, use of numerical examples, makes the book a useful text.
Law and economics research has had an enormous impact on the laws of contracts, torts, property, crimes, corporations, and antitrust, as well as public regulation and fundamental rights. The Law and Economics of Patent Damages, Antitrust, and Legal Process examines several areas of important research by a variety of international scholars. It contains technical papers on the appropriate way to estimate damages in patent disputes, as well as methods for evaluating relevant markets and vertically integrated firms when determining the competitive effects of mergers and other actions. There are also papers on the implication of different legal processes, regulations, and liability rules on consumer welfare, which range from the impact of delays in legal decisions in labour cases in France to issues of criminal liability related to the use of artificial intelligence. This volume of Research in Law and Economics is a must-read for researchers and professionals of patent damages, antitrust, labour, and legal process.
Across the social sciences there has been increasing focus on reproducibility, i.e., the ability to examine a study's data and methods to ensure accuracy by reproducing the study. Reproducible Econometrics Using R combines an overview of key issues and methods with an introduction to how to use them using open source software (R) and recently developed tools (R Markdown and bookdown) that allow the reader to engage in reproducible econometric research. Jeffrey S. Racine provides a step-by-step approach, and covers five sets of topics, i) linear time series models, ii) robust inference, iii) robust estimation, iv) model uncertainty, and v) advanced topics. The time series material highlights the difference between time-series analysis, which focuses on forecasting, versus cross-sectional analysis, where the focus is typically on model parameters that have economic interpretations. For the time series material, the reader begins with a discussion of random walks, white noise, and non-stationarity. The reader is next exposed to the pitfalls of using standard inferential procedures that are popular in cross sectional settings when modelling time series data, and is introduced to alternative procedures that form the basis for linear time series analysis. For the robust inference material, the reader is introduced to the potential advantages of bootstrapping and the Jackknifing versus the use of asymptotic theory, and a range of numerical approaches are presented. For the robust estimation material, the reader is presented with a discussion of issues surrounding outliers in data and methods for addressing their presence. Finally, the model uncertainly material outlines two dominant approaches for dealing with model uncertainty, namely model selection and model averaging. Throughout the book there is an emphasis on the benefits of using R and other open source tools for ensuring reproducibility. The advanced material covers machine learning methods (support vector machines that are useful for classification) and nonparametric kernel regression which provides the reader with more advanced methods for confronting model uncertainty. The book is well suited for advanced undergraduate and graduate students alike. Assignments, exams, slides, and a solution manual are available for instructors.
The chapters in this book describe various aspects of the application of statistical methods in finance. It will interest and attract statisticians to this area, illustrate some of the many ways that statistical tools are used in financial applications, and give some indication of problems which are still outstanding. The statisticians will be stimulated to learn more about the kinds of models and techniques outlined in the book - both the domain of finance and the science of statistics will benefit from increased awareness by statisticians of the problems, models, and techniques applied in financial applications. For this reason, extensive references are given. The level of technical detail varies between the chapters. Some present broad non-technical overviews of an area, while others describe the mathematical niceties. This illustrates both the range of possibilities available in the area for statisticians, while simultaneously giving a flavour of the different kinds of mathematical and statistical skills required. Whether you favour data analysis or mathematical manipulation, if you are a statistician there are problems in finance which are appropriate to your skills.
Volume 39A of Research in the History of Economic Thought and Methodology features a selection of essays presented at the 2019 Conference of the Latin American Society for the History of Economic Thought (ALAHPE), edited by Felipe Almeida and Carlos Eduardo Suprinyak, as well as a new general-research essay by Daniel Kuehn, an archival discovery by Katia Caldari and Luca Fiorito, and a book review by John Hall.
Sociological theories of crime include: theories of strain blame crime on personal stressors; theories of social learning blame crime on its social rewards, and see crime more as an institution in conflict with other institutions rather than as in- vidual deviance; and theories of control look at crime as natural and rewarding, and explore the formation of institutions that control crime. Theorists of corruption generally agree that corruption is an expression of the Patron-Client relationship in which a person with access to resources trades resources with kin and members of the community in exchange for loyalty. Some approaches to modeling crime and corruption do not involve an explicit simulation: rule based systems; Bayesian networks; game theoretic approaches, often based on rational choice theory; and Neoclassical Econometrics, a rational choice-based approach. Simulation-based approaches take into account greater complexities of interacting parts of social phenomena. These include fuzzy cognitive maps and fuzzy rule sets that may incorporate feedback; and agent-based simulation, which can go a step farther by computing new social structures not previously identified in theory. The latter include cognitive agent models, in which agents learn how to perceive their en- ronment and act upon the perceptions of their individual experiences; and reactive agent simulation, which, while less capable than cognitive-agent simulation, is adequate for testing a policy's effects with existing societal structures. For example, NNL is a cognitive agent model based on the REPAST Simphony toolkit.
In recent years econometricians have examined the problems of diagnostic testing, specification testing, semiparametric estimation and model selection. In addition researchers have considered whether to use model testing and model selection procedures to decide the models that best fit a particular dataset. This book explores both issues with application to various regression models, including the arbitrage pricing theory models. It is ideal as a reference for statistical sciences postgraduate students, academic researchers and policy makers in understanding the current status of model building and testing techniques.
How do groups form, how do institutions come into being, and when do moral norms and practices emerge? This volume explores how game-theoretic approaches can be extended to consider broader questions that cross scales of organization, from individuals to cooperatives to societies. Game theory' strategic formulation of central problems in the analysis of social interactions is used to develop multi-level theories that examine the interplay between individuals and the collectives they form. The concept of cooperation is examined at a higher level than that usually addressed by game theory, especially focusing on the formation of groups and the role of social norms in maintaining their integrity, with positive and negative implications. The authors suggest that conventional analyses need to be broadened to explain how heuristics, like concepts of fairness, arise and become formalized into the ethical principles embraced by a society.
Economists are regularly confronted with results of quantitative economics research. Econometrics: Theory and Applications with EViews provides a broad introduction to quantitative economic methods, for example how models arise, their underlying assumptions and how estimates of parameters or other economic quantities are computed. The author combines econometric theory with practice by demonstrating its use with the software package EViews through extensive use of screen shots. The emphasis is on understanding how to select the right method of analysis for a given situation, and how to actually apply the theoretical methodology correctly. The EViews software package is available from 'Quantitive Micro Software'. Written for any undergraduate or postgraduate course in Econometrics.
In this book, Nancy and Richard Ruggles demonstrate their unique grasp of the measurement and analysis of macro and micro data and elucidate ways of integrating the two data sets. Their analysis of macrodata is used to examine the economic growth of the United States from the 1920s to the present day. They focus particularly on recession and recovery between 1929 and 1974 and the measurement of short-run economic growth. They also examine the measurement of saving, investment and capital formation in the United States. On a microeconomic level, they analyse economic intelligence in World War II, offer a study of fertility in the United States in the pre-war era and analyse longitudinal establishment data. Finally they integrating the two approaches to provide a method of providing a more complete picture of social and economic performance.
Volume 40 in the Advances in Econometrics series features twenty-three chapters that are split thematically into two parts. Part A presents novel contributions to the analysis of time series and panel data with applications in macroeconomics, finance, cognitive science and psychology, neuroscience, and labor economics. Part B examines innovations in stochastic frontier analysis, nonparametric and semiparametric modeling and estimation, A/B experiments, big-data analysis, and quantile regression. Individual chapters, written by both distinguished researchers and promising young scholars, cover many important topics in statistical and econometric theory and practice. Papers primarily, though not exclusively, adopt Bayesian methods for estimation and inference, although researchers of all persuasions should find considerable interest in the chapters contained in this work. The volume was prepared to honor the career and research contributions of Professor Dale J. Poirier. For researchers in econometrics, this volume includes the most up-to-date research across a wide range of topics. |
You may like...
Creative Urban Regions - Harnessing…
Tan Yigitcanlar, Koray Velibeyoglu, …
Hardcover
R4,598
Discovery Miles 45 980
Age-friendly Neighbourhood Planning And…
Belinda Yuen, Md Rashed Bhuyan, …
Hardcover
R2,375
Discovery Miles 23 750
Emerging Approaches in Design and New…
Esen Goekce OEzdamar, Oksan Tandogan
Hardcover
R5,333
Discovery Miles 53 330
Urban Planning and Management
Kenneth G. Willis, R.K. Turner, …
Hardcover
R8,160
Discovery Miles 81 600
|