![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
Sociological theories of crime include: theories of strain blame crime on personal stressors; theories of social learning blame crime on its social rewards, and see crime more as an institution in conflict with other institutions rather than as in- vidual deviance; and theories of control look at crime as natural and rewarding, and explore the formation of institutions that control crime. Theorists of corruption generally agree that corruption is an expression of the Patron-Client relationship in which a person with access to resources trades resources with kin and members of the community in exchange for loyalty. Some approaches to modeling crime and corruption do not involve an explicit simulation: rule based systems; Bayesian networks; game theoretic approaches, often based on rational choice theory; and Neoclassical Econometrics, a rational choice-based approach. Simulation-based approaches take into account greater complexities of interacting parts of social phenomena. These include fuzzy cognitive maps and fuzzy rule sets that may incorporate feedback; and agent-based simulation, which can go a step farther by computing new social structures not previously identified in theory. The latter include cognitive agent models, in which agents learn how to perceive their en- ronment and act upon the perceptions of their individual experiences; and reactive agent simulation, which, while less capable than cognitive-agent simulation, is adequate for testing a policy's effects with existing societal structures. For example, NNL is a cognitive agent model based on the REPAST Simphony toolkit.
This book analyzes the following four distinct, although not dissimilar, areas of social choice theory and welfare economics: nonstrategic choice, Harsanyi's aggregation theorems, distributional ethics and strategic choice. While for aggregation of individual ranking of social states, whether the persons behave strategically or non-strategically, the decision making takes place under complete certainty; in the Harsanyi framework uncertainty has a significant role in the decision making process. Another ingenious characteristic of the book is the discussion of ethical approaches to evaluation of inequality arising from unequal distributions of achievements in the different dimensions of human well-being. Given its wide coverage, combined with newly added materials, end-chapter problems and bibliographical notes, the book will be helpful material for students and researchers interested in this frontline area research. Its lucid exposition, along with non-technical and graphical illustration of the concepts, use of numerical examples, makes the book a useful text.
How do groups form, how do institutions come into being, and when do moral norms and practices emerge? This volume explores how game-theoretic approaches can be extended to consider broader questions that cross scales of organization, from individuals to cooperatives to societies. Game theory' strategic formulation of central problems in the analysis of social interactions is used to develop multi-level theories that examine the interplay between individuals and the collectives they form. The concept of cooperation is examined at a higher level than that usually addressed by game theory, especially focusing on the formation of groups and the role of social norms in maintaining their integrity, with positive and negative implications. The authors suggest that conventional analyses need to be broadened to explain how heuristics, like concepts of fairness, arise and become formalized into the ethical principles embraced by a society.
Financial models are an inescapable feature of modern financial markets. Yet it was over reliance on these models and the failure to test them properly that is now widely recognized as one of the main causes of the financial crisis of 2007-2011. Since this crisis, there has been an increase in the amount of scrutiny and testing applied to such models, and validation has become an essential part of model risk management at financial institutions. The book covers all of the major risk areas that a financial institution is exposed to and uses models for, including market risk, interest rate risk, retail credit risk, wholesale credit risk, compliance risk, and investment management. The book discusses current practices and pitfalls that model risk users need to be aware of and identifies areas where validation can be advanced in the future. This provides the first unified framework for validating risk management models.
Economists are regularly confronted with results of quantitative economics research. Econometrics: Theory and Applications with EViews provides a broad introduction to quantitative economic methods, for example how models arise, their underlying assumptions and how estimates of parameters or other economic quantities are computed. The author combines econometric theory with practice by demonstrating its use with the software package EViews through extensive use of screen shots. The emphasis is on understanding how to select the right method of analysis for a given situation, and how to actually apply the theoretical methodology correctly. The EViews software package is available from 'Quantitive Micro Software'. Written for any undergraduate or postgraduate course in Econometrics.
A ground-breaking book that reveals why our human biases affect the way
we receive and interpret information
This Handbook takes an econometric approach to the foundations of economic performance analysis. The focus is on the measurement of efficiency, productivity, growth and performance. These concepts are commonly measured residually and difficult to quantify in practice. In real-life applications, efficiency and productivity estimates are often quite sensitive to the models used in the performance assessment and the methodological approaches adopted by the analysis. The Palgrave Handbook of Performance Analysis discusses the two basic techniques of performance measurement - deterministic benchmarking and stochastic benchmarking - in detail, and addresses the statistical techniques that connect them. All chapters include applications and explore topics ranging from the output/input ratio to productivity indexes and national statistics.
Now in its fourth edition, this comprehensive introduction of fundamental panel data methodologies provides insights on what is most essential in panel literature. A capstone to the forty-year career of a pioneer of panel data analysis, this new edition's primary contribution will be the coverage of advancements in panel data analysis, a statistical method widely used to analyze two or higher-dimensional panel data. The topics discussed in early editions have been reorganized and streamlined to comprehensively introduce panel econometric methodologies useful for identifying causal relationships among variables, supported by interdisciplinary examples and case studies. This book, to be featured in Cambridge's Econometric Society Monographs series, has been the leader in the field since the first edition. It is essential reading for researchers, practitioners and graduate students interested in the analysis of microeconomic behavior.
Score your highest in econometrics? Easy. Econometrics can prove challenging for many students unfamiliar with the terms and concepts discussed in a typical econometrics course. "Econometrics For Dummies "eliminates that confusion with easy-to-understand explanations of important topics in the study of economics. "Econometrics For Dummies "breaks down this complex subject and provides you with an easy-to-follow course supplement to further refine your understanding of how econometrics works and how it can be applied in real-world situations.An excellent resource for anyone participating in a college or graduate level econometrics courseProvides you with an easy-to-follow introduction to the techniques and applications of econometricsHelps you score high on exam day If you're seeking a degree in economics and looking for a plain-English guide to this often-intimidating course, "Econometrics For Dummies" has you covered.
We live in a time of economic virtualism, whereby our lives are
made to conform to the virtual reality of economic thought.
Globalization, transnational capitalism, structural adjustment
programmes and the decay of welfare are all signs of the growing
power of economics, one of the most potent forces of recent
decades. In the last thirty years, economics has ceased to be just
an academic discipline concerned with the study of economy, and has
come to be the only legitimate way to think about all aspects of
society and how we order our lives. Economic models are no longer
measured against the world they seek to describe, but instead the
world is measured against them, found wanting and made to conform.
This book is dedicated to the study of the term structures of the yields of zero-coupon bonds. The methods it describes differ from those usually found in the literature in that the time variable is not the term to maturity but the interest rate duration, or another convenient non-linear transformation of terms. This makes it possible to consider yield curves not only for a limited interval of term values, but also for the entire positive semiaxis of terms. The main focus is the comparative analysis of yield curves and forward curves and the analytical study of their features. Generalizations of yield term structures are studied where the dimension of the state space of the financial market is increased. In cases where the analytical approach is too cumbersome, or impossible, numerical techniques are used. This book will be of interest to financial analysts, financial market researchers, graduate students and PhD students.
High-dimensional probability offers insight into the behavior of random vectors, random matrices, random subspaces, and objects used to quantify uncertainty in high dimensions. Drawing on ideas from probability, analysis, and geometry, it lends itself to applications in mathematics, statistics, theoretical computer science, signal processing, optimization, and more. It is the first to integrate theory, key tools, and modern applications of high-dimensional probability. Concentration inequalities form the core, and it covers both classical results such as Hoeffding's and Chernoff's inequalities and modern developments such as the matrix Bernstein's inequality. It then introduces the powerful methods based on stochastic processes, including such tools as Slepian's, Sudakov's, and Dudley's inequalities, as well as generic chaining and bounds based on VC dimension. A broad range of illustrations is embedded throughout, including classical and modern results for covariance estimation, clustering, networks, semidefinite programming, coding, dimension reduction, matrix completion, machine learning, compressed sensing, and sparse regression.
Connections among different assets, asset classes, portfolios, and the stocks of individual institutions are critical in examining financial markets. Interest in financial markets implies interest in underlying macroeconomic fundamentals. In Financial and Macroeconomic Connectedness, Frank Diebold and Kamil Yilmaz propose a simple framework for defining, measuring, and monitoring connectedness, which is central to finance and macroeconomics. These measures of connectedness are theoretically rigorous yet empirically relevant. The approach to connectedness proposed by the authors is intimately related to the familiar econometric notion of variance decomposition. The full set of variance decompositions from vector auto-regressions produces the core of the 'connectedness table.' The connectedness table makes clear how one can begin with the most disaggregated pair-wise directional connectedness measures and aggregate them in various ways to obtain total connectedness measures. The authors also show that variance decompositions define weighted, directed networks, so that these proposed connectedness measures are intimately related to key measures of connectedness used in the network literature. After describing their methods in the first part of the book, the authors proceed to characterize daily return and volatility connectedness across major asset (stock, bond, foreign exchange and commodity) markets as well as the financial institutions within the U.S. and across countries since late 1990s. These specific measures of volatility connectedness show that stock markets played a critical role in spreading the volatility shocks from the U.S. to other countries. Furthermore, while the return connectedness across stock markets increased gradually over time the volatility connectedness measures were subject to significant jumps during major crisis events. This book examines not only financial connectedness, but also real fundamental connectedness. In particular, the authors show that global business cycle connectedness is economically significant and time-varying, that the U.S. has disproportionately high connectedness to others, and that pairwise country connectedness is inversely related to bilateral trade surpluses.
Algorithmic Trading and Quantitative Strategies provides an in-depth overview of this growing field with a unique mix of quantitative rigor and practitioner's hands-on experience. The focus on empirical modeling and practical know-how makes this book a valuable resource for students and professionals. The book starts with the often overlooked context of why and how we trade via a detailed introduction to market structure and quantitative microstructure models. The authors then present the necessary quantitative toolbox including more advanced machine learning models needed to successfully operate in the field. They next discuss the subject of quantitative trading, alpha generation, active portfolio management and more recent topics like news and sentiment analytics. The last main topic of execution algorithms is covered in detail with emphasis on the state of the field and critical topics including the elusive concept of market impact. The book concludes with a discussion of the technology infrastructure necessary to implement algorithmic strategies in large-scale production settings. A GitHub repository includes data sets and explanatory/exercise Jupyter notebooks. The exercises involve adding the correct code to solve the particular analysis/problem.
For one-semester courses in Introduction to Business Statistics. The gold standard in learning Microsoft Excelfor business statistics Statistics for Managers Using Microsoft (R) Excel (R), 9th Edition, Global Edition helps students develop the knowledge of Excel needed in future careers. The authors present statistics in the context of specific business fields, and now include a full chapter on business analytics. Guided by principles set forth by ASA's Guidelines for Assessment and Instruction (GAISE) reports and the authors' diverse teaching experiences, the text continues to innovate and improve the way this course is taught to students. Current data throughout gives students valuable practice analysing the types of data they will see in their professions, and the authors' friendly writing style includes tips and learning aids throughout.
Drawing on the author's extensive and varied research, this book provides readers with a firm grounding in the concepts and issues across several disciplines including economics, nutrition, psychology and public health in the hope of improving the design of food policies in the developed and developing world. Using longitudinal (panel) data from India, Bangladesh, Kenya, the Philippines, Vietnam, and Pakistan and extending the analytical framework used in economics and biomedical sciences to include multi-disciplinary analyses, Alok Bhargava shows how rigorous and thoughtful econometric and statistical analysis can improve our understanding of the relationships between a number of socioeconomic, nutritional, and behavioural variables on a number of issues like cognitive development in children and labour productivity in the developing world. These unique insights combined with a multi-disciplinary approach forge the way for a more refined and effective approach to food policy formation going forward. A chapter on the growing obesity epidemic is also included, highlighting the new set of problems facing not only developed but developing countries. The book also includes a glossary of technical terms to assist readers coming from a variety of disciplines.
This essential reference for students and scholars in the input-output research and applications community has been fully revised and updated to reflect important developments in the field. Expanded coverage includes construction and application of multiregional and interregional models, including international models and their application to global economic issues such as climate change and international trade; structural decomposition and path analysis; linkages and key sector identification and hypothetical extraction analysis; the connection of national income and product accounts to input-output accounts; supply and use tables for commodity-by-industry accounting and models; social accounting matrices; non-survey estimation techniques; and energy and environmental applications. Input-Output Analysis is an ideal introduction to the subject for advanced undergraduate and graduate students in many scholarly fields, including economics, regional science, regional economics, city, regional and urban planning, environmental planning, public policy analysis and public management.
Factor Analysis and Dimension Reduction in R provides coverage, with worked examples, of a large number of dimension reduction procedures along with model performance metrics to compare them. Factor analysis in the form of principal components analysis (PCA) or principal factor analysis (PFA) is familiar to most social scientists. However, what is less familiar is understanding that factor analysis is a subset of the more general statistical family of dimension reduction methods. The social scientist's toolkit for factor analysis problems can be expanded to include the range of solutions this book presents. In addition to covering FA and PCA with orthogonal and oblique rotation, this book's coverage includes higher-order factor models, bifactor models, models based on binary and ordinal data, models based on mixed data, generalized low-rank models, cluster analysis with GLRM, models involving supplemental variables or observations, Bayesian factor analysis, regularized factor analysis, testing for unidimensionality, and prediction with factor scores. The second half of the book deals with other procedures for dimension reduction. These include coverage of kernel PCA, factor analysis with multidimensional scaling, locally linear embedding models, Laplacian eigenmaps, diffusion maps, force directed methods, t-distributed stochastic neighbor embedding, independent component analysis (ICA), dimensionality reduction via regression (DRR), non-negative matrix factorization (NNMF), Isomap, Autoencoder, uniform manifold approximation and projection (UMAP) models, neural network models, and longitudinal factor analysis models. In addition, a special chapter covers metrics for comparing model performance. Features of this book include: Numerous worked examples with replicable R code Explicit comprehensive coverage of data assumptions Adaptation of factor methods to binary, ordinal, and categorical data Residual and outlier analysis Visualization of factor results Final chapters that treat integration of factor analysis with neural network and time series methods Presented in color with R code and introduction to R and RStudio, this book will be suitable for graduate-level and optional module courses for social scientists, and on quantitative methods and multivariate statistics courses.
This report is a partial result of the China's Quarterly Macroeconomic Model (CQMM), a project developed and maintained by the Center for Macroeconomic Research (CMR) at Xiamen University. The CMR, one of the Key Research Institutes of Humanities and Social Sciences sponsored by the Ministry of Education of China, has been focusing on China's economic forecast and macroeconomic policy analysis, and it started to develop the CQMM for purpose of short-term forecasting, policy analysis, and simulation in 2005.Based on the CQMM, the CMR and its partners hold press conferences to release forecasts for China' major macroeconomic variables. Since July, 2006, twenty-six quarterly reports on China's macroeconomic outlook have been presented and thirteen annual reports have been published. This 27th quarterly report has been presented at the Forum on China's Macroeconomic Prospects and Press Conference of the CQMM at Xiamen University Malaysia on October 25, 2019. This conference was jointly held by Xiamen University and Economic Information Daily of Xinhua News Agency.
The book focuses on problem solving for practitioners and model building for academicians under multivariate situations. This book helps readers in understanding the issues, such as knowing variability, extracting patterns, building relationships, and making objective decisions. A large number of multivariate statistical models are covered in the book. The readers will learn how a practical problem can be converted to a statistical problem and how the statistical solution can be interpreted as a practical solution. Key features: Links data generation process with statistical distributions in multivariate domain Provides step by step procedure for estimating parameters of developed models Provides blueprint for data driven decision making Includes practical examples and case studies relevant for intended audiences The book will help everyone involved in data driven problem solving, modeling and decision making.
This introductory overview explores the methods, models and interdisciplinary links of artificial economics, a new way of doing economics in which the interactions of artificial economic agents are computationally simulated to study their individual and group behavior patterns. Conceptually and intuitively, and with simple examples, Mercado addresses the differences between the basic assumptions and methods of artificial economics and those of mainstream economics. He goes on to explore various disciplines from which the concepts and methods of artificial economics originate; for example cognitive science, neuroscience, artificial intelligence, evolutionary science and complexity science. Introductory discussions on several controversial issues are offered, such as the application of the concepts of evolution and complexity in economics and the relationship between artificial intelligence and the philosophies of mind. This is one of the first books to fully address artificial economics, emphasizing its interdisciplinary links and presenting in a balanced way its occasionally controversial aspects.
This introductory overview explores the methods, models and interdisciplinary links of artificial economics, a new way of doing economics in which the interactions of artificial economic agents are computationally simulated to study their individual and group behavior patterns. Conceptually and intuitively, and with simple examples, Mercado addresses the differences between the basic assumptions and methods of artificial economics and those of mainstream economics. He goes on to explore various disciplines from which the concepts and methods of artificial economics originate; for example cognitive science, neuroscience, artificial intelligence, evolutionary science and complexity science. Introductory discussions on several controversial issues are offered, such as the application of the concepts of evolution and complexity in economics and the relationship between artificial intelligence and the philosophies of mind. This is one of the first books to fully address artificial economics, emphasizing its interdisciplinary links and presenting in a balanced way its occasionally controversial aspects.
In this book, different quantitative approaches to the study of electoral systems have been developed: game-theoretic, decision-theoretic, statistical, probabilistic, combinatorial, geometric, and optimization ones. All the authors are prominent scholars from these disciplines. Quantitative approaches offer a powerful tool to detect inconsistencies or poor performance in actual systems. Applications to concrete settings such as EU, American Congress, regional, and committee voting are discussed.
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin-Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in ride-sharing applications. |
You may like...
Quantitative statistical techniques
Swanepoel Swanepoel, Vivier Vivier, …
Paperback
(2)R718 Discovery Miles 7 180
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,581
Discovery Miles 35 810
The Leading Indicators - A Short History…
Zachary Karabell
Paperback
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
|