![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Building on the strength of the first edition, Quantitative Methods for Business and Economics provides a simple introduction to the mathematical and statistical techniques needed in business. This book is accessible and easy to use, with the emphasis clearly on how to apply quantitative techniques to business situations. It includes numerous real world applications and many opportunities for student interaction. It is clearly focused on business, management and economics students taking a single module in Quantitative Methods.
Reflects the developments and new directions in the field since the publication of the first successful edition and contains a complete set of problems and solutions This revised and expanded edition reflects the developments and new directions in the field since the publication of the first edition. In particular, sections on nonstationary panel data analysis and a discussion on the distinction between deterministic and stochastic trends have been added. Three new chapters on long-memory discrete-time and continuous-time processes have also been created, whereas some chapters have been merged and some sections deleted. The first eleven chapters of the first edition have been compressed into ten chapters, with a chapter on nonstationary panel added and located under Part I: Analysis of Non-fractional Time Series. Chapters 12 to 14 have been newly written under Part II: Analysis of Fractional Time Series. Chapter 12 discusses the basic theory of long-memory processes by introducing ARFIMA models and the fractional Brownian motion (fBm). Chapter 13 is concerned with the computation of distributions of quadratic functionals of the fBm and its ratio. Next, Chapter 14 introduces the fractional Ornstein Uhlenbeck process, on which the statistical inference is discussed. Finally, Chapter 15 gives a complete set of solutions to problems posed at the end of most sections. This new edition features: Sections to discuss nonstationary panel data analysis, the problem of differentiating between deterministic and stochastic trends, and nonstationary processes of local deviations from a unit root Consideration of the maximum likelihood estimator of the drift parameter, as well as asymptotics as the sampling span increases Discussions on not only nonstationary but also noninvertible time series from a theoretical viewpoint New topics such as the computation of limiting local powers of panel unit root tests, the derivation of the fractional unit root distribution, and unit root tests under the fBm error Time Series Analysis: Nonstationary and Noninvertible Distribution Theory, Second Edition, is a reference for graduate students in econometrics or time series analysis. Katsuto Tanaka, PhD, is a professor in the Faculty of Economics at Gakushuin University and was previously a professor at Hitotsubashi University. He is a recipient of the Tjalling C. Koopmans Econometric Theory Prize (1996), the Japan Statistical Society Prize (1998), and the Econometric Theory Award (1999). Aside from the first edition of Time Series Analysis (Wiley, 1996), Dr. Tanaka had published five econometrics and statistics books in Japanese.
This volume of Advances in Econometrics contains articles that examine key topics in the modeling and estimation of dynamic stochastic general equilibrium (DSGE) models. Because DSGE models combine micro- and macroeconomic theory with formal econometric modeling and inference, over the past decade they have become an established framework for analyzing a variety of issues in empirical macroeconomics. The research articles make contributions in several key areas in DSGE modeling and estimation. In particular, papers cover the modeling and role of expectations, the study of optimal monetary policy in two-country models, and the problem of non-invertibility. Other interesting areas of inquiry include the analysis of parameter identification in new open economy macroeconomic models and the modeling of trend inflation shocks. The second part of the volume is devoted to articles that offer innovations in econometric methodology. These papers advance new techniques for addressing major inferential problems and include discussion and applications of Laplace-type, frequency domain, empirical likelihood and method of moments estimators.
Across the social sciences there has been increasing focus on reproducibility, i.e., the ability to examine a study's data and methods to ensure accuracy by reproducing the study. Reproducible Econometrics Using R combines an overview of key issues and methods with an introduction to how to use them using open source software (R) and recently developed tools (R Markdown and bookdown) that allow the reader to engage in reproducible econometric research. Jeffrey S. Racine provides a step-by-step approach, and covers five sets of topics, i) linear time series models, ii) robust inference, iii) robust estimation, iv) model uncertainty, and v) advanced topics. The time series material highlights the difference between time-series analysis, which focuses on forecasting, versus cross-sectional analysis, where the focus is typically on model parameters that have economic interpretations. For the time series material, the reader begins with a discussion of random walks, white noise, and non-stationarity. The reader is next exposed to the pitfalls of using standard inferential procedures that are popular in cross sectional settings when modelling time series data, and is introduced to alternative procedures that form the basis for linear time series analysis. For the robust inference material, the reader is introduced to the potential advantages of bootstrapping and the Jackknifing versus the use of asymptotic theory, and a range of numerical approaches are presented. For the robust estimation material, the reader is presented with a discussion of issues surrounding outliers in data and methods for addressing their presence. Finally, the model uncertainly material outlines two dominant approaches for dealing with model uncertainty, namely model selection and model averaging. Throughout the book there is an emphasis on the benefits of using R and other open source tools for ensuring reproducibility. The advanced material covers machine learning methods (support vector machines that are useful for classification) and nonparametric kernel regression which provides the reader with more advanced methods for confronting model uncertainty. The book is well suited for advanced undergraduate and graduate students alike. Assignments, exams, slides, and a solution manual are available for instructors.
Sociological theories of crime include: theories of strain blame crime on personal stressors; theories of social learning blame crime on its social rewards, and see crime more as an institution in conflict with other institutions rather than as in- vidual deviance; and theories of control look at crime as natural and rewarding, and explore the formation of institutions that control crime. Theorists of corruption generally agree that corruption is an expression of the Patron-Client relationship in which a person with access to resources trades resources with kin and members of the community in exchange for loyalty. Some approaches to modeling crime and corruption do not involve an explicit simulation: rule based systems; Bayesian networks; game theoretic approaches, often based on rational choice theory; and Neoclassical Econometrics, a rational choice-based approach. Simulation-based approaches take into account greater complexities of interacting parts of social phenomena. These include fuzzy cognitive maps and fuzzy rule sets that may incorporate feedback; and agent-based simulation, which can go a step farther by computing new social structures not previously identified in theory. The latter include cognitive agent models, in which agents learn how to perceive their en- ronment and act upon the perceptions of their individual experiences; and reactive agent simulation, which, while less capable than cognitive-agent simulation, is adequate for testing a policy's effects with existing societal structures. For example, NNL is a cognitive agent model based on the REPAST Simphony toolkit.
In recent years econometricians have examined the problems of diagnostic testing, specification testing, semiparametric estimation and model selection. In addition researchers have considered whether to use model testing and model selection procedures to decide the models that best fit a particular dataset. This book explores both issues with application to various regression models, including the arbitrage pricing theory models. It is ideal as a reference for statistical sciences postgraduate students, academic researchers and policy makers in understanding the current status of model building and testing techniques.
How do groups form, how do institutions come into being, and when do moral norms and practices emerge? This volume explores how game-theoretic approaches can be extended to consider broader questions that cross scales of organization, from individuals to cooperatives to societies. Game theory' strategic formulation of central problems in the analysis of social interactions is used to develop multi-level theories that examine the interplay between individuals and the collectives they form. The concept of cooperation is examined at a higher level than that usually addressed by game theory, especially focusing on the formation of groups and the role of social norms in maintaining their integrity, with positive and negative implications. The authors suggest that conventional analyses need to be broadened to explain how heuristics, like concepts of fairness, arise and become formalized into the ethical principles embraced by a society.
Financial models are an inescapable feature of modern financial markets. Yet it was over reliance on these models and the failure to test them properly that is now widely recognized as one of the main causes of the financial crisis of 2007-2011. Since this crisis, there has been an increase in the amount of scrutiny and testing applied to such models, and validation has become an essential part of model risk management at financial institutions. The book covers all of the major risk areas that a financial institution is exposed to and uses models for, including market risk, interest rate risk, retail credit risk, wholesale credit risk, compliance risk, and investment management. The book discusses current practices and pitfalls that model risk users need to be aware of and identifies areas where validation can be advanced in the future. This provides the first unified framework for validating risk management models.
Economists are regularly confronted with results of quantitative economics research. Econometrics: Theory and Applications with EViews provides a broad introduction to quantitative economic methods, for example how models arise, their underlying assumptions and how estimates of parameters or other economic quantities are computed. The author combines econometric theory with practice by demonstrating its use with the software package EViews through extensive use of screen shots. The emphasis is on understanding how to select the right method of analysis for a given situation, and how to actually apply the theoretical methodology correctly. The EViews software package is available from 'Quantitive Micro Software'. Written for any undergraduate or postgraduate course in Econometrics.
Learn more about modern Econometrics with this comprehensive introduction to the field, featuring engaging applications and bringing contemporary theories to life. Introduction to Econometrics, 4th Edition, Global Edition by Stock and Watson is the ultimate introductory guide that connects modern theory with motivating, engaging applications. The text ensures you get a solid grasp of this challenging subject's theoretical background, building on the philosophy that applications should drive the theory, not the other way around. The latest edition maintains the focus on currency, focusing on empirical analysis and incorporating real-world questions and data by using results directly relevant to the applications. The text contextualises the study of Econometrics with a comprehensive introduction and review of economics, data, and statistics before proceeding to an extensive regression analysis studying the different variables and regression parameters. With a large data set increasingly used in Economics and related fields, a new chapter dedicated to Big Data will help you learn more about this growing and exciting area. Sharing a variety of resources and tools to help your understanding and critical thinking of the topics introduced, such as General Interest boxes, or end-of-chapter, and empirical exercises and summaries, this industry-leading text will help you acquire a sophisticated knowledge of this fascinating subject. Reach every student by pairing this text with Pearson MyLab (R) Economics MyLab is the teaching and learning platform that empowers you to reach every student. By combining trusted author content with digital tools and a flexible platform, MyLab (R) personalises the learning experience and improves results for each student. If you would like to purchase both the physical text and MyLab Economics search for: 9781292264561 Introduction to Econometrics, 4th Edition, Global Edition with MyLab Economics Package consists of: 9781292264455 Introduction to Econometrics, 4th Edition, Global Edition 9781292264516 Introduction to Econometrics, 4th Edition, Global Edition MyLab Economics 9780136879787 Introduction to Econometrics, 4th Edition, Global Edition Pearson eText Pearson MyLab (R) Economics is not included. Students, if Pearson MyLab Economics is a recommended/mandatory component of the course, please ask your instructor for the correct ISBN. Pearson MyLab (R) Economics should only be purchased when required by an instructor. Instructors, contact your Pearson representative for more information.
This Handbook takes an econometric approach to the foundations of economic performance analysis. The focus is on the measurement of efficiency, productivity, growth and performance. These concepts are commonly measured residually and difficult to quantify in practice. In real-life applications, efficiency and productivity estimates are often quite sensitive to the models used in the performance assessment and the methodological approaches adopted by the analysis. The Palgrave Handbook of Performance Analysis discusses the two basic techniques of performance measurement - deterministic benchmarking and stochastic benchmarking - in detail, and addresses the statistical techniques that connect them. All chapters include applications and explore topics ranging from the output/input ratio to productivity indexes and national statistics.
Now in its fourth edition, this comprehensive introduction of fundamental panel data methodologies provides insights on what is most essential in panel literature. A capstone to the forty-year career of a pioneer of panel data analysis, this new edition's primary contribution will be the coverage of advancements in panel data analysis, a statistical method widely used to analyze two or higher-dimensional panel data. The topics discussed in early editions have been reorganized and streamlined to comprehensively introduce panel econometric methodologies useful for identifying causal relationships among variables, supported by interdisciplinary examples and case studies. This book, to be featured in Cambridge's Econometric Society Monographs series, has been the leader in the field since the first edition. It is essential reading for researchers, practitioners and graduate students interested in the analysis of microeconomic behavior.
Now in its fourth edition, this comprehensive introduction of fundamental panel data methodologies provides insights on what is most essential in panel literature. A capstone to the forty-year career of a pioneer of panel data analysis, this new edition's primary contribution will be the coverage of advancements in panel data analysis, a statistical method widely used to analyze two or higher-dimensional panel data. The topics discussed in early editions have been reorganized and streamlined to comprehensively introduce panel econometric methodologies useful for identifying causal relationships among variables, supported by interdisciplinary examples and case studies. This book, to be featured in Cambridge's Econometric Society Monographs series, has been the leader in the field since the first edition. It is essential reading for researchers, practitioners and graduate students interested in the analysis of microeconomic behavior.
This book is dedicated to the study of the term structures of the yields of zero-coupon bonds. The methods it describes differ from those usually found in the literature in that the time variable is not the term to maturity but the interest rate duration, or another convenient non-linear transformation of terms. This makes it possible to consider yield curves not only for a limited interval of term values, but also for the entire positive semiaxis of terms. The main focus is the comparative analysis of yield curves and forward curves and the analytical study of their features. Generalizations of yield term structures are studied where the dimension of the state space of the financial market is increased. In cases where the analytical approach is too cumbersome, or impossible, numerical techniques are used. This book will be of interest to financial analysts, financial market researchers, graduate students and PhD students.
We live in a time of economic virtualism, whereby our lives are
made to conform to the virtual reality of economic thought.
Globalization, transnational capitalism, structural adjustment
programmes and the decay of welfare are all signs of the growing
power of economics, one of the most potent forces of recent
decades. In the last thirty years, economics has ceased to be just
an academic discipline concerned with the study of economy, and has
come to be the only legitimate way to think about all aspects of
society and how we order our lives. Economic models are no longer
measured against the world they seek to describe, but instead the
world is measured against them, found wanting and made to conform.
Drawing on the author's extensive and varied research, this book provides readers with a firm grounding in the concepts and issues across several disciplines including economics, nutrition, psychology and public health in the hope of improving the design of food policies in the developed and developing world. Using longitudinal (panel) data from India, Bangladesh, Kenya, the Philippines, Vietnam, and Pakistan and extending the analytical framework used in economics and biomedical sciences to include multi-disciplinary analyses, Alok Bhargava shows how rigorous and thoughtful econometric and statistical analysis can improve our understanding of the relationships between a number of socioeconomic, nutritional, and behavioural variables on a number of issues like cognitive development in children and labour productivity in the developing world. These unique insights combined with a multi-disciplinary approach forge the way for a more refined and effective approach to food policy formation going forward. A chapter on the growing obesity epidemic is also included, highlighting the new set of problems facing not only developed but developing countries. The book also includes a glossary of technical terms to assist readers coming from a variety of disciplines.
This is the perfect (and essential) supplement for all econometrics
classes--from a rigorous first undergraduate course, to a first
master's, to a PhD course.
This report is a partial result of the China's Quarterly Macroeconomic Model (CQMM), a project developed and maintained by the Center for Macroeconomic Research (CMR) at Xiamen University. The CMR, one of the Key Research Institutes of Humanities and Social Sciences sponsored by the Ministry of Education of China, has been focusing on China's economic forecast and macroeconomic policy analysis, and it started to develop the CQMM for purpose of short-term forecasting, policy analysis, and simulation in 2005.Based on the CQMM, the CMR and its partners hold press conferences to release forecasts for China' major macroeconomic variables. Since July, 2006, twenty-six quarterly reports on China's macroeconomic outlook have been presented and thirteen annual reports have been published. This 27th quarterly report has been presented at the Forum on China's Macroeconomic Prospects and Press Conference of the CQMM at Xiamen University Malaysia on October 25, 2019. This conference was jointly held by Xiamen University and Economic Information Daily of Xinhua News Agency.
Microbehavioral Econometric Methods and Environmental Studies uses microeconometric methods to model the behavior of individuals, then demonstrates the modelling approaches in addressing policy needs. It links theory and methods with applications, and it incorporates data to connect individual choices and global environmental issues. This extension of traditional environmental economics presents modeling strategies and methodological techniques, then applies them to hands-on examples.Throughout the book, readers can access chapter summaries, problem sets, multiple household survey data with regard to agricultural and natural resources in Sub-Saharan Africa, South America, and India, and empirical results and solutions from the SAS software.
This introductory overview explores the methods, models and interdisciplinary links of artificial economics, a new way of doing economics in which the interactions of artificial economic agents are computationally simulated to study their individual and group behavior patterns. Conceptually and intuitively, and with simple examples, Mercado addresses the differences between the basic assumptions and methods of artificial economics and those of mainstream economics. He goes on to explore various disciplines from which the concepts and methods of artificial economics originate; for example cognitive science, neuroscience, artificial intelligence, evolutionary science and complexity science. Introductory discussions on several controversial issues are offered, such as the application of the concepts of evolution and complexity in economics and the relationship between artificial intelligence and the philosophies of mind. This is one of the first books to fully address artificial economics, emphasizing its interdisciplinary links and presenting in a balanced way its occasionally controversial aspects.
This introductory overview explores the methods, models and interdisciplinary links of artificial economics, a new way of doing economics in which the interactions of artificial economic agents are computationally simulated to study their individual and group behavior patterns. Conceptually and intuitively, and with simple examples, Mercado addresses the differences between the basic assumptions and methods of artificial economics and those of mainstream economics. He goes on to explore various disciplines from which the concepts and methods of artificial economics originate; for example cognitive science, neuroscience, artificial intelligence, evolutionary science and complexity science. Introductory discussions on several controversial issues are offered, such as the application of the concepts of evolution and complexity in economics and the relationship between artificial intelligence and the philosophies of mind. This is one of the first books to fully address artificial economics, emphasizing its interdisciplinary links and presenting in a balanced way its occasionally controversial aspects.
In this book, different quantitative approaches to the study of electoral systems have been developed: game-theoretic, decision-theoretic, statistical, probabilistic, combinatorial, geometric, and optimization ones. All the authors are prominent scholars from these disciplines. Quantitative approaches offer a powerful tool to detect inconsistencies or poor performance in actual systems. Applications to concrete settings such as EU, American Congress, regional, and committee voting are discussed.
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin-Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in ride-sharing applications.
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin-Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in ride-sharing applications.
This book examines conventional time series in the context of stationary data prior to a discussion of cointegration, with a focus on multivariate models. The authors provide a detailed and extensive study of impulse responses and forecasting in the stationary and non-stationary context, considering small sample correction, volatility and the impact of different orders of integration. Models with expectations are considered along with alternate methods such as Singular Spectrum Analysis (SSA), the Kalman Filter and Structural Time Series, all in relation to cointegration. Using single equations methods to develop topics, and as examples of the notion of cointegration, Burke, Hunter, and Canepa provide direction and guidance to the now vast literature facing students and graduate economists. |
You may like...
Control of Complex Systems - Theory and…
Kyriakos Vamvoudakis, Sarangapani Jagannathan
Hardcover
Revolutionizing Collaboration through…
Shimon Y. Nof, Jose Ceroni, …
Hardcover
Fundamentals of Switching Theory and…
Jaakko Astola, Radomir S Stankovic
Hardcover
R1,597
Discovery Miles 15 970
Multi-point Interaction with Real and…
Federico Barbagli, Domenico Prattichizzo, …
Hardcover
R2,810
Discovery Miles 28 100
Mob Control: Models of Threshold…
Vladimir V. Breer, Dmitry A. Novikov, …
Hardcover
R3,193
Discovery Miles 31 930
|