![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
This practical guide in Eviews is aimed at practitioners and students in business, economics, econometrics, and finance. It uses a step-by-step approach to equip readers with a toolkit that enables them to make the most of this widely used econometric analysis software. Statistical and econometrics concepts are explained visually with examples, problems, and solutions. Developed by economists, the Eviews statistical software package is used most commonly for time-series oriented econometric analysis. It allows users to quickly develop statistical relations from data and then use those relations to forecast future values of the data. The package provides convenient ways to enter or upload data series, create new series from existing ones, display and print series, carry out statistical analyses of relationships among series, and manipulate results and output. This highly hands-on resource includes more than 200 illustrative graphs and tables and tutorials throughout. Abdulkader Aljandali is Senior Lecturer at Coventry University in London. He is currently leading the Stochastic Finance Module taught as part of the Global Financial Trading MSc. His previously published work includes Exchange Rate Volatility in Emerging Markers, Quantitative Analysis, Multivariate Methods & Forecasting with IBM SPSS Statistics and Multivariate Methods and Forecasting with IBM (R) SPSS (R) Statistics. Dr Aljandali is an established member of the British Accounting and Finance Association and the Higher Education Academy. Motasam Tatahi is a specialist in the areas of Macroeconomics, Financial Economics, and Financial Econometrics at the European Business School, Regent's University London, where he serves as Principal Lecturer and Dissertation Coordinator for the MSc in Global Banking and Finance at The European Business School-London.
Provides sound knowledge of optimal decision making in statistics and operations research problems. Serves a quick reference by exploring the research literature on the subject with commercial value-added research applications in statistics and operations research. Provides sound knowledge of optimisations and statistical techniques in modelling of real-world problems. Reviews recent developments and contributions in optimal decision-making problems using optimisation and statistical techniques. Provides an understanding of formulations of decision-making problems and their solution procedures. Describes latest developments in modelling of real-world problems and their solution approaches.
This book has become one of the main statistical tools for the
analysis of economic and financial data. Designed for both
theoreticians and practitioners, this book provides a comprehensive
treatment of GMM estimation and inference. All the main statistical
results are discussed intuitively and proved formally, and all the
inference techniques are illustrated using empirical examples in
macroeconomics and finance. This book is the first to provide an
intuitive introduction to the method combined with a unified
treatment of GMM statistical theory and a survey of recent
important developments in the field.
Medicine Price Surveys, Analyses and Comparisons establishes guidelines for the study and implementation of pharmaceutical price surveys, analyses, and comparisons. Its contributors evaluate price survey literature, discuss the accessibility and reliability of data sources, and provide a checklist and training kit on conducting price surveys, analyses, and comparisons. Their investigations survey price studies while accounting for the effects of methodologies and explaining regional differences in medicine prices. They also consider policy objectives such as affordable access to medicines and cost-containment as well as options for improving the effectiveness of policies.
The aim of this publication is to identify and apply suitable methods for analysing and predicting the time series of gold prices, together with acquainting the reader with the history and characteristics of the methods and with the time series issues in general. Both statistical and econometric methods, and especially artificial intelligence methods, are used in the case studies. The publication presents both traditional and innovative methods on the theoretical level, always accompanied by a case study, i.e. their specific use in practice. Furthermore, a comprehensive comparative analysis of the individual methods is provided. The book is intended for readers from the ranks of academic staff, students of universities of economics, but also the scientists and practitioners dealing with the time series prediction. From the point of view of practical application, it could provide useful information for speculators and traders on financial markets, especially the commodity markets.
This book develops the theory of productivity measurement using the empirical index number approach. The theory uses multiplicative indices and additive indicators as measurement tools, instead of relying on the usual neo-classical assumptions, such as the existence of a production function characterized by constant returns to scale, optimizing behavior of the economic agents, and perfect foresight. The theory can be applied to all the common levels of aggregation (micro, meso, and macro), and half of the book is devoted to accounting for the links existing between the various levels. Basic insights from National Accounts are thereby used. The final chapter is devoted to the decomposition of productivity change into the contributions of efficiency change, technological change, scale effects, and input or output mix effects. Applications on real-life data demonstrate the empirical feasibility of the theory. The book is directed to a variety of overlapping audiences: statisticians involved in measuring productivity change; economists interested in growth accounting; researchers relating macro-economic productivity change to its industrial sources; enterprise micro-data researchers; and business analysts interested in performance measurement.
Belief and Rule Compliance: An Experimental Comparison of Muslim and Non-Muslim Economic Behavior uses modern behavioral science and game theory to examine the behavior and compliance of Muslim populations to Islamic Finance laws and norms. The work identifies behaviors characterized by unexpected complexity and profound divergence, including expectations for sharing, cooperation and entrepreneurship gleaned from studies. Adopting a unique set of recent empirical observations, the work provides a reliable behavioral foundation for practitioners seeking to evaluate, create and market Islamic financial products.
The case-control approach is a powerful method for investigating factors that may explain a particular event. It is extensively used in epidemiology to study disease incidence, one of the best-known examples being Bradford Hill and Doll's investigation of the possible connection between cigarette smoking and lung cancer. More recently, case-control studies have been increasingly used in other fields, including sociology and econometrics. With a particular focus on statistical analysis, this book is ideal for applied and theoretical statisticians wanting an up-to-date introduction to the field. It covers the fundamentals of case-control study design and analysis as well as more recent developments, including two-stage studies, case-only studies and methods for case-control sampling in time. The latter have important applications in large prospective cohorts which require case-control sampling designs to make efficient use of resources. More theoretical background is provided in an appendix for those new to the field.
Equilibrium Problems and Applications develops a unified variational approach to deal with single-valued, set-valued and quasi-equilibrium problems. The authors promote original results in relationship with classical contributions to the field of equilibrium problems. The content evolved in the general setting of topological vector spaces and it lies at the interplay between pure and applied nonlinear analysis, mathematical economics, and mathematical physics. This abstract approach is based on tools from various fields, including set-valued analysis, variational and hemivariational inequalities, fixed point theory, and optimization. Applications include models from mathematical economics, Nash equilibrium of non-cooperative games, and Browder variational inclusions. The content is self-contained and the book is mainly addressed to researchers in mathematics, economics and mathematical physics as well as to graduate students in applied nonlinear analysis.
What is econophysics? What makes an econophysicist? Why are financial economists reluctant to use results from econophysics? Can we overcome disputes concerning hypotheses used in financial economics and that make no sense for econophysicists? How can we create a profitable dialogue between financial economists and econophysicists? How do we develop a common theoretical framework allowing the creation of more efficient models for the financial industry? This book moves beyond the disciplinary frontiers in order to initiate the development of a common theoretical framework that makes sense for both traditionally trained financial economists and econophysicists. Unlike other publications dedicated to econophysics, this book is written by two financial economists and it situates econophysics in the evolution of financial economics. The major issues that concern the collaboration between the two fields are analyzed in detail. More specifically, this book explains the theoretical and methodological foundations of these two fields in an accessible vocabulary providing the first extensive analytic comparison between models and results from both fields. The book also identifies the major conceptual gate-keepers that complicate dialogue between the two communities while it provides elements to overcome them. By mixing conceptual, historical, theoretical and formal arguments our analysis bridges the current deaf dialogue between financial economists and econophysicists. This book details the recent results in econophysics that bring it closer to financial economics. So doing, it identifies what remains to be done for econophysicists to contribute significantly to financial economics. Beyond the clarification of the current situation, this book also proposes a generic model compatible with the two fields, defining minimal conditions for common models. Finally, this book provides a research agenda for a more fruitful collaboration between econophysicists and financial economists, creating new research opportunities. In this perspective, it lays the foundations for common theoretical framework and models.
Biophysical Measurement in Experimental Social Science Research is an ideal primer for the experimental social scientist wishing to update their knowledge and skillset in the area of laboratory-based biophysical measurement. Many behavioral laboratories across the globe have acquired increasingly sophisticated biophysical measurement equipment, sometimes for particular research projects or for financial or institutional reasons. Yet the expertise required to use this technology and integrate the measures it can generate on human subjects into successful social science research endeavors is often scarce and concentrated amongst a small minority of researchers. This book aims to open the door to wider and more productive use of biophysical measurement in laboratory-based experimental social science research. Suitable for doctoral students through to established researchers, the volume presents examples of the successful integration of biophysical measures into analyses of human behavior, discussions of the academic and practical limitations of laboratory-based biophysical measurement, and hands-on guidance about how different biophysical measurement devices are used. A foreword and concluding chapters comprehensively synthesize and compare biophysical measurement options, address academic, ethical and practical matters, and address the broader historical and scientific context. Research chapters demonstrate the academic potential of biophysical measurement ranging fully across galvanic skin response, heart rate monitoring, eye tracking and direct neurological measurements. An extended Appendix showcases specific examples of device adoption in experimental social science lab settings.
In many industries the tariffs are not strictly proportional to the quantity purchased, i. e, they are nonlinear. Examples of nonlinear tariffs include railroad and electricity schedules and rental rates for durable goods and space. The major justification for the nonlinear pricing is the existence of private information on the side of consumers. In the early papers on the subject, private information was captured either by assuming a finite number of types (e. g. Adams and Yellen, 1976) or by a unidimensional continuum of types (Mussa and Rosen, 1978). Economics of the unidimen sional problems is by now well understood. The unidimensional models, however, do not cover all the situations of practical interest. Indeed, often the nonlinear tariffs specify the payment as a function of a variety of characteristics. For example, railroad tariffs spec ify charges based on weight, volume, and distance of each shipment. Dif ferent customers may value each of these characteristics differently, hence the customer's type will not in general be captured by a unidimensional characteristic and a problem of multidimensional screening arises. In such models the consumer's private information (her type) is captured by an m-dimensional vector, while the good produced by the monopolist has n quality dimensions."
Written for those who need an introduction, Applied Time Series Analysis reviews applications of the popular econometric analysis technique across disciplines. Carefully balancing accessibility with rigor, it spans economics, finance, economic history, climatology, meteorology, and public health. Terence Mills provides a practical, step-by-step approach that emphasizes core theories and results without becoming bogged down by excessive technical details. Including univariate and multivariate techniques, Applied Time Series Analysis provides data sets and program files that support a broad range of multidisciplinary applications, distinguishing this book from others.
This is the first book that examines the diverse range of experimental methods currently being used in the social sciences, gathering contributions by working economists engaged in experimentation, as well as by a political scientist, psychologists and philosophers of the social sciences. Until the mid-twentieth century, most economists believed that experiments in the economic sciences were impossible. But that's hardly the case today, as evinced by the fact that Vernon Smith, an experimental economist, and Daniel Kahneman, a behavioral economist, won the Nobel Prize in Economics in 2002. However, the current use of experimental methods in economics is more diverse than is usually assumed. As the concept of experimentation underwent considerable abstraction throughout the twentieth century, the areas of the social sciences in which experiments are applied are expanding, creating renewed interest in, and multifaceted debates on, the way experimental methods are used. This book sheds new light on the diversity of experimental methodologies used in the social sciences. The topics covered include historical insights into the evolution of experimental methods; the necessary "performativity" of experiments, i.e., the dynamic interaction with the social contexts in which they are embedded; the application of causal inferences in the social sciences; a comparison of laboratory, field, and natural experiments; and the recent use of randomized controlled trials (RCTs) in development economics. Several chapters also deal with the latest heated debates, such as those concerning the use of the random lottery method in laboratory experiments.
Partial least squares structural equation modeling (PLS-SEM) has become a standard approach for analyzing complex inter-relationships between observed and latent variables. Researchers appreciate the many advantages of PLS-SEM such as the possibility to estimate very complex models and the method's flexibility in terms of data requirements and measurement specification. This practical open access guide provides a step-by-step treatment of the major choices in analyzing PLS path models using R, a free software environment for statistical computing, which runs on Windows, macOS, and UNIX computer platforms. Adopting the R software's SEMinR package, which brings a friendly syntax to creating and estimating structural equation models, each chapter offers a concise overview of relevant topics and metrics, followed by an in-depth description of a case study. Simple instructions give readers the "how-tos" of using SEMinR to obtain solutions and document their results. Rules of thumb in every chapter provide guidance on best practices in the application and interpretation of PLS-SEM.
This textbook introduces readers to practical statistical issues by presenting them within the context of real-life economics and business situations. It presents the subject in a non-threatening manner, with an emphasis on concise, easily understandable explanations. It has been designed to be accessible and student-friendly and, as an added learning feature, provides all the relevant data required to complete the accompanying exercises and computing problems, which are presented at the end of each chapter. It also discusses index numbers and inequality indices in detail, since these are of particular importance to students and commonly omitted in textbooks. Throughout the text it is assumed that the student has no prior knowledge of statistics. It is aimed primarily at business and economics undergraduates, providing them with the basic statistical skills necessary for further study of their subject. However, students of other disciplines will also find it relevant.
This book provides an introduction to the technical background of
unit root testing, one of the most heavily researched areas in
econometrics over the last twenty years. Starting from an
elementary understanding of probability and time series, it
develops the key concepts necessary to understand the structure of
random walks and brownian motion, and their role in tests for a
unit root. The techniques are illustrated with worked examples,
data and programs available on the book's website, which includes
more numerical and theoretical examples
In response to the damage caused by a growth-led global economy, researchers across the world started investigating the association between environmental pollution and its possible determinants using different models and techniques. Most famously, the environmental Kuznets curve hypothesizes an inverted U-shaped association between environmental quality and gross domestic product (GDP). This book explores the latest literature on the environmental Kuznets curve, including developments in the methodology, the impacts of the pandemic, and other recent findings. Researchers have recently broadened the range of the list of drivers of environmental pollution under consideration, which now includes variables such as foreign direct investment, trade expansion, financial development, human activities, population growth, and renewable and nonrenewable energy resources, all of which vary across different countries and times. And in addition to CO2 emissions, other proxies for environmental quality – such as water, land, and ecological footprints – have been used in recent studies. This book also incorporates analysis of the relationship between economic growth and the environment during the COVID-19 crisis, presenting new empirical work on the impact of the pandemic on energy use, the financial sector, trade, and tourism. Collectively, these developments have improved the direction and extent of the environmental Kuznets curve hypothesis and broadened the basket of dependent and independent variables which may be incorporated. This book will be invaluable reading for researchers in environmental economics and econometrics.
This book examines the macroeconomic and regulatory impact of domestic and international shocks on the South African economy resulting from the 2009 financial crisis. It also assesses the impact of the US economy's eventual recovery from the crisis and the prospect of higher US interest rates in future. Told in three parts, the book explores associations between economic growth, policy uncertainty and the key domestic and international transmission channels, and transmission effects, of global financial regulatory and domestic macro-economic uncertainties on subdued and volatile economic recovery, financial channels, lending rate margins, and credit growth. The book concludes by extending its focus to the role of US monetary policy, capital flows and rand/US dollar volatility on the South African economy.
The book provides a comprehensive overview of the latest econometric methods for studying the dynamics of macroeconomic and financial time series. It examines alternative methodological approaches and concepts, including quantile spectra and co-spectra, and explores topics such as non-linear and non-stationary behavior, stochastic volatility models, and the econometrics of commodity markets and globalization. Furthermore, it demonstrates the application of recent techniques in various fields: in the frequency domain, in the analysis of persistent dynamics, in the estimation of state space models and new classes of volatility models. The book is divided into two parts: The first part applies econometrics to the field of macroeconomics, discussing trend/cycle decomposition, growth analysis, monetary policy and international trade. The second part applies econometrics to a wide range of topics in financial economics, including price dynamics in equity, commodity and foreign exchange markets and portfolio analysis. The book is essential reading for scholars, students, and practitioners in government and financial institutions interested in applying recent econometric time series methods to financial and economic data.
The methodological needs of environmental studies are unique in the breadth of research questions that can be posed, calling for a textbook that covers a broad swath of approaches to conducting research with potentially many different kinds of evidence. Fully updated to address new developments such as the effects of the internet, recent trends in the use of computers, remote sensing, and large data sets, this new edition of Research Methods for Environmental Studies is written specifically for social science-based research into the environment. This revised edition contains new chapters on coding, focus groups, and an extended treatment of hypothesis testing. The textbook covers the best-practice research methods most used to study the environment and its connections to societal and economic activities and objectives. Over five key parts, Kanazawa introduces quantitative and qualitative approaches, mixed methods, and the special requirements of interdisciplinary research, emphasizing that methodological practice should be tailored to the specific needs of the project. Within these parts, detailed coverage is provided on key topics including the identification of a research project, hypothesis testing, spatial analysis, the case study method, ethnographic approaches, discourse analysis, mixed methods, survey and interview techniques, focus groups, and ethical issues in environmental research. Drawing on a variety of extended and updated examples to encourage problem-based learning and fully addressing the challenges associated with interdisciplinary investigation, this book will be an essential resource for students embarking on courses exploring research methods in environmental studies.
This book provides a concise introduction to the mathematical foundations of time series analysis, with an emphasis on mathematical clarity. The text is reduced to the essential logical core, mostly using the symbolic language of mathematics, thus enabling readers to very quickly grasp the essential reasoning behind time series analysis. It appeals to anybody wanting to understand time series in a precise, mathematical manner. It is suitable for graduate courses in time series analysis but is equally useful as a reference work for students and researchers alike.
The volume highlights the state-of-the-art knowledge (including data analysis) of productivity, inequality and efficiency analysis. It showcases a selection of the best papers from the 9th North American Productivity Workshop. These papers are relevant to academia, but also to public and private sectors in terms of the challenges that firms, financial institutions, governments, and individuals may face when dealing with economic and education related activities that lead to increase or decrease of productivity. The volume also aims to bring together ideas from different parts of the world about the challenges those local economies and institutions may face when changes in productivity are observed. These contributions focus on theoretical and empirical research in areas including productivity, production theory and efficiency measurement in economics, management science, operation research, public administration, and education. The North American Productivity Workshop (NAPW) brings together academic scholars and practitioners in the field of productivity and efficiency analysis from all over the world, and this proceedings volume is a reflection of this mission. The papers in this volume also address general topics as education, health, energy, finance, agriculture, transport, utilities, and economic development, among others. The editors are comprised of the 2016 local organizers, program committee members, and celebrated guest conference speakers.
This book presents a unique collection of contributions on modern topics in statistics and econometrics, written by leading experts in the respective disciplines and their intersections. It addresses nonparametric statistics and econometrics, quantiles and expectiles, and advanced methods for complex data, including spatial and compositional data, as well as tools for empirical studies in economics and the social sciences. The book was written in honor of Christine Thomas-Agnan on the occasion of her 65th birthday. Given its scope, it will appeal to researchers and PhD students in statistics and econometrics alike who are interested in the latest developments in their field.
Today, information is very important for businesses. Businesses that use information correctly are successful while those that don't, decline. Social media is an important source of data. This data brings us to social media analytics. Surveys are no longer the only way to hear the voice of consumers. With the data obtained from social media platforms, businesses can devise marketing strategies. It provides a better understanding consumer behavior. As consumers are at the center of all business activities, it is unrealistic to succeed without understanding consumption patterns. Social media analytics is useful, especially for marketers. Marketers can evaluate the data to make strategic marketing plans. Social media analytics and consumer behavior are two important issues that need to be addressed together. The book differs in that it handles social media analytics from a different perspective. It is planned that social media analytics will be discussed in detail in terms of consumer behavior in the book. The book will be useful to the students, businesses, and marketers in many aspects. |
You may like...
What The Fat? - How To Live The Ultimate…
Grant Schofield, Dr. Caryn Zinn, …
Paperback
(1)
Optimization in Science and Engineering…
Themistocles M. Rassias, Christodoulos A. Floudas, …
Hardcover
R2,816
Discovery Miles 28 160
Hardware Accelerator Systems for…
Shiho Kim, Ganesh Chandra Deka
Hardcover
R3,950
Discovery Miles 39 500
|