![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
Analytics is one of a number of terms which are used to describe a data-driven more scientific approach to management. Ability in analytics is an essential management skill: knowledge of data and analytics helps the manager to analyze decision situations, prevent problem situations from arising, identify new opportunities, and often enables many millions of dollars to be added to the bottom line for the organization. The objective of this book is to introduce analytics from the perspective of the general manager of a corporation. Rather than examine the details or attempt an encyclopaedic review of the field, this text emphasizes the strategic role that analytics is playing in globally competitive corporations today. The chapters of this book are organized in two main parts. The first part introduces a problem area and presents some basic analytical concepts that have been successfully used to address the problem area. The objective of this material is to provide the student, the manager of the future, with a general understanding of the tools and techniques used by the analyst.
This book analyzes the institutional underpinnings of East Asia's dynamic growth by exploring the interplay between governance and flexibility. As the challenges of promoting and sustaining economic growth become ever more complex, firms in both advanced and industrializing countries face constant pressures for change from markets and technology. Globalization, heightened competition, and shorter product cycles mean that markets are increasingly volatile and fragmented. To contend with demands for higher quality, quicker delivery, and cost efficiencies, firms must enhance their capability to innovate and diversify. Achieving this flexibility, in turn, often requires new forms of governance arrangements that facilitate the exchange of resources among diverse yet interdependent economic actors. Moving beyond the literature's emphasis on developed economies, this volume emphasizes the relevance of the links between governance and flexibility for understanding East Asia's explosive economic growth over the past quarter century. In case studies that encompass a variety of key industrial sectors and countries, the contributors emphasize the importance of network patterns of governance for facilitating flexibility in firms throughout the region. Their analyses illuminate both the strengths and limitations of recent growth strategies and offer insights into prospects for continued expansion in the wake of the East Asian economic crisis of the late 1990s. Contributions by: Richard P. Appelbaum, Lu-lin Cheng, Stephen W. K. Chiu, Frederic C. Deyo, Richard F. Doner, Dieter Ernst, Eric Hershberg, Tai Lok Lui, Rajah Rasiah, David A. Smith, and Poh-Kam Wong.
This authoritative collection brings together the most important papers in time series econometrics published since 1990. These articles cover a range of central aspects of the field, concentrating in the main on theoretical and methodological developments. Taken together, they provide an overview of the current status of research in time series econometrics, emphasising those areas that appear to have attracted most recent interest in the profession. Volume I includes sections on unit root and stationarity tests; cointegration; structural breaks; nonlinearity; and long memory. Volume II covers conditional heteroskedasticity; stochastic volatility; unobserved components; trend function analysis; prediction; seasonality; and causality. These volumes will be essential reading for all who have an interest in this rapidly advancing subject.
* Includes many mathematical examples and problems for students to work directly with both standard and nonstandard models of behaviour to develop problem-solving and critical-thinking skills which are more valuable to students than memorizing content which will quickly be forgotten. * The applications explored in the text emphasise issues of inequality, social mobility, culture and poverty to demonstrate the impact of behavioral economics in areas which students are most passionate about. * The text has a standardized structure (6 parts, 3 chapters in each) which provides a clear and consistent roadmap for students taking the course.
* Explores the exciting and new topic of econophysics * Multidisciplinary approach, that will be of interest to students and researchers from physics, engineering, mathematics, statistics, and other physical sciences * Useful to both students and researchers
In order to understand and formulate housing policy and programs, it is necessary to have a working knowledge of the internal economic operation of housing from the points of view of both the investor and the owner. James W. Hughes argues that investors' and owners' behavior and activity tend to be governed by market forces and other realities. In that regard, he begins this work by analyzing market rates of return in real estate and housing undertakings, and the variety of analytical techniques which underlie their determination. Methods of Housing Analysis is designed to provide urban planners with an introduction to the basic, quantitative techniques associated with the analysis of housing. A myriad of specific analytical methods has evolved in each of the professions concerned with this subject area. Planners, investors, developers, engineers, appraisers, social scientists, and governmental officials all tend to exhibit unique perspectives when examining housing and have developed their analytical frameworks accordingly. The work is comprised of an extensive discussion by the author, detailed case studies and examples, and a number of essays by leading experts that detail specific analytical procedures and demonstrate their use. The book is divided into four major sections: analysis of the internal operation of housing; basic cost-revenue analysis; expanded cost-revenue/benefit analysis; and government regulation of housing. The thorough nature of Hughes' discussion and of the related readings makes this volume an ideal textbook and reference source.
Developed from the author's course on Monte Carlo simulation at Brown University, Monte Carlo Simulation with Applications to Finance provides a self-contained introduction to Monte Carlo methods in financial engineering. It is suitable for advanced undergraduate and graduate students taking a one-semester course or for practitioners in the financial industry. The author first presents the necessary mathematical tools for simulation, arbitrary free option pricing, and the basic implementation of Monte Carlo schemes. He then describes variance reduction techniques, including control variates, stratification, conditioning, importance sampling, and cross-entropy. The text concludes with stochastic calculus and the simulation of diffusion processes. Only requiring some familiarity with probability and statistics, the book keeps much of the mathematics at an informal level and avoids technical measure-theoretic jargon to provide a practical understanding of the basics. It includes a large number of examples as well as MATLAB(r) coding exercises that are designed in a progressive manner so that no prior experience with MATLAB is needed.
Change of Time and Change of Measure provides a comprehensive account of two topics that are of particular significance in both theoretical and applied stochastics: random change of time and change of probability law.Random change of time is key to understanding the nature of various stochastic processes, and gives rise to interesting mathematical results and insights of importance for the modeling and interpretation of empirically observed dynamic processes. Change of probability law is a technique for solving central questions in mathematical finance, and also has a considerable role in insurance mathematics, large deviation theory, and other fields.The book comprehensively collects and integrates results from a number of scattered sources in the literature and discusses the importance of the results relative to the existing literature, particularly with regard to mathematical finance. It is invaluable as a textbook for graduate-level courses and students or a handy reference for researchers and practitioners in financial mathematics and econometrics.
This fifth edition of a classic text is appropriate for a one semester general course in Applied Statistics or as a reference book for practicing researchers in a wide variety of disciplines, including medicine, health and human services, natural and social sciences, law, and engineering. This practical book describes the Bayesian principles necessary for applied clinical research and strategic interaction, which are frequently omitted in other texts. After a comprehensive treatment of probability theory concepts, theorems, and some basic proofs, this concisely written text illustrates sampling distributions and their importance in estimation for the purpose of statistical inference. The book then shifts its focus to the essentials associated with confidence intervals and hypothesis testing for major population parameters; namely, the population mean, population variance, and population proportion. In addition, it thoroughly describes the properties of expectations and variance, the basics of correlation and simple linear regression, as well as non-parametric statistics.
The most authoritative and up-to-date core econometrics textbook available Econometrics is the quantitative language of economic theory, analysis, and empirical work, and it has become a cornerstone of graduate economics programs. Econometrics provides graduate and PhD students with an essential introduction to this foundational subject in economics and serves as an invaluable reference for researchers and practitioners. This comprehensive textbook teaches fundamental concepts, emphasizes modern, real-world applications, and gives students an intuitive understanding of econometrics. Covers the full breadth of econometric theory and methods with mathematical rigor while emphasizing intuitive explanations that are accessible to students of all backgrounds Draws on integrated, research-level datasets, provided on an accompanying website Discusses linear econometrics, time series, panel data, nonparametric methods, nonlinear econometric models, and modern machine learning Features hundreds of exercises that enable students to learn by doing Includes in-depth appendices on matrix algebra and useful inequalities and a wealth of real-world examples Can serve as a core textbook for a first-year PhD course in econometrics and as a follow-up to Bruce E. Hansen's Probability and Statistics for Economists
Originally published in 1981, this book considers one particular area of econometrics- the linear model- where significant recent advances have been made. It considers both single and multiequation models with varying co-efficients, explains the various theories and techniques connected with these and goes on to describe the various applications of the models. Whilst the detailed explanation of the models will interest primarily econometrics specialists, the implications of the advances outlined and the applications of the models will intrest a wide range of economists.
A fascinating and comprehensive history, this book explores the most important transformation in twentieth century economics: the creation of econometrics. Containing fresh archival material that has not been published before and taking Ragnar Frisch as the narrator, Francisco Louca discusses both the keys events - the establishment of the Econometric Society, the Cowles Commission and the journal Econometrica - and the major players - economists like Wesley Mitchell, mathematicians like John von Neumann and statisticians like Karl Pearson - in history that shaped the development of econometrics. He discusses the evolution of their thought, detailing the debates, the quarrels and the interrogations that crystallized their work and even offers a conclusion of sorts, suggesting that some of the more influential thinkers abandoned econometrics or became critical of its development. International in scope and appeal, The Years of High Econometrics is an excellent accompaniment for students taking courses on probability, econometric methods and the history of economic thought.
Tourism demand is the foundation on which all tourism-related business decisions ultimately rest. Governments and companies such as airlines, tour operators, hotels, cruise ship lines, and recreation facility providers are interested in the demand for their products by tourists. The success of many businesses depends largely or totally on the state of tourism demand, and ultimate management failure is quite often due to the failure to meet market demand. This book introduces students, researchers and practitioners to the modern developments in advanced econometric methodology within the context of tourism demand analysis, and illustrates these developments with actual tourism applications. The concepts and computations of modern advanced econometric modelling methodologies are introduced at a level that is accessible to specialists and non-specialists alike. The methodologies introduced include general-to-specific modelling, cointegration, vector autoregression, time varying parameter modelling, panel data analysis and the almost ideal demand system (AIDS). In order to help the reader understand the various methodologies, extensive tourism demand examples are provided throughout the volume.
This rigorous textbook introduces graduate students to the principles of econometrics and statistics with a focus on methods and applications in financial research. Financial Econometrics, Mathematics, and Statistics introduces tools and methods important for both finance and accounting that assist with asset pricing, corporate finance, options and futures, and conducting financial accounting research. Divided into four parts, the text begins with topics related to regression and financial econometrics. Subsequent sections describe time-series analyses; the role of binomial, multi-nomial, and log normal distributions in option pricing models; and the application of statistics analyses to risk management. The real-world applications and problems offer students a unique insight into such topics as heteroskedasticity, regression, simultaneous equation models, panel data analysis, time series analysis, and generalized method of moments. Written by leading academics in the quantitative finance field, allows readers to implement the principles behind financial econometrics and statistics through real-world applications and problem sets. This textbook will appeal to a less-served market of upper-undergraduate and graduate students in finance, economics, and statistics.
Introduction to Functional Data Analysis provides a concise textbook introduction to the field. It explains how to analyze functional data, both at exploratory and inferential levels. It also provides a systematic and accessible exposition of the methodology and the required mathematical framework. The book can be used as textbook for a semester-long course on FDA for advanced undergraduate or MS statistics majors, as well as for MS and PhD students in other disciplines, including applied mathematics, environmental science, public health, medical research, geophysical sciences and economics. It can also be used for self-study and as a reference for researchers in those fields who wish to acquire solid understanding of FDA methodology and practical guidance for its implementation. Each chapter contains plentiful examples of relevant R code and theoretical and data analytic problems. The material of the book can be roughly divided into four parts of approximately equal length: 1) basic concepts and techniques of FDA, 2) functional regression models, 3) sparse and dependent functional data, and 4) introduction to the Hilbert space framework of FDA. The book assumes advanced undergraduate background in calculus, linear algebra, distributional probability theory, foundations of statistical inference, and some familiarity with R programming. Other required statistics background is provided in scalar settings before the related functional concepts are developed. Most chapters end with references to more advanced research for those who wish to gain a more in-depth understanding of a specific topic.
As Ken Wallis (1993) has pOinted out, all macroeconomic forecasters and policy analysts use economic models. That is, they have a way of going from assumptions about macroeconomic policy and the international environment, to a prediction of the likely future state of the economy. Some people do this in their heads. Increasingly though, forecasting and policy analysis is based on a formal, explicit model, represented by a set of mathematical equations and solved by computer. This provides a framework for handling, in a consistent and systematic manner, the ever-increasing amounts of relevant information. Macroeconometric modelling though, is an inexact science. A manageable model must focus only on the major driving forces in a complex economy made up of millions of households and fIrms. International economic agencies such as the IMF and OECD, and most treasuries and central banks in western countries, use macroeconometric models in their forecasting and policy analysis. Models are also used for teaching and research in universities, as well as for commercial forecasting in the private sector.
How the obsession with quantifying human performance threatens business, medicine, education, government-and the quality of our lives Today, organizations of all kinds are ruled by the belief that the path to success is quantifying human performance, publicizing the results, and dividing up the rewards based on the numbers. But in our zeal to instill the evaluation process with scientific rigor, we've gone from measuring performance to fixating on measuring itself-and this tyranny of metrics now threatens the quality of our organizations and lives. In this brief, accessible, and powerful book, Jerry Muller uncovers the damage metrics are causing and shows how we can begin to fix the problem. Filled with examples from business, medicine, education, government, and other fields, the book explains why paying for measured performance doesn't work, why surgical scorecards may increase deaths, and much more. But Muller also shows that, when used as a complement to judgment based on personal experience, metrics can be beneficial, and he includes an invaluable checklist of when and how to use them. The result is an essential corrective to a harmful trend that increasingly affects us all.
In recent years there has been a substantial global increase in interest in the study of gambling. To some extent this has mirrored seismic changes in the way that betting and gaming markets worldwide are taxed and regulated. This has heightened interest in a wide range of issues related to this sector including its regulation, public policy and commercial strategy as well as the ideal structure of gambling taxes and devising optimal responses to environmental changes, such as the growth of online gambling. This volume, by bringing together the work of leading scholars, will cover the spectrum of such perspectives, as well as examining the efficiency of betting markets, to provide an assessment of developments and current understanding in the study of the economics of gambling. This timely collection will be an immensely valuable resource for academics, policy-makers, those commercially involved in the betting and gaming sectors as well as the interested layman.
Big data is presenting challenges to cybersecurity. For an example, the Internet of Things (IoT) will reportedly soon generate a staggering 400 zettabytes (ZB) of data a year. Self-driving cars are predicted to churn out 4000 GB of data per hour of driving. Big data analytics, as an emerging analytical technology, offers the capability to collect, store, process, and visualize these vast amounts of data. Big Data Analytics in Cybersecurity examines security challenges surrounding big data and provides actionable insights that can be used to improve the current practices of network operators and administrators. Applying big data analytics in cybersecurity is critical. By exploiting data from the networks and computers, analysts can discover useful network information from data. Decision makers can make more informative decisions by using this analysis, including what actions need to be performed, and improvement recommendations to policies, guidelines, procedures, tools, and other aspects of the network processes. Bringing together experts from academia, government laboratories, and industry, the book provides insight to both new and more experienced security professionals, as well as data analytics professionals who have varying levels of cybersecurity expertise. It covers a wide range of topics in cybersecurity, which include: Network forensics Threat analysis Vulnerability assessment Visualization Cyber training. In addition, emerging security domains such as the IoT, cloud computing, fog computing, mobile computing, and cyber-social networks are examined. The book first focuses on how big data analytics can be used in different aspects of cybersecurity including network forensics, root-cause analysis, and security training. Next it discusses big data challenges and solutions in such emerging cybersecurity domains as fog computing, IoT, and mobile app security. The book concludes by presenting the tools and datasets for future cybersecurity research.
Using the neo-classical theory of production economics as the analytical framework, this book, first published in 2004, provides a unified and easily comprehensible, yet fairly rigorous, exposition of the core literature on data envelopment analysis (DEA) for readers based in different disciplines. The various DEA models are developed as nonparametric alternatives to the econometric models. Apart from the standard fare consisting of the basic input- and output-oriented DEA models formulated by Charnes, Cooper, and Rhodes, and Banker, Charnes, and Cooper, the book covers developments such as the directional distance function, free disposal hull (FDH) analysis, non-radial measures of efficiency, multiplier bounds, mergers and break-up of firms, and measurement of productivity change through the Malmquist total factor productivity index. The chapter on efficiency measurement using market prices provides the critical link between DEA and the neo-classical theory of a competitive firm. The book also covers several forms of stochastic DEA in detail.
Showcasing fuzzy set theory, this book highlights the enormous potential of fuzzy logic in helping to analyse the complexity of a wide range of socio-economic patterns and behaviour. The contributions to this volume explore the most up-to-date fuzzy-set methods for the measurement of socio-economic phenomena in a multidimensional and/or dynamic perspective. Thus far, fuzzy-set theory has primarily been utilised in the social sciences in the field of poverty measurement. These chapters examine the latest work in this area, while also exploring further applications including social exclusion, the labour market, educational mismatch, sustainability, quality of life and violence against women. The authors demonstrate that real-world situations are often characterised by imprecision, uncertainty and vagueness, which cannot be properly described by the classical set theory which uses a simple true-false binary logic. By contrast, fuzzy-set theory has been shown to be a powerful tool for describing the multidimensionality and complexity of social phenomena. This book will be of significant interest to economists, statisticians and sociologists utilising quantitative methods to explore socio-economic phenomena.
The book's comprehensive coverage on the application of econometric methods to empirical analysis of economic issues is impressive. It uncovers the missing link between textbooks on economic theory and econometrics and highlights the powerful connection between economic theory and empirical analysis perfectly through examples on rigorous experimental design. The use of data sets for estimation derived with the Monte Carlo method helps facilitate the understanding of the role of hypothesis testing applied to economic models. Topics covered in the book are: consumer behavior, producer behavior, market equilibrium, macroeconomic models, qualitative-response models, panel data analysis and time-series analysis. Key econometric models are introduced, specified, estimated and evaluated. The treatment on methods of estimation in econometrics and the discipline of hypothesis testing makes it a must-have for graduate students of economics and econometrics and aids their understanding on how to estimate economic models and evaluate the results in terms of policy implications.
Handbook of Empirical Economics and Finance explores the latest developments in the analysis and modeling of economic and financial data. Well-recognized econometric experts discuss the rapidly growing research in economics and finance and offer insight on the future direction of these fields. Focusing on micro models, the first group of chapters describes the statistical issues involved in the analysis of econometric models with cross-sectional data often arising in microeconomics. The book then illustrates time series models that are extensively used in empirical macroeconomics and finance. The last set of chapters explores the types of panel data and spatial models that are becoming increasingly significant in analyzing complex economic behavior and policy evaluations. This handbook brings together both background material and new methodological and applied results that are extremely important to the current and future frontiers in empirical economics and finance. It emphasizes inferential issues that transpire in the analysis of cross-sectional, time series, and panel data-based empirical models in economics, finance, and related disciplines.
Discover the secrets to applying simple econometric techniques to improve forecasting Equipping analysts, practitioners, and graduate students with a statistical framework to make effective decisions based on the application of simple economic and statistical methods, Economic and Business Forecasting offers a comprehensive and practical approach to quantifying and accurate forecasting of key variables. Using simple econometric techniques, author John E. Silvia focuses on a select set of major economic and financial variables, revealing how to optimally use statistical software as a template to apply to your own variables of interest. * Presents the economic and financial variables that offer unique insights into economic performance * Highlights the econometric techniques that can be used to characterize variables * Explores the application of SAS software, complete with simple explanations of SAS-code and output * Identifies key econometric issues with practical solutions to those problems Presenting the "ten commandments" for economic and business forecasting, this book provides you with a practical forecasting framework you can use for important everyday business applications.
Maurice Potron (1872-1942), a French Jesuit mathematician, constructed and analyzed a highly original, but virtually unknown economic model. This book presents translated versions of all his economic writings, preceded by a long introduction which sketches his life and environment based on extensive archival research and family documents. Potron had no education in economics and almost no contact with the economists of his time. His primary source of inspiration was the social doctrine of the Church, which had been updated at the end of the nineteenth century. Faced with the 'economic evils' of his time, he reacted by utilizing his talents as a mathematician and an engineer to invent and formalize a general disaggregated model in which production, employment, prices and wages are the main unknowns. He introduced four basic principles or normative conditions ('sufficient production', the 'right to rest', 'justice in exchange', and the 'right to live') to define satisfactory regimes of production and labour on the one hand, and of prices and wages on the other. He studied the conditions for the existence of these regimes, both on the quantity side and the value side, and he explored the way to implement them. This book makes it clear that Potron was the first author to develop a full input-output model, to use the Perron-Frobenius theorem in economics, to state a duality result, and to formulate the Hawkins-Simon condition. These are all techniques which now belong to the standard toolkit of economists. This book will be of interest to Economics postgraduate students and researchers, and will be essential reading for courses dealing with the history of mathematical economics in general, and linear production theory in particular. |
![]() ![]() You may like...
Accurate Structure Determination of Free…
Jean Demaison, Natalja Vogt
Hardcover
R2,904
Discovery Miles 29 040
The Scaling of Relaxation Processes
Friedrich Kremer, Alois Loidl
Hardcover
R4,680
Discovery Miles 46 800
Handbook of Materials Characterization
Surender Kumar Sharma
Hardcover
R6,341
Discovery Miles 63 410
The Mathematical Theory of Time-Harmonic…
Andreas Kirsch, Frank Hettlich
Hardcover
R2,865
Discovery Miles 28 650
|