0
Your cart

Your cart is empty

Browse All Departments
Price
  • R50 - R100 (2)
  • R100 - R250 (1,315)
  • R250 - R500 (232)
  • R500+ (3,383)
  • -
Status
Format
Author / Contributor
Publisher

Books > Business & Economics > Economics > Econometrics

Multivariate Normal Distribution, The: Theory And Applications (Hardcover): Thu Pham-Gia Multivariate Normal Distribution, The: Theory And Applications (Hardcover)
Thu Pham-Gia
R2,907 Discovery Miles 29 070 Ships in 18 - 22 working days

This book provides the reader with user-friendly applications of normal distribution. In several variables it is called the multinormal distribution which is often handled using matrices for convenience. The author seeks to make the arguments less abstract and hence, starts with the univariate case and moves progressively toward the vector and matrix cases. The approach used in the book is a gradual one, going from one scalar variable to a vector variable and to a matrix variable. The author presents the unified aspect of normal distribution, as well as addresses several other issues, including random matrix theory in physics. Other well-known applications, such as Herrnstein and Murray's argument that human intelligence is substantially influenced by both inherited and environmental factors, will be discussed in this book. It is a better predictor of many personal dynamics - including financial income, job performance, birth out of wedlock, and involvement in crime - than are an individual's parental socioeconomic status, or education level, and deserve to be mentioned and discussed.

Microeconometrics (Hardcover): Steven Durlauf, L. Blume Microeconometrics (Hardcover)
Steven Durlauf, L. Blume
R2,693 Discovery Miles 26 930 Ships in 18 - 22 working days

Following the recent publication of the award winning and much acclaimed "The New Palgrave Dictionary of Economics," second edition which brings together Nobel Prize winners and the brightest young scholars to survey the discipline, we are pleased to announce "The New Palgrave Economics Collection." Due to demand from the economics community these books address key subject areas within the field. Each title is comprised of specially selected articles from the Dictionary and covers a fundamental theme within the discipline. All of the articles have been specifically chosen by the editors of the Dictionary, Steven N.Durlauf and Lawrence E.Blume and are written by leading practitioners in the field. The Collections provide the reader with easy to access information on complex and important subject areas, and allow individual scholars and students to have their own personal reference copy.

Interest Rates, Exchange Rates and World Monetary Policy (Hardcover, 2010 ed.): John E. Floyd Interest Rates, Exchange Rates and World Monetary Policy (Hardcover, 2010 ed.)
John E. Floyd
R4,233 Discovery Miles 42 330 Ships in 18 - 22 working days

A careful basic theoretical and econometric analysis of the factors determining the real exchange rates of Canada, the U.K., Japan, France and Germany with respect to the United States is conducted. The resulting conclusion is that real exchange rates are almost entirely determined by real factors relating to growth and technology such as oil and commodity prices, international allocations of world investment across countries, and underlying terms of trade changes. Unanticipated money supply shocks, calculated in five alternative ways have virtually no effects. A Blanchard-Quah VAR analysis also indicates that the effects of real shocks predominate over monetary shocks by a wide margin. The implications of these facts for the conduct of monetary policy in countries outside the U.S. are then explored leading to the conclusion that all countries, to avoid exchange rate overshooting, have tended to automatically follow the same monetary policy as the United States. The history of world monetary policy is reviewed along with the determination of real exchange rates within the Euro Area.

False Feedback in Economics - The Case for Replication (Hardcover): Andrin Spescha False Feedback in Economics - The Case for Replication (Hardcover)
Andrin Spescha
R4,488 Discovery Miles 44 880 Ships in 10 - 15 working days

This book investigates why economics makes less visible progress over time than scientific fields with a strong practical component, where interactions with physical technologies play a key role. The thesis of the book is that the main impediment to progress in economics is "false feedback", which it defines as the false result of an empirical study, such as empirical evidence produced by a statistical model that violates some of its assumptions. In contrast to scientific fields that work with physical technologies, false feedback is hard to recognize in economics. Economists thus have difficulties knowing where they stand in their inquiries, and false feedback will regularly lead them in the wrong directions. The book searches for the reasons behind the emergence of false feedback. It thereby contributes to a wider discussion in the field of metascience about the practices of researchers when pursuing their daily business. The book thus offers a case study of metascience for the field of empirical economics. The main strength of the book are the numerous smaller insights it provides throughout. The book delves into deep discussions of various theoretical issues, which it illustrates by many applied examples and a wide array of references, especially to philosophy of science. The book puts flesh on complicated and often abstract subjects, particularly when it comes to controversial topics such as p-hacking. The reader gains an understanding of the main challenges present in empirical economic research and also the possible solutions. The main audience of the book are all applied researchers working with data and, in particular, those who have found certain aspects of their research practice problematic.

Risk Measures and Insurance Solvency Benchmarks - Fixed-Probability Levels in Renewal Risk Models (Hardcover): Vsevolod K.... Risk Measures and Insurance Solvency Benchmarks - Fixed-Probability Levels in Renewal Risk Models (Hardcover)
Vsevolod K. Malinovskii
R4,080 Discovery Miles 40 800 Ships in 10 - 15 working days

Risk Measures and Insurance Solvency Benchmarks: Fixed-Probability Levels in Renewal Risk Models is written for academics and practitioners who are concerned about potential weaknesses of the Solvency II regulatory system. It is also intended for readers who are interested in pure and applied probability, have a taste for classical and asymptotic analysis, and are motivated to delve into rather intensive calculations. The formal prerequisite for this book is a good background in analysis. The desired prerequisite is some degree of probability training, but someone with knowledge of the classical real-variable theory, including asymptotic methods, will also find this book interesting. For those who find the proofs too complicated, it may be reassuring that most results in this book are formulated in rather elementary terms. This book can also be used as reading material for basic courses in risk measures, insurance mathematics, and applied probability. The material of this book was partly used by the author for his courses in several universities in Moscow, Copenhagen University, and in the University of Montreal. Features Requires only minimal mathematical prerequisites in analysis and probability Suitable for researchers and postgraduate students in related fields Could be used as a supplement to courses in risk measures, insurance mathematics and applied probability.

Messy Data - Missing Observations, Outliers, and Mixed-Frequency Data (Hardcover): R. Carter Hill, Thomas B Fomby Messy Data - Missing Observations, Outliers, and Mixed-Frequency Data (Hardcover)
R. Carter Hill, Thomas B Fomby
R3,464 Discovery Miles 34 640 Ships in 10 - 15 working days

Often applied econometricians are faced with working with data that is less than ideal. The data may be observed with gaps in it, a model may suggest variables that are observed at different frequencies, and sometimes econometric results are very fragile to the inclusion or omission of just a few observations in the sample. Papers in this volume discuss new econometric techniques for addressing these problems.

Understanding Chinese GDP (Hardcover, 1st ed. 2019): Xuguang Song Understanding Chinese GDP (Hardcover, 1st ed. 2019)
Xuguang Song
R2,683 Discovery Miles 26 830 Ships in 18 - 22 working days

This book provides in-depth analyses on accounting methods of GDP, statistic calibers and comparative perspectives on Chinese GDP. Beginning with an exploration of international comparisons of GDP, the book introduces the theoretical backgrounds, data sources, algorithms of the exchange rate method and the purchasing power parity method and discusses the advantages, disadvantages, and the latest developments in the two methods. This book further elaborates on the reasons for the imperfections of the Chinese GDP data including limitations of current statistical techniques and the accounting system, as well as the relatively confusing statistics for the service industry. The authors then make suggestions for improvement. Finally, the authors emphasize that evaluation of a country's economy and social development should not be solely limited to GDP, but should focus more on indicators of the comprehensive national power, national welfare, and the people's livelihood. This book will be of interest to economists, China-watchers, and scholars of geopolitics.

Modern Econometric Analysis - Surveys on Recent Developments (Hardcover, 2006 ed.): Olaf Hubler, Joachim Frohn Modern Econometric Analysis - Surveys on Recent Developments (Hardcover, 2006 ed.)
Olaf Hubler, Joachim Frohn
R2,779 Discovery Miles 27 790 Ships in 18 - 22 working days

In this book leading German econometricians in different fields present survey articles of the most important new methods in econometrics. The book gives an overview of the field and it shows progress made in recent years and remaining problems.

Modelling Non-Stationary Economic Time Series - A Multivariate Approach (Hardcover, 2005 ed.): S. Burke, J. Hunter Modelling Non-Stationary Economic Time Series - A Multivariate Approach (Hardcover, 2005 ed.)
S. Burke, J. Hunter
R2,895 Discovery Miles 28 950 Ships in 10 - 15 working days

Co-integration, equilibrium and equilibrium correction are key concepts in modern applications of econometrics to real world problems. This book provides direction and guidance to the now vast literature facing students and graduate economists. Econometric theory is linked to practical issues such as how to identify equilibrium relationships, how to deal with structural breaks associated with regime changes and what to do when variables are of different orders of integration.

Data Stewardship for Open Science - Implementing FAIR Principles (Paperback): Barend Mons Data Stewardship for Open Science - Implementing FAIR Principles (Paperback)
Barend Mons
R1,468 Discovery Miles 14 680 Ships in 10 - 15 working days

Data Stewardship for Open Science: Implementing FAIR Principles has been written with the intention of making scientists, funders, and innovators in all disciplines and stages of their professional activities broadly aware of the need, complexity, and challenges associated with open science, modern science communication, and data stewardship. The FAIR principles are used as a guide throughout the text, and this book should leave experimentalists consciously incompetent about data stewardship and motivated to respect data stewards as representatives of a new profession, while possibly motivating others to consider a career in the field. The ebook, avalable for no additional cost when you buy the paperback, will be updated every 6 months on average (providing that significant updates are needed or avaialble). Readers will have the opportunity to contribute material towards these updates, and to develop their own data management plans, via the free Data Stewardship Wizard.

Time Series - A First Course with Bootstrap Starter (Paperback): Tucker S McElroy, Dimitris N. Politis Time Series - A First Course with Bootstrap Starter (Paperback)
Tucker S McElroy, Dimitris N. Politis
R1,439 Discovery Miles 14 390 Ships in 10 - 15 working days

Time Series: A First Course with Bootstrap Starter provides an introductory course on time series analysis that satisfies the triptych of (i) mathematical completeness, (ii) computational illustration and implementation, and (iii) conciseness and accessibility to upper-level undergraduate and M.S. students. Basic theoretical results are presented in a mathematically convincing way, and the methods of data analysis are developed through examples and exercises parsed in R. A student with a basic course in mathematical statistics will learn both how to analyze time series and how to interpret the results. The book provides the foundation of time series methods, including linear filters and a geometric approach to prediction. The important paradigm of ARMA models is studied in-depth, as well as frequency domain methods. Entropy and other information theoretic notions are introduced, with applications to time series modeling. The second half of the book focuses on statistical inference, the fitting of time series models, as well as computational facets of forecasting. Many time series of interest are nonlinear in which case classical inference methods can fail, but bootstrap methods may come to the rescue. Distinctive features of the book are the emphasis on geometric notions and the frequency domain, the discussion of entropy maximization, and a thorough treatment of recent computer-intensive methods for time series such as subsampling and the bootstrap. There are more than 600 exercises, half of which involve R coding and/or data analysis. Supplements include a website with 12 key data sets and all R code for the book's examples, as well as the solutions to exercises.

Introduction to Statistical Decision Theory - Utility Theory and Causal Analysis (Paperback): Silvia Bacci, Bruno Chiandotto Introduction to Statistical Decision Theory - Utility Theory and Causal Analysis (Paperback)
Silvia Bacci, Bruno Chiandotto
R1,589 Discovery Miles 15 890 Ships in 10 - 15 working days

Introduction to Statistical Decision Theory: Utility Theory and Causal Analysis provides the theoretical background to approach decision theory from a statistical perspective. It covers both traditional approaches, in terms of value theory and expected utility theory, and recent developments, in terms of causal inference. The book is specifically designed to appeal to students and researchers that intend to acquire a knowledge of statistical science based on decision theory. Features Covers approaches for making decisions under certainty, risk, and uncertainty Illustrates expected utility theory and its extensions Describes approaches to elicit the utility function Reviews classical and Bayesian approaches to statistical inference based on decision theory Discusses the role of causal analysis in statistical decision theory

Analysis of Integrated Data (Paperback): Li-Chun Zhang, Raymond L. Chambers Analysis of Integrated Data (Paperback)
Li-Chun Zhang, Raymond L. Chambers
R1,662 Discovery Miles 16 620 Ships in 10 - 15 working days

The advent of "Big Data" has brought with it a rapid diversification of data sources, requiring analysis that accounts for the fact that these data have often been generated and recorded for different reasons. Data integration involves combining data residing in different sources to enable statistical inference, or to generate new statistical data for purposes that cannot be served by each source on its own. This can yield significant gains for scientific as well as commercial investigations. However, valid analysis of such data should allow for the additional uncertainty due to entity ambiguity, whenever it is not possible to state with certainty that the integrated source is the target population of interest. Analysis of Integrated Data aims to provide a solid theoretical basis for this statistical analysis in three generic settings of entity ambiguity: statistical analysis of linked datasets that may contain linkage errors; datasets created by a data fusion process, where joint statistical information is simulated using the information in marginal data from non-overlapping sources; and estimation of target population size when target units are either partially or erroneously covered in each source. Covers a range of topics under an overarching perspective of data integration. Focuses on statistical uncertainty and inference issues arising from entity ambiguity. Features state of the art methods for analysis of integrated data. Identifies the important themes that will define future research and teaching in the statistical analysis of integrated data. Analysis of Integrated Data is aimed primarily at researchers and methodologists interested in statistical methods for data from multiple sources, with a focus on data analysts in the social sciences, and in the public and private sectors.

Time Series Clustering and Classification (Paperback): Elizabeth Ann Maharaj, Pierpaolo D'Urso, Jorge Caiado Time Series Clustering and Classification (Paperback)
Elizabeth Ann Maharaj, Pierpaolo D'Urso, Jorge Caiado
R1,553 Discovery Miles 15 530 Ships in 10 - 15 working days

The beginning of the age of artificial intelligence and machine learning has created new challenges and opportunities for data analysts, statisticians, mathematicians, econometricians, computer scientists and many others. At the root of these techniques are algorithms and methods for clustering and classifying different types of large datasets, including time series data. Time Series Clustering and Classification includes relevant developments on observation-based, feature-based and model-based traditional and fuzzy clustering methods, feature-based and model-based classification methods, and machine learning methods. It presents a broad and self-contained overview of techniques for both researchers and students. Features Provides an overview of the methods and applications of pattern recognition of time series Covers a wide range of techniques, including unsupervised and supervised approaches Includes a range of real examples from medicine, finance, environmental science, and more R and MATLAB code, and relevant data sets are available on a supplementary website

Statistical Portfolio Estimation (Paperback): Masanobu Taniguchi, Hiroshi Shiraishi, Junichi Hirukawa, Hiroko Kato Solvang,... Statistical Portfolio Estimation (Paperback)
Masanobu Taniguchi, Hiroshi Shiraishi, Junichi Hirukawa, Hiroko Kato Solvang, Takashi Yamashita
R1,996 Discovery Miles 19 960 Ships in 10 - 15 working days

The composition of portfolios is one of the most fundamental and important methods in financial engineering, used to control the risk of investments. This book provides a comprehensive overview of statistical inference for portfolios and their various applications. A variety of asset processes are introduced, including non-Gaussian stationary processes, nonlinear processes, non-stationary processes, and the book provides a framework for statistical inference using local asymptotic normality (LAN). The approach is generalized for portfolio estimation, so that many important problems can be covered. This book can primarily be used as a reference by researchers from statistics, mathematics, finance, econometrics, and genomics. It can also be used as a textbook by senior undergraduate and graduate students in these fields.

Adversarial Risk Analysis (Paperback): David L. Banks, Jesus M. Rios Aliaga, David Rios Insua Adversarial Risk Analysis (Paperback)
David L. Banks, Jesus M. Rios Aliaga, David Rios Insua
R1,544 Discovery Miles 15 440 Ships in 10 - 15 working days

Winner of the 2017 De Groot Prize awarded by the International Society for Bayesian Analysis (ISBA) A relatively new area of research, adversarial risk analysis (ARA) informs decision making when there are intelligent opponents and uncertain outcomes. Adversarial Risk Analysis develops methods for allocating defensive or offensive resources against intelligent adversaries. Many examples throughout illustrate the application of the ARA approach to a variety of games and strategic situations. Focuses on the recent subfield of decision analysis, ARA Compares ideas from decision theory and game theory Uses multi-agent influence diagrams (MAIDs) throughout to help readers visualize complex information structures Applies the ARA approach to simultaneous games, auctions, sequential games, and defend-attack games Contains an extended case study based on a real application in railway security, which provides a blueprint for how to perform ARA in similar security situations Includes exercises at the end of most chapters, with selected solutions at the back of the book The book shows decision makers how to build Bayesian models for the strategic calculation of their opponents, enabling decision makers to maximize their expected utility or minimize their expected loss. This new approach to risk analysis asserts that analysts should use Bayesian thinking to describe their beliefs about an opponent's goals, resources, optimism, and type of strategic calculation, such as minimax and level-k thinking. Within that framework, analysts then solve the problem from the perspective of the opponent while placing subjective probability distributions on all unknown quantities. This produces a distribution over the actions of the opponent and enables analysts to maximize their expected utilities.

Financial Mathematics - A Comprehensive Treatment in Discrete Time (Hardcover, 2nd edition): Giuseppe Campolieti, Roman  N.... Financial Mathematics - A Comprehensive Treatment in Discrete Time (Hardcover, 2nd edition)
Giuseppe Campolieti, Roman N. Makarov
R3,135 Discovery Miles 31 350 Ships in 10 - 15 working days

The book has been tested and refined through years of classroom teaching experience. With an abundance of examples, problems, and fully worked out solutions, the text introduces the financial theory and relevant mathematical methods in a mathematically rigorous yet engaging way. This textbook provides complete coverage of discrete-time financial models that form the cornerstones of financial derivative pricing theory. Unlike similar texts in the field, this one presents multiple problem-solving approaches, linking related comprehensive techniques for pricing different types of financial derivatives. Key features: In-depth coverage of discrete-time theory and methodology. Numerous, fully worked out examples and exercises in every chapter. Mathematically rigorous and consistent yet bridging various basic and more advanced concepts. Judicious balance of financial theory, mathematical, and computational methods. Guide to Material. This revision contains: Almost 200 pages worth of new material in all chapters. A new chapter on elementary probability theory. An expanded the set of solved problems and additional exercises. Answers to all exercises. This book is a comprehensive, self-contained, and unified treatment of the main theory and application of mathematical methods behind modern-day financial mathematics. Table of Contents List of Figures and Tables Preface I Introduction to Pricing and Management of Financial Securities 1 Mathematics of Compounding 2 Primer on Pricing Risky Securities 3 Portfolio Management 4 Primer on Derivative Securities II Discrete-Time Modelling 5 Single-Period Arrow-Debreu Models 6 Introduction to Discrete-Time Stochastic Calculus 7 Replication and Pricing in the Binomial Tree Model 8 General Multi-Asset Multi-Period Model Appendices A Elementary Probability Theory B Glossary of Symbols and Abbreviations C Answers and Hints to Exercises References Index Biographies Giuseppe Campolieti is Professor of Mathematics at Wilfrid Laurier University in Waterloo, Canada. He has been Natural Sciences and Engineering Research Council postdoctoral research fellow and university research fellow at the University of Toronto. In 1998, he joined the Masters in Mathematical Finance as an instructor and later as an adjunct professor in financial mathematics until 2002. Dr. Campolieti also founded a financial software and consulting company in 1998. He joined Laurier in 2002 as Associate Professor of Mathematics and as SHARCNET Chair in Financial Mathematics. Roman N. Makarov is Associate Professor and Chair of Mathematics at Wilfrid Laurier University. Prior to joining Laurier in 2003, he was an Assistant Professor of Mathematics at Siberian State University of Telecommunications and Informatics and a senior research fellow at the Laboratory of Monte Carlo Methods at the Institute of Computational Mathematics and Mathematical Geophysics in Novosibirsk, Russia.

The Essentials of Machine Learning in Finance and Accounting (Hardcover): Mohammad Zoynul Abedin, M. Kabir Hassan, Petr Hajek,... The Essentials of Machine Learning in Finance and Accounting (Hardcover)
Mohammad Zoynul Abedin, M. Kabir Hassan, Petr Hajek, Mohammed Mohi Uddin
R4,509 Discovery Miles 45 090 Ships in 10 - 15 working days

* A useful guide to financial product modeling and to minimizing business risk and uncertainty * Looks at wide range of financial assets and markets and correlates them with enterprises' profitability * Introduces advanced and novel machine learning techniques in finance such as Support Vector Machine, Neural Networks, Random Forest, K-Nearest Neighbors, Extreme Learning Machine, Deep Learning Approaches and applies them to analyze finance data sets * Real world applicable examples to further understanding

Time Series Modelling with Unobserved Components (Paperback): Matteo M. Pelagatti Time Series Modelling with Unobserved Components (Paperback)
Matteo M. Pelagatti
R1,557 Discovery Miles 15 570 Ships in 10 - 15 working days

Despite the unobserved components model (UCM) having many advantages over more popular forecasting techniques based on regression analysis, exponential smoothing, and ARIMA, the UCM is not well known among practitioners outside the academic community. Time Series Modelling with Unobserved Components rectifies this deficiency by giving a practical overview of the UCM approach, covering some theoretical details, several applications, and the software for implementing UCMs. The book's first part discusses introductory time series and prediction theory. Unlike most other books on time series, this text includes a chapter on prediction at the beginning because the problem of predicting is not limited to the field of time series analysis. The second part introduces the UCM, the state space form, and related algorithms. It also provides practical modeling strategies to build and select the UCM that best fits the needs of time series analysts. The third part presents real-world applications, with a chapter focusing on business cycle analysis and the construction of band-pass filters using UCMs. The book also reviews software packages that offer ready-to-use procedures for UCMs as well as systems popular among statisticians and econometricians that allow general estimation of models in state space form. This book demonstrates the numerous benefits of using UCMs to model time series data. UCMs are simple to specify, their results are easy to visualize and communicate to non-specialists, and their forecasting performance is competitive. Moreover, various types of outliers can easily be identified, missing values are effortlessly managed, and working contemporaneously with time series observed at different frequencies poses no problem.

The Law and Economics of Patent Damages, Antitrust, and Legal Process (Hardcover): James  Langenfeld, Frank Fagan, Samuel Clark The Law and Economics of Patent Damages, Antitrust, and Legal Process (Hardcover)
James Langenfeld, Frank Fagan, Samuel Clark
R2,791 Discovery Miles 27 910 Ships in 10 - 15 working days

Law and economics research has had an enormous impact on the laws of contracts, torts, property, crimes, corporations, and antitrust, as well as public regulation and fundamental rights. The Law and Economics of Patent Damages, Antitrust, and Legal Process examines several areas of important research by a variety of international scholars. It contains technical papers on the appropriate way to estimate damages in patent disputes, as well as methods for evaluating relevant markets and vertically integrated firms when determining the competitive effects of mergers and other actions. There are also papers on the implication of different legal processes, regulations, and liability rules on consumer welfare, which range from the impact of delays in legal decisions in labour cases in France to issues of criminal liability related to the use of artificial intelligence. This volume of Research in Law and Economics is a must-read for researchers and professionals of patent damages, antitrust, labour, and legal process.

Practical Guide to Using Econometrics, A, Global Edition (Paperback, 7th edition): A. Studenmund Practical Guide to Using Econometrics, A, Global Edition (Paperback, 7th edition)
A. Studenmund
R2,495 Discovery Miles 24 950 Ships in 9 - 17 working days

For courses in Econometrics. A Clear, Practical Introduction to Econometrics Using Econometrics: A Practical Guide offers students an innovative introduction to elementary econometrics. Through real-world examples and exercises, the book covers the topic of single-equation linear regression analysis in an easily understandable format. The Seventh Edition is appropriate for all levels: beginner econometric students, regression users seeking a refresher, and experienced practitioners who want a convenient reference. Praised as one of the most important texts in the last 30 years, the book retains its clarity and practicality in previous editions with a number of substantial improvements throughout.

Principles of Copula Theory (Paperback): Fabrizio Durante, Carlo Sempi Principles of Copula Theory (Paperback)
Fabrizio Durante, Carlo Sempi
R1,988 Discovery Miles 19 880 Ships in 10 - 15 working days

Principles of Copula Theory explores the state of the art on copulas and provides you with the foundation to use copulas in a variety of applications. Throughout the book, historical remarks and further readings highlight active research in the field, including new results, streamlined presentations, and new proofs of old results. After covering the essentials of copula theory, the book addresses the issue of modeling dependence among components of a random vector using copulas. It then presents copulas from the point of view of measure theory, compares methods for the approximation of copulas, and discusses the Markov product for 2-copulas. The authors also examine selected families of copulas that possess appealing features from both theoretical and applied viewpoints. The book concludes with in-depth discussions on two generalizations of copulas: quasi- and semi-copulas. Although copulas are not the solution to all stochastic problems, they are an indispensable tool for understanding several problems about stochastic dependence. This book gives you the solid and formal mathematical background to apply copulas to a range of mathematical areas, such as probability, real analysis, measure theory, and algebraic structures.

Why Fiscal Stimulus Programs Fail, Volume 2 - Statistical Tests Comparing Monetary Policy to Growth Effects (Hardcover, 1st ed.... Why Fiscal Stimulus Programs Fail, Volume 2 - Statistical Tests Comparing Monetary Policy to Growth Effects (Hardcover, 1st ed. 2021)
John J. Heim
R3,198 Discovery Miles 31 980 Ships in 18 - 22 working days

This book scientifically tests the assertion that accommodative monetary policy can eliminate the "crowd out" problem, allowing fiscal stimulus programs (such as tax cuts or increased government spending) to stimulate the economy as intended. It also tests to see if natural growth in th economy can cure the crowd out problem as well or better. The book is intended to be the largest scale scientific test ever performed on this topic. It includes about 800 separate statistical tests on the U.S. economy testing different parts or all of the period 1960 - 2010. These tests focus on whether accommodative monetary policy, which increases the pool of loanable resources, can offset the crowd out problem as well as natural growth in the economy. The book, employing the best scientific methods available to economists for this type of problem, concludes accommodate monetary policy could have, but until the quantitative easing program, Federal Reserve efforts to accommodate fiscal stimulus programs were not large enough to offset more than 23% to 44% of any one year's crowd out problem. That provides the science part of the answer as to why accommodative monetary policy didn't accommodate: too little of it was tried. The book also tests whether other increases in loanable funds, occurring because of natural growth in the economy or changes in the savings rate can also offset crowd out. It concludes they can, and that these changes tend to be several times as effective as accommodative monetary policy. This book's companion volume Why Fiscal Stimulus Programs Fail explores the policy implications of these results.

Introductory Econometrics (Hardcover, 1978 ed.): P.J. Dhrymes Introductory Econometrics (Hardcover, 1978 ed.)
P.J. Dhrymes
R1,659 Discovery Miles 16 590 Ships in 10 - 15 working days

This book has taken form over several years as a result of a number of courses taught at the University of Pennsylvania and at Columbia University and a series of lectures I have given at the International Monetary Fund. Indeed, I began writing down my notes systematically during the academic year 1972-1973 while at the University of California, Los Angeles. The diverse character of the audience, as well as my own conception of what an introductory and often terminal acquaintance with formal econometrics ought to encompass, have determined the style and content of this volume. The selection of topics and the level of discourse give sufficient variety so that the book can serve as the basis for several types of courses. As an example, a relatively elementary one-semester course can be based on Chapters one through five, omitting the appendices to these chapters and a few sections in some of the chapters so indicated. This would acquaint the student with the basic theory of the general linear model, some of the prob lems often encountered in empirical research, and some proposed solutions. For such a course, I should also recommend a brief excursion into Chapter seven (logit and pro bit analysis) in view of the increasing availability of data sets for which this type of analysis is more suitable than that based on the general linear model."

Big Data Analytics in Cybersecurity (Paperback): Onur Savas, Julia Deng Big Data Analytics in Cybersecurity (Paperback)
Onur Savas, Julia Deng
R1,539 Discovery Miles 15 390 Ships in 10 - 15 working days

Big data is presenting challenges to cybersecurity. For an example, the Internet of Things (IoT) will reportedly soon generate a staggering 400 zettabytes (ZB) of data a year. Self-driving cars are predicted to churn out 4000 GB of data per hour of driving. Big data analytics, as an emerging analytical technology, offers the capability to collect, store, process, and visualize these vast amounts of data. Big Data Analytics in Cybersecurity examines security challenges surrounding big data and provides actionable insights that can be used to improve the current practices of network operators and administrators. Applying big data analytics in cybersecurity is critical. By exploiting data from the networks and computers, analysts can discover useful network information from data. Decision makers can make more informative decisions by using this analysis, including what actions need to be performed, and improvement recommendations to policies, guidelines, procedures, tools, and other aspects of the network processes. Bringing together experts from academia, government laboratories, and industry, the book provides insight to both new and more experienced security professionals, as well as data analytics professionals who have varying levels of cybersecurity expertise. It covers a wide range of topics in cybersecurity, which include: Network forensics Threat analysis Vulnerability assessment Visualization Cyber training. In addition, emerging security domains such as the IoT, cloud computing, fog computing, mobile computing, and cyber-social networks are examined. The book first focuses on how big data analytics can be used in different aspects of cybersecurity including network forensics, root-cause analysis, and security training. Next it discusses big data challenges and solutions in such emerging cybersecurity domains as fog computing, IoT, and mobile app security. The book concludes by presenting the tools and datasets for future cybersecurity research.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Matching, Regression Discontinuity…
Myoung-Jae Lee Hardcover R3,748 Discovery Miles 37 480
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen Hardcover R2,970 Discovery Miles 29 700
Operations And Supply Chain Management
David Collier, James Evans Hardcover R1,391 R1,295 Discovery Miles 12 950
Transportation Economics - Theory and…
P. McCarthy Hardcover R1,731 Discovery Miles 17 310
Valuing and Investing in Equities…
Francesco Curto Paperback R1,081 Discovery Miles 10 810
Real-Estate Derivatives - From…
Radu S. Tunaru Hardcover R2,627 Discovery Miles 26 270
Generalized Method of Moments
Alastair R. Hall Hardcover R4,489 Discovery Miles 44 890
Operations and Supply Chain Management
James Evans, David Collier Hardcover R1,369 R1,276 Discovery Miles 12 760
Essential Mathematics for Economics and…
T. Bradley Paperback R1,559 Discovery Miles 15 590
The Leading Indicators - A Short History…
Zachary Karabell Paperback R456 R426 Discovery Miles 4 260

 

Partners