0
Your cart

Your cart is empty

Browse All Departments
Price
  • R100 - R250 (1,238)
  • R250 - R500 (141)
  • R500+ (1,331)
  • -
Status
Format
Author / Contributor
Publisher

Books > Business & Economics > Economics > Econometrics > Economic statistics

Computational and Mathematical Modeling in the Social Sciences (Paperback, New): Scott De Marchi Computational and Mathematical Modeling in the Social Sciences (Paperback, New)
Scott De Marchi
R967 Discovery Miles 9 670 Ships in 10 - 15 working days

Mathematical models in the social sciences have become increasingly sophisticated and widespread in the last decade. This period has also seen many critiques, most lamenting the sacrifices incurred in pursuit of mathematical rigor. If, as critics argue, our ability to understand the world has not improved during the mathematization of the social sciences, we might want to adopt a different paradigm. This book examines the three main fields of mathematical modeling - game theory, statistics, and computational methods - and proposes a new framework for modeling. Unlike previous treatments which view each field separately, the treatment provides a framework that spans and incorporates the different methodological approaches. The goal is to arrive at a new vision of modeling that allows researchers to solve more complex problems in the social sciences. Additionally, a special emphasis is placed upon the role of computational modeling in the social sciences.

Computational and Mathematical Modeling in the Social Sciences (Hardcover): Scott De Marchi Computational and Mathematical Modeling in the Social Sciences (Hardcover)
Scott De Marchi
R1,842 R1,570 Discovery Miles 15 700 Save R272 (15%) Ships in 10 - 15 working days

Mathematical models in the social sciences have become increasingly sophisticated and widespread in the last decade. This period has also seen many critiques, most lamenting the sacrifices incurred in pursuit of mathematical rigor. If, as critics argue, our ability to understand the world has not improved during the mathematization of the social sciences, we might want to adopt a different paradigm. This book examines the three main fields of mathematical modeling - game theory, statistics, and computational methods - and proposes a new framework for modeling. Unlike previous treatments which view each field separately, the treatment provides a framework that spans and incorporates the different methodological approaches. The goal is to arrive at a new vision of modeling that allows researchers to solve more complex problems in the social sciences. Additionally, a special emphasis is placed upon the role of computational modeling in the social sciences.

Introduction to the Mathematical and Statistical Foundations of Econometrics (Hardcover): Herman J. Bierens Introduction to the Mathematical and Statistical Foundations of Econometrics (Hardcover)
Herman J. Bierens
R2,807 R2,439 Discovery Miles 24 390 Save R368 (13%) Ships in 10 - 15 working days

This book is intended for use in a rigorous introductory PhD level course in econometrics, or in a field course in econometric theory. It covers the measure-theoretical foundation of probability theory, the multivariate normal distribution with its application to classical linear regression analysis, various laws of large numbers, central limit theorems and related results for independent random variables as well as for stationary time series, with applications to asymptotic inference of M-estimators, and maximum likelihood theory. Some chapters have their own appendices containing the more advanced topics and/or difficult proofs. Moreover, there are three appendices with material that is supposed to be known. Appendix I contains a comprehensive review of linear algebra, including all the proofs. Appendix II reviews a variety of mathematical topics and concepts that are used throughout the main text, and Appendix III reviews complex analysis. Therefore, this book is uniquely self-contained.

Statistics, Econometrics and Forecasting (Hardcover): Arnold Zellner Statistics, Econometrics and Forecasting (Hardcover)
Arnold Zellner
R3,624 R3,053 Discovery Miles 30 530 Save R571 (16%) Ships in 10 - 15 working days

This book is based on two Sir Richard Stone lectures at the Bank of England and the National Institute for Economic and Social Research. Largely non-technical, the first part of the book covers some of the broader issues involved in Stone's and others' work in statistics. It explores the more philosophical issues attached to statistics, econometrics and forecasting and describes the paradigm shift back to the Bayesian approach to scientific inference. The first part concludes with simple examples from the different worlds of educational management and golf clubs. The second, more technical part covers in detail the structural econometric time series analysis (SEMTSA) approach to statistical and econometric modeling.

Practical Spreadsheet Modeling Using @Risk (Hardcover): Dale Lehman, Huybert Groenendaal Practical Spreadsheet Modeling Using @Risk (Hardcover)
Dale Lehman, Huybert Groenendaal
R4,076 Discovery Miles 40 760 Ships in 10 - 15 working days

Practical Spreadsheet Modeling Using @Risk provides a guide of how to construct applied decision analysis models in spreadsheets. The focus is on the use of Monte Carlo simulation to provide quantitative assessment of uncertainties and key risk drivers. The book presents numerous examples based on real data and relevant practical decisions in a variety of settings, including health care, transportation, finance, natural resources, technology, manufacturing, retail, and sports and entertainment. All examples involve decision problems where uncertainties make simulation modeling useful to obtain decision insights and explore alternative choices. Good spreadsheet modeling practices are highlighted. The book is suitable for graduate students or advanced undergraduates in business, public policy, health care administration, or any field amenable to simulation modeling of decision problems. The book is also useful for applied practitioners seeking to build or enhance their spreadsheet modeling skills. Features Step-by-step examples of spreadsheet modeling and risk analysis in a variety of fields Description of probabilistic methods, their theoretical foundations, and their practical application in a spreadsheet environment Extensive example models and exercises based on real data and relevant decision problems Comprehensive use of the @Risk software for simulation analysis, including a free one-year educational software license

Logit Models from Economics and Other Fields (Hardcover, 2 Rev Ed): J. S Cramer Logit Models from Economics and Other Fields (Hardcover, 2 Rev Ed)
J. S Cramer
R2,725 R2,301 Discovery Miles 23 010 Save R424 (16%) Ships in 10 - 15 working days

Originating in economics but now used in a variety of disciplines, including medicine, epidemiology and the social sciences, this book provides accessible coverage of the theoretical foundations of the Logit model as well as its applications to concrete problems. It is written not only for economists but for researchers working in disciplines where it is necessary to model qualitative random variables. J.S. Cramer has also provided data sets on which to practice Logit analysis.

Simplicity, Inference and Modelling - Keeping it Sophisticatedly Simple (Hardcover): Arnold Zellner, Hugo A. Keuzenkamp,... Simplicity, Inference and Modelling - Keeping it Sophisticatedly Simple (Hardcover)
Arnold Zellner, Hugo A. Keuzenkamp, Michael McAleer
R3,481 R2,936 Discovery Miles 29 360 Save R545 (16%) Ships in 10 - 15 working days

The idea that simplicity matters in science is as old as science itself, with the much cited example of Ockham's Razor. A problem with Ockham's Razor is that nearly everybody seems to accept it, but few are able to define its exact meaning and to make it operational in a non-arbitrary way. Using a multidisciplinary perspective including philosophers, mathematicians, econometricians and economists, this monograph examines simplicity by asking six questions: What is meant by simplicity? How is simplicity measured? Is there an optimum trade-off between simplicity and goodness-of-fit? What is the relation between simplicity and empirical modelling? What is the relation between simplicity and prediction? What is the connection between simplicity and convenience?

The Econometric Analysis of Seasonal Time Series (Paperback): Eric Ghysels, Denise R. Osborn The Econometric Analysis of Seasonal Time Series (Paperback)
Eric Ghysels, Denise R. Osborn
R1,257 Discovery Miles 12 570 Ships in 10 - 15 working days

Economic and financial time series feature important seasonal fluctuations. Despite their regular and predictable patterns over the year, month or week, they pose many challenges to economists and econometricians. This book provides a thorough review of the recent developments in the econometric analysis of seasonal time series. It is designed for an audience of specialists in economic time series analysis and advanced graduate students. It is the most comprehensive and balanced treatment of the subject since the mid-1980s.

Analysis of Integrated Data (Hardcover): Li-Chun Zhang, Raymond L. Chambers Analysis of Integrated Data (Hardcover)
Li-Chun Zhang, Raymond L. Chambers
R3,345 Discovery Miles 33 450 Ships in 18 - 22 working days

The advent of "Big Data" has brought with it a rapid diversification of data sources, requiring analysis that accounts for the fact that these data have often been generated and recorded for different reasons. Data integration involves combining data residing in different sources to enable statistical inference, or to generate new statistical data for purposes that cannot be served by each source on its own. This can yield significant gains for scientific as well as commercial investigations. However, valid analysis of such data should allow for the additional uncertainty due to entity ambiguity, whenever it is not possible to state with certainty that the integrated source is the target population of interest. Analysis of Integrated Data aims to provide a solid theoretical basis for this statistical analysis in three generic settings of entity ambiguity: statistical analysis of linked datasets that may contain linkage errors; datasets created by a data fusion process, where joint statistical information is simulated using the information in marginal data from non-overlapping sources; and estimation of target population size when target units are either partially or erroneously covered in each source. Covers a range of topics under an overarching perspective of data integration. Focuses on statistical uncertainty and inference issues arising from entity ambiguity. Features state of the art methods for analysis of integrated data. Identifies the important themes that will define future research and teaching in the statistical analysis of integrated data. Analysis of Integrated Data is aimed primarily at researchers and methodologists interested in statistical methods for data from multiple sources, with a focus on data analysts in the social sciences, and in the public and private sectors.

Simulation-based Inference in Econometrics - Methods and Applications (Hardcover): Roberto Mariano, Til Schuermann, Melvyn J.... Simulation-based Inference in Econometrics - Methods and Applications (Hardcover)
Roberto Mariano, Til Schuermann, Melvyn J. Weeks
R4,334 R3,653 Discovery Miles 36 530 Save R681 (16%) Ships in 10 - 15 working days

This substantial volume has two principal objectives. First it provides an overview of the statistical foundations of Simulation-based inference. This includes the summary and synthesis of the many concepts and results extant in the theoretical literature, the different classes of problems and estimators, the asymptotic properties of these estimators, as well as descriptions of the different simulators in use. Second, the volume provides empirical and operational examples of SBI methods. Often what is missing, even in existing applied papers, are operational issues. Which simulator works best for which problem and why? This volume will explicitly address the important numerical and computational issues in SBI which are not covered comprehensively in the existing literature. Examples of such issues are: comparisons with existing tractable methods, number of replications needed for robust results, choice of instruments, simulation noise and bias as well as efficiency loss in practice.

An Introduction to Financial Mathematics - Option Valuation (Hardcover, 2nd edition): Hugo D. Junghenn An Introduction to Financial Mathematics - Option Valuation (Hardcover, 2nd edition)
Hugo D. Junghenn
R3,644 Discovery Miles 36 440 Ships in 10 - 15 working days

Introduction to Financial Mathematics: Option Valuation, Second Edition is a well-rounded primer to the mathematics and models used in the valuation of financial derivatives. The book consists of fifteen chapters, the first ten of which develop option valuation techniques in discrete time, the last five describing the theory in continuous time. The first half of the textbook develops basic finance and probability. The author then treats the binomial model as the primary example of discrete-time option valuation. The final part of the textbook examines the Black-Scholes model. The book is written to provide a straightforward account of the principles of option pricing and examines these principles in detail using standard discrete and stochastic calculus models. Additionally, the second edition has new exercises and examples, and includes many tables and graphs generated by over 30 MS Excel VBA modules available on the author's webpage https://home.gwu.edu/~hdj/.

Data Analytics - Effective Methods for Presenting Results (Hardcover): Subhashish Samaddar, Satish Nargundkar Data Analytics - Effective Methods for Presenting Results (Hardcover)
Subhashish Samaddar, Satish Nargundkar
R2,284 Discovery Miles 22 840 Ships in 10 - 15 working days

If you are a manager who receives the results of any data analyst's work to help with your decision-making, this book is for you. Anyone playing a role in the field of analytics can benefit from this book as well. In the two decades the editors of this book spent teaching and consulting in the field of analytics, they noticed a critical shortcoming in the communication abilities of many analytics professionals. Specifically, analysts have difficulty in articulating in business terms what their analyses showed and what actionable recommendations were made. When analysts made presentations, they tended to lapse into the technicalities of mathematical procedures, rather than focusing on the strategic and tactical impact and meaning of their work. As analytics has become more mainstream and widespread in organizations, this problem has grown more acute. Data Analytics: Effective Methods for Presenting Results tackles this issue. The editors have used their experience as presenters and audience members who have become lost during presentation. Over the years, they experimented with different ways of presenting analytics work to make a more compelling case to top managers. They have discovered tried and true methods for improving presentations, which they share. The book also presents insights from other analysts and managers who share their own experiences. It is truly a collection of experiences and insight from academics and professionals involved with analytics. The book is not a primer on how to draw the most beautiful charts and graphs or about how to perform any specific kind of analysis. Rather, it shares the experiences of professionals in various industries about how they present their analytics results effectively. They tell their stories on how to win over audiences. The book spans multiple functional areas within a business, and in some cases, it discusses how to adapt presentations to the needs of audiences at different levels of management.

Data Analytics in Project Management (Hardcover): Seweryn Spalek Data Analytics in Project Management (Hardcover)
Seweryn Spalek
R3,362 Discovery Miles 33 620 Ships in 10 - 15 working days

This book aims to help the reader better understand the importance of data analysis in project management. Moreover, it provides guidance by showing tools, methods, techniques and lessons learned on how to better utilize the data gathered from the projects. First and foremost, insight into the bridge between data analytics and project management aids practitioners looking for ways to maximize the practical value of data procured. The book equips organizations with the know-how necessary to adapt to a changing workplace dynamic through key lessons learned from past ventures. The book's integrated approach to investigating both fields enhances the value of research findings.

Technical Analysis of Stock Trends (Hardcover, 11th edition): Robert D. Edwards, John Magee, W. H. C. Bassetti Technical Analysis of Stock Trends (Hardcover, 11th edition)
Robert D. Edwards, John Magee, W. H. C. Bassetti
R3,218 Discovery Miles 32 180 Ships in 9 - 17 working days

Technical Analysis of Stock Trends helps investors make smart, profitable trading decisions by providing proven long- and short-term stock trend analysis. It gets right to the heart of effective technical trading concepts, explaining technical theory such as The Dow Theory, reversal patterns, consolidation formations, trends and channels, technical analysis of commodity charts, and advances in investment technology. It also includes a comprehensive guide to trading tactics from long and short goals, stock selection, charting, low and high risk, trend recognition tools, balancing and diversifying the stock portfolio, application of capital, and risk management. This updated new edition includes patterns and modifiable charts that are tighter and more illustrative. Expanded material is also included on Pragmatic Portfolio Theory as a more elegant alternative to Modern Portfolio Theory; and a newer, simpler, and more powerful alternative to Dow Theory is presented. This book is the perfect introduction, giving you the knowledge and wisdom to craft long-term success.

MODA 6 - Advances in Model-Oriented Design and Analysis - Proceedings of the 6th International Workshop on Model-Oriented... MODA 6 - Advances in Model-Oriented Design and Analysis - Proceedings of the 6th International Workshop on Model-Oriented Design and Analysis held in Puchberg/Schneeberg, Austria, June 25-29, 2001 (Paperback, Softcover reprint of the original 1st ed. 2001)
Anthony C. Atkinson, Peter Hackl, Werner G. Muller
R1,497 Discovery Miles 14 970 Ships in 18 - 22 working days

This book includes many of the papers presented at the 6th International workshop on Model Oriented Data Analysis held in June 2001. This series began in March 1987 with a meeting on the Wartburg near Eisenach (at that time in the GDR). The next four meetings were in 1990 (St Kyrik monastery, Bulgaria), 1992 (Petrodvorets, St Petersburg, Russia), 1995 (Spetses, Greece) and 1998 (Marseilles, France). Initially the main purpose of these workshops was to bring together leading scientists from 'Eastern' and 'Western' Europe for the exchange of ideas in theoretical and applied statistics, with special emphasis on experimental design. Now that the sep aration between East and West is much less rigid, this exchange has, in principle, become much easier. However, it is still important to provide opportunities for this interaction. MODA meetings are celebrated for their friendly atmosphere. Indeed, dis cussions between young and senior scientists at these meetings have resulted in several fruitful long-term collaborations. This intellectually stimulating atmosphere is achieved by limiting the number of participants to around eighty, by the choice of a location in which communal living is encour aged and, of course, through the careful scientific direction provided by the Programme Committee. It is a tradition of these meetings to provide low cost accommodation, low fees and financial support for the travel of young and Eastern participants. This is only possible through the help of sponsors and outside financial support was again important for the success of the meeting."

Classification, Automation, and New Media - Proceedings of the 24th Annual Conference of the Gesellschaft fur Klassifikation... Classification, Automation, and New Media - Proceedings of the 24th Annual Conference of the Gesellschaft fur Klassifikation e.V., University of Passau, March 15-17, 2000 (Paperback, Softcover reprint of the original 1st ed. 2002)
Wolfgang A. Gaul, Gunter Ritter
R4,235 Discovery Miles 42 350 Ships in 18 - 22 working days

Given the huge amount of information in the internet and in practically every domain of knowledge that we are facing today, knowledge discovery calls for automation. The book deals with methods from classification and data analysis that respond effectively to this rapidly growing challenge. The interested reader will find new methodological insights as well as applications in economics, management science, finance, and marketing, and in pattern recognition, biology, health, and archaeology.

Frontiers in Statistical Quality Control 6 (Paperback, Softcover reprint of the original 1st ed. 2001): Hans-Joachim Lenz,... Frontiers in Statistical Quality Control 6 (Paperback, Softcover reprint of the original 1st ed. 2001)
Hans-Joachim Lenz, Peter-Theodor Wilrich
R2,790 Discovery Miles 27 900 Ships in 18 - 22 working days

In the 1920's, Walter Shewhart visualized that the marriage of statistical methods and manufacturing processes would produce reliable and consistent quality products. Shewhart (1931) conceived the idea of statistical process control (SPC) and developed the well-known and appropriately named Shewhart control chart. However, from the 1930s to the 1990s, literature on SPC schemes have been "captured" by the Shewhart paradigm of normality, independence and homogeneous variance. When in fact, the problems facing today's industries are more inconsistent than those faced by Shewhart in the 1930s. As a result of the advances in machine and sensor technology, process data can often be collected on-line. In this situation, the process observations that result from data collection activities will frequently not be serially independent, but autocorrelated. Autocorrelation has a significant impact on a control chart: the process may not exhibit a state of statistical control when in fact, it is in control. As the prevalence of this type of data is expected to increase in industry (Hahn 1989), so does the need to control and monitor it. Equivalently, literature has reflected this trend, and research in the area of SPC with autocorrelated data continues so that effective methods of handling correlated data are available. This type of data regularly occurs in the chemical and process industries, and is pervasive in computer-integrated manufacturing environments, clinical laboratory settings and in the majority of SPC applications across various manufacturing and service industries (Alwan 1991).

Classification and Information Processing at the Turn of the Millennium - Proceedings of the 23rd Annual Conference of the... Classification and Information Processing at the Turn of the Millennium - Proceedings of the 23rd Annual Conference of the Gesellschaft fur Klassifikation e.V., University of Bielefeld, March 10-12, 1999 (Paperback, 2000 ed.)
Reinhold Decker, Wolfgang A. Gaul
R4,212 Discovery Miles 42 120 Ships in 18 - 22 working days

This volume contains revised versions of selected papers presented dur ing the 23rd Annual Conference of the German Classification Society GfKl (Gesellschaft fiir Klassifikation). The conference took place at the Univer sity of Bielefeld (Germany) in March 1999 under the title "Classification and Information Processing at the Turn of the Millennium". Researchers and practitioners - interested in data analysis, classification, and information processing in the broad sense, including computer science, multimedia, WWW, knowledge discovery, and data mining as well as spe cial application areas such as (in alphabetical order) biology, finance, genome analysis, marketing, medicine, public health, and text analysis - had the op portunity to discuss recent developments and to establish cross-disciplinary cooperation in their fields of interest. Additionally, software and book pre sentations as well as several tutorial courses were organized. The scientific program of the conference included 18 plenary or semi plenary lectures and more than 100 presentations in special sections. The peer-reviewed papers are presented in 5 chapters as follows: * Data Analysis and Classification * Computer Science, Computational Statistics, and Data Mining * Management Science, Marketing, and Finance * Biology, Genome Analysis, and Medicine * Text Analysis and Information Retrieval As an unambiguous assignment of results to single chapters is sometimes difficult papers are grouped in a way that the editors found appropriate.

Seasonal Adjustment with the X-11 Method (Paperback, Softcover reprint of the original 1st ed. 2001): Dominique Ladiray, Benoit... Seasonal Adjustment with the X-11 Method (Paperback, Softcover reprint of the original 1st ed. 2001)
Dominique Ladiray, Benoit Quenneville
R3,168 Discovery Miles 31 680 Ships in 18 - 22 working days

The most widely used statistical method in seasonal adjustment is without doubt that implemented in the X-11 Variant of the Census Method II Seasonal Adjustment Program. Developed at the US Bureau of the Census in the 1950's and 1960's, this computer program has undergone numerous modifications and improvements, leading especially to the X-11-ARIMA software packages in 1975 and 1988 and X-12-ARIMA, the first beta version of which is dated 1998. While these software packages integrate, to varying degrees, parametric methods, and especially the ARIMA models popularized by Box and Jenkins, they remain in essence very close to the initial X-11 method, and it is this "core" that Seasonal Adjustment with the X-11 Method focuses on. With a Preface by Allan Young, the authors document the seasonal adjustment method implemented in the X-11 based software. It will be an important reference for government agencies, macroeconomists, and other serious users of economic data. After some historical notes, the authors outline the X-11 methodology. One chapter is devoted to the study of moving averages with an emphasis on those used by X-11. Readers will also find a complete example of seasonal adjustment, and have a detailed picture of all the calculations. The linear regression models used for trading-day effects and the process of detecting and correcting extreme values are studied in the example. The estimation of the Easter effect is dealt with in a separate chapter insofar as the models used in X-11-ARIMA and X-12-ARIMA are appreciably different. Dominique Ladiray is an Administrateur at the French Institut National de la Statistique et des Etudes Economiques. He is also a Professor at the Ecole Nationale de la Statistique et de l'Administration Economique, and at the Ecole Nationale de la Statistique et de l'Analyse de l'Information. He currently works on short-term economic analysis. Benoît Quenneville is a methodologist with Statistics Canada Time Series Research and Analysis Centre. He holds a Ph.D. from the University of Western Ontario. His research interests are in time series analysis with an emphasis on official statistics.

Introduction to Statistical Methods for Financial Models (Hardcover): Thomas A. Severini Introduction to Statistical Methods for Financial Models (Hardcover)
Thomas A. Severini
R2,810 Discovery Miles 28 100 Ships in 10 - 15 working days

This book provides an introduction to the use of statistical concepts and methods to model and analyze financial data. The ten chapters of the book fall naturally into three sections. Chapters 1 to 3 cover some basic concepts of finance, focusing on the properties of returns on an asset. Chapters 4 through 6 cover aspects of portfolio theory and the methods of estimation needed to implement that theory. The remainder of the book, Chapters 7 through 10, discusses several models for financial data, along with the implications of those models for portfolio theory and for understanding the properties of return data. The audience for the book is students majoring in Statistics and Economics as well as in quantitative fields such as Mathematics and Engineering. Readers are assumed to have some background in statistical methods along with courses in multivariate calculus and linear algebra.

Evaluating Active Labour Market Policies - Empirical Evidence for Poland During Transition (Paperback, Softcover reprint of the... Evaluating Active Labour Market Policies - Empirical Evidence for Poland During Transition (Paperback, Softcover reprint of the original 1st ed. 1999)
Patrick A. Puhani
R2,642 Discovery Miles 26 420 Ships in 18 - 22 working days

Most governments in today's market economies spend significant sums of money on labour market programmes. The declared aims of these programmes are to increase the re-employment chances of the unemployed. This book investigates which active labour market programmes in Poland are value for money and which are not. To this end, modern statistical methods are applied to both macro- and microeconomic data. It is shown that training programmes increase, whereas job subsidies and public works decrease the re-employment opportunities of the unemployed. In general, all active labour market policy effects are larger in absolute size for men than for women. By surveying previous studies in the field and outlining the major statistical approaches that are employed in the evaluation literature, the book can be of help to any student interested in programme evaluation irrespective of the paticular programme or country concerned.

Classification in the Information Age - Proceedings of the 22nd Annual GfKl Conference, Dresden, March 4-6, 1998 (Paperback,... Classification in the Information Age - Proceedings of the 22nd Annual GfKl Conference, Dresden, March 4-6, 1998 (Paperback, Softcover reprint of the original 1st ed. 1999)
Wolfgang A. Gaul, Hermann Locarek-Junge
R4,121 Discovery Miles 41 210 Ships in 18 - 22 working days

nd Selected papers presented at the 22 Annual Conference of the German Classification Society GfKI (Gesellschaft fUr Klassifikation), held at the Uni- versity of Dresden in 1998, are contained in this volume of "Studies in Clas- sification, Data Analysis, and Knowledge Organization" . One aim of GfKI was to provide a platform for a discussion of results con- cerning a challenge of growing importance that could be labeled as "Classi- fication in the Information Age" and to support interdisciplinary activities from research and applications that incorporate directions of this kind. As could be expected, the largest share of papers is closely related to classi- fication and-in the broadest sense-data analysis and statistics. Additionally, besides contributions dealing with questions arising from the usage of new media and the internet, applications in, e.g., (in alphabetical order) archeolo- gy, bioinformatics, economics, environment, and health have been reported. As always, an unambiguous assignment of results to single topics is some- times difficult, thus, from more than 130 presentations offered within the scientific program 65 papers are grouped into the following chapters and subchapters: * Plenary and Semi Plenary Presentations - Classification and Information - Finance and Risk * Classification and Related Aspects of Data Analysis and Learning - Classification, Data Analysis, and Statistics - Conceptual Analysis and Learning * Usage of New Media and the Internet - Information Systems, Multimedia, and WWW - Navigation and Classification on the Internet and Virtual Univ- sities * Applications in Economics

Handbook Of Applied Econometrics And Statistical Inference (Hardcover): Aman Ullah Handbook Of Applied Econometrics And Statistical Inference (Hardcover)
Aman Ullah
R10,636 Discovery Miles 106 360 Ships in 10 - 15 working days

Summarizes the latest developments and techniques in the field and highlights areas such as sample surveys, nonparametric analysis, hypothesis testing, time series analysis, Bayesian inference, and distribution theory for current applications in statistics, economics, medicine, biology, engineering, sociology, psychology, and information technology. Containing more than 800 contemporary references to facilitate further study, the Handbook of Applied Econometrics and Statistical Inference is an in-depth guide for applied statisticians, econometricians, economists, sociologists, psychologists, data analysts, biometricians, medical researchers, and upper-level undergraduate and graduate-level students in these disciplines.

Bootstrapping - An Integrated Approach with Python and Stata (Paperback): Felix Bittmann Bootstrapping - An Integrated Approach with Python and Stata (Paperback)
Felix Bittmann
R748 R652 Discovery Miles 6 520 Save R96 (13%) Ships in 18 - 22 working days

Bootstrapping is a conceptually simple statistical technique to increase the quality of estimates, conduct robustness checks and compute standard errors for virtually any statistic. This book provides an intelligible and compact introduction for students, scientists and practitioners. It not only gives a clear explanation of the underlying concepts but also demonstrates the application of bootstrapping using Python and Stata.

Discrete Choice Experiments in Marketing - Use of Priors in Efficient Choice Designs and Their Application to Individual... Discrete Choice Experiments in Marketing - Use of Priors in Efficient Choice Designs and Their Application to Individual Preference Measurement (Paperback, illustrated edition)
Klaus Zwerina
R1,386 Discovery Miles 13 860 Ships in 18 - 22 working days

The chapter starts with a positioning of this dissertation in the marketing discipline. It then provides a comparison of the two most popular methods for studying consumer preferences/choices, namely conjoint analysis and discrete choice experiments. Chapter 1 continues with a description of the context of discrete choice experiments. Subsequently, the research problems and the objectives ofthis dissertation are discussed. The chapter concludes with an outline of the organization of this dissertation. 1. 1 Positioning of the Dissertation During this century, increasing globalization and technological progress has forced companies to undergo rapid and dramatic changes-for some a threat, for others it offers new opportunities. Companies have to survive in a Darwinian marketplace where the principle of natural selection applies. Marketplace success goes to those companies that are able to produce marketable value, Le. , products and services that others are willing to purchase (Kotler 1997). Every company must be engaged in new-product development to create the new products customers want because competitors will do their best to supply them. Besides offering competitive advantages, new products usually lead to sales growth and stability. As household incomes increase and consumers become more selective, fmns need to know how consumers respond to different features and appeals. Successful products and services begin with a thorough understanding of consumer needs and wants. Stated otherwise, companies need to know about consumer preferences to manufacture tailor-made products, consumers are willing to buy.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Basic mathematics for economics students…
Derek Yu Paperback R420 Discovery Miles 4 200
Quantitative statistical techniques
Swanepoel Swanepoel, Vivier Vivier, … Paperback  (2)
R718 Discovery Miles 7 180
Matching, Regression Discontinuity…
Myoung-Jae Lee Hardcover R3,748 Discovery Miles 37 480
Analysis of Covariance
Stephen L. Berman Hardcover R829 Discovery Miles 8 290
Business Statistics of the United States…
Susan Ockert Hardcover R4,908 Discovery Miles 49 080
Operations And Supply Chain Management
David Collier, James Evans Hardcover R1,391 R1,295 Discovery Miles 12 950
Operations and Supply Chain Management
James Evans, David Collier Hardcover R1,369 R1,276 Discovery Miles 12 760
On the Cusp - From Population Boom to…
Charles S. Pearson Hardcover R1,179 Discovery Miles 11 790
E.Europe Russia & C Asia 2001
Europa Publications Hardcover R10,056 Discovery Miles 100 560
The Leading Indicators - A Short History…
Zachary Karabell Paperback R456 R426 Discovery Miles 4 260

 

Partners