0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R1,000 - R2,500 (7)
  • R2,500 - R5,000 (4)
  • -
Status
Brand

Showing 1 - 11 of 11 matches in All Departments

Statistics for Making Decisions (Hardcover): Nicholas T. Longford Statistics for Making Decisions (Hardcover)
Nicholas T. Longford
R3,418 R2,846 Discovery Miles 28 460 Save R572 (17%) Ships in 9 - 15 working days

Making decisions is a ubiquitous mental activity in our private and professional or public lives. It entails choosing one course of action from an available shortlist of options. Statistics for Making Decisions places decision making at the centre of statistical inference, proposing its theory as a new paradigm for statistical practice. The analysis in this paradigm is earnest about prior information and the consequences of the various kinds of errors that may be committed. Its conclusion is a course of action tailored to the perspective of the specific client or sponsor of the analysis. The author's intention is a wholesale replacement of hypothesis testing, indicting it with the argument that it has no means of incorporating the consequences of errors which self-evidently matter to the client. The volume appeals to the analyst who deals with the simplest statistical problems of comparing two samples (which one has a greater mean or variance), or deciding whether a parameter is positive or negative. It combines highlighting the deficiencies of hypothesis testing with promoting a principled solution based on the idea of a currency for error, of which we want to spend as little as possible. This is implemented by selecting the option for which the expected loss is smallest (the Bayes rule). The price to pay is the need for a more detailed description of the options, and eliciting and quantifying the consequences (ramifications) of the errors. This is what our clients do informally and often inexpertly after receiving outputs of the analysis in an established format, such as the verdict of a hypothesis test or an estimate and its standard error. As a scientific discipline and profession, statistics has a potential to do this much better and deliver to the client a more complete and more relevant product. Nicholas T. Longford is a senior statistician at Imperial College, London, specialising in statistical methods for neonatal medicine. His interests include causal analysis of observational studies, decision theory, and the contest of modelling and design in data analysis. His longer-term appointments in the past include Educational Testing Service, Princeton, NJ, USA, de Montfort University, Leicester, England, and directorship of SNTL, a statistics research and consulting company. He is the author of over 100 journal articles and six other monographs on a variety of topics in applied statistics.

Statistics for Making Decisions (Paperback): Nicholas T. Longford Statistics for Making Decisions (Paperback)
Nicholas T. Longford
R1,452 Discovery Miles 14 520 Ships in 12 - 17 working days

Making decisions is a ubiquitous mental activity in our private and professional or public lives. It entails choosing one course of action from an available shortlist of options. Statistics for Making Decisions places decision making at the centre of statistical inference, proposing its theory as a new paradigm for statistical practice. The analysis in this paradigm is earnest about prior information and the consequences of the various kinds of errors that may be committed. Its conclusion is a course of action tailored to the perspective of the specific client or sponsor of the analysis. The author's intention is a wholesale replacement of hypothesis testing, indicting it with the argument that it has no means of incorporating the consequences of errors which self-evidently matter to the client. The volume appeals to the analyst who deals with the simplest statistical problems of comparing two samples (which one has a greater mean or variance), or deciding whether a parameter is positive or negative. It combines highlighting the deficiencies of hypothesis testing with promoting a principled solution based on the idea of a currency for error, of which we want to spend as little as possible. This is implemented by selecting the option for which the expected loss is smallest (the Bayes rule). The price to pay is the need for a more detailed description of the options, and eliciting and quantifying the consequences (ramifications) of the errors. This is what our clients do informally and often inexpertly after receiving outputs of the analysis in an established format, such as the verdict of a hypothesis test or an estimate and its standard error. As a scientific discipline and profession, statistics has a potential to do this much better and deliver to the client a more complete and more relevant product. Nicholas T. Longford is a senior statistician at Imperial College, London, specialising in statistical methods for neonatal medicine. His interests include causal analysis of observational studies, decision theory, and the contest of modelling and design in data analysis. His longer-term appointments in the past include Educational Testing Service, Princeton, NJ, USA, de Montfort University, Leicester, England, and directorship of SNTL, a statistics research and consulting company. He is the author of over 100 journal articles and six other monographs on a variety of topics in applied statistics.

Statistical Studies of Income, Poverty and Inequality in Europe - Computing and Graphics in R using EU-SILC (Paperback):... Statistical Studies of Income, Poverty and Inequality in Europe - Computing and Graphics in R using EU-SILC (Paperback)
Nicholas T. Longford
R1,431 Discovery Miles 14 310 Ships in 12 - 17 working days

There is no shortage of incentives to study and reduce poverty in our societies. Poverty is studied in economics and political sciences, and population surveys are an important source of information about it. The design and analysis of such surveys is principally a statistical subject matter and the computer is essential for their data compilation and processing. Focusing on The European Union Statistics on Income and Living Conditions (EU-SILC), a program of annual national surveys which collect data related to poverty and social exclusion, Statistical Studies of Income, Poverty and Inequality in Europe: Computing and Graphics in R presents a set of statistical analyses pertinent to the general goals of EU-SILC. The contents of the volume are biased toward computing and statistics, with reduced attention to economics, political and other social sciences. The emphasis is on methods and procedures as opposed to results, because the data from annual surveys made available since publication and in the near future will degrade the novelty of the data used and the results derived in this volume. The aim of this volume is not to propose specific methods of analysis, but to open up the analytical agenda and address the aspects of the key definitions in the subject of poverty assessment that entail nontrivial elements of arbitrariness. The presented methods do not exhaust the range of analyses suitable for EU-SILC, but will stimulate the search for new methods and adaptation of established methods that cater to the identified purposes.

Statistical Decision Theory (Paperback, 2013 ed.): Nicholas T. Longford Statistical Decision Theory (Paperback, 2013 ed.)
Nicholas T. Longford
R1,922 Discovery Miles 19 220 Ships in 10 - 15 working days

This monograph presents a radical rethinking of how elementary inferences should be made in statistics, implementing a comprehensive alternative to hypothesis testing in which the control of the probabilities of the errors is replaced by selecting the course of action (one of the available options) associated with the smallest expected loss.

Its strength is that the inferences are responsive to the elicited or declared consequences of the erroneous decisions, and so they can be closely tailored to the client s perspective, priorities, value judgments and other prior information, together with the uncertainty about them."

Missing Data and Small-Area Estimation - Modern Analytical Equipment for the Survey Statistician (Paperback, 2005): Nicholas T.... Missing Data and Small-Area Estimation - Modern Analytical Equipment for the Survey Statistician (Paperback, 2005)
Nicholas T. Longford
R2,983 Discovery Miles 29 830 Ships in 10 - 15 working days

This book evolved from lectures, courses and workshops on missing data and small-area estimation that I presented during my tenure as the ?rst C- pion Fellow (2000-2002). For the Fellowship I proposed these two topics as areas in which the academic statistics could contribute to the development of government statistics, in exchange for access to the operational details and background that would inform the direction and sharpen the focus of a- demic research. After a few years of involvement, I have come to realise that the separation of 'academic' and 'industrial' statistics is not well suited to either party, and their integration is the key to progress in both branches. Most of the work on this monograph was done while I was a visiting l- turer at Massey University, Palmerston North, New Zealand. The hospitality and stimulating academic environment of their Institute of Information S- ence and Technology is gratefully acknowledged. I could not name all those who commented on my lecture notes and on the presentations themselves; apart from them, I want to thank the organisers and silent attendees of all the events, and, with a modicum of reluctance, the 'grey ?gures' who kept inquiring whether I was any nearer the completion of whatever stage I had been foolish enough to attach a date.

Models for Uncertainty in Educational Testing (Paperback, Softcover reprint of the original 1st ed. 1995): Nicholas T. Longford Models for Uncertainty in Educational Testing (Paperback, Softcover reprint of the original 1st ed. 1995)
Nicholas T. Longford
R1,559 Discovery Miles 15 590 Ships in 10 - 15 working days

A theme running through this book is that of making inference about sources of variation or uncertainty, and the author shows how information about these sources can be used for improved estimation of certain elementary quantities. Amongst the topics covered are: essay rating, summarizing item-level properties, equating of tests, small-area estimation, and incomplete longitudinal studies. Throughout, examples are given using real data sets which exemplify these applications.

Studying Human Populations - An Advanced Course in Statistics (Paperback, Softcover reprint of hardcover 1st ed. 2008):... Studying Human Populations - An Advanced Course in Statistics (Paperback, Softcover reprint of hardcover 1st ed. 2008)
Nicholas T. Longford
R1,620 Discovery Miles 16 200 Ships in 10 - 15 working days

This textbook is for graduate students and research workers in social statistics and related subject areas. It follows a novel curriculum developed around the basic statistical activities: sampling, measurement and inference. The monograph aims to prepare the reader for the career of an independent social statistician and to serve as a reference for methods, ideas for and ways of studying of human populations. Elementary linear algebra and calculus are prerequisites, although the exposition is quite forgiving. Familiarity with statistical software at the outset is an advantage, but it can be developed while reading the first few chapters.

Studying Human Populations - An Advanced Course in Statistics (Hardcover, 2008 ed.): Nicholas T. Longford Studying Human Populations - An Advanced Course in Statistics (Hardcover, 2008 ed.)
Nicholas T. Longford
R1,652 Discovery Miles 16 520 Ships in 10 - 15 working days

This textbook is for graduate students and research workers in social statistics and related subject areas. It follows a novel curriculum developed around the basic statistical activities: sampling, measurement and inference. The monograph aims to prepare the reader for the career of an independent social statistician and to serve as a reference for methods, ideas for and ways of studying of human populations. Elementary linear algebra and calculus are prerequisites, although the exposition is quite forgiving. Familiarity with statistical software at the outset is an advantage, but it can be developed while reading the first few chapters.

Missing Data and Small-Area Estimation - Modern Analytical Equipment for the Survey Statistician (Hardcover): Nicholas T.... Missing Data and Small-Area Estimation - Modern Analytical Equipment for the Survey Statistician (Hardcover)
Nicholas T. Longford
R3,015 Discovery Miles 30 150 Ships in 10 - 15 working days

This book develops methods for two key problems in the analysis of large-scale surveys: dealing with incomplete data and making inferences about sparsely represented subdomains. The presentation is committed to two particular methods, multiple imputation for missing data and multivariate composition for small-area estimation. The methods are presented as developments of established approaches by attending to their deficiencies. Thus the change to more efficient methods can be gradual, sensitive to the management priorities in large research organisations and multidisciplinary teams and to other reasons for inertia. The typical setting of each problem is addressed first, and then the constituency of the applications is widened to reinforce the view that the general method is essential for modern survey analysis. The general tone of the book is not "from theory to practice," but "from current practice to better practice." The third part of the book, a single chapter, presents a method for efficient estimation under model uncertainty. It is inspired by the solution for small-area estimation and is an example of "from good practice to better theory."

A strength of the presentation is chapters of case studies, one for each problem. Whenever possible, turning to examples and illustrations is preferred to the theoretical argument. The book is suitable for graduate students and researchers who are acquainted with the fundamentals of sampling theory and have a good grounding in statistical computing, or in conjunction with an intensive period of learning and establishing one's own a modern computing and graphical environment that would serve the reader for most of the analytical work inthe future.

While some analysts might regard data imperfections and deficiencies, such as nonresponse and limited sample size, as someone else's failure that bars effective and valid analysis, this book presents them as respectable analytical and inferential challenges, opportunities to harness the computing power into service of high-quality socially relevant statistics.

Overriding in this approach is the general principlea "to do the best, for the consumer of statistical information, that can be done with what is available. The reputation that government statistics is a rigid procedure-based and operation-centred activity, distant from the mainstream of statistical theory and practice, is refuted most resolutely.

After leaving De Montfort University in 2004 where he was a Senior Research Fellow in Statistics, Nick Longford founded the statistical research and consulting company SNTL in Leicester, England. He was awarded the first Campion Fellowship (2000a "02) for methodological research in United Kingdom government statistics. He has served as Associate Editor of the Journal of the Royal Statistical Society, Series A, and the Journal of Educational and Behavioral Statistics and as an Editor of the Journal of Multivariate Analysis. He is a member of the Editorial Board of the British Journal of Mathematical and Statistical Psychology. He is the author of two other monographs, Random Coefficient Models (Oxford University Press, 1993) and Models for Uncertainty in Educational Testing (Springer-Verlag, 1995).

From the reviews:

"Ultimately, this book serves as an excellent reference source to guide and improve statistical practice in survey settings exhibiting theseproblems." Psychometrika

"I am convinced this book will be useful to practitioners...[and a] valuable resource for future research in this field." Jan Kordos in Statistics in Transition, Vol. 7, No. 5, June 2006

"To sum up, I think this is an excellent book and it thoroughly covers methods to deal with incomplete data problems and small-area estimation. It is a useful and suitable book for survey statisticians, as well as for researchers and graduate students interested on sampling designs." Ramon Cleries Soler in Statistics and Operations Research Transactions, Vol. 30, No. 1, January-June 2006

Statistical Studies of Income, Poverty and Inequality in Europe - Computing and Graphics in R using EU-SILC (Hardcover):... Statistical Studies of Income, Poverty and Inequality in Europe - Computing and Graphics in R using EU-SILC (Hardcover)
Nicholas T. Longford
R3,400 Discovery Miles 34 000 Ships in 12 - 17 working days

There is no shortage of incentives to study and reduce poverty in our societies. Poverty is studied in economics and political sciences, and population surveys are an important source of information about it. The design and analysis of such surveys is principally a statistical subject matter and the computer is essential for their data compilation and processing. Focusing on The European Union Statistics on Income and Living Conditions (EU-SILC), a program of annual national surveys which collect data related to poverty and social exclusion, Statistical Studies of Income, Poverty and Inequality in Europe: Computing and Graphics in R presents a set of statistical analyses pertinent to the general goals of EU-SILC. The contents of the volume are biased toward computing and statistics, with reduced attention to economics, political and other social sciences. The emphasis is on methods and procedures as opposed to results, because the data from annual surveys made available since publication and in the near future will degrade the novelty of the data used and the results derived in this volume. The aim of this volume is not to propose specific methods of analysis, but to open up the analytical agenda and address the aspects of the key definitions in the subject of poverty assessment that entail nontrivial elements of arbitrariness. The presented methods do not exhaust the range of analyses suitable for EU-SILC, but will stimulate the search for new methods and adaptation of established methods that cater to the identified purposes.

Random Coefficient Models (Hardcover): Nicholas T. Longford Random Coefficient Models (Hardcover)
Nicholas T. Longford
R2,354 Discovery Miles 23 540 Ships in 10 - 15 working days

Clustering is a phenomenon commonly observed across social science research--students are clustered in classrooms, individuals in households, and companies within industrial sectors, to name but a few examples. This book presents an elementary and systematic introduction to modeling of between-cluster variation, how results are best interpreted, and computational methods for estimation. The book addresses many important issues in the social sciences that can be best described in terms of variation sources and patterns, such as temporal, between-person, and geographical variation. By providing a balanced presentation of the advantages and limitations of these methods, the author has provided an introduction to the subject that will be of great utility to statisticians and students concentrating on social science data analysis.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Dala Craft Pom Poms - Assorted Colours…
R36 Discovery Miles 360
Magneto Head Light
R99 R84 Discovery Miles 840
High Waist Leggings (Black)
R169 Discovery Miles 1 690
Bostik Neon Twisters - Gel Highlighters…
R48 Discovery Miles 480
Bostik Glue Stick (40g)
R52 Discovery Miles 520
Midnights
Taylor Swift CD R418 Discovery Miles 4 180
HP 330 Wireless Keyboard and Mouse Combo
R800 R450 Discovery Miles 4 500
Loot
Nadine Gordimer Paperback  (2)
R398 R330 Discovery Miles 3 300
FIFA World Cup Qatar 2022 Sticker Album
R49 R39 Discovery Miles 390
Estee Lauder Beautiful Belle Eau De…
R2,241 R1,652 Discovery Miles 16 520

 

Partners