Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Social sciences > Psychology > Psychological methodology
This book introduces a new data analysis technique that addresses long standing criticisms of the current standard statistics. Observation Oriented Modelling presents the mathematics and techniques underlying the new method, discussing causality, modelling, and logical hypothesis testing. Examples of how to approach and interpret data using OOM are presented throughout the book, including analysis of several classic studies in psychology. These analyses are conducted using comprehensive software for the Windows operating system that has been written to accompany the book and will be provided free to book buyers on an accompanying website. The software has a user-friendly interface, similar to SPSS and
SAS, which are the two most commonly used software analysis
packages, and the analysis options are flexible enough to replace
numerous traditional techniques such as t-tests, ANOVA,
correlation, multiple regression, mediation analysis, chi-square
tests, factor analysis, and inter-rater reliability. The output and
graphs generated by the software are also easy to interpret, and
all effect sizes are presented in a common metric; namely, the
number of observations correctly classified by the algorithm. The
software is designed so that undergraduate students in psychology
will have no difficulty learning how to use the software and
interpreting the results of the analyses. * Describes the problems that statistics are meant to answer, why popularly used statistics often fail to fully answer the question, and how OOM overcomes these obstacles * Chapters include examples of statistical analysis using OOM * Software for OOM comes free with the book * Accompanying websiteinclude svideo instruction on OOM use "
Longitudinal research is an essential element in the investigation of human development over time, with considerable advantages over more widely used cross-sectional research designs. This book examines the scope for longitudinal studies in a range of developmental fields, emphasizing the advantages of this approach for the investigation of causal mechanisms and processes and the dynamics of development over the lifespan. It also discusses methodological issues and some of the practical and ethical problems that longitudinal research may present. The distinguished contributors review normal and disordered development in the emotional, cognitive and social domains, including valuable discussions of gene-environment interactions, the maturation of the human brain, and issues relating to ageing.
Wim van der Linden was just given a lifetime achievement award by the National Council on Measurement in Education. There is no one more prominent in the area of educational testing. There are hundreds of computer-based credentialing exams in areas such as accounting, real estate, nursing, and securities, as well as the well-known admissions exams for college, graduate school, medical school, and law school - there is great need on the theory of testing. This book presents the statistical theory and practice behind constructing good tests e.g., how is the first test item selected, how are the next items selected, and when do you have enough items.
Generalizability theory offers an extensive conceptual framework and a powerful set of statistical procedures for characterizing and quantifying the fallibility of measurements. Robert Brennan, the author, has written the most comprehensive and up-to-date treatment of generalizability theory. The book provides a synthesis of those parts of the statistical literature that are directly applicable to generalizability theory. The principal intended audience is measurement practitioners and graduate students in the behavioral and social sciences, although a few examples and references are provided from other fields. Readers will benefit from some familiarity with classical test theory and analysis of variance, but the treatment of most topics does not presume specific background.
In this book, experts in statistics and psychometrics describe classes of linkages, the history of score linkings, data collection designs, and methods used to achieve sound score linkages. They describe and critically discuss applications to a variety of domains. They define what linking is, to distinguish among the varieties of linking and to describe different procedure for linking. Furthermore, they convey the complexity and diversity of linking by covering different areas of linking and providing diverse perspectives.
WINNER OF THE 2007 DEGROOT PRIZE The prominence of finite mixture modelling is greater than ever. Many important statistical topics like clustering data, outlier treatment, or dealing with unobserved heterogeneity involve finite mixture models in some way or other. The area of potential applications goes beyond simple data analysis and extends to regression analysis and to non-linear time series analysis using Markov switching models. For more than the hundred years since Karl Pearson showed in 1894 how to estimate the five parameters of a mixture of two normal distributions using the method of moments, statistical inference for finite mixture models has been a challenge to everybody who deals with them. In the past ten years, very powerful computational tools emerged for dealing with these models which combine a Bayesian approach with recent Monte simulation techniques based on Markov chains. This book reviews these techniques and covers the most recent advances in the field, among them bridge sampling techniques and reversible jump Markov chain Monte Carlo methods. It is the first time that the Bayesian perspective of finite mixture modelling is systematically presented in book form. It is argued that the Bayesian approach provides much insight in this context and is easily implemented in practice. Although the main focus is on Bayesian inference, the author reviews several frequentist techniques, especially selecting the number of components of a finite mixture model, and discusses some of their shortcomings compared to the Bayesian approach. The aim of this book is to impart the finite mixture and Markov switching approach to statistical modelling to a wide-ranging community. This includes not only statisticians, but also biologists, economists, engineers, financial agents, market researcher, medical researchers or any other frequent user of statistical models. This book should help newcomers to the field to understand how finite mixture and Markov switching models are formulated, what structures they imply on the data, what they could be used for, and how they are estimated. Researchers familiar with the subject also will profit from reading this book. The presentation is rather informal without abandoning mathematical correctness. Previous notions of Bayesian inference and Monte Carlo simulation are useful but not needed.
This book examines extensions of the Rasch model, one of the most researched and applied models in educational research and social science. This collection contains 22 chapters by some of the most renowned international experts in the field. They cover topics ranging from general model extensions to applications in fields as diverse as cognition, personality, organizational and sports psychology, and health sciences and education.
This is the second edition of the comprehensive treatment of statistical inference using permutation techniques. It makes available to practitioners a variety of useful and powerful data analytic tools that rely on very few distributional assumptions. Although many of these procedures have appeared in journal articles, they are not readily available to practitioners. This new and updated edition places increased emphasis on the use of alternative permutation statistical tests based on metric Euclidean distance functions that have excellent robustness characteristics. These alternative permutation techniques provide many powerful multivariate tests including multivariate multiple regression analyses.
The study of brain function is one of the most fascinating pursuits of m- ern science. Functional neuroimaging is an important component of much of the current research in cognitive, clinical, and social psychology. The exci- ment of studying the brain is recognized in both the popular press and the scienti?c community. In the pages of mainstream publications, including The New York Times and Wired, readers can learn about cutting-edge research into topics such as understanding how customers react to products and - vertisements ("If your brain has a 'buy button, ' what pushes it?," The New York Times, October19,2004), howviewersrespondtocampaignads("Using M. R. I. 's to see politics on the brain," The New York Times, April 20, 2004; "This is your brain on Hillary: Political neuroscience hits new low," Wired, November 12,2007), howmen and womenreactto sexualstimulation ("Brain scans arouse researchers,"Wired, April 19, 2004), distinguishing lies from the truth ("Duped," The New Yorker, July 2, 2007; "Woman convicted of child abuse hopes fMRI can prove her innocence," Wired, November 5, 2007), and even what separates "cool" people from "nerds" ("If you secretly like Michael Bolton, we'll know," Wired, October 2004). Reports on pathologies such as autism, in which neuroimaging plays a large role, are also common (for - stance, a Time magazine cover story from May 6, 2002, entitled "Inside the world of autism").
Shame remains at the core of much psychological distress and can eventuate as physical symptoms, yet experiential approaches to healing shame are sparse. Links between shame and art making have been felt, intuited, and examined, but have not been sufficiently documented by depth psychologists. Shame and the Making of Art addresses this lacuna by surveying depth psychological conceptions of shame, art, and the role of creativity in healing, contemporary and historical shame ideologies, as well as recent psychobiological studies on shame. Drawing on research conducted with participants in three different countries, the book includes candid discussions of shame experiences. These experiences are accompanied by Cluff's heuristic inquiry into shame with an interpretative phenomenological analysis that focuses on how participants negotiate the relationship between shame and the making of art. Cluff's movement through archetypal dimensions, especially Dionysian, is developed and discussed throughout the book. The results of the research are further explicated in terms of comparative studies, wherein the psychological processes and impacts observed by other researchers and effects on self-conscious maladaptive emotions are described. Shame and the Making of Art should be essential reading for academics, researchers, and postgraduate students engaged in the study of psychology and the arts. It will be of particular interest to psychologists, Jungian psychotherapists, psychiatrists, social workers, creativity researchers, and anyone interested in understanding the dynamics of this shame and self-expression.
In the decade of the 1970s, item response theory became the dominant topic for study by measurement specialists. But, the genesis of item response theory (IRT) can be traced back to the mid-thirties and early forties. In fact, the term "Item Characteristic Curve," which is one of the main IRT concepts, can be attributed to Ledyard Tucker in 1946. Despite these early research efforts, interest in item response theory lay dormant until the late 1960s and took a backseat to the emerging development of strong true score theory. While true score theory developed rapidly and drew the attention of leading psychometricians, the problems and weaknesses inherent in its formulation began to raise concerns. Such problems as the lack of invariance of item parameters across examinee groups, and the inadequacy of classical test procedures to detect item bias or to provide a sound basis for measurement in "tailored testing," gave rise to a resurgence of interest in item response theory. Impetus for the development of item response theory as we now know it was provided by Frederic M. Lord through his pioneering works (Lord, 1952; 1953a, 1953b). The progress in the fifties was painstakingly slow due to the mathematical complexity of the topic and the nonexistence of computer programs.
This book presents the state of the art in multilevel analysis, with an emphasis on more advanced topics. These topics are discussed conceptually, analyzed mathematically, and illustrated by empirical examples. Multilevel analysis is the statistical analysis of hierarchically and non-hierarchically nested data. The simplest example is clustered data, such as a sample of students clustered within schools. Multilevel data are especially prevalent in the social and behavioral sciences and in the biomedical sciences. The chapter authors are all leading experts in the field. Given the omnipresence of multilevel data in the social, behavioral, and biomedical sciences, this book is essential for empirical researchers in these fields.
Qualitative methodologies in cultural psychology often lack the objective and verifiable character of quantitative analysis. Author Carl Ratner corrects this shortcoming by rigorously systematizing qualitative methods. The book discusses, for example, means of systematizing such subjective reports as interviews, letters, and diaries, which often yield valuable data that is not easily quantified. Ratner argues that "complex psychological phenomena are expressed through extended responses" and hence are best studied by new, more regularized qualitative methods that go beyond measuring simple, overt responses.
Item response theory has become an essential component in the toolkit of every researcher in the behavioral sciences. It provides a powerful means to study individual responses to a variety of stimuli, and the methodology has been extended and developed to cover many different models of interaction. This volume presents a wide-ranging handbook to item response theory - and its applications to educational and psychological testing. It will serve as both an introduction to the subject and also as a comprehensive reference volume for practitioners and researchers. It is organized into six major sections: the nominal categories model, models for response time or multiple attempts on items, models for multiple abilities or cognitive components, nonparametric models, models for nonmonotone items, and models with special assumptions. Each chapter in the book has been written by an expert of that particular topic, and the chapters have been carefully edited to ensure that a uniform style of notation and presentation is used throughout. As a result, all researchers whose work uses item response theory will find this an indispensable companion to their work and it will be the subject's reference volume for many years to come.
This textbook is for graduate students and research workers in social statistics and related subject areas. It follows a novel curriculum developed around the basic statistical activities: sampling, measurement and inference. The monograph aims to prepare the reader for the career of an independent social statistician and to serve as a reference for methods, ideas for and ways of studying of human populations. Elementary linear algebra and calculus are prerequisites, although the exposition is quite forgiving. Familiarity with statistical software at the outset is an advantage, but it can be developed while reading the first few chapters.
The Handbook of Psychodiagnostic Testing is an invaluable aid to students and professionals performing psychological assessments. It takes the reader from client referral to finished report, demonstrating how to synthesize details of personality and pathology into a document that is focused, coherent, and clinically meaningful. This new edition covers emerging areas in borderline and narcissistic pathologies, psychological testing of preschool children, and bilingual populations. It also discusses the most current clinical issues and evaluating populations on which standard psychological tests have not been standardized.
Presenting an innovative take on researching early childhood, this book provides an international comparison of the cultural and familial influences that shape the growth of young children. The book presents a unique methodology, and includes chapters on musicality, security, humour and eating.
A Handbook of Process Tracing Methods demonstrates how to better understand decision outcomes by studying decision processes, through the introduction of a number of exciting techniques. Decades of research have identified numerous idiosyncrasies in human decision behavior, but some of the most recent advances in the scientific study of decision making involve the development of sophisticated methods for understanding decision process-known as process tracing. In this volume, leading experts discuss the application of these methods and focus on the best practices for using some of the more popular techniques, discussing how to incorporate them into formal decision models. This edition has been expanded and thoroughly updated throughout, and now includes new chapters on mouse tracking, protocol analysis, neurocognitive methods, the measurement of valuation, as well as an overview of important software packages. The volume not only surveys cutting-edge research to illustrate the great variety in process tracing techniques, but also serves as a tutorial for how the novice researcher might implement these methods. A Handbook of Process Tracing Methods will be an essential read for all students and researchers of decision making.
This comprehensive Handbook is the first to provide a practical, interdisciplinary review of ethical issues as they relate to quantitative methodology including how to present evidence for reliability and validity, what comprises an adequate tested population, and what constitutes scientific knowledge for eliminating biases. The book uses an ethical framework that emphasizes the human cost of quantitative decision making to help researchers understand the specific implications of their choices. The order of the Handbook chapters parallels the chronology of the research process: determining the research design and data collection; data analysis; and communicating findings. Each chapter: Explores the ethics of a particular topic Identifies prevailing methodological issues Reviews strategies and approaches for handling such issues and their ethical implications Provides one or more case examples Outlines plausible approaches to the issue including best-practice solutions. Part 1 presents ethical frameworks that cross-cut design, analysis, and modeling in the behavioral sciences. Part 2 focuses on ideas for disseminating ethical training in statistics courses. Part 3 considers the ethical aspects of selecting measurement instruments and sample size planning and explores issues related to high stakes testing, the defensibility of experimental vs. quasi-experimental research designs, and ethics in program evaluation. Decision points that shape a researchers' approach to data analysis are examined in Part 4 - when and why analysts need to account for how the sample was selected, how to evaluate tradeoffs of hypothesis-testing vs. estimation, and how to handle missing data. Ethical issues that arise when using techniques such as factor analysis or multilevel modeling and when making causal inferences are also explored. The book concludes with ethical aspects of reporting meta-analyses, of cross-disciplinary statistical reform, and of the publication process. This Handbook appeals to researchers and practitioners in psychology, human development, family studies, health, education, sociology, social work, political science, and business/marketing. This book is also a valuable supplement for quantitative methods courses required of all graduate students in these fields.
This guide is a much-needed reference for clinicians on how to use the Rorschach Inkblot Test with senior adults, an essential tool for assessing personality functioning to better identify psychological interventions. The book integrates historical developments, current research, conceptual considerations, and therapeutic and diagnostic applications. Chapters review basic guidelines for the understanding and interpretation of Rorschach variables, including protocol validity; interpretation of structural variables, thematic imagery, and cross-cultural normative data; sequence analysis; and more. The authors then provide 10 case illustrations of how the Rorschach indices of cognitive functioning, emotional experience, interpersonal relatedness, and self-perception can facilitate differential diagnosis and treatment planning in clinical work with older people. These case illustrations are rooted in previously non-existent Rorschach reference data based on an international sample of more than 250 senior adults and a second sample of more than 200 patients with Alzheimer's disease. Clinicians will come away with a solid empirical basis for distinguishing between normal-range personality functioning and manifestations of psychological disorder in the elderly and for providing beneficial interventions to senior adult patients.
The advent of "Big Data" has brought with it a rapid diversification of data sources, requiring analysis that accounts for the fact that these data have often been generated and recorded for different reasons. Data integration involves combining data residing in different sources to enable statistical inference, or to generate new statistical data for purposes that cannot be served by each source on its own. This can yield significant gains for scientific as well as commercial investigations. However, valid analysis of such data should allow for the additional uncertainty due to entity ambiguity, whenever it is not possible to state with certainty that the integrated source is the target population of interest. Analysis of Integrated Data aims to provide a solid theoretical basis for this statistical analysis in three generic settings of entity ambiguity: statistical analysis of linked datasets that may contain linkage errors; datasets created by a data fusion process, where joint statistical information is simulated using the information in marginal data from non-overlapping sources; and estimation of target population size when target units are either partially or erroneously covered in each source. Covers a range of topics under an overarching perspective of data integration. Focuses on statistical uncertainty and inference issues arising from entity ambiguity. Features state of the art methods for analysis of integrated data. Identifies the important themes that will define future research and teaching in the statistical analysis of integrated data. Analysis of Integrated Data is aimed primarily at researchers and methodologists interested in statistical methods for data from multiple sources, with a focus on data analysts in the social sciences, and in the public and private sectors.
Nowadays, event history analysis can draw on a well-established set of statistical tools for the description and causal analysis of event history data. The second edition of Event History Analysis with Stata provides an updated introduction to event history modeling, along with many instructive Stata examples. Using the latest Stata software, each of these practical examples develops a research question, refers to useful substantive background information, gives a short exposition of the underlying statistical concepts, describes the organization of the input data and the application of the statistical Stata procedures, and assists the reader in performing a substantive interpretation of the obtained results. Emphasising the strengths and limitations of event history model techniques in each field of application, this book demonstrates that event history models provide a useful approach with which to uncover causal relationships or to map out a system of causal relations. It demonstrates how long-term processes can be studied and how changing context information on the micro, meso, and macro levels can be integrated easily into a dynamic analysis of longitudinal data. Event History Analysis with Stata is an invaluable resource for both novice students and researchers who need an introductory textbook and experienced researchers (from sociology, economics, political science, pedagogy, psychology, or demography) who are looking for a practical handbook for their research.
Quantitative Data Analysis for Language Assessment Volume I: Fundamental Techniques is a resource book that presents the most fundamental techniques of quantitative data analysis in the field of language assessment. Each chapter provides an accessible explanation of the selected technique, a review of language assessment studies that have used the technique, and finally, an example of an authentic study that uses the technique. Readers also get a taste of how to apply each technique through the help of supplementary online resources that include sample data sets and guided instructions. Language assessment students, test designers, and researchers should find this a unique reference as it consolidates theory and application of quantitative data analysis in language assessment.
Can the phenomena of the human mind be separated from the practices of spiritual formation-of growing to have the mind of Christ? Research into the nature of moral and spiritual change has revived in recent years in the worlds of psychology on one hand and theology and philosophy on the other. But psychology and spiritual formation draw upon distinct bodies of research and theory grounded in different methodologies, resulting in conversation that has suffered from a lack of interdisciplinary cross-pollination. Rooted in a year-long discussion held by Biola University's Center for Christian Thought (CCT), this volume bridges the gaps caused by professional specialization among psychology, theology, and philosophy. Each essay was forged out of an integrative discussion among theologians, psychologists, philosophers, New Testament scholars, educators, and pastors around the CCT seminar table. Topics that emerged included relational and developmental spirituality, moral virtue and judgment, and suffering and trauma. Psychology and Spiritual Formation in Dialogue speaks across disciplinary divides, fostering fruitful conversation for fresh insights into the nature and dynamics of personal spiritual change. Contributors include Justin L. Barrett, School of Psychology, Fuller Theological Seminary Earl D. Bland, Rosemead School of Psychology, Biola University Ellen T. Charry, Princeton Seminary John H. Coe, Biola University Robert A. Emmons, University of California, Davis Stephen Evans, Baylor University Bruce Hindmarsh, Regent College, Vancouver Marie T. Hoffman, New York University James M. Houston, Regent College, Vancouver Steven J. Sandage, David R. Paine, and Jonathan Morgan, Boston University Siang Yang Tan, School of Psychology, Fuller Theological Seminary Everett L. Worthington, Jr., Brandon J. Griffin, and Caroline R. Lavelock, Virginia Commonwealth University Edited by Thomas M. Crisp, professor of philosophy, Biola University Steve L. Porter, professor of theology, spiritual formation, and philosophy, Talbot School of Theology and Rosemead School of Psychology, Biola University Gregg Ten Elshof, professor of philosophy, Biola University Christian Association for Psychological Studies (CAPS) Books explore how Christianity relates to mental health and behavioral sciences including psychology, counseling, social work, and marriage and family therapy in order to equip Christian clinicians to support the well-being of their clients. |
You may like...
Portrait of a River - Nikolaj Bendix…
Steven Bode, Patrick Langley, …
Hardcover
Acts Of Transgression - Contemporary…
Jay Pather, Catherine Boulle
Paperback
Cairo's Independence - A Toddler With…
Abbey Ogunnaike, Cairo Kadelu
Hardcover
R758
Discovery Miles 7 580
|