![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Social sciences > Psychology > Psychological methodology > General
This book explores the ways in which the spatio-temporal contingency of human life is being conceived in different fields of research. Specifically, it looks at the relationship between the situatedness of human life, the situation or place in which human life is supposed to be situated, and the dimensions of space and time in which both situation and place are usually themselves supposed to be situated. Over the last two or three decades, the spatio-temporal contingency of human life has become an important topic of research in a broad range of different disciplines including the social sciences, the cultural sciences, the cognitive sciences, and philosophy. However, this research topic is referred to in quite different ways: while some researchers refer to it in terms of "situation", emphasizing the "situatedness" of human experience and action, others refer to it in terms of "place", emphasizing the "power of place" and advocating a "topological" or "topographical turn" in the context of a larger "spatial turn". Interdisciplinary exchange is so far hampered by the fact that the notions referred to and the relationships between them are usually not sufficiently questioned. This book addresses these issues by bringing together contributions on the spatio-temporal contingency of human life from different fields of research.
This book critically examines the work of a number of pioneers of social psychology, including legendary figures such as Kurt Lewin, Leon Festinger, Muzafer Sherif, Solomon Asch, Stanley Milgram, and Philip Zimbardo. Augustine Brannigan argues that the reliance of these psychologists on experimentation has led to questions around validity and replication of their studies. The author explores new research and archival work relating to these studies and outlines a new approach to experimentation that repudiates the use of deception in human experiments and provides clues to how social psychology can re-articulate its premises and future lines of research. Based on the author's 2004 work The Rise and Fall of Social Psychology, in which he critiques the experimental methods used, the book advocates for a return to qualitative methods to redeem the essential social dimensions of social psychology. Covering famous studies such as the Stanford Prison Experiment, Milgram's studies of obedience, Sherif's Robbers Cave, and Rosenhan's expose of psychiatric institutions, this is essential and fascinating reading for students of social psychology, and the social sciences. It's also of interest to academics and researchers interested in engaging with a critical approach to classical social psychology, with a view to changing the future of this important discipline.
This book critically examines the work of a number of pioneers of social psychology, including legendary figures such as Kurt Lewin, Leon Festinger, Muzafer Sherif, Solomon Asch, Stanley Milgram, and Philip Zimbardo. Augustine Brannigan argues that the reliance of these psychologists on experimentation has led to questions around validity and replication of their studies. The author explores new research and archival work relating to these studies and outlines a new approach to experimentation that repudiates the use of deception in human experiments and provides clues to how social psychology can re-articulate its premises and future lines of research. Based on the author's 2004 work The Rise and Fall of Social Psychology, in which he critiques the experimental methods used, the book advocates for a return to qualitative methods to redeem the essential social dimensions of social psychology. Covering famous studies such as the Stanford Prison Experiment, Milgram's studies of obedience, Sherif's Robbers Cave, and Rosenhan's expose of psychiatric institutions, this is essential and fascinating reading for students of social psychology, and the social sciences. It's also of interest to academics and researchers interested in engaging with a critical approach to classical social psychology, with a view to changing the future of this important discipline.
Research Design for the Behavioral Sciences fills an important gap for the helping professions by offering a blueprint for advanced concepts and an applied approach to understanding quantitative, qualitative, and mixed methods research design. This graduate-level text seamlessly weaves together the philosophy, science, and practical application of the most common methodological frameworks in practice. Advanced research design concepts are presented through clear and in-depth blueprints, applied case studies, myriad examples, and helpful learning activities.Written in detailed yet accessible language, this text describes the foundations of behavioral science research. The authors explore research-based philosophical integration, along with the technical application of every tradition. Through this philosophical and pragmatic approach, students will be able to attain a well-rounded and comprehensive understanding of behavioral science research. This text provides students with the opportunity to reach a greater level of research efficacy though the inclusion of methodological procedures, data analysis methods, reliability/validity standards, ethics, and directions on how to increase the rigor of each approach to research. Instructor resources include an instructor's manual, learning activities, test bank, and PowerPoints. Purchase includes digital access for use on most mobile devices and computers. Key Features: Provides clear, detailed, and contextually accurate examples of writing, quantitative, qualitative, and mixed methods procedures Reviews the paradigmatic hierarchy of each research tradition along with key analytic features in detail Delivers instructions for enhancing the methodological rigor of each approach Analyzes methodology-specific multicultural issues Demonstrates the application of a wide range of research methodologies with case studies Reviews the trends and history in research for counseling, psychology, social work, and marriage and family therapy Offers comprehensive instructor resources including manual, learning activities, test bank, and PowerPoint slides
Measures of Interobserver Agreement and Reliability, Second Edition covers important issues related to the design and analysis of reliability and agreement studies. It examines factors affecting the degree of measurement errors in reliability generalization studies and characteristics influencing the process of diagnosing each subject in a reliability study. The book also illustrates the importance of blinding and random selection of subjects. New to the Second Edition New chapter that describes various models for methods comparison studies New chapter on the analysis of reproducibility using the within-subjects coefficient of variation Emphasis on the definition of the subjects' and raters' population as well as sample size determination This edition continues to offer guidance on how to run sound reliability and agreement studies in clinical settings and other types of investigations. The author explores two ways of producing one pooled estimate of agreement from several centers: a fixed-effect approach and a random sample of centers using a simple meta-analytic approach. The text includes end-of-chapter exercises as well as downloadable resources of data sets and SAS code.
* Provides an up to date reference point for ethnographic research conducted into healthcare research * Embodies an outline of major methodological approaches to health and well-being ethnography * Includes illustrative case-studies of ethnographical research within the healthcare setting. * Offers a holistic view of ethnography, taking a multi-disciplinary approach
Prevention and developmental sciences have many complementary goals
and much to gain by collaboration. With random assignment to
conditions and long-term multivariate follow-up of individuals
across significant years in the life span, fundamental basic and
applied research questions can now be addressed using new
statistical methods. This special issue includes four empirical
papers that used growth modeling techniques (hierarchical linear
modeling, latent growth curve analyses) to examine direct and
indirect effects of theory-based, longitudinal prevention
experiments on developmental trajectories of children's and
adolescents' substance use, delinquency, and school bonding.
Missing data affect nearly every discipline by complicating the statistical analysis of collected data. But since the 1990s, there have been important developments in the statistical methodology for handling missing data. Written by renowned statisticians in this area, Handbook of Missing Data Methodology presents many methodological advances and the latest applications of missing data methods in empirical research. Divided into six parts, the handbook begins by establishing notation and terminology. It reviews the general taxonomy of missing data mechanisms and their implications for analysis and offers a historical perspective on early methods for handling missing data. The following three parts cover various inference paradigms when data are missing, including likelihood and Bayesian methods; semi-parametric methods, with particular emphasis on inverse probability weighting; and multiple imputation methods. The next part of the book focuses on a range of approaches that assess the sensitivity of inferences to alternative, routinely non-verifiable assumptions about the missing data process. The final part discusses special topics, such as missing data in clinical trials and sample surveys as well as approaches to model diagnostics in the missing data setting. In each part, an introduction provides useful background material and an overview to set the stage for subsequent chapters. Covering both established and emerging methodologies for missing data, this book sets the scene for future research. It provides the framework for readers to delve into research and practical applications of missing data methods.
R for Political Data Science: A Practical Guide is a handbook for political scientists new to R who want to learn the most useful and common ways to interpret and analyze political data. It was written by political scientists, thinking about the many real-world problems faced in their work. The book has 16 chapters and is organized in three sections. The first, on the use of R, is for those users who are learning R or are migrating from another software. The second section, on econometric models, covers OLS, binary and survival models, panel data, and causal inference. The third section is a data science toolbox of some the most useful tools in the discipline: data imputation, fuzzy merge of large datasets, web mining, quantitative text analysis, network analysis, mapping, spatial cluster analysis, and principal component analysis. Key features: Each chapter has the most up-to-date and simple option available for each task, assuming minimal prerequisites and no previous experience in R Makes extensive use of the Tidyverse, the group of packages that has revolutionized the use of R Provides a step-by-step guide that you can replicate using your own data Includes exercises in every chapter for course use or self-study Focuses on practical-based approaches to statistical inference rather than mathematical formulae Supplemented by an R package, including all data As the title suggests, this book is highly applied in nature, and is designed as a toolbox for the reader. It can be used in methods and data science courses, at both the undergraduate and graduate levels. It will be equally useful for a university student pursuing a PhD, political consultants, or a public official, all of whom need to transform their datasets into substantive and easily interpretable conclusions.
Survey Development: A Theory-Driven Mixed Methods Approach provides both an overview of standard methods and tools for developing and validating surveys and a conceptual basis for survey development. It advocates logical reasoning that combines theory related to construct validity with theory regarding design, and theory regarding survey response, item review, and identification of misfitting responses. The book has 14 chapters which are divided into four parts. Part A includes six chapters that deal with theory and methodology. Part B has five chapters and it gets into the process of constructing the survey. Part C comprises two chapters devoted to assessing the quality or psychometric properties (reliability and validity) of survey responses. Finally, the one chapter in Part D is an attempt to present a synopsis of what was covered in the previous chapters in regard to developing a survey with the Theory-Driven-Mixed-Methods (TDMM) framework for developing survey and conducting survey research. This provides a full process for survey development intended to yield results that can support validity. A mixed methods approach integrates both qualitative and quantitative data outcomes. Including detailed online resources, this book is suitable for graduate students who use or are responsible for interpretation of survey research and survey data as well as survey methodologists and practitioners who use surveys in their field.
This book combines the latest in sociology, psychology, and biology to present evidence-based research on what works in community and institutional corrections. It spans from the theoretical underpinning of correctional counseling to concrete examples and tools necessary for professionals in the field. This book equips readers with the ability to understand what we should do, why we should do it, and tools for how to do it in the field. It discusses interviewing, interrogating, and theories of directive and nondirective counseling, including group counseling. It discusses the strengths and weaknesses of various correctional approaches such as cognitive-behavioral therapies, group counseling, and therapeutic communities. It introduces ethical and legal considerations for correctional professionals. With an explanation of the presentence investigation report, case management, and appendices containing a variety of classification and assessment instruments, this volume provides practical, hands-on experience. Students of criminal justice, psychology and social work will gain an understanding of the unique challenges to correctional success and practical applications of their studies. "This book is a teacher/student/practitioner's dream. Grounded in theory and evidence-based research on best practices, it is accessible, well-written, filled with sound insights and tools for working with criminal justice clients. I have used and loved each new edition of this fine text." - Dorothy S. McClellan, Texas A&M University-Corpus Christi
There is a recent surge in the use of randomized controlled trials (RCTs) within education globally, with disproportionate claims being made about what they show, 'what works', and what constitutes the best 'evidence'. Drawing on up-to-date scholarship from across the world, Taming Randomized Controlled Trials in Education critically addresses the increased use of RCTs in education, exploring their benefits, limits and cautions, and ultimately questioning the prominence given to them. While acknowledging that randomized controlled trials do have some place in education, the book nevertheless argues that this place should be limited. Drawing together all arguments for and against RCTs in a comprehensive and easily accessible single volume, the book also adds new perspectives and insights to the conversation; crucially, the book considers the limits of their usefulness and applicability in education, raising a range of largely unexplored concerns about their use. Chapters include discussions on: The impact of complexity theory and chaos theory. Design issues and sampling in randomized controlled trials. Learning from clinical trials. Data analysis in randomized controlled trials. Reporting, evaluating and generalizing from randomized controlled trials. Considering key issues in understanding and interrogating research evidence, this book is ideal reading for all students on Research Methods modules, as well as those interested in undertaking and reviewing research in the field of education.
There is a recent surge in the use of randomized controlled trials (RCTs) within education globally, with disproportionate claims being made about what they show, 'what works', and what constitutes the best 'evidence'. Drawing on up-to-date scholarship from across the world, Taming Randomized Controlled Trials in Education critically addresses the increased use of RCTs in education, exploring their benefits, limits and cautions, and ultimately questioning the prominence given to them. While acknowledging that randomized controlled trials do have some place in education, the book nevertheless argues that this place should be limited. Drawing together all arguments for and against RCTs in a comprehensive and easily accessible single volume, the book also adds new perspectives and insights to the conversation; crucially, the book considers the limits of their usefulness and applicability in education, raising a range of largely unexplored concerns about their use. Chapters include discussions on: The impact of complexity theory and chaos theory. Design issues and sampling in randomized controlled trials. Learning from clinical trials. Data analysis in randomized controlled trials. Reporting, evaluating and generalizing from randomized controlled trials. Considering key issues in understanding and interrogating research evidence, this book is ideal reading for all students on Research Methods modules, as well as those interested in undertaking and reviewing research in the field of education.
Bayesian Demographic Estimation and Forecasting presents three statistical frameworks for modern demographic estimation and forecasting. The frameworks draw on recent advances in statistical methodology to provide new tools for tackling challenges such as disaggregation, measurement error, missing data, and combining multiple data sources. The methods apply to single demographic series, or to entire demographic systems. The methods unify estimation and forecasting, and yield detailed measures of uncertainty. The book assumes minimal knowledge of statistics, and no previous knowledge of demography. The authors have developed a set of R packages implementing the methods. Data and code for all applications in the book are available on www.bdef-book.com. "This book will be welcome for the scientific community of forecasters...as it presents a new approach which has already given important results and which, in my opinion, will increase its importance in the future." ~Daniel Courgeau, Institut national d'etudes demographiques
This is an essential how-to guide on the application of structural equation modeling (SEM) techniques with the AMOS software, focusing on the practical applications of both simple and advanced topics. Written in an easy-to-understand conversational style, the book covers everything from data collection and screening to confirmatory factor analysis, structural model analysis, mediation, moderation, and more advanced topics such as mixture modeling, censored date, and non-recursive models. Through step-by-step instructions, screen shots, and suggested guidelines for reporting, Collier cuts through abstract definitional perspectives to give insight on how to actually run analysis. Unlike other SEM books, the examples used will often start in SPSS and then transition to AMOS so that the reader can have full confidence in running the analysis from beginning to end. Best practices are also included on topics like how to determine if your SEM model is formative or reflective, making it not just an explanation of SEM topics, but a guide for researchers on how to develop a strong methodology while studying their respective phenomenon of interest. With a focus on practical applications of both basic and advanced topics, and with detailed work-through examples throughout, this book is ideal for experienced researchers and beginners across the behavioral and social sciences.
The number of innovative applications of randomization tests in various fields and recent developments in experimental design, significance testing, computing facilities, and randomization test algorithms have necessitated a new edition of Randomization Tests. Updated, reorganized, and revised, the text emphasizes the irrelevance and implausibility of the random sampling assumption for the typical experiment in three completely rewritten chapters. It also discusses factorial designs and interactions and combines repeated-measures and randomized block designs in one chapter. The authors focus more attention on the practicality of N-of-1 randomization tests and the availability of user-friendly software to perform them. In addition, they provide an overview of free and commercial computer programs for all of the tests presented in the book. Building on the previous editions that have served as standard textbooks for more than twenty-five years, Randomization Tests, Fourth Edition includes downloadable resources of up-to-date randomization test programs that facilitate application of the tests to experimental data. This CD-ROM enables students to work out problems that have been added to the chapters and helps professors teach the basics of randomization tests and devise tasks for assignments and examinations.
In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches. The book begins with four concrete examples of CPCA that provide readers with a basic understanding of the technique and its applications. It gives a detailed account of two key mathematical ideas in CPCA: projection and singular value decomposition. The author then describes the basic data requirements, models, and analytical tools for CPCA and their immediate extensions. He also introduces techniques that are special cases of or closely related to CPCA and discusses several topics relevant to practical uses of CPCA. The book concludes with a technique that imposes different constraints on different dimensions (DCDD), along with its analytical extensions. MATLAB (R) programs for CPCA and DCDD as well as data to create the book's examples are available on the author's website.
Develop a Deep Understanding of the Statistical Issues of APC Analysis Age-Period-Cohort Models: Approaches and Analyses with Aggregate Data presents an introduction to the problems and strategies for modeling age, period, and cohort (APC) effects for aggregate-level data. These strategies include constrained estimation, the use of age and/or period and/or cohort characteristics, estimable functions, variance decomposition, and a new technique called the s-constraint approach. See How Common Methods Are Related to Each Other After a general and wide-ranging introductory chapter, the book explains the identification problem from algebraic and geometric perspectives and discusses constrained regression. It then covers important strategies that provide information that does not directly depend on the constraints used to identify the APC model. The final chapter presents a specific empirical example showing that a combination of the approaches can make a compelling case for particular APC effects. Get Answers to Questions about the Relationships of Ages, Periods, and Cohorts to Important Substantive Variables This book incorporates several APC approaches into one resource, emphasizing both their geometry and algebra. This integrated presentation helps researchers effectively judge the strengths and weaknesses of the methods, which should lead to better future research and better interpretation of existing research.
Multi-State Survival Models for Interval-Censored Data introduces methods to describe stochastic processes that consist of transitions between states over time. It is targeted at researchers in medical statistics, epidemiology, demography, and social statistics. One of the applications in the book is a three-state process for dementia and survival in the older population. This process is described by an illness-death model with a dementia-free state, a dementia state, and a dead state. Statistical modelling of a multi-state process can investigate potential associations between the risk of moving to the next state and variables such as age, gender, or education. A model can also be used to predict the multi-state process. The methods are for longitudinal data subject to interval censoring. Depending on the definition of a state, it is possible that the time of the transition into a state is not observed exactly. However, when longitudinal data are available the transition time may be known to lie in the time interval defined by two successive observations. Such an interval-censored observation scheme can be taken into account in the statistical inference. Multi-state modelling is an elegant combination of statistical inference and the theory of stochastic processes. Multi-State Survival Models for Interval-Censored Data shows that the statistical modelling is versatile and allows for a wide range of applications.
This book offers a refreshing new approach to mental health by showing how 'mental health' behaviours, lived experiences, and our interventions arise from our social worlds and not from our neurophysiology gone wrong. It is part of a trilogy which offers a new way of doing psychology focusing on people's social and societal environments as determining their behaviour, rather than internal and individualistic attributions. 'Mental health' behaviours are carefully analysed as ordinary behaviours which have become exaggerated and chronic because of the bad life situations people are forced to endure, especially as children. This shifts mental health treatments away from the dominance of psychology and psychiatry to show that social action is needed because many of these bad life situations are produced by our modern society itself. By providing new ways for readers to rethink everything they thought they knew about mental health issues and how to change them, Bernard Guerin also explores how by changing our environmental contexts (our local, societal, and discursive worlds), we can improve mental health interventions. This book reframes 'mental health' into a much wider social context to show how societal structures restrict our opportunities and pathways to produce bad life situations, and how we can also learn from those who manage to deal with the very same bad life situations through crime, bullying, exploitation, and dropping out of mainstream society, rather than through the 'mental health' behaviours. By merging psychology and psychiatry into the social sciences, Guerin seeks to better understand how humans operate in their social, cultural, economic, patriarchal, discursive, and societal worlds, rather than being isolated inside their heads with a 'faulty brain', and this will provide fascinating reading for academics and students in psychology and the social sciences, and for counsellors and therapists.
* Explores the overlap/parallels between the work of clinicians and qualitative researchers. * Suggests how postmodern therapeutic approaches can be integrated into methods of data collection and analysis. * Examines a range of postmodern therapies, including collaborative language systems, narrative therapy, and solution-focused brief therapy. * Offers an innovative and unique way of enhancing the skills of the qualitative researcher, with an emphasis on reflective practice.
"Describes recent developments and surveys important topics in the areas of multivariate analysis, design of experiments, and survey sampling. Features the work of nearly 50 international leaders."
Confidence Intervals for Proportions and Related Measures of Effect Size illustrates the use of effect size measures and corresponding confidence intervals as more informative alternatives to the most basic and widely used significance tests. The book provides you with a deep understanding of what happens when these statistical methods are applied in situations far removed from the familiar Gaussian case. Drawing on his extensive work as a statistician and professor at Cardiff University School of Medicine, the author brings together methods for calculating confidence intervals for proportions and several other important measures, including differences, ratios, and nonparametric effect size measures generalizing Mann-Whitney and Wilcoxon tests. He also explains three important approaches to obtaining intervals for related measures. Many examples illustrate the application of the methods in the health and social sciences. Requiring little computational skills, the book offers user-friendly Excel spreadsheets for download at www.crcpress.com, enabling you to easily apply the methods to your own empirical data.
* Explores the overlap/parallels between the work of clinicians and qualitative researchers. * Suggests how postmodern therapeutic approaches can be integrated into methods of data collection and analysis. * Examines a range of postmodern therapies, including collaborative language systems, narrative therapy, and solution-focused brief therapy. * Offers an innovative and unique way of enhancing the skills of the qualitative researcher, with an emphasis on reflective practice.
This book examines a basic problem in critical approaches to political and social inquiry: in what way is social inquiry animated by a practical intent? This practical intent is not external to inquiry as an add-on or a choice by the inquirer, but is inherent to the process of inquiry. The practical intent in inquiry derives from the connection between social inquiry and the participant's perspective. The social inquirer, in order to grasp the sense of those who are the subject of inquiry, has to adopt the perspective of the participant in the social world. Caterino opposes the view that research is an autonomous activity distinct from or superior to a participant's perspective. He argues that since the inquirer is on the same level as the participant, all inquiry should be considered mutual critique in which those who are addressed by inquiry have an equal right and an equal capacity to criticize addressors. |
You may like...
The Method of Response Functions in…
I.G. Malkina-Pykh, Yuri A. Pykh
Hardcover
R4,435
Discovery Miles 44 350
Thematic Analysis - A Practical Guide
Virginia Braun, Victoria Clarke
Paperback
R1,273
Discovery Miles 12 730
Advances in Physiological Computing
Stephen H. Fairclough, Kiel Gilleade
Hardcover
The Science and Art of Interviewing
Kathleen Gerson, Sarah Damaske
Hardcover
R2,443
Discovery Miles 24 430
|