![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Social sciences > Psychology > Psychological methodology > General
In this collection, international contributors come together to discuss how qualitative and quantitative methods can be used in psychotherapy research. The book considers the advantages and disadvantages of each approach, and recognises how each method can enhance our understanding of psychotherapy. Divided into two parts, the book begins with an examination of quantitative research and discusses how we can transfer observations into numbers and statistical findings. Chapters on quantitative methods cover the development of new findings and the improvement of existing findings, identifying and analysing change, and using meta-analysis. The second half of the book comprises chapters considering how qualitative and mixed methods can be used in psychotherapy research. Chapters on qualitative and mixed methods identify various ways to strengthen the trustworthiness of qualitative findings via rigorous data collection and analysis techniques. Adapted from a special issue of Psychotherapy Research, this volume will be key reading for researchers, academics, and professionals who want a greater understanding of how a particular area of research methods can be used in psychotherapy.
This is the first book to demonstrate the application of power analysis to the newer more advanced statistical techniques that are increasingly used in the social and behavioral sciences. Both basic and advanced designs are covered. Readers are shown how to apply power analysis to techniques such as hierarchical linear modeling, meta-analysis, and structural equation modeling. Each chapter opens with a review of the statistical procedure and then proceeds to derive the power functions. This is followed by examples that demonstrate how to produce power tables and charts. The book clearly shows how to calculate power by providing open code for every design and procedure in R, SAS, and SPSS. Readers can verify the power computation using the computer programs on the book's website. There is a growing requirement to include power analysis to justify sample sizes in grant proposals. Most chapters are self-standing and can be read in any order without much disruption.This book will help readers do just that. Sample computer code in R, SPSS, and SAS at www.routledge.com/9781848729810 are written to tabulate power values and produce power curves that can be included in a grant proposal. Organized according to various techniques, chapters 1 - 3 introduce the basics of statistical power and sample size issues including the historical origin, hypothesis testing, and the use of statistical power in t tests and confidence intervals. Chapters 4 - 6 cover common statistical procedures -- analysis of variance, linear regression (both simple regression and multiple regression), correlation, analysis of covariance, and multivariate analysis. Chapters 7 - 11 review the new statistical procedures -- multi-level models, meta-analysis, structural equation models, and longitudinal studies. The appendixes contain a tutorial about R and show the statistical theory of power analysis. Intended as a supplement for graduate courses on quantitative methods, multivariate statistics, hierarchical linear modeling (HLM) and/or multilevel modeling and SEM taught in psychology, education, human development, nursing, and social and life sciences, this is the first text on statistical power for advanced procedures. Researchers and practitioners in these fields also appreciate the book's unique coverage of the use of statistical power analysis to determine sample size in planning a study. A prerequisite of basic through multivariate statistics is assumed.
This book addresses issues related with researching sensitive topics in social work, focusing on marginalized, vulnerable and hard to reach people. It covers the definition, characteristics, challenges and opportunities of sensitive research, its philosophical roots and methodological debates, and the skills and values that are required along with the ethical, political and legal issues involved in conducting social work research. This book will cover innovative research methods appropriate for research on sensitive topics involving vulnerable people. It shines light on how to use traditional research methods sensitively, and how to generate data while minimizing the harm that can potentially be caused to research participants and researchers.
This book reviews the latest techniques in exploratory data mining (EDM) for the analysis of data in the social and behavioral sciences to help researchers assess the predictive value of different combinations of variables in large data sets. Methodological findings and conceptual models that explain reliable EDM techniques for predicting and understanding various risk mechanisms are integrated throughout. Numerous examples illustrate the use of these techniques in practice. Contributors provide insight through hands-on experiences with their own use of EDM techniques in various settings. Readers are also introduced to the most popular EDM software programs. A related website at http://mephisto.unige.ch/pub/edm-book-supplement/offers color versions of the book's figures, a supplemental paper to chapter 3, and R commands for some chapters. The results of EDM analyses can be perilous - they are often taken as predictions with little regard for cross-validating the results. This carelessness can be catastrophic in terms of money lost or patients misdiagnosed. This book addresses these concerns and advocates for the development of checks and balances for EDM analyses. Both the promises and the perils of EDM are addressed. Editors McArdle and Ritschard taught the "Exploratory Data Mining" Advanced Training Institute of the American Psychological Association (APA). All contributors are top researchers from the US and Europe. Organized into two parts--methodology and applications, the techniques covered include decision, regression, and SEM tree models, growth mixture modeling, and time based categorical sequential analysis. Some of the applications of EDM (and the corresponding data) explored include: selection to college based on risky prior academic profiles the decline of cognitive abilities in older persons global perceptions of stress in adulthood predicting mortality from demographics and cognitive abilities risk factors during pregnancy and the impact on neonatal development Intended as a reference for researchers, methodologists, and advanced students in the social and behavioral sciences including psychology, sociology, business, econometrics, and medicine, interested in learning to apply the latest exploratory data mining techniques. Prerequisites include a basic class in statistics.
This is the first book of its kind to include the personal accounts of people who have survived injury to the brain, along with professional therapists' reports of their progress through rehabilitation. The paintings and stories of survivors combine with experts' discussions of the theory and practice of brain injury rehabilitation to illustrate the ups and downs that survivors encounter in their journey from pre-injury status to insult and post-injury rehabilitation. Wilson, Winegardner and Ashworth's focus on the survivors' perspective shows how rehabilitation is an interactive process between people with brain injury, health care staff, and others, and gives the survivors the chance to tell their own stories of life before their injury, the nature of the insult, their early treatment, and subsequent rehabilitation. Presenting practical approaches to help survivors of brain injury achieve functionally relevant and meaningful goals, Life After Brain Injury: Survivors' Stories will help all those working in rehabilitation understand the principles involved in holistic brain injury rehabilitation and how these principles, combined with theory and models, translate into clinical practice. This book will be of great interest to anyone who wishes to extend their knowledge of the latest theories and practices involved in making life more manageable for people who have suffered damage to the brain. Life After Brain Injury: Survivors' Stories will also be essential for clinical psychologists, neuropsychologists, and anybody dealing with acquired brain injury whether they be a survivor of a brain injury themselves, a relative, a friend or a carer.
Designed for a graduate course in applied statistics, Nonparametric Methods in Statistics with SAS Applications teaches students how to apply nonparametric techniques to statistical data. It starts with the tests of hypotheses and moves on to regression modeling, time-to-event analysis, density estimation, and resampling methods. The text begins with classical nonparametric hypotheses testing, including the sign, Wilcoxon sign-rank and rank-sum, Ansari-Bradley, Kolmogorov-Smirnov, Friedman rank, Kruskal-Wallis H, Spearman rank correlation coefficient, and Fisher exact tests. It then discusses smoothing techniques (loess and thin-plate splines) for classical nonparametric regression as well as binary logistic and Poisson models. The author also describes time-to-event nonparametric estimation methods, such as the Kaplan-Meier survival curve and Cox proportional hazards model, and presents histogram and kernel density estimation methods. The book concludes with the basics of jackknife and bootstrap interval estimation. Drawing on data sets from the author's many consulting projects, this classroom-tested book includes various examples from psychology, education, clinical trials, and other areas. It also presents a set of exercises at the end of each chapter. All examples and exercises require the use of SAS 9.3 software. Complete SAS codes for all examples are given in the text. Large data sets for the exercises are available on the author's website.
This edited volume features cutting-edge topics from the leading researchers in the areas of latent variable modeling. Content highlights include coverage of approaches dealing with missing values, semi-parametric estimation, robust analysis, hierarchical data, factor scores, multi-group analysis, and model testing. New methodological topics are illustrated with real applications. The material presented brings together two traditions: psychometrics and structural equation modeling. Latent Variable and Latent Structure Models' thought-provoking chapters from the leading researchers in the area will help to stimulate ideas for further research for many years to come. This volume will be of interest to researchers and practitioners from a wide variety of disciplines, including biology, business, economics, education, medicine, psychology, sociology, and other social and behavioral sciences. A working knowledge of basic multivariate statistics and measurement theory is assumed.
Single Case Research in Schools addresses and examines the variety of cutting-edge issues in single case research (SCR) in educational settings. Featuring simple and practical techniques for aggregating data for evidence-based practices, the book delves into methods of selecting behaviors of interest and measuring them reliably. The latter part of Single Case Research in Schools is devoted to a step-by-step model of using SCR to evaluate practices in schools. This includes considerations such as measurement, date collection, length of phases, design consideratoins, calculating effect size and reliability of measures.
Built around a problem solving theme, this book extends the intermediate and advanced student's expertise to more challenging situations that involve applying statistical methods to real-world problems. Data relevant to these problems are collected and analyzed to provide useful answers. Building on its central problem-solving theme, a large number of data sets arising from real problems are contained in the text and in the exercises provided at the end of each chapter. Answers, or hints to providing answers, are provided in an appendix. Concentrating largely on the established SPSS and the newer S-Plus statistical packages, the author provides a short, end-of-chapter section entitled Computer Hints that helps the student undertake the analyses reported in the chapter using these statistical packages.
This volume presents the first wide-ranging critical review of validity generalization (VG)--a method that has dominated the field since the publication of Schmidt and Hunter's (1977) paper "Development of a General Solution to the Problem of Validity Generalization." This paper and the work that followed had a profound impact on the science and practice of applied psychology. The research suggests that fundamental relationships among tests and criteria, and the constructs they represent are simpler and more regular than they appear. Looking at the history of the VG model and its impact on personnel psychology, top scholars and leading researchers of the field review the accomplishments of the model, as well as the continuing controversies. Several chapters significantly extend the maximum likelihood estimation with existing models for meta analysis and VG. Reviewing 25 years of progress in the field, this volume shows how the model can be extended and applied to new problems and domains. This book will be important to researchers and graduate students in the areas of industrial organizational psychology and statistics.
Single Case Research in Schools addresses and examines the variety of cutting-edge issues in single case research (SCR) in educational settings. Featuring simple and practical techniques for aggregating data for evidence-based practices, the book delves into methods of selecting behaviors of interest and measuring them reliably. The latter part of Single Case Research in Schools is devoted to a step-by-step model of using SCR to evaluate practices in schools. This includes considerations such as measurement, date collection, length of phases, design consideratoins, calculating effect size and reliability of measures.
The study of intuition and its relation to thoughtful reasoning is a burgeoning research topic in psychology and beyond. While the area has the potential to radically transform our conception of the mind and decision making, the procedures used for establishing empirical conclusions have often been vaguely formulated and obscure. This book fills a gap in the field by providing a range of methods for exploring intuition experimentally and thereby enhancing the collection of new data. The book begins by summarizing current challenges in the study of intuition and gives a new foundation for intuition research. Going beyond classical dual-process models, a new scheme is introduced to classify the different types of processes usually collected under the label of intuition. These new classifications range from learning approaches to complex cue integration models. The book then goes on to describe the wide variety of behavioural methods available to investigate these processes, including information search tracing, think aloud protocols, maximum likelihood methods, eye-tracking, and physiological and non-physiological measures of affective responses. It also discusses paradigms to investigate implicit associations and causal intuitions, video-based approaches to expert research, methods to induce specific decision modes as well as questionnaires to assess individual preferences for intuition or deliberation. By uniquely providing the basis for exploring intuition by introducing the different methods and their applications in a step-by-step manner, this text is an invaluable reference for individual research projects. It is also very useful as a course book for advanced decision making courses, and could inspire experimental explorations of intuition in psychology, behavioural economics, empirical legal studies and clinical decision making.
Age-Period-Cohort Analysis: New Models, Methods, and Empirical Applications is based on a decade of the authors' collaborative work in age-period-cohort (APC) analysis. Within a single, consistent HAPC-GLMM statistical modeling framework, the authors synthesize APC models and methods for three research designs: age-by-time period tables of population rates or proportions, repeated cross-section sample surveys, and accelerated longitudinal panel studies. The authors show how the empirical application of the models to various problems leads to many fascinating findings on how outcome variables develop along the age, period, and cohort dimensions. The book makes two essential contributions to quantitative studies of time-related change. Through the introduction of the GLMM framework, it shows how innovative estimation methods and new model specifications can be used to tackle the "model identification problem" that has hampered the development and empirical application of APC analysis. The book also addresses the major criticism against APC analysis by explaining the use of new models within the GLMM framework to uncover mechanisms underlying age patterns and temporal trends. Encompassing both methodological expositions and empirical studies, this book explores the ways in which statistical models, methods, and research designs can be used to open new possibilities for APC analysis. It compares new and existing models and methods and provides useful guidelines on how to conduct APC analysis. For empirical illustrations, the text incorporates examples from a variety of disciplines, such as sociology, demography, and epidemiology. Along with details on empirical analyses, software and programs to estimate the models are available on the book's web page.
Psychologists are under increasing pressure to demonstrate the ecological validity of their assessment procedures--to show that the recommendations concluding their evaluations are relevant to urgent concerns in the legal and social policy arenas, such as predicting dangerousness, awarding compensation, and choosing a custodial parent. How much damage does a referred patient have? Who or what "caused" the damage? What impact will it have on his or her future life, work, and family? And what can be done to remediate the damage? The purpose of this book is to provide sound objective methods for answering these questions. It integrates the knowledge of experienced practitioners who offer state-of-the-art summaries of the best current approaches to evaluating difficult cases with that of basic theorists who describe emerging methods in both predictive and inferential statistics, such as Bayesian networks, that have proven their value in other scientific fields. Arguably, the enterprise of psychological assessment is so interdependent with that of data analysis that attempts to make inferences without consideration of statistical implications is malpractice. Prediction in Forensic and Neuropsychology: Sound Statistical Practices clarifies the process of hypothesis testing and helps to push the clinical interpretation of psychological data into the 21st century. It constitutes a vital resource for all the stakeholders in the assessment process--practitioners, researchers, attorneys, and policymakers.
Notwithstanding the mythical demise of "introspection," self-observation has always been an integral aspect of the social sciences. In the century following the "behavioral revolution," psychology has seen a reduction not so much in the frequency as in the rigor with which self-observation is practiced. A great deal of self-observation has been renamed or obscured (as, for example, "self-report"), but this has served only to defer and impoverish important theoretical and technical work. This volume, which contributes to the development of a rigorous theory of self-observation, is organized around three general objectives: to re-animate a discourse on self-observation through a historical analysis of various self-observation traditions; to outline and begin to address some of the unique theoretical challenges of self-observation; and to elaborate some of the technical and practical details necessary for realizing a program of research dedicated to self-observation. In the first section of the book, three historians of psychology trace the evolution of self-observation. In the second, three scholars who are currently working in contemporary traditions of self-observation discuss the basic theoretical and practical challenges involved in conducting self-observation research. In the final two sections of the book, scholars from the phenomenological and narrative traditions trace the history, theory, and practice of self-observation in their respective traditions. Self-Observation in the Social Sciences continues the fine tradition set by Transaction's History and Theory of Psychology series edited by Jaan Valsiner. It is of interest to psychologists and to those who study methodology within the social sciences.
This new edited volume features contributions from many of the leading scientists in probability and statistics from the latter part of the 20th century. It is the only book to assemble the views of these leading scientists--the pioneers in their respective fields. Stochastic Musings features contributions by: *Sir David Cox on statistics and econometrics; *C.R. Rao, M.B. Rao, and D.N. Shanbhag on convex sets of multivariate distributions and their extreme points; *Bradley Efron on the future of statistics; *David Freedman on regression association and causation; *Vic Barnett on sample ordering for effective statistical inference with particular reference to environmental issues; *David Bartholomew on a unified statistical approach to some measurement problems in the social sciences; *Joe Gani on scanning a lattice for a particular pattern; *Leslie Kish on new paradigms for probability sampling (his last paper); *Samuel Kotz and Norman L. Johnson on limit distributions of uncorrelated but dependent distributions on the unit square; *Samuel Kotz and Saralees Nadarajah on some new elliptical distributions; *Jef Teugels on the life span of a renewal; *Wolfgang Urfer and Katharina Emrich on maximum likelihood estimates of genetic effects; and **Vladimir M. Zolotarev on convergence rate estimates in functional limit theorems. The volume also includes the following contributions by faculty members of the Department of Statistics, Athens University of Economics and Business: *J. Panaretos, E. Xekalaki, and S. Psarakis on a predictive model evaluation and selection approach--the correlated gamma ratio distribution; *J. Panaretos and Z. Tsourti on extreme value index estimators and smoothing alternatives; *E. Xekalaki and D. Karlis on mixtures everywhere; and * Ir. Moustaki on latent variable models with covariates. Stochastic Musings will appeal to researchers, professionals, and students interested in the history and development of statistics and probability as well as in related areas, such as physics, biometry, economics, and mathematics. Academic and professional statisticians will benefit from the book's coverage of the latest developments in the field, as well as reflections on the future directions of the discipline.
Horrified by the Holocaust, social psychologist Stanley Milgram wondered if he could recreate the Holocaust in the laboratory setting. Unabated for more than half a century, his (in)famous results have continued to intrigue scholars. Based on unpublished archival data from Milgram's personal collection, volume one of this two-volume set introduces readers to a behind the scenes account showing how during Milgram's unpublished pilot studies he step-by-step invented his official experimental procedure-how he gradually learnt to transform most ordinary people into willing inflictors of harm. Volume two then illustrates how certain innovators within the Nazi regime used the very same Milgram-like learning techniques that with increasing effectiveness gradually enabled them to also transform most ordinary people into increasingly capable executioners of other men, women, and children. Volume two effectively attempts to capture how step-by-step these Nazi innovators attempted to transform the Fuhrer's wish of a Jewish-free Europe into a frightening reality. By the books' end the reader will gain an insight into how the seemingly undoable can become increasingly doable.
This book describes a series of ground-breaking residential workshops in therapeutic counselling in the 1960s, for people working in mental health and social care disciplines seeking to expand and deepen their reach. The work is unique in the scope of its research into the process and outcomes of such active immersive enquiry in this area. Besides a wealth of more systematic features, the author invites us into the initial conversations in the meeting room, and then follows the group members back into their lives, allowing us to see both early outcomes and the impact of participation up to ten years later. Finally, Barrett-Lennard reflects on the extended history of the intensive workshops and the related group work in other contexts they led into. He makes a compelling argument that such an intensive participatory process is as powerful today as it was in the 1960s. The blend of rich qualitative and empirical data and theory is a unique strength. It will be a great resource for students and scholars in applied psychology and psychotherapy, as well as for practicing therapists and trainees committed to meaningful work with their client groups.
Research today demands the application of sophisticated and powerful research tools. Fulfilling this need, The Oxford Handbook of Quantitative Methods in Psychology is the complete tool box to deliver the most valid and generalizable answers to today's complex research questions. It is a one-stop source for learning and reviewing current best-practices in quantitative methods as practiced in the social, behavioral, and educational sciences. Comprising two volumes, this handbook covers a wealth of topics related to quantitative research methods. It begins with essential philosophical and ethical issues related to science and quantitative research. It then addresses core measurement topics before delving into the design of studies. Principal issues related to modern estimation and mathematical modeling are also detailed. Topics in the handbook then segway into the realm of statistical inference and modeling with chapters dedicated to classical approaches as well as modern latent variable approaches. Numerous chapters associated with longitudinal data and more specialized techniques round out this broad selection of topics. Comprehensive, authoritative, and user-friendly, this two-volume set will be an indispensable resource for serious researchers across the social, behavioral, and educational sciences.
This book covers statistical consequences of breaches of research integrity such as fabrication and falsification of data, and researcher glitches summarized as questionable research practices. It is unique in that it discusses how unwarranted data manipulation harms research results and that questionable research practices are often caused by researchers' inadequate mastery of the statistical methods and procedures they use for their data analysis. The author's solution to prevent problems concerning the trustworthiness of research results, no matter how they originated, is to publish data in publicly available repositories and encourage researchers not trained as statisticians not to overestimate their statistical skills and resort to professional support from statisticians or methodologists. The author discusses some of his experiences concerning mutual trust, fear of repercussions, and the bystander effect as conditions limiting revelation of colleagues' possible integrity breaches. He explains why people are unable to mimic real data and why data fabrication using statistical models stills falls short of credibility. Confirmatory and exploratory research and the usefulness of preregistration, and the counter-intuitive nature of statistics are discussed. The author questions the usefulness of statistical advice concerning frequentist hypothesis testing, Bayes-factor use, alternative statistics education, and reduction of situational disturbances like performance pressure, as stand-alone means to reduce questionable research practices when researchers lack experience with statistics.
* It is a straightforward, conversational introduction to statistics that delivers exactly what its title promises. * Each chapter begins with a brief overview of a statistic that describes what the statistic does and when to use it, followed by a detailed step-by-step explanation of how the statistic works and exactly what information it provides. * Chapters also include an example of the statistic (or statistics) in use in real-world research, "Worked Examples," "Writing It Up" sections that demonstrate how to write about each statistic, "Wrapping Up and Looking Forward" sections, and practice work problems. * A new chapter on person-centered analyses, including cluster analysis and latent class analysis (LCA) has been added (Chapter 16). * Person-centered analysis is an important alternative to the more commonly used variable-centered analyses (e.g., t tests, ANOVA, regression) and is gaining popularity in social-science research. * The chapter on non-parametric statistics (Chapter 14) was enhanced significantly with in-depth descriptions of Mann-Whitney U, Kruskall-Wallace, and Wilcoxon Signed-Rank analyses. * These non-parametric statistics are important alternatives to statistics that rely on normally distributed data. * This new edition also includes more information about the assumptions of various statistics, including a detailed explanation of the assumptions and consequences of violating the assumptions of regression (Chapter 13). * There is more information provided about the importance of the normal distribution in statistics (Chapters 4 and 7). * Each of the last nine chapters includes an example from the real world of research that employs the statistic, or statistics, covered in the chapter. * Altogether, these improvements provide important foundational information about how inferential statistics work and additional statistical tools that are commonly used by researchers in the social sciences. * The text works as a standalone or as a supplement and covers a range of statistical concepts from descriptive statistics to factor analysis and person-centered analyses.
First published in 1985, Ethical Issues in Psychosurgery examines the continuing debate surrounding the treatment of psychiatric disorder by psychosurgery and its ethical implications. Psychosurgery represents a radical treatment and it therefore raises, in a particularly acute and challenging fashion, questions which are implicit In most therapy. The book offers a focussed study in bioethics, a model for bioethical inquiry, as well as introduction to some of the major problems in bioethics. These range from detailed discussions of informed consent, the sanctity of the brain, and the use of experimental therapies, to wider questions of social contract and professionalization. John Kleinig's balanced and informed treatment of the questions will make this book invaluable not only to those concerned with the philosophy of legal and medical ethics, but also to those in the fields of psychiatric practice and research.
This research volume serves as a comprehensive resource for psychophysiological research on media responses. It addresses the theoretical underpinnings, methodological techniques, and most recent research in this area. It goes beyond current volumes by placing the research techniques within a context of communication processes and effects as a field, and demonstrating how the real-time measurement of physiological responses enhances and complements more traditional measures of psychological effects from media. This volume introduces readers to the theoretical assumptions of psychophysiology as well as the operational details of collecting psychophysiological data. In addition to discussing specific measures, it includes brief reviews of recent experiments that have used psychophysiological measures to study how the brain processes media. It will serve as a valuable reference for media researchers utilizing these methodologies, or for other researchers needing to understand the theories, history, and methods of psychophysiological research.
Modeled after Barbara Byrne's other best-selling structural equation modeling (SEM) books, this practical guide reviews the basic concepts and applications of SEM using Mplus Version 6. The author reviews SEM applications based on actual data taken from her own research. Using non-mathematical language, it is written for the novice SEM user. With each application chapter, the author "walks" the reader through all steps involved in testing the SEM model including:
The first two chapters introduce the fundamental concepts of SEM and important basics of the Mplus program. The remaining chapters focus on SEM applications and include a variety of SEM models presented within the context of three sections: Single-group analyses, Multiple-group analyses, and other important topics, the latter of which includes the multitrait-multimethod, latent growth curve, and multilevel models. Intended for researchers, practitioners, and students who use SEM and Mplus, this book is an ideal resource for graduate level courses on SEM taught in psychology, education, business, and other social and health sciences and/or as a supplement for courses on applied statistics, multivariate statistics, intermediate or advanced statistics, and/or research design. Appropriate for those with limited exposure to SEM or Mplus, a prerequisite of basic statistics through regression analysis is recommended. |
You may like...
The Office of Ceremonies and Advancement…
Jennifer Mara Desilva
Hardcover
R3,673
Discovery Miles 36 730
|