![]() |
![]() |
Your cart is empty |
||
Books > Social sciences > Psychology > Psychological methodology
Wim van der Linden was just given a lifetime achievement award by the National Council on Measurement in Education. There is no one more prominent in the area of educational testing. There are hundreds of computer-based credentialing exams in areas such as accounting, real estate, nursing, and securities, as well as the well-known admissions exams for college, graduate school, medical school, and law school - there is great need on the theory of testing. This book presents the statistical theory and practice behind constructing good tests e.g., how is the first test item selected, how are the next items selected, and when do you have enough items.
"Models of Psychological Space" begins the reformulation of the construct of psychological space by bringing together in one volume a sampling of theoretical models from the psychometric, developmental, and experimental approaches. The author also discusses five general issues which cut across these three approaches; namely, age-related differences, sex-related differences, trainability, imagery, processing solutions, and the effect of stimulus dimensionality upon spatial performance. "Models of " "Psychological Space" provides an overview of a significant construct which has many researchable ideas, and which should be of interest to scholars from a wide range of disciplines.
Behavioral scientists - including those in psychology, infant and child development, education, animal behavior, marketing and usability studies - use many methods to measure behavior. Systematic observation is used to study relatively natural, spontaneous behavior as it unfolds sequentially in time. This book emphasizes digital means to record and code such behavior; while observational methods do not require them, they work better with them. Key topics include devising coding schemes, training observers and assessing reliability, as well as recording, representing and analyzing observational data. In clear and straightforward language, this book provides a thorough grounding in observational methods along with considerable practical advice. It describes standard conventions for sequential data and details how to perform sequential analysis with a computer program developed by the authors. The book is rich with examples of coding schemes and different approaches to sequential analysis, including both statistical and graphical means.
Psychoanalytic infant observation is frequently used in training psychoanalytic psychotherapists and allied professionals, but increasingly its value as a research method is being recognised, particularly in understanding developmental processes in vulnerable individuals and groups. This book explores the scope of this approach and discusses its strengths and limitations from a methodological and philosophical point of view. Infant Observation and Research uses detailed case studies to demonstrate the research potential of the infant observation method. Divided into three sections this book covers
Throughout the book, Cathy Urwin, Janine Sternberg and their contributors introduce the reader to the nature and value of psychoanalytic infant observation and its range of application. This book will therefore interest a range of mental health practitioners concerned with early development and infants' emotional relationships, as well as academics and researchers in the social sciences and humanities.
Sample surveys provide data used by researchers in a large range of disciplines to analyze important relationships using well-established and widely used likelihood methods. The methods used to select samples often result in the sample differing in important ways from the target population and standard application of likelihood methods can lead to biased and inefficient estimates. Maximum Likelihood Estimation for Sample Surveys presents an overview of likelihood methods for the analysis of sample survey data that account for the selection methods used, and includes all necessary background material on likelihood inference. It covers a range of data types, including multilevel data, and is illustrated by many worked examples using tractable and widely used models. It also discusses more advanced topics, such as combining data, non-response, and informative sampling. The book presents and develops a likelihood approach for fitting models to sample survey data. It explores and explains how the approach works in tractable though widely used models for which we can make considerable analytic progress. For less tractable models numerical methods are ultimately needed to compute the score and information functions and to compute the maximum likelihood estimates of the model parameters. For these models, the book shows what has to be done conceptually to develop analyses to the point that numerical methods can be applied. Designed for statisticians who are interested in the general theory of statistics, Maximum Likelihood Estimation for Sample Surveys is also aimed at statisticians focused on fitting models to sample survey data, as well as researchers who study relationships among variables and whose sources of data include surveys.
This is the second edition of the comprehensive treatment of statistical inference using permutation techniques. It makes available to practitioners a variety of useful and powerful data analytic tools that rely on very few distributional assumptions. Although many of these procedures have appeared in journal articles, they are not readily available to practitioners. This new and updated edition places increased emphasis on the use of alternative permutation statistical tests based on metric Euclidean distance functions that have excellent robustness characteristics. These alternative permutation techniques provide many powerful multivariate tests including multivariate multiple regression analyses.
This book examines extensions of the Rasch model, one of the most researched and applied models in educational research and social science. This collection contains 22 chapters by some of the most renowned international experts in the field. They cover topics ranging from general model extensions to applications in fields as diverse as cognition, personality, organizational and sports psychology, and health sciences and education.
The study of brain function is one of the most fascinating pursuits of m- ern science. Functional neuroimaging is an important component of much of the current research in cognitive, clinical, and social psychology. The exci- ment of studying the brain is recognized in both the popular press and the scienti?c community. In the pages of mainstream publications, including The New York Times and Wired, readers can learn about cutting-edge research into topics such as understanding how customers react to products and - vertisements ("If your brain has a 'buy button, ' what pushes it?," The New York Times, October19,2004), howviewersrespondtocampaignads("Using M. R. I. 's to see politics on the brain," The New York Times, April 20, 2004; "This is your brain on Hillary: Political neuroscience hits new low," Wired, November 12,2007), howmen and womenreactto sexualstimulation ("Brain scans arouse researchers,"Wired, April 19, 2004), distinguishing lies from the truth ("Duped," The New Yorker, July 2, 2007; "Woman convicted of child abuse hopes fMRI can prove her innocence," Wired, November 5, 2007), and even what separates "cool" people from "nerds" ("If you secretly like Michael Bolton, we'll know," Wired, October 2004). Reports on pathologies such as autism, in which neuroimaging plays a large role, are also common (for - stance, a Time magazine cover story from May 6, 2002, entitled "Inside the world of autism").
WINNER OF THE 2007 DEGROOT PRIZE The prominence of finite mixture modelling is greater than ever. Many important statistical topics like clustering data, outlier treatment, or dealing with unobserved heterogeneity involve finite mixture models in some way or other. The area of potential applications goes beyond simple data analysis and extends to regression analysis and to non-linear time series analysis using Markov switching models. For more than the hundred years since Karl Pearson showed in 1894 how to estimate the five parameters of a mixture of two normal distributions using the method of moments, statistical inference for finite mixture models has been a challenge to everybody who deals with them. In the past ten years, very powerful computational tools emerged for dealing with these models which combine a Bayesian approach with recent Monte simulation techniques based on Markov chains. This book reviews these techniques and covers the most recent advances in the field, among them bridge sampling techniques and reversible jump Markov chain Monte Carlo methods. It is the first time that the Bayesian perspective of finite mixture modelling is systematically presented in book form. It is argued that the Bayesian approach provides much insight in this context and is easily implemented in practice. Although the main focus is on Bayesian inference, the author reviews several frequentist techniques, especially selecting the number of components of a finite mixture model, and discusses some of their shortcomings compared to the Bayesian approach. The aim of this book is to impart the finite mixture and Markov switching approach to statistical modelling to a wide-ranging community. This includes not only statisticians, but also biologists, economists, engineers, financial agents, market researcher, medical researchers or any other frequent user of statistical models. This book should help newcomers to the field to understand how finite mixture and Markov switching models are formulated, what structures they imply on the data, what they could be used for, and how they are estimated. Researchers familiar with the subject also will profit from reading this book. The presentation is rather informal without abandoning mathematical correctness. Previous notions of Bayesian inference and Monte Carlo simulation are useful but not needed.
This is the first workbook that introduces the multilevel approach to modeling with categorical outcomes using IBM SPSS Version 20. Readers learn how to develop, estimate, and interpret multilevel models with categorical outcomes. The authors walk readers through data management, diagnostic tools, model conceptualization, and model specification issues related to single-level and multilevel models with categorical outcomes. Screen shots clearly demonstrate techniques and navigation of the program. Modeling syntax is provided in the appendix. Examples of various types of categorical outcomes demonstrate how to set up each model and interpret the output. Extended examples illustrate the logic of model development, interpretation of output, the context of the research questions, and the steps around which the analyses are structured. Readers can replicate examples in each chapter by using the corresponding data and syntax files available at www.psypress.com/9781848729568. The book opens with a review of multilevel with categorical outcomes, followed by a chapter on IBM SPSS data management techniques to facilitate working with multilevel and longitudinal data sets. Chapters 3 and 4 detail the basics of the single-level and multilevel generalized linear model for various types of categorical outcomes. These chapters review underlying concepts to assist with trouble-shooting common programming and modeling problems. Next population-average and unit-specific longitudinal models for investigating individual or organizational developmental processes are developed. Chapter 6 focuses on single- and multilevel models using multinomial and ordinal data followed by a chapter on models for count data. The book concludes with additional trouble shooting techniques and tips for expanding on the modeling techniques introduced. Ideal as a supplement for graduate level courses and/or
professional workshops on multilevel, longitudinal, latent variable
modeling, multivariate statistics, and/or advanced quantitative
techniques taught in psychology, business, education, health, and
sociology, this practical workbook also appeals to researchers in
these fields. An excellent follow up to the authors' highly
successful Multilevel and Longitudinal Modeling with IBM SPSS and
Introduction to Multilevel Modeling Techniques, 2nd Edition, this
book can also be used with any multilevel and/or longitudinal book
or as a stand-alone text introducing multilevel modeling with
categorical outcomes.
"Polygraphy;' "lie detection;' and the "detection of deception" are all terms that refer to an application of the science of psychophysiology, which itself employs physiological measures to study and differentiate between psychological processes. The issues raised by polygraphy are controversial. One such issue is whether the polygraph is a genuinely scientifically based application, or merely a purported application, of psychophysiology. Such concerns are of interest not only to polygraph practitioners and to specialists in psychophysiology, but also to such other specialists as those in the legal and forensic professions. Moreover, there are two sorts of nonspecialists who should also be concerned. On the one hand, there are the potential "users" of the polygraph-for example, a manager who employs a polygrapher to check on subordinates; on the other hand, there are those "used by" the polygraph - the employee who is subjected to the poly graphic examination. To begin with the user of the polygraph, this person should know not only about its overall accuracy, but also about the rationales of the various detection methods and their validity for different purposes in different sorts of situations. This infor mation is important, because even for the potential user there are costs as well as benefits. Aside from the lack of trust generated by the polygraph, there have also been successful suits by employees against employers, so there are traps in polygraph usage that employers (and managers) need to keep in mind."
This book presents the state of the art in multilevel analysis, with an emphasis on more advanced topics. These topics are discussed conceptually, analyzed mathematically, and illustrated by empirical examples. Multilevel analysis is the statistical analysis of hierarchically and non-hierarchically nested data. The simplest example is clustered data, such as a sample of students clustered within schools. Multilevel data are especially prevalent in the social and behavioral sciences and in the biomedical sciences. The chapter authors are all leading experts in the field. Given the omnipresence of multilevel data in the social, behavioral, and biomedical sciences, this book is essential for empirical researchers in these fields.
Item response theory has become an essential component in the toolkit of every researcher in the behavioral sciences. It provides a powerful means to study individual responses to a variety of stimuli, and the methodology has been extended and developed to cover many different models of interaction. This volume presents a wide-ranging handbook to item response theory - and its applications to educational and psychological testing. It will serve as both an introduction to the subject and also as a comprehensive reference volume for practitioners and researchers. It is organized into six major sections: the nominal categories model, models for response time or multiple attempts on items, models for multiple abilities or cognitive components, nonparametric models, models for nonmonotone items, and models with special assumptions. Each chapter in the book has been written by an expert of that particular topic, and the chapters have been carefully edited to ensure that a uniform style of notation and presentation is used throughout. As a result, all researchers whose work uses item response theory will find this an indispensable companion to their work and it will be the subject's reference volume for many years to come.
This textbook is for graduate students and research workers in social statistics and related subject areas. It follows a novel curriculum developed around the basic statistical activities: sampling, measurement and inference. The monograph aims to prepare the reader for the career of an independent social statistician and to serve as a reference for methods, ideas for and ways of studying of human populations. Elementary linear algebra and calculus are prerequisites, although the exposition is quite forgiving. Familiarity with statistical software at the outset is an advantage, but it can be developed while reading the first few chapters.
This volume provides an integrative review of the emerging and increasing use of network science techniques in cognitive psychology, first developed in mathematics, computer science, sociology, and physics. The first resource on network science for cognitive psychologists in a growing international market, Vitevitch and a team of expert contributors provide a comprehensive and accessible overview of this cutting-edge topic. This innovative guide draws on the three traditional pillars of cognitive psychological research-experimental, computational, and neuroscientific-and incorporates the latest findings from neuroimaging. The network perspective is applied to the fundamental domains of cognitive psychology including memory, language, problem-solving, and learning, as well as creativity and human intelligence, highlighting the insights to be gained through applying network science to a wide range of approaches and topics in cognitive psychology Network Science in Cognitive Psychology will be essential reading for all upper-level cognitive psychology students, psychological researchers interested in using network science in their work, and network scientists interested in investigating questions related to cognition. It will also be useful for early career researchers and students in methodology and related courses.
Presenting an innovative take on researching early childhood, this book provides an international comparison of the cultural and familial influences that shape the growth of young children. The book presents a unique methodology, and includes chapters on musicality, security, humour and eating.
The Analytic Network Process (ANP) developed by Thomas Saaty in his work on multicriteria decision making applies network structures with dependence and feedback to complex decision making. This book is a selection of applications of ANP to economic, social and political decisions, and also to technological design. The chapters comprise contributions of scholars, consultants and people concerned about the outcome of certain important decisions who applied the Analytic Network Process to determine the best outcome for each decision from among several potential outcomes. The ANP is a methodological tool that is helpful to organize knowledge and thinking, elicit judgments registered in both in memory and in feelings, quantify the judgments and derive priorities from them, and finally synthesize these diverse priorities into a single mathematically and logically justifiable overall outcome. In the process of deriving this outcome, the ANP also allows for the representation and synthesis of diverse opinions in the midst of discussion and debate, The ANP offers economists a considerably different approach for dealing with economic problems than the usual quantitative models used. The ANP approach is based on absolute scales used to represent pairwise comparison judgments in the context of dominance with respect to a property shared by the homogeneous elements being compared. How much or how many times more does A dominate B with respect to property P? Actually people are able to answer this question by using words to indicate intensity of dominance that all of us are equipped biologically to do all the time (equal, moderate, strong, very strong, and extreme) whose conversion to numbers, validation and extension to inhomogeneous elements form the foundation of the AHP/ANP. Numerous applications of the ANP have been made to economic problems, among which prediction of the turn-around dates for the US economy in the early 1990 s and again in 2001 whose accuracy and validity were both confirmed later in the news. They were based on the process of comparisons of mostly intangible factors rather than on financial, employment and other data and statistics.
The Handbook of Psychodiagnostic Testing is an invaluable aid to students and professionals performing psychological assessments. It takes the reader from client referral to finished report, demonstrating how to synthesize details of personality and pathology into a document that is focused, coherent, and clinically meaningful. This new edition covers emerging areas in borderline and narcissistic pathologies, psychological testing of preschool children, and bilingual populations. It also discusses the most current clinical issues and evaluating populations on which standard psychological tests have not been standardized.
In the decade of the 1970s, item response theory became the dominant topic for study by measurement specialists. But, the genesis of item response theory (IRT) can be traced back to the mid-thirties and early forties. In fact, the term "Item Characteristic Curve," which is one of the main IRT concepts, can be attributed to Ledyard Tucker in 1946. Despite these early research efforts, interest in item response theory lay dormant until the late 1960s and took a backseat to the emerging development of strong true score theory. While true score theory developed rapidly and drew the attention of leading psychometricians, the problems and weaknesses inherent in its formulation began to raise concerns. Such problems as the lack of invariance of item parameters across examinee groups, and the inadequacy of classical test procedures to detect item bias or to provide a sound basis for measurement in "tailored testing," gave rise to a resurgence of interest in item response theory. Impetus for the development of item response theory as we now know it was provided by Frederic M. Lord through his pioneering works (Lord, 1952; 1953a, 1953b). The progress in the fifties was painstakingly slow due to the mathematical complexity of the topic and the nonexistence of computer programs.
Generalizability theory offers an extensive conceptual framework and a powerful set of statistical procedures for characterizing and quantifying the fallibility of measurements. Robert Brennan, the author, has written the most comprehensive and up-to-date treatment of generalizability theory. The book provides a synthesis of those parts of the statistical literature that are directly applicable to generalizability theory. The principal intended audience is measurement practitioners and graduate students in the behavioral and social sciences, although a few examples and references are provided from other fields. Readers will benefit from some familiarity with classical test theory and analysis of variance, but the treatment of most topics does not presume specific background.
Synthesizing many years' investigation into sexual identity and orientation, this book presents Dr Money's formulation of how sexual preference is determined. It includes a review of long-term follow-up studies on pre-natal influences on sexual identity, and discusses gender differentiation in childhood. The book concludes with an examination of the conflict between gender and sexual identities, and a description of the paraphilias. Researchers, clinicians, and graduate students in psychology, biology, endocrinology, psychiatry, and family studies will find this volume of interest, as will anyone interested in gay and lesbian issues.
Qualitative methodologies in cultural psychology often lack the objective and verifiable character of quantitative analysis. Author Carl Ratner corrects this shortcoming by rigorously systematizing qualitative methods. The book discusses, for example, means of systematizing such subjective reports as interviews, letters, and diaries, which often yield valuable data that is not easily quantified. Ratner argues that "complex psychological phenomena are expressed through extended responses" and hence are best studied by new, more regularized qualitative methods that go beyond measuring simple, overt responses.
This comprehensive Handbook is the first to provide a practical, interdisciplinary review of ethical issues as they relate to quantitative methodology including how to present evidence for reliability and validity, what comprises an adequate tested population, and what constitutes scientific knowledge for eliminating biases. The book uses an ethical framework that emphasizes the human cost of quantitative decision making to help researchers understand the specific implications of their choices. The order of the Handbook chapters parallels the chronology of the research process: determining the research design and data collection; data analysis; and communicating findings. Each chapter: Explores the ethics of a particular topic Identifies prevailing methodological issues Reviews strategies and approaches for handling such issues and their ethical implications Provides one or more case examples Outlines plausible approaches to the issue including best-practice solutions. Part 1 presents ethical frameworks that cross-cut design, analysis, and modeling in the behavioral sciences. Part 2 focuses on ideas for disseminating ethical training in statistics courses. Part 3 considers the ethical aspects of selecting measurement instruments and sample size planning and explores issues related to high stakes testing, the defensibility of experimental vs. quasi-experimental research designs, and ethics in program evaluation. Decision points that shape a researchers' approach to data analysis are examined in Part 4 - when and why analysts need to account for how the sample was selected, how to evaluate tradeoffs of hypothesis-testing vs. estimation, and how to handle missing data. Ethical issues that arise when using techniques such as factor analysis or multilevel modeling and when making causal inferences are also explored. The book concludes with ethical aspects of reporting meta-analyses, of cross-disciplinary statistical reform, and of the publication process. This Handbook appeals to researchers and practitioners in psychology, human development, family studies, health, education, sociology, social work, political science, and business/marketing. This book is also a valuable supplement for quantitative methods courses required of all graduate students in these fields.
This is the sixth edition of a popular textbook on multivariate analysis. Well-regarded for its practical and accessible approach, with excellent examples and good guidance on computing, the book is particularly popular for teaching outside statistics, i.e. in epidemiology, social science, business, etc. The sixth edition has been updated with a new chapter on data visualization, a distinction made between exploratory and confirmatory analyses and a new section on generalized estimating equations and many new updates throughout. This new edition will enable the book to continue as one of the leading textbooks in the area, particularly for non-statisticians. Key Features: Provides a comprehensive, practical and accessible introduction to multivariate analysis. Keeps mathematical details to a minimum, so particularly geared toward a non-statistical audience. Includes lots of detailed worked examples, guidance on computing, and exercises. Updated with a new chapter on data visualization.
This book explores how an object relations--integrative perspective may combine in--depth psychodynamic principles and theories with the flexibility afforded by an integrative framework. Object relations theory is rooted in a psychoanalytic tradition which views individuals essentially social and holds that their need for others is primary. Integrative psychotherapy attempts to combine the theories and/or techniques of two or more therapeutic approaches. This volume is useful for graduates, undergraduates and trainee psychotherapists as well as social workers, psychologists, psychotherapists and counsellors who are interested in broadening their understanding of different therapeutic approaches and intefrative endeavours. The contributors consist of an international group of practitioners and theoreticians who draw on the knowlege of object realtions and other therapeutic orientations as well as innovations in the integrative movement. Some of te contributors grapple directly with integrative questions, while others examine ways of working with specific client groups or methods, where integrative ideas enrich their work. |
![]() ![]() You may like...
Research Methods, Design, and Analysis…
Larry Christensen, R. Burke Johnson, …
Paperback
R2,353
Discovery Miles 23 530
Essentials Of Statistics For The…
Larry Wallnau, Frederick Gravetter, …
Paperback
The Match - Academic/Applied Psychology…
Alfred A. Borrelli
Hardcover
Measures for Clinical Practice and…
Kevin Corcoran, Joel Fischer
Hardcover
R3,496
Discovery Miles 34 960
|