![]() |
![]() |
Your cart is empty |
||
Books > Social sciences > Psychology > Psychological methodology
A Handbook of Process Tracing Methods demonstrates how to better understand decision outcomes by studying decision processes, through the introduction of a number of exciting techniques. Decades of research have identified numerous idiosyncrasies in human decision behavior, but some of the most recent advances in the scientific study of decision making involve the development of sophisticated methods for understanding decision process-known as process tracing. In this volume, leading experts discuss the application of these methods and focus on the best practices for using some of the more popular techniques, discussing how to incorporate them into formal decision models. This edition has been expanded and thoroughly updated throughout, and now includes new chapters on mouse tracking, protocol analysis, neurocognitive methods, the measurement of valuation, as well as an overview of important software packages. The volume not only surveys cutting-edge research to illustrate the great variety in process tracing techniques, but also serves as a tutorial for how the novice researcher might implement these methods. A Handbook of Process Tracing Methods will be an essential read for all students and researchers of decision making.
MATLAB Blues is an accessible, comprehensive introduction to the MATLAB computer programming language-a powerful and increasingly popular tool for students and researchers. Rosenbaum identifies many of the common mistakes and pitfalls associated with using MATLAB, and shows users how they can learn from these mistakes to be better, happier programmers. Each chapter systematically addresses one of the basic principles of the programming language, like matrices, calculations, contingencies, plotting, input-output, and graphics, and then identifies areas that are problematic, as well as potential errors that can occur. This not only provides the reader with the fundamental "scales and chords" that a MATLAB programmer needs to know, but also with a series of examples and explanations of how to avoid and remedy common mistakes. Accompanied by an array of sample code that can be used and manipulated in conjunction with the textbook, this book is a practical, insightful introduction to MATLAB which provides motivation and encouragement to those with little or no background in programming as well as to those with more advanced concerns. It is an invaluable resource for researchers and students undertaking courses in research methods, statistics, and programming.
This guide is a much-needed reference for clinicians on how to use the Rorschach Inkblot Test with senior adults, an essential tool for assessing personality functioning to better identify psychological interventions. The book integrates historical developments, current research, conceptual considerations, and therapeutic and diagnostic applications. Chapters review basic guidelines for the understanding and interpretation of Rorschach variables, including protocol validity; interpretation of structural variables, thematic imagery, and cross-cultural normative data; sequence analysis; and more. The authors then provide 10 case illustrations of how the Rorschach indices of cognitive functioning, emotional experience, interpersonal relatedness, and self-perception can facilitate differential diagnosis and treatment planning in clinical work with older people. These case illustrations are rooted in previously non-existent Rorschach reference data based on an international sample of more than 250 senior adults and a second sample of more than 200 patients with Alzheimer's disease. Clinicians will come away with a solid empirical basis for distinguishing between normal-range personality functioning and manifestations of psychological disorder in the elderly and for providing beneficial interventions to senior adult patients.
Horrified by the Holocaust, social psychologist Stanley Milgram wondered if he could recreate the Holocaust in the laboratory setting. Unabated for more than half a century, his (in)famous results have continued to intrigue scholars. Based on unpublished archival data from Milgram's personal collection, volume one of this two-volume set introduces readers to a behind the scenes account showing how during Milgram's unpublished pilot studies he step-by-step invented his official experimental procedure-how he gradually learnt to transform most ordinary people into willing inflictors of harm. The open access volume two then illustrates how certain innovators within the Nazi regime used the very same Milgram-like learning techniques that with increasing effectiveness gradually enabled them to also transform most ordinary people into increasingly capable executioners of other men, women, and children. Volume two effectively attempts to capture how step-by-step these Nazi innovators attempted to transform the Fuhrer's wish of a Jewish-free Europe into a frightening reality. By the books' end the reader will gain an insight into how the seemingly undoable can become increasingly doable.
Since the mid-80s several laboratories around the world have been
developing techniques for the operational use of tests derived from
item-generation. According to the experts, the major thrust of test
development in the next decade will be the harnessing of item
generation technology to the production of computer developed
tests. This is expected to revolutionize the way in which tests are
constructed and delivered.
Qualitative Market Research follows through a complete research project from the perspective of both user and practitioner. In this respect, it can be used as both a continuous teaching text and training manual, or individual sections may be consulted to enhance knowledge of `best practices' and improve productivity in any specific research application. Section one begins with an overview of the history and philosophy behind the practice of qualitative research, using qualitative or quantitative approaches, organizing qualitative research (particularly those in `practice' such as research consultants), qualitative research applications (including product development, branding and advertising) and the varieties of qualitative research methods. Section two looks at the management of qualitative research and discusses project management, planning and budgeting issues. Section three looks at group moderation and interviewing techniques, and section four addresses the whole area of collecting and analyzing qualitative data, including discussion of computer-assisted software methods, as well as research reporting. This book meets the needs of several audiences by creating some common ground in the applied practice of qualitative research. It should consequently be invaluable reading to a wide readership, from social research methods students (particularly those in sociology, business, psychology, education, marketing and market research) to worldwide practitioners of qualitative research, both clients and consultants.
". . . a comprehensive survey of the topic . . . a complete resource and a fundamental yet creative cookbook . . . Mariampolski offers detailed suggestions on how to effectively set up each particular type of project with step-by-step guidelines on how to proceed at each stage along the way. . . . It will be very interesting to those who wish to work in marketing, advertising, or research." --Journal of Advertising Research
Contributors to the volume represent an international "who's who"
of research scientists from the fields of psychology and
measurement. It offers the insights of these leading authorities
regarding cognition and personality. In particular, they address
the roles of constructs and values in clarifying the theoretical
and empirical work in these fields, as well as their relation to
educational assessment. It is intended for professionals and
students in psychology and assessment, and almost anyone doing
research in cognition and personality.
Factor Analysis and Dimension Reduction in R provides coverage, with worked examples, of a large number of dimension reduction procedures along with model performance metrics to compare them. Factor analysis in the form of principal components analysis (PCA) or principal factor analysis (PFA) is familiar to most social scientists. However, what is less familiar is understanding that factor analysis is a subset of the more general statistical family of dimension reduction methods. The social scientist's toolkit for factor analysis problems can be expanded to include the range of solutions this book presents. In addition to covering FA and PCA with orthogonal and oblique rotation, this book's coverage includes higher-order factor models, bifactor models, models based on binary and ordinal data, models based on mixed data, generalized low-rank models, cluster analysis with GLRM, models involving supplemental variables or observations, Bayesian factor analysis, regularized factor analysis, testing for unidimensionality, and prediction with factor scores. The second half of the book deals with other procedures for dimension reduction. These include coverage of kernel PCA, factor analysis with multidimensional scaling, locally linear embedding models, Laplacian eigenmaps, diffusion maps, force directed methods, t-distributed stochastic neighbor embedding, independent component analysis (ICA), dimensionality reduction via regression (DRR), non-negative matrix factorization (NNMF), Isomap, Autoencoder, uniform manifold approximation and projection (UMAP) models, neural network models, and longitudinal factor analysis models. In addition, a special chapter covers metrics for comparing model performance. Features of this book include: Numerous worked examples with replicable R code Explicit comprehensive coverage of data assumptions Adaptation of factor methods to binary, ordinal, and categorical data Residual and outlier analysis Visualization of factor results Final chapters that treat integration of factor analysis with neural network and time series methods Presented in color with R code and introduction to R and RStudio, this book will be suitable for graduate-level and optional module courses for social scientists, and on quantitative methods and multivariate statistics courses.
Evaluation of staff and procedures is now a regular part of organisational efficiency, yet few people are given any training or guidance in how to set about it. This book provides this guidance, giving a step-by-step guide to designing evaluations. Questions of costs, benefits, types of methods and ethics are all discussed. The book also equips the reader with the skills to assess evaluations provided by outsiders.
Built around a problem solving theme, this book extends the
intermediate and advanced student's expertise to more challenging
situations that involve applying statistical methods to real-world
problems. Data relevant to these problems are collected and
analyzed to provide useful answers.
Stanley Milgram's experiments on obedience to authority are among
the most important psychological studies of this century. Perhaps
because of the enduring significance of the findings--the
surprising ease with which ordinary persons can be commanded to act
destructively against an innocent individual by a legitimate
authority--it continues to claim the attention of psychologists and
other social scientists, as well as the general public. The study
continues to inspire valuable research and analysis. The goal of
this book is to present current work inspired by the obedience
paradigm.
Cognitive task analysis is a broad area consisting of tools and
techniques for describing the knowledge and strategies required for
task performance. Cognitive task analysis has implications for the
development of expert systems, training and instructional design,
expert decision making and policymaking. It has been applied in a
wide range of settings, with different purposes, for instance:
specifying user requirements in system design or specifying
training requirements in training needs analysis. The topics to be
covered by this work include: general approaches to cognitive task
analysis, system design, instruction, and cognitive task analysis
for teams. The work settings to which the tools and techniques
described in this work have been applied include: 911 dispatching,
faultfinding on board naval ships, design aircraft, and various
support systems.
The question of whether someone is psychologically healthy or mentally ill, and the fundamental nature of mental health underlying that question has been debated in cultural, academic, and clinical settings for millennia. This book provides an overview of how people have conceptualized and understood mental illness through the ages. The book begins by looking at mental illness in humanity's evolutionary past then moves through the major historical epochs: the mythological, the Classical, the Middle Ages, the Renaissance, the Enlightenment, and modern, and the postmodern. At each point, it focuses on major elements that emerged regarding how people judged sanity and insanity and places major emphasis on the growing fields of psychiatry and psychology as they emerged and developed. As the book moves into the twenty-first century, Dr. Jenkins presents his integrated model of knowledge, a systemic, holistic model of the psyche that creates a conceptual foundation for understanding both psychological wellness and disorder and approaching assessment and diagnosis. This text provides a valuable exploration of mental health and illness across the ages and gives those already well versed in the subject matter a fresh perspective on the past and new model of knowledge and assessment for the future.
This volume collects recent studies conducted within the area of
medical education that investigate two of the critical components
of problem-based curricula--the group meeting and self-directed
learning--and demonstrates that understanding these complex
phenomena is critical to the operation of this innovative
curriculum. It is the editors' contention that it is these
components of problem-based learning that connect the initiating
"problem" with the process of effective "learning." Revealing how
this occurs is the task taken on by researchers contributing to
this volume. The studies include use of self-reports, interviews,
observations, verbal protocols, and micro-analysis to find ways
into the psychological processes and sociological contexts that
constitute the world of problem-based learning.
Stanley Milgram's experiments on obedience to authority are among
the most important psychological studies of this century. Perhaps
because of the enduring significance of the findings--the
surprising ease with which ordinary persons can be commanded to act
destructively against an innocent individual by a legitimate
authority--it continues to claim the attention of psychologists and
other social scientists, as well as the general public. The study
continues to inspire valuable research and analysis. The goal of
this book is to present current work inspired by the obedience
paradigm.
The Handbook of Research Methods in Human Memory presents a collection of chapters on methodology used by researchers in investigating human memory. Understanding the basic cognitive function of human memory is critical in a wide variety of fields, such as clinical psychology, developmental psychology, education, neuroscience, and gerontology, and studying memory has become particularly urgent in recent years due to the prominence of a number of neurodegenerative diseases, such as Alzheimer's. However, choosing the most appropriate method of research is a daunting task for most scholars. This book explores the methods that are currently available in various areas of human memory research and serves as a reference manual to help guide readers' own research. Each chapter is written by prominent researchers and features cutting-edge research on human memory and cognition, with topics ranging from basic memory processes to cognitive neuroscience to further applications. The focus here is not on the "what," but the "how"-how research is best conducted on human memory.
This book offers a comprehensive introduction to the latest developments in the theory and practice of CAT. It can be used both as a basic reference and a valuable resource on test theory. It covers such topics as item selection and ability estimation, item pool development and maintenance, item calibration and model fit, and testlet-based adaptive testing, as well as the operational aspects of existing large-scale CAT programs.
The primary purpose of this revision remains identical to that of
the first edition--to show how key personality,
cognitive/behavioral, and vocational tests/assessment procedures
can be used by counselors in their work with clients. Too often,
assessment books only provide the reader with information about
tests and assessment procedures. They do not, however, take the
next step--showing readers how these tests/assessment procedures
can be used and integrated into the actual work of counseling. This
revision is designed to fill that void. Chapter authors, all of
whom are experts in their respective topic areas, share the
theoretical and research backgrounds about a particular
test/assessment procedure and then provide a case example or
examples to show how assessment data can be meaningfully
incorporated into the counseling process.
The primary purpose of this revision remains identical to that of
the first edition--to show how key personality,
cognitive/behavioral, and vocational tests/assessment procedures
can be used by counselors in their work with clients. Too often,
assessment books only provide the reader with information about
tests and assessment procedures. They do not, however, take the
next step--showing readers how these tests/assessment procedures
can be used and integrated into the actual work of counseling. This
revision is designed to fill that void. Chapter authors, all of
whom are experts in their respective topic areas, share the
theoretical and research backgrounds about a particular
test/assessment procedure and then provide a case example or
examples to show how assessment data can be meaningfully
incorporated into the counseling process.
Teaches the principles of sampling with examples from social sciences, public opinion research, public health, business, agriculture, and ecology. Has been thoroughly revised to incorporate recent research and applications. Includes a new chapter on nonprobability samples, and more than 200 new examples and exercises have been added.
See How to Use Statistics for New Testament Interpretation The Synoptic Problem and Statistics lays the foundations for a new area of interdisciplinary research that uses statistical techniques to investigate the synoptic problem in New Testament studies, which concerns the relationships between the Gospels of Matthew, Mark, and Luke. There are potential applications of the techniques to study other sets of similar documents. Explore Hidden Markov Models for Textual Data The book provides an introductory account of the synoptic problem and relevant theories, literature, and research at a level suitable for academic and professional statisticians. For those with no special interest in biblical studies or textual analysis, the book presents core statistical material on the use of hidden Markov models to analyze binary time series. Biblical scholars interested in the synoptic problem or in the use of statistical methods for textual analysis can omit the more technical/mathematical aspects of the book. The binary time series data sets and R code used are available on the author's website. |
![]() ![]() You may like...
Zero Index Metamaterials - Trends and…
Nishant Shankhwar, Ravindra Kumar Sinha
Hardcover
R3,538
Discovery Miles 35 380
|