Factor Analysis
M. Venkataswamy Reddy in Statistical Methods in Psychiatry Research and SPSS, 2019
The principal components analysis (PCA) is a variant of factor analysis; both are data analysis techniques. In PCA, all the variance in the observed variable is analyzed; whereas in other factor analysis, only shared variance is analyzed. The PCA extracts the maximum sum of squares of the loadings for each factor in turn. Accordingly, the PCA explains more variance than the loadings obtained from any other method of factoring. It is assumed that all the variables are standardized. The aim of this method is to construct new variables (Pi) called principal components that are linear combinations of a given set of variables xj(j = 1 … k).
Evaluating Psoriasis in Patients
John Y. M. Koo, Ethan C. Levin, Argentina Leon, Jashin J. Wu, Alice B. Gottlieb in Moderate to Severe Psoriasis, 2014
For this study, one compound question from the 41-item PQOL (i.e., item #22: How much does your psoriasis interfere with making social contacts and relationships?) in the psychosocial domain was divided into two questions, creating a 42-item instrument. A combination of qualitative review and factor analysis was used to refine the questionnaire. Observations were randomly assigned to an exploratory or confirmatory dataset. The exploratory dataset (n = 301) was used to reduce and refine the existing PQOL instrument, and the confirmatory dataset (n = 182) was used to test the reliability of the findings from the exploratory analysis. Each PQOL item was evaluated for missing values, mean scores, floor/ceiling effects, reading level, translatability, and applicability to all patients. Qualitative criteria were applied by assessing items for redundancy, wording, and meaning/conceptual characteristics. Factor analysis was used to assess the factor structure and item loadings on factors. An item-retention grid consisting of all analytical parameters was created to evaluate all item parameter estimates simultaneously and to facilitate the item-reduction decision process. Once reduced, all analyses performed on the exploratory dataset (i.e., descriptive and factor analysis) were repeated on the revised questionnaire (i.e., PQOL-12) using the confirmatory dataset. The confirmatory analyses yielded results consistent with the exploratory analyses.
Emulsion Rheology
Laba Dennis in Rheological Proper ties of Cosmetics and Toiletries, 2017
In multivariate scaling, the panelist is provided with a list of descriptors and then asked to distinguish qualitatively between pairs of words or pairs of products. Numbers are generated that will place the response on a scale or axis. Harmon (1966) utilized factor analysis and Green and Rao (1972) used multidimensional scaling to enable the axes to be visualized. The two techniques require that the panelists understand the concept behind the procedure. Training may need to be done periodically to ensure that they maintain the correct mindset. Alander et al. (1990) utilized multivariate data analysis to classify emollients by comparing 11 physical properties with three sensory properties. They felt that the technique could be used to compare a new emollient with an already tested one, to select an emollient for a specific purpose, or to find a substitute. The utilization of multivariate scaling requires complex mathematical and statistical operations, the use of which are minimized by user-friendly software. Examples appear in articles by Stone and Sidel (1986) and Aust et al. (1987).
Assessment of the psychometric properties of the AQoL-4D questionnaire in Kannada language for use with adults with hearing loss
Published in International Journal of Audiology, 2019
Spoorthi Thammaiah, Vinaya Manchaiah, Rajalakshmi Krishna, Adriana A. Zekveld, Sophia E. Kramer
The data were analysed using the software International Business Machines Corporation Statistical Package for Social Sciences -Version 20 (IBM Corp. Released 2011). First, means and standard deviations of the scores were obtained. Subsequently, a factor structure analysis was carried out. Factor analysis is a statistical method used to group a wide range of attributes to a smaller number of dimensions (factors). We used factor analysis to identify the various factors assessing the overall construct HRQoL and the items of the AQoL questionnaires loading on those factors. Furthermore, the study involved the assessment of various other psychometric properties including; (i) Internal Consistency (Reliability), (ii) Test–retest Reliability (Stability), (iii) Convergent (or construct) Validity, (iv) Discriminant Validity, and (v) Floor/Ceiling Effects.
Factor structure of the Rivermead Post-Concussion Symptoms Questionnaire over the first year following mild traumatic brain injury
Published in Brain Injury, 2018
Suzanne Barker-Collo, Alice Theadom, Nicola Starkey, Michael Kahan, Kelly Jones, Valery Feigin
The most commonly used measure of PCS, which was used in each of the above studies, is the Rivermead Post-Concussion Symptom Questionnaire (RPQ). Although the RPQ is one of the most commonly used measure of PCS, there is no commonly accepted way to report the data from this measure which makes comparisons across different studies difficult. Most studies report RPQ data as either a total score across the measure, or report on the frequency with which specific RPQ symptoms are reported as being present within a sample. Consequently, a number of examinations have been conducted into the RPQ and whether its items can be clustered into subscales or factors. To conduct such examinations, factor analysis is typically used. Factor analysis is a statistical technique applied to a set of variables where the researcher is interested in discovering which variables from the set form into coherent subsets that are relatively independent from one another. The importance of a factor (or set of factors) is evaluated by the proportion of variance associated with that factor (8).
Reliability and validity of Persian version of Brief Self-Control Scale (BSCS) in motorcyclists
Published in International Journal of Psychiatry in Clinical Practice, 2020
Fatemeh Sadat Asgarian, Mahshid Namdari, Hamid Soori
The exploratory factor structure was run using principal component analysis with varimax rotation, and items with factor loading less than 0.40 were not allowed to load in the respective component. To perform exploratory factor analysis, the sample size is appropriate. The sufficiency of the sample size to perform factor analysis was values of 0.6 and higher by Kaiser–Meyer–Olkin Measure of Sampling Adequacy (KMO). Exploratory factor analysis was used for construct validity. The main purpose of the factor analysis is to diagnose a large number of variables in a limited number of factors so that we have the least amount of data loss. The KMO value (0.716) is higher than 0.6 and the significance level of the Bartlett Spread Test is less than 0.001. Therefore, based on both criteria, it can be concluded that the application of factor analysis based on the correlation matrix in the sample groups can be justified and the data are suitable for exploratory factor analysis.
Related Knowledge Centers
- Covariance Matrix
- Intelligence
- Personality
- Psychometrics
- Scree Plot
- Latent & Observable Variables
- Operations Research
- Normalization
- Parallel Analysis
- Sample Size Determination