Hostname: page-component-cd9895bd7-hc48f Total loading time: 0 Render date: 2024-12-25T18:20:16.935Z Has data issue: false hasContentIssue false

Exploring symptom clusters in mild cognitive impairment and dementia with the NIH Toolbox

Published online by Cambridge University Press:  16 February 2024

Callie E. Tyner*
Affiliation:
Center for Health Assessment Research and Translation, University of Delaware, Newark, DE, USA
Aaron J. Boulton
Affiliation:
Center for Health Assessment Research and Translation, University of Delaware, Newark, DE, USA
Jerry Slotkin
Affiliation:
Center for Health Assessment Research and Translation, University of Delaware, Newark, DE, USA
Matthew L. Cohen
Affiliation:
Center for Health Assessment Research and Translation, University of Delaware, Newark, DE, USA Department of Communication Sciences & Disorders, University of Delaware, Newark, DE, USA Delaware Center for Cognitive Aging Research, University of Delaware, Newark, DE, USA
Sandra Weintraub
Affiliation:
Mesulam Center for Cognitive Neurology and Alzheimer’s Disease, Northwestern University Feinberg School of Medicine, Chicago, IL, USA Department of Psychiatry and Behavioral Sciences, Northwestern University Feinberg School of Medicine, Chicago, IL, USA
Richard C. Gershon
Affiliation:
Department of Medical Social Sciences, Northwestern University Feinberg School of Medicine, Chicago, IL, USA
David S. Tulsky
Affiliation:
Center for Health Assessment Research and Translation, University of Delaware, Newark, DE, USA Departments of Physical Therapy and Psychological and Brain Sciences, University of Delaware, Newark, DE, USA
*
Corresponding author: C. E. Tyner; Email: ctyner@udel.edu
Rights & Permissions [Opens in a new window]

Abstract

Objective:

Symptom clustering research provides a unique opportunity for understanding complex medical conditions. The objective of this study was to apply a variable-centered analytic approach to understand how symptoms may cluster together, within and across domains of functioning in mild cognitive impairment (MCI) and dementia, to better understand these conditions and potential etiological, prevention, and intervention considerations.

Method:

Cognitive, motor, sensory, emotional, and social measures from the NIH Toolbox were analyzed using exploratory factor analysis (EFA) from a dataset of 165 individuals with a research diagnosis of either amnestic MCI or dementia of the Alzheimer’s type.

Results:

The six-factor EFA solution described here primarily replicated the intended structure of the NIH Toolbox with a few deviations, notably sensory and motor scores loading onto factors with measures of cognition, emotional, and social health. These findings suggest the presence of cross-domain symptom clusters in these populations. In particular, negative affect, stress, loneliness, and pain formed one unique symptom cluster that bridged the NIH Toolbox domains of physical, social, and emotional health. Olfaction and dexterity formed a second unique cluster with measures of executive functioning, working memory, episodic memory, and processing speed. A third novel cluster was detected for mobility, strength, and vision, which was considered to reflect a physical functioning factor. Somewhat unexpectedly, the hearing test included did not load strongly onto any factor.

Conclusion:

This research presents a preliminary effort to detect symptom clusters in amnestic MCI and dementia using an existing dataset of outcome measures from the NIH Toolbox.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2024. Published by Cambridge University Press on behalf of International Neuropsychological Society

There has been interest by the healthcare science community in recent years in looking closely at how symptoms cluster together within and across clinical conditions as a way to understand the potential shared etiologies, possible preventive strategies, and treatment interventions that may be useful (Miaskowski et al., Reference Miaskowski, Aouizerat, Dodd and Cooper2007; Reference Miaskowski, Barsevick, Berger, Casagrande, Grady, Jacobsen, Kutner, Patrick, Zimmerman, Xiao, Matocha and Marden2017). This has been driven particularly strongly within oncology and nursing research, where so-called “symptom clustering” approaches have been used to understand the patient experience of their outcomes holistically when numerous co-occurring symptoms—such as pain, fatigue, insomnia, and depression—go undetected and unaddressed if the primary clinical focus is elsewhere, as in the case of cancer treatment (Dodd et al., Reference Dodd, Miaskowski and Paul2001; Ho et al., Reference Ho, Rohan, Parent, Tager and McKinley2015; Huang & Lin, Reference Huang and Lin2009; Illi et al., Reference Illi, Miaskowski, Cooper, Levine, Dunn, West, Dodd, Dhruva, Paul, Baggott, Cataldo, Langford, Schmidt and Aouizerat2012; Kim et al., Reference Kim, McGuire, Tulman and Barsevick2005). The clinical challenges posed by complex, comorbid, and chronic conditions are familiar to clinical and health psychologists and neuropsychologists (Ashworth et al., Reference Ashworth, Sletten, Junge, Simpson, Clarke, Cunnington and Rajaratnam2015; Ford, Reference Ford2018; Miles et al., Reference Miles, Silva, Lang, Hoffman, Venkatesan, Sevigny and Nakase-Richardson2021), although the concept of a “symptom cluster” as defined by these allied health research traditions (see Table 1) may be a novel framing (Barsevick, Reference Barsevick2016; Reference Barsevick2007; Dodd et al., Reference Dodd, Miaskowski and Paul2001; Harris et al., Reference Harris, Dodd, Kober, Dhruva, Hammer, Conley and Miaskowski2022; Miaskowski et al., Reference Miaskowski, Barsevick, Berger, Casagrande, Grady, Jacobsen, Kutner, Patrick, Zimmerman, Xiao, Matocha and Marden2017).

Table 1. Definitions of a symptom cluster in the literature

In symptom clusters research, exploratory factor analysis (EFA) has been a preferred approach for detecting clusters of symptoms that may not be expected based on traditional diagnostic groupings or severity delineations (Harris et al., Reference Harris, Dodd, Kober, Dhruva, Hammer, Conley and Miaskowski2022). Considered one of several “variable-centered approaches” of symptom clusters research (in contrast to other “patient-centered approaches,” such as latent profile analysis), EFA allows for statistical understanding of how an array of symptoms can be grouped to reveal the underlying structure of symptoms and how they may be related or caused by shared etiologies (Barsevick, Reference Barsevick2016; Harris et al., Reference Harris, Dodd, Kober, Dhruva, Hammer, Conley and Miaskowski2022; Oh et al., Reference Oh, Thor, Olsson, Skokic, Jörnsten, Alsadius, Pettersson, Steineck and Deasy2016; Skerman et al., Reference Skerman, Yates and Battistutta2012). In the psychometric and neuropsychological assessment traditions, EFA is often used for identifying “domains” for assessment, for psychometric validation of tests or batteries, or for further interpretation of assessments, such as developing composite scores (Ma et al., Reference Ma, Carlsson, Wahoske, Blazel, Chappell, Johnson, Asthana and Gleason2021; Strauss & Fritsch, Reference Strauss and Fritsch2004). While the language used for describing these approaches may differ between these research communities—trading “domains” for “symptom clusters”—the underlying clinical interest is a shared one: to use patient-centered assessments and advanced statistical techniques to better understand complex presentations of symptoms that may offer novel targets for assessment, intervention, and improvement in quality of life.

Thus, in the present study, a symptom clustering framework was chosen to study the experience of people with amnestic mild cognitive impairment (aMCI) or dementia of the Alzheimer type (DAT), as it offers an opportunity to understand patient experiences beyond the lines of traditional domains as typically approached in neuropsychological research. The objective of this research is to detect and examine the presence of possible symptom clusters within and across individuals with a research diagnosis of either aMCI or mild DAT (Weintraub et al., Reference Weintraub, Karpouzian‐Rogers, Peipert, Nowinski, Slotkin, Wortman, Ho, Rogalski, Carlsson, Giordani, Goldstein, Lucas, Manly, Rentz, Salmon, Snitz, Dodge, Riley, Eldes, Ustsinovich and Gershon2022). Given the known wide array of symptoms that can be experienced by patients with these diagnoses, which can span cognitive, emotional, motor, sensory, and social challenges (American Psychiatric Association, 2013; Lezak et al., Reference Lezak, Howieson, Bigler and Tranel2012; Thomas et al., Reference Thomas, Cook, Bondi, Unverzagt, Gross, Willis and Marsiske2020), using an exploratory symptom clustering approach offers the potential to detect groups of symptoms that may cut across domains. While most symptom clusters research has been performed in oncology populations (Harris et al., Reference Harris, Dodd, Kober, Dhruva, Hammer, Conley and Miaskowski2022), this theoretical and methodological approach has been applied in other clinical groups—such as HIV, heart disease, lung disease, and kidney disease—and symptom clusters have been found that correlate with important patient-centered outcomes including quality of life and health care utilization (Miaskowski et al., Reference Miaskowski, Barsevick, Berger, Casagrande, Grady, Jacobsen, Kutner, Patrick, Zimmerman, Xiao, Matocha and Marden2017). Furthermore, the National Institutes of Health (NIH) has recently supported several initiatives to examine other chronic disease groups for relevant symptom clusters (National Institutes of Health, 2017), with the goal to understand what can be learned about these clinical groups through application of these methods. The present study reports on one of these recent projects.

Historically, research on symptom clusters (or domains of functioning) in dementia has primarily focused on using neuropsychological assessments to distinguish similar clinical syndromes with unique neuropathological causes (such as frontotemporal dementia, semantic dementia, and DAT) by comparing neuropsychiatric symptoms that support the diagnosis of one syndrome over another (Kramer et al., Reference Kramer, Jurik, Sha, Rankin, Rosen, Johnson and Miller2003; Marra et al., Reference Marra, Quaranta, Zinno, Misciagna, Bizzarro, Masullo, Daniele and Gainotti2007). More recent research on symptom clusters relevant to MCI and dementia has focused primarily on understanding how patterns of cognitive functioning change with aging—for example, the distinct patterns of fluid cognition seen in adults with and without dementia (Ma et al., Reference Ma, Carlsson, Wahoske, Blazel, Chappell, Johnson, Asthana and Gleason2021; McDonough et al., Reference McDonough, Bischof, Kennedy, Rodrigue, Farrell and Park2016)—and lifestyle predictors of cognitive decline, such as the connections between engaging in cognitive leisure activities and lower rates of dementia (Lee & Chi, Reference Lee and Chi2016). Furthermore, research has shown the relevance of associated symptoms in noncognitive domains—such as motor, sensory, mood, social functioning—to the daily functioning and quality of life of these patients (Dyer et al., Reference Dyer, Lawlor and Kennelly2020; Elovainio et al., Reference Elovainio, Lahti, Pirinen, Pulkki-Råback, Malmberg, Lipsanen, Virtanen, Kivimäki and Hakulinen2022; Insha et al., Reference Insha, Arshad and Fazle2022; Hwang et al., Reference Hwang, Longstreth, Brenowitz, Thielke, Lopez, Francis, DeKosky and Fitzpatrick2020; Rostamzadeh et al., Reference Rostamzadeh, Kahlert, Kalthegener and Jessen2022; Vaingankar et al., Reference Vaingankar, Chong, Abdin, Picco, Jeyagurunathan, Seow, Ng, Prince and Subramaniam2017; van der Linde et al., Reference van der Linde, Stephan, Matthews, Brayne and Savva2010). There has nevertheless been limited research exploring the associations of symptoms across cognitive and noncognitive domains in MCI and dementia. One study, for example, found a connection between vestibular functioning and visuospatial functioning, with worse performance in one domain associated with worse performance in the other (Bigelow et al., Reference Bigelow, Semenov, Trevino, Ferrucci, Resnick, Simonsick, Xue and Agrawal2015). These findings are likely more than correlational; there is increasingly rigorous evidence to indicate that domains of noncognitive functioning (e.g., hearing; physical fitness) may be modifiable risk factors of cognitive decline and dementia (Cohen et al., Reference Cohen, Ryan and Lanzi2021; Livingston et al., Reference Livingston, Huntley, Sommerlad, Ames, Ballard, Banerjee, Brayne, Burns, Cohen-Mansfield, Cooper, Costafreda, Dias, Fox, Gitlin, Howard, Kales, Kivimäki, Larson, Ogunniyi, Orgeta, Ritchie, Rockwood, Sampson, Samus, Schneider, Selbæk, Teri and Mukadam2020). Thus, we believe that exploration of symptom clusters within and across domains in these populations may be fruitful for understanding targets for intervention, prevention, and as possible markers of etiology.

For the purpose of measuring symptoms within and across domains, there have been important test development efforts for standardizing systems of comprehensive symptom assessments, including the NIH Toolbox® (Gershon et al., Reference Gershon, Cella, Fox, Havlik, Hendrie and Wagster2010; Reference Gershon, Wagster, Hendrie, Fox, Cook and Nowinski2013). The NIH Toolbox (NIHTB), which includes tests that span a wide range of cognitive, sensory, motor, emotional, and social domains, appears to have significant potential for exploring symptom clusters. The NIHTB tests are self-contained and conormed, and score corrections can be made based on demographics and/or premorbid ability estimates (Holdnack et al., Reference Holdnack, Tulsky, Slotkin, Tyner, Gershon, Iverson and Heinemann2017; Nitsch et al., Reference Nitsch, Casaletto, Carlozzi, Tulsky, Heinemann and Heaton2017). This widely researched battery benefits from being a computerized assessment by having high precision for measuring reaction times, and automated administration and scoring that can reduce user bias. The NIHTB has been validated previously in healthy aging (Scott et al., Reference Scott, Sorrell and Benitez2019) and dementia research (Ma et al., Reference Ma, Carlsson, Wahoske, Blazel, Chappell, Johnson, Asthana and Gleason2021), including a recent publication supporting the validity of the NIHTB in individuals aged 85 and older (Nolin et al., Reference Nolin, Cowart, Merritt, McInerney, Bharadwaj, Franchetti, Raichlen, Jessup, Hishaw, Van Etten, Trouard, Geldmacher, Wadley, Porges, Woods, Cohen, Levin, Rundek, Alexander and Visscher2023), which further supports its utility for this study. For these reasons, we anticipate that the NIHTB will offer a useful assessment approach for symptom clusters research generally, and specifically in aging research, and we aim to evaluate it for this purpose with the current project. We hypothesize that by using an existing dataset, which was collected as part of the Advancing Reliable Measurement in Alzheimer’s Disease and Cognitive Aging (ARMADA) study (Weintraub et al., Reference Weintraub, Karpouzian‐Rogers, Peipert, Nowinski, Slotkin, Wortman, Ho, Rogalski, Carlsson, Giordani, Goldstein, Lucas, Manly, Rentz, Salmon, Snitz, Dodge, Riley, Eldes, Ustsinovich and Gershon2022), and a variable-oriented symptom clustering approach, we expect we will detect clusters of symptoms within and potentially across domains that may have relevance to understanding potential etiological factors, prevention targets, and intervention strategies for MCI and dementia.

Methods

Participants

Participants for this study were drawn from the ongoing data collection for the ARMADA study and included individuals recruited from nine established Alzheimer’s Disease Research Centers (ADRCs) across the United States. Although the ARMADA study includes several participant groups, only data from 165 individuals aged ≥ 60 with a research diagnosis of aMCI (single or multidomain) or mild DAT were included in the present study. Diagnoses of aMCI and mild DAT were based on the 2011 NIA–Alzheimer’s Association criteria (Albert et al., Reference Albert, DeKosky, Dickson, Dubois, Feldman, Fox, Gamst, Holtzman, Jagust, Petersen, Snyder, Carrillo, Thies and Phelps2011; McKhann et al., Reference McKhann, Knopman, Chertkow, Hyman, Jack, Kawas, Klunk, Koroshetz, Manly, Mayeux, Mohs, Morris, Rossor, Scheltens, Carrillo, Thies, Weintraub and Phelps2011) following the procedures of the National Alzheimer’s Coordinating Center (Morris et al., Reference Morris, Weintraub, Chui, Cummings, DeCarli, Ferris, Foster, Galasko, Graff-Radford, Peskind, Beekly, Ramos and Kukull2006; Weintraub et al., Reference Weintraub, Besser, Dodge, Teylan, Ferris, Goldstein, Giordani, Kramer, Loewenstein, Marson, Mungas, Salmon, Welsh-Bohmer, Zhou, Shirk, Atri, Kukull, Phelps and Morris2018; Reference Weintraub, Salmon, Mercaldo, Ferris, Graff-Radford, Chui, Cummings, DeCarli, Foster, Galasko, Peskind, Dietrich, Beekly, Kukull and Morris2009), with the Clinical Dementia Rating Scale (CDR; Morris, Reference Morris1993) global score of 0.5 required for aMCI and 1.0 for mild DAT. Participants were excluded for acute neurological disorders that could lead to cognitive impairment, a history of major psychiatric illness, or substance use disorder (see Weintraub et al., Reference Weintraub, Karpouzian‐Rogers, Peipert, Nowinski, Slotkin, Wortman, Ho, Rogalski, Carlsson, Giordani, Goldstein, Lucas, Manly, Rentz, Salmon, Snitz, Dodge, Riley, Eldes, Ustsinovich and Gershon2022, for full details on sample recruitment, inclusion/exclusion criteria, and diagnostic classification procedures). The study was approved by the institutional review boards at each of the participating data collection sites and was completed in accordance with Helsinki declaration.

Study design, measures, and procedures

Data used for these analyses come from the baseline assessment timepoint of the ARMADA project and included administration of the NIHTB (English language version) and a detailed inventory of demographics and health history (in many cases this was the Uniform Data Set [UDS] assessment). Table 2 presents details on the NIHTB assessments administered in the baseline assessment of the ARMADA study, including scoring metrics and directions. Analyses were conducted on all measures in NIHTB version 2.0 except the Standing Balance test, which was only completed by a minority of participants. Only cases in which the NIHTB and demographic/health history data were collected within a 130-day window were included in the present analyses.

Table 2. NIH Toolbox measures by battery and domain

Notes. Uncorrected SS = uncorrected standardized scores (M = 100, SD = 15) and T-scores (M = 50, SD = 10) are weighted to the 2010 Census; Better Ear Threshold for the Words-In-Noise Test is defined as the lowest threshold score observed for either ear, in the unit decibels of signal-to-noise ratio (dB S/N).

Data analyses

EFA was used to analyze the data. All EFA models were fit in R using the lavaan package (version 0.6-16; Rosseel, Reference Rosseel2012). Rates of missing data were generally low (< 5%), although a small number of assessments had slightly higher rates of missingness (e.g., Picture Sequence Memory = 18%). Full-information maximum likelihood (FIML) was used for model estimation and used all available information in the dataset. We examined EFA solutions that varied between 1 and 9 factors being extracted. Several criteria were used for factor extraction, as employing multiple criteria simultaneously may be optimal, compared to reliance on any single criterion alone (Auerswald & Moshagen, Reference Auerswald and Moshagen2019). First, we conducted a parallel analysis (Horn, Reference Horn1965). Second, we compared global model fit between adjacent solutions, as indicated by the chi-square test statistic χ 2, the root-mean-square error of approximation (RMSEA; Steiger & Lind, Reference Steiger and Lind1980), and the Tucker-Lewis index (TLI; Tucker & Lewis, Reference Tucker and Lewis1973). Conventional fit index cutoffs—despite their known limitations (Marsh et al., Reference Marsh, Hau and Wen2004)—were used to pinpoint solutions that exhibited acceptable fit to the data: nonsignificant χ 2, RMSEA values below .06, TLI values greater than .95 (Hu & Bentler, Reference Hu and Bentler1999). We also noted the solution in which the lower bound of the 90% confidence interval for the RMSEA dipped below .05 (Preacher et al., Reference Preacher, Zhang, Kim and Mels2013). Comparative model fit was indexed by the Bayesian information criterion (BIC; Schwarz, Reference Schwarz1978), with the lowest BIC value indicating an optimal balance between model fit and parsimony. Finally, the interpretability of each EFA solution was given the largest weight when deciding how many factors to extract. Following extraction, factor loadings were rotated using oblimin, a common oblique rotation method, as we expected the extracted factors to be correlated. Following extraction, factor scores were computed using empirical Bayes estimation and compared (t-test, Cohen’s d) between the aMCI and DAT subsamples.

Results

Demographic characteristics of the analysis sample are shown in Table 3. Participant age was approximately normally distributed between 60 and 94 years of age. Participant gender was roughly evenly distributed, with a slighter higher proportion of males in each diagnostic category. The sample was predominantly non-Hispanic and White. A majority of the sample reported high levels of educational attainment (4-year college degree or higher). Descriptive statistics for the primary outcome variables are shown in Table 4 (descriptive statistics within each diagnostic category are available in the Supplementary Material). Apart from the 9-Hole Pegboard Test and Odor Identification tests, all variables exhibited skewness and kurtosis indices between 2 and 2. Moderate floor effects were also observed for three measures: Picture Sequence Memory, Pain Interference, and Friendship.

Table 3. Demographic characteristics

Table 4. Descriptive statistics

Parallel analysis suggested the extraction of four factors (Fig. 1). Model fit results, shown in Table 5, were mixed. The chi-square test statistic was significantly different from zero for all solutions except the 7-, 8-, and 9-factor solutions. The RMSEA index fell below .06 in the 5-factor solution, and the lower bound of the 90% RMSEA confidence interval fell below .05 in this solution as well. The BIC was lowest for the 4-factor solution, although the TLI index did not show acceptable fit until the 7-factor solution. Given these conflicting answers, factor loading patterns for the 5-, 6-, 7-, 8-, and 9-factor solutions were inspected, and the 6-factor solution was considered the most interpretable solution. This solution exhibited acceptable fit for the RMSEA statistic and nearly met the conventional .95 threshold for the TLI.

Figure 1. Parallel analysis scree plot. In this figure, the triangles represent eigenvalues obtained across the 25 NIHTB measures. The dotted line represents average eigenvalues obtained from randomly generated datasets. Four of the observed eigenvalues were greater than the average of random samples, and thus the parallel analysis suggests retention of four factors, although a 6-factor structure was ultimately chosen as a more interpretable solution.

Table 5. Model fit

Note. BIC = Bayesian information criterion, CI = confidence interval, df = degrees of freedom, p = p-value, RMSEA = root-mean-square error of approximation, TLI = Tucker-Lewis index, χ 2 = chi-square goodness of fit test statistic.

Rotated factor loadings for the 6-factor solution are shown in Table 6—a second oblique rotation method (geomin) was inspected and produced similar rotated loading values—and interfactor correlations are provided in Table 7. The factor loadings largely adhered to a simple structure pattern, with very few secondary loadings greater than .3. communalities, which represent the proportion of variance accounted for in each indicator variable by all extracted factors (symbolized as h 2 ) were largest (> .5) for emotional, social, and cognitive performance outcomes, and smallest for motor and sensory outcomes—in particular, the tests of visual acuity (Visual Acuity Test; h 2 = .14) and auditory function (Words-in-Noise; h 2 = .17) had the lowest communalities.

Table 6. Factor loadings for 6−factor solution

Note. h 2 = communality: collective proportion of variance in NIH Toolbox test accounted for by all extracted factors. u 2 = uniqueness: proportion of variance in NIH Toolbox test not accounted for by extracted factors. Factor loading absolute values > .3 are highlighted in boldface.

Table 7. Factor correlations for 6-factor solution

Note. Interfactor correlations are shown in the lower diagonal. The proportion of variance accounted for in the entire variable set by each factor is shown on the diagonal.

Measures from the Cognition battery loaded onto two factors reflecting Fluid Intelligence (Factor 1) and Crystallized Intelligence (Factor 2). Along with the five cognitive tests of fluid intelligence, one motor and one sensory test (9-Hole Pegboard, Odor Identification) also exhibited weak/moderate loadings on Fluid Intelligence (Factor 1). The relationships between olfaction with cognitive decline in aging and with frontal-lobe structural integrity appear potentially relevant for understanding this factor (Bathini et al., Reference Bathini, Brai and Auber2019; Cynthia et al., Reference Felix, Chahine, Hengenius, Chen, Rosso, Zhu, Cao and Rosano2021; Roberts et al., Reference Roberts, Christianson, Kremers, Mielke, Machulda, Vassilaki, Alhurani, Geda, Knopman and Petersen2016; Sohrabi et al., Reference Sohrabi, Bates, Weinborn, Johnston, Bahramian, Taddei, Laws, Rodrigues, Morici, Howard, Martins, Mackay-Sim, Gandy and Martins2012; Yap et al., Reference Yap, Mahendran, Kua, Zhou and Wang2022). Regarding Crystallized Intelligence (Factor 2), the loading for Picture Vocabulary dominated the factor (rounded–1.00); a similar factor loading pattern was observed for this factor by Ma et al. (Reference Ma, Carlsson, Wahoske, Blazel, Chappell, Johnson, Asthana and Gleason2021) in individuals with MCI/dementia.

Measures from the Emotion battery loaded onto three factors that were interpreted as indicators of Negative Affect (Factor 3), Positive Affect/Life Satisfaction (Factor 4), and Social Health (Factor 5). Although these factors were quite homogeneous (i.e., indicated by variables with strong primary loadings and weak secondary loadings), both Pain Interference and Loneliness exhibited moderate loadings on Negative Affect (Factor 3). Loneliness also exhibited a moderate loading on Social Health (Factor 5). The remaining sixth factor included several Physical Function outcomes across motor (gait speed/ambulation endurance, manual motor strength) and sensory (visual acuity) domains (Factor 6), although the factor loading pattern suggests the factor is strongly indicated by lower extremity function (ambulation endurance). The auditory perception outcome measure, Words-in-Noise Test, did not load clearly on any factor, although the highest loading (absolute value = .25) observed for this test was for the Physical Function factor (Factor 6).

Interfactor correlations ranged between .51 and .41 (see Table 7); the strongest correlations were observed among the three factors indicated by the Emotion battery measures (Factors 1, 4, and 5). The proportion of variance accounted for in the NIHTB tests by each factor is shown in the diagonal in Table 7. These proportions ranged between .07 and .15, with the highest value of .15 observed for Negative Affect and the lowest value of .07 observed for Crystallized Intelligence and Positive Affect/Life Satisfaction. Overall, the factor loading patterns and communality estimates suggest the outcomes from the Emotion and Cognition batteries form rather tight-knit clusters, whereas the Motor and Sensory outcome measures form less cohesive clusters that may be dominated by a single measure. Nevertheless, we believe that the cross-domain factor loadings may be useful for understanding patterns of symptoms that cluster together in meaningful ways in these populations. Finally, factor score comparisons between the aMCI and DAT subsamples are shown in Table 8. The DAT group exhibited significantly worse functioning compared to the aMCI group on three factors: Fluid Intelligence (Factor 1), Crystallized Intelligence (Factor 2), and Physical Function (Factor 6); effect sizes (absolute values) ranged from .01 to 1.14.

Table 8. Mean comparisons

Note. Negative values for d indicate lower average factor scores for the DAT group compared to the aMCI group and vice versa.

Discussion

This study provides a novel contribution by highlighting several cross-domain symptom clusters using the NIHTB in a clinical sample of individuals with either aMCI or mild DAT. We found that measures of sensory and motor functioning were associated with cognitive, emotional, and social functioning in these populations; instead of forming homogeneous factors, measures from the NIHTB Sensory and Motor batteries spread across several factors, although they exhibited somewhat lower associations.

Specifically, Pain Interference loaded primarily on a factor along with three measures of Negative Affect, a measure of Stress, and a measure of Social Relationships. Taken together, Sadness, Fear, Anger, Stress, Loneliness, and Pain Interference could therefore be considered a symptom cluster indicative of the complexities of negative emotional experiences in aMCI and DAT, which are influenced by both physical and social health. Loneliness was the one measure in this EFA that had dual loadings > 0.30, contributing both to this symptom cluster of Negative Affect and Pain as well as a negative loading with two other measures of Social Health: Friendship and Emotional Support.

One sensory domain, Olfaction, and one motor domain, Dexterity, were found to load with the Fluid Intelligence factor. Both domains are known to be associated with frontal lobe functioning, so the association of Olfaction and Dexterity on a symptom cluster with the fluid cognition measures of Executive Functioning, Working Memory, Episodic Memory, and Processing Speed is not surprising. Odor identification has long been proposed as a biomarker of cognitive impairment that may be useful for early screening (Bathini et al., Reference Bathini, Brai and Auber2019; Roberts et al., Reference Roberts, Christianson, Kremers, Mielke, Machulda, Vassilaki, Alhurani, Geda, Knopman and Petersen2016; Yap et al., Reference Yap, Mahendran, Kua, Zhou and Wang2022), so we believe these associations could be fruitful for future research on noninvasive predictive biomarkers of cognitive decline.

When the sensory domains of Vision and Hearing were evaluated for associations with other domains across the NIHTB, the Visual Acuity Test loaded meaningfully on a factor with Strength, Locomotion, and Endurance, which together formed what we considered to be a Physical Function factor. Interpreted through the lens of symptom clustering, vision appears to form a meaningful symptom cluster with mobility and strength in aMCI and DAT as markers of physical decline and frailty that have been shown in prior research to meaningfully impact the quality of life and overall health (Liljas et al., Reference Liljas, Carvalho, Papachristou, De Oliveira, Wannamethee, Ramsay and Walters2017). The aMCI group performed significantly better on the Physical Function factor than the DAT group (Factor 6; effect size = .34), which highlights the relevance of declining physical functioning (specifically strength, locomotion, endurance, and vision) for individuals with dementia (Auyeung et al., Reference Auyeung, Kwok, Lee, Leung, Leung and Woo2008; Wang et al., Reference Wang, Larson, Bowen and van Belle2006).

Hearing was expected to be similarly relevant as vision across domains in this clinical sample, given prior research documenting the relevance of auditory functioning to cognitive performance in older adults (O’Brien et al., Reference O’Brien, Lister, Fausto, Morgan, Maeda, Andel and Edwards2021). However, the NIHTB test of audition, the Words-In-Noise Test, did not load strongly on any of the identified factors. This measure is more complex cognitively than a simple hearing test, as it requires listeners to identify spoken words heard with multitalker background noise of increasing volume (Zecker et al., Reference Zecker, Hoffman, Frisina, Dubno, Dhar, Wallhagen, Kraus, Griffith, Walton, Eddins, Newman, Victorson, Warrier and Wilson2013). Future, prospective research is needed to determine if a pure hearing test—such as the recently developed Hearing Threshold Test for the NIHTB (Wiseman et al., Reference Wiseman, Slotkin, Spratford, Haggerty, Heusinkvelt, Weintraub, Gershon and McCreery2022)—would be relevantly associated with performance tests or symptoms in other domains.

Mirroring the structure of the NIHTB Cognition Battery, our results support separate factors for Fluid and Crystallized Intelligence (Mungas et al., Reference Mungas, Heaton, Tulsky, Zelazo, Slotkin, Blitz, Lai and Gershon2014). This study also replicated the finding of a discrepancy between NIHTB fluid and crystallized cognition in clinical and preclinical MCI and dementia samples that has been documented in recent research (Ma et al., Reference Ma, Carlsson, Wahoske, Blazel, Chappell, Johnson, Asthana and Gleason2021; McDonough et al., Reference McDonough, Bischof, Kennedy, Rodrigue, Farrell and Park2016). The aMCI group was found to perform significantly better than the DAT group on both Fluid Intelligence (Factor 1; effect size = 1.14) and Crystallized Intelligence (Factor 2; effect size = .34). Thus, the discrepancy in fluid and crystallized intelligence appears to increase in magnitude with cognitive decline/worse cognitive performance. Specifically, the difference between fluid and crystallized cognition mean scores observed in this sample was .26 for aMCI and .33 for the DAT group (see Table 8). Moderation analysis could be used to further elucidate this trend.

On the Fluid Intelligence Factor in this study (which included both Odor Identification and the 9-Hole Pegboard Dexterity Test), one notable finding was that Picture Sequence Memory had the weakest loading (0.43), and List Sorting Working Memory the second weakest loading (0.60), compared with the three other fluid cognition tests on this factor (range 0.72–0.87; see Table 6). As two of the more challenging tests in the NIHTB cognition battery, Picture Sequence Memory and List Sorting Working Memory have been shown previously to have reduced completion rates in cognitively impaired samples (Hackett et al., Reference Hackett, Krikorian, Giovannetti, Melendez‐Cabrero, Rahman, Caesar, Chen, Hristov, Seifan, Mosconi and Isaacson2018; Ma et al., Reference Ma, Carlsson, Wahoske, Blazel, Chappell, Johnson, Asthana and Gleason2021). This was also seen in our study, with 17.6% and 9.7% of the sample missing scores on Picture Sequence Memory and List Sorting Working Memory tests, respectively. We suspect this is most likely attributable to the discontinue criterion applied for both tests after the sample items, such that the test is not attempted unless the test taker can accurately complete the practice component of the test. Nevertheless, we suspect that the attenuation of these loadings is primarily due to the difficulty of these tasks, although this should be evaluated in future research.

On the Crystallized Intelligence factor, Picture Vocabulary was the dominant contributor (1.00), with a strong loading of Oral Reading (0.62) also observed. The loadings for these two crystallized cognition tests were less balanced than has been shown in prior research on the NIHTB in a general population adult sample. For example, Mungas et al. (Reference Mungas, Heaton, Tulsky, Zelazo, Slotkin, Blitz, Lai and Gershon2014) documented loadings of 0.84 for Vocabulary (Picture Vocabulary Test and PPVT-R) and 0.99 for Reading (Oral Reading Recognition Test and WRAT-R) on the Crystallized Intelligence factor in their study. The larger split between the loadings of these two tests on Factor 6 found in our study may suggest a weakening of the association of vocabulary knowledge and reading ability with cognitive decline, which was also observed by Ma et al. (Reference Ma, Carlsson, Wahoske, Blazel, Chappell, Johnson, Asthana and Gleason2021).

Potential clinical implications and future directions

This study represents an initial step toward improving patient care for aMCI and DAT using data-driven symptom science. The results, although exploratory in nature, suggest that multidomain assessment of each patient—considering their symptoms holistically including emotional, social, and physical symptoms in addition to cognitive functioning—should continue to be the standard of care for neuropsychological assessment in MCI and DAT. The results also suggest potential shared targets for intervention and assessment. In particular, several of the symptoms loading on Factor 3 may be modifiable with a shared treatment approach. As one example, the symptoms of Pain, Sadness, and Fear may be symptoms of a shared underlying problem that could be improved with antidepressant and/or anxiolytic medication (Feeney, Reference Feeney2004; Lin et al., Reference Lin, Katon, Von Korff, Tang, Williams, Kroenke, Hunkeler, Harpole, Hegel, Arean, Hoffing, Della Penna, Langston and Unützer2003). Alternatively, the symptoms of Anger, Stress, and Loneliness on Factor 3 could be appropriate targets for a socially based intervention, such as a support group or community engagement activity (O’Rourke et al., Reference O’Rourke, Collins and Sidani2018). Recent reviews of social interventions for older adults have cited the need to expand the theoretical understanding of how social and emotional symptoms are related in older adults (Gardiner et al., Reference Gardiner, Geldenhuys and Gott2018), and for understanding how the symptoms of social and emotional health experienced by older adults may be unique for those individuals with cognitive impairments (Cohen-Mansfield & Perach, Reference Cohen-Mansfield and Perach2015). We anticipate that our results could be useful for supporting the design of a future study to evaluate and potentially validate psychosocial interventions for these patients.

Our findings also suggest approaches to screening and assessment for individuals with aMCI and DAT. Research suggests that early identification of Alzheimer’s disease is possible and beneficial, given the relevance of modifiable risk factors that can be treated to potentially prevent progression to dementia if identified early (Isaacson et al., Reference Isaacson, Ganzer, Hristov, Hackett, Caesar, Cohen, Kachko, Meléndez‐Cabrero, Rahman, Scheyer, Hwang, Berkowitz, Hendrix, Mureb, Schelke, Mosconi, Seifan and Krikorian2018; Norton et al., Reference Norton, Matthews, Barnes, Yaffe and Brayne2014; Rasmussen & Langerman, Reference Rasmussen and Langerman2019; Saxton et al., Reference Saxton, Lopez, Ratcliff, Dulberg, Fried, Carlson, Newman and Kuller2004). Although moderate in size, the loadings on Factor 1 in our study suggest that Odor Identification and/or 9-Hole Pegboard Dexterity could potentially be used as screening tests to identify individuals who may benefit from cognitive assessment, with the goal of early detection. There has recently been renewed interest in using sensory measures to detect early brain changes associated with dementia (Bathini et al., Reference Bathini, Brai and Auber2019; National Institutes of Health, 2022; Yap et al., Reference Yap, Mahendran, Kua, Zhou and Wang2022). Since these sensory tests from the NIHTB are simple to deploy, they could be implemented in a variety of settings by nonneuropsychologist health professionals. Analysis of the sensitivity and specificity of these measures would be needed to validate the predictive utility of these tests for screening and is an area warranting further research. The results from this study also lend support to the development of a clinically oriented assessment battery based on the NIHTB. This is an area of interest for members of the ARMADA investigator team, and we anticipate this will be explored in future research stemming from this larger project.

Study limitations

This study has several limitations, which stem primarily from the use of an existing dataset as opposed to prospective data collection designed explicitly to evaluate symptom clusters. Ideally, in future research, a broader array of symptom-oriented assessments would be used. Prospective research is needed with the inclusion of prespecified symptom assessments across the domains found to be important in this study, as well as other domains suspected to be salient, to replicate and expand on these findings. For example, we suspect it would be fruitful to expand the assessments, particularly in the sensory domains, to capture subtle differences in hearing and visual acuity, for example, that may contribute to relevant symptom clusters in these populations. Fortunately, future waves of ARMADA data collection will use three additional tests, which were not available in the baseline dataset, but would be relevant for this future research: Face-Name Associative Memory Exam (FNAME; Rentz et al., Reference Rentz, Amariglio, Becker, Frey, Olson, Frishe, Carmasin, Maye, Johnson and Sperling2011), Hearing Threshold Test (Wiseman et al., Reference Wiseman, Slotkin, Spratford, Haggerty, Heusinkvelt, Weintraub, Gershon and McCreery2022), and Near Visual Acuity (under development). The results from the present study will be useful for shaping future study design and data collection efforts.

Concerns about the accuracy of self-appraisals of symptoms in dementia may also limit the interpretability of symptom cluster research with this population, although the inclusion of participants with mild DAT in this study’s sample should help partially assuage this concern, given that difficulties with self-awareness (including anosognosia) have been observed to progress with disease burden (Hanseeuw et al., Reference Hanseeuw, Scott, Sikkes, Properzi, Gatchel, Salmon, Marshall and Vannini2020). Finally, the absence of explicit performance validity testing or symptom validity assessment in the NIHTB is an important limitation. Future prospective studies should include measures of performance and symptom validity to elucidate relevant patterns that may be affecting the symptom clusters observed in this study. In fact, symptom clustering approaches may themselves prove useful for this purpose: recent research has demonstrated how “patient-centered approaches” of symptom clusters research (e.g., latent class analysis) can be used to identify patterns of cognitive effort and symptom magnification (Morin & Axelrod, Reference Morin and Axelrod2017).

Some features of the data and sample may limit the generalizability of these findings. The sample size was modest, skewed slightly older in the aMCI group, and was restricted to individuals with memory impairment in both subgroups. Additionally, the sample was not representative of the diversity of individuals with MCI and DAT in the general population with regards to race, ethnicity, and education. Furthermore, the exclusions based on preexisting neurologic and psychiatric illness also distort the sample makeup. The findings from this study may differ if replicated in a larger, more representative and diverse sample. Replication of these results should include tests of measurement invariance across diagnostic categories and over time. Moderate floor effects were also present for three of the measures, most notably Picture Sequence Memory where a small portion of the sample (17.6%) did not complete the measure (we suspect due to not passing the sample items) and another 24.2% scored at the floor. While this is not surprising given that memory impairment was consistently demonstrated throughout this aMCI and DAT sample, and we do not suspect results were influenced—except possible attenuation of the relation between this test and the Fluid Intelligence factor—this remains an open question and may be a limitation of the current study.

Last, we believe it may be useful to understand if the factors and loading identified in this study sample would replicate or vary in a meaningful way for older adults without cognitive impairment. Although this was not the purpose of the current project, we did explore the applicability of the exploratory model identified in this study in a separate sample of older adults without significant memory impairment collected as part of the same ARMADA study. Specifically, we conducted a separate, post-hoc EFA analysis using data from 156 individuals ages 65–85 from the older adult “healthy control” cohort, extracting the same number of factors as was selected for the final aMCI/DAT solution. Factor loadings from this solution were then rotated using Procrustes rotation (Fischer & Karl, Reference Fischer and Karl2019) to make it as similar as possible to the rotated solution found in the aMCI/DAT sample. Results of these analyses, including similarities and differences in the factor structure between samples, are provided in the Supplementary Material for interested readers.

Conclusions

This research provides support for using the NIHTB to detect clusters of symptoms that people with aMCI and DAT experience. Researchers and clinicians can use these findings to shape their assessment approaches and considerations for treatment. For example, individuals with symptoms of fear or sadness may also be experiencing pain and life stress that could be targets for intervention to improve quality of life. Future research is needed to validate the utility of symptom clusters for planning assessment and treatment approaches in these patients.

Supplementary material

The supplementary material for this article can be found at https://doi.org/10.1017/S1355617724000055.

Acknowlegements

We thank the study participants and their family members for their time and efforts to participate in the original ARMADA study. We also thank the staff at the research sites who helped collect the data.

Funding statement

This work was supported by the National Institute of Nursing Research (D.T., C.T., A.B., J.S., & M.C., grant number R01NR018684), and National Institute on Aging (R.G., S.W, & J.S., grant number U2C AG057441).

Competing interests

None.

References

Albert, M. S., DeKosky, S. T., Dickson, D., Dubois, B., Feldman, H. H., Fox, N. C., Gamst, A., Holtzman, D. M., Jagust, W. J., Petersen, R. C., Snyder, P. J., Carrillo, M. C., Thies, B., & Phelps, C. H. (2011). The diagnosis of mild cognitive impairment due to Alzheimer’s disease: Recommendations from the National Institute on Aging-Alzheimer’s Association workgroups on diagnostic guidelines for Alzheimer’s disease. Alzheimers & Dementia, 7(3), 270279. https://doi.org/10.1016/j.jalz.2011.03.008 CrossRefGoogle ScholarPubMed
American Psychiatric Association (2013). Diagnostic and statistical manual of mental disorders (DSM-5®) (5th ed.). American Psychiatric Association Publishing.Google Scholar
Ashworth, D. K., Sletten, T. L., Junge, M., Simpson, K., Clarke, D., Cunnington, D., & Rajaratnam, S. M. (2015). A randomized controlled trial of cognitive behavioral therapy for insomnia: An effective treatment for comorbid insomnia and depression. Journal of Counseling Psychology, 62(2), 115123. https://doi.org/10.1037/cou0000059 CrossRefGoogle ScholarPubMed
Auerswald, M., & Moshagen, M. (2019). How to determine the number of factors to retain in exploratory factor analysis: A comparison of extraction methods under realistic conditions. Psychological Methods, 24(4), 468491. https://doi.org/10.1037/met0000200 CrossRefGoogle ScholarPubMed
Auyeung, T. W., Kwok, T., Lee, J., Leung, P. C., Leung, J., & Woo, J. (2008). Functional decline in cognitive impairment: The relationship between physical and cognitive function. Neuroepidemiology, 31(3), 167173. https://doi.org/10.1159/000154929 CrossRefGoogle Scholar
Barsevick, A. M. (2007). The elusive concept of the symptom cluster. Oncology Nursing Forum, 34(5), 971980. https://doi.org/10.1188/07.ONF.971-980 CrossRefGoogle ScholarPubMed
Barsevick, A. (2016). Defining the symptom cluster: How far have we come? Seminars in Oncology Nursing, 32(4), 334350. https://doi.org/10.1016/j.soncn.2016.08.001 CrossRefGoogle ScholarPubMed
Bathini, P., Brai, E., & Auber, L. A. (2019). Olfactory dysfunction in the pathophysiological continuum of dementia. Ageing Research Reviews, 55, 100956. https://doi.org/10.1016/j.arr.2019.100956 CrossRefGoogle Scholar
Bigelow, R. T., Semenov, Y. R., Trevino, C., Ferrucci, L., Resnick, S. M., Simonsick, E. M., Xue, Q‐Li, & Agrawal, Y. (2015). Association between visuospatial ability and vestibular function in the Baltimore longitudinal study of aging. Journal of the American Geriatrics Society, 63(9), 18371844. https://doi.org/10.1111/jgs.13609 CrossRefGoogle ScholarPubMed
Cohen, M. L., Ryan, A. C., & Lanzi, A. M. (2021). Prevention of and early intervention for cognitive decline due to Alzheimer’s disease and related disorders. Delaware Journal of Public Health, 7(4), 118122. https://doi.org/10.32481/djph.2021.09.014 CrossRefGoogle Scholar
Cohen-Mansfield, J., & Perach, R. (2015). Interventions for alleviating loneliness among older persons: A critical review. American Journal of Health Promotion, 29(3), e109e125. https://doi.org/10.4278/ajhp.130418-LIT-182 CrossRefGoogle Scholar
Dodd, M. J., Miaskowski, C., & Paul, S. M. (2001). Symptom clusters and their effect on the functional status of patients with cancer. Oncology Nursing Forum, 28(3), 465470, https://www.ncbi.nlm.nih.gov/pubmed/11338755.Google ScholarPubMed
Dyer, A. H., Lawlor, B., & Kennelly, S. P. (2020). Gait speed, cognition and falls in people living with mild-to-moderate Alzheimer disease: Data from NILVAD. BMC Geriatrics, 20(1), Article 117. https://doi.org/10.1186/s12877-020-01531-w CrossRefGoogle ScholarPubMed
Elovainio, M., Lahti, J., Pirinen, M., Pulkki-Råback, L., Malmberg, A., Lipsanen, J., Virtanen, M., Kivimäki, M., & Hakulinen, C. (2022). Association of social isolation, loneliness and genetic risk with incidence of dementia: UK biobank cohort study. BMJ Open, 12(2), e053936. https://doi.org/10.1136/bmjopen-2021-053936 CrossRefGoogle ScholarPubMed
Feeney, S. L. (2004). The relationship between pain and negative affect in older adults: Anxiety as a predictor of pain. Journal of Anxiety Disorders, 18(6), 733744. https://doi.org/10.1016/j.janxdis.2001.04.001 CrossRefGoogle ScholarPubMed
Felix, C., Chahine, L. M., Hengenius, J., Chen, H., Rosso, A. L., Zhu, X., Cao, Z., & Rosano, C. (2021). Diffusion tensor imaging of the olfactory system in older adults with and without hyposmia. Frontiers in Aging Neuroscience, 13, Article 648598. https://doi.org/10.3389/fnagi.2021.648598 CrossRefGoogle ScholarPubMed
Fischer, R., & Karl, J. A. (2019). A primer to (Cross-cultural) multi-group invariance testing possibilities in R. Frontiers in Psychology, 10,  Article 1507. https://doi.org/10.3389/fpsyg.2019.01507 CrossRefGoogle ScholarPubMed
Ford, J. D. (2018). Understanding the intersection of borderline personality and somatoform disorders: A developmental trauma disorder framework. Clinical Psychology: Science and Practice, 25(2), Article e12243. https://doi.org/10.1111/cpsp.12243 Google Scholar
Gardiner, C., Geldenhuys, G., & Gott, M. (2018). Interventions to reduce social isolation and loneliness among older people: An integrative review. Health & Social Care in the Community, 26(2), 147157. https://doi.org/10.1111/hsc.12367 CrossRefGoogle Scholar
Gershon, R. C., Cella, D., Fox, N. A., Havlik, R. J., Hendrie, H. C., & Wagster, M. V. (2010). Assessment of neurological and behavioural function: The NIH toolbox. Lancet Neurology, 9(2), 138139. https://doi.org/10.1016/S1474-4422(09)70335-7 CrossRefGoogle Scholar
Gershon, R. C., Wagster, M. V., Hendrie, H. C., Fox, N. A., Cook, K. F., & Nowinski, C. J. (2013). NIH toolbox for assessment of neurological and behavioral function. Neurology, 80(11 Suppl 3), S26. https://doi.org/10.1212/WNL.0b013e3182872e5f CrossRefGoogle ScholarPubMed
Hackett, K., Krikorian, R., Giovannetti, T., Melendez‐Cabrero, J., Rahman, A., Caesar, E. E., Chen, J. L., Hristov, H., Seifan, A., Mosconi, L., & Isaacson, R. S. (2018). Utility of the NIH toolbox for assessment of prodromal Alzheimer’s disease and dementia. Alzheimer’s & Dementia: Diagnosis, Assessment & Disease Monitoring, 10(1), 764772. https://doi.org/10.1016/j.dadm.2018.10.002 Google Scholar
Hanseeuw, B. J., Scott, M. R., Sikkes, S. A. M., Properzi, M., Gatchel, J. R., Salmon, E., Marshall, G. A., Vannini, P., & Alzheimer’s Disease Neuroimaging Initiative (2020). Alzheimer’s disease neuroimaging initiative, 2020, evolution of anosognosia in Alzheimer’s disease and its relationship to amyloid. Annals of Neurology, 87(2), 267280. https://doi.org/10.1002/ana.25649 CrossRefGoogle Scholar
Harris, C. S., Dodd, M., Kober, K. M., Dhruva, A. A., Hammer, M. J., Conley, Y. P., & Miaskowski, C. A. (2022). Advances in conceptual and methodological issues in symptom cluster research: A 20-year perspective. ANS: Advances in Nursing Science, 45(4), 309322. https://doi.org/10.1097/ANS.0000000000000423 Google ScholarPubMed
Ho, S. Y., Rohan, K. J., Parent, J., Tager, F. A., & McKinley, P. S. (2015). A longitudinal study of depression, fatigue, and sleep disturbances as a symptom cluster in women with breast cancer. Journal of Pain and Symptom Management, 49(4), 707715. https://doi.org/10.1016/j.jpainsymman.2014.09.009 CrossRefGoogle Scholar
Holdnack, J. A., Tulsky, D. S., Slotkin, J., Tyner, C. E., Gershon, R., Iverson, G. L., & Heinemann, A. W. (2017). NIH toolbox premorbid ability adjustments: Application in a traumatic brain injury sample. Rehabilitation Psychology, 62(4), 496508. https://doi.org/10.1037/rep0000198 CrossRefGoogle Scholar
Horn, J. L. (1965). A rationale and test for the number of factors in factor analysis. Psychometrika, 30(2), 179185. https://doi.org/10.1007/BF02289447 CrossRefGoogle ScholarPubMed
Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 155. https://doi.org/10.1080/10705519909540118 CrossRefGoogle Scholar
Huang, T. W., & Lin, C. C. (2009). The mediating effects of depression on sleep disturbance and fatigue: Symptom clusters in patients with hepatocellular carcinoma. Cancer Nursing, 32(5), 398403. https://doi.org/10.1097/NCC.0b013e3181ac6248 CrossRefGoogle ScholarPubMed
Hwang, P. H., Longstreth, W. T. Jr, Brenowitz, W. D., Thielke, S. M., Lopez, O. L., Francis, C. E., DeKosky, S. T., & Fitzpatrick, A. L. (2020). Dual sensory impairment in older adults and risk of dementia from the GEM study. Alzheimer’s & Dementia: Diagnosis, Assessment & Disease Monitoring, 12(1), Article e12054. https://doi.org/10.1002/dad2.12054 Google ScholarPubMed
Illi, J., Miaskowski, C., Cooper, B., Levine, J. D., Dunn, L., West, C., Dodd, M., Dhruva, A., Paul, S. M., Baggott, C., Cataldo, J., Langford, D., Schmidt, B., & Aouizerat, B. E. (2012). Association between pro- and anti-inflammatory cytokine genes and a symptom cluster of pain, fatigue, sleep disturbance, and depression. Cytokine, 58(3), 437447. https://doi.org/10.1016/j.cyto.2012.02.015 CrossRefGoogle Scholar
Insha, R., Arshad, H., & Fazle, R. (2022). Loneliness, social isolation, traumatic life events, and risk of Alzheimer’s dementia: A case-control study. Indian Journal of Social Psychiatry, 38(3), 276281. https://doi.org/10.4103/ijsp.ijsp_284_20 Google Scholar
Isaacson, R. S., Ganzer, C. A., Hristov, H., Hackett, K., Caesar, E., Cohen, R., Kachko, R., Meléndez‐Cabrero, J., Rahman, A., Scheyer, O., Hwang, M. J., Berkowitz, C., Hendrix, S., Mureb, M., Schelke, M. W., Mosconi, L., Seifan, A., & Krikorian, R. (2018). The clinical practice of risk reduction for Alzheimer’s disease: A precision medicine approach. Alzheimer’s & Dementia, 14(12), 16631673. https://doi.org/10.1016/j.jalz.2018.08.004 CrossRefGoogle ScholarPubMed
Kim, H. J., McGuire, D. B., Tulman, L., & Barsevick, A. M. (2005). Symptom clusters: Concept analysis and clinical implications for cancer nursing. Cancer Nursing, 28(4), 270282. https://doi.org/10.1097/00002820-200507000-00005 CrossRefGoogle ScholarPubMed
Kramer, J. H., Jurik, J., Sha, S. J., Rankin, K. P., Rosen, H. J., Johnson, J. K., & Miller, B. L. (2003). Distinctive neuropsychological patterns in frontotemporal dementia, semantic dementia, and Alzheimer disease. Cognitive and Behavioral Neurology, 16(4), 211218. https://doi.org/10.1097/00146965-200312000-00002 CrossRefGoogle ScholarPubMed
Lee, Y., & Chi, I. (2016). Do cognitive leisure activities really matter in the relationship between education and cognition? Evidence from the aging, demographics, and memory study (ADAMS). Aging & Mental Health, 20(3), 252261. https://doi.org/10.1080/13607863.2015.1011081 CrossRefGoogle ScholarPubMed
Lezak, M. D., Howieson, D. B., Bigler, E. D., & Tranel, D. (2012). Neuropathology for neuropsychologists. In Neuropsychological assessment (5th ed. pp. 179345). Oxford University Press, Inc.Google Scholar
Liljas, A. E. M., Carvalho, L. A., Papachristou, E., De Oliveira, C., Wannamethee, S. G., Ramsay, S. E., & Walters, K. R. (2017). Self-reported vision impairment and incident prefrailty and frailty in English community-dwelling older adults: Findings from a 4-year follow-up study. Journal of Epidemiology and Community Health, 71(11), 10531058. https://doi.org/10.1136/jech-2017-209207 CrossRefGoogle ScholarPubMed
Lin, E. H. B., Katon, W., Von Korff, M., Tang, L., Williams, J. W. Jr, Kroenke, K., Hunkeler, E., Harpole, L., Hegel, M., Arean, P., Hoffing, M., Della Penna, R., Langston, C., Unützer, J. C., & for the IMPACT Investigators (2003). Effect of improving depression care on pain and functional outcomes among older adults with arthritis: A randomized controlled trial. Journal of the American Medical Association, 290(18), 24282429. https://doi.org/10.1001/jama.290.18.2428 CrossRefGoogle ScholarPubMed
Livingston, G., Huntley, J., Sommerlad, A., Ames, D., Ballard, C., Banerjee, S., Brayne, C., Burns, A., Cohen-Mansfield, J., Cooper, C., Costafreda, S. G., Dias, A., Fox, N., Gitlin, L. N., Howard, R., Kales, H. C., Kivimäki, M., Larson, E. B., Ogunniyi, A., Orgeta, V., Ritchie, K., Rockwood, K., Sampson, E. L., Samus, Q., Schneider, L. S., Selbæk, G., Teri, L., & Mukadam, N. (2020). Dementia prevention, intervention, and care: 2020 report of the Lancet Commission. Lancet, 396(10248), 413446. https://doi.org/10.1016/s0140-6736(20)30367-6 CrossRefGoogle ScholarPubMed
Ma, Y., Carlsson, C. M., Wahoske, M. L., Blazel, H. M., Chappell, R. J., Johnson, S. C., Asthana, S., & Gleason, C. E. (2021). Latent factor structure and measurement invariance of the NIH toolbox cognition battery in an Alzheimer’s disease research sample. Journal of the International Neuropsychological Society, 27(5), 412425. https://doi.org/10.1017/S1355617720000922 CrossRefGoogle Scholar
Marra, C., Quaranta, D., Zinno, M., Misciagna, S., Bizzarro, A., Masullo, C., Daniele, A., & Gainotti, G. (2007). Clusters of cognitive and behavioral disorders clearly distinguish primary progressive aphasia from frontal lobe dementia, and Alzheimer’s disease. Dementia and Geriatric Cognitive Disorders, 24(5), 317326. https://doi.org/10.1159/000108115 CrossRefGoogle ScholarPubMed
Marsh, H. W., Hau, K.-T., & Wen, Z. (2004). In search of golden rules: Comment on hypothesis-testing approaches to setting cutoff values for fit indexes and dangers in overgeneralizing Hu and Bentler’s, 1999 findings. Structural Equation Modeling, 11(3), 320341.10.1207/s15328007sem1103_2CrossRefGoogle Scholar
McDonough, I. M., Bischof, G. N., Kennedy, K. M., Rodrigue, K. M., Farrell, M. E., & Park, D. C. (2016). Discrepancies between fluid and crystallized ability in healthy adults: A behavioral marker of preclinical Alzheimer’s disease. Neurobiology of Aging, 46, 6875. https://doi.org/10.1016/j.neurobiolaging.2016.06.011 CrossRefGoogle ScholarPubMed
McKhann, G. M., Knopman, D. S., Chertkow, H., Hyman, B. T., Jack, C. R. Jr, Kawas, C. H., Klunk, W. E., Koroshetz, W. J., Manly, J. J., Mayeux, R., Mohs, R. C., Morris, J. C., Rossor, M. N., Scheltens, P., Carrillo, M. C., Thies, B., Weintraub, S., & Phelps, C. H. (2011). The diagnosis of dementia due to Alzheimer’s disease: Recommendations from the National Institute on Aging-Alzheimer’s Association workgroups on diagnostic guidelines for Alzheimer’s disease. Alzheimers & Dementia, 7(3), 263269. https://doi.org/10.1016/j.jalz.2011.03.005 CrossRefGoogle ScholarPubMed
Miaskowski, C., Aouizerat, B. E., Dodd, M., & Cooper, B. (2007). Conceptual issues in symptom clusters research and their implications for quality-of-life assessment in patients with cancer. Journal of the National Cancer Institute. Monographs, 2007(37), 3946. https://doi.org/10.1093/jncimonographs/lgm003 CrossRefGoogle Scholar
Miaskowski, C., Barsevick, A., Berger, A., Casagrande, R., Grady, P. A., Jacobsen, P., Kutner, J., Patrick, D., Zimmerman, L., Xiao, C., Matocha, M., & Marden, S. (2017). Advancing symptom science through symptom cluster research: Expert panel proceedings and recommendations. Journal of the National Cancer Institute, 109(4), djw253. https://doi.org/10.1093/jnci/djw253 CrossRefGoogle ScholarPubMed
Miles, S. R., Silva, M. A., Lang, B., Hoffman, J. M., Venkatesan, U. M., Sevigny, M., & Nakase-Richardson, R. (2021). Sleep apnea and posttraumatic stress after traumatic brain injury (TBI): A veterans affairs TBI model systems study. Rehabilitation Psychology, 66(4), 450460. https://doi.org/10.1037/rep0000389 CrossRefGoogle ScholarPubMed
Morin, R. T., & Axelrod, B. N. (2017). Use of latent class analysis to define groups based on validity, cognition, and emotional functioning. Clinical Neuropsychologist, 31(6-7), 10871099. https://doi.org/10.1080/13854046.2017.1341550 CrossRefGoogle ScholarPubMed
Morris, J. C. (1993). The clinical dementia rating (CDR): Current version and scoring rules. Neurology, 43(11), 24122414. https://doi.org/10.1212/wnl.43.11.2412-a CrossRefGoogle ScholarPubMed
Morris, J. C., Weintraub, S., Chui, H. C., Cummings, J., DeCarli, C., Ferris, S., Foster, N. L., Galasko, D., Graff-Radford, N., Peskind, E. R., Beekly, D., Ramos, E. M., & Kukull, W. A. (2006). The uniform data set (UDS): Clinical and cognitive variables and descriptive data from Alzheimer disease centers. Alzheimer Disease and Associated Disorders, 20(4), 210216. https://doi.org/10.1097/01.wad.0000213865.09806.92 CrossRefGoogle Scholar
Mungas, D., Heaton, R., Tulsky, D., Zelazo, P. D., Slotkin, J., Blitz, D., Lai, J.-S., & Gershon, R. (2014). Factor structure, convergent validity, and discriminant validity of the NIH toolbox cognitive health battery (NIHTB-CHB) in adults. Journal of the International Neuropsychological Society, 20(6), 579587. https://doi.org/10.1017/S1355617714000307 CrossRefGoogle ScholarPubMed
National Institutes of Health (2017). Symptom cluster characterization in chronic conditions (R01). Funding opportunity announcement (FOA) number: PA-17-462. National Institutes of Health. https://grants.nih.gov/grants/guide/pa-files/PA-17-462.html.Google Scholar
National Institutes of Health (2022). Notice of special interest (NOSI): Sensory and motor system changes as predictors of preclinical Alzheimer’s disease. Notice number: NOT-AG-21-044. National Institutes of Health. https://grants.nih.gov/grants/guide/notice-files/NOT-AG-21-044.html.Google Scholar
Nitsch, K. P., Casaletto, K. B., Carlozzi, N. E., Tulsky, D. S., Heinemann, A. W., & Heaton, R. K. (2017). Uncorrected versus demographically-corrected scores on the NIH toolbox cognition battery in persons with traumatic brain injury and stroke. Rehabilitation Psychology, 62(4), 485495. https://doi.org/10.1037/rep0000122 CrossRefGoogle ScholarPubMed
Nolin, S. A., Cowart, H., Merritt, S., McInerney, K., Bharadwaj, P. K., Franchetti, M. K., Raichlen, D. A., Jessup, C. J., Hishaw, G. A., Van Etten, E. J., Trouard, T. P., Geldmacher, D. S., Wadley, V. G., Porges, E. S., Woods, A. J., Cohen, R. A., Levin, B. E., Rundek, T., Alexander, G. E., & Visscher, K. M. (2023). Validity of the NIH toolbox cognitive battery in a healthy oldest-old 85+ sample. Journal of the International Neuropsychological Society, 29(6), 605614. https://doi.org/10.1017/S1355617722000443 CrossRefGoogle Scholar
Norton, S., Matthews, F. E., Barnes, D. E., Yaffe, K., & Brayne, C. (2014). Potential for primary prevention of Alzheimer’s disease: An analysis of population-based data. The Lancet Neurology, 13(8), 788794. https://doi.org/10.1016/S1474-4422(14)70136-X CrossRefGoogle Scholar
O’Brien, J. L., Lister, J. J., Fausto, B. A., Morgan, D. G., Maeda, H., Andel, R., & Edwards, J. D. (2021). Are auditory processing and cognitive performance assessments overlapping or distinct? Parsing the auditory behaviour of older adults. International Journal of Audiology, 60(2), 123132. https://doi.org/10.1080/14992027.2020.1791366 CrossRefGoogle ScholarPubMed
Oh, J. H., Thor, M., Olsson, C., Skokic, V., Jörnsten, R., Alsadius, D., Pettersson, N., Steineck, G., & Deasy, J. (2016). A factor analysis approach for clustering patient reported outcomes. Methods of Information in Medicine, 55(5), 431439. https://doi.org/10.3414/ME16-01-0035 Google Scholar
O’Rourke, H. M., Collins, L., & Sidani, S. (2018). Interventions to address social connectedness and loneliness for older adults: A scoping review. BMC Geriatrics, 18(1), 214. https://doi.org/10.1186/s12877-018-0897-x CrossRefGoogle Scholar
Preacher, K. J., Zhang, G., Kim, C., & Mels, G. (2013). Choosing the optimal number of factors in exploratory factor analysis: A model selection perspective. Multivariate Behav Res, 48(1), 2856. https://doi.org/10.1080/00273171.2012.710386 CrossRefGoogle Scholar
Rasmussen, J., & Langerman, H. (2019). Alzheimer’s disease – why we need early diagnosis. Degenerative Neurological and Neuromuscular Disease, 9, 123130. https://doi.org/10.2147/DNND.S228939 CrossRefGoogle ScholarPubMed
Rentz, D. M., Amariglio, R. E., Becker, J. A., Frey, M., Olson, L. E., Frishe, K., Carmasin, J., Maye, J. E., Johnson, K. A., & Sperling, R. A. (2011). Face-name associative memory performance is related to amyloid burden in normal elderly. Neuropsychologia, 49(9), 27762783. https://doi.org/10.1016/j.neuropsychologia.2011.06.006 CrossRefGoogle ScholarPubMed
Roberts, R. O., Christianson, T. J. H., Kremers, W. K., Mielke, M. M., Machulda, M. M., Vassilaki, M., Alhurani, R. E., Geda, Y. E., Knopman, D. S., & Petersen, R. C. (2016). Association between olfactory dysfunction and amnestic mild cognitive impairment and alzheimer disease dementia. JAMA Neurology, 73(1), 93101. https://doi.org/10.1001/jamaneurol.2015.2952 CrossRefGoogle ScholarPubMed
Rosseel, Y. (2012). Lavaan: An R package for structural equation modeling. Journal of Statistical Software, 48(2), 136.10.18637/jss.v048.i02CrossRefGoogle Scholar
Rostamzadeh, A., Kahlert, A., Kalthegener, F., & Jessen, F. (2022). Psychotherapeutic interventions in individuals at risk for Alzheimer’s dementia: A systematic review. Alzheimer’s Research & Therapy, 14(1), Article 18. https://doi.org/10.1186/s13195-021-00956-8 Google ScholarPubMed
Saxton, J., Lopez, O. L., Ratcliff, G., Dulberg, C., Fried, L. P., Carlson, M. C., Newman, A. B., & Kuller, L. (2004). Preclinical Alzheimer disease: Neuropsychological test performance 1.5 to 8 years prior to onset. Neurology, 63(12), 23412347. https://doi.org/10.1212/01.wnl.0000147470.58328.50 CrossRefGoogle ScholarPubMed
Schwarz, G. (1978). Estimating the dimension of a model. The Annals of Statistics, 6(2), 461464.10.1214/aos/1176344136CrossRefGoogle Scholar
Scott, E. P., Sorrell, A., & Benitez, A. (2019). Psychometric properties of the NIH toolbox cognition battery in healthy older adults: Reliability, validity, and agreement with standard neuropsychological tests. Journal of the International Neuropsychological Society, 25(8), 857867. https://doi.org/10.1017/S1355617719000614 CrossRefGoogle Scholar
Skerman, H. M., Yates, P. M., & Battistutta, D. (2012). Identification of cancer-related symptom clusters: An empirical comparison of exploratory factor analysis methods. Journal of Pain and Symptom Management, 44(1), 1022. https://doi.org/10.1016/j.jpainsymman.2011.07.009 CrossRefGoogle ScholarPubMed
Sohrabi, H. R., Bates, K. A., Weinborn, M. G., Johnston, A. N. B., Bahramian, A., Taddei, K., Laws, S. M., Rodrigues, M., Morici, M., Howard, M., Martins, G., Mackay-Sim, A., Gandy, S. E., & Martins, R. N. (2012). Olfactory discrimination predicts cognitive decline among community-dwelling older adults. Translational Psychiatry, 2(5), e118e118. https://doi.org/10.1038/tp.2012.43 CrossRefGoogle ScholarPubMed
Steiger, J. H., & Lind, J. C. (1980). Statistically based tests for the number of common factors. Paper Presented at the Annual Meeting of the Psychometric Society.Google Scholar
Strauss, M. E., & Fritsch, T. (2004). Factor structure of the CERAD neuropsychological battery. Journal of the International Neuropsychological Society, 10(4), 559565. https://doi.org/10.1017/S1355617704104098 CrossRefGoogle Scholar
Thomas, K. R., Cook, S. E., Bondi, M. W., Unverzagt, F. W., Gross, A. L., Willis, S. L., & Marsiske, M. (2020). Application of neuropsychological criteria to classify mild cognitive impairment in the active study. Neuropsychology, 34(8), 862873. https://doi.org/10.1037/neu0000694 CrossRefGoogle ScholarPubMed
Tucker, L. R., & Lewis, C. (1973). A reliability coefficient for maximum likelihood factor analysis. Psychometrika, 38(1), 110. https://doi.org/10.1007/bf02291170 CrossRefGoogle Scholar
Vaingankar, J. A., Chong, S. A., Abdin, E., Picco, L., Jeyagurunathan, A., Seow, E., Ng, L. L., Prince, M., & Subramaniam, M. (2017). Behavioral and psychological symptoms of dementia: Prevalence, symptom groups and their correlates in community-based older adults with dementia in Singapore. International Psychogeriatrics, 29(8), 13631376. https://doi.org/10.1017/S1041610217000564 CrossRefGoogle Scholar
van der Linde, R., Stephan, B. C. M., Matthews, F. E., Brayne, C., Savva, G. M., & the Medical Research Council Cognitive Function and Ageing Study (2010). Behavioural and psychological symptoms in the older population without dementia: Relationship with socio-demographics, health and cognition. BMC Geriatrics, 10(1), 87. https://doi.org/10.1186/1471-2318-10-87 CrossRefGoogle ScholarPubMed
Wang, L., Larson, E. B., Bowen, J. D., & van Belle, G. (2006). Performance-based physical function and future dementia in older people. Archives of Internal Medicine, 166(10), 11151120. https://doi.org/10.1001/archinte.166.10.1115 CrossRefGoogle ScholarPubMed
Weintraub, S., Besser, L., Dodge, H. H., Teylan, M., Ferris, S., Goldstein, F. C., Giordani, B., Kramer, J., Loewenstein, D., Marson, D., Mungas, D., Salmon, D., Welsh-Bohmer, K., Zhou, X.-H., Shirk, S. D., Atri, A., Kukull, W. A., Phelps, C., & Morris, J. C. (2018). Version 3 of the Alzheimer disease centers’ neuropsychological test battery in the uniform data set (UDS). Alzheimer Disease and Associated Disorders, 32(1), 1017. https://doi.org/10.1097/WAD.0000000000000223 CrossRefGoogle ScholarPubMed
Weintraub, S., Karpouzian‐Rogers, T., Peipert, J. D., Nowinski, C., Slotkin, J., Wortman, K., Ho, E., Rogalski, E., Carlsson, C., Giordani, B., Goldstein, F., Lucas, J., Manly, J. J., Rentz, D., Salmon, D., Snitz, B., Dodge, H. H., Riley, M., Eldes, F., Ustsinovich, V., &… Gershon, R. (2022). ARMADA: Assessing reliable measurement in Alzheimer’s disease and cognitive aging project methods. Alzheimers & Dementia, 18(8), 14491460. https://doi.org/10.1002/alz.12497 CrossRefGoogle ScholarPubMed
Weintraub, S., Salmon, D., Mercaldo, N., Ferris, S., Graff-Radford, N. R., Chui, H., Cummings, J., DeCarli, C., Foster, N. L., Galasko, D., Peskind, E., Dietrich, W., Beekly, D. L., Kukull, W. A., & Morris, J. C. (2009). The Alzheimer’s disease centers’ uniform data set (UDS): The neuropsychologic test battery. Alzheimer Disease and Associated Disorders, 23(2), 91101. https://doi.org/10.1097/WAD.0b013e318191c7dd CrossRefGoogle ScholarPubMed
Wiseman, K., Slotkin, J., Spratford, M., Haggerty, A., Heusinkvelt, M., Weintraub, S., Gershon, R., & McCreery, R. (2022). Validation of a tablet-based assessment of auditory sensitivity for researchers. Behavior Research Methods, 55(6), 28382852. https://doi.org/10.3758/s13428-022-01933-1 CrossRefGoogle ScholarPubMed
Yap, A. C., Mahendran, R., Kua, E. H., Zhou, W., & Wang, D. Y. (2022). Olfactory dysfunction is associated with mild cognitive impairment in community-dwelling older adults. Frontiers in Aging Neuroscience, 14, 930686. https://doi.org/10.3389/fnagi.2022.930686 CrossRefGoogle ScholarPubMed
Zecker, S. G., Hoffman, H. J., Frisina, R., Dubno, J. R., Dhar, S., Wallhagen, M., Kraus, N., Griffith, J. W., Walton, J. P., Eddins, D. A., Newman, C., Victorson, D., Warrier, C. M., & Wilson, R. H. (2013). Audition assessment using the NIH toolbox. Neurology, 80(11 Suppl 3), S4548. https://doi.org/10.1212/WNL.0b013e3182872dd2 CrossRefGoogle ScholarPubMed
Figure 0

Table 1. Definitions of a symptom cluster in the literature

Figure 1

Table 2. NIH Toolbox measures by battery and domain

Figure 2

Table 3. Demographic characteristics

Figure 3

Table 4. Descriptive statistics

Figure 4

Figure 1. Parallel analysis scree plot. In this figure, the triangles represent eigenvalues obtained across the 25 NIHTB measures. The dotted line represents average eigenvalues obtained from randomly generated datasets. Four of the observed eigenvalues were greater than the average of random samples, and thus the parallel analysis suggests retention of four factors, although a 6-factor structure was ultimately chosen as a more interpretable solution.

Figure 5

Table 5. Model fit

Figure 6

Table 6. Factor loadings for 6−factor solution

Figure 7

Table 7. Factor correlations for 6-factor solution

Figure 8

Table 8. Mean comparisons

Supplementary material: File

Tyner et al. supplementary material 1

Tyner et al. supplementary material
Download Tyner et al. supplementary material 1(File)
File 236.8 KB
Supplementary material: File

Tyner et al. supplementary material 2

Tyner et al. supplementary material
Download Tyner et al. supplementary material 2(File)
File 55.9 KB