We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Renal sinus fat (RSF) crucially influences metabolic regulation, inflammation, and vascular function. We investigated the association between RSF accumulation, metabolic disorders, and nutritional status in obese individuals with hypertension. A cross-sectional study involved 51 obese hypertensive patients from Salamat Specialized Community Clinic (February–September 2022). Basic and clinical information were collected through interviews. Data included anthropometrics, blood pressure, number of antihypertensive medications, body composition (bioelectrical impedance analysis), dietary intake (semi-quantitative 147-item food frequency questionnaire), and blood samples. Renal sinus fat was measured via ultrasonography. Statistical analyses included Pearson correlation, binary logistic regression, and linear regression. RSF positively correlated with abdominal visceral adipose tissue (VAT) area (P = 0.016), systolic blood pressure (SBP) (P = 0.004), and diastolic blood pressure (DBP) (P = 0.005). A strong trend toward a positive association was observed between antihypertensive medications and RSF (P = 0.062). In linear regression, RSF was independently associated with abdominal VAT area, SBP, and DBP after adjusting for confounders. After considering other risk factors, RSF volume relates to prescribed antihypertensive medications, hypertension, and central fat accumulation in obese hypertensive subjects. These findings suggest the need for further investigations into whether RSF promotes metabolic disorders.
The aim of this study was to analyze the validity and reliability of the Turkish version of the Renal Inpatient Nutrition Screening Tool (Renal iNUT) for hemodialysis patients.
The Renal Inpatient Nutrition Screening Tool (Renal iNUT) and the Malnutrition Universal Screening Tool (MUST) were used in adult hemodialysis patients at two different centers to identify malnutrition. The Subjective Global Assessment (SGA), regarded as the gold standard for nutritional status assessment, was utilized for comparison. Structural validity was assessed using biochemical values and anthropometric measurements, while reliability was assessed using repeated the Renal iNUT assessment. Of the 260 patients admitted, 42.3% were malnourished (SGA score was B or C). According to the Renal iNUT, 59.6% of the patients were at increased risk for malnutrition (score ≥1) and 3.8% required referral to a dietitian (score ≥2). According to the MUST, 13.1% the of patients were at increased risk for malnutrition and 8.5% required referral to a dietician. The Renal iNUT was found to be more sensitive in detecting increased risk of malnutrition in hemodialysis patients compared with the MUST (59.6% versus 13.1%). According to the SGA, the sensitivity of the Renal iNUT is higher compared to the MUST (89% and 45%, respectively). Kappa-assessed reliability of the Renal iNUT was 0.48 (95% CI, 0.58-0.9) and a moderate concordance was observed. The Renal iNUT is a valid and reliable nutritional screening tool for evaluating hemodialysis patients to determine their nutritional status. The use of the Renal iNUT by dieticians will contribute to the identification of malnutrition and its treatment.
Selenium (Se) is a mineral with several biological functions, and studies have shown that its deficiency can be linked to many complications in patients with chronic kidney disease (CKD). This study aims to systematically review the effects of Se supplementation in patients with CKD undergoing haemodialysis (HD). This systematic review was carried out according to the PRISMA statement. Clinical trials were searched in PubMed, Lilacs, Embase, Scopus and Cochrane Library databases from inception to July 2021 and updated in July 2024. The protocol was registered on PROSPERO (CRD42021231444). Two independent reviewers performed the study screening and data extraction, and the risk of bias was evaluated using the Cochrane Collaboration tool. Thirteen studies were included in this review. Only nine studies showed results on Se levels; in all, reduced Se levels were observed before supplementation. A positive effect of supplementation on plasma Se level was demonstrated. Of the ten studies analysed, six demonstrated positive effects on antioxidant and inflammatory markers. Only one study analysed immunological parameters, showing a positive impact. From two studies that analysed thyroid hormones, only one showed positive results. All studies were classified as high risk of bias. The findings suggest that Se supplementation significantly increases plasma Se levels in these patients; however, there are still not enough studies to clarify the effects of Se supplementation on the antioxidant and inflammatory markers, immune system and thyroid hormones. Further studies are needed to elucidate the effects of Se supplementation and to provide a recommendation for patients with CKD undergoing HD.
The potential threshold for dietary energy intake (DEI) that might prevent protein-energy wasting (PEW) in chronic kidney disease (CKD) is uncertain. The subjects were non-dialysis CKD patients aged ≥ 14 years who were hospitalised from September 2019 to July 2022. PEW was measured by subjective global assessment. DEI and dietary protein intake (DPI) were obtained by 3-d diet recalls. Patients were divided into adequate DEI group and inadequate DEI group according to DEI ≥ 30 or < 30 kcal/kg/d. Logistic regression analysis and restricted cubic spline were used in this study. We enrolled 409 patients, with 53·8 % had hypertension and 18·6 % had diabetes. The DEI and DPI were 27·63 (sd 5·79) kcal/kg/d and 1·00 (0·90, 1·20) g/kg/d, respectively. 69·2 % of participants are in the inadequate DEI group. Malnutrition occurred in 18·6 % of patients. Comparing with patients in the adequate DEI group, those in the inadequate DEI group had significantly lower total lymphocyte count, serum cholesterol and LDL-cholesterol and a higher prevalence of PEW. For every 1 kcal/kg/d increase in DEI, the incidence of PEW was reduced by 12·0 % (OR: 0·880, 95 % CI: 0·830, 0·933, P < 0·001). There was a nonlinear curve relationship between DEI and PEW (overall P < 0·001), and DEI ≥ 27·6 kcal/kg/d may have a preventive effect on PEW in CKD. Low DPI was also significantly associated with malnutrition, but not when DEI was adequate. Decreased energy intake may be a more important factor of PEW in CKD than protein intake.
To examine associations between three different plant-based diet quality indices, chronic kidney disease (CKD) prevalence and related risk factors in a nationally representative sample of the Australian population.
Design:
Cross-sectional analysis. Three plant-based diet scores were calculated using data from two 24-h recalls: an overall plant-based diet index (PDI), a healthy PDI (hPDI) and an unhealthy PDI (uPDI). Consumption of plant and animal ingredients from ‘core’ and ‘discretionary’ products was also differentiated. Associations between the three PDI scores and CKD prevalence, BMI, waist circumference (WC), blood pressure (BP) measures, blood cholesterol, apo B, fasting TAG, blood glucose levels (BGL) and HbA1c were examined.
Setting:
Australian Health Survey 2011–2013.
Participants:
n 2060 adults aged ≥ 18 years (males: n 928; females: n 1132).
Results:
A higher uPDI score was associated with a 3·7 % higher odds of moderate-severe CKD (OR: 1·037 (1·0057–1·0697); P = 0·021)). A higher uPDI score was also associated with increased TAG (P = 0·032) and BGL (P < 0·001), but lower total- and LDL-cholesterol (P = 0·035 and P = 0·009, respectively). In contrast, a higher overall PDI score was inversely associated with WC (P < 0·001) and systolic BP (P = 0·044), while higher scores for both the overall PDI and hPDI were inversely associated with BMI (P < 0·001 and P = 0·019, respectively).
Conclusions:
A higher uPDI score reflecting greater intakes of refined grains, salty plant-based foods and added sugars were associated with increased CKD prevalence, TAG and BGL. In the Australian population, attention to diet quality remains paramount, even in those with higher intakes of plant foods and who wish to reduce the risk of CKD.
Patients with diabetes have a higher risk of developing chronic kidney disease (CKD). Early detection of CKD through microalbuminuria screening, followed by treatment, delays the progression of CKD. We evaluated the cost-effectiveness of population-based screening of microalbuminuria among normotensive type 2 diabetes mellitus patients aged >40 years compared with no screening scenario using a decision tree combined with the Markov model.
Methods
We considered two scenarios: Scenario I – dipstick microalbuminuria followed by spot-urine albumin–creatinine ratio (ACR) and serum creatinine in sequence; Scenario II – spot urine ACR plus serum creatinine. A mathematical cohort of the target population was simulated over a lifetime horizon with an annual cycle. Data for the model were obtained from secondary resources. The incremental cost-effectiveness ratios (ICERs) were estimated for screening scenarios compared to nonscreening scenario, along with sensitivity analyses.
Results
The discounted ICER per quality-adjusted life years gained for annual microalbuminuria screening in the normotensive diabetic population in India were ₹ 24,114 (US$ 308) and ₹ 13,790 (US$ 176) for scenarios I and II, respectively. Annual screening by scenarios I and II resulted in a reduction of 180 and 193 end-stage renal disease (ESRD) cases per 100,000 population, respectively, resulting in a cost saving of ₹ 12.3 and 13.3 Crore spent on ESRD management over 10 years. Both scenarios were also cost-effective even at the screening frequencies of 5 and 10 yearly.
Conclusion
Microalbuminuria screening was cost-effective at the threshold of one-time GDP per capita in India.
The association between sarcopenia and kidney function remains poorly investigated. We aimed to evaluate the associations between sarcopenia status and kidney function (rapid kidney function decline and chronic kidney disease (CKD)) in middle-aged and older Chinese population. A total of 9375 participants from the China Health and Retirement Longitudinal Study 2011 were included in the cross-sectional analyses. A total of 5864 participants with eGFRcr-cys ≥ 60 ml/min per 1·73 m2 at baseline were included in the longitudinal analyses and were followed up in 2015. Sarcopenia status was defined according to the Asian Working Group for Sarcopenia 2019 criteria. In the cross-sectional analyses, possible sarcopenia and sarcopenia were significantly associated with an increased risk of CKD. During the 4 years of follow-up, 359 (6·12 %) participants experienced rapid decline in kidney function and 126 (2·15 %) participants developed CKD. After multivariable adjustment of baseline eGFRcr-cys level and other risk factors, possible sarcopenia (OR, 1·33; 95 % CI 1·01, 2·12) and sarcopenia (OR, 1·49; 95 % CI 1·05, 2·12) were associated with an increased risk of primary outcome (composite of rapid decline in kidney function (annualised decline in eGFRcr-cys ≥ 5 ml/min per 1·73 m2) and progression to CKD (eGFRcr-cys < 60 ml/min per 1·73 m2). Individuals with low muscle mass or low muscle strength alone also had an increased risk of rapid decline in kidney function and progression to CKD.
Although the cardiovascular benefits of an increased urinary potassium excretion have been suggested, little is known about the potential cardiac association of urinary potassium excretion in patients with chronic kidney disease. In addition, whether the cardiac association of urinary potassium excretion was mediated by serum potassium levels has not been studied yet. We reviewed the data of 1633 patients from a large-scale multicentre prospective Korean study (2011–2016). Spot urinary potassium to creatinine ratio was used as a surrogate for urinary potassium excretion. Cardiac injury was defined as a high-sensitivity troponin T ≥ 14 ng/l. OR and 95 % (CI for cardiac injury were calculated using logistic regression analyses. Of 1633 patients, the mean spot urinary potassium to creatinine ratio was 49·5 (sd 22·6) mmol/g Cr and the overall prevalence of cardiac injury was 33·9 %. Although serum potassium levels were not associated with cardiac injury, per 10 mmol/g Cr increase in the spot urinary potassium to creatinine ratio was associated with decreased odds of cardiac injury: OR 0·917 (95 % CI 0·841, 0·998), P = 0·047) in multivariate logistic regression analysis. In mediation analysis, approximately 6·4 % of the relationship between spot urinary potassium to creatinine ratio and cardiac injury was mediated by serum potassium levels, which was not statistically significant (P = 0·368). Higher urinary potassium excretion was associated with lower odds of cardiac injury, which was not mediated by serum potassium levels.
Acute kidney injury is the most common major complication after cardiac surgery. The incidence of cardiac surgery-associated AKI (CSA-AKI) varies between 5% to 40% and leads to dramatically worse outcomes. The incidence of CSA-AKI requiring renal replacement therapyafter coronary artery bypass grafting alone is roughly 1%. After valve surgery or combined CABG plus valve surgery the risk of requiring RRT increases to 1.7 and 3.3% respectively.Regardless of its reversibility, CSA-AKI has been associated with increased mortality and risk of developing chronic or end-stage renal disease, and consequently generating substantial cost.
Wearable digital health technologies (DHTs) have the potential to improve chronic kidney disease (CKD) management through patient engagement. This study aimed to investigate and elicit preferences of individuals with CKD toward wearable DHTs designed to support self-management of their condition.
Methods
Using the results of our review of the published literature and after conducting qualitative patient interviews, five-choice attributes were identified and included in a discrete-choice experiment. The design consisted of 10-choice tasks, each comprising two hypothetical technologies and one opt-out scenario. We collected data from 113 adult patients with CKD stages 3–5 not on dialysis and analyzed their responses via a latent class model to explore preference heterogeneity.
Results
Two patient segments were identified. In all preference segments, the most important attributes were the device appearance, format, and type of information provided. Patients within the largest preference class (70 percent) favored information provided in any format except the audio, while individuals in the other class preferred information in text format. In terms of the style of engagement with the device, both classes wanted a device that provides options rather than telling them what to do.
Conclusions
Our analysis indicates that user preferences differ between patient subgroups, supporting the case for offering a different design of the device for different patients’ strata, thus moving away from a one-size-fits-all service provision. Furthermore, we showed how to leverage the information from user preferences early in the R&D process to inform and support the provision of nuanced person-centered wearable DHTs.
Immune cells play a key role in maintaining renal dynamic balance and dealing with renal injury. The physiological and pathological functions of immune cells are intricately connected to their metabolic characteristics. However, immunometabolism in chronic kidney disease (CKD) is not fully understood. Pathophysiologically, disruption of kidney immune cells homeostasis causes inflammation and tissue damage via triggering metabolic reprogramming. The diverse metabolic characteristics of immune cells at different stages of CKD are strongly associated with their different pathological effect. In this work, we reviewed the metabolic characteristics of immune cells (macrophages, natural killer cells, T cells, natural killer T cells and B cells) and several non-immune cells, as well as potential treatments targeting immunometabolism in CKD. We attempt to elaborate on the metabolic signatures of immune cells and their intimate correlation with non-immune cells in CKD.
This study was to investigate the relationships among health behaviors and quality of life (QOL) and to test a proposed model among people with hypertension and concomitant chronic kidney disease (CKD) in primary care. In addition, the mediation effect of modifiable risk factors between self-care health behaviors and QOL was examined.
Methods:
This study was prospective, conducted in the centers of primary medical care in the period from January 2018 to January 2020. In total, 170 patients diagnosed with hypertension and CKD at least 12 months previously were included in this study. The following parameters were measured: self-efficacy, self-care health behaviors with the subscales of health responsibility, exercise, consumption of a healthy diet, stress management, and smoking cessation; modifiable risk score; and QOL (assessed using the 36-item Short-Form Health Survey instrument).
Results:
Self-efficacy had a significantly positive direct effect on self-care health behaviors, with a standardized regression coefficient of 0.87 (P = 0.007), a negative indirect effect on risk factors, with a standardized regression coefficient of 0.11 (P = 0.006), and a positive indirect effect on QOL, with a standardized regression coefficient of 0.62 (P = 0.008). Self-care health behaviors had a significantly positive direct effect on QOL, with a standardized regression coefficient of 0.72 (P = 0.012); there was also an indirect effect of 0.053 (P = 0.004). The direct effect of risk factors on QOL was significant, with a standardized regression coefficient of 0.44 (P = 0.018). The direct effect of self-care health behaviors on QOL was 0.77 (P = 0.008), which has been reduced to 0.72 (P = 0.012). The reduced effect of 0.05 was significant (P = 0.004), confirming the mediating role of modified risk factors.
Conclusions:
This study indicates health-promoting behaviors in hypertensive patients with CKD have a potential impact on their QOL in primary care. Primary care physicians should focus on motivation strategies to encourage individuals to perform self-care health-promoting behaviors associated with the improved QOL, in order to achieve better outcomes in risk factor management.
Hyperkalemia (HK) is common and potentially a life-threatening condition. If untreated, HK can progress to ventricular arrhythmia and cardiac arrest. Early treatment reduces mortality in HK. This study evaluates a novel protocol for identification and empiric management of presumed HK in the prehospital setting.
Methods:
This was a retrospective, observational chart review of a single, large, suburban Emergency Medical Services (EMS) system. Patients treated for presumed HK, with both a clinical concern for HK and electrocardiogram (ECG) changes consistent with HK, from February 2018 through February 2021 were eligible for inclusion. Patients were excluded if found to be in cardiac arrest on EMS arrival. Empiric treatment of HK included administration of calcium, sodium bicarbonate, and albuterol. Post-treatment, patients were placed on cardiac monitoring and adverse events recorded enroute to receiving hospital. Protocol compliance was assessed by two independent reviewers. Serum potassium (K) level was obtained from hospital medical records.
Results:
A total of 582 patients were treated for HK, of which 533 patients were excluded due to cardiac arrest prior to EMS arrival. The remaining 48 patients included in the analysis had a mean age of 56 (SD = 20) years and were 60.4% (n = 29) male with 77.1% (n = 37) Caucasian, 10.4% (n = 5) African American, and 12.5% (n = 6) Hispanic. Initial blood draw at the receiving facilities showed K >5.0mEq/L in 22 (45.8%), K of 3.5-5.0mEq/L in 23 (47.9%), and K <3.5mEq/L in three patients (6.3%). Independent review of the EMS ECG found the presence of hyperkalemic-related change in 43 (89.6%) cases, and five (10.4%) patients did not meet criteria for treatment due to lack of either appropriate ECG findings or clinical suspicion. No episodes of unstable tachyarrhythmia or cardiac arrest occurred during EMS treatment or transport.
Conclusion:
The study evaluated a novel protocol for detecting and managing HK in the prehospital setting. It is feasible for EMS crews to administer this protocol, although a larger study is needed to make the results generalizable.
Kidney disease is common in older age and associated with adverse health outcomes. Guidelines recommend estimating GFR (eGFR) using a standardized estimating equation that includes age, race, gender, and serum creatinine. Management for chronic kidney disease (CKD) includes identification and management of underlying causes to slow the progression, recognition, and treatment of CKD-related complications, and preparation for renal replacement therapy prior to end-stage renal disease. Despite the high burden of CKD in older adults, the evidence underlying existing guidelines comes primarily from young and middle-aged adults and often does not take into account multimorbidity, geriatric conditions, and heterogeneity in life expectancy and health goals at older age. A person-centered approach to CKD that includes geriatric assessment, engages and supports family caregivers, and incorporates palliative care principles across the spectrum of kidney disease is appropriate for older adults.
The current trial investigates the effect of renal diet therapy and nutritional education on the estimated glomerular filtration rate (eGFR), blood pressure (BP) and depression among patients with chronic kidney disease (CKD). A total of 120 CKD patients (stages 3–4) (15<eGFR < 60) were randomised into an intensive nutrition intervention group (individualised renal diet therapy plus nutrition counselling: 0·75 g protein/kg/d and 30–35 kcal/kg/d with Na restriction) and a control group (routine and standard care) for 24 weeks. The primary outcome was the change in the eGFR. Secondary outcomes included changes in anthropometric measures, biochemistry (serum creatinine (Cr), uric acid, albumin, electrolytes, Ca, vitamin D, ferritin, blood urea nitrogen (BUN), and Hb), BP, nutritional status, depression and quality of life. The eGFR increased significantly in the intervention group compared with the control group (P < 0·001). Moreover, serum levels of Cr and the systolic and diastolic BP decreased significantly in the intervention group relative to the control group (P < 0·001, P < 0·001 and P = 0·020, respectively). The nutrition intervention also hindered the increase in the BUN level and the depression score (P = 0·045 and P = 0·028, respectively). Furthermore, the reduction in protein and Na intake was greater in the intervention group (P = 0·003 and P < 0·001, respectively). Nutritional treatment along with supportive education and counselling contributed to improvements in renal function, BP control and adherence to protein intake recommendations. A significant difference in the mean eGFR between the groups was also confirmed at the end of the study using ANCOVA (β = -5·06; 95 % CI (−8·203, −2·999)).
The objective of this research communication was to produce low potassium milk in which other electrolyte changes and changes in taste were minimized. To reduce potassium concentrations, several studies have reported batch methods of directly mixing milk or formula with sodium polystyrene sulfonate, which can exchange cations such as potassium for sodium. However, they also reported increases in sodium content, decreases in calcium and magnesium content, and changes in taste, because sodium polystyrene sulfonate exchanged other substances such as calcium and magnesium for sodium. In the present study, a method of dialyzing whole cow's milk using both sodium polystyrene sulfonate and a small amount of water through cellophane membranes was developed. A batch method for comparison was also performed. Each milk sample was evaluated biochemically and analyzed for taste and aroma in a sensory analysis. We showed that the potassium concentration in the dialyzed milk was reduced to 38% of that in unreacted milk. It was also shown that changes in sodium (increased) as well as calcium and magnesium (decreased) in the dialyzed milk were less than half of those in the batch method milk. Sensory analysis showed that minimal changes occurred in the taste of the dialyzed milk.
To date, there is limited evidence for health care providers regarding the determinants of early assessment of poor outcomes of adult in-patients due to earthquakes. This study aimed to explore factors related to early assessment of adult earthquake trauma patients (AETPs).
Methods:
The data on 29,933 AETPs in the West China Earthquake Patients Database (WCEPD) were analyzed retrospectively. Then, 37 simple variables that could be obtained rapidly upon arrival at the hospital were collected. The least absolute shrinkage and selection operator (LASSO) regression analyses were performed. A nomogram was then constructed.
Results:
Nine independent mortality-related factors that contributed to AETP in-patient mortality were identified. The variables included age (OR:1.035; 95%CI, 1.027-1.044), respiratory rate ([RR]; OR:1.091; 95%CI, 1.050-1.133), pulse rate ([PR]; OR:1.028; 95%CI, 1.020-1.036), diastolic blood pressure ([DBP]; OR:0.96; 95%CI, 0.950-0.970), Glasgow Coma Scale ([GCS]; OR:0.666; 95%CI, 0.643-0.691), crush injury (OR:3.707; 95%CI, 2.166-6.115), coronary heart disease ([CHD]; OR:4.025; 95%CI, 1.869-7.859), malignant tumor (OR:4.915; 95%CI, 2.850-8.098), and chronic kidney disease ([CKD]; OR:5.735; 95%CI, 3.209-10.019).
Conclusions:
The nine mortality-related factors for ATEPs, including age, RR, PR, DBP, GCS, crush injury, CHD, malignant tumor, and CKD, could be quickly obtained on hospital arrival and should be the focal point of future earthquake response strategies for AETPs. Based on these factors, a nomogram was constructed to screen for AETPs with a higher risk of in-patient mortality.
Using a sample of US adults aged 65 years and older, we examined the role of dietary quality in cystatin C change over 4 years and whether this association varied by race/ethnicity. The Health and Retirement Study provided observations with biomarkers collected in 2012 and 2016, participant attributes measured in 2012, and dietary intake assessed in 2013. The sample was restricted to respondents who were non-Hispanic/Latino White (n 789), non-Hispanic/Latino Black (n 108) or Hispanic/Latino (n 61). Serum cystatin C was constructed to be equivalent to the 1999–2002 National Health and Nutrition Examination Survey (NHANES) scale. Dietary intake was assessed by a semi-quantitative FFQ with diet quality measured using an energy-adjusted form of the Alternative Healthy Eating Index-2010 (AHEI-2010). Statistical analyses were conducted using autoregressive linear modelling adjusting for covariates and complex sampling design. Cystatin C slightly increased from 1·2 mg/l to 1·3 mg/l over the observational period. Greater energy-adjusted AHEI-2010 scores were associated with slower increase in cystatin C from 2012 to 2016. Among respondents reporting moderately low to low dietary quality, Hispanic/Latinos had significantly slower increases in cystatin C than their non-Hispanic/Latino White counterparts. Our results speak to the importance of considering racial/ethnic determinants of dietary intake and subsequent changes in health in ageing populations. Further work is needed to address measurement issues including further validation of dietary intake questionnaires in diverse samples of older adults.
The aim of the present study was to develop and validate a test to evaluate dietitian's clinical competence (CC) about nutritional care in patients with early chronic kidney disease (CKD). The study was conducted through five steps: (1) CC and its dimensions were defined; (2) test items were elaborated, and choice of response format and scoring system was selected; (3) content and face validity were established; (4) test was subjected to a pilot test and those items with inadequate performance were removed; (5) criterion validity and internal consistency for final validation were established. A 120-items test was developed and applied to 207 dietitians for validation. Dietitians with previous CKD training obtained higher scores than those with no training, confirming the test validity criterion. According to item analysis, Cronbach's α was 0⋅85, difficulty index 0⋅61 ± 0⋅22, discrimination index 0⋅26 ± 0⋅15 and inter-item correlation 0⋅19 ± 0⋅11, displaying adequate internal consistency.
We aimed to investigate the relationship between the neutrophil to lymphocyte ratio (NLR) and nutritional parameters in chronic kidney disease (CKD) patients. In this cross-sectional study, 187 non-dialysis CKD patients were enrolled. Daily dietary energy intake (DEI) and daily dietary protein intake (DPI) were assessed by 3-d dietary records. Protein-energy wasting (PEW) was defined as Subjective Global Assessment (SGA) class B and C. Spearman correlation analysis, logistic regression analysis and receiver operating characteristic (ROC) curve analysis were performed. The median NLR was 2·51 (1·83, 3·83). Patients with CKD stage 5 had the highest NLR level. A total of 19·3 % (n 36) of patients suffered from PEW. The NLR was positively correlated with SGA and serum P, and the NLR was negatively correlated with BMI, waist and hip circumference, triceps skinfold thickness, mid-arm muscle circumference, DPI and Hb. Multivariate logistic regression analysis adjusted for DPI, DEI, serum creatinine, blood urea nitrogen, uric acid and Hb showed that a high NLR was an independent risk factor for PEW (OR = 1·393, 95 % CI 1·078, 1·800, P = 0·011). ROC analysis showed that an NLR ≥ 2·62 had the ability to identify PEW among CKD patients, with a sensitivity of 77·8 %, a specificity of 62·3 % and an AUC of 0·71 (95 % CI 0·63, 0·81, P < 0·001). The NLR was closely associated with nutritional status. NLR may be an indicator of PEW in CKD patients.