Explore chapters and articles related to this topic
Acute Renal Dysfunction
Published in Stephen M. Cohn, Matthew O. Dolich, Complications in Surgery and Trauma, 2014
Meghan E. Sise, Matthew O’Rourke, Jonathan Barasch
Distinguishing prerenal AKI from intrinsic AKI is more difficult; however, it is important given the significant increase in morbidity and mortality in the latter. Prerenal causes of AKI include volume depletion from dehydration, blood loss, and diuretics. Prerenal AKI can also result in volume overloaded or edematous states including congestive heart failure or cirrhosis, which are also associated with decreased renal blood flow. Urinary studies are helpful to distinguish prerenal causes from intrinsic renal causes. Prerenal causes are associated with low urine sodium (i.e., <20 mEq/L) and low FENa (<1%); these measures indicate intact sodium retention.20 However, recently administered diuretics or CKD might elevate the urine sodium even in the setting of prerenal azotemia. Blood urea nitrogen (BUN) to creatinine ratio of >20:1 also suggests prerenal azotemia; however, this is neither sensitive nor specific. Sepsis, high-protein enteral feeding, corticosteroid use, and upper gastrointestinal bleeding can all elevate BUN out of proportion to the creatinine, and conversely, liver disease and poor nutritional status will depress the BUN, meaning that prerenal disease cannot be excluded by a normal BUN/creatinine ratio. It is important to note that fluid management cannot be determined based on serum or urinary findings alone, but must incorporate history and physical exam findings, since both volume depletion and congestive heart failure are “prerenal” causes of AKI, yet are managed very differently: intravenous fluids in the former group, and diuresis and possible inotropic support in the latter group.
Acute Renal Dysfunction
Published in Stephen M. Cohn, Matthew O. Dolich, Kenji Inaba, Acute Care Surgery and Trauma, 2016
David Bennett, Meghan E. Sise, Catherine S. Forster, Matthew O’Rourke, Katherine Xu, Jonathan Barasch
Distinguishing prerenal kidney stress from intrinsic AKI is more difficult, but necessary, given the significant increase in morbidity and mortality in the latter. Prerenal causes include volume depletion from dehydration, blood loss, and diuretics. Prerenal AKI can also result from volume overloaded or edematous states including congestive heart failure or cirrhosis, in which the total body fluid is increased, but the effective circulating volume is decreased due to movement of fluids from the intravascular into the extravascular space, causing decreased renal blood flow. Urinary studies can be helpful in distinguish prerenal from intrinsic renal causes. Prerenal causes are associated with low urine sodium (i.e., <20 mEq/L) and low fractional excretion of sodium <1%, indicating intact sodium retention [28]. However, several factors can decrease the diagnostic utility of urine sodium, including recently administered diuretics, CKD, and acute rehydration therapy. A blood urea nitrogen (BUN) to creatinine ratio of greater than 20:1 is also suggestive of prerenal azotemia; however, this is neither sensitive nor specific. Sepsis, high-protein enteral feeding, corticosteroid use, and upper gastrointestinal bleeding can all elevate BUN out of proportion to the creatinine and, conversely, liver disease and poor nutritional status will depress the BUN. Therefore, prerenal disease cannot be excluded by a normal BUN/creatinine ratio. It is important to note that fluid management cannot be determined based on serum or urinary findings alone, but must incorporate history and physical examination findings, since both volume depletion and congestive heart failure are “prerenal” causes of AKI, yet are managed very differently: IV fluids in the former group and IV diuresis and possible inotropic support in the latter group.
Colonoscopic observation time as a predictor of stigmata of recent hemorrhage identification in colonic diverticular hemorrhage
Published in Scandinavian Journal of Gastroenterology, 2023
Sho Watanabe, Ayako Sato, Katsumasa Kobayashi, Akihiro Miyakawa, Hitoshi Uchida, Tomoyo Machida, Kenichiro Kobashi, Tsunehito Yauchi
We conducted a multicenter, retrospective cohort study in three facilities between January 2008 and October 2021. In this study, 410 hospitalized patients registered with ICD10 (International classification of diseases) code K573 (CDH) were enrolled. The patients were emergently hospitalized for acute hematochezia and diagnosed with definitive or presumptive CDH via endoscopy. The ethics committees and institutional review boards approved this study with the opt-out method in all three participating hospitals. The study flow diagram is presented in Figure 1. The exclusion criterion was (i) the lack of total colonoscopy. Finally, 392 patients diagnosed with either presumptive or definitive CDH were included for analysis. Definitive CDH was diagnosed as CDH with SRH via endoscopy, whereas presumptive CDH was defined as acute lower GI bleeding with colonic diverticula and without any other major colonic lesions or evidence of SRH [8]. In this study, massive upper GI bleeding was clinically denied in all of the cases due to following reasons; (i) Blood urea nitrogen (BUN)-to-creatinine ratio in normal range, (ii) no bleeding source detected in upper GI in CT and (iii) no outflow of fresh blood from the oral side of terminal ileum. Early rebleeding was defined as rebleeding within 30 d after colonoscopy, with or without endoscopic treatment [8].
Delirium in hospitalized older adults
Published in Hospital Practice, 2020
Katie M Rieck, Sandeep Pagali, Donna M Miller
Risk factors are broadly classified into predisposing factors – the factors which make a patient vulnerable – and precipitating factors – the noxious insults the patient is exposed to during hospitalization [14]. Patients with more significant predisposing factors require fewer noxious insults to develop delirium, and vice versa [2]. Known predisposing risk factors include dementia or cognitive impairment, functional or sensory (visual and/or hearing) impairment, depression, substance use disorder, age over 75 years, and severity of comorbidities [2,15,16]. Significant precipitating factors include medications (polypharmacy or psychoactive drugs), use of restraints, presence of a urinary catheter, and increased serum urea or blood urea nitrogen (BUN) to creatinine ratio [2,15,16]. Table 1 outlines a comprehensive list of these risk factors. The classification of these risk factors is not completely discrete as some of the risk factors can overlap categories depending on the acuity and chronicity of the condition such as immobility, anemia, nutritional status, and pain.
Delta Checks in the clinical laboratory
Published in Critical Reviews in Clinical Laboratory Sciences, 2019
Edward W Randell, Sedef Yenice
The work done throughout the mid- and late-1970s and early 1980s showed a maturing of the delta check concept similar to the way it is used today. Of these advances, Ladenson [15] made use of a computer and select clinical chemistry and immunoassay tests, and considered, but did not adopt, thresholds based on biological variability that had been described by Young et al. [16]. Contemporary with this report, Whitehurst et al. [17] described a system involving routine clinical chemistry tests, but they also examined change in the calculated anion gap, the first multivariate approach to delta checking. Up to this point, all delta check thresholds had been set empirically. In a move towards a more systematic approach for identifying delta check thresholds, Wheeler and Sheiner [18] used archived laboratory data to determine probabilities of change affecting six commonly measured clinical chemistry tests and two calculated parameters, the anion gap, and the urea (or blood urea nitrogen, BUN) to creatinine ratio. This approach was more complex than previous ones as it applied different delta check thresholds based on test result categories for the current result, used two different time intervals between specimens, and used seven different probability-based thresholds for which different actions were ascribed. Sher [6] examined a variety of clinical chemistry analytes, including anion gap, using a computerized delta check strategy that extended the delta check interval to a 30-d period; this represented a significant change from shorter intervals of <4 d evaluated by predecessors. The strategy resulted in a 1.6% positivity rate but only 16% of these delta check alerts represented an error. While specimen misidentification errors (22%) were the most common, other errors included specimen mishandling and instrument failures. Later, work by Sheiner et al. [19] and Wheeler and Sheiner [20], which evaluated delta check strategies published over the previous decade, concluded that all approaches showed similar performance in terms of the frequency of alerts and errors detected. Recognized as a challenge by these early studies was balancing error detection with the work required to evaluate and rule-out errors, as changes in the majority of results were explained by pathophysiology or clinical intervention.