Can surgical need in patients with Naja atra (Taiwan or Chinese cobra) envenomation be predicted in the emergency department?

Hong Kong Med J 2016 Oct;22(5):435–44 | Epub 12 Aug 2016
DOI: 10.12809/hkmj154739
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Can surgical need in patients with Naja atra (Taiwan or Chinese cobra) envenomation be predicted in the emergency department?
HY Su, MD1; MJ Wang, PhD2; YH Li, PhD3; CN Tang, MD4; MJ Tsai, MD, PhD5
1 Department of Emergency Medicine, E-Da Hospital and I-Shou University, Kaohsiung, Taiwan; Department of Emergency Medicine, Buddhist Tzu Chi General Hospital, Hualien, Taiwan
2 Department of Medical Research, Buddhist Tzu Chi General Hospital, Hualien, Taiwan
3 Department of Public Health, Tzu Chi University, Hualien, Taiwan
4 Department of Family Medicine, Buddhist Tzu Chi General Hospital, Hualien, Taiwan
5 Department of Emergency Medicine, Ditmanson Medical Foundation Chiayi Christian Hospital, Chiayi, Taiwan; Department of Sports Management, Chia Nan University of Pharmacy and Science, Tainan, Taiwan
 
Corresponding author: Dr MJ Tsai (tshi33@gmail.com)
 
An earlier version of this paper was presented at the 7th Asian Conference on Emergency Medicine held in Tokyo, Japan on 23-25 October 2013.
 
 Full paper in PDF
 
Abstract
Objectives: To investigate the clinical predictors and the aetiologies for surgery in patients with Naja atra (Taiwan or Chinese cobra) envenomation.
 
Methods: This case series was conducted in the only tertiary care centre in eastern Taiwan. Patients who presented to the emergency department with Naja atra bite between January 2008 and September 2014 were included. Clinical information was collected and compared between surgical and non-surgical patients.
 
Results: A total of 28 patients with Naja atra envenomation presented to the emergency department during the study period. Of these, 60.7% (n=17) required surgery. Necrotising fasciitis (76.5%) was the main finding in surgery. Comparisons between surgical and non-surgical patients showed skin ecchymosis (odds ratio=34.36; 95% confidence interval, 2.20-536.08; P=0.012) and a high total dose of antivenin (≥6 vials; odds ratio=14.59; 95% confidence interval, 1.10-192.72; P=0.042) to be the most significant predictors of surgery. The rate of bacterial isolation from the surgical wound was 88.2%. Morganella morganii (76.5%), Enterococcus faecalis (58.8%), and Bacteroides fragilis (29.4%) were the most common pathogens involved. Bacterial susceptibility testing indicated that combined broad-spectrum antibiotics were needed to cover mixed aerobic and anaerobic bacterial infection.
 
Conclusions: Patients with Naja atra envenomation who present with skin ecchymosis or the need for a high dose of antivenin may require early surgical assessment. Combined broad-spectrum antibiotics are mandatory.
 
 
New knowledge added by this study
  • Among the six major venomous snakebites in Taiwan, Naja atra envenomation most commonly leads to surgical intervention.
  • Ecchymosis on the bite wound may be a good indicator for surgical need in N atra envenomation.
  • Adequate antibiotic treatment may play an important role in the early management of N atra envenomation.
Implications for clinical practice or policy
  • Surgical debridement and broad-spectrum antibiotic treatment are suggested in patients with N atra envenomation who develop ecchymosis. Surgery is more likely when high-dose antivenin has been used.
 
 
Introduction
Snakebites are an important public health and wilderness medical issue in Taiwan. Because of the warm and humid climate in Taiwan, there are more than 40 terrestrial snake species, of which 15 are venomous. Six of the venomous species are of high clinical importance, including Protobothrops mucrosquamatus (Taiwan Habu), Trimeresurus stejnegeri (Taiwan bamboo viper), Naja atra (Taiwan or Chinese cobra), Bungarus multicinctus (banded krait), Deinagkistrodon acutus (hundred pacer), and Daboia russelii siamensis (Russell’s viper).1 2
 
Naja atra belongs to the Elapidae family, and in addition to Taiwan, it inhabits southern China, Hong Kong, northern Laos, and northern Vietnam.3 Cobra venom contains a mixture of components, including cardiotoxin, cobrotoxin, haemotoxin, and phospholipase A2.4 Patients envenomed by a cobra experience varying degrees of neurotoxicity and cytotoxicity depending upon the proportions of the venom components. Due to evolution and geographical variations, different cobra species cause distinct clinical effects. For example, Naja philippinensis (northern Philippine cobra) causes a purely neurotoxic effect without local cytotoxicity.5 In contrast, N atra envenomation is associated with more cytotoxic effects.3 6 7 Although an equine-derived bivalent F(ab)2 antivenin has been produced by the Centers for Disease Control, ROC (Taiwan) to neutralise the venom of N atra, the surgical intervention rate remains high.1 8 The main objective of this study was to investigate the clinical presentations and predictors for surgery in patients with N atra envenomation. Due to high wound infection rates, the isolated bacteria from surgical wounds and the antimicrobial susceptibility were also analysed.
 
Methods
Study design and patient population
The Buddhist Tzu Chi General Hospital is the only tertiary care centre in eastern Taiwan. There are 1000 beds and the emergency department (ED) has more than 55 000 patient visits per year. This hospital is also the toxicant, drug information, and antidote control centre for eastern Taiwan. A retrospective study was conducted to analyse data from patients admitted to the ED with N atra envenomation between 1 January 2008 and 30 September 2014.
 
Data collection, processing, and categorisation
A medical assistant was responsible for collecting the medical records of patients admitted with snakebite during the study period by using the computerised chart system and International Classification of Diseases, 9th Revision, Clinical Modification codes 989.5, E905.0, E905.9, E906.2, and E906.5. Two physicians (the first and fifth authors) independently reviewed the charts and categorised these patients as having venomous or non-venomous snakebites based on the patient’s presentation with or without toxic effects. For venomous snakebites, classification of the snake species was based on the identification of the snake brought in by the patient or identification by the patient from a picture. All the included patients had a compatible presentation and consistent antivenin use as recorded in the patient chart. Patients who were initially recognised as having venomous snakebites but did not receive antivenin treatment were excluded from the study because of the high probability of a dry bite or misidentification of the snake species. Patients with a toxic presentation who could not identify the snake species or who received more than one type of antivenin were recorded as having an unknown poisonous snakebite.
 
Here we only report patients who were bitten by N atra. To identify the early clinical predictors of surgery, we categorised the patients into surgical and non-surgical groups. All surgical interventions were performed after surgical consultation in the ED or after admission when patients presented with progressive signs suggesting tissue necrosis, necrotising fasciitis, or suspected compartment syndrome. The final diagnoses of necrotising fasciitis and compartment syndrome were made according to surgical pathological findings and intracompartmental pressure measurement, respectively. The surgical procedures included debridement, fasciotomy, skin graft, and digit or limb amputation. The potential clinical predictors of surgery in N atra envenomation included the patient’s age, gender, season of snakebite, co-morbidities, details of envenomation, site of snakebite, initial vital signs on arriving at the ED, clinical presentation, laboratory data, treatment, timing of initial antivenin therapy, and total dose of antivenin.
 
For the laboratory analyses, the initial data obtained in the ED were collected, including haematology, biochemistry, and coagulation profiles. In regard to clinical presentation, the local signs and symptoms, local complications, and systemic manifestations and complications were classified. Local signs and symptoms included swelling, ecchymosis, necrosis, numbness, and bulla formation. Local complications included necrotising fasciitis and suspected compartment syndrome. Systemic manifestations and complications included neurological symptoms, including ptosis, blurred vision, drooling, and paralysis of facial, limb, or respiratory muscles; leukocytosis, defined as a white blood cell count of >11.0 x 109 /L; thrombocytopenia, defined as a platelet count of <150 x 103 /mm3;2 prothrombin time (PT) prolongation, defined as PT of >11.6 seconds; activated partial thromboplastin time (aPTT) prolongation, defined as aPTT of >34.9 seconds (prolonged PT and aPTT were defined according to our clinical laboratory reference range); fibrinogen consumption, defined as a fibrinogen level of <1.8 g/L; elevated D-dimer level, defined as a D-dimer level of >500 µg/L; acute renal impairment, defined as a creatinine level of >123.8 µmol/L9; and rhabdomyolysis, defined as a creatine kinase level of >1000 U/L.10 Two physicians reviewed the charts of the enrolled patients and rechecked the accuracy of the data collection. If the patient’s initial vital signs were not measured or laboratory tests were not performed in the ED, this was recorded as a missing value in the database. Any discrepancy regarding the collected data was resolved through discussion with the third physician on the research team. The study protocol was approved by the institutional review board of the Buddhist Tzu Chi General Hospital (IRB102-38). All patient records and information were anonymised and de-identified prior to analysis.
 
Statistical analyses
To identify significant early clinical presentation and laboratory data associated with surgery in patients with N atra envenomation, the Student’s t test or the Mann-Whitney U test for continuous variables and Chi squared test for categorical variables were used to perform univariate analysis. A P value of <0.05 was considered statistically significant, and all statistical tests were two-tailed. For multivariate analysis, the categorical variables with a P value of <0.05 in the initial univariate analysis were selected and entered into a logistic regression forward stepwise Wald test to calculate the odds ratios (ORs). The Statistical Package for the Social Sciences (Windows version 12.0; SPSS Inc, Chicago [IL], US) was used to perform the statistical analyses.
 
Results
Epidemiology and surgical intervention rate for snake envenomation
Between 1 January 2008 and 30 September 2014, a total of 245 patients with venomous snakebites were recorded. Among these, 64 (26.1%) patients had P mucrosquamatus envenomation, 56 (22.9%) had T stejnegeri envenomation, 28 (11.4%) had N atra envenomation, five (2.0%) had B multicinctus envenomation, six (2.4%) had D acutus envenomation, seven (2.9%) had D r siamensis envenomation, and 79 (32.2%) had unknown poisonous snake envenomation.
 
The snakebites associated with the highest surgical intervention rates were N atra (60.7%), followed by D acutus (33.3%), and P mucrosquamatus (12.5%).
 
Characteristics and clinical status of patients with Naja atra envenomation
Of the 28 patients with a N atra bite, 20 (71.4%) were male. The mean (± standard deviation) age of patients was 52.3 ± 3.2 years. Of the patients, 22 (78.6%) were bitten in the summer or fall; 17 (60.7%) were bitten on an upper limb; and 17 (60.7%) with N atra envenomation received surgical treatment. These patients had a significantly longer duration of hospitalisation than non-surgical patients (27.5 ± 10.2 days vs 2.7 ± 3.1 days; P<0.001). The main operative diagnosis was necrotising fasciitis (n=13, 76.5%) with confirmation by histopathology. The clinical characteristics of the 17 surgical patients are shown in Table 1. The mean duration from the time of initial presentation to the day of surgery was 5.5 ± 4.3 days. All 13 patients with necrotising fasciitis underwent emergency fasciotomy and debridement, and two required limb or digit amputation. The other four surgical patients without necrotising fasciitis only received local debridement with or without skin graft due to local tissue necrosis. Therefore, a smaller surgical wound and a shorter duration of hospitalisation were observed for these patients (Table 1). Nearly all surgical patients presented with local swelling and ecchymosis on the bite wound. Only one non-surgical patient presented with ecchymosis on a finger and was discharged from the ED 1 day later after four vials of antivenin were administered. The Figure shows the initial ecchymosis and necrosis of a N atra bite wound, the development of extensive tissue necrosis, and the postoperative wounds of a surgical patient (patient No. 9 in Table 1).
 

Table 1. Clinical characteristics of the 17 surgical patients with Naja atra envenomation
 

Figure. Patient No. 9 in Table 1
A 59-year-old man bitten by Naja atra on his left foot visited our hospital 6 hours after the snakebite. (a) Despite the use of 10 vials of antivenin, progressive ecchymosis and necrosis on the bite wound developed later. (b) Fasciotomy and debridement were done on the second day of patient visit. (c) Progressive wound necrosis and necrotising fasciitis of the leg developed 5 days later. (d and e) He underwent second surgical debridement of the foot and fasciotomy of the leg
 
Demographic and clinical characteristics associated with surgical treatment in patients with Naja atra envenomation
The demographic and clinical characteristics were compared between the surgical and non-surgical patients with N atra envenomation (Tables 2 and 3). Overall, the surgical patients received significantly higher doses of antivenin (9.2 ± 4.9 vials vs 3.8 ± 2.4 vials; P=0.002) and had significantly higher white blood cell counts (11.0 ± 3.7 x 109 /L vs 8.2 ± 2.4 x 109 /L; P=0.043). A higher respiratory rate was also evident in surgical patients (median [interquartile range]: 20 [20-21] vs 18 [16-18] breaths/min; P=0.015), but the incidence of missing data in both groups for this factor was high (Table 2). A significantly higher proportion of surgical patients received six or more vials of antivenin in total compared with non-surgical patients (82.4% vs 18.2%; P=0.001) [Table 3]. For local signs, symptoms and complications, a significantly higher proportion of surgical patients presented with local swelling (100% vs 72.7%; P=0.05), ecchymosis (82.4% vs 9.1%; P<0.001), necrosis (58.8% vs 0%, P=0.002), bulla formation (41.2% vs 0%; P=0.023), and necrotising fasciitis (76.5% vs 0%; P<0.001) [Table 3]. Age, season and site of snakebite, co-morbidity with diabetes, allergy to antivenin, and other systemic manifestations were not found to be significantly different between surgical and non-surgical patients. None of the patients with N atra envenomation presented with neurological symptoms. One patient with a small area of ecchymosis on the bite wound of his left hand did not receive surgical intervention, because the condition of the local wound improved and healed after administration of four vials of antivenin and intravenous antibiotics.
 

Table 2. Clinical and laboratory characteristics of 28 patients with Naja atra envenomation
 

Table 3. Demographics, and clinical and laboratory characteristics of 28 patients with Naja atra envenomation
 
Independent predictors of surgery in patients with Naja atra envenomation
To determine clinical predictors of surgery, a multivariate logistic regression analysis was conducted for the significant variables derived from the univariate analysis. Necrotising fasciitis was not included in the multivariate analysis because it was a surgical finding and not an early sign that could be identified in the ED. The results showed that local ecchymosis (OR=34.36; 95% confidence interval [CI], 2.20-536.08; P=0.012) and a high total dose of antivenin (≥6 vials; OR=14.59; 95% CI, 1.10-192.72; P=0.042) were the most significant clinical predictors of surgery in patients with N atra envenomation.
 
Bacterial isolates identified from the snakebite wounds of surgical patients with Naja atra envenomation, and bacterial susceptibility to common antibiotics
To analyse the cause of necrotising fasciitis in surgical patients, the bacterial isolates identified from snakebite wounds were further analysed in surgical patients. The positive culture rate was 88.2% (n=15). More than one type of bacteria were isolated from the snakebite wound in 14 (82.4%) surgical patients. The isolated pathogens included aerobic Gram-positive and Gram-negative bacteria, as well as anaerobic bacteria. The most commonly identified pathogen was Morganella morganii (76.5%), followed by Enterococcus faecalis (58.8%) and Bacteroides fragilis (29.4%) [Table 4].
 

Table 4. Bacterial isolates from snakebite wounds of surgical patients
 
The susceptibility of the bacteria to common antibiotics was analysed (Table 5). All Gram-positive bacteria were susceptible to vancomycin and teicoplanin. All Gram-negative bacteria were susceptible to cefotaxime and amikacin. Cefmetazole, gentamicin, levofloxacin, and trimethoprim/sulfamethoxazole were also effective against the isolated Gram-negative bacteria. Nearly all anaerobic bacteria were susceptible to clindamycin and metronidazole (Table 5).
 

Table 5. Susceptibility of bacteria isolated from snakebite wounds to common antibiotics
 
Discussion
In our study, skin change of ecchymosis on the bite wound was a good clinical predictor of surgery for N atra envenomation. The majority of N atra venom is cytotoxic, not haemorrhagic. The cardiotoxin and phospholipase A2 in N atra venom are direct cytotoxic polypeptides and cause degradation of cell membranes. They induce cell death by activating calcium-dependent proteases, and inhibit mitochondrial respiration. Hyaluronidase in N atra venom destroys interstitial constituents and precipitates the spreading of venom.11 A histopathological study of N atra bite wounds demonstrated thrombotic and fibrinoid deposits in superficial and deep dermal vessels, and leukocytoclastic vasculitis.12 Hence, both the cytotoxic and ischaemic effects of N atra venom may lead to blood extravasation from the destroyed subcutaneous vessels or capillaries and result in the characteristic ecchymosis on the bite wound. This finding may be a potentially important clinical sign of irreversible subcutaneous tissue necrosis due to development of tissue ischaemia.3 If management at this stage is inadequate, tissue destruction may progress to involve the fascia rapidly and extensively with ultimate development of necrotising fasciitis.13 In our patients, extensive tissue destruction beyond the original bite site was evident once necrotising fasciitis developed. Further study is required to verify whether early surgical intervention can prevent the development of necrotising fasciitis, reduce the size of surgical wound, or shorten the length of hospital stay. Nonetheless, surgical assessment may be needed in patients with N atra bite who present with local ecchymosis on the bite wound.
 
Traditionally, immediate injection of antivenin to neutralise N atra venom was the only efficient management.14 A study using an enzyme-linked immunosorbent assay to detect the amount of N atra venom revealed that two to eight vials of antivenin are sufficient to eliminate systemic circulating venom if presentation is early.6 The efficacy of systemically administrated antivenin to diminish local tissue destruction is still controversial, however, and needs further study.3 In an animal study, the cytotoxic venom of N atra was shown to bind with high affinity to tissues leading to high levels of local tissue destruction.15 This finding may explain the difficulties associated with neutralisation of local venom toxicity, especially in cases of delayed presentation. Thus, the adequate dose of antivenin for preventing advanced tissue destruction remains unknown. In our study, nearly all patients presented within 1 hour following envenoming. Intravenous injection of antivenin was administered as soon as clinically possible following identification of cobra envenoming. Interestingly, the use of higher doses of antivenin in patients with N atra envenomation did not decrease surgical rates even in cases of early presentation. More than half of the patients underwent surgery and the majority were diagnosed with necrotising fasciitis. Surgical intervention appears to be crucial for the management of N atra envenomation. Hence, the identification of clinical predictors of surgical need and sufficient evidence to support surgeons’ decisions to carry out early surgical intervention are important issues in N atra management.
 
High bacterial isolation rates and the growth of mixed spectrums of bacteria from bite wounds indicate bacterial infection (which may be another cause of necrotising fasciitis in N atra envenomation), bacterial colonisation, or both. Morganella morganii and Enterococcus species were the most common pathogens cultured from N atra bite wounds in this study. This finding is consistent with the bacterial cultures taken from oral swabs of N atra in Hong Kong.16 Similar results were also described in a previous study in western Taiwan.17 Hence, the use of adequate antibiotics is important in N atra envenomation management. In accordance with the results of our tests of the antibiotic susceptibility of the isolated bacteria, treatment with glycopeptide antibiotics (vancomycin or teicoplanin) combined with a third-generation cephalosporin (cefotaxime) with or without anti-anaerobic antibiotics (clindamycin or metronidazole) is recommended.
 
Limitations
There are several limitations in our study. First, this was a retrospective chart review comparative study. Non-uniform description of symptoms and signs documented by different providers may have influenced the validity of the statistics. Second, the small sample size may limit the statistical power in the multivariate analysis. Third, there are no definitive guidelines for the management of venomous snakebites in Taiwan, and various treatment strategies were employed; this may have influenced the final outcome. A large-scale prospective study is warranted to verify the risk factors we have identified to provide more accurate data for early risk stratification, treatment, and management of these patients.
 
Conclusions
Of the six common venomous snakes in eastern Taiwan, bites by N atra most frequently lead to surgical intervention. Severe tissue necrosis and necrotising fasciitis were the main findings during surgery. Patients who present with ecchymosis on the bite wound or who require higher doses of antivenin may have a higher probability of surgical intervention. In addition to early and adequate antivenin treatment, combined broad-spectrum antibiotics and surgical intervention may be needed in the management of N atra snakebites.
 
Acknowledgement
This work was supported by Buddhist Tzu Chi General Hospital Grants TCRD103-53 (to the first author).
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Liau MY, Huang RJ. Toxoids and antivenoms of venomous snakes in Taiwan. Toxin Rev 1997;16:163-75.
2. Hung DZ. Taiwan’s venomous snakebite: epidemiological, evolution and geographic differences. Trans R Soc Trop Med Hyg 2004;98:96-101. Crossref
3. Wong OF, Lam TS, Fung HT, Choy CH. Five-year experience with Chinese cobra (naja atra)–related injuries in two acute hospitals in Hong Kong. Hong Kong Med J 2010;16:36-43.
4. Li S, Wang J, Zhang X, et al. Proteomic characterization of two snake venoms: Naja naja atra and Agkistrodon halys. Biochem J 2004;384:119-27. Crossref
5. Watt G, Padre L, Tuazon L, Theakston RD, Laughlin L. Bites by the Philippine cobra (Naja naja philippinensis): prominent neurotoxicity with minimal local signs. Am J Trop Med Hyg 1988;39:306-11.
6. Hung DZ, Liau MY, Lin-Shiau SY. The clinical significance of venom detection in patients of cobra snakebite. Toxicon 2003;41:409-15. Crossref
7. Wang W, Chen QF, Yin RX, et al. Clinical features and treatment experience: A review of 292 Chinese cobra snakebites. Environ Toxicol Pharmacol 2014;37:648-55. Crossref
8. Huang LW, Wang JD, Huang JA, Hu SY, Wang LM, Tsan YT. Wound infections secondary to snakebite in central Taiwan. J Venom Anim Toxins Incl Trop Dis 2012;18:272-6. Crossref
9. Hung DZ, Wu ML, Deng JF, Lin-Shiau SY. Russell’s viper snakebite in Taiwan: differences from other Asian countries. Toxicon 2002;40:1291-8. Crossref
10. Chen YW, Chen MH, Chen YC, et al. Differences in clinical profiles of patients with Protobothrops mucrosquamatus and Viridovipera stejnegeri envenoming in Taiwan. Am J Trop Med Hyg 2009;80:28-32.
11. Harris JB. Myotoxic phospholipases A2 and the regeneration of skeletal muscles. Toxicon 2003;42:933-45. Crossref
12. Pongprasit P, Mitrakul C, Noppakun N. Histopathology and microbiological study of cobra bite wounds. J Med Assoc Thai 1988;71:475-80.
13. Gozal D, Ziser A, Shupak A, Ariel A, Melamed Y. Necrotizing fasciitis. Arch Surg 1986;121:233-5. Crossref
14. Russell FE. Snake venom immunology: historical and practical considerations. Toxin Rev 1988;7:1-82. Crossref
15. Guo MP, Wang QC, Liu GF. Pharmacokinetics of cytotoxin from Chinese cobra (Naja naja atra) venom. Toxicon 1993;31:339-43. Crossref
16. Lam KK, Crow P, Ng KH, et al. A cross-sectional survey of snake oral bacterial flora from Hong Kong, SAR, China. Emerg Med J 2011;28:107-14. Crossref
17. Chen CM, Wu KG, Chen CJ, Wang CM. Bacterial infection in association with snakebite: a 10-year experience in a northern Taiwan medical center. J Microbiol Immunol Infect 2011;44:456-60. Crossref

Multimodal analgesia model to achieve low postoperative opioid requirement following bariatric surgery

Hong Kong Med J 2016 Oct;22(5):428–34 | Epub 15 Jul 2016
DOI: 10.12809/hkmj154769
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Multimodal analgesia model to achieve low postoperative opioid requirement following bariatric surgery
Katherine KY Lam, FHKCA, FHKAM (Anaesthesiology); Wilfred LM Mui, FCSHK, FHKAM (Surgery)
Hong Kong Bariatric and Metabolic Institute and Evangel Hospital Weight Management Centre, Room 610, Champion Building, 301-309 Nathan Road, Jordan, Hong Kong
 
Corresponding author: Dr Katherine KY Lam (katherinelamky@gmail.com)
 
 Full paper in PDF
Abstract
Objective: To investigate whether a new anaesthesia protocol can reduce opioid use in obese patients following laparoscopic sleeve gastrectomy.
 
Methods: This prospective observational case series was conducted in a private hospital in Hong Kong that has been accredited as a Centre of Excellence for Bariatric Surgery. Thirty consecutive patients scheduled for laparoscopic sleeve gastrectomy from 1 January 2015 to 31 March 2015 were reviewed.
 
Results: Of the 30 patients, 14 (46.7%) did not require any opioids for rescue analgesia during the entire postoperative period; six (20.0%) required rescue opioids only in the post-anaesthetic care unit, but not in the surgical ward. The mean postoperative total opioid requirement per patient was 32 mg of pethidine.
 
Conclusion: With combination of multimodal analgesia with local anaesthetic infiltration, it is possible to avoid giving potent long-acting opioids in anaesthesia for bariatric surgery.
 
New knowledge added by this study
  • It is possible to avoid giving potent long-acting opioids in anaesthesia for bariatric surgery, by using multimodal analgesia with a combination of paracetamol, pregabalin, COX-2 inhibitors, tramadol, ketamine, dexmedetomidine, and local anaesthetic wound infiltration.
Implications for clinical practice or policy
  • The use of this opioid-sparing anaesthetic technique can potentially reduce the adverse effects and morbidity associated with the use of opioids in obese patients. The technique can be extended to other types of surgery in obese patients.
 
 
Introduction
Obese patients are particularly sensitive to the sedative and respiratory depressive effects of long-acting opioids. Many obese patients also have obstructive sleep apnoea syndrome (OSAS) and will be prone to airway obstruction and desaturation in the postoperative period, especially if opioids have been given.1 2 Given this background, multimodal analgesia is advocated for bariatric surgery with the aim of reducing opioid use.3 4 At the time of writing, no studies were able to demonstrate a technique that can consistently remove the need for any postoperative opioid analgesia. In this study, we report the use of an anaesthesia protocol that allowed a significant proportion of our patients undergoing laparoscopic sleeve gastrectomy to be completely free from any long-acting potent opioids in the intra-operative and postoperative period.
 
Methods
Patient selection
This was a prospective observational study. The study was conducted in a private hospital in Hong Kong that has been accredited as a Centre of Excellence for Bariatric Surgery. All patients scheduled for laparoscopic sleeve gastrectomy for management of obesity or type 2 diabetes from 1 January 2015 onwards were anaesthetised using the same protocol. We analysed 30 consecutive cases between 1 January 2015 and 31 March 2015 to investigate the postoperative opioid requirements using this anaesthesia protocol. Patients were excluded from the case series if they had contra-indications or allergy to any of the anaesthetic or analgesic drugs, or if anaesthesia deviated from the standard protocol for any reason. Three patients were excluded—one was taking serotonin-specific reuptake inhibitor antidepressants and pethidine was avoided to prevent serotonin syndrome (morphine given instead); one was allergic to non-steroidal anti-inflammatory drugs (NSAIDs), so intravenous parecoxib and oral etoricoxib were not given; one accidentally had a larger dose of ketamine given intra-operatively than allowed by the protocol. Concomitant laparoscopic cholecystectomy was performed with laparoscopic sleeve gastrectomy in three patients who were included in the study.
 
The anaesthesia protocol
All patients were fasted from midnight on the night before surgery. All operations were scheduled in the morning. Patients were premedicated with oral pantoprazole 40 mg on the night before surgery, and 2 g of oral paracetamol and 150 mg or 300 mg of oral pregabalin (for patients of body mass index <35 kg/m2 or ≥35 kg/m2, respectively) 2 hours before surgery.
 
Upon arrival in the operating theatre, intravenous access was established and 1 to 2 mg of intravenous midazolam was administered followed by an infusion of dexmedetomidine. The dose of dexmedetomidine was titrated according to the calculated lean body weight (LBW) using the Hume formula.5 The starting dose of the dexmedetomidine was 0.2 µg/kg/h using LBW.6 No loading dose was given.
 
Standard monitoring was applied to the patient together with a bispectral index (BIS) monitor and peripheral nerve stimulation monitor. Graduated compression stockings and sequential compression devices were used for all patients. Induction of anaesthesia was accomplished with fentanyl 100 µg, a titrated dose of propofol, and either suxamethonium or rocuronium as appropriate. The trachea was intubated and patients were ventilated with a mixture of air, oxygen, and desflurane.
 
Intra-operatively, desflurane was titrated to maintain BIS value between 40 and 60. Muscle relaxation was maintained with a rocuronium infusion to keep a train-of-four count of 1. Dexmedetomidine infusion continued at 0.2 µg/kg/h or higher if necessary. Shortly after induction, the various supplementary analgesic drugs were given. A loading dose of ketamine 0.3 mg/kg LBW was given followed by intermittent boluses roughly equivalent to 0.2 to 0.3 mg/kg/h of LBW. Intravenous parecoxib 40 mg and tramadol 100 mg were given. Dexamethasone 8 mg and tropisetron 5 mg were given intravenously for prophylaxis of postoperative nausea and vomiting (PONV).
 
For intravenous fluids, patients were given 10 mL/kg actual body weight of either lactated Ringer’s solution or normal saline, then more were given as appropriate. Hypotension was treated with either ephedrine or phenylephrine.
 
When the surgeon started to close the wounds, rocuronium infusion was stopped and dexmedetomidine infusion rate was reduced to 0.1 µg/kg/h. Wounds were infiltrated with 20 mL of 0.5% levobupivacaine. When all wounds were closed, dexmedetomidine infusion was stopped and desflurane switched off, muscle relaxation reversed by neostigmine and atropine. Patients were extubated when awake and able to obey command.
 
After extubation, patients were transferred to the post-anaesthetic care unit (PACU) for observation for 30 minutes, or longer if appropriate. If a patient required rescue analgesia, intravenous pethidine 20 mg with intravenous ketamine 5 mg was given, and the dose repeated if necessary. When 10 mg of intravenous ketamine had been given, further rescue analgesia was intravenous pethidine 20 mg without any more ketamine. This avoided administration of too much ketamine in an awake patient causing dizziness or hallucinations. When patients had good pain control and stable vital signs, they were transferred back to the ward. The standard postoperative protocol was initiated: if patients requested analgesics, an intramuscular injection of pethidine 50 mg was given, and repeated after 4 hours if necessary. By early evening, when vital signs were stable, patients were allowed sips of water followed by a fluid diet of 60 mL/h. Regular oral paracetamol and etoricoxib were given, and oral pregabalin was added to the protocol the next day. Opioid requirements were reviewed for 24 hours after surgery.
 
As part of the standard postoperative protocol, patients were asked to get off the bed and walk around the ward with the assistance of nursing or physiotherapy staff by the evening of the day of surgery. Provided there were no complications, patients were discharged on the second postoperative day. The anaesthesia protocol is summarised in Table 1.
 

Table 1. Anaesthesia protocol
 
Results
Patient characteristics are shown in Table 2, and postoperative opioid requirements are listed in Table 3.
 

Table 2. Patient characteristics (n=30)
 

Table 3. Postoperative opioid requirements (n=30)
 
Of the 30 patients, no opioid rescue analgesia was required in 14 (46.7%) throughout the postoperative period; six (20%) required intravenous pethidine for rescue analgesia in the PACU, but not after their return to the ward. The remaining 10 (33.3%) patients were given intramuscular pethidine injections in the ward on request.
 
The mean postoperative opioid requirement per patient in the whole case series was 32 mg of pethidine. Among the 16 patients who required rescue analgesia in the ward or in the PACU, their mean opioid requirement was 60 mg of pethidine, with a range of 20 to 150 mg.
 
This anaesthetic protocol included a dexmedetomidine infusion that might cause hypotension and bradycardia due to its alpha-2 adrenoceptor blocking action. In our case series, 11 (36.7%) patients developed transient hypotension despite intravenous fluid loading and required either intravenous ephedrine or phenylephrine. One patient had transient intra-operative bradycardia requiring atropine, probably due to preoperative use of a beta blocker and low resting heart rate.
 
Discussion
Importance of reducing postoperative opioid use in obese patients
Opioids are among the world’s oldest known drugs. They have been used in anaesthesia traditionally as part of a balanced anaesthesia, to provide hypnosis and analgesia, to blunt the sympathetic response to surgery, and are the mainstay of postoperative analgesia in many situations. Morbidly obese patients, however, are particularly sensitive to the respiratory depressant effects of opioids. Taylor et al2 found that the use of opioids per se is a risk factor for respiratory events in the first 24 hours after surgery. Ahmad et al1 demonstrated in their study of 40 morbidly obese patients who presented for laparoscopic bariatric surgery, that in using desflurane and remifentanil-morphine–based anaesthesia, hypoxaemic episodes in the first 24 hours were common, and 14 of their 40 patients had more than five hypoxic episodes per hour despite supplementary oxygen.
 
Another concern with use of opioids in bariatric patients is the high incidence (>70%) of OSAS.7 In our study, 30% (n=9) of patients had OSAS confirmed by an overnight sleep study. The remaining patients were not tested although many had varying symptoms of OSAS. These untested patients were assumed to have OSAS unless proven otherwise. The American Society of Anesthesiologists recommends that in patients with OSAS, methods should be used to reduce or eliminate the requirement for systemic opioids.8 Hence, reducing perioperative opioid use by these obese patients can potentially reduce morbidity.
 
How can the anaesthetist avoid or reduce the use of perioperative opioids, and yet still provide balanced anaesthesia with hypnosis, analgesia, haemodynamic stability, and satisfactory postoperative analgesia? The first method is to combine general anaesthesia with regional analgesia techniques, such that anaesthetic agents will provide hypnosis while the regional blocks will provide analgesia and block sympathetic responses to surgery. Any form of major regional block in a morbidly obese patient can be technically challenging, however. Furthermore, with respect to bariatric surgery, most procedures are now performed laparoscopically, so that thoracic epidural analgesia techniques have become largely unnecessary.
 
Putting aside the use of regional analgesia, the second method to reduce perioperative opioid use is to use a combination of non-opioid agents with volatile agents or propofol to achieve analgesia and haemodynamic control.3 A point to note here is that as acute tolerance to the analgesic effects of opioids can rapidly develop (such as after 90 minutes of remifentanil infusion),9 any attempts to reduce postoperative opioid requirement must include an effort to either eliminate or reduce the use of intra-operative opioids. These techniques are now often described as opioid-free anaesthesia or non-opioid techniques.
 
Paracetamol, NSAID, or COX-2 inhibitors, gabapentinoids, ketamine and alpha-2 agonists, when used individually, have all been shown to reduce postoperative opioid requirement and improve pain relief.10 11 12 13 14 Different combinations of these agents, together with local anaesthetic infiltration of the wounds, have been reported for bariatric surgery, as discussed below.
 
Development of the study protocol based on previous studies
In 2003, Feld et al15 described a technique of using sevoflurane combined with ketorolac, clonidine, ketamine, lignocaine, and magnesium for patients undergoing open gastric bypass. Compared with the control group where sevoflurane was used with fentanyl, they found the non-opioid group to be less sedated, with less morphine use in PACU although the total morphine use at 16 hours was not significantly different to the opioid group.
 
In 2006 Feld et al16 again described using desflurane combined with dexmedetomidine infusion, and compared it with a control group using desflurane and fentanyl, for patients undergoing open gastric bypass. In the dexmedetomidine group, there were lower pain scores and less morphine use in the PACU.
 
In 2005, Hofer et al17 described a case report of a super-obese patient weighing 433 kg who underwent open gastric bypass. No opioids were used but instead replaced with a high-dose dexmedetomidine infusion together with isoflurane.
 
As laparoscopic techniques have become more common in bariatric surgery, more studies have been carried out of non-opioid anaesthetic techniques for laparoscopic bariatric surgery. Tufanogullari et al18 described a technique in which either fentanyl or varying doses of dexmedetomidine were used with desflurane for laparoscopic bariatric surgery. All patients were also given celecoxib. Postoperatively, patients were given fentanyl boluses in PACU, then intravenous morphine via a patient-controlled analgesia system. The only statistical difference was decreased PACU fentanyl use in the dexmedetomidine groups.
 
Ziemann-Gimmel et al19 looked at 181 patients undergoing laparoscopic gastric bypass. In the treatment group, volatile anaesthetics were used together with intravenous paracetamol and ketorolac. Postoperatively patients were given regular paracetamol and ketorolac. If there was breakthrough pain, intermittent oral oxycodone or intravenous hydromorphone was given. A small number of patients in this treatment group (3/89) were able to remain opioid-free throughout, and 15 patients did not require opioid medications when they were back to the ward.
 
In another study where the primary outcome was the incidence of PONV, Ziemann-Gimmel et al20 evaluated 119 patients undergoing laparoscopic bariatric surgery. The treatment group was managed with propofol infusion, dexmedetomidine infusion, paracetamol, ketorolac, and ketamine. The other group was managed with volatile anaesthetic and opioids. Postoperative analgesia regimen was the same as the previous study.19 They reported a large reduction in PONV in their treatment group.
 
While most studies reported decreased requirement of opioids for postoperative analgesia in their non-opioid groups, very few studies could achieve zero postoperative opioid use. Only Ziemann-Gimmel et al19 could achieve total opioid sparing in a small proportion (3 out of 92 patients) of the treatment group by using intra-operative and postoperative intravenous paracetamol and ketorolac.
 
Most of these earlier studies used a combination of only a few of the available non-opioid adjuncts. Dexmedetomidine remains a mainstay of non-opioid adjunct in most of these studies. We hence proposed the use of a wider mix of non-opioid adjuncts, using a combination of paracetamol, COX-2 inhibitor, pregabalin, ketamine, dexmedetomidine, and local anaesthesia infiltration. In contrast to the earlier studies, in our study we were able to achieve zero postoperative opioid use in a significant percentage of patients (46.7%).
 
In our protocol, the only opioid given during anaesthesia was fentanyl 100 µg for intubation, and tramadol 100 mg, a weak opioid, shortly after induction. All other opioid analgesics, if required, were given after the patient was awake. This avoided having to blindly give intra-operative long-acting opioids during anaesthesia, and allowed better titration of the drug by giving small boluses each time with the patient awake.
 
Dexmedetomidine
Dexmedetomidine was a useful agent in our protocol. Before the addition of this agent to our protocol, total opioid sparing was very difficult to achieve. Dexmedetomidine is a highly selective alpha-2 adrenoceptor blocker, with analgesic and sedative properties.21 Previous study of its use in bariatric anaesthesia has failed to show any reduction in opioid requirements.18 In our protocol, we used more non-opioid adjuncts, and since we calculated the infusion dose using LBW instead of total body weight (TBW), overall we administered a much lower dose of dexmedetomidine.
 
Infusion of dexmedetomidine may cause initial hypertension and tachycardia (especially during a loading dose infusion), followed by hypotension and bradycardia. In our study, no loading dose was given. Of the 30 patients, 11 (36.7%) developed transient hypotension despite intravenous fluid loading and required either intravenous ephedrine or phenylephrine. This transient hypotension was also aggravated by putting the patient in a steep reverse trendelenburg position to facilitate surgical exposure, which decreases the venous return. When using dexmedetomidine in bariatric surgery, care must be taken to ensure the patient is euvolaemic.
 
Ketamine
Ketamine was another useful adjunct in our protocol. Ketamine is an N-methyl-D-aspartate receptor antagonist with strong analgesic properties when given at subanaesthetic doses.22 The use of ketamine has advantages in morbidly obese patients as it causes little respiratory depression compared with opioids. In our protocol, we used LBW to calculate the ketamine dose, and used relatively low ketamine doses (0.3 mg/kg bolus followed by 0.2-0.3 mg/kg/h with intermittent boluses). This resulted in a low total ketamine dose, with a mean of 31 mg ketamine per patient (range, 25-50 mg). Midazolam 1 to 2 mg was also given at induction to prevent any psychomimetic reactions caused by ketamine. No patient developed any hallucinations or dysphoria and there was no delay in emergence noted in our patients.
 
The use of lean body weight as dosing scalar
In our protocol, we chose to use LBW to calculate the dose for dexmedetomidine and ketamine. The classic teaching is that for obese patients, anaesthetic drugs can be dosed according to the TBW versus ideal body weight or LBW according to lipid solubility. Lipophilic drugs are better dosed according to actual body weight due to an increase in volume of distribution, whereas hydrophilic drugs are better dosed according to LBW or ideal body weight.23 Lean body weight is significantly correlated with cardiac output, and drug clearance increases proportionately with LBW.6
 
There is insufficient information regarding the pharmacokinetics and pharmacodynamics of dexmedetomidine and ketamine in the morbidly obese patient. In the few previous studies regarding dexmedetomidine and bariatric anaesthesia, TBW was used as dosing scalars. For example, Feld et al16 used 0.5 µg/kg TBW loading dose followed by 0.4 µg/kg/h infusion in their series of 10 patients with open gastric bypass. Ziemann-Gimmel et al20 used 0.5 µg/kg TBW loading dose followed by 0.1 to 0.3 µg/kg/h infusion for their group of 60 patients undergoing a variety of bariatric procedures. Tufanogullari et al18 gave no loading dose and infused from 0 to 0.8 µg/kg/h in their series of 80 patients undergoing laparoscopic banding or bypass. There were little data regarding ketamine dose in bariatric surgery. We chose to dose these two drugs using LBW to see how our results would differ from the other published studies.
 
Limitations of the study
Our study has several limitations. It was a prospective observational study with a relatively small number of cases. We do not have data to compare this protocol with our previous protocols, nor do we have data in the form of a randomised controlled trial to look at the isolated effect of any of the drugs used.
 
The opioid that we used for rescue analgesia was pethidine, given intravenously in the recovery room by the anaesthetist, or given intramuscularly on the ward by the nurses upon standing order. One can argue that the mean opioid dose per patient was not accurate as some were given small intravenous boluses and others were given intramuscular injections of fixed dose. To accurately assess the postoperative parenteral opioid requirements in theory, all patients should be given a patient-controlled analgesia system to deliver boluses of parenteral opioids as required. This, however, is not practical and not necessary for the patient, given that two thirds of our patients did not require any opioids at all. This would also represent a lot of drug wastage when the whole cassette of drugs was unused.
 
We were able to demonstrate that a significant proportion of patients did not require any opioids, but we do not have data to demonstrate a reduction in respiratory complications or an improvement in time to ambulation or discharge. This could be the basis for further studies.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Ahmad S, Nagle A, McCarthy RJ, et al. Postoperative hypoxaemia in morbidly obese patients with and without obstructive sleep apnea undergoing laparoscopic bariatric surgery. Anesth Analg 2008;107:138-43. Crossref
2. Taylor S, Kirton OC, Staff I, Kozol RA. Postoperative day one: a high risk period for respiratory events. Am J Surg 2005;190:752-6. Crossref
3. Mulier JP. Perioperative opioids aggravate obstructive breathing in sleep apnea syndrome: mechanisms and alternative anaesthesia strategies. Curr Opin Anaesthesiol 2016;29:129-33. Crossref
4. Alvarez A, Singh PM, Sinha AC. Postoperative analgesia in morbid obesity. Obes Surg 2014;24:652-9. Crossref
5. Hume R. Prediction of lean body mass from height and weight. J Clin Pathol 1966;19:389-91. Crossref
6. Ingrande J, Lemmens HJ. Dose adjustment of anaesthetics in the morbidly obese. Br J Anaesth 2010;105 Suppl 1:i16-23. Crossref
7. Lopez PP, Stefan B, Schulman CI, Byers PM. Prevalence of sleep apnea in morbidly obese patients who presented for weight loss surgery evaluation: more evidence for routine screening for obstructive sleep apnea before weight loss surgery. Am Surg 2008;74:834-8.
8. American Society of Anesthesiologists Task Force on Perioperative Management of patients with obstructive sleep apnea. Practice guidelines for the perioperative management of patients with obstructive sleep apnea: an updated report by the American Society of Anesthesiologists Task Force on Perioperative Management of patients with obstructive sleep apnea. Anesthesiology 2014;120:268-86. Crossref
9. Vinki HR, Kissin I. Rapid development of tolerance to analgesia during remifentanil infusion in humans. Anesth Analg 1998;86:1307-11. Crossref
10. Dahl JB, Nielsen RV, Wetterslev J, et al. Post-operative analgesic effects of paracetamol, NSAIDs, glucocorticoids, gabapentinoids and their combinations: a topical review. Acta Anaesthesiol Scand 2014;58:1165-81. Crossref
11. Blaudszun G, Lysakowski C, Elia N, Tramèr MR. Effect of perioperative systemic α2 agonists on postoperative morphine consumption and pain intensity: systematic review and meta-analysis of randomized controlled trials. Anesthesiology 2012;116:1312-22. Crossref
12. Cabrera Schulmeyer MC, de la Maza J, Ovalle C, Farias C, Vives I. Analgesic effects of a single preoperative dose of pregabalin after laparoscopic sleeve gastrectomy. Obes Surg 2010;20:1678-81. Crossref
13. Weinbroum AA. Non-opioid IV adjuvants in the perioperative period: pharmacological and clinical aspects of ketamine and gabapentinoids. Pharmocol Res 2012;65:411-29. Crossref
14. Alimian M, Imani F, Faiz SH, Pournajafian A, Navadegi SF, Safari S. Effect of oral pregabalin premedication on post-operative pain in laparoscopic gastric bypass surgery. Anesth Pain Med 2012;2:12-6. Crossref
15. Feld JM, Laurito CE, Beckerman M, Vincent J, Hoffman WE. Non-opioid analgesia improves pain relief and decreases sedation after gastric bypass surgery. Can J Anaesth 2003;50:336-41. Crossref
16. Feld JM, Hoffam WE, Stechert MM, Hoffman IW, Anada RC. Fentanyl or dexmedetomidine combined with desflurane for bariatric surgery. J Clin Anesth 2006;18:24-8. Crossref
17. Hofer RE, Sprung J, Sarr MG, Wedel DJ. Anesthesia for a patient with morbid obesity using dexmedetomidine without narcotics. Can J Anaesth 2005;52:176-80. Crossref
18. Tufanogullari B, White PF, Peixoto MP, et al. Dexmedetomidine infusion during laparoscopic bariatric surgery: the effect on recovery outcome variables. Anesth Analg 2008;106:1741-8. Crossref
19. Ziemann-Gimmel P, Hensel P, Koppman J, Marema R. Multimodal analgesia reduces narcotic requirements and antiemetic rescue medication in laparoscopic Roux-en-Y gastric bypass surgery. Surg Obes Relat Dis 2013;9:975-80. Crossref
20. Ziemann-Gimmel P, Goldfarb AA, Koppman J, Marema RT. Opioid-free total intravenous anaesthesia reduces postoperative nausea and vomiting in bariatric surgery beyond triple prophylaxis. Br J Anaesth 2014;112:906-11. Crossref
21. Carollo DS, Nossaman BD, Ramadhyani U. Dexmedetomidine: a review of clinical applications. Curr Opin Anaesthesiol 2008;21:457-61. Crossref
22. Gammon D, Bankhead B. Perioperative pain adjuncts. In: Johnson KB, editor. Clinical pharmacology for anesthesiology. McGraw-Hill Education; 2014: 157-78.
23. Sinha AC, Eckmann DM. Anesthesia for bariatric surgery. In: Miller RD, Eriksson LI, Fleisher LA, Wiener-Kronish JP, Young WL, editors. Miller’s anesthesia. 7th ed. Philadelphia: Churchill Livingston; 2015: 2089-104.

Seatbelt use by pregnant women: a survey of knowledge and practice in Hong Kong

Hong Kong Med J 2016 Oct;22(5):420–7 | Epub 19 Aug 2016
DOI: 10.12809/hkmj164853
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Seatbelt use by pregnant women: a survey of knowledge and practice in Hong Kong
WC Lam, MPH (CUHK), FHKAM (Obstetrics and Gynaecology)1; William WK To, MD, FHKAM (Obstetrics and Gynaecology)1; Edmond SK Ma, MD, FHKAM (Community Medicine)2
1 Department of Obstetrics and Gynaecology, United Christian Hospital, Kwun Tong, Hong Kong
2 The Jockey Club School of Public Health and Primary Care, The Chinese University of Hong Kong, Shatin, Hong Kong
 
Corresponding author: Dr WC Lam (lamwc2@ha.org.hk)
 
 Full paper in PDF
 
Abstract
Introduction: The use of motor vehicles is common during pregnancy. Correct seatbelt use during pregnancy has been shown to protect both the pregnant woman and her fetus. This survey aimed to evaluate the practices, beliefs, and knowledge of Hong Kong pregnant women of correct seatbelt use, and identify factors leading to reduced compliance and inadequate knowledge.
 
Methods: A self-administered survey was completed by postpartum women in the postnatal ward at the United Christian Hospital, Hong Kong, from January to April 2015. Eligible surveys were available from 495 women. The primary outcome was the proportion of pregnant women who maintained or reduced seatbelt use during pregnancy. Secondary outcomes were analysed and included knowledge of correct seatbelt use, as well as contributing factors to non-compliance and inadequate knowledge.
 
Results: There was decreased compliance with seatbelt use during pregnancy and the decrease was in line with increasing gestation. Pregnant women’s knowledge about seatbelt use was inadequate and only a minority had received relevant information. Women who held a driving licence and had a higher education level were more likely to wear a seatbelt before and during pregnancy. Women with tertiary education or above knew more about seatbelt use.
 
Conclusions: Public health education for pregnant women in Hong Kong about road safety is advisable, and targeting the lower-compliant groups may be more effective and successful.
 
 
New knowledge added by this study
  • There was decreased compliance with seatbelt use by pregnant women in Hong Kong. The decrease in compliance became more pronounced as gestation increased. This may be related to lack of relevant information and misconceptions.
Implications for clinical practice or policy
  • As a form of public health and road traffic safety promotion, information about seatbelt use during pregnancy should be provided to pregnant women, health care workers, and all road traffic users.
 
 
Introduction
Road traffic safety is an important public health issue. Health care professionals are usually involved in the treatment of road traffic accident victims rather than prevention of their occurrence or minimising the severity of injury. Education about and promotion of road traffic safety is important for all; pregnant women are no exception. Safety issues relate to both the mother and her fetus, and different information and/or a different approach may be required. With any kind of intervention during pregnancy, an emphasis on the safety of the fetus may improve compliance.
 
The number of pregnant drivers in Hong Kong is unknown, but the use of motor vehicles including private car, taxi, and public light bus is common during pregnancy. To promote maternal seatbelt use among the local pregnant population, information about their beliefs is essential.
 
Correct seatbelt use during pregnancy has been shown to protect both the pregnant woman and her fetus. There is evidence that pregnant women who do not wear a seatbelt and who are involved in a motor vehicle accident are more likely to experience excessive bleeding and fetal death.1 2 3 Compliance and proper use of the seatbelt are crucial. Incorrect placement of the seatbelt and a subsequent accident may result in fetal death due to abruptio placentae.4 The three-point restraint (ie shoulder harness in addition to a lap belt) provides more protection for the fetus than a lap belt alone. Previous studies have revealed incorrect positioning of the seatbelt in 40% to 50% of pregnant women.5 6 Various other studies have shown reduced seatbelt compliance during pregnancy.7 The proportion of seatbelt use has been reported to be around 70% to 80% before pregnancy, but reduced by half at 20 weeks or more of gestation.5 7 There is also evidence that pregnant women lack information about the proper use of a seatbelt and its role in preventing injury: only 14% to 37% of pregnant women received advice from health care professionals.5 6 7 8 The common reasons for not using a seatbelt have been reported to include discomfort, inconvenience, forgetfulness, and fear of harming the fetus.9
 
In this study, the current practice and knowledge of Hong Kong pregnant women about seatbelt use was surveyed, and any determining factors were identified. The results will enable public health education and promotion to be targeted to at-risk groups to improve road traffic safety among local pregnant women.
 
Methods
Study design
This was a cross-sectional survey using a convenient sampling carried out from January to April 2015. A self-administered questionnaire was distributed to postpartum women in the postnatal ward of United Christian Hospital (UCH) in Hong Kong. Participation in the survey was entirely voluntary.
 
Questionnaires were analysed if at least 50% of questions were answered, including the main outcomes. Those from women who did not understand the content or who did not understand Chinese or English were excluded.
 
Questionnaire
The questionnaire was based on a pilot study, with questions revised after review. It was available in English and Chinese (traditional and simplified) versions and was divided into four parts. The first part included demographic and pregnancy information and driving experience. The second part focused on practice of seatbelt use before and during pregnancy, any change in habit with progression of pregnancy, and the reason(s) for non-use of a seatbelt. The third part related to awareness and knowledge of the Road Traffic Ordinance on seatbelt use and the correct use of both lap and shoulder belts. Text descriptions and diagrams of different restraint positions were provided. The correct way is to place the lap belt below the abdomen and the shoulder belt diagonally across the chest. The diagram of restraint positions were adopted from the leaflet “Protect your unborn child in a car” by the Transport Department of Hong Kong with permission.10 The final part asked whether the postpartum woman had received any advice about seatbelt use during pregnancy, the source of information, and whether they thought such information was useful and/or relevant.
 
Statistical analysis
Sample size calculation
Using the results of overseas studies as reference, the sample size was calculated according to the assumption that around 80% of Hong Kong pregnant women use a seatbelt. A previous questionnaire survey among postpartum women at a local hospital indicated that a response rate of approximately 80% could be expected.11 We assumed the margin of error that could be accepted to be 4%, with a confidence level of 95%, and using this formula: n = z2 x p x (1-p)/d2 (where p = proportion of wearing seatbelt [0.8]; d = margin of error [0.04]; and z value = 1.96), the adjusted sample size was 481.
 
All statistical analysis was performed using PASW Statistics 18 (Release Version 18.0.0; SPSS Inc, Chicago [IL], US). For categorical data, the Chi squared test was used to compare knowledge about seatbelt use in wearers and non-wearers. For continuous data with a highly skewed distribution, non-parametric test (Mann-Whitney U test for two groups and Kruskal-Wallis H test for more than two groups) was used to compare the knowledge of correct seatbelt use. Knowledge score was calculated based on the answer to questions about the Road Traffic Ordinance on seatbelt use and the proper way to use both the lap and shoulder belts. One point was given for each correct answer. The critical level of statistical significance was set at 0.05.
 
The relative effects of factors (age, marital status, education level, resident status, husband’s occupation, family monthly income, respondent’s and husband’s driving licence holder status, frequency of public transport use, and stage of pregnancy) that might influence seatbelt use during pregnancy were estimated using generalised estimating equation (GEE). The outcome variables were dichotomous correlated responses (eg use of seatbelt in different gestations), and the outcome variables were assumed to be independent. The issue about statistical significance due to lack of independence was corrected using GEE.
 
To account for the interdependence of observations, we used robust estimates of variance (GEE) by including each period of observation as a cluster. For use of a seatbelt before and during each trimester of pregnancy, since the responses were correlated as time progressed, the GEE model with working correlation matrix was adopted.12
 
Results
Demographic data
There were 769 postpartum women in the postnatal ward during the study period. A total of 550 questionnaires were distributed by convenience and the response rate was 91% with 501 questionnaires returned. The remaining women (n=49, 9%) either refused to participate or did not return the questionnaire. Among the returned questionnaires, six were excluded due to missing information on the main outcomes of the survey or they were <50% complete. At the end of the recruitment period, 495 (90%) questionnaires were valid for analysis.
 
The majority (93.5%) of respondents were aged between 21 and 40 years. Only 10 (2%) were English speakers; others (98%) spoke Cantonese or Mandarin as their first language and completed the Chinese questionnaire. With regard to education level, 188 (38%) women had received tertiary education or above, 290 (58.6%) secondary education, and 14 (2.8%) primary education. There was no existing information about any association between pregnant woman or spousal occupation and compliance with or knowledge about seatbelt use. We therefore investigated whether occupation was a relevant factor, for example, driver and health care worker. Around half (n=216, 43.6%) of the women were housewives, 57 (11.6%) were professionals, and 14 (2.8%) were medical health care workers. Among spouses, 32 (6.5%) were drivers, two (0.4%) were medical health care workers, and 122 (24.6%) were professionals. Other occupations were unrelated to transportation or health care, including clerk, construction site worker, restaurant waiter, and chef. Overall, 439 (88.7%) women were Hong Kong residents, others were new immigrants or double-entry permit holders from Mainland China. Of the respondents, 477 (96.4%) women had attended regular antenatal check-ups, and 215 (43.4%) were first-time mothers.
 
Driving experience and mode of transport
Around half of the spouses (49.1%) but only 71 (14.3%) women held a Hong Kong driving licence. Among those women with a driving licence, only 16 (22.5%) drove daily, and seven (9.9%) only at weekends. Public transport was used daily by 300 (60.6%) women. Among different means of public transport, buses (53.7%) were the most commonly used but not all seats on buses have seatbelts. In public light buses and taxis, use of a seatbelt, if available, is mandatory: 38.6% and 15.2% of respondents used public light buses and taxis, respectively.
 
Use of a seatbelt before and during pregnancy
Of the respondents, 379 (76.6%) pregnant women reported using a seatbelt in the 6 months before pregnancy, but compliance was reduced as pregnancy progressed. Seatbelt use was reduced to 73.5% in the first trimester, 70.5% in the second trimester, and 67.1% in the third trimester (Table 1). There were 26 women who changed their behaviour from not wearing a seatbelt prior to pregnancy to wearing one after they became pregnant. Therefore the total number of ever seatbelt users was 405. Analysis of the knowledge score was performed by excluding these 26 women; the result showed a similar finding and statistical significance.
 

Table 1. Use of seatbelt before and during pregnancy
 
Reasons for not using a seatbelt during pregnancy
With regard to the reasons for not using a seatbelt at any time during pregnancy, 156 (89.1%) of 175 women stated that the seatbelt caused discomfort, 22 (12.6%) thought seatbelts were not useful, and 79 (45.1%) worried that they would cause harm to the fetus (Table 2). Apart from the three stated options in the questionnaire, several respondents stated that the travelling distance was usually short on public light buses and the time taken to buckle up and unfasten the seatbelt may delay other passengers. Other women admitted to being lazy or forgetful, or were just not in the habit of using a seatbelt. They also found seatbelts inconvenient because those on public transport were “not user-friendly”, “too short”, or were “dirty” (Table 2).
 

Table 2. Reasons for not using a seatbelt
 
Knowledge of seatbelt use during pregnancy
Of the respondents, 216 (43.6%) correctly answered that pregnant women are not exempted from seatbelt use according to the Road Traffic Ordinance. The remaining 56.4% either answered wrongly or did not know the answer. Approximately 52.7% women correctly pointed out that appropriate use of a seatbelt will not harm the fetus. Although around half of the women wrongly believed that pregnant women are exempted from seatbelt legislation or that use of a seatbelt will harm the fetus, 358 (72.3%) stated that pregnant women should wear a seatbelt. When the three-point seatbelts were shown on the diagrams, 403 (81.4%) women could identify the correct way of wearing the seatbelt with the lap strap placed below the bump, not over it (Table 3).
 

Table 3. Knowledge of seatbelt use (seatbelt users vs non-users)
 
Among all the respondents, 90 (18.2%) women never wore a seatbelt, and the other 405 (81.8%) were seatbelt users either before or during pregnancy. Comparison of responses revealed that never wearers of a seatbelt had significantly poorer knowledge in three of the four questions about seatbelt use during pregnancy (P<0.05) [Table 3].
 
Information about seatbelt use during pregnancy
Information about seatbelt use had been received by only 32 (6.5%) women. Among them, 13 (40.6%) had derived the information from the internet, others from staff of a government and private clinic, magazine, and publications of Transport Department. Seven (21.9%) received information from friends or family members; one had a car accident during pregnancy and was given relevant information by health care workers at the Accident and Emergency Department. Most (n=426, 86%) women expressed the view that information about seatbelt use during pregnancy was useful and necessary.
 
Factors influencing use of seatbelt during pregnancy
Among all potential factors, women who held a driving licence (odds ratio [OR]=3.28; P=0.004) or had a higher level of education (OR=2.13; P<0.001) were more likely to use a seatbelt. Considering time as another variable, as pregnancy progressed women were significantly less likely to use a seatbelt (OR=0.84; P<0.001) [Table 4].
 

Table 4. Determining factors influencing use of a seatbelt before and during pregnancy
 
Factors influencing knowledge about correct seatbelt use
Women with a lower education level (P<0.001) were less aware of the Road Traffic Ordinance on seatbelt use, the protective effects of a seatbelt during pregnancy, and the correct way to position both the lap and shoulder belts (Table 5).
 

Table 5. Determining factors influencing knowledge score for correct seatbelt use
 
Discussion
Main findings
In this study, 76.6% of Hong Kong pregnant women were consistent seatbelt wearers before pregnancy; this is similar to overseas studies which reported 70% to 80%.5 7 Compliance was reduced during all trimesters, and decreased as gestation progressed. Only 26 women changed their behaviour from non-users to users after becoming pregnant. It also demonstrated the misconception about the effects of seatbelt use on pregnancy and the fetus. Pregnant women’s knowledge about seatbelt use was inadequate and only a minority had received relevant information. Women who held a driving licence or had a higher education level were more likely to wear a seatbelt before and during pregnancy. Women with a tertiary education or above were more knowledgeable about seatbelt use.
 
Strengths and limitations
As far as we know, this is the first survey in Hong Kong of the knowledge of pregnant women about seatbelt use and their associated practice, with a reasonably high response rate. One limitation of the study was that the questionnaire was not validated and there were overlapping categories for numerical variables. Results and experience in this study can serve to revise the questions for a future study with improved validity and reliability. During the study period, 769 postpartum women stayed in the postnatal ward and 495 (64%) completed questionnaires were collected. The proportion included was relatively high, but still the method of convenient sampling may have affected the representativeness of the sampled subjects. Moreover this was a single-centre survey in the obstetric unit of a district hospital. The UCH provides obstetric services to the population in the Kowloon East region. The geographical location of a clinic could dictate the mode of travelling to attend antenatal hospital appointments. Although taxis and public light buses are the usual mode of transport, some women may have taken the bus or Mass Transit Railway, and these do not require use of a seatbelt. Furthermore, the delivery rate at UCH was less than 10% of the total deliveries in Hong Kong, therefore the results may not be applicable to other clusters with patients of different education levels, driving experience, and transportation habits.
 
In addition, those who were unable to read or understand Chinese or English were excluded. These were usually illiterate or non–Hong Kong residents, and may be the group with the lowest compliance and poorest knowledge about seatbelt use. There were 49 women who refused to participate and six who did not complete the questionnaire; this 10% also introduced inaccuracy and bias in our data. Reporting bias is another concern. Discrepancies between observed and self-reported seatbelt use were found in a previous study.13 Anonymity of the questionnaires might have minimised reporting bias. Although all demographic variables included in the questionnaire were analysed, there were other potential confounders that might have affected the knowledge score and the use of a seatbelt during pregnancy. These were not investigated and hence not adequately adjusted in the knowledge score analysis or in the GEE model, for example prior traffic accidents in the respondents and their family members, risk-taking behaviours such as smoking, alcohol drinking, and drug use. Finally, multivariate instead of univariate analysis of the factors affecting knowledge score could be performed to investigate the relationship among different variables.
 
Interpretation
Prevention plays a major role in ensuring maternal and fetal survival in road traffic accidents. Motor vehicle crashes are responsible for severe maternal injury and fetal loss. Despite existing knowledge about the protective effects of wearing a seatbelt, pregnant women remain poorly compliant. This was confirmed in this local survey and in overseas studies.14 15
 
In the Report on Confidential Enquiries into Maternal Deaths in the United Kingdom 1994-1996 published by the Royal College of Obstetricians and Gynaecologists, 13 pregnant women died as a result of road traffic accidents. One of the victims did not use a seatbelt and was forcibly ejected from the vehicle.16 Ten years later, in a more recent Report on Confidential Enquiries into Maternal Deaths in the United Kingdom 2006-2008,17 there were 17 pregnant women who died as a result of road traffic accidents. A specific recommendation was made in the report: “All women should be advised to wear a 3-point seat belt throughout pregnancy, with the lap strap placed as low as possible beneath the ‘bump’ lying across the thighs and the diagonal shoulder strap above the ‘bump’ lying between the breasts. The seat belt should be adjusted to fit as snugly and comfortably as possible, and if necessary the seat should be adjusted”.17
 
According to the Road Traffic Ordinance in Hong Kong, drivers and passengers must wear seatbelts where provided. The exceptions are when reversing a vehicle, making a three-point turn, manoeuvring in and out of a parking place, and those who have a medical certificate and have been granted an exemption on medical grounds by the Commissioner for Transport.18 According to a report of the Transport Department of Hong Kong, the total number of road traffic accidents was 14 436 in 2003. In 2013, the number rose to 16 089. The number of pregnant women involved or injured in road traffic accidents is unknown.19 The Hong Kong SAR Government revises seatbelt legislation regularly to enhance road safety. Since 1 January 2001, passengers have been required to wear a seatbelt, if available, in the rear of taxis as well as in the front. Since 1 August 2004, passengers on public light buses have also been required to wear a seatbelt where one is fitted.20 21 Stickers were put inside buses and taxis to remind passengers of their responsibility to wear a seatbelt and to give clear instructions on the correct way to wear it. Nonetheless, the requirement to use a seatbelt and its protective effects were not well recognised among the respondents in this survey. This may be due to the lack of information provision as only 6.5% of women had received information related to seatbelt use in pregnancy.
 
In this study, those with a lower education level had poorer knowledge about seatbelt use in pregnancy. Effective public education should target these women. Using diagrams as instruction can be simple and direct so that those with a lower education level or who only use public transportation occasionally can easily understand and follow the advice. In the past, leaflets or stickers about seatbelt use were widely seen, especially after introduction of the new legislation, but those specifically targeted to the pregnant population were not common. Maternal child health centres and antenatal clinics of government hospitals are ideal places to distribute educational material. Television announcements may also convey the message effectively, not only to pregnant women, but to all road traffic users. It is also a good opportunity to inform drivers and other passengers so that they can help pregnant women as well as the elderly and disabled who use public transport. Regular spot-checks on public transport and law enforcement may also encourage compliance with seatbelt use. The majority of doctors and midwives give advice about seatbelt use only if asked. This survey demonstrated that the proportion of pregnant women who received seatbelt information was very small. It is recommended that written instructions and advice should be available from well-informed health care professionals, and pregnant women should always be encouraged to wear a correctly positioned seatbelt. Obstetricians, midwives, and general practitioners play an important role in disseminating information. A study in Ireland showed that 75% of general practitioners believed women should wear seatbelts in the third trimester, although only 30% provided regular advice and fewer than 50% indicated that they were aware of the correct advice to give.22
 
Conclusions
This study demonstrated decreased compliance with seatbelt use during pregnancy that continued to decrease as pregnancy progressed. Women with a lower education level or without a driving licence were less likely to use a seatbelt during pregnancy. The former were also less aware of the Road Traffic Ordinance on seatbelt use and the correct way to position both the lap and shoulder belts. Only a minority of pregnant women had received information about seatbelt use. Future studies to assess the knowledge of Hong Kong health care workers about use of seatbelts in pregnancy may enhance the awareness and involvement of medical professionals in educating pregnant women on this issue. Publicity and education about road safety by health care providers and the government are advised, and targeting the lower compliant groups may be more effective and successful.
 
Acknowledgements
The authors gratefully acknowledge Mr Edward Choi for his valuable statistical advice, the staff in the postnatal ward of UCH for helping to collect the questionnaires, and the Transport Department of Hong Kong for permission to use the diagram of restraint positions adopted from the leaflet “Protect your unborn child in a car” on the questionnaires.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Hyde LK, Cook LJ, Olson LM, Weiss HB, Dean JM. Effect of motor vehicle crashes on adverse fetal outcomes. Obstet Gynecol 2003;102:279-86. Crossref
2. Wolf ME, Alexander BH, Rivara FP, Hickok DE, Maier RV, Starzyk PM. A retrospective cohort study of seatbelt use and pregnancy outcome after a motor vehicle crash. J Trauma 1993;34:116-9. Crossref
3. Klinich KD, Schneider LW, Moore JL, Pearlman MD. Injuries to pregnant occupants in automotive crashes. Annu Proc Assoc Adv Automot Med 1998;42:57-91.
4. Bunai Y, Nagai A, Nakamura I, Ohya I. Fetal death from abruptio placentae associated with incorrect use of a seatbelt. Am J Forensic Med Pathol 2000;21:207-9. Crossref
5. Jamjute P, Eedarapalli P, Jain S. Awareness of correct use of a seatbelt among pregnant women and health professionals: a multicentric survey. J Obstet Gynaecol 2005;25:550-3. Crossref
6. Johnson HC, Pring DW. Car seatbelts in pregnancy: the practice and knowledge of pregnant women remain causes for concern. BJOG 2000;107:644-7. Crossref
7. Ichikawa M, Nakahara S, Okubo T, Wakai S. Car seatbelt use during pregnancy in Japan: determinants and policy implications. Inj Prev 2003;9:169-72. Crossref
8. Taylor AJ, McGwin G Jr, Sharp CE, et al. Seatbelt use during pregnancy: a comparison of women in two prenatal care settings. Matern Child Health J 2005;9:173-9. Crossref
9. Weiss H, Sirin H, Levine JA, Sauber E. International survey of seat belt use exemptions. Inj Prev 2006;12:258-61. Crossref
10. Transport Department, The Government of the Hong Kong Special Administrative Region. Protect your unborn child in a car. Available from: http://www.td.gov.hk/filemanager/en/content_174/belt-e.pdf. Accessed Aug 2016.
11. Yu CH, Chan LW, Lam WC, To WK. Pregnant women’s knowledge and consumption of long-chain omega-3 polyunsaturated fatty acid supplements. Hong Kong J Gynaecol Obstet Midwifery 2014;14:57-63.
12. Liang KY, Zeger SL. Longitudinal data analysis using generalized linear models. Biometrika 1986;73:13-22. Crossref
13. Robertson LS. The validity of self-reported behavioral risk factors: seatbelt and alcohol use. J Trauma 1992;32:58-9.Crossref
14. Luley T, Fitzpatrick CB, Grotegut CA, Hocker MB, Myers ER, Brown HL. Perinatal implications of motor vehicle accident trauma during pregnancy: identifying populations at risk. Am J Obstet Gynecol 2013;208:466.e1-5. Crossref
15. Grossman NB. Blunt trauma in pregnancy. Am Fam Physician 2004;70:1303-10.
16. Chapter 13: Fortuitous deaths. Why mothers die: report on confidential enquiries into maternal deaths in the United Kingdom 1994-1996. London: Royal College of Obstetrics and Gynaecologists Press; 2001.
17. Cantwell R, Clutton-Brock T, Cooper G, et al. Saving Mothers’ Lives: Reviewing maternal deaths to make motherhood safer: 2006-2008. The Eighth Report of the Confidential Enquiries into Maternal Deaths in the United Kingdom. BJOG 2011;118 Suppl 1:1-203. Crossref
18. Transport Department, The Government of the Hong Kong Special Administrative Region. Be Smart, buckle up. Available from: http://www.td.gov.hk/filemanager/en/content_174/seatbelt_leaflet.pdf. Accessed Aug 2016.
19. Transport Department, The Government of the Hong Kong Special Administrative Region. Road Traffic Accident Statistics Year 2013. Available from: http://www.td.gov.hk/en/road_safety/road_traffic_accident_statistics/2013/index.html. Accessed Aug 2016.
20. Transport Department, The Government of the Hong Kong Special Administrative Region. Seat belt: safe motoring guides. Available from: http://www.td.gov.hk/en/road_safety/safe_motoring_guides/seat_belt/index.html. Accessed Aug 2016.
21. Transport Department, The Government of the Hong Kong Special Administrative Region. Road Safety Bulletin; March 2001. Available from: http://www.td.gov.hk/filemanager/en/content_182/rs_bulletin_04.pdf. Accessed Aug 2016.
22. Wallace C. General practitioners knowledge of and attitudes to the use of seat belts in pregnancy. Ir Med J 1997;90:63-4.

Primary ventriculoperitoneal shunting outcomes: a multicentre clinical audit for shunt infection and its risk factors

Hong Kong Med J 2016 Oct;22(5):410–9 | Epub 26 Aug 2016
DOI: 10.12809/hkmj154735
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Primary ventriculoperitoneal shunting outcomes: a multicentre clinical audit for shunt infection and its risk factors
Working Group on Neurosurgical Outcomes Monitoring; Peter YM Woo, MMedSc1; HT Wong, FRCSEd (SN)1; Jenny KS Pu, FRCSEd (SN)2; WK Wong, FRCSEd (SN)3; Larry YW Wong, FRCSEd (SN)4; Michael WY Lee, FRCSEd (SN)5; KY Yam, FRCSEd (SN)6; WM Lui, FRCSEd (SN)2; WS Poon, FRCSEd (SN)7
1 Department of Neurosurgery, Kwong Wah Hospital, Yaumatei, Hong Kong
2 Division of Neurosurgery, Department of Surgery, Queen Mary Hospital, Pokfulam, Hong Kong
3 Department of Neurosurgery, Princess Margaret Hospital, Laichikok, Hong Kong
4 Department of Neurosurgery, Queen Elizabeth Hospital, Jordan, Hong Kong
5 Department of Neurosurgery, Pamela Youde Nethersole Eastern Hospital, Chai Wan, Hong Kong
6 Department of Neurosurgery, Tuen Mun Hospital, Tuen Mun, Hong Kong
7 Division of Neurosurgery, Department of Surgery, Prince of Wales Hospital, Shatin, Hong Kong
 
Corresponding author: Dr Peter YM Woo (wym307@ha.org.hk)
 
This paper was presented at the 21st Annual Scientific Meeting of the Hong Kong Neurosurgical Society, 6 December 2014, Hong Kong.
 
 Full paper in PDF
 
Abstract
Objectives: To determine the frequency of primary ventriculoperitoneal shunt infection among patients treated at neurosurgical centres of the Hospital Authority and to identify underlying risk factors.
 
Methods: This multicentre historical cohort study included consecutive patients who underwent primary ventriculoperitoneal shunting at a Hospital Authority neurosurgery centre from 1 January 2009 to 31 December 2011. The primary endpoint was shunt infection, defined as: (1) the presence of cerebrospinal fluid or shunt hardware culture that yielded the pathogenic micro-organism with associated compatible symptoms and signs of central nervous system infection or shunt malfunction; or (2) surgical incision site infection requiring shunt reinsertion (even in the absence of positive culture); or (3) intraperitoneal pseudocyst formation (even in the absence of positive culture). Secondary endpoints were shunt malfunction, defined as unsatisfactory cerebrospinal fluid drainage that required shunt reinsertion, and 30-day mortality.
 
Results: A primary ventriculoperitoneal shunt was inserted in 538 patients during the study period. The mean age of patients was 48 years (range, 13-88 years) with a male-to-female ratio of 1:1. Aneurysmal subarachnoid haemorrhage was the most common aetiology (n=169, 31%) followed by intracranial tumour (n=164, 30%), central nervous system infection (n=42, 8%), and traumatic brain injury (n=27, 5%). The mean operating time was 75 (standard deviation, 29) minutes. Shunt reinsertion and infection rates were 16% (n=87) and 7% (n=36), respectively. The most common cause for shunt reinsertion was malfunction followed by shunt infection. Independent predictors for shunt infection were: traumatic brain injury (adjusted odds ratio=6.2; 95% confidence interval, 2.3-16.8), emergency shunting (2.3; 1.0-5.1), and prophylactic vancomycin as the sole antibiotic (3.4; 1.1-11.0). The 30-day all-cause mortality was 6% and none were directly procedure-related.
 
Conclusions: This is the first Hong Kong territory-wide review of infection in primary ventriculoperitoneal shunts. Although the ventriculoperitoneal shunt infection rate met international standards, there are areas of improvement such as vancomycin administration and the avoidance of scheduling the procedure as an emergency.
 
 
New knowledge added by this study
  • The local rate of infection in ventriculoperitoneal (VP) shunts meets international standards.
  • Vancomycin antibiotic prophylaxis is a risk factor for shunt infection and is a novel finding.
  • VP shunt inserted as an emergency procedure is the strongest risk factor for infection.
Implications for clinical practice or policy
  • There is a need to review prophylactic vancomycin administration in terms of timing, dosage, and the need for its combination with another antibiotic.
  • Emergency VP shunting is not recommended. Shunts should be implanted whenever possible as an elective procedure.
  • A comprehensive local shunt surgery protocol to reduce the risk of shunt infection is recommended.
 
 
Introduction
Ventriculoperitoneal (VP) shunting is one of the most common neurosurgical procedures performed to treat patients with hydrocephalus, which is a disorder related to an abnormal accumulation of cerebrospinal fluid (CSF) in the brain. The operation involves diverting CSF from the ventricles of the brain to the peritoneal cavity of the abdomen by catheter implantation. Despite being a well-established procedure, shunt failure can be as high as 70% in the first year with an annual occurrence rate of 5% thereafter.1 One of the main causes for failure is shunt infection, a potentially debilitating complication that more than doubles the risk of death and exposes affected patients to 3 times as many neurosurgical procedures as non-infected patients.2 Shunt infection varies and occurs in 3% to 17% of patients. Standard management involves intravenous antibiotic therapy, shunt removal, insertion of an external ventricular drain, and replacement with a new shunt once the patient’s CSF is free of microbial infection.1 3 4 5 6 The economic impact of VP shunt infection can be considerable. In the US, the median cost per episode per patient has been reported as US$23 500, accountable for US$2.0 billion in annual hospital charges.7 8 Evidence suggests that the adoption of a strict institutional implantation protocol can significantly reduce the risk of this most challenging shunt complications.9 10 11 12 This retrospective study aimed to determine the frequency of primary VP shunt reinsertions and infection among patients treated in Hong Kong’s public health system and to identify risk factors for shunt infection.
 
Methods
This was a multicentre retrospective study of patients who underwent VP shunt implantation at all seven Hong Kong Hospital Authority neurosurgical units. The Hospital Authority is a public service highly subsidised by the Hong Kong Special Administrative Region Government, and responsible for delivering health care for 90% of inpatient bed days in the city.13 Clinical research ethics committee approval was obtained from the participating centres. Patients who underwent primary VP shunting from 1 January 2009 to 31 December 2011 were included in this study. Those who underwent alternative CSF diversion procedures or those with a history of VP shunt implantation were excluded from this review. Data from clinical records, operation notes, medication-dispensing records, CSF biochemistry, cell counts, and microbiological cultures were collected. The primary endpoint for this study was primary VP shunt infection. The criteria for shunt infection were: (1) CSF or shunt hardware culture that yielded the pathogenic micro-organism with associated compatible symptoms and signs of central nervous system (CNS) infection or shunt malfunction5 14 15; or (2) surgical incision site infection, as defined by the National Nosocomial Infection Surveillance System, requiring shunt reinsertion (even in the absence of a positive culture)16; or (3) intraperitoneal pseudocyst formation (even in the absence of a positive culture). Secondary endpoints were shunt malfunction, defined as unsatisfactory CSF drainage that required shunt reinsertion, and 30-day mortality. Potential risk factors for shunt infection were classified as patient-, disease-, or surgical-related factors. All subjects were followed up for at least 30 days from the operation date or until death.
 
Statistical analysis was carried out using Pearson’s Chi squared test, Fisher’s exact test, and binary logistic regression to identify risk factors for shunt infection. The Kaplan-Meier (log-rank) and Cox proportional hazards models were employed for survival analysis. Patient, disease, and surgical factors were used as covariates and a stepwise regression strategy was adopted (Table 1). P values of <0.05 were considered statistically significant. All tests were performed using the Statistical Package for the Social Sciences (Windows version 16.0.1; SPSS Inc, Chicago [IL], US).
 

Table 1. Clinical characteristics of patients with primary ventriculoperitoneal shunt and univariate logistic regression for shunt infection
 
Results
During the 3-year period, 538 patients underwent primary VP shunt implantation and 87% (n=470) had complete clinical follow-up with a median duration of 37 months (range, 3 days to 76 months). Seven (1%) patients were transferred to other hospitals within 30 days of the procedure. The median duration of hospitalisation was 42 days (range, 3 days to 36 months) and the median length of time from admission to shunting was 18 days (range, <1 day to 21 months). The clinical features and surgical variables are presented in Table 1. The mean (± standard deviation) age of patients was 48 ± 13 years (range, 13-88 years) and the male-to-female ratio was 1:1. In the study group, 80 (15%) were paediatric patients and 48 (9%) were infants. Overall, primary VP shunting was performed for post-aneurysmal subarachnoid haemorrhage communicating hydrocephalus in 169 (31%) patients, for CNS neoplasms in 164 (30%) patients, and for spontaneous intracerebral or intraventricular haemorrhage in 64 (12%) patients. For patients who had preoperative CSF sampling performed, the mean red blood cell count was 1900/µL, white cell count was 17/µL, total protein level was 0.78 g/L, and glucose level was 3.6 mmol/L.
 
Over one quarter of patients (n=155, 29%) had never had prior cranial neurosurgery and approaching half had undergone either one (n=141, 26%) or two (n=115, 21%) previous procedures. Antiseptic skin preparation was povidone-iodine 10% combined with another antiseptic in 422 (78%) patients and with povidone iodine alone in the remainder. The mean operating time for VP shunting was 75 ± 29 minutes. All patients had antibiotic prophylaxis of whom 328 (61%) were prescribed a third-generation cephalosporin and 40 (7%) had vancomycin. Twelve (2%) patients had a rifampicin-clindamycin antibiotic-impregnated ventricular catheter as part of the shunt system. The majority of operations were performed in an emergency setting (n=312, 58%) and shunt implantation was the sole procedure performed (n=514, 96%). The burr hole was most frequently positioned at the parietal location in 320 (59%) patients and 135 (25%) had a frontal burr hole. New burr holes were fashioned for shunt placement 95% of the time. The median number of surgeons was two, with a third of shunts performed by higher neurosurgical trainees (n=174, 32%) and the remaining performed by a neurosurgical specialist. Almost three quarters of VP shunts had a fixed-pressure valve (n=390, 72%) and the predominant design utilised was the Integra Pudenz flushing valve (Integra LifeSciences Corporation, Plainsboro [NJ], US) in 324 (60%) patients.
 
The rate of VP shunt reinsertion was 16% (n=87) and infection was 7% (n=36). The main causes for reinsertion were malfunction (9%) followed by infection. The annual proportion of shunts that required reoperation or were infected was comparable (P=0.87) [Fig 1]. The median time from shunt implantation to shunt removal for infection was 64 days (range, 2 days to 10 months). A cumulative risk for infection was noted affecting 3% of shunts in the first 30 days, 6% in 6 months, and 7% in 1 year. Although 68 (13%) patients were lost to follow-up, attrition analysis revealed that this did not affect infection rates. The mean follow-up duration in this subgroup between those with infection and those without was comparable at 526 days and 554 days, respectively (P=0.43). In addition, the incidence of shunt infection in patients with incomplete follow-up (5%) was similar to those with complete follow-up (7%) [P=0.42].
 

Figure 1. Comparison of the total number of primary ventriculoperitoneal (VP) shunts performed from 2009 to 2011 with the number of shunt reoperations and infected shunts
 
Most infections manifested as meningitis or ventriculitis (n=19, 53%), followed by wound breakdown (n=15, 42%) and peritonitis (n=2, 6%). The most common causative bacteria were coagulase-negative staphylococci (CoNS) [n=25, 69%] of which methicillin resistance was detected in 19 (76%) patients (Table 2). All CoNS species were sensitive to vancomycin with a quarter of methicillin-resistant (MR) species susceptible to aminoglycosides such as gentamicin or amikacin. The second most common infective agent affecting four (11%) patients was MR Staphylococcus aureus (MRSA). Polymicrobial infection was evident in six (17%) patients. One patient with peritonitis had mixed Gram-positive and -negative micro-organisms from CSF cultures.
 

Table 2. micro-organisms cultured from cerebrospinal fluid or shunt hardware from the 36 infected cases with antibiotic sensitivity distribution
 
The only patient risk factor for shunt infection was sex (Table 1). Male patients had a greater than two-fold increased odds of infection (odds ratio [OR]=2.2; 95% confidence interval [CI], 1.1-4.5). Traumatic brain injury (TBI) was the only disease risk factor (OR=7.8; 95% CI, 2.9-18.1). Surgical factors included the use of vancomycin as the prophylactic antibiotic (OR=3.7; 95% CI, 1.3-10.5) and shunts implanted as an emergency procedure (OR=2.2; 95% CI, 1.0-4.7). After adjusting for confounding factors, the independent risk factors for primary VP shunt infection were TBI (adjusted OR=6.2; 95% CI, 2.3-16.8), the use of vancomycin (adjusted OR=3.4; 95% CI, 1.1-11.0), and emergency shunting (adjusted OR=2.3; 95% CI, 1.0-5.1) [Table 3].
 

Table 3. Independent predictors for primary ventriculoperitoneal shunt infection
 
With respect to shunt infection, there was a difference in duration of shunt implantation for patients with the aforementioned risk factors as demonstrated by the significant separation of Kaplan-Meier survival curves (Fig 2). This was ascertained by Cox regression analysis (Table 3). Median shunt survival for trauma patients was 35 days (vs 154 days in non-trauma patients), 32 days for patients who received vancomycin (vs 124 days for alternative antibiotics), and 65 days for emergency operations (vs 208 days for elective operations).
 

Figure 2. Kaplan-Meier shunt survival analysis
Patients with traumatic brain injury, administered vancomycin as the sole prophylactic antibiotic, or had implantation as an emergency procedure were more likely to experience shunt infection (P<0.05, log-rank test)
 
In this study, 30-day all-cause mortality was 6% (n=32), but none was directly procedure-related. Almost half of these patients (n=15, 47%) had an underlying malignant CNS tumour; the majority being brain metastases (n=12, 80%). After accounting for patient age, sex, disease aetiology, shunt reinsertion and infection, a diagnosis of malignant brain tumour was the only significant independent predictor for 30-day mortality with an adjusted OR of 5.6 (95% CI, 2.6-11.7).
 
Discussion
Ventriculoperitoneal CSF shunting has considerably reduced the morbidity and mortality of patients with hydrocephalus since its first description in 1908.17 More than a century later the operation remains the mainstay treatment for this condition. Despite the introduction of antibiotics, improvements in shunt materials and surgical techniques, VP shunt complications are common. Long-term epidemiological studies have indicated that more than half of all patients with CSF shunts will require a surgical revision in their lifetime.4 18 Shunt infection is a serious complication with potentially devastating consequences. Observational studies have recorded infection rates ranging between 3% and 17%, but more consistent estimates from larger patient cohorts cite rates of 6% to 8%.1 3 4 5 Our local shunt infection rate of 7% is relatively lower than other previously published findings and is in keeping with the results from other developed countries.
 
The wide range of shunt infection rates quoted in the literature is due in part to the diverse definitions adopted for shunt infection and the patient populations studied. Many studies have defined infection as a positive CSF microbial culture or the presence of CSF pleocytosis or low CSF glucose levels with clinical features of CNS infection.3 11 19 Due to the study design, a more pragmatic definition was adopted whereby infection was determined retrospectively by either a positive CSF or shunt hardware microbial culture in the presence of shunt malfunction.5 14 15 Nonetheless, it is acknowledged that the true incidence of shunt infection may be overestimated by false-positive cultures from skin flora. The reasons for selecting this interpretation for shunt infection were three-fold. First, it allowed for micro-organism identification and consequent epidemiological analysis; second, infection may not be clinically apparent with malfunctioning shunts; and third, CSF cultures alone cannot exclude infection in cases of shunt malfunction.15
 
The only disease risk factor independently associated with VP shunt infection was TBI. This may be due to two reasons. Delayed post-traumatic hydrocephalus invariably occurs in severe TBI patients and develops in over a third of those subject to decompressive craniectomies.20 Such patients often have a prolonged hospital stay and undergo multiple operations before a shunt is eventually implanted. In this cohort, TBI patients had a mean duration of hospitalisation of 89 days, which was 2 weeks more than the mean stay of 74 days for hydrocephalic patients with alternative neurosurgical conditions. Protracted hospitalisation may lead to skin colonisation with drug-resistant organisms that can evade single-agent conventional antibiotic prophylaxis.21 This is supported by evidence from this study where causative bacteria of shunt infection were resistant to the prescribed prophylactic antibiotic in 80% of TBI patients. These patients also had a mean number of three prior cranial procedures before shunting compared with two operations in patients with non-traumatic hydrocephalus aetiology. Previous surgery is well known to be a main cause of CSF leak in shunted patients and contributes to an increased risk of infection.5 22 Although in this patient series, the number of prior cranial procedures per se did not impart greater risk, detailed clinical data regarding CSF leak in TBI patients were not collected and therefore the influence of TBI on infection can only be inferred.
 
An unexpected finding was that patient age was not a risk factor for shunt infection. This is in contrast to several larger studies that identified paediatric patients, especially infants (younger than a year), to be particularly at risk.11 23 Infants have less-developed humoral and cellular immune systems with immature skin growth rendering them more vulnerable to shunt infection. A likely reason for this observation is the small number of paediatric patients (n=80, 15%) in this cohort with only 9% (n=48) being infants. Larger sample size may delineate more distinctive differences among age-groups.
 
There is little doubt that systemic antibiotic prophylaxis can prevent shunt infection.24 We, however, interestingly identified the sole use of vancomycin as a risk factor for shunt infection, a novel observation that has not been previously reported in the literature. The antibiotic was regularly reserved for patients allergic to penicillin-group antibiotics or for those with documented penicillin-resistant microbial infection or colonisation. This finding apparently seems counter-intuitive especially when all causative CoNS identified in this cohort were sensitive to vancomycin. The issue may lie with the timing of its administration before the procedure and its dosage. With regard to timing, systemic vancomycin requires slow intravenous infusion to reduce the risk of a hypersensitivity reaction that manifests as either red man syndrome or anaphylaxis and occurs in 3.7% to 47% of patients.25 To further illustrate the incidence of these symptoms, the first randomised controlled trial investigating its efficacy in shunt procedures was prematurely discontinued due to these adverse effects.26 Most hospital protocols require infusion rates over an hour as a minimum, but clinical trials have demonstrated that even lengthier 2-hour infusions can further reduce the frequency and severity of these reactions.25 27 Furthermore, the efficacy of vancomycin to treat CoNS and MRSA infections has been questioned due to observations of slower bactericidal activity, compared with nafcillin, than was previously recognised.28 To address this issue we suggest that rigid guidelines should be adhered to with respect to the adequate timing of vancomycin infusion before skin incision. Should more rapid infusions be required, for example, in the emergency setting, pre-administration of intravenous diphenhydramine before vancomycin infusion can prevent the development of red man syndrome.25
 
Limited data are available about the pharmacokinetics and CSF concentrations of vancomycin in neurosurgical patients. In a study reviewing intra-operative serum and CSF vancomycin concentrations of paediatric patients undergoing shunt implantation, the authors noted that CSF penetration was negligible in patients with non-inflamed meninges despite presumed adequate loading doses.29 This was echoed in a later study determining that among non-meningitic patients, vancomycin CNS penetration was poor with a CSF-to-serum ratio of only 18%.30 Its increasing use over the course of decades has also led to a corresponding rise in minimum bactericidal concentrations of CoNS.28 These unique findings should prompt an extensive review of prophylactic vancomycin use as there is currently no consensus on a recommended loading dose for neurosurgical procedures. In the meantime, researchers have attempted to improve the bactericidal activity in patients treated with vancomycin with varying measures of success. Vancomycin in combination with gentamicin results in more rapid bactericidal rates in animal models28 and has been proven to be as effective as third-generation cephalosporins in preventing surgical site infection for neurosurgical procedures in a randomised trial.31 Others have proposed intra-operative combined vancomycin-aminoglycoside administration either intraventricularly, for shunt hardware antibiotic bath immersion prior to implantation or applied in powder form within the subgaleal space of the wound with tenable positive results.11 32 33 34
 
Antibiotic-impregnated (by rifampicin with either clindamycin or minocycline) and silver-coated ventricular catheters offer the greatest promise in preventing shunt infection.35 36 There exists a growing body of evidence in support of antibiotic-impregnated ventricular catheters and they are gradually replacing conventional plain silicone catheters in daily practice with considerable cost savings.37 38 39 40 There are also accompanying concerns about the development of antibiotic-resistant micro-organisms and a recent meta-analysis has elucidated the higher risk of Gram-negative and MRSA shunt infections.38 In this series, only 12 patients received antibiotic-impregnated catheters during the study period so it is difficult to draw any conclusion about their effectiveness.
 
Emergency VP shunting is another surgical-related risk factor for infection and was performed in more than half of patients who underwent the procedure. Its significance is the greatest among the three independent factors identified and is possibly the most amenable to change in current practice. The clinical condition of most patients with hydrocephalus who require primary VP shunting does not warrant emergency surgery although a few indications exist, for example, obstructive pineal region or cerebellar tumours that may present with acute symptoms. More than two thirds of patients (70%) in this study had conditions that necessitated delayed shunting when the primary disease had been treated and the patients stabilised. It is most likely because of limited availability of operating theatre among other related resources and the general practice that VP shunting is delegated to more junior members of the surgical team that this phenomenon prevails. Several reasons support why ‘emergency’ primary shunting should be discouraged. It has long been established by several protocol-driven trials that shunting should be performed as the first procedure of the operative day to minimise the risk of contamination.9 10 11 To illustrate, a surgical incision time after 10 am was observed to be a predictor for infection.5 In elective procedures the neurosurgeon in-charge and other responsible operating theatre staff are likely to be more experienced in comparison with personnel involved in emergencies. In particular, it has been shown that individual surgical experience is an important factor for infection with researchers stating a higher incidence among neurosurgical trainees or in surgeons who performed fewer than 147 shunts within a decade.4 41 Nonetheless, using the former stratification of trainee versus specialist, this was not evident in our cohort. Another argument against ‘emergency’ VP shunting could be the location where the procedure is performed. For a variety of resource allocation reasons, shunting scheduled as an emergency procedure is often not performed in neurosurgery-designated operating theatres. A study investigating the distribution of bacteria in the operating room environment and its relationship with ventricular shunt infections concluded that positive environmental cultures were more likely to occur in a theatre not devoted to neurosurgery.42 Although procedure timing and location were not explored in this audit, it is believed that they were the principal explanations why shunts implanted as an emergency were more likely to become infected.
 
The time interval from shunt implantation to revision for infection in our study is longer than most published data with a median shunt survival of 64 days.5 34 Our data show that 92% (n=33) of shunt infections occurred within 6 months and is compatible with the commonly held belief that the infection begins intra-operatively with the inoculation of skin flora, either from the patient or surgeon, into the surgical wound.42 43 44 This is further substantiated by the predominance of CoNS and S aureus in 81% of bacterial cultures in this patient series and similarly in several previous reports.4 5 9 10 11 12 32 35 36 43 Coupled with positive research findings that theatre discipline during surgery reduces infection risk, it seems reasonable to conclude that institutional shunt implantation protocols should be established.9 10 11 12 32
 
Even though the performance of our neurosurgical community with regard to primary VP shunt infection meets international standards, there is room for improvement. The implementation of a standardised shunt surgery protocol that covers preoperative preparation as well as intra-operative and postoperative management has consistently been proven to be effective in reducing infection.9 10 11 12 32 The landmark study by Choux at al10 first demonstrated that meticulous measures—such as adopting a no-touch technique for shunt handling, limiting the length of shunt exposure time and the number of people in the operating room—have dramatically decreased shunt infection rates from 16% to less than 1%. It is our belief that a similarly comprehensive protocol should be developed and based on the findings of this preliminary study.
 
This study has several limitations. Data collection was retrospective so key clinical information such as the presence of CSF leak and patient co-morbidities were missing. This may have led to inadequate control for confounding factors. An additional limitation inherent in studies of this nature is the potential presence of observational bias where data were collected without blinding after outcomes were known. Follow-up was incomplete with 68 (13%) patients defaulting from clinical review over the course of 3 years. Inadequate follow-up duration was also noted; seven (1%) patients were transferred to other hospitals within 30 days of the procedure and this might have influenced 30-day all-cause mortality findings. Finally, our definition of shunt infection did not include abnormal CSF biochemistry criteria that could have confirmed or refuted positive culture results of specimens that might have been contaminated during collection.
 
Conclusions
This is the first territory-wide review of infection in primary VP shunts conducted in Hong Kong’s public health care setting. This study is also one of the largest in the literature examining shunt infection complications among a predominantly Chinese population. Shunt infection was the second most common cause for reinsertion occurring in 7% of patients. Significant independent predictors for shunt infection were TBI, vancomycin administration for prophylaxis, and procedures performed in an emergency setting. Although VP shunt infection rates meet international standards, there are areas of improvement that can be readily addressed such as the timing or dosage of vancomycin and the avoidance of performing the procedure as an emergency. The best approach to reducing shunt infection may be the design and adoption of a standardised shunt surgery protocol customised to local practice.
 
Acknowledgements
We would like to thank members of the Hospital Authority Head Office (HAHO) Co-ordinating Committee (Neurosurgery), the Clinical Effectiveness & Technology Management Department, the Division of Quality and Safety, and the Clinical Data Analysis and Reporting System Team, HAHO IT Service for their administrative advice and data collection. We also wish to thank Drs Chris YW Liu, Alphon HY Ip, and Claudia Law for their contributions in data collection and entry. This study was supported by the Tung Wah Group of Hospitals Neuroscience Research Fund.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Wong JM, Ziewacz JE, Ho AL, et al. Patterns in neurosurgical adverse events: cerebrospinal fluid shunt surgery. Neurosurg Focus 2012;33:E13. Crossref
2. Schoenbaum SC, Gardner P, Shillito J. Infections of cerebrospinal fluid shunts: epidemiology, clinical manifestations, and therapy. J Infect Dis 1975;131:543-52. Crossref
3. Birjandi A, Zare E, Hushmandi F. Ventriculoperitoneal shunt infection: a review of treatment. Neurosurg Q 2012;22:145-8. Crossref
4. Borgbjerg BM, Gjerris F, Albeck MJ, Børgesen SE. Risk of infection after cerebrospinal fluid shunt: an analysis of 884 first-time shunts. Acta Neurochir (Wien) 1995;136:1-7. Crossref
5. Korinek AM, Fulla-Oller L, Boch AL, Golmard JL, Hadiji B, Puybasset L. Morbidity of ventricular cerebrospinal fluid shunt surgery in adults: an 8-year study. Neurosurgery 2011;68:985-94; discussion 994-5. Crossref
6. Patwardhan RV, Nanda A. Implanted ventricular shunts in the United States: the billion-dollar-a-year cost of hydrocephalus treatment. Neurosurgery 2005;56:139-44; discussion 144-5. Crossref
7. Simon TD, Riva-Cambrin J, Srivastava R, et al. Hospital care for children with hydrocephalus in the United States: utilization, charges, comorbidities, and deaths. J Neurosurg Pediatr 2008;1:131-7. Crossref
8. Shannon CN, Simon TD, Reed GT, et al. The economic impact of ventriculoperitoneal shunt failure. J Neurosurg Pediatr 2011;8:593-9. Crossref
9. Faillace WJ. A no-touch technique protocol to diminish cerebrospinal fluid shunt infection. Surg Neurol 1995;43:344-50. Crossref
10. Choux M, Genitori L, Lang D, Lena G. Shunt implantation: reducing the incidence of shunt infection. J Neurosurg 1992;77:875-80. Crossref
11. Kestle JR, Riva-Cambrin J, Wellons JC 3rd, et al. A standardized protocol to reduce cerebrospinal fluid shunt infection: the Hydrocephalus Clinical Research Network Quality Improvement Initiative. J Neurosurg Pediatr 2011;8:22-9. Crossref
12. Pirotte BJ, Lubansu A, Bruneau M, Loqa C, Van Cutsem N, Brotchi J. Sterile surgical technique for shunt placement reduces the shunt infection rate in children: preliminary analysis of a prospective protocol in 115 consecutive procedures. Childs Nerv Syst 2007;23:1251-61. Crossref
13. World Health Organization and Department of Health, Hong Kong. Hong Kong (China) health service delivery profile, 2012. Available from: http://www.wpro.who.int/health_services/service_delivery_profile_hong_kong_(china).pdf. Accessed Mar 2016.
14. Overturf GD. Defining bacterial meningitis and other infections of the central nervous system. Pediatr Crit Care Med 2005;6 Suppl:S14-8. Crossref
15. Vanaclocha V, Sáiz-Sapena N, Leiva J. Shunt malfunction in relation to shunt infection. Acta Neurochir (Wien) 1996;138:829-34. Crossref
16. Horan TC, Gaynes RP, Martone WJ, Jarvis WR, Emori TG. CDC definitions of nosocomial surgical site infections, 1992: a modification of CDC definitions of surgical wound infections. Infect Control Hosp Epidemiol 1992;13:606-8. Crossref
17. Kausch W. Die behandlung des hydrocephalus der kleinen kinder. Arch Klin Chir 1908;87:709-96.
18. Tuli S, Drake J, Lawless J, Wigg M, Lamberti-Pasculli M. Risk factors for repeated cerebrospinal shunt failures in pediatric patients with hydrocephalus. J Neurosurg 2000;92:31-8. Crossref
19. Odio C, McCracken GH Jr, Nelson JD. CSF shunt infections in pediatrics. A seven-year experience. Am J Dis Child 1984;138:1103-8. Crossref
20. Honeybul S, Ho KM. Incidence and risk factors for post-traumatic hydrocephalus following decompressive craniectomy for intractable intracranial hypertension and evacuation of mass lesions. J Neurotrauma 2012;29:1872-8. Crossref
21. Wang KW, Chang WN, Shih TY, et al. Infection of cerebrospinal fluid shunts: causative pathogens, clinical features, and outcomes. Jpn J Infect Dis 2004;57:44-8.
22. Jeelani NU, Kulkarni AV, Desilva P, Thompson DN, Hayward RD. Postoperative cerebrospinal fluid wound leakage as a predictor of shunt infection: a prospective analysis of 205 cases. Clinical article. J Neurosurg Pediatr 2009;4:166-9. Crossref
23. Davis SE, Levy ML, McComb JG, Masri-Lavine L. Does age or other factors influence the incidence of ventriculoperitoneal shunt infections? Pediatr Neurosurg 1999;30:253-7. Crossref
24. Ratilal B, Costa J, Sampaio C. Antibiotic prophylaxis for surgical introduction of intracranial ventricular shunts. Cochrane Database Syst Rev 2006;(3):CD005365. Crossref
25. Sivagnanam S, Deleu D. Red man syndrome. Crit Care 2003;7:119-20. Crossref
26. Odio C, Mohs E, Sklar FH, Nelson JD, McCracken GH Jr. Adverse reactions to vancomycin used as prophylaxis for CSF shunt procedures. Am J Dis Child 1984;138:17-9. Crossref
27. Healy DP, Sahai JV, Fuller SH, Polk RE. Vancomycin-induced histamine release and “red man syndrome”: comparison of 1- and 2-hour infusions. Antimicrob Agents Chemother 1990;34:550-4. Crossref
28. Stevens DL. The role of vancomycin in the treatment paradigm. Clin Infect Dis 2006;42 Suppl 1:S51-7. Crossref
29. Fan-Havard P, Nahata MC, Bartkowski MH, Barson WJ, Kosnik EJ. Pharmacokinetics and cerebrospinal fluid (CSF) concentrations of vancomycin in pediatric patients undergoing CSF shunt placement. Chemotherapy 1990;36:103-8. Crossref
30. Albanèse J, Léone M, Bruguerolle B, Ayem ML, Lacarelle B, Martin C. Cerebrospinal fluid penetration and pharmacokinetics of vancomycin administered by continuous infusion to mechanically ventilated patients in an intensive care unit. Antimicrob Agents Chemother 2000;44:1356-8. Crossref
31. Pons VG, Denlinger SL, Guglielmo BJ, et al. Ceftizoxime versus vancomycin and gentamicin in neurosurgical prophylaxis: a randomized, prospective, blinded clinical study. Neurosurgery 1993;33:416-22; discussion 422-3. Crossref
32. Choksey MS, Malik IA. Zero tolerance to shunt infections: can it be achieved? J Neurol Neurosurg Psychiatry 2004;75:87-91.
33. Abdullah KG, Attiah MA, Olsen AS, Richardson A, Lucas TH. Reducing surgical site infections following craniotomy: examination of the use of topical vancomycin. J Neurosurg 2015;123:1600-4. Crossref
34. Ragel BT, Browd SR, Schmidt RH. Surgical shunt infection: significant reduction when using intraventricular and systemic antibiotic agents. J Neurosurg 2006;105:242-7. Crossref
35. Keong NC, Bulters DO, Richards HK, et al. The SILVER (Silver Impregnated Line Versus EVD Randomized trial): a double-blind, prospective, randomized, controlled trial of an intervention to reduce the rate of external ventricular drain infection. Neurosurgery 2012;71:394-403; discussion 403-4. Crossref
36. Sciubba DM, Stuart RM, McGirt MJ, et al. Effect of antibiotic-impregnated shunt catheters in decreasing the incidence of shunt infection in the treatment of hydrocephalus. J Neurosurg 2005;103 Suppl:131-6. Crossref
37. Thomas R, Lee S, Patole S, Rao S. Antibiotic-impregnated catheters for the prevention of CSF shunt infections: a systematic review and meta-analysis. Br J Neurosurg 2012;26:175-84. Crossref
38. Konstantelias AA, Vardakas KZ, Polyzos KA, Tansarli GS, Falagas ME. Antimicrobial-impregnated and -coated shunt catheters for prevention of infections in patients with hydrocephalus: a systematic review and meta-analysis. J Neurosurg 2015;122:1096-112. Crossref
39. Parker SL, Farber SH, Adogwa O, Rigamonti D, McGirt MJ. Comparison of hospital cost and resource use associated with antibiotic-impregnated versus standard shunt catheters. Clin Neurosurg 2011;58:122-5. Crossref
40. Parker SL, McGirt MJ, Murphy JA, Megerian JT, Stout M, Engelhart L. Cost savings associated with antibiotic-impregnated shunt catheters in the treatment of adult and pediatric hydrocephalus. World Neurosurg 2015;83:382-6. Crossref
41. Cochrane DD, Kestle JR. The influence of surgical operative experience on the duration of first ventriculoperitoneal shunt function and infection. Pediatr Neurosurg 2003;38:295-301. Crossref
42. Duhaime AC, Bonner K, McGowan KL, Schut L, Sutton LN, Plotkin S. Distribution of bacteria in the operating room environment and its relation to ventricular shunt infections: a prospective study. Childs Nerv Syst 1991;7:211-4. Crossref
43. Bayston R, Lari J. A study of the sources of infection in colonised shunts. Dev Med Child Neurol 1974;16 Suppl 32:16-22. Crossref
44. Tulipan N, Cleves MA. Effect of an intraoperative double-gloving strategy on the incidence of cerebrospinal fluid shunt infection. J Neurosurg 2006;104 Suppl:5-8. Crossref

Chronic peritoneal dialysis in Chinese infants and children younger than two years

Hong Kong Med J 2016 Aug;22(4):365–71 | Epub 17 Jun 2016
DOI: 10.12809/hkmj154781
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Chronic peritoneal dialysis in Chinese infants and children younger than two years
YH Chan, FHKCPaed, FHKAM (Paediatrics); Alison LT Ma, FHKCPaed, FHKAM (Paediatrics); PC Tong, FHKCPaed, FHKAM (Paediatrics); WM Lai, FHKCPaed, FHKAM (Paediatrics); Niko KC Tse, FHKCPaed, FHKAM (Paediatrics)
Paediatric Nephrology Centre, Department of Paediatric and Adolescent Medicine, Princess Margaret Hospital, Laichikok, Hong Kong
 
Corresponding author: Dr YH Chan (genegene.chan@gmail.com)
 
 Full paper in PDF
Abstract
Objective: To review the outcome for Chinese infants and young children on chronic peritoneal dialysis.
 
Methods: The Paediatric Nephrology Centre of Princess Margaret Hospital is the designated site offering chronic dialysis to children in Hong Kong. Medical records of children who started chronic peritoneal dialysis before the age of 2 years, from 1 July 1995 to 31 December 2013, were retrieved and retrospectively reviewed.
 
Results: Nine Chinese patients (male-to-female ratio, 3:6) were identified. They were commenced on automated peritoneal dialysis at a median age of 4.7 (interquartile range, 1.1-13.3) months. The median duration of chronic peritoneal dialysis was 40.9 (interquartile range, 22.9-76.2) months. The underlying aetiologies were renal dysplasia (n=3), pneumococcal-associated haemolytic uraemic syndrome (n=3), ischaemic nephropathy (n=2), and primary hyperoxaluria I (n=1). Peritonitis and exit-site infection rate was 1 episode per 46.5 patient-months and 1 episode per 28.6 patient-months, respectively. Dialysis adequacy (Kt/Vurea >1.8) was achieved in 87.5% of patients. Weight gain was achieved in our patients although three required gastrostomy. Four patients were delayed in development. All patients survived except one patient with primary hyperoxaluria I who died of acute portal vein thrombosis following liver transplantation. One patient with pneumococcal-associated haemolytic uraemic syndrome had sufficient renal function to be weaned off dialysis. Four patients received deceased donor renal transplantation after a mean waiting time of 76.7 months. Three patients remained on chronic peritoneal dialysis at the end of the study.
 
Conclusions: Chronic peritoneal dialysis is technically difficult in infants. Nonetheless, low peritonitis rate, low exit-site infection rate, and no chronic peritoneal dialysis–related mortality can be achieved. Chronic peritoneal dialysis offers a promising strategy to bridge the way to renal transplantation.
 
New knowledge added by this study
  • Literature on infant chronic peritoneal dialysis (CPD) is scarce. This is the first report about long-term outcome of Chinese infants on CPD.
  • The local catheter-related infection rate is low compared with western countries.
Implications for clinical practice or policy
  • CPD in infancy is a feasible modality as a bridge to transplantation with low infection and mortality rate. A shared decision-making process between parents and paediatric nephrologists is necessary to provide an optimal care plan for this group of patients, considering the predicted outcome, associated co-morbidities, and family burden.
 
 
Introduction
End-stage renal disease (ESRD) is a rare disease with high mortality in infants and young children under 2 years of age. In the past, the decision to initiate infant dialysis was not easy due to technical difficulties and poor clinical outcome, as evidenced by a 1990 survey showing that only 50% of paediatric nephrologists would offer dialysis to ESRD children younger than 1 year, and only 40% would offer dialysis to neonates.1 With technological advances and improving outcome for children on dialysis in terms of physical growth, development and quality of life,2 3 most paediatric nephrologists will now consider peritoneal dialysis (PD) as a bridge to renal transplantation. Data in North American Pediatric Renal Trials and Collaborative Studies (NAPRTCS) 2011 indicated that 92% of ESRD children younger than 2 years were on chronic PD (CPD).4
 
Literature in this area is scarce2 3 5 6 especially on the long-term outcome of these infants in the Chinese population. As the only tertiary referral paediatric nephrology centre in Hong Kong, we retrospectively reviewed our experience in the epidemiology, dialysis prescription, complications, and outcome in this group of patients.
 
Methods
The Paediatric Nephrology Centre of Princess Margaret Hospital is the designated site offering renal replacement therapy to children in Hong Kong. Medical records of children who started CPD before 2 years old, from 1 July 1995 to 31 December 2013, were retrieved and reviewed. Information regarding their primary renal diagnosis, co-morbidities, growth profile, infectious and non-infectious complications, dialysis prescription, dialysis adequacy, peritoneal membrane transport status, relevant laboratory investigations, and final outcome were reviewed. Data collected were recorded on data entry forms. Patients who underwent CPD for less than 6 months were excluded. The study was approved by the ethics committee of Princess Margaret Hospital.
 
In our centre, CPD was the preferred dialysis modality in young children; PD was performed by automated cycler in the modes of nocturnal intermittent peritoneal dialysis (NIPD), continuous cyclic peritoneal dialysis (CCPD), continuous optimal peritoneal dialysis, and tidal peritoneal dialysis. Peritoneal equilibration test was performed annually with membrane transport status classified as high, high-average, low-average, or low transporters.7 8 Dialysis adequacy was monitored by both clinical parameters and biochemical parameters. Due to limited information on residual renal function, solute clearance referred to contribution by CPD only, and was expressed in terms of Kt/Vurea.
 
Peritonitis was defined as cloudy peritoneal effluent, with white cell count of >100/mm3 in the dialysate with at least 50% polymorphonuclear leukocytes.9 Additionally, clinical symptoms of fever with or without abdominal pain were included. Exit-site infection (ESI) was diagnosed in the presence of peri-catheter swelling, redness, tenderness, and discharge at the exit site.9 Developmental delay was defined as children who received special education or failed to reach a normal developmental milestone in two or more developmental domains (eg gross motor, cognition, etc).
 
Chronic kidney disease–mineral bone disease (CKD-MBD) was defined as a systemic disorder of mineral bone metabolism due to renal failure, manifesting as biochemical abnormalities (calcium, phosphate, parathyroid hormone [PTH], or vitamin D metabolism), abnormal bone turnover, or vascular calcification.10 Renal osteodystrophy, the skeletal component of CKD-MBD, was defined as alteration of bone morphology in patients with ESRD.10 Target of PTH ranged from 11 to 33 pmol/L (100-300 pg/mL) in children on CPD, supported by recent data from the International Pediatric Peritoneal Dialysis Network (IPDN).11
 
Statistical analysis
Data collection and analysis were performed with Microsoft Excel 2010. The demographic data and biochemical parameters were expressed as mean ± standard deviation, range, median, interquartile range (IQR), number, or percentage as appropriate. Height and weight were expressed as standard deviation scores (SDSs), calculated according to a local study on growth of children.12
 
Results
Patient characteristics
From 1995 to 2013, nine Chinese children under 2 years of age (3 boys and 6 girls) receiving CPD were identified. The mean estimated glomerular filtration rate immediately prior to dialysis was 6.9 ± 3.8 (range, 3.9-15) mL/min/1.73 m2, calculated by Schwartz Formula. The median age at initiating CPD was 4.7 (IQR, 1.1-13.3) months. The median duration of CPD was 40.9 (IQR, 22.9-76.2) months. The most common causes of ESRD were renal dysplasia (n=3, 33%) and pneumococcal-associated haemolytic uraemic syndrome (pHUS) [n=3, 33%], followed by ischaemic nephropathy due to severe perinatal asphyxia (n=2, 22%) and primary hyperoxaluria I (PH1) [n=1, 11%] (Tables 1 and 2). All three patients with pHUS presented with pneumococcal pneumonia, microangiopathic haemolytic anaemia, and acute kidney injury. Either direct Coombs test or T-antigen test was positive to support the diagnosis of pHUS.
 

Table 1. Summary of children younger than 2 years who were on chronic peritoneal dialysis during July 1995 to December 2013
 

Table 2. Clinical characteristics of nine Chinese children started on chronic peritoneal dialysis before 2 years old during July 1995 to December 2013
 
Peritoneal dialysis prescription, transporter status, and peritoneal dialysis adequacy
All patients were put on automated peritoneal dialysis (APD). Initially, eight children were on NIPD and only one was on CCPD. Over the course of CPD, five (56%) patients changed to CCPD, and one (11%) patient changed to tidal PD because of drainage pain. Three (33%) patients remained on NIPD. Decreasing residual renal function and inadequate dialysis were the most common reasons for changing modes of CPD.
 
Peritoneal equilibration test and dialysis adequacy assessment were performed in eight patients. Four patients were high transporters, while two patients were high-average transporters and two patients were low-average transporters (Table 3). Seven (87.5%) patients achieved a dialysis adequacy (Kt/Vurea) of >1.8. The mean Kt/Vurea was 2.5 ± 0.6 (range, 1.5-3.4). The mean weekly creatinine clearance was 38.3 ± 6.2 (range, 25.6-47.1) L/week/1.73 m2.
 

Table 3. Transporter status in children younger than 2 years on chronic peritoneal dialysis
 
Catheter survival
During the study period, 23 episodes of Tenckhoff catheter insertion were carried out in these nine patients. The median catheter survival was 260 (IQR, 19-569) days. Only one patient did not require any catheter change. Fourteen catheter changes were performed in eight patients. Catheters were replaced once in four patients, twice in three patients, and 4 times in one patient. The most common reason for catheter change was catheter blockage due to omental wrap (n=7, 50%), followed by chronic ESI or refractory peritonitis (n=4, 29%), migration or malposition (n=2, 14%), and cuff extrusion (n=1, 7%). While omentectomy was not routinely performed, 44% patients eventually required partial omentectomy due to omental wrap.
 
Peritonitis, exit-site infection, and surgical complications
Five patients experienced a total of eight episodes of peritonitis. Four patients did not have peritonitis. Peritonitis rate was 0.26 episode per patient-year or 1 episode per 46.5 patient-months. Two (25%) episodes were caused by Staphylococcus aureus, one of which was methicillin-resistant. One (12.5%) episode was caused by coagulase-negative Staphylococcus (CoNS) and the other by Mycobacterium chelonae (12.5%). The remaining four (50%) episodes were culture-negative peritonitis (Fig 1). Altogether 13 episodes of ESI occurred in five patients, and patient 3 contributed seven episodes. The rate of ESI was 0.42 episode per patient-year or 1 episode per 28.6 patient-months. The most common organisms were Pseudomonas aeruginosa (n=7, 54%) and methicillin-sensitive S aureus (n=2, 15%). Other causative pathogens included CoNS, diphtheroid, Serratia, and M chelonae, each of which resulted in one ESI (Fig 2).
 

Figure 1. Causative organisms in eight peritonitis episodes (five patients) in the study population
 

Figure 2. Causative organisms in 13 exit-site infections (five patients) in the study population
 
One patient required surgical correction of patent processus vaginalis that led to hydrocoele. One patient required repair of bilateral inguinal hernia. One patient had cuff extrusion and required replacement of PD catheter. No catheters developed leakage.
 
Growth and nutrition
Weight gain was observed after initiation of CPD. At the start of dialysis, 12 months and 24 months post-dialysis, the mean weight SDS (wtSDS) was -1.32, -1.44, and -1.27, while height SDS (htSDS) was -0.75, -0.92, and -1.45, respectively (Table 4). Three (33%) patients were commenced on nasogastric (NG) enteral feeding and eventually were switched to gastrostomy feeding. Six (67%) patients were fed on demand, one of whom was awaiting gastrostomy insertion at the end of the study. Three (33%) patients were prescribed growth hormone therapy before the age of 2 years.
 

Table 4. Growth outcome in children younger than 2 years on chronic peritoneal dialysis
 
Development
Four (44%) children were delayed in development or received special education. Two of them had severe perinatal asphyxia associated with hypoxic ischaemic encephalopathy, one was born prematurely at 32 weeks of gestation and the other had PH1, all of which could account for the developmental delay.
 
Anaemia, chronic kidney disease–mineral bone disease, and hypertension
During the first 2 years of CPD, all patients received erythropoiesis-stimulating agent, except patient 5 who later became dialysis-free. The mean maximum dose of recombinant human erythropoietin (rHuEPO) was 169 ± 91 (range, 65-300) units/kg/week. Three patients received rHuEPO at a dose exceeding 200 units/kg/week. Seven patients were put on oral iron supplements; one of whom was switched to intravenous iron replacement subsequently due to functional iron deficiency. The mean haemoglobin level was 109 ± 8 g/L; only two patients (patients 1 and 7) failed to achieve a mean haemoglobin level of ≥100 g/L.
 
All patients showed some degree of CKD-MBD, as evidenced by raised PTH level and the need for activated vitamin D and phosphate binder. Five patients had severe renal osteodystrophy with clinical or radiological manifestations (Table 2). All of them had markedly elevated mean PTH (90-111 pmol/L) outside the recommended target. Of note, two patients (patients 6 and 7) had pathological fractures. Two patients (patients 7 and 9) received a calcimimetic (cinacalcet) for tertiary hyperparathyroidism. Five (56%) patients had hypertension and were on antihypertensive medications with satisfactory control.
 
Outcome
All patients survived except patient 6 with PH1 who died of acute portal vein thrombosis following liver transplantation at the age of 5 years. Patient 5 with pHUS became dialysis-free after 8.6 months of CPD. Four patients underwent deceased donor renal transplantation (DDRT) with a mean waiting time of 76.7 (range, 54-90) months, of whom two were switched to chronic haemodialysis before transplantation because of inadequate dialysis. Three patients remained on PD at the end of the study.
 
Discussion
End-stage renal disease is rare in infants and young children. The reported incidence is variable but remains low around the globe. Up to 16 cases per age-related population per year have been reported in the UK.13 In NAPRTCS 2011, 13.2% of children on dialysis were under 2 years old.4 In Hong Kong, recent data from the Hong Kong Renal Registry showed that the incidence and prevalence of ESRD in those <20 years old was around 5 and 28 per million children, respectively.14
 
The most common aetiology of ESRD in this age-group is congenital anomalies of the kidney and urinary tract, including renal dysplasia and obstructive uropathy.15 Nonetheless, pHUS constituted an important cause of ESRD in Hong Kong. A potential explanation is the late introduction of a universal pneumococcal vaccination programme in 2009, compared with 2000 in the US population.
 
In our study, all patients started with CPD. Difficult vascular access for haemodialysis and a high volume of daily milk intake make CPD the more favourable choice of renal replacement therapy in young infants. While local mean DDRT waiting time in children younger than 18 years was 4.4 ± 2.4 years,16 the waiting time in our young patients was much longer (mean, 6.4 years). This is because patients have to weigh more than 15 kg before DDRT can be carried out due to technical difficulties. Therefore, CPD acts as a bridge to transplantation and reserves vascular access for future use.15
 
Ethical considerations and infection, together with growth and nutrition, are the most challenging aspects of infant CPD.
 
Ethical considerations
Decisions to initiate or withhold dialysis remain one of the most challenging aspects in infant ESRD. Recent data, which showed improvement in mortality and developmental outcome, support initiation of dialysis. Shroff et al6 reported a survival rate of 77% at 5 years in children commenced on chronic dialysis before the age of 1 year. Our unpublished data revealed 91 patients were put on APD from 1996 to 2013. The overall survival rate was 90%. In this series, survival rate in young infants was similar and there was no CPD-related mortality. The only mortality resulted from surgical complications after liver transplantation.
 
Warady et al17 reported the 79% infants who started CPD had normal developmental scores at 1 and 4 years old and 94% of school-aged children attended school. In our series, 44% of patients were delayed in development, all of which could be accounted for by co-morbidities or underlying aetiology of ESRD.
 
Nonetheless, unpredictable outcome, psychosocial burdens, and cost continually fuel the ethical dilemma.13 15 18 The family burden is tremendous. Since CPD is a home-based treatment, caregivers must perform dialysis daily. Up to 55% of paediatric nephrologists felt a parental decision to refuse dialysis should be respected for neonates and 26% for children of 1 to 12 months old.13 In two surveys, serious co-existing co-morbidities and predicted morbidity were the most important factors when a physician considered withholding dialysis.1 19 While serious non-renal co-morbidities such as pulmonary hypoplasia are strongly associated with a poor prognosis,20 patients with isolated renal disease should be considered separately as their prognosis is generally better.18 It should be a shared decision-making process between parents and paediatric nephrologists, after detailed counselling on potential burdens and after considering co-morbidities, expected quality of life, and available resources and expertise.18 21 Designated nurses, clinical psychologists, and medical social workers are crucial in supporting patients and parents.
 
Peritoneal dialysis–related infection
Infants and young children are at risk of PD-related infectious complications. In the US, the annualised rate of peritonitis in children younger than 2 years was 0.79 episode per patient-year, compared with 0.57 episode per patient-year in adolescents aged over 12 years.4 In our current series, the annual peritonitis rate was 0.26, which is less frequent than the US data. As previously reported, the overall annual peritonitis rate among all our paediatric patients on APD was low at 0.22.22 A low infection rate has similarly been reported in several Asian countries.22
 
There are a few possible explanations. First, all our patients were on APD that is associated with a reduced risk of infection as shown in a systematic review by Rabindranath et al23 and previous data in NAPRTCS 2005.24 Second, we strictly complied with the guidelines and recommendations on prevention of PD-related infection.4 9 21 25 26 Measures included the use of double-cuffed Tenckhoff catheters, downward or laterally pointing exit sites away from diaper and ostomies, antibiotic prophylaxis at catheter insertion, post-insertion immobilisation of the catheter, nasal methicillin-resistant S aureus screening and decolonisation with mupirocin, and selective use of prophylactic topical antibiotics for patients with a history of ESI. Third, all patients and their carers completed an intensive PD training programme before commencing home APD. Training was conducted by a senior renal nurse with regular reviews and phone follow-ups. The high culture-negative peritonitis rate in our series highlights the need for proper specimen collection and handling.9
 
Growth and nutrition
Growth in infancy is important because one third of postnatal height is achieved during the first 2 years of life.27 Growth during this period largely relies on nutritional intake, rather than growth hormone. Growth in ESRD is often impaired because of poor appetite, increased circulatory leptin, nutritional loss through peritoneal dialysate and repeated vomiting due to dysmotility, gastroesophageal reflux, and raised intraperitoneal pressure.15 27 Infants can lose more than 2 htSDS that can be irreversible.15 Importantly, it is also a period of catch-up growth; NAPRTCS reported improvement in both htSDS and wtSDS in children who started dialysis before the age of 2 years—htSDS improved from -2.59 at baseline to -2.15 at 24 months post-dialysis, while wtSDS improved from -2.26 to -1.05.4
 
In our cohort, there was weight gain, but a decline in htSDS was observed. The IPDN recently analysed growth in 153 very young children on CPD.27 Interestingly, htSDS decreased further in the first 6 to 12 months of CPD and then stabilised. Although catch-up in height was noted in the NAPRTCS report, such improvement was only observed in children with worse baseline height deficit, defined as htSDS ≤ –1.88. Children with htSDS > –1.88 instead had a decline in htSDS by 0.11 and 0.2 at 12 and 24 months, respectively.4 Only two of our patients had worse baseline height deficit (≤ –1.88), with htSDS being -2 at CPD initiation. Similar to the findings in IPDN and NAPRTCS, catch-up growth in height was observed in these two patients. At 12 months post-dialysis, their htSDS improved to -0.94 and -1.6, respectively.
 
Oral intake is often unsatisfactory and enteral feeding is required, either by NG or gastrostomy tube. This allows overnight feeding and reduces vomiting. In the recent IPDN study on growth, 37% young children were fed on demand, 39% by NG tube, 7% by gastrostomy tube, and 17% switched from NG to gastrostomy feeding.27 Both NG and gastrostomy feeding led to significant increase in body mass index SDS, although regional variation was observed. Gastrostomy but not NG feeding was associated with improved linear growth, an effect that was no longer significant after adjusting the baseline length. Feeding by gastrostomy appeared to be superior to NG tube in growth promotion and may be related to less vomiting.27
 
Over the years, the use of a gastrostomy to enhance nutritional supplementation has been promoted in our centre, with intensified collaboration with a paediatric renal dietitian. In our series, only one (20%) patient who commenced CPD before 2008 received enteral feeding, owing to low parental acceptance. Of the remaining four patients who started CPD after 2008, two had gastrostomies, one was awaiting gastrostomy insertion, and one thrived satisfactorily without the need for enteral feeding. This suggests an improved nutritional management and parental acceptance. Extra efforts should also be made to optimise factors such as acidosis, anaemia, and metabolic bone disease.27 In addition, KDOQI (Kidney Disease Outcomes Quality Initiative) suggests consideration of growth hormone when children have htSDS and height velocity SDS of ≤ –1.88 after optimising nutrition and metabolic abnormalities.28
 
There are a few limitations to this study. First, because of the retrospective study design, there was recall bias. Some information could not be retrieved from medical records, especially for children who presented in the late 1990s and early 2000s. Second, the total case number was small since patients were recruited from a single nephrology centre. Last, infant dialysis has changed considerably over the past two decades and might in turn affect patient outcome.
 
Conclusions
End-stage renal disease in very young children is uncommon. Chronic PD is feasible and the outcome is improving. Vigilant adoption of guidelines, universal use of APD, and a well-structured PD training programme are crucial to achieve low peritonitis and ESI rates with no CPD-related mortality in our centre. Optimisation of dialysis, nutritional support, and developmental training are important while successful renal transplantation is the ultimate goal for these infants.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Geary DF. Attitudes of pediatric nephrologists to management of end-stage renal disease in infants. J Pediatr 1998;133:154-6. Crossref
2. Kari JA, Gonzalez C, Ledermann SE, Shaw V, Rees L. Outcome and growth of infants with severe chronic renal failure. Kidney Int 2000;57:1681-7. Crossref
3. Ledermann SE, Scanes ME, Fernando ON, Duffy PG, Madden SJ, Trompeter RS. Long-term outcome of peritoneal dialysis in infants. J Pediatr 2000;136:24-9. Crossref
4. North American Pediatric Renal Trials and Collaborative Studies (NAPRTCS). 2011 Annual dialysis report. Available from: https://web.emmes.com/study/ped/annlrept/annualrept2011.pdf. Accessed Nov 2015.
5. Vidal E, Edefonti A, Murer L, et al. Peritoneal dialysis in infants: the experience of the Italian Registry of Paediatric Chronic Dialysis. Nephrol Dial Transplant 2012;27:388-95. Crossref
6. Shroff R, Rees L, Trompeter R, Hutchinson C, Ledermann S. Long-term outcome of chronic dialysis in children. Pediatr Nephrol 2006;21:257-64. Crossref
7. Warady BA, Alexander SR, Hossli S, et al. Peritoneal membrane transport function in children receiving long-term dialysis. J Am Soc Nephrol 1996;7:2385-91.
8. Warady BA, Alexander S, Hossli S, Vonesh E, Geary D, Kohaut E. The relationship between intraperitoneal volume and solute transport in pediatric patients. Pediatric Peritoneal Dialysis Study Consortium. J Am Soc Nephrol 1995;5:1935-9.
9. Warady BA, Bakkaloglu S, Newland J, et al. Consensus guidelines for the prevention and treatment of catheter-related infections and peritonitis in pediatric patients receiving peritoneal dialysis: 2012 update. Perit Dial Int 2012;32 Suppl 2:S32-86. Crossref
10. Kidney Disease: Improving Global Outcomes (KDIGO) CKD-MBD Work Group. KDIGO clinical practice guideline for the diagnosis, evaluation, prevention, and treatment of Chronic Kidney Disease-Mineral and Bone Disorder (CKD-MBD). Kidney Int Suppl 2009;113:S1-130.
11. Borzych D, Rees L, Ha IS, et al. The bone and mineral disorder of children undergoing chronic peritoneal dialysis. Kidney Int 2010;78:1295-304. Crossref
12. Leung SS, Tse LY, Wong GW, et al. Standards for anthropometric assessment of nutritional status of Hong Kong children. Hong Kong J Paediatr 1995;12:5-15.
13. Rees L. Paediatrics: Infant dialysis—what makes it special? Nat Rev Nephrol 2013;9:15-7. Crossref
14. Yap HK, Bagga A, Chiu MC. Pediatric nephrology in Asia. In: Avner ED, Harmon WE, Niaudet P, Yoshikawa N, Emma F, Goldstein SL, editors. Pediatric nephrology. 6th ed. Springer; 2010: 1981-90.
15. Zaritsky J, Warady BA. Peritoneal dialysis in infants and young children. Semin Nephrol 2011;31:213-24. Crossref
16. Chiu MC. An update overview on paediatric renal transplantation. Hong Kong J Paediatr 2004;9:74-7.
17. Warady BA, Belden B, Kohaut E. Neurodevelopmental outcome of children initiating peritoneal dialysis in early infancy. Pediatr Nephrol 1999;13:759-65. Crossref
18. Lantos JD, Warady BA. The evolving ethics of infant dialysis. Pediatr Nephrol 2013;28:1943-7. Crossref
19. Teh JC, Frieling ML, Sienna JL, Geary DF. Attitudes of caregivers to management of end-stage renal disease in infants. Perit Dial Int 2011;31:459-65. Crossref
20. Wood EG, Hand M, Briscoe DM, et al. Risk factors for mortality in infants and young children on dialysis. Am J Kidney Dis 2001;37:573-9. Crossref
21. Zurowska AM, Fischbach M, Watson AR, Edefonti A, Stefanidis CJ, European Paediatric Dialysis Working Group. Clinical practice recommendations for the care of infants with stage 5 chronic kidney disease (CKD5). Pediatr Nephrol 2013;28:1739-48. Crossref
22. Chiu MC, Tong PC, Lai WM, Lau SC. Peritonitis and exit-site infection in pediatric automated peritoneal dialysis. Perit Dial Int 2008;28 Suppl 3:S179-82.
23. Rabindranath KS, Adams J, Ali TZ, Daly C, Vale L, MacLeod AM. Automated vs continuous ambulatory peritoneal dialysis: a systematic review of randomized controlled trials. Nephrol Dial Transplant 2007;22:2991-8. Crossref
24. North American Pediatric Renal Transplant Cooperative Study (NAPRTCS). 2005 Annual report. Available from: https://web.emmes.com/study/ped/annlrept/annlrept2005.pdf. Accessed Nov 2015.
25. Piraino B, Bailie GR, Bernardini J, et al. Peritoneal dialysis-related infections recommendations: 2005 update. Perit Dial Int 2005;25:107-31.
26. Auron A, Simon S, Andrews W, et al. Prevention of peritonitis in children receiving peritoneal dialysis. Pediatr Nephrol 2007;22:578-85. Crossref
27. Rees L, Azocar M, Borzych D, et al. Growth in very young children undergoing chronic peritoneal dialysis. J Am Soc Nephrol 2011;22:2303-12. Crossref
28. KDOQI Work Group. KDOQI Clinical Practice Guideline for Nutrition in Children with CKD: 2008 update. Executive summary. Am J Kidney Dis 2009;53:S11-104. Crossref

Therapeutic inertia in the management of hyperlipidaemia in type 2 diabetic patients: a cross-sectional study in the primary care setting

Hong Kong Med J 2016 Aug;22(4):356–64 | Epub 17 Jun 2016
DOI: 10.12809/hkmj154667
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Therapeutic inertia in the management of hyperlipidaemia in type 2 diabetic patients: a cross-sectional study in the primary care setting
FY Man, MB, BS, FHKCFP; Catherine XR Chen, MRCP (UK), FHKAM (Family Medicine); YY Lau, MB, BS, FHKAM (Family Medicine); King Chan, FRACGP, FHKAM (Family Medicine)
Department of Family Medicine & General Outpatient Clinic, Queen Elizabeth Hospital, Kowloon Central Cluster, Jordan, Hong Kong
 
Corresponding author: Dr FY Man (mfy252@ha.org.hk)
 
 Full paper in PDF
Abstract
Objectives: To study the prevalence of therapeutic inertia in lipid management among type 2 diabetic patients in the primary care setting and to explore associated factors.
 
Methods: This was a cross-sectional study involving type 2 diabetic patients with suboptimal lipid control followed up in all general out-patient clinics of Kowloon Central Cluster in Hong Kong from 1 October 2011 to 30 September 2013. Main outcome measures included prevalence of therapeutic inertia in low-density lipoprotein management among type 2 diabetic patients and its association with patient and physician characteristics.
 
Results: Based on an agreed standard, lipid control was suboptimal in 49.1% (n=9647) of type 2 diabetic patients who attended for a regular annual check-up (n=19 662). Among the sampled 369 type 2 diabetic patients with suboptimal lipid control, therapeutic inertia was found to be present in 244 cases, with a prevalence rate of 66.1%. When the attending doctors’ profiles were compared, the mean duration of clinical practice was significantly longer in the therapeutic inertia group than the non–therapeutic inertia group. Doctors without prior training in family medicine were also found to have a higher rate of therapeutic inertia. Patients in the therapeutic inertia group had longer disease duration, a higher co-morbidity rate of cardiovascular disease, and a closer-to-normal low-density lipoprotein level. Logistic regression analysis revealed that lack of family medicine training among doctors was positively associated with the presence of therapeutic inertia whereas patient’s low-density lipoprotein level was inversely associated.
 
Conclusions: Therapeutic inertia was common in the lipid management of patients with type 2 diabetes in a primary care setting. Lack of family medicine training among doctors and patient’s low-density lipoprotein level were associated with the presence of therapeutic inertia. Further study of the barriers and strategies to overcome therapeutic inertia is needed to improve patient outcome in this aspect of chronic disease management.
 
New knowledge added by this study
  • Lipid control among patients with type 2 diabetes mellitus (T2DM) was far from satisfactory, with nearly half being suboptimally controlled.
  • Therapeutic inertia (TI) is common in the lipid management of T2DM patients in the primary care setting with a prevalence rate of 66.1%.
  • Lack of family medicine training among doctors was positively associated with the presence of TI whereas patient’s low-density lipoprotein level was inversely associated.
Implications for clinical practice or policy
  • Comprehensive strategies should be devised to overcome TI so that long-term cardiovascular outcome of diabetic patients can be improved.
 
 
Introduction
Type 2 diabetes mellitus (T2DM) is one of the most common chronic conditions encountered in primary care, affecting up to 10% of the Hong Kong population.1 It is also a leading cause of morbidity and mortality due to diabetic complications.2 Optimal control of cardiovascular risk factors can decrease the risk of developing diabetes-related complications.3 4 5
 
Hyperlipidaemia is one of the most important modifiable risk factors for cardiovascular disease (CVD) prevention. Studies have shown that optimal lipid control is associated with an improved cardiovascular outcome.6 7 8 9 Low-density lipoprotein (LDL) particles are considered more atherogenic than other cholesterol components and therefore stringent control of LDL is particularly important for the prevention of CVD in high-risk patients.10
 
Despite this evidence, lipid control among diabetic patients in the primary care setting, both locally and internationally, has been inadequate.11 The most recent study performed in Hong Kong found that 88.4% of diabetic patients had a suboptimal lipid level.12 Studies in Europe and the US found that the LDL control rate ranged from 30% to 55%.13 14 15 16 17 Similarly, a study of dyslipidaemia management in South Asia including China, South Korea, Malaysia, and Singapore revealed that only 48% of patients attained pre-defined low-density lipoprotein–cholesterol goals.18
 
Similar to other chronic conditions, the reasons for poor (lipid) control are multifactorial and may include patient, physician, and health care delivery factors. Among them, suboptimal medication augmentation has been identified as an important physician factor. This is known as therapeutic inertia (TI) and is said to exist whenever the health care provider does not initiate or intensify therapy appropriately when therapeutic goals are not reached: “recognition of the problem, but failure to act”.19 20 Such TI has become increasingly acknowledged as a major impediment to CVD risk factor control. Studies have suggested that TI is related to the management of diabetes and hypertension (HT) and may contribute to up to 80% of heart attacks and strokes.21 22
 
The prevalence of TI in chronic disease management has not been explored in Hong Kong. In this study, we specifically looked at the prevalence of TI in hyperlipidaemia management among diabetic patients. Internal statistical data (internal data from Hospital Authority [HA] Head Office) revealed that lipid control has been relatively poor in this cluster when compared with blood pressure and glycaemic control. Our study aimed to explore the prevalence of TI in the management of hyperlipidaemia among T2DM patients and to explore the underlying factors. By overcoming the barriers to adequate and appropriate treatment, it was expected that the long-term cardiovascular outcome of T2DM patients could be improved.
 
Methods
Subjects
Inclusion criteria
In this cross-sectional study, all T2DM patients with International Classification of Primary Care code T90 (Non-insulin Dependent Diabetes Mellitus), who had been regularly followed up in all General Outpatient Clinics (GOPCs) of Kowloon Central Cluster (KCC) from 1 October 2011 to 30 September 2013, and had blood lipid levels checked at least once during this period were recruited. In our clinics, blood and urine check-ups are usually carried out in patients with T2DM every 12 to 18 months. This 2-year retrieval period was therefore likely to cover all such patients regularly followed up in our cluster. The diagnosis of diabetes was based on the “Definition and description of diabetes mellitus” from American Diabetes Association (ADA) in 2013.23
 
Exclusion criteria
The following patients were excluded: patients who had been incorrectly diagnosed with diabetes, type 1 diabetic patients, diabetic patients who had no regular blood or urine check-up during the study period, diabetic patients followed up in a specialist clinic, and patients who died during the study period.
 
Definition of treatment target and therapeutic inertia in lipid management among type 2 diabetic patients
Various studies and guidelines have recommended targets in the treatment of hyperlipidaemia. In the HA of Hong Kong, National Cholesterol Education Program Adult Treatment Panel III Guidelines (NCEP ATP III) and ADA guidelines were used to set up the manual for the risk assessment and management programme. In this study, we used the same set of guidelines to define the level of lipid control in T2DM patients. We focused on the control of LDL as it is the most important risk factor of the lipid profile.
 
According to NCEP ATP III 200224 and ADA 2013 Guidelines on Diabetes and Lipids,23 target LDL should be <2.6 mmol/L in diabetic patients without overt CVD and <1.8 mmol/L in diabetic patients with overt CVD. In this study, CVD is defined as established ischaemic heart disease (IHD), cerebrovascular accident (CVA), or peripheral vascular disease (PVD).
 
In this study, lipid control was defined as poor and escalation of treatment indicated if the last LDL level was ≥2.6 mmol/L in diabetic patients without CVD and ≥1.8 mmol/L in diabetic patients with established CVD. Consultation notes of the follow-up immediately after the last available lipid profile test were reviewed through the HA Clinical Management System (CMS). Therapeutic inertia was considered to be present when the attending doctor failed to initiate or intensify treatment if target LDL level was not achieved. If medical notes indicated a valid reason for non-escalation of treatment despite a clinical indication, it was not considered TI. Common justifications included:
(1) Diet and lifestyle modification advice was given to patients newly diagnosed with hyperlipidaemia.
(2) Statin was started following the previous visit and LDL level was improving.
(3) Patient was non-compliant with the existing statin regimen and advice on regular drug compliance was given.
(4) Patient refused to take a statin.
(5) Patient was unable to tolerate side-effects of statin.
(6) Statin was contra-indicated, eg in patients with deranged liver function.
 
Calculation of sample size and random sampling
According to the data drawn from Clinical Data Analysis and Reporting System of the HA, a total of 19 662 T2DM patients were attending GOPCs of KCC for regular follow-up with checking of blood lipid profile during the study period. Based on the definitions mentioned above, 9647 of them had suboptimal or poor LDL control. Using the internet sample size calculator (Survey Software from Creative Research System, http://www.surveysystem.com), a sample size of 369 would provide 95% confidence level and 5% margin of error. Thus, 400 patients were sampled to ensure adequate statistical power and allow room for case exclusion. A list of random numbers was then generated from the research randomiser (http://www.randomizer.org/form.htm), from which 400 patients were selected. Details of the visit with latest lipid profile result seen were recorded. Data were derived from the consultation notes in the CMS record of selected patients and recorded on a standard data collection form (Appendix). Data were collected by the principal investigator and counter-checked by another experienced doctor in the research team.
 

Appendix. Data collection form
 
Determination of variables
Age and gender of all patients as well as smoking status, body mass index (BMI), latest blood pressure, haemoglobin A1c (HbA1c) level, serum creatinine level, lipid profile, and urine albumin-to-creatinine ratio were retrieved from the CMS. The most recent blood or urine test was used for analysis if more than one test had been performed during the study period. The BMI was calculated as body weight/body height2 (kg/m2). The patient was considered a smoker if he/she currently smoked or had stopped in the last 6 months.25 The abbreviated Modification of Diet in Renal Disease formula was used to calculate the estimated glomerular filtration rate.26
 
The working profile of the attending doctors was retrieved from the Central Office of Department of Family Medicine (FM) and GOPC, KCC. Duration of clinical practice was calculated as the number of years from registration with the Medical Council of Hong Kong. The training status of FM of doctors was documented and categorised according to the following criteria:
  • Group 1: Doctors who had never received any formal FM training.
  • Group 2: Doctors who had completed basic vocational training from Hong Kong College of Family Physicians (HKCFP), or had studied the diploma of FM (DFM).
  • Group 3: Doctors who were an intermediate fellow who had obtained fellowship in HKCFP.
  • Group 4: Doctors were FM specialists who had obtained fellowship of the Hong Kong Academy of Medicine.
  •  
    Statistical analysis
    All data were entered and analysed using computer software (Windows version 21.0; SPSS Inc, Chicago [IL], US). Student’s t test and analysis of variance were used to analyse continuous variables and the Chi squared test for categorical data. Fisher’s exact test was used if the sample size was less than five. Multivariate stepwise logistic regression was used to determine the association between TI and the significant different variables from patient characteristics and doctor characteristics. All statistical tests were two-sided, and a P value of <0.05 was considered statistically significant.
     
    Ethical considerations
    The study protocol was reviewed and approved by the Research Ethics Committee of HA (Kowloon Central/Kowloon East Cluster) [Reference number: KC/KE-13-0247/ER-1].
     
    Results
    A total of 21 960 T2DM patients were identified from the KCC GOPC Diabetes Mellitus registry from 1 October 2011 to 30 September 2013. Among them, 19 662 (89.5%) patients had their lipid profile checked at least once during the study period; 9647 (49.1%) cases had suboptimal lipid control based on the defined criteria above, including 1733 cases with co-existing CVD and 7914 cases without CVD.
     
    Among 400 randomly sampled diabetic patients with suboptimal lipid control, 31 were excluded including 21 who were being followed up in other clinics for diabetic control, nine who died during the study period, and one who was wrongly diagnosed with diabetes. The remaining 369 cases were recruited for data analysis (Fig).
     

    Figure. Patient recruitment in this study
     
    Table 1 summarises the demographic characteristics of the recruited patients. The mean (± standard deviation) age of the study population was 65.5 ± 11.9 years and 186 (50.4%) were female. The mean duration of diabetes was 9.1 ± 7.9 years. With regard to their co-morbidities, 306 (82.9%) patients had concomitant HT, 24 (6.5%) had IHD, 40 (10.8%) had CVA, and two (0.5%) had PVD. The mean LDL level was 3.12 ± 0.61 mmol/L and only 101 (27.4%) patients were prescribed a statin.
     

    Table 1. Demographic characteristics of type 2 diabetic patients recruited into the study
     
    Table 2a summarises the demographic characteristics of the attending doctors. A total of 56 doctors, among whom 19 (33.9%) were female, attended the 369 diabetic patients. The mean duration of clinical practice was 13.6 ± 9.6 years. With regard to FM training status, 13 (23.2%) doctors had received no FM training, 18 (32.1%) received basic training or studied DFM, 13 (23.2%) were intermediate FM fellows, and 12 (21.4%) were FM specialists.
     

    Table 2. (a) Demographic profile of physicians caring for the recruited patients with diabetes, and (b) subanalysis of attending doctors’ profile according to duration of clinical practice and Family Medicine training status
     
    Subanalysis of attending doctors’ profile according to their duration of clinical practice and FM training status is shown in Table 2b. Training status of FM varied significantly with duration of clinical practice (P<0.001). Among 13 doctors who had worked for ≤5 years, all had been a basic FM trainee or had obtained a DFM. On the other hand, among 12 doctors who had worked for over 20 years, most (n=9, 75%) had not received any formal FM training.
     
    Among the 369 recruited T2DM patients, treatment was escalated in 47 (12.7%). Justification for not intensifying treatment was provided in 78 (21.1%) cases. Justification was as follows: 19 patients were given dietary advice on lifestyle modifications as they were newly diagnosed with hyperlipidaemia; in 13 patients, a statin had been newly commenced at the previous visit and lipid level was lower compared with pretreatment; five patients were non-compliant with the existing treatment regimen and advice on compliance was given; 28 patients refused to start a statin despite medical advice; six patients had been unable to tolerate side-effects of statin. Statin therapy was contra-indicated in seven patients with impaired liver function. In the remaining 244 cases, TI was present with a prevalence rate of 66.1%.
     
    Table 3 shows the characteristics of physicians in TI-positive and TI-negative patients. The duration of clinical practice of attending doctors was significantly longer in the TI group compared with the non-TI group (P=0.001), with doctors working for over 20 years having a particularly higher rate of TI (82.4%). Doctors without any FM training also had a higher rate of TI (77.7%; P=0.006).
     

    Table 3. Comparison of the prevalence of TI according to profile of attending doctors
     
    Table 4 summarises the characteristics of T2DM patients in TI-positive and TI-negative groups. Patients in the TI-positive group had a longer duration of diabetes (9.8 ± 8.1 years in TI-positive group vs 7.8 ± 7.4 years in TI-negative group; P=0.024) and lower total cholesterol level and LDL level (both P<0.001). The co-existence of CVD (IHD, CVA, PVD) was more common in the TI-positive group (P=0.003). Other characteristics including patient gender, age, BMI, smoking status, blood pressure, HbA1c level, and type and dose of current statin use were comparable for both groups (all P>0.05).
     

    Table 4. Patient profile in the presence or absence of TI
     
    Based on the results from Tables 3 and 4, multivariate stepwise logistic regression analysis was performed to identify any factors that contributed to TI (Table 5). Only variables that were significantly different in the univariate analysis were included in the regression model. As the FM training status varied significantly with the duration of clinical practice (Table 2b, P<0.001) and these two factors were interrelated, only one of these two variables was included in the logistic regression analysis. As the P value of FM training status (P=0.006) was smaller than that for years of clinical practice (P=0.007) in the univariate analysis (Table 3), FM training status was entered into the logistic regression analysis. Lack of FM training was positively associated with TI (odds ratio [OR]=2.170; P=0.008), whereas patient’s LDL level was inversely associated (OR=0.320; P=0.001).
     
    Discussion
    This was the first clinical analysis of TI in lipid management among T2DM patients managed locally in the primary care setting. It has provided important background information about the prevalence of TI in this group of patients. It also explored possible underlying factors from both the doctor’s and patient’s perspective.
     
    Our study found that lipid control among T2DM patients was far from satisfactory, with 49.1% suboptimally controlled. This is consistent with reports that a high proportion of patients with hyperlipidaemia do not achieve their LDL goal.27 28 It is important to note that TI was present in 66.1% of these cases, meaning that in over 60% of diabetic patients with dyslipidaemia, appropriate management including dietary advice or drug treatment was not provided. This relatively high TI rate should alert primary care physicians to the importance of lipid control among T2DM patients as greater TI leads to poorer clinical outcomes. A similar study carried out by Whitford et al29 has shown that TI was present in 80% of consultations when lipid control was addressed among diabetic patients managed in the primary care setting in Middle East countries. This rate was much higher than the TI in blood pressure control (68%) and glycaemic control (29%). A similar study of lipid management in high-risk patients at a large academic primary care practice in the US has shown that statin dose was augmented at only 16% of over 2000 patient visits where the patient was suboptimally controlled.30 Among the sampled 369 poorly controlled T2DM patients in this study, only 27.4% (n=101) were treated with simvastatin, which is the only statin available in Hong Kong GOPCs. In addition, most (74.3%, 75/101) were treated with a lower dose (5-10 mg daily) that is considered inadequate according to ATP-IV guidelines in which a moderate dose of statin, such as simvastatin 20-40 mg daily, is recommended for T2DM patients in order to achieve target LDL level.31 Thus, the low statin prescription rate and the inadequate dose of statin may together contribute to the suboptimal lipid control among T2DM patients in primary care.
     
    A possible explanation for the TI in dose augmentation of simvastatin is the potential drug-drug interaction with calcium channel blockers (CCB) such as amlodipine.32 The maximum recommended dose for simvastatin in conjunction with amlodipine use is 20 mg/day. Since 306 (82.9%) sampled diabetic patients were found to have concomitant HT and among them 122 (40%) were prescribed amlodipine for blood pressure control, doctors might have hesitated to increase the dose of simvastatin. In our study, 10 diabetic patients in the TI-positive group were prescribed amlodipine and simvastatin 20 mg daily. In this scenario, either changing simvastatin to an alternative statin such as atorvastatin or changing amlodipine to an alternative CCB such as nifedipine is recommended if lipid control remains suboptimal. Failing to switch to another statin or CCB when clinically indicated is also considered to be TI. A more proactive approach to prescribing different drug combinations is required in order to achieve the target LDL in a timely manner.
     
    Further studies of the physician profile relative to the presence of TI have revealed that doctors with longer duration of clinical practice have a higher rate of TI that is even more prominent in those with over 20 years’ clinical practice. These findings are contrary to an overseas study where more experienced doctors had a lower rate of TI33; nonetheless, this study was performed in a secondary care setting and involved cardiologists who managed hyperlipidaemia in patients with IHD. In our study, most doctors who had worked for over 20 years had no formal FM training (9 [75%] of 12 doctors; Table 2b). In addition, when training status was compared, doctors with no FM training had a higher rate of TI than those who had completed FM training (77.7% vs 60.0%; P=0.006). We postulate that doctors who have worked for over 20 years may be less familiar with the latest guidelines on lipid management, possibly due to a lack of FM or related training. If physicians lack appropriate training, there will be gaps in their knowledge of latest clinical management guidelines. This has been confirmed by review articles which showed that TI could be attributed to insufficient knowledge of guidelines.34
     
    When patient’s profile was compared, surprisingly, TI was present in 51 of 62 diabetic patients with overt CVD, and only 11 cases were properly managed (Table 4). This is a considerable concern since lipid control is particularly important and as a secondary prevention strategy in this group of patients. The target for LDL control is much more stringent at <1.8 mmol/L in this group of patients, and more difficult to achieve clinically. Some doctors may not have been aware of this stricter/lower LDL target and have been satisfied with LDL level of 1.8 to 2.6 mmol/L. This is supported by our finding that among diabetic patients with overt CVD whose lipid profile was inadequately controlled (n=62), more than half (n=33, 53.2%) had LDL controlled at 1.8 to 2.6 mmol/L. Physicians should take a more proactive approach particularly in this high-risk group of patients and adhere closely to the prevailing management guidelines in CVD risk factor control.
     
    Multiple variable logistic regression analysis revealed that patients with lower LDL or LDL level closer to normal was associated with TI (Table 5). This could be explained by the threshold effect, that is, the closer the LDL level is to target level, the less likely is the doctor to intensify treatment. This threshold effect has been commonly observed in other similar studies.30 35 Other factors that contribute to the threshold effect could be ‘overestimation of current care’ or ‘complacency with borderline values’, leading to the physician’s subjective misperception that the care provided is sufficient.34
     

    Table 5. Logistic regression analysis of factors contributing to the presence of therapeutic inertia
     
    Implications for primary care
    Our study found that TI was common in lipid management among diabetic patients managed in the GOPCs of KCC, with a prevalence of 66.1%. Doctors with a longer duration of clinical practice and who had not received formal FM training had a higher rate of TI. Patients with a closer-to-target LDL were more common in the TI group. Considering that a large volume of diabetic patients are managed in the primary care setting, comprehensive strategies with a more proactive approach should be devised to combat TI so that the cardiovascular outcome of diabetic patients can be improved.
     
    Strengths and limitations of the study
    This is the first clinical analysis of TI in lipid management among diabetic patients managed locally in the primary care setting. It has provided important background information about the prevalence of TI in lipid management among diabetic patients and explored the possible underlying factors from both the doctor’s and patient’s perspective. These findings will help improve strategies to overcome TI in lipid control for these patients.
     
    There are some limitations in this study. First, the study was carried out in one single cluster of HA and therefore selection bias might exist. These results from the public primary health care sector might not be applicable to the private sector or secondary care. In addition, the number of doctors with or without FM training was quite discrepant in this study (43 vs 13) and may affect the generalisation of findings. Nevertheless, the present results may lay the groundwork for similar studies in the future, both locally and internationally. Second, patients with diabetes who had not had any blood testing performed during the study period were excluded (n=2298, 10.5% of all diabetic cases). The lipid control status of this group of diabetic patients remained unknown. This might bias the accurate measurement of TI among our target population. Third, only TI in LDL management was explored in this study. Management of hypertriglyceridaemia was not addressed in view of its less-important role as a risk factor for CVD. Future studies exploring the TI in hypertriglyceridaemia management are needed to comprehensively assess lipid control among diabetic patients. Lastly, this study relied heavily on review of consultation notes to identify justification for submaximal therapy and determine presence of TI. Insufficient justification for a certain treatment may have resulted in an overestimation of the prevalence of TI.
     
    Conclusions
    This study found that TI was common in the lipid management of diabetic patients managed in GOPCs of KCC, with a prevalence rate of 66.1%. Doctors without FM training and a closer-to-target LDL level among T2DM patients were associated with the presence of TI. Comprehensive strategies should be devised to overcome TI so that the cardiovascular outcome of diabetic patients can be improved.
     
    Acknowledgements
    We are indebted to Ms Katherine Chan, statistical officer of Queen Elizabeth Hospital, for her expert statistical support in the data analysis.
     
    Appendix
    Additional material related to this article can be found on the HKMJ website. Please go to <http://www.hkmj.org>, and search for the article.
     
    Declaration
    All authors have disclosed no conflicts of interest.
     
    References
    1. Chan JC, Malik V, Jia W, et al. Diabetes in Asia: epidemiology, risk factors, and pathophysiology. JAMA 2009;301:2129-40. Crossref
    2. Leung GM, Lam KS. Diabetic complications and their implications on health care in Asia. Hong Kong Med J 2000;6:61-8.
    3. Colhoun HM, Betteridge DJ, Durrington PN, et al. Primary prevention of cardiovascular disease with atorvastatin in type 2 diabetes in the Collaborative Atorvastatin Diabetes Study (CARDS): multicentre randomised placebo-controlled trial. Lancet 2004;364:685-96. Crossref
    4. Collins R, Armitage J, Parish S, Sleigh P, Peto R; Heart Protection Study Collaborative Group. MRC/BHF Heart Protection Study of cholesterol-lowering with simvastatin in 5963 people with diabetes: a randomised placebo-controlled trial. Lancet 2003;361:2005-16. Crossref
    5. Gaede P, Vedel P, Larsen N, Jensen GV, Parving HH, Pedersen O. Multifactorial intervention and cardiovascular disease in patients with type 2 diabetes. N Engl J Med 2003;348:383-93. Crossref
    6. Waters DD, Ku I. Early statin therapy in acute coronary syndromes: the successful cycle of evidence, guidelines, and implementation. J Am Coll Cardiol 2009;54:1434-7. Crossref
    7. Ward S, Lloyd Jones M, Pandor A, et al. A systematic review and economic evaluation of statins for the prevention of coronary events. Health Technol Assess 2007;11:1-160,iii-iv. Crossref
    8. O’Regan C, Wu P, Arora P, Perri D, Mills EJ. Statin therapy in stroke prevention: a meta-analysis involving 121,000 patients. Am J Med 2008;121:24-33. Crossref
    9. Vrecer M, Turk S, Drinovec J, Mrhar A. Use of statins in primary and secondary prevention of coronary heart disease and ischemic stroke. Meta-analysis of randomized trials. Int J Clin Pharmacol Ther 2003;41:567-77. Crossref
    10. Law MR, Wald NJ, Rudnicka AR. Quantifying effect of statins on low density lipoprotein cholesterol, ischaemic heart disease, and stroke: systematic review and meta-analysis. BMJ 2003;326:1423. Crossref
    11. Fung CS, Chin WY, Dai DS, et al. Evaluation of the quality of care of a multi-disciplinary risk factor assessment and management programme (RAMP) for diabetic patients. BMC Fam Pract 2012;13:116. Crossref
    12. Kung K, Chow KM, Hui EM, et al. Prevalence of complications among Chinese diabetic patients in urban primary care clinics: a cross-sectional study. BMC Fam Pract 2014;15:8. Crossref
    13. Kotseva K, Wood D, De Backer G, De Bacquer D, Pyörälä K, Keil U; EUROASPIRE Study Group. Cardiovascular prevention guidelines in daily practice: a comparison of EUROASPIRE I, II, and III surveys in eight European countries. Lancet 2009;373:929-40. Crossref
    14. Toth PP, Zarotsky V, Sullivan JM, Laitinen D. Dyslipidemia treatment of patients with diabetes mellitus in a US managed care plan: a retrospective database analysis. Cardiovasc Diabetol 2009;8:26. Crossref
    15. Ferrières J, Gousse ET, Fabry C, Hermans MP; French CEPHEUS Investigators. Assessment of lipid-lowering treatment in France—the CEPHEUS study. Arch Cardiovasc Dis 2008;101:557-63. Crossref
    16. Kotseva K, Stagmo M, De Bacquer D, De Backer G, Wood D; EUROASPIRE II Study Group. Treatment potential for cholesterol management in patients with coronary heart disease in 15 European countries: findings from the EUROASPIRE II survey. Atherosclerosis 2008;197:710-7. Crossref
    17. Santos RD, Waters DD, Tarasenko L, et al. Low- and high-density lipoprotein cholesterol goal attainment in dyslipidemic women: The Lipid Treatment Assessment Project (L-TAP) 2. Am Heart J 2009;158:860-6. Crossref
    18. Kim HS, Wu Y, Lin SJ, et al. Current status of cholesterol goal attainment after statin therapy among patients with hypercholesterolemia in Asian countries and region: the Return on Expenditure Achieved for Lipid Therapy in Asia (REALITY-Asia) study. Curr Med Res Opin 2008;24:1951-63. Crossref
    19. Phillips LS, Branch WT, Cook CB, et al. Clinical inertia. Ann Intern Med 2001;135:825-34. Crossref
    20. Okonofua EC, Simpson KN, Jesri A, Rehman SU, Durkalski VL, Egan BM. Therapeutic inertia is an impediment to achieving the Healthy People 2010 blood pressure control goals. Hypertension 2006;47:345-51. Crossref
    21. Andrade SE, Gurwitz JH, Field TS, et al. Hypertension management: the care gap between clinical guidelines and clinical practice. Am J Manag Care 2004;10:481-6.
    22. O’Connor PJ, Sperl-Hillen JM, Johnson PE, Rush WA, Biltz G. Clinical inertia and outpatient medical errors. In: Henriksen K, Battles JB, Marks ES, Lewin DI, editors. Advances in patient safety: from research to implementation. Vol 2: Concepts and methodology. Rockville (MD): Agency for Healthcare Research and Quality (US); 2005.
    23. American Diabetes Association. Standards of medical care in diabetes—2013. Diabetes Care 2013;36 Suppl 1:S11-66. Crossref
    24. National Cholesterol Education Program (NCEP) Expert Panel on Detection, Evaluation, and Treatment of High Blood Cholesterol in Adults (Adult Treatment Panel III). Third Report of the National Cholesterol Education Program (NCEP) Expert Panel on Detection, Evaluation, and Treatment of High Blood Cholesterol in Adults (Adult Treatment Panel III) final report. Circulation 2002;106:3143-421.
    25. Centers for Disease Control and Prevention (CDC). Smoking-attributable mortality, years of potential life lost, and productivity losses—United States, 2000-2004. MMWR Morb Mortal Wkly Rep 2008;57:1226-8.
    26. Levey AS, Bosch JP, Lewis JB, Greene T, Rogers N, Roth D. A more accurate method to estimate glomerular filtration rate from serum creatinine: a new prediction equation. Modification of Diet in Renal Disease Study Group. Ann Intern Med 1999;130:461-70. Crossref
    27. Park JE, Chiang CE, Munawar M, et al. Lipid-lowering treatment in hypercholesterolaemic patients: the CEPHEUS Pan-Asian survey. Eur J Prev Cardiol 2012;19:781-94. Crossref
    28. Khoo CM, Tan ML, Wu Y, et al. Prevalence and control of hypercholesterolaemia as defined by NCEP-ATPIII guidelines and predictors of LDL-C goal attainment in a multi-ethnic Asian population. Ann Acad Med Singapore 2013;42:379-87.
    29. Whitford DL, Al-Anjawi HA, Al-Baharna MM. Impact of clinical inertia on cardiovascular risk factors in patients with diabetes. Prim Care Diabetes 2014;8:133-8. Crossref
    30. Goldberg KC, Melnyk SD, Simel DL. Overcoming inertia: improvement in achieving target low-density lipoprotein cholesterol. Am J Manag Care 2007;13:530-4.
    31. Stone NJ, Robinson J, Lichtenstein AH, et al. 2013 ACC/AHA Guideline on the Treatment of Blood Cholesterol to Reduce Atherosclerotic Cardiovascular Risk in Adults, Journal of the American College of Cardiology 2013. Available from: https://www.joslin.org/docs/2013-ACC-AHA-Guideline-Treatment-of-Blood-Cholestero-_to-Reduce-Atherosclerotic-Cardiovascular-Risk-in-Adults.pdf. Accessed Mar 2016.
    32. Simvastatin summary of product characteristics. Available from: http://www.medicines.org.uk/emc/medicine/1201/SPC. Accessed Mar 2016.
    33. Aujoulat I, Jacquemin P, Rietzschel E, et al. Factors associated with clinical inertia: an integrative review. Adv Med Educ Pract 2014;5:141-7. Crossref
    34. Byrnes PD. Why haven’t I changed that? Therapeutic inertia in general practice. Aust Fam Physician 2011;40:24-8.
    35. Berlowitz DR, Ash AS, Glickman M, et al. Developing a quality measure for clinical inertia in diabetes care. Health Serv Res 2005;40:1836-53. Crossref

    Prevalence of infections among residents of Residential Care Homes for the Elderly in Hong Kong

    Hong Kong Med J 2016 Aug;22(4):347–55 | Epub 6 Jul 2016
    DOI: 10.12809/hkmj164865
    © Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
     
    ORIGINAL ARTICLE  CME
    Prevalence of infections among residents of Residential Care Homes for the Elderly in Hong Kong
    Carmen SM Choy, MB, BS1; H Chen, MB, BS, FHKAM (Community Medicine)2; Carol SW Yau, MB, BS, FHKAM (Community Medicine)3; Enoch K Hsu, BSc, MSc2; NY Chik, BNurs2; Andrew TY Wong, MB, BS, FHKAM (Medicine)2
    1 Accident and Emergency Department, Queen Elizabeth Hospital, Jordan, Hong Kong
    2 Infection Control Branch, Centre for Health Protection, Hong Kong
    3 Surveillance and Epidemiology Branch, Centre for Health Protection, Hong Kong
     
    Corresponding author: Dr Carmen SM Choy (carmencly@yahoo.com.hk)
     
     Full paper in PDF
    Abstract
    Introduction: A point prevalence study was conducted to study the epidemiology of common infections among residents in Residential Care Homes for the Elderly in Hong Kong and their associated factors.
     
    Methods: Residential Care Homes for the Elderly in Hong Kong were selected by stratified single-stage cluster random sampling. All residents aged 65 years or above from the recruited homes were surveyed. Infections were identified using standardised definitions. Demographic and health information—including medical history, immunisation record, antibiotic use, and activities of daily living (as measured by Barthel Index)—was collected by a survey team to determine any associated factors.
     
    Results: Data were collected from 3857 residents in 46 Residential Care Homes for the Elderly from February to May 2014. A total of 105 residents had at least one type of infection based on the survey definition. The overall prevalence of all infections was 2.7% (95% confidence interval, 2.2%-3.4%). The three most common infections were of the respiratory tract (1.3%; 95% confidence interval, 0.9%-1.9%), skin and soft tissue (0.7%; 95% confidence interval, 0.5%-1.0%), and urinary tract (0.5%; 95% confidence interval, 0.3%-0.9%). Total dependence in activities of daily living, as indicated by low Barthel Index score of 0 to 20 (odds ratio=3.0; 95% confidence interval, 1.4-6.2), and presence of a wound or stoma (odds ratio=2.7; 95% confidence interval, 1.4-4.9) were significantly associated with presence of infection.
     
    Conclusions: This survey provides information about infections among residents in Residential Care Homes for the Elderly in the territory. Local data enable us to understand the burden of infections and formulate targeted measures for prevention.
     
    New knowledge added by this study
    • Characteristics of local Residential Care Homes for the Elderly (RCHE) residents were explored. Most individuals had medical co-morbidities and required assistance with activities of daily living (ADL); use of an indwelling medical device was also common.
    • Local prevalence of infections among residents in RCHE was 2.7% and the most common infection was of the respiratory tract.
    • Total dependence in ADL and presence of a wound or stoma were associated with presence of infections among residents in RCHE in Hong Kong.
    Implications for clinical practice or policy
    • Measures that focus on prevention of respiratory tract infection among the elderly should be emphasised and an infection control programme should be designed to enhance such practice in RCHE.
    • Infection control protocols can be developed according to specific areas of nursing care, for example, wound care or catheter care.
     
     
    Introduction
    Ageing is a worldwide phenomenon, and Hong Kong, without exception, is encountering the same population change. In 2014, the proportion of our elderly population aged 65 years or above was 15%. This proportion is expected to double over the next 20 years, to 28% in 2034.1
     
    In Hong Kong, Residential Care Homes for the Elderly (RCHEs) provide different levels of care for the elderly who—for personal, social, health, or other reasons—can no longer live alone or with their family. These RCHEs can be broadly categorised as private homes (PH) and non-private homes (NPH); the former are run by private entrepreneurs and vary in size and capacity, while the latter are run by non-governmental organisations and include care and attention homes for the elderly and subvented, and self-financing and contract homes that provide subsidised care for the elderly. In 2015, there were around 740 RCHEs providing over 73 000 residential placements over the territories and residential care for approximately 7% of the elderly in Hong Kong.1 2 These numbers are expected to increase further.
     
    Generally speaking, RCHEs are very heterogeneous in terms of size, facilities, manpower, and level of care. Some residents are encouraged to participate in various types of group activities while some residents may require assistance in daily activities. These activities, together with the confined and shared living environment, may promote the transmission of infectious diseases.
     
    Infections are an important cause of morbidity and mortality among the elderly, and place a significant burden on our health care system. Residents of RCHEs are usually frail; compared with their community-dwelling counterparts, they are more susceptible to infections and related complications. Overseas studies conducted in a hospital setting have shown that the mortality rate of community-acquired pneumonia is 30%, while that in nursing homes is substantially higher, with a reported rate of up to 57%.3 Infections may be a cause as well as the consequence of functional impairment among RCHE residents, leading to a reduction in their quality of life.4
     
    Local studies of infectious diseases in the RCHE setting are scarce. The last such survey was conducted in 2006.5 Continuous or regular surveillance serves to reveal the disease burden to increase awareness of infections and to identify critical areas for infection control. It is important that we understand the local epidemiology and burden of infections among RCHE residents and apply measures to control these infections and safeguard the health of this vulnerable group.
     
    Methods
    We conducted a point prevalence survey of common infections among residents of RCHEs in Hong Kong.
     
    Population and setting
    All RCHEs in Hong Kong were included. All residents aged 65 years or above who were present at 9 am (the reference time) on the survey day were included. Residents were excluded from the survey if: (1) he/she was not present at the reference time (owing to medical appointment, admission to hospital, or home leave from the RCHE); or (2) he/she attended the RCHE as a day patient/resident.
     
    Sampling strategy
    A list of all RCHEs in Hong Kong was retrieved from the Social Welfare Department website.6 All RCHEs on the list were stratified according to the main geographical region of Hong Kong (Kowloon, Hong Kong Island, and New Territories) and type of RCHE (PH and NPH) into six strata. Stratified single-stage cluster random sampling was performed using the captioned list as the sampling frame. All residents were surveyed in each recruited home.
     
    Data collection
    The survey was conducted from February to May 2014. A survey team comprising doctors, nurses, and research staff visited the RCHEs for data collection on any one day during the survey period. A standardised survey form was developed based on a previous similar prevalence survey.5 This survey form comprised four parts: (1) socio-demographic information about the resident; (2) health information including medical history, vaccination history, and antibiotic use; (3) measurement of activities of daily living (ADL) using the Barthel Index (BI)7; and (4) a checklist of acute symptoms of common infections. Symptoms of acute infections were obtained by doctors by interviewing the residents or their major carers with the help of RCHE staff. All health-related information was verified by doctors or nurses; functional status of residents was assessed by trained nurses using the BI.
     
    A pilot study was conducted in two RCHEs in February 2014 to field test the data collection tools. Inter-rater reliability on BI was assessed during the period for the six trained nurses. Cohen’s kappa of BI estimated ranged from 0.8 to 1, suggesting good inter-rater reliability among them.
     
    The survey was conducted in an anonymous manner. Written consent was obtained from the RCHEs. Verbal consent was obtained from residents and/or their relatives. If any residents (or their relatives) refused to participate in the survey, their information was not retrieved. If relevant data were not available on the survey day, data would be retrieved within 1 week. The study was approved by the Ethics Committee of the Department of Health.
     
    Outcome measures
    Infection was defined using any one of the following criteria: (1) presence of symptoms and/or signs of infection that developed in the 24 hours preceding the survey day, that fulfilled the surveillance criteria of the Canadian Consensus Conference8; (2) infection diagnosed by a locally registered physician (eg visiting medical officers, general practitioners); or (3) consumption of antimicrobial agents on the survey day for a specific infection.
     
    Sample size and power estimation
    Sample size was estimated to determine the prevalence of infections among residents in RCHEs in Hong Kong. Based on a previous prevalence survey in Hong Kong,5 the prevalence of infections was 5.7%, the design effect (DEFF) was 1.765 with an intraclass correlation coefficient (ICC) of 0.025 and average cluster size of 31.61. Assuming the margin of error to be 0.2, given a conservative DEFF of 2, the sample size calculated was 3178.
     
    Statistical analyses
    R (version 3.0.2) was used for statistical analysis. “Survey” package (version 3.29-5) in R was used to calculate the prevalence of infections adjusted for cluster sampling. Prevalence rates of infections and other study variables were calculated using “svyciprop” function from the “Survey” package. Logistic regression with adjustment on cluster sampling was performed using “svyglm” function from the “Survey” package to identify risk factors for infection. Variables were included for multivariate analysis if: (1) the P value was <0.25 in univariate analysis or (2) the variables had been considered as risk factors of infection in previous studies, such as mobility status, use of medical devices, presence of wound, home size, gender, and recipient of the Comprehensive Social Security Assistance (as a surrogate measurement of social economic status).9 10 11 12 In addition, subgroup analyses were performed to explore the association of specific risk factors with different types of infection, such as the presence of chronic obstructive pulmonary disease (COPD) with respiratory tract infection (RTI) and diabetes mellitus (DM) with skin and soft tissue infection (SSTI). In order to adjust for multiple comparisons, P values calculated for exploration of association between risk factors and different types of infection were adjusted using Bonferroni correction. A P value of <0.05 was considered to be statistically significant.
     
    Results
    A total of 100 RCHEs were invited. The overall response rate was 46% (n=46). Table 1 illustrates the number of recruited RCHEs stratified by region and home type. A higher response rate was noted in NPHs (70.6%, n=12) than PHs (42.2%, n=35). The ICC and DEFF calculated from the sample collected were 0.0035 and 1.45, respectively. The mean home capacity of the participating RCHEs was 111 residents per home (95% confidence interval [CI], 88-133 residents per home) with a median occupancy of 89%. Among the staff in participating RCHEs, 45.9% were personal care workers, 14.7% were health care workers, and 11% were nurses (including registered nurses and enrolled nurses). There was no significant difference in terms of home capacity, occupancy, or staffing level between participating and non-participating RCHEs based on data from the annual assessment of all RCHEs conducted by Elderly Health Service, Department of Health.
     

    Table 1. Recruitment of Residential Care Homes for the Elderly (RCHEs)
     
    Demographics and underlying co-morbidity of residents
    Among the 4127 residents in the participating RCHEs, 261 (6.3%) were excluded from the survey as they were not available due to hospitalisation, medical appointment, home leave, or other personal reasons. All 3866 residents who were at the participating RCHEs at 9 am on the survey day were invited and joined the survey. Nine (0.2%) residents were not included in the analysis as their RCHEs failed to provide relevant information subsequently. Among the 3857 residents surveyed, the mean age was 85.2 years, and the female-to-male ratio was 1.9:1. Most residents were Chinese (99.8%, n=3849), and 56.5% (n=2178) of those surveyed were above 84 years old. The mean age of male and female residents was 82.4 and 86.8 years, respectively. Table 2 shows the demographic information of the surveyed sample. Duration of their stay in RCHEs varied; a quarter had resided in a RCHE for less than 1 year, 29.4% for 1 to 3 years, 24.7% for 3 to 6 years, and 20.9% for more than 6 years. For ADL of residents, the median BI score was 30, and 46.7% of residents scored 0-20 indicating they were totally dependent in ADL.13 Regarding use of an indwelling medical device, 14.4% of residents required at least one device, mostly a nasogastric tube (9.2%) or urinary catheter (5.3%). Up to 75.8% of residents received the 2013/2014 seasonal influenza vaccine and 50.2% had received the pneumococcal vaccine. Most residents (87.1%) had more than one underlying co-morbidity with the most common diagnosis being hypertension (69.3%), followed by dementia (37.0%) and stroke (35.0%) [Table 3].
     

    Table 2. Demographic information of surveyed sample at baseline
     

    Table 3. Common co-morbidities of the study population
     
    Prevalence of infections
    A total of 105 residents were diagnosed with at least one infection based on the survey definition. Among these residents, 102 had one type of infection and three had two types of infection. The overall prevalence of infections was 2.7% (95% CI, 2.2%-3.4%). Table 4 shows the prevalence of different infections.
     

    Table 4. Prevalence of different types of infections among residents of Residential Care Homes for the Elderly
     
    Of all the infections, RTI was the most common type, comprising 49.1% (n=53) of all infections, followed by SSTI (25.0%, n=27) and urinary tract infection (UTI) [17.6%, n=19; Fig].
     

    Figure. Distribution of different types of infections in Residential Care Homes for the Elderly (RCHEs)
     
    Factors associated with infectious diseases
    Table 5 illustrates factors associated with presence of any infection and specific infections. Residents with ADL dependency, as reflected by low BI score of 0-20 (odds ratio [OR]=3.0; 95% CI, 1.4-6.2), presence of a wound or stoma (OR=2.7; 95% CI, 1.4-4.9), or co-morbidities including cardiovascular diseases (CVD) [OR=2.4; 95% CI, 1.4-4.0] and respiratory diseases (OR=2.6; 95% CI, 1.6-4.1) were significantly likely to have an infection. Seasonal influenza vaccination (OR=0.82; P=0.452) and pneumococcal vaccination (OR=0.66; P=0.201) were associated with a lower risk of infection but neither reached statistical significance.
     

    Table 5. Association between presence of infections and RCHEs/resident characteristics
     
    Subgroup analysis by site of infection showed that low BI score (OR=2.6; 95% CI, 1.3-5.2) and COPD (OR=3.7; 95% CI, 1.5-9.1) were significantly associated with RTI. Factors significantly associated with SSTI included low BI score (OR=5.5; 95% CI, 1.7-17.5), presence of wound(s) and stoma (OR=9.0; 95% CI, 4.7-17.1), having DM (OR=1.9; 95% CI, 1.0-3.6), mental illness (OR=3.7; 95% CI, 1.2-11.8), and CVD (OR=4.6; 95% CI, 1.3-16.3). Presence of a urinary catheter was significantly associated with UTI (OR=5.6; 95% CI, 1.9-16.2).
     
    Discussion
    This point prevalence survey aimed to investigate the prevalence of infections among residents living in RCHEs in Hong Kong. It is essential to understand that this specific group of elderly differs significantly from their community-dwelling counterparts in terms of health condition, level of mobility, daily routine behaviour, and level of care received. The confined living environment, shared bathing equipment, group dining facilities, and close human-to-human contact potentially foster the transmission of infection. A local study has shown that nursing home residency is an independent predictor of infection-related mortality, pneumonia-related mortality, and all-cause mortality.14
     
    In this study, the overall prevalence of infections was 2.7%. Among all infections, RTI, SSTI, and UTI ranked top with a prevalence of 1.3%, 0.7%, and 0.5%, respectively. Low BI score of 0-20, presence of a wound or stoma, and co-morbidities including CVD and respiratory diseases were significantly associated with presence of infection.
     
    Compared with the previous prevalence survey of infections among residents of RCHEs in Hong Kong conducted in November 2006,5 a lower overall prevalence was noticed in this survey. In 2006 the prevalence was 5.7%.5 A similar pattern of prevalence regarding type of infection was observed for common cold or pharyngitis (included under RTI in our study) that was the most common type of infection, followed by SSTI and UTI. This reduction in overall prevalence over a 6-year period may be due to a better awareness of infection control among the general public and health care workers, particularly after the severe acute respiratory syndrome endemic in 2003 and H1N1 swine influenza endemic in 2009. Another encouraging finding in this study may also account for this improved trend: an increased uptake of seasonal influenza vaccine was noted, from 60.3% in year 2012-2013 to 75.8% in year 2013-2014 among surveyed residents.
     
    Prevalence surveys conducted in long-term care facilities (LTCF) overseas have generally reported an overall higher prevalence of infection, from 3.4% to 11.8%.15 16 17 18 19 20 21 22 23 24 Most reported UTI as the most common type of infection.15 16 19 20 21 22 23 Despite a lower prevalence in our survey compared with overseas surveys, we must interpret the results with caution for a few reasons. First, the difference in survey method, study population, and case definition among these studies may render direct comparison of prevalence inappropriate. Second, it is important to understand the differences between settings and the elderly population in LTCF in Hong Kong and those overseas. In the US, LTCF can further be categorised into veteran care centres that provide care for elderly military officers, and nursing homes and residential care communities that offer different levels of assistance in ADL depending on the elderly individual’s capacity for self-care.25 On the contrary, in Europe, more than two thirds of those receiving institutional care are above 80 years of age.26 Third, staff levels, occupancy,25 local infection practice and guidelines, and accessibility to health care facilities, such as emergency room or secondary health care facilities in overseas LTCF differ significantly from our local setting. These factors may explain the difference in prevalence between local RCHEs and overseas LTCF.
     
    This study also investigated the risk factors associated with the presence of infection among residents. Low BI score of 0-20 representing total dependency in ADL, and presence of a wound or stoma were associated with presence of any type of infection. The findings are consistent with past studies that suggest limitations in ADL or functional impairment, and presence of skin ulcers are risk factors for infection.15 17 21 Nevertheless, the protective effect of immunisation with seasonal influenza vaccine and pneumococcal vaccine was not clearly demonstrated in this study.
     
    Regarding RTI, which essentially includes upper tract infections (eg common cold or influenza-like illness) and lower tract infections (eg pneumonia), COPD and lower BI score of 0-20 were two associated factors. A few previous studies that focused on risk factors for pneumonia (or specifically nursing home–associated pneumonia) also suggested that a low BI score,27 low ADL score,27 profound debility (measured by Karnofsky score of ≤40),28 and COPD5 28 are associated factors, and is compatible with our findings.
     
    Although multiple factors were significantly associated with SSTI in our study, including low BI score, presence of wounds and stoma, co-morbidities like DM, mental illnesses and CVD, limited studies have determined risk factors for SSTI in LTCF. In Cotter et al’s study,16 presence of a urinary catheter, vascular catheter, pressure sores, or other wounds was significantly associated with SSTI. It is possible that individuals with DM or CVD are more prone to development of an ulcer or poor wound healing, and thus have a higher risk of SSTI. Further studies may be necessary to delineate the association between SSTI and other co-morbidities.
     
    Presence of an indwelling urinary catheter is not surprisingly associated with UTI, and is compatible with the previous local study5 and most overseas studies.16 29 30 This reflects the importance of proper care for indwelling urinary catheters in RCHEs.
     
    Our study provides more information regarding prevalence and risk factors associated with infectious diseases in RCHEs in Hong Kong. Readers, however, must take note of a few limitations of this study.
     
    First, a point prevalence study offers only a snapshot of events and thus a causal relationship between risk factors and infections cannot be established. Our study was conducted during February to May, which was late winter to early spring time in Hong Kong, and the prevalence of different infections may have a seasonal variation, for example, influenza.31 32 Comparison needs to take account of the season during which the study was conducted.
     
    Second, only 46 of the 100 invited RCHEs participated in the survey. This response rate may affect the generalisability of results. It is possible that the RCHEs with stronger compliance with infection control measures volunteered to participate whilst those homes that refused were less compliant and had a higher infection prevalence.
     
    Third, the exclusion of residents who were not present at the RCHEs at the reference time may have led to underestimation of the prevalence of infection. We reviewed the list of residents excluded from the survey and found 18 of them had been admitted to hospital in the 2 days preceding the survey, of whom 10 were admitted because of symptoms or signs suggestive of infection. Assuming they all fulfilled the criteria for infection in this survey, the effect was likely minimal, with an adjusted prevalence of infections of 2.9% (95% CI, 2.3%-3.7%).
     
    Fourth, demographic data, medical history, and vaccination history were retrieved from records maintained by RCHEs, but different RCHEs had different practices of record keeping. Data may have been incomplete or inadequate in certain RCHEs while others may have provided more detailed data. These differences were minimised by a standard protocol and training of the survey team and verification of data with RCHE staff on site.
     
    Finally, we did not include any infection control practice measures in our study, such as hand hygiene compliance of staff and environmental hygiene measures. While the aim of the study was not to assess the infection control practices of RCHEs, these factors could potentially affect the results in the risk factor analysis, and hence, readers should interpret the regression result in the context that confounding may present.
     
    Conclusions
    The overall prevalence of infections among RCHE residents was estimated to be 2.7%. Associated factors were identified. It is recommended that infection control measures be targeted towards these factors. Training for RCHE staff and a policy to execute infection control guidelines in RCHEs should be planned early in view of an increasing demand for services provided by RCHEs. Further study can be carried out at different times of the year to identify any seasonal changes and pattern of infections, or targeted at residents admitted to public hospitals with acute infections to estimate the overall burden on our health care sector.
     
    Acknowledgements
    The authors would like to thank the survey team for their hard work in study design and fieldwork. Furthermore, we extend our heartfelt gratitude to all the participating RCHEs and their staff for their assistance throughout the study. Without their support, this survey would not have been possible. The authors would like to thank the Elderly Health Service, Department of Health for sharing their data on annual assessment of RCHEs.
     
    Declaration
    All authors have disclosed no conflicts of interest.
     
    References
    1. Census and Statistics Department, Hong Kong SAR Government. Hong Kong population projections 2015-2064. September 2015. Available from: http://www.statistics.gov.hk/pub/B1120015062015XXXXB0100.pdf. Accessed 22 Dec 2015.
    2. Social Welfare Department, Hong Kong SAR Government. Provision of residential care services for elders (non-governmental organisations versus private sector). March 2015. Available from: http://www.swd.gov.hk/doc/elderly/ERCS/Overview%20item(b)English(31-3-2015).pdf. Accessed 22 Dec 2015.
    3. Gavazzi G, Krause KH. Ageing and infection. Lancet Infect Dis 2002;2:659-66. Crossref
    4. Büla CJ, Ghilardi G, Wietlisbach V, Petignat C, Francioli P. Infections and functional impairment in nursing home residents: a reciprocal relationship. J Am Geriatr Soc 2004;52:700-6. Crossref
    5. Chen H, Chiu AP, Lam PS, et al. Prevalence of infections in residential care homes for the elderly in Hong Kong. Hong Kong Med J 2008;14:444-50.
    6. Social Welfare Department, Hong Kong SAR Government. List, licences and briefs of residential care homes. Available from: http://www.swd.gov.hk/en/index/site_pubsvc/page_elderly/sub_residentia/id_listofresi/. Accessed 31 Dec 2013.
    7. Mahoney FI, Barthel DW. Functional evaluation: The Barthel Index. Md State Med J 1965;14:61-5.
    8. McGeer A, Campbell B, Emori TG, et al. Definitions of infection for surveillance in long-term care facilities. Am J Infect Control 1991;19:1-7. Crossref
    9. Magaziner J, Tenney JH, DeForge B, Hebel JR, Muncie HL Jr, Warren JW. Prevalence and characteristics of nursing home–acquired infections in the aged. J Am Geriatr Soc 1991;39:1071-8. Crossref
    10. Li J, Birkhead GS, Strogatz DS, Coles FB. Impact of institution size, staffing patterns, and infection control practices on communicable disease outbreaks in New York State nursing homes. Am J Epidemiol 1996;143:1042-9. Crossref
    11. Cohen S. Social status and susceptibility to respiratory infections. Ann N Y Acad Sci 1999;896:246-53. Crossref
    12. Harrington RD, Hooton TM. Urinary tract infection risk factors and gender. J Gend Specif Med 2000;3:27-34.
    13. McDowell I. Measuring health: a guide to rating scales and questionnaires. Oxford: Oxford University Press; 2006. Crossref
    14. Chan TC, Hung IF, Cheng VC, et al. Is nursing home residence an independent risk factor of mortality in Chinese older adults? J Am Geriatr Soc 2013;61:1430-2. Crossref
    15. Chami K, Gavazzi G, Carrat F, et al. Burden of infections among 44,869 elderly in nursing homes: a cross-sectional cluster nationwide survey. J Hosp Infect 2011;79:254-9. Crossref
    16. Cotter M, Donlon S, Roche F, Byrne H, Fitzpatrick F. Healthcare-associated infection in Irish long-term care facilities: results from the First National Prevalence Study. J Hosp Infect 2012;80:212-6. Crossref
    17. Eriksen HM, Iversen BG, Aavitsland P. Prevalence of nosocomial infections and use of antibiotics in long-term care facilities in Norway, 2002 and 2003. J Hosp Infect 2004;57:316-20. Crossref
    18. Moro ML, Mongardi M, Marchi M, Taroni F. Prevalence of long-term care acquired infections in nursing and residential homes in the Emilia-Romagna Region. Infection 2007;35:250-5. Crossref
    19. Lim CJ, McLellan SC, Cheng AC, et al. Surveillance of infection burden in residential aged care facilities. Med J Aust 2012;196:327-31. Crossref
    20. Tsan L, Langberg R, Davis C, et al. Nursing home–associated infections in Department of Veterans Affairs community living centers. Am J Infect Control 2010;38:461-6. Crossref
    21. Tsan L, Davis C, Langberg R, et al. Prevalence of nursing home–associated infections in the Department of Veterans Affairs nursing home care units. Am J Infect Control 2008;36:173-9. Crossref
    22. Marchi M, Grilli E, Mongardi M, Bedosti C, Nobilio L, Moro ML. Prevalence of infections in long-term care facilities: how to read it? Infection 2012;40:493-500. Crossref
    23. Dwyer LL, Harris-Kojetin LD, Valverde RH, et al. Infections in long-term care populations in the United States. J Am Geriatr Soc 2013;61:342-9. Crossref
    24. European Centre for Disease Prevention and Control. Point prevalence survey of healthcare-associated infections and antimicrobial use in European long-term care facilities. May-September 2010. Available from: http://www.ecdc.europa.eu/en/publications/_layouts/forms/Publication_DispForm.aspx?List=4f55ad51-4aed-4d32-b960-af70113dbb90&ID=1086. Accessed 31 Dec 2013.
    25. Harris-Kojetin L, Sengupta M, Park-Lee E, Valverde R. Long-term care services in the United States: 2013 overview. Vital Health Stat 3 2013;(37):1-107.
    26. Rodrigues R, Huber M, Lamura G, editors. Facts and figures on healthy ageing and long-term care. European Centre for Social Welfare Policy and Research; 2012. Available from: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.303.1022&rep=rep1&type=pdf. Accessed 22 Jan 2016.
    27. Wójkowska-Mach J, Gryglewska B, Romaniszyn D, et al. Age and other risk factors of pneumonia among residents of Polish long-term care facilities. Int J Infect Dis 2013;17:e37-43. Crossref
    28. Muder RR. Pneumonia in residents of long-term care facilities: epidemiology, etiology, management, and prevention. Am J Med 1998;105:319-30. Crossref
    29. Michel JP, Lesourd B, Conne P, Richard D, Rapin CH. Prevalence of infections and their risk factors in geriatric institutions: a one-day multicentre survey. Bull World Health Organ 1991;69:35-41.
    30. Eriksen HM, Koch AM, Elstrøm P, Nilsen RM, Harthug S, Aavitsland P. Healthcare-associated infection among residents of long-term care facilities: a cohort and nested case-control study. J Hosp Infect 2007;65:334-40. Crossref
    31. Saha S, Chadha M, Al Mamun A, et al. Influenza seasonality and vaccination timing in tropical and subtropical areas of southern and south-eastern Asia. Bull World Health Organ 2014;92:318-30. Crossref
    32. Yang L, Wong CM, Lau EH, Chan KP, Ou CQ, Peiris JS. Synchrony of clinical and laboratory surveillance for influenza in Hong Kong. PLoS One 2008;3:e1399. Crossref

    Breast pain in lactating mothers

    Hong Kong Med J 2016 Aug;22(4):341–6 | Epub 17 Jun 2016
    DOI: 10.12809/hkmj154762
    © Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
     
    ORIGINAL ARTICLE
    Breast pain in lactating mothers
    Sophie SF Leung, FHKCPaed, FHKAM (Paediatrics)
    Department of Paediatrics, The Chinese University of Hong Kong, Shatin, Hong Kong (c/o: Room 1502, 15/F, Hong Kong Pacific Centre, 28 Hankow Road, Tsimshatsui, Hong Kong)
     
    Corresponding author: Dr Sophie SF Leung (dr.leung@ssfl.com.hk)
     
     Full paper in PDF
    Abstract
    Introduction: The number of new mothers who breastfeed has increased dramatically over the last three decades. There is a concern that the present related medical service may be inadequate. Breast pain is the most common complaint among lactating mothers who seek medical help. This study aimed to investigate this problem.
     
    Methods: Medical records of women who presented with breast pain to a private clinic run by a doctor who was trained as an International Lactation Consultant were reviewed over a period of 6 months in 2015. Most patients were self-referred after chatting online. Assessment included characteristics and duration of pain, treatment prior to consultation, feeding practices, mother’s diet, and breast examination. Any site of blockage was identified and relieved. Those with persistent pain were given antibiotics. When there were signs of abscess or abscess that could not be drained, they were referred to a breast surgeon.
     
    Results: A total of 69 patients were seen of whom 45 had been breastfeeding for more than 1 month. Pain was experienced for longer than 7 days in 22 women. Antifungal or antibacterial treatment had been unsuccessful in 31 women prior to consultation. The diagnoses were engorgement in five women, blocked duct in 35, mastitis in 13, breast abscess in six, poor positioning and latch in seven, nipple cracks in two, and skin infection in one. Oral antibiotics were prescribed to 21 patients and local antifungal treatment was given to one patient only.
     
    Conclusion: Blocked duct was the most common cause of breast pain in lactating mothers. Without prompt relief it is possible that it will progress to mastitis/breast abscess or the mother may discontinue breastfeeding. This may be a suitable time for Hong Kong to set up one or more public full-time breastfeeding clinics to provide a better service to lactating mothers and to facilitate professional training and research.
     
    New knowledge added by this study
    • Most breast pain in lactating mothers is not necessarily due to bacterial or fungal infection but due to duct blockage that can be relieved promptly by gentle breast massage and milk expression.
    • Local mothers had a specific dietary practice to encourage milk production that could sometimes be harmful.
    Implications for clinical practice or policy
    • To cope with the increased prevalence of breastfeeding, relevant clinical services should be established, including one or more full-time breastfeeding clinics in the public sector that mothers can attend without the need for medical referral. This will also help in research since local practices and clinical problems may differ to those described in the literature.
     
     
    Introduction
    Hong Kong has experienced a tremendous change in lifestyle and the consequent clinical problems pose a challenge to the medical profession. A good example of this is infant feeding. Almost half a century ago, the prevalence of breastfeeding in Hong Kong was at its lowest rate of 5% in 1978 after a dramatic fall from 44% in 1967.1 Following the joint efforts of doctors, nurses and mothers, the prevalence of the ever breastfeeding rate in Hong Kong has rapidly climbed from 20% in 1992 to 60% in 2002 and 86% in 2014.2 The efforts of both the UNICEF Baby-Friendly Hospital Initiative and the Department of Health should be applauded.
     
    Breast milk is the best for babies. Mothers should be encouraged to breastfeed fully for 6 months, followed by introduction of solid foods and continuation of breastfeeding for 2 years or more. Recent data have shown that only 27% of mothers can sustain breastfeeding for 4 to 6 months.2 There are areas where we, as medical professionals, can provide support. For historical reasons, however, not many local doctors and nurses have been trained to manage the clinical problems encountered by breastfeeding mothers. One such problem is breast pain.
     
    Breast pain, which may lead to cessation of breastfeeding, is the most common complaint of lactating mothers seen in a private general paediatric clinic run by a doctor (author) trained in 2000 as an International Lactation Consultant. This study aimed to analyse the reasons for breast pain and how it can be relieved.
     
    Methods
    Clinical records of lactating mothers who presented with breast pain over a 6-month period (January to June 2015) were retrieved. Patients were self-referred after chatting online with other breastfeeding mothers. During consultation, patients were asked about the history of pain, prior treatment, breastfeeding practices, and their own diet. Breast examination was then performed, including the nipple and areola, to identify any redness or tenderness. In particular, any blockage was identified.
     
    If redness or tenderness was generalised in either or both breasts, it was diagnosed as engorgement (Fig a). If it was confined to a segment, this implied only a lobule was involved. If gentle massage and milk expression provided relief, a blocked duct was diagnosed (Fig b). The ability to express pus (Fig c) or an area of fluctuation or skin thinning (Fig d) was indicative of breast abscess. Mastitis was diagnosed in the presence of fever and tenderness/mass that could not be relieved but had not progressed to an abscess.3 Nipples were examined for cracks (Fig e). Feeding position and latch were checked when appropriate and corrected accordingly. When there was a white spot in the nipple, it was cleared by simple expression or by using a needle to open up the blockage. If there was a shinny reddish colour of the nipple and areola together with burning, stinging, and itchiness then fungal infection was diagnosed. In such case, the baby’s mouth was also examined for the presence of oral thrush.
     

    Figure. (a) General redness of breast due to engorgement. (b) Local area of redness in blocked duct or mastitis. (c) Pus expressed from breast abscess. (d) Bluish area with sign of fluctuation. (e) Cracks in nipple (highlight with mercurochrome solution)
     
    Results
    A total of 69 patients were seen of whom 45 had been breastfeeding for more than 1 month. All except six were in their 30s. The age of the baby was less than 1 month in 24 (35%) women, 1 to 6 months in 27 (39%), and over 6 months in 18 (26%). Only 13 (19%) used complementary infant milk formula.
     
    Breast pain was present for less than 3 days in 35 (51%) women but for longer in the remaining 34 (49%). Pain duration exceeded 7 days in 22 (32%); 15 (22%) of whom had intermittent pain for 14 to 30 days. In 31 (45%) patients, earlier treatment had been received from various sources including Maternal and Child Health Centres (MCHCs), family doctors, general practitioners, obstetricians, doctors at an accident and emergency department, surgeons in a breast surgery clinic, or lactation consultants. Antifungal or antibacterial medication, either local or systemic, was prescribed.
     
    Apart from breast pain, there were other additional complaints: nipple pain in eight (12%) women, sharp needle pain after feeding in eight (12%), white spot at nipple in 15 (22%), and fever in 14 (20%). All had decreased milk production by the affected breast despite frequent feeding or pumping.
     
    The following diagnoses were made: nipple cracks (n=2), poor positioning and latch (n=7), engorgement (n=5), blocked duct (n=35), mastitis (n=13), breast abscess (n=6), and skin infection (n=1). One had all pus drained via the milk duct. Another had pus formed in the sebaceous gland at the areola and was fully drained. The remaining four were referred to a surgeon for further management. Oral antibiotics were prescribed to 21 (30%) women. Fungal infection was suspected in only one woman. Clinical details of four patients chosen for illustration are shown in the Table.
     

    Table. Clinical details of four patients
     
    Discussion
    Subjects in this study represent mothers who were very dedicated to breastfeeding. Most had been breastfeeding for more than 1 month and had not given up, despite experiencing pain for quite a number of days.
     
    Blocked duct/mastitis
    Blocked duct was the most common cause of breast pain in this study group. Delay in diagnosing and treating a blocked duct can lead to a more serious condition of mastitis and breast abscess.
     
    Engorgement, blocked duct, mastitis, and breast abscess reflect progression from a common original problem of inadequate drainage that can be due to poor positioning and latching, inadequate emptying, or overproduction.3 Obtaining a good history, performing a thorough breast examination, and milk expression can help to make the diagnosis.
     
    Engorgement usually involves the whole breast whereas a blocked duct involves a lobule. In the latter, redness and tenderness are apparent and examination of the areola may reveal a tender swelling representing a blockage of the duct near the opening. Gentle massage and milk expression will relieve the pain and tenderness. A simple blocked duct can be relieved immediately. Nonetheless, when the swelling can only be partially relieved, it may represent tissue inflammation indicative of mastitis. Mothers were encouraged to feed more often on the affected breast. If this failed after one or two feeds, antibiotics were prescribed to prevent progression to breast abscess.
     
    Milk is a very good medium for bacterial culture. Stasis of milk for too long may lead to infection (mastitis) and pus formation (abscess). The common guideline is to relieve a blocked duct as soon as possible, especially in the presence of fever. Once fever has persisted for longer than 24 hours, antibiotics are required. In one woman in this study, however, breast abscess was evident within the first few hours of fever and in another woman without fever, thus fever should be considered a non-specific sign. Clinical assessment was the most important. The ratio of breast abscess to mastitis was higher in this series (46.2%) compared with that reported in the literature (11.1%).3 This may have been due to a difference in sampling methods or different diagnostic criteria for mastitis. The difference between a blocked duct and mastitis can be very subtle. Presence of redness and tenderness in the breast with little effort to clear the blockage may be classified as mastitis. What is of more concern is the possible delay in management that allows untreated mastitis to progress to breast abscess. An abscess can be drained through the duct manually, but needle aspiration under ultrasound guidance or incision may be required in some cases. If there is an incision, the wound must be left open for continuous drainage and the mother may be forced to stop breastfeeding.
     
    Since most of these infections are due to Staphylococcus aureus, Streptococcus, or Escherichia coli, antibiotics chosen should be amoxicillin with clavulanate, cloxacillin, or cefuroxime; all of which are compatible with continuation of breastfeeding.3
     
    Nipple pain
    Nipple pain may indicate a blocked duct because the duct beneath the areola is swollen. There should be some tenderness although not as much as that of the affected breast lobule. After relief of the blockage, nipple pain will resolve.
     
    A white spot at the nipple may also indicate a blocked duct. Blockage of a lobule and then stasis of milk at the opening of the duct can lead to further blockage by milk that has a high fat or high calcium content. This spot will be white in colour, sometimes referred to as a bleb. It can be removed by milk expression, needle or local application of vegetable oil. This should not be confused with thrush.
     
    Concern has already been raised about the general overdiagnosis of fungal infection as a cause of breast pain, nipple pain, or white spot.4 Patients treated for a presumed ‘yeast infection’ might have shown improvement in symptoms as a result of the anti-inflammatory effect of the antifungal drugs or because the blocked duct resolved on its own. Fungal infection of the breast and nipple may be considered if a blocked duct has been excluded and is often associated with other risk factors. Examples are consuming a diet with high sugar content that promotes growth of fungus, mother having received antibiotics, a maternal history of vaginal candidiasis, or baby’s oral mucosa with thrush.4 All these risk factors were not found in any of the mothers in this study. Excruciating pain after a feed is a non-specific sign. It is more likely to be due to a blocked duct or inadequate emptying of the breast, as shown in this study. These patients had failed to improve after being given local or systemic antifungal treatment in their previous consultations prior to presentation to this clinic. Pain was relieved only after the blocked duct was cleared.
     
    Diet of lactating mothers
    There was a general misunderstanding among the lactating mothers that eating more animal foods could improve milk supply. Previous studies have shown that the protein intake of local lactating mothers is much higher than that of those in other countries. At 3 months postpartum, Hong Kong mothers had a protein intake of 98 g/day5 compared to 81 g/day in the UK6 and 80 g/day in Japan.7 In the first month after delivery (known locally as the confinement period), the protein intake was even higher (133 g/day)5 than at 3 months. During this month, local mothers usually consume a special diet consisting of much more pork, fish, chicken, egg, and milk.
     
    The practice of eating a special diet with additional animal foods during confinement may be unique to Hong Kong Chinese population and is likely a long Chinese tradition. The original rationale was to replenish the blood loss of childbirth and may have been necessary at a time when the general population had barely enough food. Prior to the 1960s, our ancestors usually ate a plant-based diet, with pork available only in the Chinese New Year or during some festivals. There was very little over-nutrition. Time has changed. The diet of adults today is generally high in animal protein8 and fat. Further increase will lead to new clinical problems, not just weight gain in mothers but also increased risk of blockage and inflammation in breastfeeding mothers. A diet that contains much more meat has been shown to be associated with higher inflammatory index scores9 and one of these is C-reactive protein.10
     
    The quantity and quality of fat in breast milk can be affected by the fat in the maternal diet. Lactating mothers in Chongqing (a major city in Southwest China) consumed a diet wherein fat came from lard. Total fat in the breast milk was higher in Chongqing: 38 g/L compared with 32 g/L in Hong Kong.11 Chongqing mothers did not appear to have problems of blocked ducts or mastitis. Thus, a high-fat diet per se may not cause mastitis, it is the quality of fat that matters. Mothers who consume a diet high in saturated fat may be more prone to duct blockage.3 Mothers with a recurrent blocked duct were often advised to change their diet to one with more polyunsaturated fat or use a supplement, lecithin.3 A dietary source of lecithin is mainly soy or eggs. It would appear to be a good practice for lactating mothers in Chongqing to eat lots of eggs. However, in view of the possibility of egg allergy, Hong Kong mothers may be better advised to eat more soy products. Mothers in this study group appeared to eat very few soy products.
     
    High milk production together with inadequate emptying definitely poses a problem. Many Hong Kong mothers took both Chinese remedies (herbs, fish soups) and drank western teas (eg fenugreek) to increase milk production. Nearly all breastfeeding mothers had a breast pump. Some mothers pumped milk more often to produce an excess for later use. Indeed quite a number of the studied mothers had plenty of stored milk in their refrigerator. Working mothers may have stopped pumping during weekends. Such irregular breast emptying may cause the problem of milk stasis. The presence of fatigue, stress, and an imbalanced diet can encourage inflammation that can easily progress to mastitis. Recurrence of blocked duct/mastitis may occur if the mother’s diet and practice of feeding or pumping are not corrected.
     
    A diet rich in white sugar or corn syrup, pastries, and cakes can enhance the growth of fungus but was generally not observed in our subjects. This may explain why fungal infection was rare. A natural well-balanced diet with whole grains, plenty of vegetables and fruits, and no excessive animal products should be recommended. Refined sugary foods, foods with chemicals, colouring agents, and preservatives should be avoided.
     
    Medical services
    A substantial number of nurses had passed the examination that qualified them as an International Lactation Consultant. They worked mainly in the maternity wards of hospitals and MCHCs in Hong Kong. They were very successful in initiating breastfeeding. Some hospitals ran a breastfeeding clinic to support mothers after discharge from the maternity ward but they were not available round the clock. Most mothers with breastfeeding problems attended a MCHC to seek for help. Other mothers chose to see their family doctors. In general, doctors had little training in dealing with problems related to breastfeeding. In 2011, the Department of Health produced a self-learning kit on breastfeeding for any doctor who was interested, but it is difficult for the public to identify such doctors.
     
    Breast pain can sometimes be unbearable. Some patients described it as worse than labour pain. It is unknown from this study how many mothers had stopped breastfeeding because of the pain or how many ended up in hospital with a high fever and abscess that required surgery. Many patients in this study stated that after earlier treatment failed, they had no idea where else to seek further help. Others hesitated to seek medical help because they were afraid they would be told to stop breastfeeding. The mothers in this study were perhaps exceptional. They had tried very hard to find a solution for their pain even though it might have taken a number of days. These mothers deserve a better medical service. The Secretary for Food and Health has stated that the government is very supportive of breastfeeding and is ready to collaborate with health care professional bodies or non-governmental organisations in training personnel and promoting breastfeeding.12 Setting up breastfeeding clinics is the correct approach. These clinics can be run by MCHCs or a Baby-Friendly Hospital and should be full time and open to all. Doctors and lactation consultants can accumulate clinical experience faster and can then act as professional trainers. There is also a need for more local research on the diet and health of lactating mothers, especially those in confinement, so that appropriate education can be delivered to doctors, lactation consultants, midwives, peer counsellors, confinement nannies, and the public.
     
    This study was limited by its retrospective nature. There was a lack of standard protocols for data recording and retrieval. Not all women were followed up to determine if they had completely recovered since it is difficult to do so in a private clinic. What is certain is that those with engorgement and a blocked duct felt immediate relief the moment they left the clinic. It is also quite possible that many cases of breast pain were treated by other doctors and lactation consultants. The data of this study may thus not be representative of Hong Kong in general.
     
    Conclusion
    Blocked duct was the most common cause of breast pain in lactating mothers. Without prompt relief it may progress to mastitis/breast abscess or the mother may choose to stop breastfeeding. It may be a suitable time for Hong Kong to set up one or more public full-time breastfeeding clinics in order to provide a better service for lactating mothers and to facilitate professional training and research.
     
    Declaration
    The author has disclosed no conflicts of interest.
     
    References
    1. Baber FM. The current situation in Hong Kong. Hong Kong Pract 1981;5:132-7.
    2. Baby-Friendly Hospital Initiative Hong Kong Association. Available from: http://www.babyfriendly.org.hk/en/breastfeeding-in-hk/breastfeeding-trend/. Accessed Feb 2016.
    3. Lawrence RA, Lawrence RM. Breast feeding: a guide for the medical profession. 5th ed. St Louis: Mosby; 1999: 273-83.
    4. Wilson-Clay B, Hoover K. The breastfeeding atlas. 5th ed. US: LactNews Press; 2013: 57-8.
    5. Chan SM, Nelson EA, Leung SS, Cheng JC. Bone mineral density and calcium metabolism of Hong Kong Chinese postpartum women—a 1-y longitudinal study. Eur J Clin Nutr 2005;59:868-76. Crossref
    6. Black AE, Wiles SJ, Paul AA. The nutrient intakes of pregnant and lactating mothers of good socio-economic status in Cambridge, UK: some implications for recommended daily allowances of minor nutrients. Br J Nutr 1986;56:59-72. Crossref
    7. Takimoto H, Yoshiike N, Katagiri A, Ishida H, Abe S. Nutritional status of pregnant and lactating women in Japan: a comparison with non-pregnant/non-lactating controls in the National Nutrition Survey. J Obstet Gynaecol Res 2003;29:96-103. Crossref
    8. Leung SS, Woo J, Ho S, Lam TH, Janus ED. Hong Kong dietary survey. Aust J Nutr Diet 1988;55(Suppl):S11-4.
    9. Morimoto Y, Beckford F, Cooney RV, Franke AA, Maskarinec G. Adherence to cancer prevention recommendations and antioxidant and inflammatory status in premenopausal women. Br J Nutr 2015;114:134-43. Crossref
    10. Turner-McGrievy GM, Wirth MD, Shivappa N, et al. Randomization to plant-based dietary approaches leads to larger short-term improvements in Dietary Inflammatory Index scores and macronutrient intake compared with diets that contain meat. Nutr Res 2015;35:97-106. Crossref
    11. Chen ZY, Kwan KY, Tong KK, Ratnayake WM, Li HQ, Leung SS. Breast milk fatty acid composition: a comparative study between Hong Kong and Chongqing Chinese. Lipids 1997;32:1061-7. Crossref
    12. Hong Kong Paediatric Society, Hong Kong Paediatric Foundation. Summit on Breastfeeding and Early Childhood Nutrition in the First 1000 Days 2015; Abstract: 18.

    Managing malignant pleural effusion with an indwelling pleural catheter: factors associated with spontaneous pleurodesis

    Hong Kong Med J 2016 Aug;22(4):334–40 | Epub 3 Jun 2016
    DOI: 10.12809/hkmj154673
    © Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
     
    ORIGINAL ARTICLE
    Managing malignant pleural effusion with an indwelling pleural catheter: factors associated with spontaneous pleurodesis
    WM Wong, FHKCP, FHKAM (Medicine); Terence CC Tam, FHKCP, FHKAM (Medicine); Matthew KY Wong, MB, BS, FRCP; Macy MS Lui, FHKCP, FHKAM (Medicine); Mary SM Ip, MD, FRCP; David CL Lam, MD, FRCP
    Department of Medicine, Queen Mary Hospital, The University of Hong Kong, Pokfulam, Hong Kong
     
    Corresponding author: Dr David CL Lam (dcllam@hku.hk)
     
     Full paper in PDF
    Abstract
    Introduction: Malignant pleural effusion can be recurrent despite active anti-cancer treatment. Significant malignant pleural effusion leads to debilitating dyspnoea and worsening quality of life in patients with advanced cancer. An indwelling pleural catheter offers a novel means to manage recurrent malignant pleural effusion and may remove the need for repeated thoracocentesis. Spontaneous pleurodesis is another unique advantage of indwelling pleural catheter placement but the factors associated with its occurrence are not clearly established. The aims of this study were to explore the safety of an indwelling pleural catheter in the management of symptomatic recurrent malignant pleural effusion, and to identify the factors associated with spontaneous pleurodesis.
     
    Methods: This case series with internal comparisons was conducted in the Division of Respiratory Medicine, Department of Medicine, Queen Mary Hospital, Hong Kong. All patients who underwent insertion of an indwelling pleural catheter from the initiation of such service from January 2010 to December 2014 were included for data analysis. Patients were monitored until December 2014, with the last catheter inserted in July 2014.
     
    Results: Between 2010 and 2014, a total of 23 indwelling pleural catheters were inserted in 22 consecutive patients with malignant pleural effusion, including 15 (65.2%) cases with malignant pleural effusion as a result of metastatic lung cancer. Ten (43.5%) cases achieved minimal output according to defined criteria, in five of whom the pleural catheter was removed without subsequent re-accumulation of effusion (ie spontaneous pleurodesis). Factors associated with minimal output were the absence of trapped lung (P=0.036), shorter time from first appearance of malignant pleural effusion to catheter insertion (P=0.017), and longer time from catheter insertion till patient’s death or end of study (P=0.007).
     
    Conclusions: An indwelling pleural catheter provides a safe means to manage symptomatic malignant pleural effusion. Potential clinical factors associated with minimal output were identified along with the occurrence of spontaneous pleurodesis, which is a unique advantage offered by indwelling pleural catheter.
     
    New knowledge added by this study
    • An indwelling pleural catheter (IPC) offers a new and safe management option for symptomatic malignant pleural effusion (MPE).
    • Potential clinical factors associated with spontaneous pleurodesis were identified.
    Implications for clinical practice or policy
    • IPC is a safe management option for MPE.
    • In addition to drainage of effusion, the use of an IPC may be followed by spontaneous pleurodesis that obviates the need for any additional chemical sclerosant.
     
     
    Introduction
    Malignant pleural effusion (MPE) develops in up to 50% of patients with advanced lung cancer1 and can also develop in metastatic pleural involvement from non-pulmonary cancers. Such complication can be recurrent despite active anti-cancer treatment and thus difficult to manage.1 Significant MPE leads to debilitating dyspnoea and worsening quality of life in patients with terminal cancer.2 Conventional management options of MPE include thoracocentesis, chest tube drainage, and chemical and surgical pleurodesis.3 Nonetheless, MPE often recurs and necessitates repeated thoracocentesis or chest tube drainage.4 Chemical pleurodesis via an intercostal chest tube may entail prolonged hospitalisation and despite initial ‘success’, MPE often recurs a few months later.5 Surgical pleurodesis is often too invasive for frail cancer patients.6 Systemic anti-cancer treatment may reduce MPE but there is no guarantee of success.7 To secure symptom relief and to minimise repeated interventions and hospitalisation in refractory MPE was a constant challenge, until an indwelling pleural catheter (IPC) became more commonly used.8
     
    An IPC is intended to be left in situ in the pleural cavity permanently in patients with advanced cancer. Insertion is under local anaesthesia, and supplemented with conscious sedation if needed. An IPC is a silicon catheter with a polyester cuff for anchoring the catheter at the subcutaneous tunnel that serves to reduce infection. At the end of the external portion of the catheter is a silicone valve that remains closed unless connected to a designated drainage line or vacuum bottle. Vacuum bottles are not reusable and are discarded after each episode of drainage. Patients are usually advised to have IPC drainage every 1 or 2 days, especially when output remains substantial. In addition, drainage should be done whenever symptoms of MPE occur (Fig).
     

    Figure. (a) An indwelling pleural catheter (IPC) kit (Rocket Medical, UK). (b) Connecting different parts of IPC. (c) IPC inserted in a patient
     
    The guidelines for management of MPE published by the British Thoracic Society suggest that IPC is an alternative option for patients whose estimated survival exceeds 1 month and who have either a trapped lung or recurrent pleural effusion following a trial of pleurodesis.3 First-line use of IPC in patients who have no previous trial of pleurodesis has also been shown to be superior to talc pleurodesis with subjects being less dyspnoeic at 6 months, and less likely to need further pleural procedures, and reduced hospital stay by 3.5 days.9 Another prospective open-label trial that compared IPC with talc slurry pleurodesis as first-line treatment for MPE also demonstrated that first-line use of IPC conferred non-inferior improvement in dyspnoea and quality of life, reduced effusion-related hospital stay by 7 to 11 days, and required less subsequent pleural procedures compared with talc slurry pleurodesis.10 Research has shown that IPC is a safe procedure, with no complications in 87.5% (range, 54.5-100%) of patients.11 Although the IPC is designed to be left permanently in situ in the pleural cavity in patients with advanced cancer, one unique advantage of IPC is the occurrence of autopleurodesis or spontaneous pleurodesis (SP)—ie pleurodesis achieved following IPC insertion without the use of sclerosant. The achievement of SP may enable consequent removal of the IPC. The pooled rate of SP in MPE patients has been reported to be 45.6%,11 achieved after a mean duration of 26 to 56 days after IPC insertion.11 12 13 14 15 16 17 18 19 20 The possibility of SP is attractive as there is a chance that an IPC will no longer be required. The aims of this study were to review our single-centre experience of the safety of IPC in the management of symptomatic MPE and to explore the potential clinical factors associated with SP. To our knowledge, this is the first IPC study published in Hong Kong.
     
    Methods
    All patients who underwent IPC insertion at the Division of Respiratory Medicine, Department of Medicine, Queen Mary Hospital since initiation of the IPC service in January 2010 up to December 2014 were included for data analysis. Patients and data were followed up until December 2014, with the last IPC inserted in July 2014. The study was approved by the University of Hong Kong/Hong Kong Hospital Authority Hong Kong West Cluster Institutional Review Board/Ethics Committee (HKU/HAHO HKWC IRB/EC UW13-581) and informed consent was obtained from patients.
     
    An IPC was inserted in patients with MPE who had trapped lung or prior failed pleurodesis or persistent high effusion output from a chest drain and a high chance of pleurodesis failure, or in patients who preferred IPC as their first-line management of MPE. The IPC kits (Rocket Medical, UK) were used and IPCs were inserted in the endoscopy room under local anaesthesia supplemented with conscious sedation if needed.
     
    The electronic patient records, in-patient records, chest radiographs, and drainage diaries were retrospectively reviewed. Data regarding patient demographics, primary malignancy, cancer treatment, history of thoracic irradiation, number and type of prior pleural procedures, indications for IPC, serum albumin level before IPC insertion, laboratory analysis of pleural fluid obtained prior to IPC insertion, and IPC-related complications and admissions were collected and evaluated. ‘Massive effusion’ was defined as more than two thirds of the hemithorax. Effusion less than or equal to two thirds of the hemithorax was defined as ‘non-massive effusion’. Trapped lung was clinically diagnosed when chest X-ray showed an incompletely re-expanded lung despite adequate drainage and suction, together with a compatible tumour status predisposing to trapped lung (eg endobronchial tumour). The number of IPCs inserted, instead of the number of patients, was used for analysis in this study unless otherwise specified.
     
    Although IPC removal could be considered when SP was achieved clinically, there were patients who achieved minimal IPC output in whom IPC was not removed due to other clinical considerations or patient preference. Hence, the rate of SP would be underestimated if only IPC removal of the basis of minimal output was considered to reflect SP. Therefore, in this study patients were deemed to have achieved ‘minimal output’ if there was a persistently reduced IPC output of ≤50 mL per day on average that was not secondary to IPC complications, and regardless of whether the IPC was removed or kept in situ. Patients who persistently had an average IPC output that exceeded 50 mL per day, or had little output due to IPC complications (eg blocked IPC or significant pleural loculation) were defined as the ‘persistent output’ group. As achievement of SP did not necessarily infer IPC removal, because of patient preference and/or other considerations, the endpoint ‘minimal output’ was used for analysis of factors associated with SP.
     
    The IBM PASW statistical software version 20 was used for data analysis. Association of clinical factors with outcome was analysed with Fisher’s exact test, independent sample t tests, and Mann-Whitney test where appropriate. Shapiro-Wilk tests were used to check for normal distribution of individual continuous variables. As minimal output was a dichotomous variable, the point-biserial correlation method was used for association analysis between minimal output and other factors that were continuous variables. The P values were two-sided and were considered statistically significant if <0.05.
     
    Results
    A total of 23 IPCs were inserted in 22 consecutive patients with symptomatic MPE. Insertion of 15 (65.2%) IPCs were in patients with MPE from metastatic lung cancer. A further six were inserted for MPE from metastatic breast cancer and two in patients with MPE from metastatic colon cancer. The characteristics of patients are shown in Table 1. The mean (± standard error of the mean) duration of follow-up was 33.3 ± 28.1 weeks.
     

    Table 1. Summary of characteristics of subjects included in this study (n=23)
     
    Patients were admitted for symptomatic MPE or elective IPC insertion. Patients were able to be discharged with a mean of 4 days following IPC insertion. Ambulatory IPC drainage via vacuum bottles was performed by patients and/or their carers, except one patient who was attended by outreach nurses of the palliative care team.
     
    Complications related to IPC occurred in 10 (43.5%) cases (Table 2). Site infection and wound infection following IPC removal were minor and all resolved after a course of oral antibiotics without the need for hospitalisation. Tumour seeding at the IPC tract was successfully treated by local radiotherapy. Two patients had symptomatic loculated effusion following IPC insertion and required intrapleural fibrinolytics: only one of them improved. Complications necessitated removal of two IPCs. One patient developed empyema 6 months after IPC insertion. Pseudomonas aeruginosa was persistently isolated from pleural fluid despite appropriate antibiotics; infection resolved following IPC removal. Another patient developed intractable cough and it was suspected that her IPC was trapped at the right oblique fissure causing irritation. Cough improved following IPC removal. There were six IPC complication–related hospitalisations (either clinical or emergency admissions) in three patients: the two patients with symptomatic loculations on the IPC requiring fibrinolytics and the patient with empyema mentioned above.
     

    Table 2. Complications related to indwelling pleural catheter (IPC) in this study (n=23)
     
    A total of 10 patients achieved minimal output: IPC was removed in five (21.7%) without subsequent effusion re-accumulation and the other five patients achieved minimal output but retained their IPC. In another two patients, IPC was removed because of complications as mentioned before. No difficulties were encountered during any IPC removal.
     
    Significant factors associated with minimal output were the absence of trapped lung (P=0.036), shorter time from first appearance of MPE to IPC insertion (24.5 ± 24.2 weeks in persistent output group vs 5.75 ± 4.91 weeks in minimal output group; P=0.017), and longer time from IPC insertion till patient’s death or end of study (whichever was earlier; 20.2 ± 19.5 weeks in persistent output group vs 50.3 ± 29.2 weeks in minimal output group; P=0.007; Table 3).
     

    Table 3. Association of clinical factors with minimal output by bivariate analysis
     
    Discussion
    In this small series of 22 patients with 23 IPCs, mainly minor complications were encountered. A serious IPC complication, namely empyema, occurred in one (4.3%) case who was successfully treated with antibiotics and removal of IPC without serious consequences. Insertion of IPC is considered a relatively safe procedure: up to 87.5% (range, 54.5-100%) of patients have no complications following the insertion.11 Complications reported in the literature include local pain (0.4-13%), bleeding (0-0.9%), pneumothorax (0-38%), cellulitis at exit site (1.3-25%), pleural infection (0-16.7%), asymptomatic loculations (4-7.3%), symptomatic loculations (2-13.5%), IPC tract metastasis (0-13.6%), clogged catheter (0-17.6%), IPC dislodgement (1.3-17.7%), and fractured IPC during removal (9.8%). Previous studies suggest that up to 20.6% (range, 1.6-20.6%) of IPCs need to be removed due to complications.9 10 11 14 15 21 22 Nonetheless, serious complications are uncommon; the most common being pleural infection (0-16.7%).23 The TIME2 study reported that the risk of pleural infection was 13.4% in the IPC group compared with 1.9% in the talc slurry pleurodesis group.9 Chemotherapy is not regarded as a contra-indication to IPC, or vice versa. No increased risk of pleural infection has been observed in patients who receive chemotherapy with an IPC in situ.24 Symptomatic loculations following IPC insertion is another relatively significant complication, as they often necessitate admission for management such as intrapleural fibrinolysis or other pleural procedure.
     
    When the daily IPC output reduces to a certain level (the exact ‘amount’ remains arbitrary), IPC removal can be considered and SP is achieved if there is no significant re-accumulation following IPC removal. In reality, some patients had little IPC output but the catheter was left in situ due to other clinical considerations. The rate of SP could be underestimated if it was solely reflected by the ultimate rate of IPC removal, hence ‘minimal output’ was used in this study as the surrogate of SP during analysis of factors that contributed to SP.
     
    We determined that absence of trapped lung, shorter time from first appearance of MPE to IPC insertion, and longer time from IPC insertion till patient’s death or end of study were associated with minimal output. Trapped lung unsurprisingly led to a higher chance of persistent output. Nonetheless, it has been observed that patients with IPC inserted for trapped lung can still achieve SP,12 15 17 18 20 or their lung expansion will improve after IPC.17 In our cohort, two patients had their trapped lung re-expanded after IPC insertion; one of whom had IPC removed successfully without re-accumulation of effusion.
     
    It appears from this study that a shorter time from MPE to IPC insertion could be associated with the achievement of a minimal output state. This could imply that the earlier an IPC is inserted, the better chance of achieving minimal output or even SP. Both a history of multiple pleural procedures (which was arbitrarily defined in this study as requiring two or more episodes of pleurocentesis or chest drainage) and a history of failed pleurodesis were usually indicative of refractory or difficult-to-manage MPE.25 It has never been ascertained whether earlier IPC insertion rather than repeated attempts at pleurocentesis or pleurodesis will increase the chance of SP with IPC. Both factors were not significantly associated with minimal output in our small cohort. Further studies are required to investigate whether prompt insertion of IPC as soon as possible after development of MPE will improve the likelihood of SP.
     
    Patients who achieved minimal output had a longer time from IPC insertion until death or end of study (20.2 ± 19.5 weeks in the persistent output group vs 50.3 ± 29.2 weeks in the minimal output group; P=0.007). Minimal output may be a marker of overall disease control. Lung cancer was the underlying pathology in eight of the 10 subjects who achieved minimal output, of whom six had adenocarcinoma and were prescribed targeted therapy and chemotherapy. Whether the concomitant use of anti-cancer treatments for these lung cancer patients contributed to longer survival following IPC insertion could not be established from this small cohort of lung cancer patients. Comparison with non–lung cancer patients with IPC in this study could not be made as patients with metastatic breast or colorectal tumour with MPE had different treatment strategies. As at December 2014, only four of the 22 patients were still living. They were patients with adenocarcinoma of the lung on palliative chemotherapy/tyrosine kinase inhibitors. Among these four patients, one had her IPC removed earlier due to SP achievement, two had IPC removed earlier due to IPC-related complications, and one still had IPC in situ with persistent output.
     
    Minimal output was used as a surrogate of SP in this study rather than actual IPC removal in the hope that it would better reflect what clinical factors contribute to SP. Comparison of time from IPC insertion to minimal output achievement in those five patients whose IPCs were ultimately removed and the five patients in whom IPC remained in situ despite minimal output revealed no significant difference (30 [interquartile range, 15-59] days vs 23 [standard error of the mean, 6.63] days). Nonetheless, one must not ignore the reasons for non-removal of IPC despite minimal output since they impact the ultimate goal of IPC removal. In this study, there were five patients who achieved minimal output but in whom IPCs remained in situ due to various reasons: poor performance state and short life expectancy, undergoing cycles of chemotherapy, or patient preferences.
     
    This study was limited by the very small sample size and its retrospective nature. There were missing data and the dichotomous groupings, eg IPC drainage every 1 to 2 days versus less frequent, were crude and arbitrary. For example, more-frequent IPC drainage to increase the chance of pleural apposition may theoretically increase the chance of SP, although in this study IPC drainage every 1 to 2 days versus less frequent was not associated with minimal output. This could be related to the crude grouping of the IPC drainage frequency due to the retrospective design of this study that did not allow us to properly allocate the IPC drainage schedule. Further studies to identify modifiable clinical factors that may facilitate SP would be particularly meaningful.
     
    Conclusions
    Insertion of IPC was shown to be a safe technique in the management of symptomatic MPE. Potential factors associated with minimal output, which may predict SP, were absence of trapped lung, shorter time from first appearance of MPE to IPC insertion, and longer time with IPC. Validation by further studies is required owing to the small number of subjects in this study. More data are needed regarding modifiable factors that contribute to achievement of minimal output, as the removal of IPC offers further enhancement of quality of life.
     
    Acknowledgement
    The authors would like to thank Ms Crystal Kwan for assistance in statistical analysis.
     
    Declaration
    All authors have disclosed no conflicts of interest.
     
    References
    1. Shaw P, Agarwal R. Pleurodesis for malignant pleural effusions. Cochrane Database Syst Rev 2004;(1):CD002916. Crossref
    2. Lorenzo MJ, Modesto M, Pérez J, et al. Quality-of-Life assessment in malignant pleural effusion treated with indwelling pleural catheter: a prospective study. Palliat Med 2014;28:326-34. Crossref
    3. Roberts ME, Neville E, Berrisford RG, Antunes G, Ali NJ; BTS Pleural Disease Guideline Group. Management of a malignant pleural effusion: British Thoracic Society Pleural Disease Guideline 2010. Thorax 2010;65 Suppl 2:ii32-40. Crossref
    4. de Andrade FM. The role of indwelling pleural catheter in management of malignant pleural effusion: A creative new technique for an old method. Lung India 2015;32:81-2.
    5. Penz ED, Mishra EK, Davies HE, Manns BJ, Miller RF, Rahman NM. Comparing cost of indwelling pleural catheter vs talc pleurodesis for malignant pleural effusion. Chest 2014;146:991-1000. Crossref
    6. Bhatnagar R, Kahan BC, Morley AJ, et al. The efficacy of indwelling pleural catheter placement versus placement plus talc sclerosant in patients with malignant pleural effusions managed exclusively as outpatients (IPC-PLUS): study protocol for a randomised controlled trial. Trials 2015;16:48. Crossref
    7. Massarelli E, Onn A, Marom EM, et al. Vandetanib and indwelling pleural catheter for non–small-cell lung cancer with recurrent malignant pleural effusion. Clin Lung Cancer 2014;15:379-86. Crossref
    8. Fysh ET, Thomas R, Read CA, et al. Protocol of the Australasian Malignant Pleural Effusion (AMPLE) trial: a multicentre randomised study comparing indwelling pleural catheter versus talc pleurodesis. BMJ Open 2014;4:e006757. Crossref
    9. Davies HE, Mishra EK, Kahan BC, et al. Effect of an indwelling pleural catheter vs chest tube and talc pleurodesis for relieving dyspnea in patients with malignant pleural effusion: the TIME2 randomized controlled trial. JAMA 2012;307:2383-9. Crossref
    10. Fysh ET, Waterer GW, Kendall PA, et al. Indwelling pleural catheters reduce inpatient days over pleurodesis for malignant pleural effusion. Chest 2012;142:394-400. Crossref
    11. Van Meter ME, McKee KY, Kohlwes RJ. Efficacy and safety of tunneled pleural catheters in adults with malignant pleural effusions: a systematic review. J Gen Intern Med 2011;26:70-6. Crossref
    12. Warren WH, Kalimi R, Khodadadian LM, Kim AW. Management of malignant pleural effusions using the Pleurx catheter. Ann Thorac Surg 2008;85:1049-55. Crossref
    13. Warren WH, Kim AW, Liptay MJ. Identification of clinical factors predicting Pleurx catheter removal in patients treated for malignant pleural effusion. Eur J Cardiothorac Surg 2008;33:89-94. Crossref
    14. Sioris T, Sihvo E, Salo J, Räsänen J, Knuuttila A. Long-term indwelling pleural catheter (PleurX) for malignant pleural effusion unsuitable for talc pleurodesis. Eur J Surg Oncol 2009;35:546-51. Crossref
    15. Tremblay A, Michaud G. Single-center experience with 250 tunnelled pleural catheter insertions for malignant pleural effusion. Chest 2006;129:362-8. Crossref
    16. Bertolaccini L, Viti A, Gorla A, Terzi A. Home-management of malignant pleural effusion with an indwelling pleural catheter: ten years experience. Eur J Surg Oncol 2012;38:1161-4. Crossref
    17. Schneider T, Reimer P, Storz K, et al. Recurrent pleural effusion: who benefits from a tunneled pleural catheter? Thorac Cardiovasc Surg 2009;57:42-6. Crossref
    18. Al-Halfawy A, Light R. Safety and efficacy of using a surgivac pump for the drainage of chronic indwelling pleural catheters in malignant pleural effusions. Respirology 2008;13:461-4. Crossref
    19. Bazerbashi S, Villaquiran J, Awan MY, Unsworth-White MJ, Rahamim J, Marchbank A. Ambulatory intercostal drainage for the management of malignant pleural effusion: a single center experience. Ann Surg Oncol 2009;16:3482-7. Crossref
    20. Ohm C, Park D, Vogen M, et al. Use of an indwelling pleural catheter compared with thoracoscopic talc pleurodesis in the management of malignant pleural effusions. Am Surg 2003;69:198-202.
    21. Tremblay A, Mason C, Michaud G. Use of tunnelled catheters for malignant pleural effusions in patients fit for pleurodesis. Eur Respir J 2007;30:759-62. Crossref
    22. Fysh ET, Wrightson JM, Lee YC, Rahman NM. Fractured indwelling pleural catheters. Chest 2012;141:1090-4. Crossref
    23. Gilbert CR, Lee HJ, Akulian JA, et al. A quality improvement intervention to reduce indwelling tunneled pleural catheter infection rates. Ann Am Thorac Soc 2015;12:847-53. Crossref
    24. Mekhaiel E, Kashyap R, Mullon JJ, Maldonado F. Infections associated with tunnelled indwelling pleural catheters in patients undergoing chemotherapy. J Bronchology Interv Pulmonol 2013;20:299-303. Crossref
    25. Fysh ET, Bielsa S, Budgeon CA, et al. Predictors of clinical use of pleurodesis and/or indwelling pleural catheter therapy for malignant pleural effusion. Chest 2015;147:1629-34. Crossref

    Impact of 18FDG PET and 11C-PIB PET brain imaging on the diagnosis of Alzheimer’s disease and other dementias in a regional memory clinic in Hong Kong

    Hong Kong Med J 2016 Aug;22(4):327–33 | Epub 17 Jun 2016
    DOI: 10.12809/hkmj154707
    © Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
     
    ORIGINAL ARTICLE
    Impact of 18FDG PET and 11C-PIB PET brain imaging on the diagnosis of Alzheimer’s disease and other dementias in a regional memory clinic in Hong Kong
    YF Shea, MRCP, FHKAM (Medicine)1; Joyce Ha, BSc1; SC Lee, BHS (Nursing)1; LW Chu, MD, FRCP1,2
    1 Division of Geriatrics, Department of Medicine, LKS Faculty of Medicine, The University of Hong Kong, Queen Mary Hospital, Pokfulam, Hong Kong
    2 The Alzheimer’s Disease Research Network, SRT Ageing, The University of Hong Kong, Pokfulam, Hong Kong
     
    Corresponding author: Dr YF Shea (elphashea@gmail.com)
     
     Full paper in PDF
    Abstract
    Objective: This study investigated the improvement in the accuracy of diagnosis of dementia subtypes among Chinese dementia patients who underwent [18F]-2-fluoro-2-deoxy-D-glucose positron emission tomography (18FDG PET) with or without carbon 11–labelled Pittsburgh compound B (11C-PIB).
     
    Methods: This case series was performed in the Memory Clinic at Queen Mary Hospital, Hong Kong. We reviewed 109 subjects (56.9% were female) who received PET with or without 11C-PIB between January 2007 and December 2014. Data including age, sex, education level, Mini-Mental State Examination score, Clinical Dementia Rating scale score, neuroimaging report, and pre-/post-imaging clinical diagnoses were collected from medical records. The agreement between the initial and post-PET with or without 11C-PIB dementia diagnosis was analysed by the Cohen’s kappa statistics.
     
    Results: The overall accuracy of initial clinical diagnosis of dementia subtype was 63.7%, and diagnosis was subsequently changed in 36.3% of subjects following PET with or without 11C-PIB. The rate of accurate initial clinical diagnosis (compared with the final post-imaging diagnosis) was 81.5%, 44.4%, 14.3%, 28.6%, 55.6% and 0% for Alzheimer’s disease, dementia with Lewy bodies, frontotemporal dementia, vascular dementia, other dementia, and mixed dementia, respectively. The agreement between the initial and final post-imaging dementia subtype diagnosis was only fair, with a Cohen’s kappa of 0.25 (95% confidence interval, 0.05-0.45). For the 21 subjects who underwent 11C-PIB PET imaging, 19% (n=4) of those with Alzheimer’s disease (PIB positive) were initially diagnosed with non–Alzheimer’s disease dementia.
     
    Conclusions: In this study, PET with or without 11C-PIB brain imaging helped improve the accuracy of diagnosis of dementia subtype in 36% of our patients with underlying Alzheimer’s disease, dementia with Lewy bodies, vascular dementia, and frontotemporal dementia.
     
    New knowledge added by this study
    • Positron emission tomography (PET) with or without Pittsburgh compound B (PIB) brain imaging helps improve the accuracy of dementia subtype diagnosis in Chinese patients.
    Implications for clinical practice or policy
    • PET with or without PIB brain imaging should be considered in patients with dementia who attend the memory clinic, especially if there is diagnostic difficulty.
     
     
    Introduction
    With ageing of the world’s population, the prevalence of dementia increases: 46.8 million people worldwide were living with dementia in 2015. This is projected to reach 74.7 million in 2030 and 131.5 million in 2050, with 60% suffering from Alzheimer’s disease (AD).1 In Hong Kong, the prevalence of mild dementia has been reported to be 8.9% for adults aged 70 years or over, with 64.6% suffering from AD.2 Appropriate management of demented patients begins with correct diagnosis of dementia subtype that allows earlier implementation of disease-specific treatment. In particular, cholinesterase inhibitors (ChEIs) or N-methyl-D-aspartate receptor antagonists are mostly suitable for the treatment of AD. The current clinical diagnostic guidelines for various types of dementia have limited sensitivities and specificities, however. The sensitivity and specificity of clinical diagnostic criteria for AD, dementia with Lewy bodies (DLB), and frontotemporal dementia (FTD) have been reported as 81% and 70%, 50% and 80%, 85% and 95%, respectively.3 4 5 6 In the most recent diagnostic criteria for AD, additional use of biomarkers of AD has been recommended by the National Institute on Aging and Alzheimer’s Association to improve the accuracy of AD diagnosis.3 Biomarkers for the diagnosis of AD include cerebrospinal fluid (CSF), amyloid pathological imaging (eg carbon 11–labelled Pittsburgh compound B [11C-PIB] positron emission tomography [PET]), and functional imaging (eg [18F]-2-fluoro-2-deoxy-D-glucose [18F-FDG] PET) that yield sensitivities and specificities of at least 90% and 85%, respectively in the diagnosis of AD, DLB, and FTD.3 7 8 9 10 11 Because of the invasive nature of lumbar puncture in the collection of CSF, neuroimaging modalities such as 18F-FDG PET and 11C-PIB PET are more accepted in routine clinical practice to improve the diagnosis of dementia subtype.
     
    The most common functional neuroimaging is with 18F-FDG12 and the most common pathological neuroimaging is with 11C-PIB.13 These molecular imaging markers are imaged using PET. The 18F-FDG measures metabolic activity of the brain; 18F-FDG PET distinguishes well between AD and non-AD dementia.11 In a systematic review, the sensitivity and specificity for 18F-FDG PET in distinguishing between AD and DLB was 83%-99% and 71%-93%, respectively; and the sensitivity and specificity for 18F-FDG PET in distinguishing between AD and FTD was 97.6%-99% and 65%-86%, respectively.11 In the same systematic review, 18F-FDG PET predicted patients with mild cognitive impairment (MCI) deteriorating into dementia with sensitivity and specificity of 81%-82% and 86%-90%, respectively.11 Besides, 11C-PIB can detect the presence of fibrillar amyloid plaques that are a neuropathological marker of AD.13 Correlation studies with neuropathology have shown a sensitivity of 90% and specificity of 100%; 11C-PIB can reasonably distinguish AD from other types of dementia, eg FTD.13 Using neuropathology as the gold standard, the sensitivity and specificity was 89% and 83%, respectively.13 The presence of 11C-PIB retention also predicts the progression of patients with MCI: 50% progress to AD in 1 year and 80% progress to AD within 3 years.14
     
    Previous studies with 18F-FDG and 11C-PIB PET have focused on highly selected diagnostic groups, and only a few studies have studied their impact in the routine clinical setting of a memory clinic at a tertiary university hospital. The latter are referral centres, and often encounter patients with complicated diagnostic issues. Ossenkoppele et al15 reported a cohort of 145 patients who underwent 18F-FDG and 11C-PIB PET after clinical assessment. Change in clinical diagnosis was required in 23% with the diagnostic confidence increased from a mean of 71% to 87%. Diagnosis remained unchanged in 96% after PET over the next 2 years.15 In seven patients with MCI and positive amyloid deposition on 11C-PIB PET, six progressed to AD during follow-up (5 had AD pattern of hypometabolism on 18F-FDG PET).15 In a retrospective study of 94 patients with MCI or dementia, Laforce et al16 showed that 18F-FDG PET brain scan led to a change in diagnosis in 29% of patients, and reduced the frequency of atypical or unclear diagnoses from 39.4% to 16%.
     
    To the best of our knowledge, there are no published data on the impact of molecular neuroimaging on accuracy of diagnosis of AD or other dementias in the Chinese population. We hypothesised that brain 18F-FDG with or without 11C-PIB PET imaging can improve the accuracy of diagnosis of common dementia subtypes in a memory clinic. The objective of this study was to investigate the impact of brain 18F-FDG with or without 11C-PIB imaging in improving the accuracy of diagnosis of dementia subtype in a local memory clinic in Hong Kong.
     
    Methods
    This was a retrospective study conducted at the Memory Clinic of Queen Mary Hospital, the University of Hong Kong. Patients were referred by general practitioners, neurologists, geriatricians, surgeons, or psychiatrists. All patient records between January 2007 and December 2014 were reviewed. Inclusion criteria were a clinical diagnosis of MCI, dementia of any type, or unclassifiable dementia; and 18F-FDG with or without 11C-PIB PET performed within 3 months after the initial clinical diagnosis. The initial clinical assessment was performed by a geriatrician experienced in dementia care and included detailed history taking from primary carers of the patient, physical examination, cognitive assessment, and laboratory studies (including thyroid function test, vitamin B12 level, folate level, and syphilis serology [Venereal Disease Research Laboratory]). Clinical criteria for AD, FTD, DLB, and vascular dementia (VaD) were employed to establish the clinical diagnosis initially, without using any biomarker. The diagnosis of different dementia subtype before neuroimaging was based on the respective diagnostic guidelines. Patients with AD were diagnosed according to the NINCDS-ADRDA (National Institute of Neurological and Communicative Disorders and Stroke and Alzheimer’s Disease and Related Disorders Association) diagnostic criteria.17 Patients with DLB were diagnosed by the McKeith criteria.4 Behavioural variant (bv) of FTD was diagnosed by revised diagnostic criteria reported by the International bvFTD Criteria Consortium5 and language variant of FTD was diagnosed by latest published criteria.6 Patients with VaD were diagnosed according to the criteria of the NINDS-AIREN (National Institute of Neurological Disorders and Stroke/Association Internationale pour la Recherche et l’Enseignement en Neurosciences).18 In this study, we reviewed the medical records of eligible subjects and collected data including age, sex, education level, Mini-Mental State Examination score, Clinical Dementia Rating scale score, molecular imaging report including the standardised uptake value ratio (SUVR) of 11C-PIB PET, and the pre- and post-imaging diagnoses. For patients who were diagnosed with MCI, their progression during subsequent follow-up visits was also reviewed.
     
    The need for 18F-FDG with or without 11C-PIB PET was determined by the geriatrician who performed the initial clinical assessment. The images were evaluated by a radiologist with more than 10 years of experience in reading PET scans. Dementias were classified using the generally accepted criteria. Patients were fasted for at least 4 hours before the PET. The serum glucose level was measured in all patients. For 18F-FDG PET, the patient was rested in a dimly lit room with eyes closed for 30 minutes prior to injection of 18F-FDG via a venous catheter. Another 30 minutes of rest was observed before starting the acquisition. The acquired data were semi-quantitatively compared with age-stratified normal controls using three-dimensional stereotactic surface projections. For PIB imaging, acquisition was performed at 5 minutes and 35 minutes after 11C-PIB injection via a venous catheter, and SUVR images of 11C-PIB between 5 and 35 minutes were generated. Cerebellar grey matter was chosen as reference tissue. In this study, 11C-PIB PET scans were rated as positive (PIB+; if binding occurred in more than one cortical brain region; ie frontal, parietal, temporal, or occipital) or negative (PIB–; if predominantly white matter binding).
     
    The pattern of 18F-FDG PET hypometabolism that is suggestive of each subtype of dementia is as follows6 12 19:
    (1) AD—uni- or bi-lateral parietotemporal hypometabolism with posterior cingulate gyrus involvement or bilateral parietal and precuneal hypometabolism.
    (2) DLB—same as AD with added hypometabolism in occipital lobes.
    (3) bvFTD—uni- or bi-lateral frontotemporal hypometabolism with or without less-severe parietal hypometabolism.
    (4) Semantic dementia—anterior temporal lobe hypometabolism.
    (5) Progressive non-fluent aphasia—left posterior frontoinsular hypometabolism.
    (6) VaD—well-defined focal defects not fitting the above described patterns.
     
    Statistical analyses
    Descriptive statistics were used for data analyses. Continuous variables were expressed as mean ± standard deviation or median (interquartile range) as appropriate. Categorical data were expressed as number and percentages. The agreement between pre- or post-imaging diagnoses of dementia subtype was analysed by the Cohen’s kappa (κ) statistic. The Cohen’s κ reflected the degree of agreement: <0 = no agreement, 0-0.20 = slight agreement, 0.21-0.40 = fair agreement, 0.41-0.60 = moderate agreement, 0.61-0.80 = substantial agreement, and 0.81-1.00 = almost perfect agreement. All analyses were performed with the Statistical Package for the Social Sciences (Windows version 18.0; SPSS Inc, Chicago [IL], US).
     
    Results
    A total of 109 patients (56.9% were female) were recruited of whom 102 had dementia and seven had MCI. Both 18F-FDG and 11C-PIB PET data were available for 45 (41.3%) patients, and 64 patients underwent 18F-FDG only. The final diagnosis of the 102 demented patients after neuroimaging is shown in Table 1.
     

    Table 1. Characteristics of demented patients by final diagnoses after brain 18F-FDG with or without 11C-PIB imaging (n=102)
     
    The accuracy of clinical diagnoses is summarised in Table 2. Overall, PET scans confirmed the clinical impression in 63.7% of patients, and corrected the diagnosis in 36.3%. Using the result of PET scan as the gold standard, the frequency of accurate initial clinical diagnosis was low for FTD, VaD, and mixed dementia (14.3%, 28.6%, and 0%, respectively). The accuracy of clinical diagnosis for AD and DLB was 81.5% and 44.4%, respectively. After excluding subjects with an initial MCI diagnosis, the agreement between the initial and final post-imaging dementia diagnosis was only fair, with a Cohen’s κ of 0.25 (95% confidence interval, 0.05-0.45).
     

    Table 2. Change in clinical diagnoses of dementia subtypes after 18F-FDG with or without 11C-PIB brain imaging
     
    Table 3 lists the diagnosis of subjects before and after the availability of 18F-FDG with or without 11C-PIB PET neuroimaging. For subjects with a final diagnosis of AD (n=65), 18.5% (n=12) were initially diagnosed with non-AD dementia (including 3 with DLB, 2 with FTD, 4 with VaD, and 3 with other dementia) and subsequently received symptomatic AD therapy (ie ChEIs and/or memantine). For the 21 subjects who underwent PIB PET imaging, 19% (n=4) of those with AD (PIB+) were initially diagnosed with non-AD dementia. For subjects with an initial diagnosis of AD (n=74), 28.4% (n=21) had a change in diagnosis (including 4 DLB, 6 FTD, 4 VaD, 3 mixed AD plus VaD, and 4 with other dementia). Excluding subjects with DLB and mixed AD plus VaD, 13.7% of all subjects (14 out of 102) had discontinued their previous symptomatic AD therapy. For subjects with a final diagnosis of FTD (n=7), 85.7% (n=6) were initially misdiagnosed as AD. For subjects with a final diagnosis of DLB (n=9), 44.4% (n=4) were misdiagnosed as AD.
     

    Table 3. Agreement between initial and final diagnoses
     
    Five patients were diagnosed with unclassifiable dementia following neuroimaging, which comprised four females and one male with a mean age of 78 ± 9.4 years. All presented with amnesia. In addition, one patient presented with apraxia and dysexecutive syndrome and another presented with hyperorality. All of them were PIB-. An AD pattern of hypometabolism was present in four patients (2 with hypometabolism in posterior cingulate gyrus and 2 with hypometabolism in temporoparietal lobes). Isolated hypometabolism in the temporal lobes was present in one patient.
     
    The clinical information of the seven amnesic MCI subjects are summarised in Table 4. None of the three subjects without imaging risk factors for AD deteriorated over a follow-up period of 1 to 5 years. Of the four amnesic MCI subjects with imaging risk factors, two deteriorated into AD over a follow-up period of 5 years.
     

    Table 4. Longitudinal outcome of the seven patients with amnesic mild cognitive impairment
     
    Discussion
    In this study, we showed that 18F-FDG with or without 11C-PIB PET clarified and improved the accuracy of dementia diagnosis in 36.3% of patients, and confirmed the initial diagnosis in 63.7%. Using the results of PET scan as the gold standard, the accuracy of clinical diagnosis was low for FTD, VaD, and mixed dementia collectively. On the one hand, 11.7% of patients (ie 12 out of 102) were started on symptomatic AD therapy after the 18F-FDG with or without 11C-PIB PET neuroimaging investigations. On the other hand, 13.7% of patients (ie 14 out of 102) discontinued symptomatic AD therapy after 18F-FDG with or without 11C-PIB PET because they did not have AD.
     
    We also showed that the accuracy of clinical diagnosis of DLB and FTD was low (44.4% and 14.3%, respectively). This finding was in agreement with a previous study.20 Both DLB and FTD are commonly misdiagnosed clinically as AD (50% for DLB and 85.7% for FTD).20 We have previously reported that 100% of our patients with biomarkers that confirmed DLB and FTD presented with memory impairment in our memory clinic.20 A previous study also reported that 26% of DLB patients were initially misdiagnosed with AD, and 57% of these DLB patients presented with memory impairment.21 We understand that an accurate diagnosis of DLB is very important for subsequent management. Patients with DLB are particularly sensitive to neuroleptics.21 Neuroleptic sensitivity can present as drowsiness, confusion, abrupt worsening of parkinsonism, postural hypotension, or neuroleptic malignant syndrome.21 Other clinical features of DLB that need to be observed and tackled include well-formed visual hallucinations, rapid eye movement sleep behavioural disorder, and autonomic symptoms (including postural hypotension, sialorrhoea, and urinary and bowel symptoms).21 By accurately establishing the diagnosis of DLB, careful observation of classic DLB symptoms may reduce unnecessary investigations. Regarding therapeutic implications, DLB is characterised by far greater cholinergic deficits than AD. Hence, most DLB patients will benefit from ChEIs, and the extent of symptomatic improvement should be monitored after such therapy.22
     
    Similarly, FTD may be misdiagnosed as AD. The former can also present initially with memory impairment, as illustrated by our FTD patients. There is increasing evidence that elderly patients with FTD often present with memory impairment.5 23 24 In one autopsy study, 64% (n=7) of 11 elderly patients with FTD had anterograde memory loss.23 Current treatment guidelines do not advise giving ChEIs or memantine treatments to FTD patients. Thus, such medications should be stopped to prevent unnecessary adverse effects.25
     
    In the past few years, disease-modifying treatments (eg bapineuzumab) have failed to demonstrate their efficacy in clinical trials with AD patients.26 Detailed post-hoc analyses with AD biomarkers have shown the problem of diagnosing AD in subjects recruited in these studies. Only approximately 80% of these subjects had AD amyloid pathology, according to the presence of amyloid PET scan.26 Thus, including 11C-PIB PET to confirm brain amyloid in study inclusion criteria can help ensure recruitment of genuine AD patients to future clinical trials of disease-modifying treatments for AD.27 Given the minimally invasive nature of 11C-PIB PET compared with CSF amyloid-beta (Aβ) 42 measurements,7 it is likely to be a more acceptable choice for patients in clinical trials. At present, there are ongoing clinical trials of AD treatments including secretase inhibitors, Aβ aggregation inhibitors, Aβ and tau immunotherapy.27 We believe that 11C-PIB PET will play an important role in these clinical trials.
     
    It is considered that 18F-FDG and 11C-PIB PET may detect underlying AD in patients with MCI.28 In the present study, 50% of MCI patients (ie 2 out of 4) with 18F-FDG and 11C-PIB PET imaging findings positive for AD showed deterioration over a follow-up period of 5 years. Although recommending PET brain imaging in MCI patients is still debatable, we believe that this investigation can help clinicians to better plan future and long-term treatments. In particular, disease-modifying drugs for AD or MCI due to AD may prove to be effective in the coming decade. Finally, in the present study, five patients were diagnosed with unclassifiable dementia. In the four patients with an AD pattern of hypometabolism, AD may still be present as they may have diffuse plaques or amorphous plaques that do not bind well to PIB. Alternatively they may have another type of dementia that requires pathological confirmation, eg argyrophilic grain disease or neurofibrillary tangle–only dementia.29 We will follow up the remaining patient with isolated hypometabolism in the temporal lobes to see whether additional FTD features develop.
     
    There were several limitations to the present study. This was a retrospective case series and as such we were unable to collect further information such as the pre-imaging or post-imaging confidence of diagnosis. The diagnosis of dementia relied on the clinical diagnostic criteria without pathological confirmation. Therefore, we were also unable to compare the relative accuracy of clinical diagnosis and PET diagnosis with pathological diagnosis. For patients with MCI, some were not followed up for sufficiently long to ascertain whether or not they had deteriorated and developed dementia. Structural imaging (including computed tomography or magnetic resonance imaging) of the brain was not analysed as a separate variable but integrated into the pre-functional imaging clinical diagnoses of dementia subtypes. Our case series is likely to have selection bias as PET imaging is mostly a self-paid service in Hong Kong. The exception is for patients who are retired civil servants or recipients of Comprehensive Social Security Assistance. Demented patients who could not afford PET may differ to the patients selected. Although the PET images were analysed and read by radiologists experienced in PET, the interpretations depended heavily on individual experience and training; also, radiologists were not blinded to clinical information written on the request form. Despite these limitations, our study should be more reflective of day-to-day practice in a memory clinic and how 18F-FDG with or without11C-PIB PET imaging may assist clinical diagnosis.
     
    Conclusions
    In this study, 18F-FDG with or without 11C-PIB brain imaging improved the accuracy of diagnosis of dementia subtype in 36% of patients with underlying AD, DLB, VaD, and FTD who presented to our memory clinic.
     
    Declaration
    All authors have disclosed no conflicts of interest.
     
    References
    1. Alzheimer’s Disease International World Alzheimer Report 2015: executive summary. Available from: http://www.alz.co.uk/research/WorldAlzheimerReport2015-sheet.pdf. Accessed Sep 2015.
    2. Lam LC, Tam CW, Lui VW, et al. Prevalence of very mild and mild dementia in community-dwelling older Chinese people in Hong Kong. Int Psychogeriatr 2008;20:135-48. Crossref
    3. McKhann GM, Knopman DS, Chertkow H, et al. The diagnosis of dementia due to Alzheimer’s disease: recommendations from the National Institute on Aging and Alzheimer’s Association workgroups on diagnostic guidelines for Alzheimer’s disease. Alzheimers Dement 2011;7:263-9. Crossref
    4. McKeith IG, Dickson DW, Lowe J, et al. Diagnosis and management of dementia with Lewy bodies: third report of the DLB Consortium. Neurology 2005;65:1863-72. Crossref
    5. Rascovsky K, Hodges JR, Knopman D, et al. Sensitivity of revised diagnostic criteria for the behavioural variant of frontotemporal dementia. Brain 2011;134:2456-77. Crossref
    6. Harris JM, Gall C, Thompson JC, et al. Classification and pathology of primary progressive aphasia. Neurology 2013;81:1832-9. Crossref
    7. Shea YF, Chu LW, Zhou L, et al. Cerebrospinal fluid biomarkers of Alzheimer’s disease in Chinese patients: a pilot study. Am J Alzheimers Dis Other Demen 2013;28:769-75. Crossref
    8. Duits FH, Teunissen CE, Bouwman FH, et al. The cerebrospinal fluid “Alzheimer profile”: easily said, but what does it mean? Alzheimers Dement 2014;10:713-723.e2. Crossref
    9. Sinha N, Firbank M, O’Brien JT. Biomarkers in dementia with Lewy bodies: a review. Int J Geriatr Psychiatry 2012;27:443-53. Crossref
    10. Harris JM, Gall C, Thompson JC, et al. Sensitivity and specificity of FTDC criteria for behavioral variant frontotemporal dementia. Neurology 2013;80:1881-7. Crossref
    11. Davison CM, O’Brien JT. A comparison of FDG-PET and blood flow SPECT in the diagnosis of neurodegenerative dementias: a systematic review. Int J Geriatr Psychiatry 2014;29:551-61. Crossref
    12. Schöll M, Damián A, Engler H. Fluorodeoxyglucose PET in neurology and psychiatry. PET Clin 2014;9:371-90. Crossref
    13. Vandenberghe R, Adamczuk K, Dupont P, Laere KV, Chételat G. Amyloid PET in clinical practice: Its place in the multidimensional space of Alzheimer’s disease. Neuroimage Clin 2013;2:497-511. Crossref
    14. Cummings JL. Biomarkers in Alzheimer’s disease drug development. Alzheimers Dement 2011;7:e13-44. Crossref
    15. Ossenkoppele R, Prins ND, Pijnenburg YA, et al. Impact of molecular imaging on the diagnostic process in a memory clinic. Alzheimers Dement 2013;9:414-21. Crossref
    16. Laforce R Jr, Buteau JP, Paquet N, Verret L, Houde M, Bouchard RW. The value of PET in mild cognitive impairment, typical and atypical/unclear dementias: A retrospective memory clinic study. Am J Alzheimers Dis Other Demen 2010;25:324-32. Crossref
    17. McKhann G, Drachman D, Folstein M, Katzman R, Price D, Stadlan EM. Clinical diagnosis of Alzheimer’s disease: report of the NINCDS-ADRDA Work Group under the auspices of Department of Health and Human Services Task Force on Alzheimer’s Disease. Neurology 1984;34:939-44. Crossref
    18. Román GC, Tatemichi TK, Erkinjuntti T, et al. Vascular dementia: diagnostic criteria for research studies. Report of the NINDS-AIREN International Workshop. Neurology 1993;43:250-60. Crossref
    19. Waldö ML. The frontotemporal dementias. Psychiatr Clin North Am 2015;38:193-209. Crossref
    20. Shea YF, Ha J, Chu LW. Comparisons of clinical symptoms in biomarker-confirmed Alzheimer’s disease, dementia with Lewy bodies, and frontotemporal dementia patients in a local memory clinic. Psychogeriatrics 2014;15:235-41. Crossref
    21. Zweig YR, Galvin JE. Lewy body dementia: the impact on patients and caregivers. Alzheimers Res Ther 2014;6:21. Crossref
    22. Gauthier S. Pharmacotherapy of Parkinson disease dementia and Lewy body dementia. Front Neurol Neurosci 2009;24:135-9. Crossref
    23. Baborie A, Griffiths TD, Jaros E, et al. Frontotemporal dementia in elderly individuals. Arch Neurol 2012;69:1052-60. Crossref
    24. Hornberger M, Piguet O. Episodic memory in frontotemporal dementia: a critical review. Brain 2012;135:678-92. Crossref
    25. Portugal Mda G, Marinho V, Laks J. Pharmacological treatment of frontotemporal lobar degeneration: systematic review. Rev Bras Psiquiatr 2011;33:81-90. Crossref
    26. Blennow K, Mattsson N, Schöll M, Hansson O, Zetterberg H. Amyloid biomarkers in Alzheimer’s disease. Trends Pharmacol Sci 2015;36:297-309. Crossref
    27. Wisniewski T, Goñi F. Immunotherapeutic approaches for Alzheimer’s disease. Neuron 2015;85:1162-76. Crossref
    28. Langa KM, Levine DA. The diagnosis and management of mild cognitive impairment: a clinical review. JAMA 2014;312:2551-61. Crossref
    29. Kovacs GG. Tauopathies. In: Kovacs GG, editor. Neuropathology of neurodegenerative diseases: a practical guide. Cambridge: Cambridge University Press; 2015: 125-8.

    Pages