Initial results of selective renal parenchymal clamping with an adjustable kidney clamp in nephron-sparing surgery: an easy way to minimise renal ischaemia

Hong Kong Med J 2016 Dec;22(6):563–9 | Epub 29 Jul 2016
DOI: 10.12809/hkmj154746
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Initial results of selective renal parenchymal clamping with an adjustable kidney clamp in nephron-sparing surgery: an easy way to minimise renal ischaemia
KC Cheng, MB, ChB; MK Yiu, FHKAM (Surgery); SH Ho, FHKAM (Surgery); TL Ng, FHKAM (Surgery); HL Tsu, FHKAM (Surgery); WK Ma, FHKAM (Surgery)
Department of Surgery, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Queen Mary Hospital, Pokfulam, Hong Kong
 
Corresponding author: Dr WK Ma (kitkitma@yahoo.com)
 
 Full paper in PDF
Abstract
Introduction: A renal parenchymal clamp has been used at our centre since March 2012. It is used in position over the kidney to achieve optimal vascular control of a tumour while minimising parenchymal ischaemia. This study aimed to report the feasibility, surgical outcome, and oncological control of a kidney clamp in partial nephrectomy.
 
Methods: This study was conducted at a teaching hospital in Hong Kong. Partial nephrectomies performed from January 2009 to March 2015 were reviewed. The tumour characteristics and surgical outcomes of kidney clamp were studied and compared with traditional hilar clamping.
 
Results: A total of 92 patients were identified during the study period. Kidney clamps were used in 20 patients and hilar clamping in 72, with a mean follow-up of 27 and 37 months, respectively. For patients in whom a kidney clamp was applied, all tumours were exophytic to a different extent and the majority (90%) were located at the polar region. The PADUA (preoperative aspects and dimensions used for an anatomical) classification nephrometry score was also lower than those in whom hilar clamping was used (7.07 vs 8.34; P=0.002). The clamp was used in open, laparoscopic, and robot-assisted surgery. Operating time was shorter (207 ± 72 mins vs 306 ± 80 mins; P<0.001) and estimated blood loss was lower (205 ± 191 mL vs 331 ± 275 mL; P=0.045) with kidney clamp. No acute kidney injury occurred. Postoperative renal function was comparable between the two groups.
 
Conclusions: Partial nephrectomy using parenchymal clamping is safe and feasible in selected cases. The postoperative renal function and oncological control were satisfactory.
 
New knowledge added by this study
  • The use of a renal parenchymal clamp is feasible and safe for vascular control during partial nephrectomy.
  • Early results in terms of intra-operative blood loss, operating time, and postoperative renal function are promising.
Implications for clinical practice or policy
  • This clamp offers an ideal and convenient alternative to hilar clamping in selected cases.
 
 
Introduction
Nephron-sparing surgery (NSS) for renal tumour has been proved to produce similar oncological and superior functional outcome for stage T1 renal masses as radical nephrectomy.1 It is now recommended as the gold standard.2 Various vascular control methods have been described to achieve a bloodless field for tumour dissection and excision, with the traditional way being renal hilar clamping (HC) that results in global renal ischaemia. The importance of minimising ischaemia in NSS to preserve postoperative renal function is well documented.3 While HC is less technically demanding, zero-ischaemia NSS with selective segmental arterial clamping requires high levels of surgical and anaesthetic expertise.1 Furthermore, not all renal masses are amenable to the technique, such as peripherally located tumours without an identifiable feeding segmental artery on preoperative imaging. In such cases, attaining regional renal ischaemia is a feasible way of maintaining a clear operative field while reducing ischaemic insults to the renal parenchyma, and will theoretically better preserve postoperative renal function. Different methods of regional ischaemia have been reported including manual compression and various parenchymal clamps.4 We describe a method of selective renal parenchymal clamping (SRPC) with a kidney clamp that can be adopted in open, conventional, or robot-assisted laparoscopic NSS. In this study, we report the case selection, feasibility, and surgical outcomes in our initial series of SRPC technique with respect to traditional HC NSS.
 
Methods
All patients who underwent NSS for renal tumour from January 2009 to March 2015 were retrospectively reviewed at a tertiary centre in Hong Kong. Since March 2012, selected patients have been prospectively recruited for SRPC after careful review of the computed tomographic imaging during a preoperative planning session. Eligible indications for renal parenchymal clamping included small tumour size, and peripherally located and exophytic tumour; hilar or centrally located tumours were unsuitable. All procedures including open, conventional, and robot-assisted laparoscopic transperitoneal or retroperitoneal approaches were included. Patients with NSS performed using selective segmental artery clamping technique were excluded from the current study. This study was done in accordance with the principles outlined in the Declaration of Helsinki.
 
The kidney clamp
The technique of SRPC has been in use at our unit since March 2012. The kidney clamp (Karl Storz, Tuttlingen, Germany) is a 29-cm long, 10-mm wide instrument. It consists of a 120-mm long distal snare comprising nitinol, an outer sheath, and a handle with ratchet (Fig 1). It is reusable and can be inserted through a 10-mm laparoscopic port. It can be used in laparoscopic, robot-assisted, or open NSS in both transperitoneal and retroperitoneal approach. Tightening of the snare across the renal parenchyma surrounding the tumour occludes the blood flow with the ratchet preventing accidental loosening of the snare during dissection. The clamp can be released by rotating the handle 90 degrees. The fine ratchet mechanism of the clamp allows easy fine-tuning of tightness on the parenchymal tissue according to the bleeding encountered during tumour excision, as the ratchet can be further tightened one gear tooth at a time. Furthermore, the clamp release is gradual and can be easily tightened again. This allows further haemostasis procedures to be performed in case bleeding is encountered on clamp release.
 

Figure 1. The design of the kidney clamp
 
The nephron-sparing surgery with selective renal parenchymal clamping
The partial nephrectomy procedure was carried out in a standardised way via an open, pure laparoscopic, or robot-assisted laparoscopic approach. Intra-operative ureteric cannulation and catheterisation was performed after general anaesthesia in selected patients whose tumours were considered by the operating surgeon to be closer to the collecting system. With the patient positioned in a lateral bridged position, an open oblique loin or subcostal incision along the 12th rib tip was made or laparoscopic ports were inserted in the standard manner according to the selected approach. After creation of the operative field and exposure of the kidney, the Gerota’s fascia was incised and perirenal fat was dissected from the renal capsule. Hilar dissections were performed in all cases in order to prepare for HC in case excessive bleeding was encountered. The renal tumour was exposed with its circumferential and deep margins confirmed by intra-operative ultrasonography. The perirenal fat was dissected adequately to allow positioning of the parenchymal clamp over the kidney at about 1 to 1.5 cm from the tumour edge so as to achieve optimal vascular control while avoiding slipping of the clamp during repair of the parenchymal defect after tumour excision. In laparoscopic procedures, the clamp was inserted through a 10-mm assistant port directed towards the planned axis of clamping across the polar region (Fig 2). The hilar area and the ureter were always spared from clamping. The snare was then tightened gradually until bluish discolouration of the parenchyma was noted, and the tightness adjusted according to the degree of bleeding during tumour excision. A thermal excision of tumour was performed. Breaching of the collecting system was checked in those patients with retrograde stenting by slightly releasing the clamp and injecting methylene blue dye through the ureteric catheter, and any area of leakage repaired with polydioxanone sutures. Renorrhaphy was done by using a barbed suture in a continuous manner, with reinforcement by sliding polymer clips. Fibrin glue was applied to aid haemostasis in selected patients, and the kidney clamp was loosened once major oozing had stopped, but kept in place providing stability for the renorrhaphy before its final removal, such that regional ischaemic time could be minimised.
 

Figure 2. Intra-operative photos showing application of the parenchymal clamp in a robot-assisted laparoscopic partial nephrectomy
(a) Parenchymal clamp being placed across the polar region with kidney tumour (circled in yellow) dissected free from perirenal fat; (b) excision of tumour (circled in yellow) in a bloodless field with parenchymal clamp tightened; (c) renorrhaphy with barbed suture; (d) completion of renorrhaphy with fibrin glue and parenchymal clamp released
 
Data collection and analysis
Patient demographics (age, gender, Charlson Comorbidity Index score, baseline renal function, co-existing diabetes or hypertension), tumour characteristics (radiological maximum tumour diameter, PADUA [preoperative aspects and dimensions used for an anatomical] nephrometry score), intra-operative data (operating time, ischaemic time, estimated blood loss), and postoperative outcomes (complications, hospital stay, renal function) were assessed for all patients. Estimated glomerular filtration rate (eGFR) using Modification of Diet in Renal Disease study equation was used to measure postoperative renal function.5 Acute kidney injury was defined as either two-fold increase in serum creatinine or 50% reduction in eGFR within the postoperative hospital stay when compared with preoperative baseline, or any requirement for renal replacement therapy. The renal function at 7, 30, 60, and 90 days after operation was assessed. The PADUA nephrometry score6 was used as an objective measure of tumour characteristics and its individual parameters were also analysed. Complications were graded as per the Clavien-Dindo classification system.7 The proportion of high-grade complications (grades 3-5) was reported.
 
Clinical data were compared between the SRPC series and a larger cohort of conventional HC series. Continuous variables were compared using Mann-Whitney U test while categorical variables were compared using Chi squared test or Fisher’s exact test. Statistical significance was defined as P<0.05. Analysis was performed using the Statistical Package for the Social Sciences (Windows version 20.0; SPSS Inc, Chicago [IL], US).
 
Results
From January 2009 to March 2015, a total of 93 patients were identified. After excluding one patient who had an infected upper moiety in a duplex system with upper pole nephrectomy, 92 cases with NSS procedures performed with conventional HC (n=72) or SRPC (n=20) techniques were included for analysis. The mean follow-up duration was 27 and 37 months for SRPC and HC groups, respectively. Patient demographics and co-morbidity were similar between the two groups, as was baseline renal function (mean ± standard deviation of eGFR: 79.1 ± 25.9 mL/min vs 76.1 ± 26.5 mL/min; P=0.751; Table 1). Regarding the tumour complexity (Table 2), there were no differences in the mean radiological tumour size (26.1 ± 10.6 mm vs 30.6 ± 15.2 mm; P=0.371), while the mean PADUA nephrometry score was significantly lower in the SRPC group (7.07 vs 8.34; P=0.002). All tumours in the SRPC series were significantly more exophytic to varying extents (P=0.032), laterally located (P=0.015), and over the polar region (P=0.016). Apparently more NSS procedures in the SRPC group were performed by laparoscopic approach (with or without robot assistance), though not reaching statistical significance compared with HC group. (Table 3). Operating time was significantly shorter (207 ± 72 mins vs 306 ± 80 mins; P<0.001) and estimated blood loss was lower (205 ± 191 mL vs 331 ± 275 mL; P=0.045) with SRPC. No open conversions were needed in minimally invasive approaches. The mean length of hospital stay was 6.6 days and complication rate with Clavien-Dindo grade 3 or above was 5% (n=1) for SRPC group.
 

Table 1. Patient demographics and preoperative renal function
 

Table 2. Tumour characteristics
 

Table 3. Surgical outcomes and complications
 
Overall postoperative renal function was satisfactory in both groups and the changes between preoperative and postoperative eGFR are shown in Table 4. Cold and warm ischaemia was adopted in 50 (69.4%) and 22 (30.6%) patients in the HC group, respectively. The mean clamping time for SRPC was 20 minutes. The reduction in eGFR was significantly more for HC at postoperative day 60 for both cold (P=0.006) and warm ischaemia (P=0.016) when compared with SRPC. No acute kidney injury occurred during the early postoperative period after parenchymal clamping, while there were seven (9.7%) cases of acute kidney injury in HC.
 

Table 4. Renal function outcomes
 
Overall 70% were renal cell carcinoma in SRPC. All tumours were pathological T1 disease with 92.9% in stage T1a. The mean pathological tumour size was 26.8 mm. No patients in the SRPC group had a positive surgical margin or developed local recurrence or metastasis.
 
Discussion
This was a feasibility and safety study that showed the initial promising result of regional ischaemia achieved by SRPC using an adjustable kidney clamp. The concept of regional ischaemia is indeed not new and different instruments have been used successfully in other centres for partial nephrectomy. It was first described by Semb8 in 1956 using manual compression. A self-made clamp with two remodelled malleable retractors4 and use of Rumel tourniquet have also been reported.9 More recently open10 and laparoscopic Satinsky vascular clamps11 have been used. A Nussbaum clamp was used by Simon et al in 200812 in open partial nephrectomy, normally intended for intestinal clamping during general surgery. They later modified this to the laparoscopic Simon clamp with a ratchet mechanism similar to ours.13 It consists of a pair of jaws 100-mm long, one straight and the other one curved. Blood loss was minimal and no complications occurred in three cases. Recently the first study comparing parenchymal clamping with HC for robot-assisted laparoscopic partial nephrectomy was published.14 It showed that parenchymal clamping was associated with a shorter operating time and better preserved immediate postoperative renal function. Our method of SRPC using an adjustable kidney clamp has the merit of allowing flexible control with different degrees of tightness on the parenchyma during tumour excision, collecting system repair, and renorrhaphy. The degree of regional ischaemia can thus be minimised. Furthermore, the clamp serves as a mount for controlling the kidney’s position during the procedure, mimicking the surgeon’s hand directly holding the kidney and keeping it in a stable position during tumour excision and renorrhaphy, which is particularly useful in a laparoscopic setting.
 
Tumour characteristics
Utilisation of the kidney clamp is feasible for tumours with certain characteristics. As illustrated in Table 2, favourable tumour features include laterally located, polar region, and exophytic. Use in tumours that are located near the hilum, mid pole, or medial side is generally contra-indicated.13 Recently, different nephrometry scores have been increasingly used to describe tumour complexity in partial nephrectomies.6 The PADUA nephrometry score takes several variables into account, including the longitudinal location, the rim location, the relationship with the renal sinus and collecting system, percentage of tumour extending into the kidney, the anterior or posterior location, and the tumour diameter. A higher score is associated with greater risk of complications,6 externally validated by other study.15 The PADUA nephrometry score in the SRPC group was significantly lower in this study. It may have been that these tumours were technically less challenging, and reflects the limitations of the clamp as the hilum and ureter have to be spared during clamp application, therefore not all tumour locations are feasible. Nonetheless in selected cases, SRPC offers an easy and safe way of performing NSS. Currently no particular cut-off value of nephrometry score is used to determine the method of vascular control. Future studies are needed to deduce the selection criteria in which SRPC could be feasible and safely performed. This would be more objective and could facilitate the widespread adoption of the technique.
 
Versatility in surgical approaches
The kidney clamp is reusable and can be used regardless of the surgical approach. In the current study, seven patients had open surgery, nine had robot-assisted laparoscopic surgery, and four had pure laparoscopic surgery. In our experience, no adjustment to its application is required, regardless of surgical approach. This versatility allows the surgeon to be flexible when deciding the surgical approach.
 
Avoid global renal ischaemia
Another obvious benefit with the kidney clamp was the avoidance of whole kidney ischaemia.16 This was especially true for laparoscopic or robot-assisted cases in which only warm ischaemia was permitted. While cold ischaemia can be up to 3 hours, warm ischaemia is classically limited to 30 minutes.17 This is undoubtedly one of the important stresses for the surgeons during partial nephrectomy. Recent studies have even shown that every minute of ischaemia can have a significant impact on postoperative renal function. Longer warm ischaemia is associated with acute kidney failure, with an odds ratio of 1.05 for each 1-minute increase.3 Regional ischaemia spares most of the non–tumour baring area from ischaemic injury. Animal study has reported the change in serum creatinine and intra-operative oxygenation profiles to be improved with parenchymal clamping or partial renal artery clamping, compared with complete renal artery clamping.18 In this study, early postoperative renal function was satisfactory after parenchymal clamping with minimal changes in eGFR. No patients experienced acute kidney injury during the early postoperative period. Postoperative renal function was mostly comparable between the groups (Table 4). There was significant deterioration in eGFR for HC (P=0.006 and 0.016 for cold and warm ischaemia, respectively) at postoperative 60 days when compared with SRPC. This benefit in SRPC, however, failed to translate into a long-term renal function improvement as shown by 90-day renal function. The significance of this apparent transient benefit was not clear and might have been confounded by different factors in this retrospective study. By accumulating more cases and a longer follow-up, we believe the renal parenchymal clamp will be shown to better preserve renal function for partial nephrectomy in the long term.
 
Safety
There was no slippage or accidental loosening of the parenchymal clamp during tumour dissection and renorrhaphy. No cases required any additional hilar control. In terms of the oncological control, the parenchymal clamps did not lead to a higher positive margin rate, local recurrence rate, or metastases. On the contrary, we expect it to be lower, as the parenchymal clamp allows a more comfortable tumour dissection with less time constraints. This may improve the dissection result and lead to reduced positive surgical margin and better tumour control as the surgeon becomes more experienced.
 
Surgical outcomes
The mean operating time was reasonably short at 207 minutes. This was mainly attributed by the time saved for the tedious and sometimes risky hilar dissection.19 Renal cooling with ice was also unnecessary and further reduced the overall operating time. The lower estimated blood loss could be explained by the avoidance of HC, as vascular injury during HC can lead to profuse bleeding10 or even renal artery dissection.
 
The study result of a shortened operating time and lower estimated blood loss needs to be interpreted with caution in view of several limitations of our study. First, it was a retrospective study and the sample size for SRPC was small. Moreover parenchymal clamping was done for less complex tumours. This difference in complexity might have contributed to the smaller blood loss and shorter operating time, as well as the preservation of renal function as less volume of renal parenchyma was removed. Another significant limitation was the lack of volumetric analysis, which rendered the renal function comparison between two groups difficult. A randomised, prospective study is required to truly compare the two methods after accumulating more clinical experience.
 
Road to zero ischaemia
Recently partial nephrectomy with zero ischaemia was reported, combining the use of selective arterial clamping and controlled hypotension.20 Outcomes were favourable with a mean absolute and percentage change in preoperative and 4-month postoperative eGFR of -11.4 mL/min/1.73 m2 and 13%, respectively. Nonetheless, this technique is technically demanding and requires a steep learning curve. Its utilisation is also limited to robot-assisted or laparoscopic surgery as a magnified view is essential for the meticulous vascular dissection. On the contrary, the kidney clamp in our study provides a relatively simpler way to perform partial nephrectomy without HC, with a reasonable postoperative renal function outcome. This kidney clamp is undoubtedly an important addition to the surgical armamentarium in the evolvement of partial nephrectomy with ultimately zero ischaemia.
 
Conclusions
Partial nephrectomy using parenchymal clamping as a means of vascular control is safe and feasible in selected cases with peripherally located and exophytic tumours. It could be used in various surgical approaches to achieve regional ischaemia. The postoperative renal function and oncological control in this initial experience were satisfactory.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Uzzo RG, Novick AC. Nephron sparing surgery for renal tumors: indications, techniques and outcomes. J Urol 2001;166:6-18. Crossref
2. Campbell SC, Novick AC, Belldegrun A, et al. Guideline for management of the clinical T1 renal mass. J Urol 2009;182:1271-9. Crossref
3. Thompson RH, Lane BR, Lohse CM, et al. Every minute counts when the renal hilum is clamped during partial nephrectomy. Eur Urol 2010;58:340-5. Crossref
4. Selikowitz SM. A simple partial nephrectomy clamp. J Urol 1995;154:489-90. Crossref
5. Levey AS, Bosch JP, Lewis JB, Greene T, Rogers N, Roth D. A more accurate method to estimate glomerular filtration rate from serum creatinine: a new prediction equation. Modification of Diet in Renal Disease Study Group. Ann Intern Med 1999;130:461-70. Crossref
6. Ficarra V, Novara G, Secco S, et al. Preoperative aspects and dimensions used for an anatomical (PADUA) classification of renal tumours in patients who are candidates for nephron-sparing surgery. Eur Urol 2009;56:786-93. Crossref
7. Clavien PA, Barkun J, de Oliveira ML, et al. The Clavien-Dindo classification of surgical complications: five-year experience. Ann Surg 2009;250:187-96. Crossref
8. Semb C. Partial resection of the kidney: anatomical, physiological and clinical aspects. Ann R Coll Surg Engl 1956;19:137-55.
9. Gill IS, Munch LC, Clayman RV, McRoberts JW, Nickless B, Roemer FD. A new renal tourniquet for open and laparoscopic partial nephrectomy. J Urol 1995;154:1113-6. Crossref
10. Denardi F, Borges GM, Silva W Jr, et al. Nephron-sparing surgery for renal tumours using selective renal parenchymal clamping. BJU Int 2005;96:1036-9. Crossref
11. Verhoest G, Manunta A, Bensalah K, et al. Laparoscopic partial nephrectomy with clamping of the renal parenchyma: initial experience. Eur Urol 2007;52:1340-6. Crossref
12. Simon J, dePetriconi R, Rinnab L, Hautmann RE, Kurtz F. Optimizing selective renal clamping in nephron-sparing surgery using the Nussbaum clamp. Urology 2008;71:1196-8. Crossref
13. Simon J, Bartsch G Jr, Finter F, Hautmann R, de Petriconi R. Laparoscopic partial nephrectomy with selective control of the renal parenchyma: initial experience with a novel laparoscopic clamp. BJU Int 2009;103:805-8. Crossref
14. Hsi RS, Macleod LC, Gore JL, Wright JL, Harper JD. Comparison of selective parenchymal clamping to hilar clamping during robotic-assisted laparoscopic partial nephrectomy. Urology 2014;83:339-44. Crossref
15. Tyritzis SI, Papadoukakis S, Katafigiotis I, et al. Implementation and external validation of Preoperative Aspects and Dimensions Used for an Anatomical (PADUA) score for predicting complications in 74 consecutive partial nephrectomies. BJU Int 2012;109:1813-8. Crossref
16. George AK, Herati AS, Srinivasan AK, et al. Perioperative outcomes of off-clamp vs complete hilar control laparoscopic partial nephrectomy. BJU Int 2013;111:E235-41. Crossref
17. Novick AC. Renal hypothermia: in vivo and ex vivo. Urol Clin North Am 1983;10:637-44.
18. Raman JD, Bensalah K, Bagrodia A, et al. Comparison of tissue oxygenation profiles using 3 different methods of vascular control during porcine partial nephrectomy. Urology 2009;74:926-31. Crossref
19. Mejean A, Vogt B, Cazin S, Balian C, Poisson JF, Dufour B. Nephron sparing surgery for renal cell carcinoma using selective renal parenchymal clamping. J Urol 2002;167:234-5. Crossref
20. Gill IS, Patil MB, Abreu AL, et al. Zero ischemia anatomical partial nephrectomy: a novel approach. J Urol 2012;187:807-14. Crossref

Sperm retrieval rate and pregnancy rate in infertile couples undergoing in-vitro fertilisation and testicular sperm extraction for non-obstructive azoospermia in Hong Kong

Hong Kong Med J 2016 Dec;22(6):556–62 | Epub 30 Sep 2016
DOI: 10.12809/hkmj154710
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Sperm retrieval rate and pregnancy rate in infertile couples undergoing in-vitro fertilisation and testicular sperm extraction for non-obstructive azoospermia in Hong Kong
Jennifer KY Ko, FHKAM (Obstetrics and Gynaecology)1; Joyce Chai, FHKAM (Obstetrics and Gynaecology)1; Vivian CY Lee, FHKAM (Obstetrics and Gynaecology)1; Raymond HW Li, FRCOG, FHKAM (Obstetrics and Gynaecology)1; Estella Lau, PhD1; KL Ho, FRCSEd (Urol), FHKAM (Surgery)2,3; PC Tam, FRCSEd (Urol), FHKAM (Surgery)2,3; William SB Yeung, PhD1; PC Ho, MD1; Ernest HY Ng, MD1
1 Department of Obstetrics and Gynaecology, The University of Hong Kong, Queen Mary Hospital, Pokfulam, Hong Kong
2 Division of Urology, Department of Surgery, The University of Hong Kong, Queen Mary Hospital, Pokfulam, Hong Kong
3 Private practice
 
Corresponding author: Dr Jennifer KY Ko (jenko@hku.hk)
 
 
 Full paper in PDF
 
Abstract
Objective: There are currently no local data on the sperm retrieval and pregnancy rates in in-vitro fertilisation and testicular sperm extraction cycles, especially with regard to the presence of genetic abnormalities. This study aimed to determine the sperm retrieval and pregnancy rates in infertile couples who underwent in-vitro fertilisation and testicular sperm extraction for non-obstructive azoospermia.
 
Methods: This retrospective case series was conducted at a tertiary assisted reproduction unit in Hong Kong. Men with non-obstructive azoospermia who underwent in-vitro fertilisation and testicular sperm extraction between January 2001 and December 2013 were included. The main outcome measures were sperm retrieval and pregnancy rates.
 
Results: During the study period, 89 men with non-obstructive azoospermia underwent in-vitro fertilisation and testicular sperm extraction. Sperm was successfully retrieved in 40 (44.9%) men. There was no statistically significant difference in the sperm retrieval rate of those with karyotypic abnormalities (2/5, 40.0% vs 28/61, 45.9%; P=1.000) and AZFc microdeletion (3/6, 50.0% vs 28/61, 45.9%; P=1.000) compared with those without. Sperms were successfully retrieved in patients who had mosaic Klinefelter syndrome (2/3, 66.7%) but not in the patient with non-mosaic Klinefelter syndrome. No sperms were found in men with AZFa or AZFb microdeletions. Pregnancy test was positive in 15 (16.9%) patients and the clinical pregnancy rate was 13.5% (12/89) per cycle. The clinical pregnancy rate per transfer was 34.3% (12/35).
 
Conclusions: The sperm retrieval rate and clinical pregnancy rate per initiated cycle in men undergoing in-vitro fertilisation and testicular sperm extraction in our unit were 44.9% and 13.5%, respectively. No sperms could be retrieved in the presence of AZFa and AZFb microdeletions, but karyotype and AZFc microdeletion abnormalities otherwise did not predict the success of sperm retrieval in couples undergoing in-vitro fertilisation and testicular sperm extraction. Genetic tests are important prior to testicular sperm extraction for patient selection and genetic counselling.
 
 
New knowledge added by this study
  • Our study provides important local data for counselling of men with non-obstructive azoospermia. The sperm retrieval rate and clinical pregnancy rate per cycle in men undergoing in-vitro fertilisation and testicular sperm extraction in our unit were 44.9% and 13.5%, respectively.
  • There was no statistically significant difference in the sperm retrieval and pregnancy rates in those with karyotypic abnormalities and AZFc microdeletion compared with those without. Sperms, however, were not found in men with AZFa or AZFb microdeletions.
Implications for clinical practice or policy
  • Although karyotype abnormalities and AZFc microdeletion did not affect the sperm retrieval and pregnancy rates in couples undergoing in-vitro fertilisation and testicular sperm extraction, karyotype and Y-microdeletion should be checked in men with non-obstructive azoospermia. The risk of vertical transmission of genetic abnormalities should be discussed and couples should be offered appropriate genetic counselling before treatment.
 
 
Introduction
Male-factor infertility is involved in about half of all infertile couples who seek assisted reproduction treatment.1 The advent of in-vitro fertilisation (IVF) with intracytoplasmic sperm injection (ICSI) has allowed many men with severe male factor to have their own genetic child.2 The development of surgical sperm retrieval techniques by testicular sperm extraction (TESE) has extended the possibility of fatherhood to those with non-obstructive azoospermia (NOA).3
 
The reported sperm retrieval rate from TESE varies in different studies due to inclusion of different populations, but is generally quoted to be around 50%.4 Nevertheless TESE is invasive. In a recent retrospective cohort study by Vloeberghs et al,5 only one (14.3%) out of seven men undergoing IVF-TESE eventually became a biological father. Studies have also shown a high prevalence of chromosomal abnormalities and Y-microdeletion in infertile men with NOA or severe oligozoospermia.6 7 8 9 These genetic abnormalities can potentially be transmitted vertically resulting in a child with sex aneuploidies or boys with the same Y-microdeletion.10 Guidelines from the American Society for Reproductive Medicine, American Urological Association, Canadian Urological Association, and International Federation of Fertility Societies are unanimous in suggesting that all men with NOA due to testicular failure should be offered genetic testing to exclude chromosomal abnormalities and Y chromosome microdeletions, and that genetic counselling be offered if an abnormality is detected.11 12 13 14
 
There are currently no local data on the sperm retrieval and pregnancy rates in IVF-TESE cycles, especially with regard to the presence of genetic abnormalities. Such information is invaluable to couples in pretreatment counselling and could affect their decision about treatment options. In this study, we determined the sperm retrieval and pregnancy rates in infertile couples who underwent IVF-TESE for NOA.
 
Methods
The study was a retrospective analysis of couples who underwent the first IVF cycle and required TESE for NOA at Queen Mary Hospital, a university affiliated tertiary care hospital, from January 2001 to December 2013. They were identified from our assisted reproductive technique database. Ethics approval was obtained from the Institutional Review Board of the University of Hong Kong / Hospital Authority Hong Kong West Cluster, with the requirement of patient informed consent waived because of its retrospective nature.
 
Husbands who attended our subfertility clinic were requested to submit one semen sample to the andrology laboratory of our centre prior to the first consultation. If the first semen analysis was abnormal, they were asked to submit a second semen sample. Semen analysis was based on the criteria of the World Health Organization (1999, 2010).15 16 Those with azoospermia confirmed in two semen samples following centrifugation were referred to the male infertility clinic for further assessment by urologists. All patients had detailed urological and reproductive history, physical examination, and hormonal profile including morning serum testosterone, follicle-stimulating hormone (FSH), and luteinising hormone.17 Men with azoospermia deemed to be due to a non-obstructive cause, as suggested by raised FSH and small testes, were advised to check their karyotype and microdeletion of the Y chromosome. The detailed techniques for chromosome analysis and Y-microdeletion studies by polymerase chain reaction of DNA from peripheral blood have been previously described.6 9 18 Karyotyping was performed by analysis of banded metaphase chromosomes from cultured cells. At least 15 and 30 metaphases were analysed routinely and whenever an anomaly was suspected, respectively.6 Y-microdeletion was analysed using six Y chromosome specific sequence tagged site markers that corresponded to the AZFa (sY84, sY86), AZFb (sY127, sY132), and AZFc (SY254, sY255) regions.6 9 18 Information on karyotype and Y-microdeletion was obtained from the medical record of the patients, and cross-referenced with the database of the genetic screening for male subfertility at Tsan Yuk Hospital.
 
Men with NOA who wished to have their own genetic child would be advised to undergo TESE to retrieve sperms. The most common ovarian stimulation protocols used in our unit were the long gonadotropin-releasing hormone (GnRH) agonist and GnRH antagonist protocols. Details of the stimulation cycle have previously been described.19 Patients attended the clinic on day 2 of the treatment cycle. Transvaginal ultrasonography was performed to determine antral follicle count (AFC) and serum oestradiol level was checked. Ovarian stimulation was commenced if the serum oestradiol level was confirmed at basal level and there was no ovarian cyst. The gonadotropin dosage depended on the woman’s age, AFC, and previous ovarian response. Transvaginal ultrasound for follicular tracking was performed 7 days after the start of ovarian stimulation and every 1 to 3 days thereafter. Further dosage of gonadotropins was titrated depending on the ovarian response of the patient during follicular tracking. Human chorionic gonadotropin (hCG) was given to trigger final oocyte maturation when the mean diameter of the leading follicle was at least 18 mm and three or more follicles reached a mean diameter of at least 16 mm. Transvaginal ultrasound-guided oocyte retrieval was performed 34 to 36 hours later.
 
The urologist was responsible for performing TESE that was performed under general anaesthesia on the day of oocyte retrieval. In conventional TESE, the scrotal skin and tunica vaginalis were opened and the testis was exposed through an incision. Bilateral testicular biopsies from the upper, middle, and lower poles were performed. Microdissection TESE, which involved identification of spermatogenically active areas under high magnification for more targeted biopsies, was performed starting from 2010. The biopsied testicular tissue was minced mechanically with two microscope slides, and incubated for 1 hour to allow the sperms to swim out of the tissue. Enzymatic digestion of the testicular tissue was performed as described when no sperms were seen under microscope.20 The testicular sperms were isolated by mini-density gradient centrifugation. Sperms with high density at the bottom of the gradient as well as those in the interface between the gradients were collected in two aliquots of culture medium. Before ICSI, the embryologist searched the sperms first in the whole aliquot containing sperms of high density, and then in the aliquot containing sperms at the interface if no sperm was found in the first aliquot. Spare sperms were cryopreserved. Fertilisation was achieved by ICSI in patients with successful sperm retrieval. One or two embryos were replaced 2 days after retrieval. Luteal phase support was given with either two doses of hCG 1500 IU 5 days apart or vaginal progesterone for 2 weeks after embryo transfer. Patients were followed up with a urinary pregnancy test 16 days after embryo transfer and those with a positive pregnancy test had transvaginal ultrasound scan performed 10 to 14 days later and were referred for antenatal care at 8 to 10 weeks of gestation. Pregnancy outcome was monitored. Pregnancy was defined as a positive urinary pregnancy test. A clinical pregnancy was defined as a pregnancy with the presence of one or more intrauterine sac on transvaginal ultrasound. An ongoing pregnancy was defined as the presence of at least one fetal heart pulsation on ultrasound beyond 20 weeks. When no sperms were retrieved, the collected oocytes were either discarded, donated, or frozen if further treatment with donor sperm was considered.
 
Data analyses were performed by the Statistical Package for the Social Sciences (Windows version 20.0; SPSS Inc, Chicago [IL], US). Comparisons between the groups were made using the Fisher’s exact test, and P<0.05 was considered statistically significant.
 
Results
Only information from the first cycle of IVF-TESE was included. Of 112 men who underwent TESE during the study period, 23 were excluded from analysis because of non-motile sperms in the ejaculate (n=3), ejaculatory dysfunction (n=4), and obstructive azoospermia and underwent TESE after failed microepididymal sperm aspiration (n=16). Therefore, 89 patients with NOA were included in the analysis.
 
The mean age of the men was 37.2 (standard deviation [SD], 6.2) years. Of the 85 patients with smoking history available, 59 (69.4%) were non-smokers and 26 (30.6%) were smokers. Genetic information was missing in nine men. Of 76 men with genetic information available, eight (10.5%) had karyotypic abnormality—six were sex chromosomal and two were autosomal, as shown in Table 1. The most common sex chromosomal abnormality was Klinefelter syndrome (3/6, 50%; 1 non-mosaic and 2 mosaic). The two men with autosomal chromosome abnormality were a ring chromosome 21 and a mosaic supernumerary marker chromosome. Of these 76 men, nine (11.8%) had microdeletion of the Y chromosome. The most common Y-microdeletion was AZFc microdeletion (6/9, 66.7%). Three men had both chromosomal abnormality and Y-microdeletion.
 

Table 1. Success of TESE and pregnancy outcome in patients with genetic abnormality
 
Of the 89 patients, sperms were successfully retrieved in 40 (44.9%). There was no statistically significant difference in the sperm retrieval rate in those with karyotype abnormalities and AFZc microdeletion compared with those without (Table 2). The same was true when those with sex chromosomal abnormality were compared with those having normal karyotype. Sperms were retrieved in the two patients with mosaic Klinefelter syndrome but not the one with non-mosaic Klinefelter syndrome. For those with Y-microdeletion, sperms were retrieved in 50% (3/6) with AZFc microdeletions, but no sperms were found in the men with AZFa+b+c and AZFb+c microdeletions. The men with AZFa+b+c and AZFb+c microdeletions all had co-existing sex chromosomal abnormalities (Table 1).
 

Table 2. The relationship between Y-microdeletion and karyotypic abnormalities and success of testicular sperm extraction
 
The mean age of the female partners was 37.2 (SD, 6.2) years. The median dosage of gonadotropins used was 1950 IU (interquartile range, 1650-2550 IU), duration of stimulation 12 days (11-14 days), peak oestradiol level 11 341 pmol/L (7859.5-19 260.5 pmol/L), and the number of metaphase II oocytes obtained was 8 (4-13). Pregnancy test was positive in 15/89 (16.9%) and the clinical pregnancy rate was 12/89 (13.5%). Among those with sperms found, pregnancy test was positive in 15/40 (37.5%) and the clinical pregnancy rate was 12/40 (30.0%). Clinical pregnancy rate per transfer was 12/35 (34.3%). Two patients had biochemical pregnancy and one had ectopic pregnancy. There were 10 live births—seven singletons and three pairs of twins. One patient underwent second-trimester medical termination of pregnancy for fetal alobar holoprosencephaly. One had an ongoing pregnancy at 12 weeks but was subsequently lost to follow-up. The ongoing pregnancy rate per cycle was 10/89 (11.2%). The clinical pregnancy rate was not statistically different between those who had karyotypic abnormalities (0/5, 0% vs 8/61, 13.1%; P=1.000) and Y-microdeletion (1/6, 16.7% vs 8/61, 13.1%; P=1.000) compared with those who did not. Among the five patients who did not have embryo transfer, two had failed fertilisation, two had no transferrable embryos, and one had no oocyte retrieved.
 
Discussion
Ho et al17 highlighted the importance of male factor in infertility assessment and treatment in a recent case series in a local male infertility clinic. The incidence of azoospermia was reported to be up to 36.2%, of which 52.1% had a non-obstructive cause and would require TESE.17 With increasing awareness of male infertility, it is hoped that more men will seek professional help and share the burden in fertility treatment. Nevertheless, existing data available in the literature on outcomes of IVF-TESE are fragmentary5 and local data are still lacking.
 
Karyotypic abnormalities and Y-microdeletion have been associated with severe male factor infertility. Fu et al21 demonstrated high rates of chromosomal abnormalities and Y chromosome microdeletions in Chinese infertile men with azoospermia or severe oligozoospermia. In another local study from our centre, the prevalence of chromosomal abnormality and Y-microdeletion were up to 21.1% and 8.5%, respectively in the azoospermic group.6 The prevalence of sex chromosomal abnormality and Y-microdeletion in the present study was 6/76 (7.9%) and 9/76 (11.8%), respectively; the former was lower than that reported by our centre previously. This may be because we only included men who had TESE performed in the current study, and it is possible that some men, particularly those who were found to have genetic abnormality, did not further pursue assisted reproductive treatment considering the anticipated poor prognosis.
 
Testing for chromosomal abnormalities and Y-microdeletion are recommended as an essential part of the workup of men with NOA or severe oligospermia by the American Society for Reproductive Medicine,13 American Urological Association,12 Canadian Urological Association,11 and International Federation of Fertility Societies guidelines.14 In particular, microdeletion of the AZFa or AZFb regions were associated with poor prognosis of sperm retrieval and no sperms have been retrieved in these patients.22 Although our study did not show any statistically significant difference in the sperm retrieval rates in those with karyotype abnormalities and Y-microdeletion compared with those without, it should be noted that all patients who had sperms retrieved were having AZFc microdeletion consistent with existing reports, and those who had AZFabc and AZFbc did not have sperms retrieved. Nonetheless the three men with AZFabc and AZFbc microdeletions all had co-existing sex chromosomal abnormalities that may also have affected spermatogenesis. Men with AZFa or AZFb microdeletion are therefore often advised against TESE because of a very low chance of successful retrieval of mature spermatozoa. Given proper counselling, these patients may opt not to pursue further assisted reproductive techniques, and go directly for options including donor insemination or adoption. On the other hand, the majority of men with AZFc microdeletion have sperm available for use. In some studies, AZFc deletion was even associated with an increased likelihood of sperm retrieval.22 The sperm retrieval rate in men with Klinefelter syndrome via microdissection TESE has also been reported to be similar or even higher than in those with NOA and normal karyotype.23 24 25 In our experience, sperms were only found in the two patients with mosaic Klinefelter syndrome but not the one with non-mosaic Klinefelter syndrome. Previous studies showed that sperms are more likely to be retrieved in younger men with Klinefelter syndrome.26 27 In our study, the man with non-mosaic Klinefelter syndrome was 42 years old and might have passed the window of opportunity for successful sperm retrieval. The two patients with mosaic Klinefelter syndrome were 34 and 47 years old, respectively.
 
This study provides important information on the prognosis for men with NOA. The sperm retrieval rate of 44.9% is very similar to a previous report at our centre where sperms were found in 12/26 (46.2%) of men.28 In that study, the pregnancy rate was 14.3% per cycle when spermatozoa were injected. Our study showed that the chance for a man with NOA undergoing TESE to father his own child is 13.5%, very similar to that of 13.4% reported by Vloeberghs et al.5 Indeed, Vloeberghs et al5 have only included men with normal karyotype and absence of Y-microdeletion, who were included in our study, and we only included men who underwent first cycle of IVF-TESE. The clinical pregnancy rate would have been higher if we also included men who underwent further attempts. Nonetheless, men were generally advised against further TESE after failure to retrieve sperms in the first attempt owing to the poor prognosis.
 
Another important issue is the potential for the genetic abnormality to be transmitted vertically via assisted reproductive technology. Couples wherein the male partner has Y-microdeletion should be counselled about the potential for inheritance of compromised fertility by male offspring and proper genetic counselling should be in place before embarking on IVF-TESE.
 
The practice of our centre was synchronous TESE on the day of oocyte retrieval. Several studies have shown comparable fertilisation and pregnancy rates when TESE is performed beforehand, either on the day before oocyte retrieval29 or even prior to initiation of controlled ovarian hyperstimulation.30 The merit of the latter approach is that women do not have to go through controlled ovarian stimulation if no sperms can be retrieved, and therefore can avoid the risks of ovarian hyperstimulation syndrome and the costs involved. The available cryopreservation-thawing procedure for sperms leads to sperm loss however, and there is still a possibility that the cycle will have to be cancelled if there are no viable sperms after thawing.
 
As 50% of couples undergo IVF-TESE in vain, with resulting psychological and financial implications, research into various factors that could predict successful sperm retrieval is important. Previous findings from our retrospective study did not suggest any significant differences in the age, history of mumps or orchitis/oligozoospermia, volume of both testes, serum FSH and testosterone levels in men with or without spermatozoa in IVF-TESE cycles.28 Indeed, no individual biochemical or hormonal marker has been found to reliably predict success in TESE.31 Although histopathology of testicular biopsy has been shown to predict TESE outcome, it is invasive and usually done at the time of TESE itself rather than beforehand in many patients, limiting its role as a predictive marker.31 The detection of Y chromosome microdeletion, especially AZFa and AZFb, was important to guide prognosis as discussed above. Ramasamy et al32 showed that high serum FSH level in men did not affect the success of microdissection TESE and should not be used to deny men the possibility to father a child with their own genetic material.14 Similarly, testicular size may represent poor spermatogenesis in general but does not consistently predict sperm retrieval rate.31 In a meta-analysis, serum inhibin B had a sensitivity of 0.65 and a specificity of 0.83 in prediction of successful TESE but it was still suboptimal as a single predictive criterion.33 Seminal anti-Müllerian hormone and inhibin B are secreted by the Sertoli cells into the seminiferous tubules and therefore in theory are more direct markers of spermatogenesis, but their predictive value for successful TESE was not confirmed in a prospective study of 139 men with NOA by Mitchell et al.34 A combination of factors have been shown to fare better, and authors have described the use of a predictive score involving the total testicular volume, FSH, and inhibin B to predict the sperm retrieval rate in NOA.35
 
There are several limitations to our study. The number of men who declined genetic testing or TESE was not known. These men may be those with anticipated poor prognosis such that our study has included those with ‘better’ prognosis and therefore a higher sperm retrieval rate. In addition, while some authors suggest that hormonal profile or testicular volume may have provided prognostic information,35 36 our information on hormonal profile of men was incomplete. Prior to TESE, FSH might have been checked up to 2 years because of the long waiting list for IVF, and therefore was not analysed in this study. Other important limitations are the small number of cases and retrospective nature of our study that precludes proper statistical analysis. When pregnancy outcome is analysed, it is essential to remember the other confounding variables of the female partner such as age and diagnostic category that limit the conclusions that can be drawn. Moreover, as the study spanned over 13 years, there have been changes such as surgical techniques. Men included in the earlier years underwent TESE while those more recently underwent microdissection TESE. Further studies looking into different prognostic factors to predict successful sperm retrieval are needed.
 
Conclusions
The sperm retrieval rate and clinical pregnancy rate per cycle in men undergoing IVF-ICSI-TESE in our unit were 44.9% and 13.5%, respectively. Karyotype and AZFc microdeletion abnormalities did not predict the success of sperm retrieval or clinical pregnancy rate in couples undergoing IVF-ICSI-TESE in the present case series, but are important in patient counselling. Consistent with existing literature, no sperms could be retrieved in individuals with AZFa and AZFb microdeletions.
 
Acknowledgements
We would like to thank Mr Tak-ming Cheung for data management, and the laboratory colleagues of Prenatal Diagnostic Laboratory at Tsan Yuk Hospital who have helped trace the karyotype and Y-microdeletion results.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Infertility diagnosis by age of patients receiving RT procedures (other than DI and AIH) in 2013. Council on Human Reproductive Technology. Available from: http://www.chrt.org.hk/english/publications/files/table17_2013.pdf. Accessed 6 Jul 2015.
2. Devroey P, Van Steirteghem A. A review of ten years experience of ICSI. Hum Reprod Update 2004;10:19-28. Crossref
3. Devroey P, Liu J, Nagy Z, et al. Pregnancies after testicular sperm extraction and intracytoplasmic sperm injection in non-obstructive azoospermia. Hum Reprod 1995;10:1457-60. Crossref
4. Donoso P, Tournaye H, Devroey P. Which is the best sperm retrieval technique for non-obstructive azoospermia? A systematic review. Hum Reprod Update 2007;13:539-49. Crossref
5. Vloeberghs V, Verheyen G, Haentjens P, Goossens A, Polyzos NP, Tournaye H. How successful is TESE-ICSI in couples with non-obstructive azoospermia? Hum Reprod 2015;30:1790-6. Crossref
6. Ng PP, Tang MH, Lau ET, et al. Chromosomal anomalies and Y-microdeletions among Chinese subfertile men in Hong Kong. Hong Kong Med J 2009;15:31-8.
7. Chiang HS, Wei HJ, Chen YT. Genetic screening for patients with azoospermia and severe oligo-asthenospermia. Int J Androl 2000;23 Suppl 2:20-5. Crossref
8. Chandley AC. Chromosome anomalies and Y chromosome microdeletions as causal factors in male infertility. Hum Reprod 1998;13 Suppl 1:45-50. Crossref
9. Tse JY, Yeung WS, Lau EY, Ng EH, So WW, Ho PC. Deletions within the azoospermia factor subregions of the Y chromosome in Hong Kong Chinese men with severe male-factor infertility: controlled clinical study. Hong Kong Med J 2000;6:143-6.
10. Lee SH, Ahn SY, Lee KW, Kwack K, Jun HS, Cha KY. Intracytoplasmic sperm injection may lead to vertical transmission, expansion, and de novo occurrence of Y-chromosome microdeletions in male fetuses. Fertil Steril 2006;85:1512-5. Crossref
11. Jarvi K, Lo K, Grober E, et al. The workup and management of azoospermic males. Can Urol Assoc J 2015;9:229-35. Crossref
12. Jarow J, Sigman M, Kolettis PN, et al. The evaluation of the azoospermic male: AUA Best Practice Statement (revised 7/22/11). US, Maryland: American Urological Association Education and Research, Inc; 2011.
13. Practice Committee of the American Society for Reproductive Medicine in collaboration with the Society for Male Reproduction and Urology. Evaluation of the azoospermic male. Fertil Steril 2008;90(5 Suppl):S74-7. Crossref
14. Standards and Practice Committee. International Federation of Fertility Societies. Global standards of infertility care. Standard 17. Investigation and management of non-obstructive azoospermia. Recommendations for practice. June 2014.
15. World Health Organization. WHO laboratory manual for the examination of human semen and sperm-cervical mucus interaction. 4th ed. Cambridge: Cambridge University Press; 1999.
16. Cooper TG, Noonan E, von Eckardstein S, et al. World Health Organization reference values for human semen characteristics. Human Reprod Update 2010;16:231-45. Crossref
17. Ho KL, Tsu JH, Tam PC, Yiu MK. Disease spectrum and treatment patterns in a local male infertility clinic. Hong Kong Med J 2015;21:5-9.
18. Tse JY, Yeung WS, Ng EH, et al. A comparative study of Y chromosome microdeletions in infertile males from two Chinese populations. J Assist Reprod Genet 2002;19:376-83. Crossref
19. Li HW, Lee VC, Lau EY, Yeung WS, Ho PC, Ng EH. Role of baseline antral follicle count and anti-Mullerian hormone in prediction of cumulative live birth in the first in vitro fertilisation cycle: a retrospective cohort analysis. PLoS One 2013;8:e61095. Crossref
20. Crabbé E, Verheyen G, Silber S, et al. Enzymatic digestion of testicular tissue may rescue the intracytoplasmic sperm injection cycle in some patients with non-obstructive azoospermia. Hum Reprod 1998;13:2791-6. Crossref
21. Fu L, Xiong DK, Ding XP, et al. Genetic screening for chromosomal abnormalities and Y chromosome microdeletions in Chinese infertile men. J Assist Reprod Genet 2012;29:521-7. Crossref
22. Stahl PJ, Masson P, Mielnik A, Marean MB, Schlegel PN, Paduch DA. A decade of experience emphasizes that testing for Y microdeletions is essential in American men with azoospermia and severe oligozoospermia. Fertil Steril 2010;94:1753-6. Crossref
23. Bakircioglu ME, Ulug U, Erden HF, et al. Klinefelter syndrome: does it confer a bad prognosis in treatment of nonobstructive azoospermia? Fertil Steril 2011;95:1696-9. Crossref
24. Sabbaghian M, Modarresi T, Hosseinifar H, et al. Comparison of sperm retrieval and intracytoplasmic sperm injection outcome in patients with and without Klinefelter syndrome. Urology 2014;83:107-10. Crossref
25. Ozveri H, Kayabasoglu F, Demirel C, Donmez E. Outcomes of micro-dissection TESE in patients with non-mosaic Klinefelter’s syndrome without hormonal treatment. Int J Fertil Steril 2015;8:421-8.
26. Rohayem J, Fricke R, Czeloth K, et al. Age and markers of Leydig cell function, but not of Sertoli cell function predict the success of sperm retrieval in adolescents and adults with Klinefelter’s syndrome. Andrology 2015;3:868-75. Crossref
27. Mehta A, Paduch DA. Klinefelter syndrome: an argument for early aggressive hormonal and fertility management. Fertil Steril 2012;98:274-83. Crossref
28. Ng HY, Lau YL, Yeung SB, So WK, Tam PC, Ho PC. Testicular sperm extraction and intracytoplasmic sperm injection in non-obstructive azoospermia. Chinese Med J (Engl) 2000;113:246-50.
29. Levran D, Ginath S, Farhi J, Nahum H, Glezerman M, Weissman A. Timing of testicular sperm retrieval procedures and in vitro fertilization–intracytoplasmic sperm injection outcome. Fertil Steril 2001;76:380-3. Crossref
30. Karacan M, Alwaeely F, Erkan S, et al. Outcome of intracytoplasmic sperm injection cycles with fresh testicular spermatozoa obtained on the day of or the day before oocyte collection and with cryopreserved testicular sperm in patients with azoospermia. Fertil Steril 2013;100:975-80. Crossref
31. Bernie AM, Ramasamy R, Schlegel PN. Predictive factors of successful microdissection testicular sperm extraction. Basic Clin Androl 2013;23:5. Crossref
32. Ramasamy R, Lin K, Gosden LV, Rosenwaks Z, Palermo GD, Schlegel PN. High serum FSH levels in men with nonobstructive azoospermia does not affect success of microdissection testicular sperm extraction. Fertil Steril 2009;92:590-3. Crossref
33. Toulis KA, Iliadou PK, Venetis CA, et al. Inhibin B and anti-Müllerian hormone as markers of persistent spermatogenesis in men with non-obstructive azoospermia: a meta-analysis of diagnostic accuracy studies. Hum Reprod Update 2010;16:713-24. Crossref
34. Mitchell V, Boitrelle F, Pigny P, et al. Seminal plasma levels of anti-Müllerian hormone and inhibin B are not predictive of testicular sperm retrieval in nonobstructive azoospermia: a study of 139 men. Fertil Steril 2010;94:2147-50. Crossref
35. Boitrelle F, Robin G, Marcelli F, et al. A predictive score for testicular sperm extraction quality and surgical ICSI outcome in non-obstructive azoospermia: a retrospective study. Hum Reprod 2011;26:3215-21. Crossref
36. Yang Q, Huang YP, Wang HX, et al. Follicle-stimulating hormone as a predictor for sperm retrieval rate in patients with nonobstructive azoospermia: a systematic review and meta-analysis. Asian J Androl 2015;17:281-4. Crossref

Clinical outcome of neoadjuvant chemoradiation in locally advanced rectal cancer at a tertiary hospital

Hong Kong Med J 2016 Dec;22(6):546–55 | Epub 31 Oct 2016
DOI: 10.12809/hkmj154788
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Clinical outcome of neoadjuvant chemoradiation in locally advanced rectal cancer at a tertiary hospital
William WK Yeung, FRCR, FHKAM (Radiology)1; Brigette BY Ma, FHKCP, MD (CUHK)1; Janet FY Lee, FHKAM (Surgery), MD (CUHK)2; Simon SM Ng, FHKAM (Surgery), MD (CUHK)2; Michael HY Cheung, FRCS, FHKAM (Surgery)3; WM Ho, MRCP, FHKAM (Medicine)1; Maverick WK Tsang, FRCR, FHKAM (Radiology)1; Simon Chu, FRCS, FHKAM (Surgery)2; Daisy CM Lam, MB, BS, FRCR1; Frankie KF Mo, PhD1
1 Department of Clinical Oncology, Prince of Wales Hospital, The Chinese University of Hong Kong, Shatin, Hong Kong
2 Department of Surgery, Prince of Wales Hospital, The Chinese University of Hong Kong, Shatin, Hong Kong
3 Department of Surgery, North District Hospital, Sheung Shui, Hong Kong
 
Corresponding author: Dr William WK Yeung (wilyeung@netvigator.com)
 
 
 Full paper in PDF
 
Abstract
Objectives: To review the clinical outcome of locally advanced rectal cancer treated with neoadjuvant chemoradiation followed by definitive surgery with or without adjuvant chemotherapy and to elucidate the prognostic factors for treatment outcome.
 
Methods: This historical cohort study was conducted at a tertiary public hospital in Hong Kong. All patients who had undergone neoadjuvant chemoradiation for locally advanced rectal cancer in our department from November 2005 to October 2014 were recruited. Local recurrence–free survival, distant metastasis–free survival, disease-free survival, and overall survival of patients were documented.
 
Results: A total of 135 patients who had received neoadjuvant chemoradiation during the study period were reviewed. There were 130 patients who had completed neoadjuvant chemoradiation and surgery. The median follow-up time was 35.1 months. The 3- and 5-year local recurrence–free survival, distant metastasis–free survival, disease-free survival, as well as overall survival rates were 91.8% and 86.7%, 73.9% and 72.1%, 70.1% and 64.6%, as well as 86.5% and 68.4%, respectively. The rate of pathological complete response was 13.8%. The T and N downstaging rate was 49.2% and 63.1%, respectively. The rate of conversion from threatened circumferential resection margin to clearance of margin was 90.6%. Of the 42 cases that were initially deemed to require abdominal perineal resection, 15 (35.7%) were converted to sphincter-sparing surgery.
 
Conclusions: The treatment outcome of neoadjuvant chemoradiation for locally advanced rectal cancer was comparable with overseas data in terms of local control rate and overall survival. This strategy may increase the chance of achieving a clear surgical margin by downstaging the tumour, especially in patients who presented with threatened circumferential margin.
 
 
New knowledge added by this study
  • This is a local study from a tertiary oncology centre on the clinical outcome of neoadjuvant chemoradiation in the treatment of locally advanced rectal cancer.
Implications for clinical practice or policy
  • Neoadjuvant chemoradiation is effective in downstaging advanced rectal cancers, especially those with threatened circumferential resection margin, facilitating definitive surgery to achieve a clearance of the final pathological margin.
 
 
Introduction
According to the Hong Kong Cancer Registry,1 there were 1797 new cases of rectal/anal cancer in 2013. The incidence rate per 100 000 persons was 25.0 (crude rate) and 13.3 (age-standardised rate). The total number of deaths from rectal/anal cancer was 597, and the mortality rate was 8.3 (crude rate) or 3.7 (age-standardised rate) per 100 000 persons. In Hong Kong, colorectal cancer is the first most common cancer in incidence and the second in mortality rate for both sexes.
 
Conventional treatment of rectal cancer is mainly surgery. In locally advanced cancer, adjuvant therapy with concurrent chemoradiation has been shown to improve local control and disease-free survival (DFS) in phase III clinical trials.2 3 4 5 The major indication for adjuvant chemoradiation is pathological T3 or T4 and/or regional nodal disease without distant metastasis.
 
Preoperative radiotherapy with or without concurrent chemotherapy has been shown to reduce the local recurrence rate of locally advanced rectal cancer.6 7 8 9 10 11 12 Preoperative radiotherapy comprises a short or long course.
 
Short-course preoperative radiotherapy was given in 5 Gy per fraction for five fractions over 1 week, followed by surgery about 1 week after completion of radiotherapy. Since the introduction of total mesorectal excision (TME), the local recurrence rate has been significantly reduced. In the new era of TME surgery, a Dutch rectal trial confirmed that a short course of preoperative radiotherapy, followed by TME surgery, was also beneficial in reducing local recurrence rate from 8.2% to 2.4% over 2 years compared with TME surgery alone in locally advanced rectal cancer.13 14
 
Long-course preoperative radiotherapy involves a conventional fractionation of 1.8 Gy per fraction, five fractions per week, up to a total dose of 45 to 50 Gy. It is given with concurrent chemotherapy consisting of mostly a fluoropyrimidine-containing regimen. Surgery is usually performed approximately 4 to 10 weeks after completion of chemoradiation.
 
A randomised German trial (CAO/ARO/AIO 94)10 compared preoperative long-course chemoradiation with postoperative chemoradiation. At a median follow-up of 4 years, no significant difference was reported in the 5-year overall survival (OS). Nonetheless, treatment compliance, grade 3/4 acute and late toxicity profile, tumour and nodal downstaging, and rates of pelvic recurrence all favoured the preoperative chemoradiation arm. In addition, the sphincter preservation rate in the 194 patients with low-lying tumours declared by the surgeon prior to randomisation requiring an abdominoperineal resection (APR) was enhanced with preoperative treatment (39% vs 19%; P=0.004).
 
Since 2005, our hospital has adopted a treatment policy of long-course neoadjuvant (or preoperative) chemoradiation (nCRT) for selected cases of locally advanced rectal cancer. The objective of this study was to review the clinical outcome of these patients with locally advanced rectal cancer treated with nCRT in our department from November 2005 to October 2014 and to elucidate the prognostic factors for treatment outcome retrospectively.
 
Methods
Eligible patients included those without distant metastasis and who were staged preoperatively on radiological grounds with T3 or T4 disease and/or having nodal involvement. There might have been other extra specific reasons for recommending nCRT, including threatened circumferential resection margin (CRM), sphincter-sparing surgery, avoidance of pelvic exenteration, and unresectability (Table 1). Patients were required to be medically fit and agree to the nCRT.
 

Table 1. Other reasons for recommending neoadjuvant chemoradiation
 
The nCRT scheme adopted in our department consisted of the following.
 
Radiotherapy
Simulation procedure was done in an immobilised prone position with full bladder, using simulation computed tomography (CT) scan. The three-dimensional conformal radiotherapy planning was performed on the simulation CT scan imaging, using three coplanar fields with shielding conformal to the target volume. The radiotherapy was given in two phases. Phase 1 included the whole pelvis. A total dose of 45 Gy was delivered at 1.8 Gy per day, five fractions per week over 5 weeks. Phase II included only the gross tumour and the enlarged pelvic nodes with margins. A booster dose of 5.4 Gy was administered in the same fractionation as phase 1.
 
Chemotherapy
Concurrent chemotherapy was given in the first and fifth weeks of radiation. It comprised an intravenous (IV) bolus of 5-fluorouracil (5-FU; 400 mg/m2) and leucovorin (20 mg/m2) on days 1 to 4.
 
The surgery was scheduled about 4 to 10 weeks after completion of nCRT. Adjuvant chemotherapy with four cycles of 5-FU and leucovorin was administered to most patients. In some selected cases with pathological node-positive disease following surgery, four cycles of capecitabine and oxaliplatin (‘XELOX’ regimen) were given.
 
In this study, clinical data were collected retrospectively from the medical records of all patients who had undergone nCRT for locally advanced rectal cancer in the Department of Clinical Oncology at the Prince of Wales Hospital, Hong Kong from November 2005 to October 2014. The surgery was performed either at Prince of Wales Hospital or the referring hospital. There was variation in practice for pretreatment staging method, re-staging on completion of nCRT (follow-up CT scan was arranged to exclude distant metastasis at least 2 weeks after nCRT; optional magnetic resonance imaging [MRI] was considered at least 4 weeks after nCRT), and follow-up among different hospitals. The patients’ demographic information, tumour characteristics, and treatment details were retrieved. The initial type of surgery recommended by the referring surgical team at presentation and any extra specific reasons (intentions) for referral for nCRT were reviewed. The final pathology at the definitive surgery (the pathological T and N staging, the tumour size, any pathological complete response [pCR], the resection margins), the treatment-related toxicity (radiation- or chemotherapy-related, surgical complications), recurrence (local, regional, distant relapse), and disease status at follow-up were reviewed.
 
The key study endpoints included loco-regional recurrence–free survival, distant metastasis–free survival, DFS, and OS. Other secondary endpoints included the rate of pCR, tumour downstaging (T and N staging), conversion of threatened CRM to clearance of margins (R0), conversion to sphincter-sparing surgery for lower rectal cancers, conversion from a potential pelvic exenteration to non-exenterating surgery, and the rate of conversion from unresectable to resectable tumour. For toxicity endpoints, the rate of grade 3 or above acute toxicity according to the National Cancer Institute Common Terminology Criteria for Adverse Events version 4.0, and the rate of grade 3 or above late radiation toxicity according to the Toxicity criteria of the Radiation Therapy Oncology Group and the European Organization for Research and Treatment of Cancer, and perioperative complications as represented by rate of 30-day postoperative mortality and morbidity (delayed wound healing, anastomotic complication, reoperation) were also assessed.
 
Statistical analysis
Descriptive statistics were used to report the incidence rates of secondary endpoints that were calculated directly. The survival rates and time-to-event rates were estimated with the Kaplan-Meier method. Univariate analysis based on the proportional hazard model was performed to investigate the relationship between different outcome (survival) and prognostic factors. The hazard ratio and the corresponding 95% confidence interval were shown. The prognostic factors included pretreatment T stage, pretreatment N stage, histological grade, threatened CRM, completion of nCRT, time from nCRT to surgery, pathological T stage, pathological N stage, pathological group stage, pCR, pathological margin, number of involved nodes, and completion of adjuvant chemotherapy. For those significant prognostic factors, multivariate analysis using Cox regression with stepwise selection was performed.
 
This study was approved by the Joint Chinese University of Hong Kong–New Territories East Cluster Clinical Research Ethics Committee with informed consent waived. The principles outlined in the Declaration of Helsinki have also been followed.
 
Results
A total of 135 patients who had received nCRT in our department from November 2005 to October 2014 were reviewed, of whom 130 had completed nCRT and surgery with or without adjuvant chemotherapy. Of the five patients who did not have surgery, two refused surgery after nCRT and three progressed after nCRT without undergoing surgery.
 
Patient characteristics are shown in Table 2. The mean age was 60.9 (standard deviation [SD], 9.23) years. The male-to-female ratio was 3.2:1. For the pretreatment stage, 80% and 20% were T3 and T4 respectively, while 13.8%, 40.0% and 45.4% were N0, N1 and N2 stage, respectively. For the overall group stage, the incidences of stage IIA, IIB/C, and III were 8.5%, 6.1%, and 85.4%, respectively. A total of 65.4% cases had threatened CRM at pretreatment imaging.
 

Table 2. Patient characteristics (n=130)
 
Of the 130 patients who had surgery, 128 (98.5%) completed nCRT. For the radiotherapy-related toxicities, the combined incidence of grade 3 or above acute toxicity to the skin, bowel, and urinary toxicity was 6.2%. Similarly, the radiotherapy-related grade 3 or above late toxicity to the bowel and urinary tract was 6.2%. For chemotherapy-related grade 3 or above acute toxicity, the incidences of neutropenia, anaemia, and thrombocytopenia were 14.6%, 1.5%, and 1.5%, respectively. The most common non-haematological grade 3 or above acute toxicities were hand-foot-mouth syndrome (0.8%), mucositis (1.5%), and diarrhoea (0.8%). Adjuvant chemotherapy was given to 103 (79.2%) patients, of whom 92 (89.3%) received the regimen of IV bolus 5-FU and leucovorin. With regard to surgical complications, 22 (16.9%) patients had delayed wound healing (>30 days after operation), six (4.6%) had anastomotic complication, and six (4.6%) required reoperation. There was no 30-day postoperative mortality reported (Table 3).
 

Table 3. Toxicity and treatment compliance
 
Of the 130 cases, 124 (95.4%) underwent TME surgery and 114 (87.7%) had laparoscopic surgery. The mean time from the date of completion of nCRT to surgery was 7.2 (SD, 4.8) weeks. Comparing the type of surgery recommended before starting nCRT and those finally carried out after nCRT, the rate of anterior resection/low anterior resection increased to 65.4% from 50.8%, and the rate of APR/pelvic exenteration decreased to 27.7%/3.8% from 32.3%/12.3% respectively. The overall rate of surgical conversion was reported in several clinical contexts: (1) percentage achieving a R0 resection, (2) percentage undergoing sphincter-sparing surgery, and (3) percentage avoiding pelvic exenteration. First, of the total number of patients who were found to have threatened CRM before treatment, 90.6% finally achieved a R0 resection. Of the 42 patients who were initially deemed on presentation to require an APR, 35.7% underwent sphincter-sparing surgery. In a subgroup of the 15 patients who had received nCRT with the intention of sphincter preservation, 86.7% (n=13) underwent sphincter-sparing surgery rather than APR. Among these 13 cases with successful sphincter-sparing surgery, one had pCR and all had clear resection margins. They remained alive and free of loco-regional and distant recurrence at the end of this study. Of the 16 patients who were initially assessed to require pelvic exenteration, 62.5% (n=10) underwent non-exenterating surgery. There were six patients in whom tumour was deemed unresectable and who were referred for nCRT to improve resectability. Complete resection with negative margins was subsequently achieved in four (66.7%) of the six patients while the other two had a positive margin in the palliative surgery.
 
The final pathological staging in the surgical specimen is reported (Table 4). The rates of pCR and clear resection margin were 13.8% and 89.2%, respectively. The rate of T downstaging was 49.2% and that for N stage was 63.1% (Table 3).
 

Table 4. Surgical and pathological outcomes
 
The median follow-up time was 35.1 months. Of the 130 patients, local recurrence, loco-regional recurrence, distant metastasis, disease recurrence, and death occurred in 10 (crude rate, 7.7%), 15 (11.5%), 30 (23.1%), 34 (26.2%), and 23 (17.7%) patients, respectively. The Kaplan-Meier estimates of the 3-year local recurrence–free survival, regional recurrence–free survival, loco-regional recurrence–free survival, distant metastasis–free survival, DFS, and OS were 91.8%, 92.6%, 87.9%, 73.9%, 70.1%, and 86.5%, respectively. The respective 5-year survival rates were 86.7%, 85.3%, 81.0%, 72.1%, 64.6%, and 68.4%. The corresponding Kaplan-Meier curves for local recurrence-free survival and OS is also shown in the Figure (the curves for loco-regional recurrence–free survival, distant metastasis–free survival, and DFS are shown in the Appendix).
 

Figure. (a) Local recurrence–free and (b) overall survival curves
 

Appendix. (a) Loco-regional recurrence–free, (b) distant metastasis–free, and (c) disease-free survival curves
 
Analysis of prognostic factors
The variables (factors including age and gender were tested but not significant in univariate model) in the univariate analysis included the pretreatment T stage, pretreatment N stage, histological grade, presence of threatened CRM, completion of nCRT, time from nCRT to surgery (continuous variable), pathological T stage, pathological N stage, pathological group stage, pCR, pathological margin, number of involved nodes (continuous variable), and completion of adjuvant chemotherapy. Those significant prognostic factors were studied by multivariate analysis.
 
In the multivariate analysis, the pathological clear margin, completion of nCRT, and the number of involved nodes were significantly associated with local recurrence–free survival. The number of involved nodes, pathological clear margin, and time from nCRT to surgery were significantly associated with loco-regional recurrence–free survival. The number of involved nodes, the pretreatment T4, pathological stage III/IV, and completion of adjuvant chemotherapy were significantly associated with distant metastasis–free survival. The number of involved nodes, pathological stage III/IV, and completion of adjuvant chemotherapy were significantly associated with DFS. Finally, the number of involved nodes, the pretreatment T4, and pathological stage III/IV were significantly associated with OS (Table 5).
 

Table 5. Significant prognostic factors for various survival categories in both univariate and multivariate analysis
 
Discussion
Although the current study was retrospective, survival data were comparable with figures reported in international studies. In the major randomised trials, 5-year local recurrence rate in the arm with preoperative short-course radiotherapy was in the range of 11% to 14%, and OS was in the range of 42% to 76%.6 7 8 9 In the randomised trials that had an arm with nCRT, the 4- or 5-year local recurrence rates were 5.7% to 15.6% and the OS were 66.2% to 76%.10 15 16 17 18 In this study, the 5-year local recurrence rate and loco-regional recurrence rate was 13.3% and 19%, respectively. These were close to the reported figures from randomised studies.10 15 16 17 18 The 5-year OS in this study was 68.4% and is comparable with international studies.10 15 16 17 18
 
The pCR rate was 13.8% in this study, again comparable with randomised trials10 15 16 17 18 and reviews.19 Together with the favourable downstaging effects, the completion resection rate was high (89.2%). This is the primary aim of nCRT in advanced rectal cancer. The role of nCRT in sphincter preservation for low-lying tumours has been a controversial issue in some randomised trials,10 11 15 16 and critical reviews.20 21 In a German study,10 among the 194 patients with tumours that were determined by the surgeon before randomisation to require an APR, a statistically significant increase in sphincter preservation was achieved among patients who received nCRT compared with those who received postoperative chemoradiation (39% vs 19%; P=0.004). Although long-course nCRT is expected to result in tumour downsizing, a Polish trial11 did not find that long-course chemoradiation was superior to short-course preoperative radiotherapy in reducing the APR rate. The possible explanations for this finding include the possibility that the degree of downsizing was not sufficient to alter the surgical approach, due to surgeon’s concern about residual microscopic disease despite an apparently good response after nCRT, or the surgeons had made their clinical decision based on the pretreatment staging information. In our study the overall rate of conversion from APR to sphincter-sparing surgery was 35.7% and was comparable with that (39%) in the German trial10; and for the subgroup of patients with an intention to spare the sphincter, the conversion rate was even higher, up to 86.7%, with a good clinical outcome.
 
The extent of extramural tumour spread and lymph node and CRM status are powerful predictive factors for local recurrence, distant metastases, and OS in patients with rectal cancer.22 23 24 25 26 27 28 From our study, it was evident that the number of involved nodes in the final pathology was an independent factor in OS, DFS, local or loco-regional recurrence–free survival, and distant metastasis–free survival. For local or loco-regional recurrence, the pathological clear margin, the completion of nCRT, and the time from nCRT to surgery were independent prognostic factors. Although in this study there was an attempt to find the optimal cut-off time for surgery after the completion of nCRT, this was not possible because of the small sample size. Increasing the time interval from completion of nCRT to surgery was associated with a detrimental effect on loco-regional recurrence (hazard ratio=1.348).
 
In this study, completion of adjuvant chemotherapy was a prognostic factor for distant metastasis. This implies that adjuvant chemotherapy might be important in reducing distant metastasis. It remains controversial whether adjuvant chemotherapy should be given after nCRT and surgery. A 2x2 factorial randomised trial (EORTC trial 22921)29 30 31 32 that assessed the value of preoperative chemo-radiotherapy versus preoperative radiotherapy and postoperative chemotherapy versus no postoperative chemotherapy in patients with cT3-4 disease could not demonstrate any prolonged progression-free or OS from adjuvant chemotherapy in patients with resectable T3-T4 rectal cancer. Its follow-up report of 785 eligible patients who underwent R0 resection showed that patients with a good prognosis (ypT0-2) seemed to benefit from adjuvant chemotherapy, especially if the tumour was located in the mid-rectum.33 Nonetheless, an updated analysis of the EORTC 22921 trial18 recently failed to confirm the benefit of adjuvant chemotherapy for ypT0-2 patients after a median follow-up of 10.4 years. In the I-CNR-RT phase III randomised trial,34 there was no benefit of adjuvant chemotherapy (6 cycles of 5-FU and folinic acid) compared with observation only after nCRT. The result may be partly attributed to the low compliance to complete the planned number of chemotherapy cycles. The British Chronicle trial35 is unique in comparing XELOX postoperatively against observation alone in locally advanced rectal cancer treated with nCRT. After a median follow-up of 44.8 months, there was no statistically significant benefit of adjuvant XELOX in the 3-year DFS rate. A Korean study reported the results of ADORE phase II study in which 321 patients of ypT3-4/ypN0 or ypTx/ypN1-2 after nCRT with 5-FU alone were randomised to receive adjuvant chemotherapy with 5-FU or FOLFOX.36 37 After a median follow-up of 38.2 months, the 3-year DFS rate was better in the FOLFOX arm (P=0.047). Although adjuvant treatment of patients with rectal cancer remains controversial, the National Comprehensive Cancer Network guidelines recommend 5-FU–based chemotherapy with oxaliplatin as the preferred adjuvant treatment for all patients with rectal cancer, who receive neoadjuvant 5-FU–based chemoradiation, regardless of surgical pathology results. The recently reported German CAO/ARO/AIO-04 trial also revealed the benefit of adding oxaliplatin to both neoadjuvant and adjuvant treatment with significant improvement in DFS of patients with clinically staged cT3-4 or cN1-2 rectal cancer compared with conventional 5-FU–based combined modality regimen.38
 
There were limitations to this study. The data were collected retrospectively and there was no blinding during data collection. It is possible that potential confounding factors like smoking and co-morbidity were inadequately controlled for. Toxicity data were not collected systematically and thus could be underreported. If the data can be collected prospectively, a tailor-made toxicity form will be designed and more toxicity can be captured. The median follow-up time was relatively short. Magnetic resonance imaging is now a standard staging tool in rectal cancer. The use of MRI as initial staging was only 66.1% in this cohort. Therefore, pretreatment staging might not accurately reflect the true staging at presentation. In this study, there was limited reporting of late toxicity of radiation such as sexual and sphincter dysfunction. The full extent of the late toxicity of radiation requires longer follow-up. Due to the small sample size, the adjustment of the potential confounding factors for survival was a limitation of the study.
 
Conclusions
The treatment outcome following nCRT for locally advanced non-metastatic rectal cancer in our experience was comparable with overseas data in terms of local control rate and OS. The high conversion rate from having a threatened circumferential margin to clear resection margin, and the high T and N downstaging rates, suggest that this approach is effective in facilitating surgery to obtain complete surgical clearance. In the subgroup with an intention of sphincter preservation, the conversion rate from APR to sphincter-sparing surgery was high. The rate of acute toxicities was within expectations and manageable and there were no treatment-related deaths.
 
Acknowledgements
Although not named in the author list, we thank the other colleagues who contributed to the treatment of this group of patients and those who helped with data collection.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Hong Kong Cancer Registry. Statistics. Available from: http://www3.ha.org.hk/cancereg/statistics.html. Accessed Aug 2016.
2. Prolongation of the disease-free interval in surgically treated rectal carcinoma. Gastrointestinal Tumor Study Group. N Engl J Med 1985;312:1465-72. Crossref
3. Douglass HO Jr, Moertel CG, Mayer R, et al. Survival after postoperative combination treatment of rectal cancer. N Engl J Med 1986;315:1294-5. Crossref
4. Krook JE, Moertel CG, Gunderson LL, et al. Effective surgical adjuvant therapy for high-risk rectal carcinoma. N Engl J Med 1991;324:709-15. Crossref
5. Fisher B, Wolmark N, Rockette H, et al. Postoperative adjuvant chemotherapy or radiation therapy for rectal cancer: results from NSABP protocol R-01. J Natl Cancer Inst 1988;80:21-9. Crossref
6. Påhlman L, Glimelius B. Pre- or postoperative radiotherapy in rectal and rectosigmoid carcinoma. Report from a randomized multicenter trial. Ann Surg 1990;211:187-95. Crossref
7. Cedermark B, Johansson H, Rutqvist LE, Wilking N. The Stockholm I trial of preoperative short term radiotherapy in operable rectal carcinoma. A prospective randomized trial. Stockholm Colorectal Cancer Study Group. Cancer 1995;75:2269-75. Crossref
8. Improved survival with preoperative radiotherapy in resectable rectal cancer. Swedish Rectal Cancer Trial. N Engl J Med 1997;336:980-7. Crossref
9. Martling A, Holm T, Johansson H, Rutqvist LE, Cedermark B, Stockholm Colorectal Cancer Study Group. The Stockholm II trial on preoperative radiotherapy in rectal carcinoma: long-term follow-up of a population-based study. Cancer 2001;92:896-902. Crossref
10. Sauer R, Becker H, Hohenberger W, et al. Preoperative versus postoperative chemoradiotherapy for rectal cancer. N Engl J Med 2004;351:1731-40. Crossref
11. Sauer R, Liersch T, Merkel S, et al. Preoperative versus postoperative chemoradiotherapy for locally advanced rectal cancer: results of the German CAO/ARO/AIO-94 randomized phase III trial after a median follow-up of 11 years. J Clin Oncol 2012;30:1926-33. Crossref
12. Sebag-Montefiore D, Stephens RJ, Steele R, et al. Preoperative radiotherapy versus selective postoperative chemoradiotherapy in patients with rectal cancer (MRC CR07 and NCIC-CTG C016): a multicentre, randomised trial. Lancet 2009;373:811-20. Crossref
13. Kapiteijn E, Marijnen CA, Nagtegaal ID, et al. Preoperative radiotherapy combined with total mesorectal excision for resectable rectal cancer. N Engl J Med 2001;345:638-46. Crossref
14. Van Gijn W, Marijnen CA, Nagtegaal ID, et al. Preoperative radiotherapy combined with total mesorectal excision for resectable rectal cancer: 12-year follow-up of the multicentre, randomised controlled TME trial. Lancet Oncol 2011;12:575-82. Crossref
15. Bujko K, Nowacki MP, Nasierowska-Guttmejer A, Michalski W, Bebenek M, Kryj M. Long-term results of a randomized trial comparing preoperative short-course radiotherapy with preoperative conventionally fractionated chemoradiation for rectal cancer. Br J Surg 2006;93:1215-23. Crossref
16. Ngan SY, Burmeister B, Fisher RJ, et al. Randomized trial of short-course radiotherapy versus long-course chemoradiation comparing rates of local recurrence in patients with T3 rectal cancer: Trans-Tasman Radiation Oncology Group trial 01.04. J Clin Oncol 2012;30:3827-33. Crossref
17. Gérard JP, Conroy T, Bonnetain F, et al. Preoperative radiotherapy with or without concurrent fluorouracil and leucovorin in T3-4 rectal cancers: results of FFCD 9203. J Clin Oncol 2006;24:4620-5. Crossref
18. Bosset JF, Calais G, Mineur L, et al. Fluorouracil-based adjuvant chemotherapy after preoperative chemoradiotherapy in rectal cancer: long-term results of the EORTC 22921 randomised study. Lancet Oncol 2014;15:184-90. Crossref
19. Damin DC, Lazzaron AR. Evolving treatment strategies for colorectal cancer: a critical review of current therapeutic options. World J Gastroenterol 2014;20:877-87. Crossref
20. Gerard JP, Rostom Y, Gal J, et al. Can we increase the chance of sphincter saving surgery in rectal cancer with neoadjuvant treatments: lessons from a systematic review of recent randomized trials. Crit Rev Oncol Hematol 2012;81:21-8. Crossref
21. Bujko K, Kepka L, Michalski W, Nowacki MP. Does rectal cancer shrinkage induced by preoperative radio(chemo)therapy increase the likelihood of anterior resection? A systematic review of randomised trials. Radiother Oncol 2006;80:4-12. Crossref
22. Dukes CE, Bussey JH. The spread of rectal cancer and its effect on prognosis. Br J Cancer 1958;12:309-20. Crossref
23. Gunderson LL, Sosin H. Areas of failure found at reoperation (second or symptomatic look) following “curative surgery” for adenocarcinoma of the rectum: clinicopathologic correlation and implications for adjuvant therapy. Cancer 1974;34:1278-92. Crossref
24. Rich T, Gunderson LL, Lew R, Galdibini JJ, Cohen AM, Donaldson G. Patterns of recurrence of rectal cancer after potentially curative surgery. Cancer 1983;52:1317-29. Crossref
25. Quirke P, Durdey P, Dixon MF, Williams NS. Local recurrence of rectal adenocarcinoma due to inadequate surgical resection. Histopathological study of lateral tumour spread and surgical excision. Lancet 1986;2:996-9. Crossref
26. Merkel S, Mansmann U, Siassi M, Papadopoulos T, Hohenberger W, Hermanek P. The prognostic inhomogeneity in pT3 rectal carcinomas. Int J Colorectal Dis 2001;16:298-304. Crossref
27. Birbeck KF, Macklin CP, Tiffin NJ, et al. Rates of circumferential resection margin involvement vary between surgeons and predict outcomes in rectal cancer surgery. Ann Surg 2002;235:449-57. Crossref
28. Wibe A, Rendedal PR, Svensson E, et al. Prognostic significance of the circumferential resection margin following total mesorectal excision for rectal cancer. Br J Surg 2002;89:327-34. Crossref
29. Bosset JF, Collette L, Calais G, et al. Chemotherapy with preoperative radiotherapy in rectal cancer. N Engl J Med 2006;355:1114-23. Crossref
30. Bosset JF, Calais G, Mineur L, et al. Enhanced tumorocidal effect of chemotherapy with preoperative radiotherapy for rectal cancer: preliminary results—EORTC 22921. J Clin Oncol 2005;23:5620-7. Crossref
31. Bosset JF, Calais G, Mineur L, et al. Preoperative radiation (Preop RT) in rectal cancer: effect and timing of additional chemotherapy (CT) 5-year results of the EORTC 22921 trial. Proc Am Soc Clin Oncol 2005;23:247S (abstract no. 3505).
32. Bosset JF, Calais G, Daban A, et al. Preoperative chemoradiotherapy versus preoperative radiotherapy in rectal cancer patients: assessment of acute toxicity and treatment compliance. Report of the 22921 randomised trial conducted by the EORTC Radiotherapy Group. Eur J Cancer 2004;40:219-24. Crossref
33. Collette L, Bosset JF, den Dulk M, et al. Patients with curative resection of cT3-4 rectal cancer after preoperative radiotherapy or radiochemotherapy: does anybody benefit from adjuvant fluorouracil-based chemotherapy? A trial of the European Organisation for Research and Treatment of Cancer Radiation Oncology Group. J Clin Oncol 2007;25:4379-86. Crossref
34. Sainato A, Cernusco Luna Nunzia V, Valentini V, et al. No benefit of adjuvant fluorouracil leucovorin chemotherapy after neoadjuvant chemoradiotherapy in locally advanced cancer of the rectum (LARC): long term results of a randomized trial (I-CNR-RT). Radiother Oncol 2014;113:223-9. Crossref
35. Glynne-Jones R, Counsell N, Quirke P, et al. Chronicle: results of a randomised phase III trial in locally advanced rectal cancer after neoadjuvant chemoradiation randomising postoperative adjuvant capecitabine plus oxaliplatin (XELOX) versus control. Ann Oncol 2014;25:1356-62. Crossref
36. Hong YS, Nam B, Kim K, et al. Adjuvant chemotherapy with oxaliplatin/5-fluorouracil/leucovorin (FOLFOX) versus 5-fluorouracil/leucovorin for rectal cancer patients whose postoperative yp stage 2 or 3 after preoperative chemotherapy: updated results of 3-year disease-free survival from a randomized phase II study (The ADORE). J Clin Oncol 2014;32(5 Suppl):abstract no. 3502.
37. Hong YS, Nam BH, Kim KP, et al. Oxaliplatin, fluorouracil, and leucovorin versus fluorouracil and leucovorin as adjuvant chemotherapy for locally advanced rectal cancer after preoperative chemoradiotherapy (ADORE): an open-label, multicentre, phase 2, randomised controlled trial. Lancet Oncol 2014;15:1245-53. Crossref
38. Rödel C, Graeven U, Fietkau R, et al. Oxaliplatin added to fluorouracil-based preoperative chemoradiotherapy and postoperative chemotherapy of locally advanced rectal cancer (the German CAO/ARO/AIO-04 study): final results of the multicentre, open-label, randomised, phase 3 trial. Lancet Oncol 2015;16:979-89. Crossref

Effectiveness of proximal intra-operative salvage Palmaz stent placement for endoleak during endovascular aneurysm repair

Hong Kong Med J 2016 Dec;22(6):538–45 | Epub 24 Oct 2016
DOI: 10.12809/hkmj154799
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Effectiveness of proximal intra-operative salvage Palmaz stent placement for endoleak during endovascular aneurysm repair
Y Law, MB, BS, FRCS (Edin); YC Chan, MB, BS, FRCS (Eng); Stephen WK Cheng, MB, BS, FRCS (Edin)
Division of Vascular and Endovascular Surgery, Department of Surgery, The University of Hong Kong, Queen Mary Hospital, Pokfulam, Hong Kong
 
This paper was presented in abstract form at the 15th Congress of Asian Society for Vascular Surgery (ASVS) Hong Kong, 5-7 September 2014, Hong Kong.
 
Corresponding author: Dr Y Law (lawyuksimpson@gmail.com)
 
 
 Full paper in PDF
 
Abstract
Introduction: The use of a proximal Palmaz stent is a well-recognised technique to treat proximal endoleak in endovascular aortic repair. This study aimed to report the effectiveness and safety of an intra-operative Palmaz stent for immediate type 1a endoleak in Hong Kong patients.
 
Methods: This case series was conducted at a tertiary hospital in Hong Kong. In a cohort of 494 patients who underwent infrarenal endovascular aortic repair from July 1999 to September 2015, 12 (2.4%) received an intra-operative proximal Palmaz stent for type 1a endoleak. Immediate and subsequent proximal endoleak on follow-up image was documented.
 
Results: Morphological review of the pre-repair aneurysm neck showed five conical, one funnel, five cylindrical and one undetermined short neck, with a median neck angle of 61 degrees (range, 19-109 degrees). Stent grafts used included seven Cook Zenith, one Cook Aorto-Uni-Iliac device, three Metronic Endurant, and one TriVascular Ovation. Eleven Palmaz stents were placed successfully as intended, but one of them was accidentally placed too low. Of the 12 type 1a endoleaks, postoperative imaging revealed immediate resolution of eight whilst four had improved. After a median follow-up of 16 (range, 1-59) months, none of the subsequent imaging showed a type 1a endoleak. The mean size of the aneurysm sac reduced from 7.4 cm preoperatively to 7.3 cm at 1 month, 6.9 cm at 6 months, 7.1 cm at 1 year, and 6.1 cm at 2 years postoperatively. None of these patients required aortic reintervention or had device-related early- or mid-term mortality. One patient required delayed iliac re-interventions for an occluded limb at 10 days post-surgery.
 
Conclusion: In our cohort, Palmaz stenting was effective and safe in securing proximal sealing and fixation.
 
 
New knowledge added by this study
  • Palmaz stenting is effective and safe as a salvage treatment of proximal endoleak during endovascular aortic repair (EVAR).
Implications for clinical practice or policy
  • Appropriate patient selection, meticulous preoperative planning, and diligent follow-up ensured the ultimate success of EVAR.
  • A Palmaz stent should always be readily available during EVAR especially for aneurysms with a hostile aortic neck.
 
 
Introduction
Hostile proximal infrarenal aortic neck is often considered one of the relative contra-indications for infrarenal endovascular aortic repair (EVAR), but large multicentre registries have shown that more EVARs have been performed worldwide beyond manufacturers’ indications.1 2 3 Data from the EUROSTAR Registry more than a decade ago looking at aneurysm morphology and procedural details of 2272 patients showed that perioperative type 1a endoleak was significantly associated with a larger angulated infrarenal neck with thrombus.4 Other morphological features included short aortic neck length, large sac diameter, severe angulation, and conical neck configuration.5 Endoleak is defined as the persistence of blood flow outside the lumen of an endovascular graft but within an aneurysm sac. Type 1a endoleak is a leak around the proximal graft attachment site. They are clinically important since they can lead to continual aneurysm enlargement and eventual rupture.6
 
Although the development of renal and mesenteric fenestration or branched stent graft technology to extend the landing zone more proximally may overcome type 1a endoleaks, these devices may not be ideal because of regulations, complexity, and manufacturing delays.7 In addition, contemporary published literature does not provide sufficiently strong evidence to warrant a change in treatment guidelines for juxtarenal or short-neck aneurysm.8
 
The use of a proximal giant Palmaz stent (Cordis Corp, Fremont [CA], US) is a well-recognised technique to manage a type 1a endoleak intra-operatively.9 10 11 12 A prophylactic Palmaz stent inserted for hostile neck has also been reported.13 14 15 The aim of this study was to report the effectiveness and safety of an intra-operative Palmaz stent for an immediate type 1a endoleak.
 
Methods
Patient selection
Patients with an infrarenal abdominal aortic aneurysm (AAA) who had undergone EVAR were retrospectively reviewed for the period 1 July 1999 to 30 September 2015. Data were extracted from prospectively collected computerised departmental database, supplemented by patient records. This study was done in accordance with the principles outlined in the Declaration of Helsinki. All patients had undergone a preoperative fine-cut computed tomographic (CT) scan and careful preoperative planning with a conclusion that EVAR was feasible. All our patients who underwent EVAR had preoperative three-dimensional aortic anatomy examined using the stationary TeraRecon Aquarius workstation (TeraRecon, San Mateo [CA], US) that rapidly manipulates all preoperative data sets in the Digital Imaging and Communications in Medicine, and examines the aortic morphology. These details would be difficult to appreciate on two-dimensional planar CT imaging, especially in those with inadequate short hostile neck. Those patients who were deemed unsuitable for infrarenal sealing would either undergo open repair or, more recently, fenestrated or adjunctive EVAR (such as chimney technique).
 
Cook Zenith (Cook Medical, Bloomington [IN], US), Medtronic Endurant (Medtronic Inc, Minneapolis [MN], US), and TriVascular Ovation (TriVascular Inc, Santa Rosa [CA], US) stent grafts were most commonly used. We sometimes extended the manufacturers’ indications of use to include difficult aortic necks. All operations were performed in our institution, a tertiary vascular referral centre, by experienced vascular specialists in a hybrid endovascular suite (Siemens Artis zee Multipurpose System; Siemens, Erlangen, Germany). Post-deployment balloon moldings with Coda balloon (Cook Medical, Bloomington [IN], US) and completion angiograms with power injection were performed in all patients.
 
Proximal Palmaz stent placement
In the event of proximal endoleak on completion angiogram, further balloon molding would be attempted. Persistent endoleak was managed with balloon expandable giant Palmaz stent P4014 (Cordis Corp). Our preference was for the Palmaz stent to be positioned at the infrarenal region over the fabric of the stent graft, so as not to jeopardise any future chance of proximal extension with a renal or mesenteric fenestrated cuff. They would be crimped on a Coda balloon (maximum inflated diameter, 40 mm) and railed through a stiff guidewire to the top neck.
 
Preoperative aneurysm neck morphological analysis
Cause of proximal endoleak was usually hostile neck. We focused on aortic infrarenal neck anatomy where the Palmaz stent was placed and exerted its effect. All preoperative CT scans were analysed. Measurement was automatically calculated by computer software Aquarius iNtuition Client version 4.4 (TeraRecon Inc), after a median centre line path (MCLP) was drawn. We used definitions of neck measurement as defined by Filis et al16:
(1) Neck length is calculated between the point of MCLP at the orifice of the lower renal artery and MCLP at the start of the aneurysm.
(2) Neck diameter is measured on the orthogonal cross-section of the neck, from the outer wall to the outer wall. It is measured at the lower renal artery level, 1 cm and 2 cm below.
(3) Neck angle is the angle between the axis of the neck (straight line between the MCLP of the aorta at the level of the orifice of the lower renal artery and the MCLP of the aorta at the start of the aneurysm) and the axis of the lumen of the aneurysm (straight line between the MCLP at the start of the aneurysm and the MCLP at the end of the aneurysm).
(4) Neck morphology (Fig 1):
(a) Funnel shape is defined as 20% decrease in neck area between the level of the lower renal artery and 2 cm below.
(b) Conical shape is defined as 20% increase in neck area between the level of the renal artery and 2 cm below.
(c) Cylindrical shape is defined between the above two.
(d) Shape is undetermined if neck length is less than 2 cm.
 

Figure 1. Illustration of neck morphology
 
Clinical and radiological outcomes
Immediate radiological result following Palmaz stent placement was reported. Any procedural mortality or morbidity was recorded. Our departmental protocol recommends postoperative imaging, either CT scan or duplex scan, at 1 to 3 months, then every 6 months for the first 2 years, and annually thereafter. These were supplemented by X-ray to detect any stent fracture or migration. Any endoleak detected during follow-up was reported as well as any alteration in sac size. Follow-up was dated to the most recent objective imaging available.
 
Results
Patient population
During the study period 1 July 1999 to 30 September 2015, a total of 842 AAA surgeries were performed, of which 320 (38%) were open repair and 522 (62%) were endovascular repair (28 fenestrated/branched EVAR and 494 infrarenal EVAR). In a cohort of 494 patients with infrarenal EVAR, 12 (2.4%) received an intra-operative proximal Palmaz stent for type 1a endoleak that was noticed on completion angiogram. No patients received prophylactic Palmaz stent for difficult neck anatomy.
 
Patient demographics are summarised in Table 1. The median age was 84 (range, 58-95) years. All had undergone elective surgery for asymptomatic AAA. The median AAA size was 7.6 cm (range, 5.0-9.4 cm). Seven patients received a Cook Zenith stent graft, one patient a Cook Aorto-Uni-Iliac device, three had Metronic Endurant stent grafts, and one had TriVascular Ovation stent graft. The occurrence of type 1a endoleak was more common in recent years (Table 2).
 

Table 1. Baseline characteristics of 12 patients
 

Table 2. Aneurysm neck morphology
 
Analysis of aneurysm neck morphology
Table 2 summarises the neck morphologies of our cohort. Morphological review of the pre-EVAR aneurysm neck showed five conical, one funnel, five cylindrical, and one undetermined short necks. The median neck angle was 61 degrees (range, 19-109 degrees). Use of the stent graft was outside of the manufacturer’s guidelines in six (50%) patients. Most patients had one or more features of hostile neck, rendering them at high risk of proximal endoleak.
 
Radiological outcomes
All 12 patients had persistent proximal type 1a endoleak after stent graft placement with standard balloon molding. Placement of a giant Palmaz stent in the infrarenal position was technically successful in all cases, although one Palmaz stent was placed too low and lodged in one of the iliac limbs. Nonetheless, it served its function well in correcting the proximal endoleak (Figs 2 and 3). Immediate resolution of the endoleak was achieved in eight (67%); whilst four (33%) had improved but persistent leak at completion of the procedure.
 

Figure 2. Abdominal aortic aneurysm of patient 4
(a) Preoperative computed tomographic (CT) scan showing a short and angulated neck; the patient underwent endovascular aortic repair. (c) Completion angiogram showing type 1a endoleak from the left side (arrowhead). (d) Palmaz stent being inserted (red arrows) and (e) proximal endoleak is resolved. (b) Postoperative CT scan showing no endoleak
 

Figure 3. Aneurysm of patient 10
(a) Preoperative computed tomographic (CT) scan showing an angulated neck. (b) Postoperative CT reconstruction. (c and d) X-ray scans showing the Palmaz stent was placed too low and became lodged in the proximal right limb (red arrows). Nonetheless the proximal part of the Palmaz stent served well to prevent proximal endoleak
 
Clinical outcomes
After a median follow-up of 16 (range, 1-59) months, no patient had a type 1a endoleak on subsequent imaging, either CT scan or duplex ultrasound scan. All patients had at least one postoperative CT scan. Patient 2 had a type 1c endoleak from an embolised left internal iliac artery that was managed conservatively. Patients 9 and 11 had a type 2 endoleak, also managed conservatively. All had shrinkage of sac size. Sac size of the aneurysms decreased from a mean of 7.4 cm pre-EVAR to 7.3 cm, 6.9 cm, 7.1 cm, and 6.1 cm at 1 month, 6 months, 1 year, and 2 years post-EVAR, respectively (Fig 4). Routine X-ray surveillance did not reveal any Palmaz stent fracture or migration.
 

Figure 4. Alteration in sac size over time
Patient 2 developed type 1e endoleak, patients 9 and 11 developed type 2 endoleaks
 
One patient required secondary endovascular re-intervention for occluded left iliac limb at day 10 postoperatively. It was due to a tortuous iliac system causing an acute bend to the stent graft limb. There was no other reported secondary intervention. Seven patients have since died (at 6, 9, 16, 16, 20, 24, and 59 postoperative months) from non-aneurysm–related causes. Five remain alive at the time of writing.
 
Discussion
Hostile aortic neck anatomy often precludes endovascular treatment of AAA. It has been shown in large clinical cohorts that up to 42% of EVARs are performed outside the instructions for use for commercially available stent grafts.3 17 18 19 Multiple measures have been developed to include more of these difficult necks for endovascular treatment. Evolution of stent graft design including suprarenal fixation20 and renal and mesenteric fenestration21 are examples. Adjunctive neck measures, eg endostapling,22 23 proximal fibrin glue embolisation,24 open aortic neck banding,25 and proximal covered cuff extension10 may be used in cases of perioperative type 1a endoleak following EVAR. Endostapling and glue embolisation, though minimally invasive, are not always feasible and risk major aortic injury. Open aortic neck banding requires laparotomy. Proximal cuff extension is only feasible if there is an additional sealing zone to the most caudal renal artery. Since we routinely landed our stent graft at the level of the lowest renal artery, this technique was not usually practical. The simplest and most well-recognised manoeuvre remains placement of a proximal Palmaz stent.
 
The morphology of the infrarenal aortic neck is important in securing the proximal landing zone. Three-dimensional workstation planning has been considered useful by many26 27; for example, Sobocinski et al28 showed that it reduced the rate of type 1 endoleaks and Velazquez et al29 indicated that it decreased the rate of extra iliac extension. We emphasise the importance of proper pre-EVAR planning, as this is one of the obvious factors that may compromise long-term durability and outcome. The fact that half of stent graft usage in our series was outside the instructions for use and most patients had one or more hostile neck feature rendered them at high risk of proximal endoleak. Under these circumstances, a Palmaz stent should always be readily available during EVAR.
 
Multiple series have reported their experience in its successful use. Early study by Dias et al9 reported nine patients who received a Palmaz stent and in whom aneurysm remained excluded at a median follow-up of 13 months (range, 6-24 months). Rajani et al10 reported successful treatment of intra-operative type 1 endoleak with Palmaz stent in 27 patients who had no recurrence at follow-up, although length of follow-up was not mentioned. Arthurs et al11 reported no type 1 endoleak in 31 patients after a median follow-up of 53 months (interquartile range, 14-91 months). The Palmaz stent was effective across a variety of available devices with suprarenal fixation (eg Cook Zenith) or infrarenal fixation (eg Gore Excluder; WL Gore & Associates, Inc, Newark [DE], US). Our results are in agreement with these findings.
 
Other series have shown controversial results. Farley et al15 reported 18 cases of Palmaz stent placement. Technical placement failed in one patient, in whom attempts at passing the access sheath to the proximal landing zone resulted in proximal migration of the main body of the aortic stent graft. An attempt at passage of the balloon-mounted stent without sheath protection resulted in slippage of the stent from the balloon. The stent could not be retrieved and was deployed in the iliac limb. With a mean follow-up period of 254 days in the 17 successful Palmaz stent placements, one patient had unresolved type 1 endoleak. Malposition of the stent was not an unusual complication.30 31 Kim et al32 described a deployment technique to ensure accuracy. Palmaz stent was asymmetrically hand-crimped on an appropriately sized valvuloplasty balloon that assured the proximal aspect would deploy first.
 
Some series have advocated prophylactic Palmaz stent placement in hostile necks, including 15 patients reported by Cox et al.13 One (7%) patient had secondary endoleak with intervention after a mean follow-up of 12 months. Qu and Raithel14 reported 117 cases of difficult neck treated with unibody Powerlink device (Endologix Inc, Irvine [CA], US). In this series, 83 (72.8%) had proximal Palmaz stent as an adjunctive procedure. Proximal cuff extension was also used. The mean follow-up was 2.6 years (range, 4 months-5 years). Results were satisfactory with an overall re-intervention rate of 5.3%, and no device migration, conversion, or post-EVAR rupture.
 
Palmaz stents were routinely placed at an infrarenal position in our unit on the basis that future extension with a fenestrated cuff is possible. If a transrenal position is adopted, the strut of the Palmaz stent, which is a very tight space, may hinder catheterisation of renal or visceral arteries should a fenestrated cuff be inserted. This is not absolute, however. Oikonomou et al33 reported a case of post-treatment with Powerlink stent graft and transrenal Palmaz stent. Treatment of proximal endoleak at 3 years after operation was successful by means of a proximal fenestrated graft. Selective catheterisation of both renal arteries and dilation of the stent struts prior to stent graft repair ensured that it would be feasible to catheterise the renal arteries through the fenestrated cuff.
 
There are limitations to this study. The retrospective nature of our cohort may risk inaccurate information. The efficacy of the Palmaz stent in aneurysms with short infrarenal neck may not be tested, as the majority were considered straight for custom-made fenestrated or branched EVAR. In our limited experience, a Palmaz stent is a valuable tool to expand the boundary of endovascular treatment for AAA.
 
Conclusion
Palmaz stent helps proximal sealing and fixation. In our experience, Palmaz stenting is effective and safe as a salvage treatment of immediate proximal endoleak during EVAR. We emphasise the importance of appropriate patient selection, pre-EVAR planning, and diligent follow-up.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Bachoo P, Verhoeven EL, Larzon T. Early outcome of endovascular aneurysm repair in challenging aortic neck morphology based on experience from the GREAT C3 registry. J Cardiovasc Surg (Torino) 2013;54:573-80.
2. Matsumoto T, Tanaka S, Okadome J, et al. Midterm outcomes of endovascular repair for abdominal aortic aneurysms with the on-label use compared with the off-label use of an endoprosthesis. Surg Today 2015;45:880-5. Crossref
3. Hoshina K, Hashimoto T, Kato M, Ohkubo N, Shigematsu K, Miyata T. Feasibility of endovascular abdominal aortic aneurysm repair outside of the instructions for use and morphological changes at 3 years after the procedure. Ann Vasc Dis 2014;7:34-9. Crossref
4. Buth J, Harris PL, van Marrewijk C, Fransen G. The significance and management of different types of endoleaks. Semin Vasc Surg 2003;16:95-102. Crossref
5. Stanley BM, Semmens JB, Mai Q, et al. Evaluation of patient selection guidelines for endoluminal AAA repair with the Zenith Stent-Graft: the Australasian experience. J Endovasc Ther 2001;8:457-64. Crossref
6. Fransen GA, Vallabhaneni SR Sr, van Marrewijk CJ, et al. Rupture of infra-renal aortic aneurysm after endovascular repair: a series from EUROSTAR registry. Eur J Vasc Endovasc Surg 2003;26:487-93. Crossref
7. Chuter T, Greenberg RK. Standardized off-the-shelf components for multibranched endovascular repair of thoracoabdominal aortic aneurysms. Perspect Vasc Surg Endovasc Ther 2011;23:195-201. Crossref
8. Ou J, Chan YC, Cheng SW. A systematic review of fenestrated endovascular repair for juxtarenal and short-neck aortic aneurysm: evidence so far. Ann Vasc Surg 2015;29:1680-8. Crossref
9. Dias NV, Resch T, Malina M, Lindblad B, Ivancev K. Intraoperative proximal endoleaks during AAA stent-graft repair: evaluation of risk factors and treatment with Palmaz stents. J Endovasc Ther 2001;8:268-73. Crossref
10. Rajani RR, Arthurs ZM, Srivastava SD, Lyden SP, Clair DG, Eagleton MJ. Repairing immediate proximal endoleaks during abdominal aortic aneurysm repair. J Vasc Surg 2011;53:1174-7. Crossref
11. Arthurs ZM, Lyden SP, Rajani RR, Eagleton MJ, Clair DG. Long-term outcomes of Palmaz stent placement for intraoperative type Ia endoleak during endovascular aneurysm repair. Ann Vasc Surg 2011;25:120-6. Crossref
12. Chung J, Corriere MA, Milner R, et al. Midterm results of adjunctive neck therapies performed during elective infrarenal aortic aneurysm repair. J Vasc Surg 2010;52:1435-41. Crossref
13. Cox DE, Jacobs DL, Motaganahalli RL, Wittgen CM, Peterson GJ. Outcomes of endovascular AAA repair in patients with hostile neck anatomy using adjunctive balloon-expandable stents. Vasc Endovascular Surg 2006;40:35-40. Crossref
14. Qu L, Raithel D. Experience with the Endologix Powerlink endograft in endovascular repair of abdominal aortic aneurysms with short and angulated necks. Perspect Vasc Surg Endovasc Ther 2008;20:158-66. Crossref
15. Farley SM, Rigberg D, Jimenez JC, Moore W, Quinones-Baldrich W. A retrospective review of Palmaz stenting of the aortic neck for endovascular aneurysm repair. Ann Vasc Surg 2011;25:735-9. Crossref
16. Filis KA, Arko FR, Rubin GD, Zarins CK. Three-dimensional CT evaluation for endovascular abdominal aortic aneurysm repair. Quantitative assessment of the infrarenal aortic neck. Acta Chir Belg 2003;103:81-6. Crossref
17. Walker J, Tucker LY, Goodney P, et al. Adherence to endovascular aortic aneurysm repair device instructions for use guidelines has no impact on outcomes. J Vasc Surg 2015;61:1151-9. Crossref
18. Igari K, Kudo T, Toyofuku T, Jibiki M, Inoue Y. Outcomes following endovascular abdominal aortic aneurysm repair both within and outside of the instructions for use. Ann Thorac Cardiovasc Surg 2014;20:61-6. Crossref
19. Lee JT, Ullery BW, Zarins CK, Olcott C 4th, Harris EJ Jr, Dalman RL. EVAR deployment in anatomically challenging necks outside the IFU. Eur J Vasc Endovasc Surg 2013;46:65-73. Crossref
20. Robbins M, Kritpracha B, Beebe HG, Criado FJ, Daoud Y, Comerota AJ. Suprarenal endograft fixation avoids adverse outcomes associated with aortic neck angulation. Ann Vasc Surg 2005;19:172-7. Crossref
21. Verhoeven EL, Vourliotakis G, Bos WT, et al. Fenestrated stent grafting for short-necked and juxtarenal abdominal aortic aneurysm: an 8-year single-centre experience. Eur J Vasc Endovasc Surg 2010;39:529-36. Crossref
22. Donas KP, Kafetzakis A, Umscheid T, Tessarek J, Torsello G. Vascular endostapling: new concept for endovascular fixation of aortic stent-grafts. J Endovasc Ther 2008;15:499-503. Crossref
23. Avci M, Vos JA, Kolvenbach RR, et al. The use of endoanchors in repair EVAR cases to improve proximal endograft fixation. J Cardiovasc Surg (Torino) 2012;53:419-26.
24. Feng JX, Lu QS, Jing ZP, et al. Fibrin glue embolization treating intra-operative type I endoleak of endovascular repair of abdominal aortic aneurysm: long-term result [in Chinese]. Zhonghua Wai Ke Za Zhi 2011;49:883-7.
25. Scarcello E, Serra R, Morrone F, Tarsitano S, Triggiani G, de Franciscis S. Aortic banding and endovascular aneurysm repair in a case of juxtarenal aortic aneurysm with unsuitable infrarenal neck. J Vasc Surg 2012;56:208-11. Crossref
26. Lee WA. Endovascular abdominal aortic aneurysm sizing and case planning using the TeraRecon Aquarius workstation. Vasc Endovascular Surg 2007;41:61-7. Crossref
27. Parker MV, O’Donnell SD, Chang AS, et al. What imaging studies are necessary for abdominal aortic endograft sizing? A prospective blinded study using conventional computed tomography, aortography, and three-dimensional computed tomography. J Vasc Surg 2005;41:199-205. Crossref
28. Sobocinski J, Chenorhokian H, Maurel B, et al. The benefits of EVAR planning using a 3D workstation. Eur J Vasc Endovasc Surg 2013;46:418-23. Crossref
29. Velazquez OC, Woo EY, Carpenter JP, Golden MA, Barker CF, Fairman RM. Decreased use of iliac extensions and reduced graft junctions with software-assisted centerline measurements in selection of endograft components for endovascular aneurysm repair. J Vasc Surg 2004;40:222-7. Crossref
30. Gabelmann A, Krämer SC, Tomczak R, Görich J. Percutaneous techniques for managing maldeployed or migrated stents. J Endovasc Ther 2001;8:291-302. Crossref
31. Slonim SM, Dake MD, Razavi MK, et al. Management of misplaced or migrated endovascular stents. J Vasc Interv Radiol 1999;10:851-9. Crossref
32. Kim JK, Noll RE Jr, Tonnessen BH, Sternbergh WC 3rd. A technique for increased accuracy in the placement of the “giant” Palmaz stent for treatment of type IA endoleak after endovascular abdominal aneurysm repair. J Vasc Surg 2008;48:755-7. Crossref
33. Oikonomou K, Botos B, Bracale UM, Verhoeven EL. Proximal type I endoleak after previous EVAR with Palmaz stents crossing the renal arteries: treatment using a fenestrated cuff. J Endovasc Ther 2012;19:672-6. Crossref

Nephrolithiasis among male patients with newly diagnosed gout

Hong Kong Med J 2016 Dec;22(6):534–7 | Epub 9 Sep 2016
DOI: 10.12809/hkmj154694
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Nephrolithiasis among male patients with newly diagnosed gout
KS Wan, MD, PhD1,2; CK Liu, MD, MPH3,4; MC Ko, MD3; WK Lee, MD3; CS Huang, MD2
1 Department of Immunology and Rheumatology, Taipei City Hospital-Zhongxing Branch, Taiwan
2 Department of Pediatrics, Taipei City Hospital-Renai Branch, Taiwan
3 Department of Urology, Taipei City Hospital-Zhongxing Branch, Taiwan
4 Fu Jen Catholic University School of Medicine, Taiwan
 
Corresponding author: Dr KS Wan (gwan1998@gmail.com)
 
 
 Full paper in PDF
 
Abstract
Introduction: An elevated serum urate level is recognised as a cause of gouty arthritis and uric acid stone. The level of serum uric acid that accelerates kidney stone formation, however, has not yet been clarified. This study aimed to find out if a high serum urate level is associated with nephrolithiasis.
 
Methods: Patients were recruited from the rheumatology clinic of Taipei City Hospital (Renai and Zhongxing branches) in Taiwan from March 2015 to February 2016. A total of 120 Chinese male patients with newly diagnosed gout and serum urate concentration of >7 mg/dL and no history of kidney stones were divided into two groups according to their serum urate level: <10 mg/dL (group 1, n=80) and ≥10 mg/dL (group 2, n=40). The mean body mass index, blood urea nitrogen level, creatinine level, urinary pH, and kidney ultrasonography were compared between the two groups.
 
Results: There were no significant differences in blood urea nitrogen or creatinine level between the two groups. The urine pH in both groups was similar and not statistically significant. Kidney stone formation was detected via ultrasonography in 6.3% (5/80) and 82.5% (33/40) of patients in groups 1 and 2, respectively (P<0.05).
 
Conclusion: A serum urate level of ≥10 mg/dL may precipitate nephrolithiasis. Further studies are warranted to substantiate the relationship between serum urate level and kidney stone formation.
 
 
New knowledge added by this study
  • Hyperuricaemia is a risk factor for renal stone formation, which is associated with a substantially higher prevalence of nephrolithiasis on ultrasonography.
  • Patients with gouty arthritis and serum urate level of ≥10 mg/dL should be advised to have renal ultrasonography.
 
 
Introduction
Over the past century, kidney stones have become increasingly prevalent, particularly in more developed countries. The incidence of urolithiasis in a given population is dependent on the geographic area, racial distribution, socio-economic status, and dietary habits.1 In general, patients with a history of gout are at greater risk of forming uric acid stones, as are patients with obesity, diabetes, or complete metabolic syndrome.2 Moreover, elevated serum urate levels are known to lead to gouty arthritis, tophi formation, and uric acid kidney stones.3 The incidence of uric acid stones varies between countries and accounts for 5% to 40% of all urinary calculi.4 Certain risk factors may be involved in the pathogenesis of uric acid nephrolithiasis, including low urinary volume and persistently low urinary pH.5
 
Calcium oxalate stones may form in some patients with gouty diathesis due to increased urinary excretion of calcium and reduced excretion of citrate. In addition, relative hypercalciuria in gouty diathesis with calcium oxalate stones may be due to intestinal hyperabsorption of calcium.6 Most urinary uric acid calculi are not pure in composition and complex urates, sodium, potassium, and calcium have been found together in various proportions.7 An analysis of stones in gout patients in Japan showed that the incidence of common calcium salt stones was over 60%, while that of uric acid stones was only 30%.8 This implies that the disruption of uric acid metabolism promotes not only uric acid stones, but also calcium salt stones. Therefore, a high serum urate level might be associated with nephrolithiasis and this provided the rationale for this study.
 
Methods
Overall, 120 male gouty arthritis patients with newly diagnosed gout and serum urate concentration of >7 mg/dL, and without previous kidney stone disease were allocated to one of the two groups according to their serum uric acid level: <10 mg/dL (group 1, n=80) and ≥10 mg/dL (group 2, n=40). Patients were recruited from the rheumatology clinic of Taipei City Hospital (Renai and Zhongxing branches), a tertiary community hospital in Taiwan, from March 2015 to February 2016. They had been newly diagnosed with gout but had no clinical suggestions of renal stone disease. The exclusion criteria included previously treated gouty arthritis and current prescription of urate reabsorption inhibitors. The patient’s age, duration of gout arthritis, presence of tophi, body mass index (BMI), blood urea nitrogen (BUN), creatinine, urinary pH, and kidney ultrasonography were all measured and analysed. This study has been approved by the hospital’s Institutional Review Board with informed consent waived.
 
Results for continuous variables were given as means ± standard deviations. Student’s t test was used to compare the physical characteristics that were continuous in nature among the different subject groups and the Chi squared test was used to compare the difference in the stone detection rate between the two groups. A P value of <0.05 was regarded as statistically significant for two-sided tests. The Statistical Package for the Social Sciences (Windows version 12.0; SPSS Inc, Chicago [IL], US) was used for all statistical analyses.
 
Results
The mean age of the two study groups was similar (40 years). Family history of gout was present in 67.5% and 90% of groups 1 and 2, respectively. The time elapsed since onset of gout was less than 4 years in both groups. Tophaceous gout was found in 8.8% in group 1 and 10.0% in group 2. The prevalence of patients with a BMI of ≥30 kg/m2 was not statistically significant between the two groups. Only 6% of group 2 patients with kidney stones had a BMI of >95th percentile. In most cases, urinary pH was less than 5.5 in both groups and there were no abnormal changes to BUN or creatinine levels. Interestingly, the prevalence of kidney stones detected by ultrasonography was 6.3% in group 1 and 82.5% in group 2 (P<0.05). The sensitivity and specificity of high serum urate level (>10 mg/dL) in predicting kidney stones was 87% and 91%, respectively (Table).
 

Table. Risk factors of male gout patients with and without nephrolithiasis
 
Discussion
Gout is a common metabolic disorder characterised by chronic hyperuricaemia, and serum urate level of >6.8 mg/dL that exceeds the physiological threshold of saturation. Urolithiasis is one of the well-known complications of gout. We hypothesise that serum urate level can be used as a predictive marker for urolithiasis. Uric acid, a weak organic acid, has very low pH-dependent solubility in aqueous solution. Approximately 70% of urate elimination occurs in urine, and the kidney plays a dominant role in determining plasma level.9 A serum urate level of >7 mg/dL is recognised as leading to gouty arthritis and uric acid stone formation. Moreover, recent epidemiological studies have identified serum urate elevation as an independent risk factor for chronic kidney disease, cardiovascular disease, and hypertension.3 Impaired renal uric acid excretion is the major mechanism of hyperuricaemia in patients with primary gout.10 The molecular mechanisms of renal urate transport are still incompletely understood. Urate transporter 1 is an organic anion transporter with highly specific urate transport activity, exchanging this anion with others including most of the endogenous organic anions and drug anions that are known to affect renal uric acid transport.10 11
 
Uric acid stones account for 10% of all kidney stones and are the second most common cause of urinary stones after calcium oxalate and calcium phosphate. The most important risk factor for uric acid crystallisation and stone formation is a low urine pH (<5.5) rather than an increased urinary uric acid excretion.12 The proportion of uric acid stones varies between countries and accounts for 5% to 40% of all urinary calculi.4 Uric acid homeostasis is determined by the balance between its production, intestinal secretion, and renal excretion. The kidney is an important regulator of circulating uric acid levels by reabsorbing about 90% of filtered urate and being responsible for 60% to 70% of total body uric acid factor underpinning hyperuricaemia and gout.13 Pure uric acid stones are radiolucent but well visualised on renal ultrasound or non-contrast helical computed tomographic scanning; the latter is especially good for detection of stones which are <5 mm in size.14 Nonetheless the reason why most patients with gout present with acidic urine, even though only 20% have uric acid stones, remains unclear. In a US study, the prevalence of kidney stone disease was almost two-fold higher in men with a history of gout than in those without (15% vs 8%).15 Higher adiposity and weight gain are strong risk factors for gout in men, while weight loss is protective.15 An analysis by Shimizu8 of stones in gout patients revealed that the proportion of common calcium salt stones was over 60%, while that of uric acid stones was only about 30%. Overweight/obesity and older age associated with low urine pH were the principal characteristics of ‘pure’ uric acid stone formers. Impaired urate excretion associated with increased serum uric acid is also another characteristic of uric acid stone formers and resembles patients with primary gout. Patients with pure calcium oxalate stones were younger; they had a low proportion of obese subjects and higher urinary calcium.16
 
Conventionally, BMI was stratified as normal (<25 kg/m2), overweight (25-29.9 kg/m2), or obese (≥30 kg/m2). In males, the proportion of uric acid stones gradually increased with BMI, from 7.1% in normal BMI to 28.7% in obese subjects.17 The same was true in females, with the proportion of uric acid stones rising from 6.1% in normal BMI to 17.1% in obese subjects.17 Studies found that BMI is associated with an increased risk of kidney stone disease, but with a BMI of >30 kg/m2, further increases do not appear to significantly increase the risk of stone disease.17 18 An independent association between kidney stone disease and gout strongly suggests that they share common underlying pathophysiological mechanisms.19
 
Three major conditions control the potential for uric acid stone formation: the quantity of uric acid, the volume of urine as it affects the urinary concentration of uric acid, and the urinary pH.20 Two major abnormalities have been suggested to explain overly acidic urine: increased net acid excretion and impaired buffering caused by defective urinary ammonium excretion, with the combination resulting in abnormally acidic urine.21 Urinary alkalisation, which involves maintaining a continuously high urinary pH (pH 6-6.5), is considered by some or many to be the treatment of choice for uric acid stone dissolution and prevention.20 In general, gout is caused by the deposition of monosodium urate crystals in tissue that provokes a local inflammatory reaction. The formation of monosodium urate crystals is facilitated by hyperuricaemia. In a study by Sakhaee and Maalouf,21 being overweight and of older age were associated with low urine pH and one of the principal characteristics of pure uric acid stone formation. Impaired urate excretion associated with increased serum uric acid was another characteristic of uric acid stone formation that resembles patients with primary gout.
 
The limitations of this current study included the lack of measurement of uric acid concentration of urine in the participants, no further computed tomographic scanning for kidney stones, no analysis of stone composition, and limited representativeness of the study subjects. For example, there were only 10 obese patients (BMI ≥30 kg/m2) in the analysis. In this study, hyperuricaemia was a risk factor for kidney stone formation. Patients with serum urate level of >10 mg/dL should undergo ultrasound examination to look for any nephrolithiasis.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. López M, Hoppe B. History, epidemiology and regional diversities of urolithiasis. Pediatr Nephrol 2010;25:49-59. Crossref
2. Liebman SE, Taylor JG, Bushinsky DA. Uric acid nephrolithiasis. Curr Rheumatol Rep 2007;9:251-7. Crossref
3. Edwards NL. The role of hyperuricemia and gout in kidney and cardiovascular disease. Cleve Clin J Med 2008;75 Suppl 5:S13-6. Crossref
4. Shekarriz B, Stoller ML. Uric acid nephrolithiasis: current concepts and controversies. J Urol 2002;168:1307-14. Crossref
5. Ngo TC, Assimos DG. Uric acid nephrolithiasis: recent progress and future directions. Rev Urol 2007;9:17-27.
6. Pak CY, Moe OW, Sakhaee K, Peterson RD, Poindexter JR. Physicochemical metabolic characteristics for calcium oxalate stone formation in patients with gouty diathesis. J Urol 2005;173:1606-9. Crossref
7. Bellanato J, Cifuentes JL, Salvador E, Medina JA. Urates in uric acid renal calculi. Int J Urol 2009;16:318-21; discussion 322. Crossref
8. Shimizu T. Urolithiasis and nephropathy complicated with gout [in Japanese]. Nihon Rinsho 2008;66:717-22.
9. Marangella M. Uric acid elimination in the urine. Pathophysiological implications. Contrib Nephrol 2005;147:132-48.
10. Taniquchi A, Kamatani N. Control of renal uric acid excretion and gout. Curr Opin Rheumatol 2008;20:192-7. Crossref
11. Yamauchi T, Ueda T. Primary hyperuricemia due to decreased renal uric acid excretion [in Japanese]. Nihon Rinsho 2008;66:679-81.
12. Ferrari P, Bonny O. Diagnosis and prevention of uric acid stones [in German]. Ther Umsch 2004;61:571-4. Crossref
13. Bobulescu IA, Moe OW. Renal transport of uric acid: evolving concepts and uncertainties. Adv Chronic Kidney Dis 2012;19:358-71. Crossref
14. Wiederkehr MR, Moe OW. Uric acid nephrolithiasis: a systemic metabolic disorder. Clin Rev Bone Miner Metab 2011;9:207-17. Crossref
15. Choi HK. Atkinson K, Karison EW, Curhan G. Obesity, weight change, hypertension, diuretic use, and risk of gout in men: the health professionals follow-up study. Arch Intern Med 2005;165:742-8. Crossref
16. Negri AL, Spivacow R, Del Valle E, et al. Clinical and biochemical profile of patients with “pure” uric acid nephrolithiasis compared with “pure” calcium oxalate stone formers. Urol Res 2007;35:247-51. Crossref
17. Daudon M, Lacour B, Jungers P. Influence of body size on urinary stone composition in men and women. Urol Res 2006;34:193-9. Crossref
18. Semins MJ, Shore AD, Makary MA, Magnuson T, Johns R, Matlaga BR. The association of increasing body mass index and kidney stone disease. J Urol 2010;183:571-5. Crossref
19. Kramer HM, Curhan G. The association between gout and nephrolithiasis: the National Health and Nutrition Examination Survey III, 1988-1994. Am J Kidney Dis 2002;40:37-42. Crossref
20. Cicerello E, Merlo F, Maccatrozzo L. Urinary alkalization for the treatment of uric acid nephrolithiasis. Arch Ital Urol Androl 2010;82:145-8.
21. Sakhaee K, Maalouf NM. Metabolic syndrome and uric acid nephrolithiasis. Semin Nephrol 2008;28:174-80. Crossref

Silver-Russell syndrome in Hong Kong

Hong Kong Med J 2016 Dec;22(6):526–33 | Epub 29 Jul 2016
DOI: 10.12809/hkmj154750
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Silver-Russell syndrome in Hong Kong
HM Luk, MB, BS, FHKAM (Paediatrics)1#; KS Yeung, BSc, MPhil2#; WL Wong, BSc, MPhil2; Brian HY Chung, FHKAM (Paediatrics), FCCMG (Clinical Genetics)2; Tony MF Tong, MPhil, MSc1; Ivan FM Lo, MB, ChB, FHKAM (Paediatrics)1
1 Clinical Genetic Service, Department of Health, 3/F, Cheung Sha Wan Jockey Club Clinic, 2 Kwong Lee Road, Sham Shui Po, Hong Kong
2 Department of Paediatrics and Adolescent Medicine, Queen Mary Hospital, The University of Hong Kong, Pokfulam, Hong Kong
 
# Co-first author
 
Corresponding author: Dr Ivan FM Lo (con_cg@dh.gov.hk)
 
 Full paper in PDF
Abstract
Objectives: To examine the molecular pathogenetic mechanisms, (epi)genotype-phenotype correlation, and the performance of the three clinical scoring systems—namely Netchine et al, Bartholdi et al, and Birmingham scores—for patients with Silver-Russell syndrome in Hong Kong.
 
Methods: This retrospective case series was conducted at two tertiary genetic clinics, the Clinical Genetic Service, Department of Health, and clinical genetic clinic in Queen Mary Hospital in Hong Kong. All records of patients with suspected Silver-Russell syndrome under the care of the two genetic clinics between January 2010 and September 2015 were retrieved from the computer database.
 
Results: Of the 28 live-birth patients with Silver-Russell syndrome, 35.7% had H19 loss of DNA methylation, 21.4% had maternal uniparental disomy of chromosome 7, 3.6% had mosaic maternal uniparental disomy of chromosome 11, and the remaining 39.3% were Silver-Russell syndrome of unexplained molecular origin. No significant correlation between (epi)genotype and phenotype could be identified between H19 loss of DNA methylation and maternal uniparental disomy of chromosome 7. Comparison of molecularly confirmed patients and patients with Silver-Russell syndrome of unexplained origin revealed that postnatal microcephaly and café-au-lait spots were more common in the latter group, and body and limb asymmetry was more common in the former group. Performance analysis showed the Netchine et al and Birmingham scoring systems had similar sensitivity in identifying Hong Kong Chinese subjects with Silver-Russell syndrome.
 
Conclusion: This is the first territory-wide study of Silver-Russell syndrome in Hong Kong. The clinical features and the spectrum of underlying epigenetic defects were comparable to those reported in western populations.
 
New knowledge added by this study
  • The epigenetic defects of Silver-Russell syndrome (SRS) in Hong Kong Chinese patients are comparable to those reported in western populations.
  • No epigenotype-phenotype correlation was demonstrated among SRS patients in this study.
Implications for clinical practice or policy
  • All suspected SRS patients should be referred to a genetic clinic for assessment.
  • A new diagnostic algorithm has been proposed for Chinese patients with SRS.
 
 
Introduction
Silver-Russell syndrome (SRS) [OMIM 180860] is a clinically and genetically heterogeneous congenital imprinting disorder. It was first described in 1953 by Dr Henry Silver and his colleagues, who reported two children with short stature and congenital hemihypertrophy.1 In the following year, Dr Alexander Russell reported five similar cases with intrauterine dwarfism and craniofacial dysostosis.2 The term SRS has been used since 1970 to describe a constellation of features with intrauterine growth retardation without postnatal catch-up, distinct facial characteristics, relative macrocephaly, body asymmetry, and/or fifth finger clinodactyly.3 4 The prevalence of SRS was estimated to be 1 in 100 000,5 but was probably underestimated due to the diverse and variable clinical manifestations. The majority of SRS cases are sporadic, although occasional familial cases have been reported.
 
Two major molecular mechanisms have been implicated in SRS—maternal uniparental disomy of chromosome 7 (mUPD7)6 and loss of DNA methylation (LOM) of the imprinting control region 1 (ICR1) on the paternal allele of chromosome 11p15 region that regulates the IGF2/H19 locus.6 7 8 9 According to the studies, LOM of ICR1 and mUPD7 roughly account for 45% to 50% and 5% to 10% of SRS cases, respectively.6 7 8 9 Rare cytogenetic rearrangements have also been reported in 1% to 2% of cases.4 10 11 There remain 30% to 40% of SRS cases in which the molecular mechanisms remain elusive, however.
 
Owing to the wide spectrum of clinical presentations of SRS, there is considerable clinical overlap with other growth retardation syndromes. At present there is no consensus for the diagnostic criteria, so diagnosing SRS is challenging. Several scoring systems have been proposed to facilitate clinical diagnosis and to guide genetic testing.7 11 12 13 14 Based on the prevalence of different molecular mechanisms, methylation study of the 11p15 region is the recommended first-tier investigation for patients with suspected SRS, and mUPD7 analysis is the second tier.14
 
The comprehensive clinical spectrum and molecular study of SRS have not been reported in the Chinese population. Therefore, a retrospective review that aimed to summarise the clinical and genetic findings of all SRS patients in Hong Kong was conducted. The sensitivity and specificity of different scoring systems7 11 12 13 14 in identifying Hong Kong Chinese SRS patients have also been studied.
 
Methods
Patients
The Clinical Genetic Service (CGS), Department of Health and the clinical genetic clinic at Queen Mary Hospital (QMH), The University of Hong Kong, are the only two tertiary genetic referral centres that provide comprehensive genetic counselling, and diagnostic and laboratory service for the Hong Kong population. Patients with a clinical suspicion of growth failure due to genetic causes or possibly SRS were referred for assessment and genetic testing.
 
In this review, all records of patients with suspected SRS seen at the CGS or clinical genetic clinic of QMH between January 2010 and September 2015 were retrieved from the computer database system using the key words of “Silver Russell syndrome” and “failure to thrive and growth retardation”. The clinical and laboratory data of these patients were retrospectively analysed. Patients with alternative diagnoses after assessment and genetic investigation were excluded. This study was done in accordance with the principles outlined in the Declaration of Helsinki.
 
Clinical diagnostic criteria for Silver-Russell syndrome in this study
Currently, there is no universal consensus on the diagnostic criteria of SRS, but the Hitchins et al’s criteria15 are the most commonly used clinically. The diagnosis of SRS in this study was made when a patient fulfilled three major, or two major and two minor criteria.
 
Major criteria included (1) intrauterine growth retardation/small for gestational age (<10th percentile); (2) postnatal growth with height/length <3rd percentile; (3) normal head circumference (3rd-97th percentile); and (4) limb, body, and/or facial asymmetry.
 
Minor criteria included (1) short arm span with normal upper-to-lower segment ratio; (2) fifth finger clinodactyly; (3) triangular facies; and (4) frontal bossing/prominent forehead.
 
Epimutation in imprinting control region 1
Investigation of the methylation status and copy number change of the H19 differentially methylated region (H19 DMR) and KvDMR1 at chromosome 11p15 region was done with methylation specific–multiplex ligation-dependent probe amplification (MS-MLPA) method, using SALSA MLPA ME030-B1 BWS/RSS kit (MRC-Holland, Amsterdam, The Netherlands). Following the manufacturer’s instructions, approximately 100 ng genomic DNA was first denatured and hybridised overnight with the probe mixture supplied with the kit. The samples were then split into two portions, treated either with ligase alone or with ligase and HhaI. Polymerase chain reactions (PCR) were then performed with the reagents and primers supplied in the kit. The PCR products were separated by capillary electrophoresis (model 3130xl; Applied Biosystems, Foster City [CA], US). The electropherograms were analysed using GeneScan software (Applied Biosystems, Foster City [CA], US), and the relative peak area was calculated using the Coffalyser version 9.4 software (MRC-Holland, Amsterdam, The Netherlands).
 
Analysis of maternal uniparental disomy of chromosome 7
We studied mUPD7 with eight polymorphic microsatellite markers, three on 7p and five on 7q (D7S531, D7S507, D7S2552, D7S2429, D7S2504, D7S500, D7S2442, and D7S2465), using a standard protocol. Haplotype analysis was then performed. A diagnosis of mUPD7 required evidence of exclusive maternal inheritance at two or more informative markers.
 
Data analysis and (epi)genotype-phenotype correlation
Epidemiological data, physical characteristics, growth records, and molecular findings were then collected for analysis. Clinical photographs were taken during consultation (Fig 1). In order to delineate the (epi)genotype-phenotype correlation, we divided the patients according to their (epi)genotype, namely H19 LOM, mUPD7, mosaic maternal uniparental disomy of chromosome 11 (mUPD11), or SRS of unexplained origin. The SRS of unexplained origin was defined as negative for 11p15 region epimutation and mUPD7 study. For statistical calculation, Student’s t test was used for continuous variables and Fisher’s exact test for categorical variables. Two-tailed P values were also computed. Differences were considered to be statistically significant when P≤0.05.
 

Figure 1. Clinical photos for molecularly confirmed SRS in this study
Patients with (a to d) mUPD7 and (e to g) H19 LOM. All had relative macrocephaly, frontal bossing, triangular face, and pointed chin. Patients showing (e) fifth finger clinodactyly and (f) body asymmetry. (h) Informative microsatellite markers in UPD study that shows mUPD7
 
Clinical score
Three clinical scoring systems were applied to all patients referred with suspected SRS and included Netchine et al score,7 Bartholdi et al score,12 and the Birmingham score.14 An overview of the three SRS scoring systems is summarised in Table 1. Using the Hitchins et al’s criteria15 as standard in this study, the sensitivity and specificity of these three scoring systems in identifying SRS were compared.
 

Table 1. Comparison of three common clinical scoring systems for SRS
 
Results
During the study period, 83 patients with suspected SRS were referred to both genetic clinics. After clinical assessment and investigations, 54 patients had an alternative diagnosis. The remaining 29 patients were clinically diagnosed with SRS using the Hitchins et al’s criteria.15 All were Chinese. One was a prenatal case with maternal H19 duplication. Since termination of pregnancy was performed at 23 weeks of gestation, it was excluded for downstream analysis. For the remaining 28 SRS patients, their age at the end of the study (September 2015) ranged from 2 years to 22 years 9 months, with a median of 9 years 4 months. The male-to-female ratio was 9:5. Sequential MS-MLPA study on chromosome 11p15 region and mUPD7 study were performed on all SRS patients. Among the 28 live-birth SRS patients, 35.7% (n=10) had H19 LOM, 21.4% (n=6) had mUPD7, 3.6% (n=1) had mosaic mUPD11, and 39.3% (n=11) were of SRS of unexplained origin. The clinical features of the SRS cohort are summarised in Table 2. The clinical features of some molecularly confirmed SRS patients in this study and one illustrative microsatellite electropherogram in mUPD7 analysis are shown in Figure 1.
 

Table 2. Summary of the clinical features in different subgroups of SRS patients
 
In order to study the (epi)genotype-phenotype correlation among the H19 LOM and mUPD7 groups, the clinical features were compared. There was no significant difference among the two groups (data not shown). When comparing the 28 molecularly confirmed SRS and 54 SRS of unexplained origin patients, postnatal microcephaly (P=0.01) and café-au-lait spots (P=0.05) were more common among SRS of unexplained origin, while body asymmetry (P<0.01) and limb asymmetry (P<0.01) were more common among the molecularly confirmed group.
 
The performance of the three clinical scoring systems namely Netchine et al score,7 Bartholdi et al score,12 and Birmingham score14 in identifying SRS in our cohort was compared. The proportion of molecularly confirmed cases in those ‘likely SRS’ and ‘unlikely SRS’ based on the scoring system are summarised in Table 3. The sensitivity and specificity among different scoring systems for identifying SRS are summarised in Table 4.
 

Table 3. Proportion of different SRS subtypes with ‘likely SRS’ and ‘unlikely SRS’ score in different scoring systems in our cohort
 

Table 4. The sensitivity and specificity of the three clinical scoring systems compared with Hitchin et al’s criteria15 in identifying SRS in our cohort
 
Discussion
Silver-Russell syndrome is a clinically and genetically heterogeneous disorder. This is the first comprehensive clinical and epigenetic study of SRS in Hong Kong. With sequential 11p15 epimutation analysis and mUPD7 study of SRS patients in this cohort, molecular confirmation was achieved in 60.7% of cases; H19 LOM and mUPD7 accounted for 35.7% and 21.4% of the cases, respectively. Although the proportion of H19 LOM–related SRS cases was similar to the western and Japanese populations,6 7 8 9 16 the proportion of mUPD7 in our cohort was significantly higher. Nonetheless, due to the small sample size, this observation might not reflect the true ethnic-specific epigenetic alteration in the Chinese population. Further studies are necessary to confirm this difference.
 
In previous studies of (epi)genotype-phenotype correlation4 7 12 17 18 19 20 in SRS, patients with mUPD7 had a milder phenotype but were more likely to have developmental delay. On the contrary, patients with H19 LOM appeared to have more typical SRS features such as characteristic facial profile and body asymmetry. Such a correlation could not be demonstrated in our cohort. When comparing the molecularly confirmed and SRS of unexplained origin groups, postnatal microcephaly and café-au-lait spots were more common in the group of SRS of unexplained origin, while body/limb asymmetry was more common in the molecularly confirmed group. This observation has also been reported in Japanese SRS patients.16 This might be due to the greater clinical and genetic heterogeneity in the molecularly negative SRS.
 
Although SRS has been extensively studied, there remains no universal consensus on the clinical diagnostic criteria. Hitchins et al’s criteria15 are currently the most commonly used. In order to facilitate the clinical diagnosis, several additional scoring systems have been proposed which include the Netchine et al,7 Bartholdi et al,12 and Birmingham scores.14 Each of them has its advantages and limitations. The major caveats of those scoring systems include relative subjectivity of clinical signs, and time-dependent and evolving clinical features. The heterogeneity of clinical manifestations also limits their application. In order to validate these scoring systems, several studies have been performed to evaluate their accuracy in predicting the molecular genetic testing result.14 21 We also evaluated the performance of these three scoring systems in this Chinese cohort. All three scoring systems are 100% specific in diagnosing SRS, but the sensitivity for Netchine et al score,7 Bartholdi et al score,12 and Birmingham score14 is 75%, 53.6%, and 71.4%, respectively when compared with Hitchins et al’s criteria.15 This suggests that Hitchins et al’s criteria15 remain the most sensitive diagnostic criteria for SRS when used clinically.
 
The management of SRS is challenging and requires multidisciplinary input. Growth hormone (GH) treatment is the current recommended therapy for children with small for gestational age without spontaneous catch-up growth and those with GH deficiency. In SRS, abnormalities in spontaneous GH secretion and subnormal responses to provocative GH stimulation have been well reported.20 The proposed mechanism is dysregulation of the growth factors and its major binding protein,4 particularly in the H19 LOM group. Besides, SRS patients are expected to have poor catch-up growth. Nonetheless, GH therapy is not a universal standard treatment for SRS. In Hong Kong, the indications for GH therapy under Hospital Authority guidelines do not include SRS22 without GH response abnormalities. In our cohort, only three patients who had a suboptimal GH provocative stimulation test are currently receiving GH treatment. The long-term outcome is not yet known.
 
Although tissue-specific epigenetic manifestation has been reported in SRS,23 mosaic genetic or epigenetic alteration is uncommon.24 We have one patient with mUPD11 confirmed by molecular testing with peripheral blood and buccal swab samples. Mosaicism should be considered when a patient has typical SRS phenotype but negative routine testing. Testing of other tissue should be pursued so as to provide an accurate molecular diagnosis that can guide subsequent genetic counselling and clinical management.
 
Finally, upon review of the literature, it is well known that gain of function of the CDKN1C gene25 and maternal UPD14 (Temple syndrome)26 27 can result in a phenotype mimicking SRS. There are also other syndromic growth retardation disorders with many overlapping clinical features with those of SRS, such as mulibrey nanism and 3-M syndrome.28 29 Therefore, with the latest understanding of the molecular pathogenetic mechanisms of SRS, together with evidence21 30 31 and results from this study, we propose the diagnostic algorithm for Chinese SRS patients as depicted in Figure 2. All clinically suspected SRS patients should be assessed by a clinical geneticist. Although the Netchine et al score,7 Bartholdi et al score,12 and Birmingham score14 are highly specific, they are less sensitive than the Hitchins et al’s criteria15 for diagnosing SRS in our Chinese cohort. Therefore, the Hitchins et al’s criteria15 should be used clinically to classify those suspected SRS patients into ‘likely’ or ‘unlikely’ SRS. For those ‘likely SRS’ patients, sequential 11p15 region methylation study and mUPD7 analysis should be performed because 11p15 region epigenetic alteration is more prevalent than mUPD7 in SRS. For those molecularly unconfirmed SRS, further testing for other SRS-like syndromes including Temple syndrome or CDKN1C-related disorder should be pursued if indicated.
 

Figure 2. Proposed algorithm for management and genetic investigations for suspected SRS in Hong Kong
 
Conclusion
This 5-year review is the first territory-wide study of Chinese SRS patients in Hong Kong. It showed that the clinical features and underlying epigenetic mechanisms of Chinese SRS are similar to those of other western populations. Early diagnosis and multidisciplinary management are important for managing SRS patients. Vigilant clinical suspicion with confirmation by molecular testing is essential. Based on the current evidence and performance evaluation of different clinical scoring systems, a comprehensive diagnostic algorithm is proposed. We hope that with an increase in understanding of the underlying pathophysiology and the (epi)genotype-phenotype correlation in Chinese SRS patients, the quality of medical care will be greatly improved in the near future.
 
Acknowledgements
We thank all the paediatricians and physicians who have referred their SRS patients to our service and QMH. We are also grateful to all the laboratory staff in CGS for their technical support. This work in HKU is supported by HKU small project funding and The Society for the Relief of Disabled Children in Hong Kong.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Silver HK, Kiyasu W, George J, Deamer WC. Syndrome of congenital hemihypertrophy, shortness of stature, and elevated urinary gonadotropins. Pediatrics 1953;12:368-76.
2. Russell A. A syndrome of intra-uterine dwarfism recognizable at birth with cranio-facial dysostosis, disproportionately short arms, and other anomalies (5 examples). Proc R Soc Med 1954;47:1040-4.
3. Wollmann HA, Kirchner T, Enders H, Preece MA, Ranke MB. Growth and symptoms in Silver-Russell syndrome: review on the basis of 386 patients. Eur J Pediatr 1995;154:958-68. Crossref
4. Wakeling EL, Amero SA, Alders M, et al. Epigenotype-phenotype correlations in Silver-Russell syndrome. J Med Genet 2010;47:760-8. Crossref
5. Christoforidis A, Maniadaki I, Stanhope R. Managing children with Russell-Silver syndrome: more than just growth hormone treatment? J Pediatr Endocrinol Metab 2005;18:651-2. Crossref
6. Kotzot D, Schmitt S, Bernasconi F, et al. Uniparental disomy 7 in Silver-Russell syndrome and primordial growth retardation. Hum Mol Genet 1995;4:583-7. Crossref
7. Netchine I, Rossignol S, Dufourg MN, et al. 11p15 Imprinting center region 1 loss of methylation is a common and specific cause of typical Russell-Silver syndrome: clinical scoring system and epigenetic-phenotypic correlations. J Clin Endocrinol Metab 2007;92:3148-54. Crossref
8. Gicquel C, Rossignol S, Cabrol S, et al. Epimutation of the telomeric imprinting center region on chromosome 11p15 in Silver-Russell syndrome. Nat Genet 2005;37:1003-7. Crossref
9. Schönherr N, Meyer E, Eggermann K, Ranke MB, Wollmann HA, Eggermann T. (Epi)mutations in 11p15 significantly contribute to Silver-Russell syndrome: but are they generally involved in growth retardation? Eur J Med Genet 2006;49:414-8. Crossref
10. Azzi S, Abi Habib W, Netchine I. Beckwith-Wiedemann and Russell-Silver Syndromes: from new molecular insights to the comprehension of imprinting regulation. Curr Opin Endocrinol Diabetes Obes 2014;21:30-8. Crossref
11. Price SM, Stanhope R, Garrett C, Preece MA, Trembath RC. The spectrum of Silver-Russell syndrome: a clinical and molecular genetic study and new diagnostic criteria. J Med Genet 1999;36:837-42.
12. Bartholdi D, Krajewska-Walasek M, Ounap K, et al. Epigenetic mutations of the imprinted IGF2-H19 domain in Silver-Russell syndrome (SRS): results from a large cohort of patients with SRS and SRS-like phenotypes. J Med Genet 2009;46:192-7. Crossref
13. Eggermann T, Gonzalez D, Spengler S, Arslan-Kirchner M, Binder G, Schönherr N. Broad clinical spectrum in Silver-Russell syndrome and consequences for genetic testing in growth retardation. Pediatrics 2009;123:e929-31. Crossref
14. Dias RP, Nightingale P, Hardy C, et al. Comparison of the clinical scoring systems in Silver-Russell syndrome and development of modified diagnostic criteria to guide molecular genetic testing. J Med Genet 2013;50:635-9. Crossref
15. Hitchins MP, Stanier P, Preece MA, Moore GE. Silver-Russell syndrome: a dissection of the genetic aetiology and candidate chromosomal regions. J Med Genet 2001;38:810-9. Crossref
16. Fuke T, Mizuno S, Nagai T, et al. Molecular and clinical studies in 138 Japanese patients with Silver-Russell syndrome. PLoS One 2013;8:e60105. Crossref
17. Bliek J, Terhal P, van den Bogaard MJ, et al. Hypomethylation of the H19 gene causes not only Silver-Russell syndrome (SRS) but also isolated asymmetry or an SRS-like phenotype. Am J Hum Genet 2006;78:604-14. Crossref
18. Bruce S, Hannula-Jouppi K, Peltonen J, Kere J, Lipsanen-Nyman M. Clinically distinct epigenetic subgroups in Silver-Russell syndrome: the degree of H19 hypomethylation associates with phenotype severity and genital and skeletal anomalies. J Clin Endocrinol Metab 2009;94:579-87. Crossref
19. Kotzot D. Maternal uniparental disomy 7 and Silver-Russell syndrome—clinical update and comparison with other subgroups. Eur J Med Genet 2008;51:444-51. Crossref
20. Binder G, Seidel AK, Martin DD, et al. The endocrine phenotype in Silver-Russell syndrome is defined by the underlying epigenetic alteration. J Clin Endocrinol Metab 2008;93:1402-7. Crossref
21. Azzi S, Salem J, Thibaud N, et al. A prospective study validating a clinical scoring system and demonstrating phenotypical-genotypical correlations in Silver-Russell syndrome. J Med Genet 2015;52:446-53. Crossref
22. But WM, Huen KF, Lee CY, Lam YY, Tse WY, Yu CM. An update on the indications of growth hormone treatment under Hospital Authority in Hong Kong. Hong Kong J Paediatr 2012;17:208-16.
23. Azzi S, Blaise A, Steunou V, et al. Complex tissue-specific epigenotypes in Russell-Silver Syndrome associated with 11p15 ICR1 hypomethylation. Hum Mutat 2014;35:1211-20. Crossref
24. Bullman H, Lever M, Robinson DO, Mackay DJ, Holder SE, Wakeling EL. Mosaic maternal uniparental disomy of chromosome 11 in a patient with Silver-Russell syndrome. J Med Genet 2008;45:396-9. Crossref
25. Brioude F, Oliver-Petit I, Blaise A, et al. CDKN1C mutation affecting the PCNA-binding domain as a cause of familial Russell Silver syndrome. J Med Genet 2013;50:823-30. Crossref
26. Ioannides Y, Lokulo-Sodipe K, Mackay DJ, Davies JH, Temple IK. Temple syndrome: improving the recognition of an underdiagnosed chromosome 14 imprinting disorder: an analysis of 51 published cases. J Med Genet 2014;51:495-501. Crossref
27. Kagami M, Mizuno S, Matsubara K, et al. Epimutations of the IG-DMR and the MEG3-DMR at the 14q32.2 imprinted region in two patients with Silver-Russell Syndrome–compatible phenotype. Eur J Hum Genet 2015;23:1062-7. Crossref
28. Hämäläinen RH, Mowat D, Gabbett MT, O’brien TA, Kallijärvi J, Lehesjoki AE. Wilms’ tumor and novel TRIM37 mutations in an Australian patient with mulibrey nanism. Clin Genet 2006;70:473-9. Crossref
29. van der Wal G, Otten BJ, Brunner HG, van der Burgt I. 3-M syndrome: description of six new patients with review of the literature. Clin Dysmorphol 2001;10:241-52. Crossref
30. Scott RH, Douglas J, Baskcomb L, et al. Methylation-specific multiplex ligation-dependent probe amplification (MS-MLPA) robustly detects and distinguishes 11p15 abnormalities associated with overgrowth and growth retardation. J Med Genet 2008;45:106-13. Crossref
31. Spengler S, Begemann M, Ortiz Brüchle N, et al. Molecular karyotyping as a relevant diagnostic tool in children with growth retardation with Silver-Russell features. J Pediatr 2012;161:933-42. Crossref

Management of health care workers following occupational exposure to hepatitis B, hepatitis C, and human immunodeficiency virus

Hong Kong Med J 2016 Oct;22(5):472–7 | Epub 26 Aug 2016
DOI: 10.12809/hkmj164897
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE  CME
Management of health care workers following occupational exposure to hepatitis B, hepatitis C, and human immunodeficiency virus
Winnie WY Sin, MB, ChB, FHKAM (Medicine); Ada WC Lin, MB, BS, FHKAM (Medicine); Kenny CW Chan, MB, BS, FHKAM (Medicine); KH Wong, MB, BS, FHKAM (Medicine)
Special Preventive Programme, Centre for Health Protection, Department of Health, Kowloon Bay Health Centre, Hong Kong
 
Corresponding author: Dr Kenny CW Chan (kcwchan@dh.gov.hk)
 
 Full paper in PDF
 
Abstract
Introduction: Needlestick injury or mucosal contact with blood or body fluids is well recognised in the health care setting. This study aimed to describe the post-exposure management and outcome in health care workers following exposure to hepatitis B, hepatitis C, or human immunodeficiency virus (HIV) during needlestick injury or mucosal contact.
 
Methods: This case series study was conducted in a public clinic in Hong Kong. All health care workers with a needlestick injury or mucosal contact with blood or body fluids who were referred to the Therapeutic Prevention Clinic of Department of Health from 1999 to 2013 were included.
 
Results: A total of 1525 health care workers were referred to the Therapeutic Prevention Clinic following occupational exposure. Most sustained a percutaneous injury (89%), in particular during post-procedure cleaning or tidying up. Gloves were worn in 62.7% of instances. The source patient could be identified in 83.7% of cases, but the infection status was usually unknown, with baseline positivity rates of hepatitis B, hepatitis C, and HIV of all identified sources, as reported by the injured, being 7.4%, 1.6%, and 3.3%, respectively. Post-exposure prophylaxis of HIV was prescribed to 48 health care workers, of whom 14 (38.9%) had been exposed to known HIV-infected blood or body fluids. The majority (89.6%) received HIV post-exposure prophylaxis within 24 hours of exposure. Drug-related adverse events were encountered by 88.6%. The completion rate of post-exposure prophylaxis was 73.1%. After a follow-up period of 6 months (or 1 year for those who had taken HIV post-exposure prophylaxis), no hepatitis B, hepatitis C, or HIV seroconversions were detected.
 
Conclusions: Percutaneous injury in the health care setting is not uncommon but post-exposure prophylaxis of HIV is infrequently indicated. There was no hepatitis B, hepatitis C, and HIV transmission via sharps or mucosal injury in this cohort of health care workers.
 
 
New knowledge added by this study
  • The risk of hepatitis B (HBV), hepatitis C (HCV), and human immunodeficiency virus (HIV) transmission following occupational sharps or mucosal injury in Hong Kong is small.
Implications for clinical practice or policy
  • Meticulous adherence to infection control procedures and timely post-exposure management prevents HBV, HCV, and HIV infection following occupational exposure to blood and body fluids.
 
 
Introduction
Needlestick injury or mucosal contact with blood or body fluids is well recognised in the health care setting. These incidents pose a small but definite risk for health care workers of acquiring blood-borne viruses, notably hepatitis B virus (HBV), hepatitis C virus (HCV), and human immunodeficiency virus (HIV). The estimated risk of contracting HBV infection through occupational exposure to known infected blood via needlestick injury varies from 18% to 30%, while that for HCV infection is 1.8% (range, 0%-7%).1 The risk of HIV transmission following percutaneous or mucosal exposure to HIV-contaminated blood is 0.3% and 0.09%, respectively.1 The risk is further affected by the type of exposure, body fluid involved, and infectivity of the source.
 
In Hong Kong, injured health care workers usually receive initial first aid and immediate management in the Accident and Emergency Department. They are then referred to designated clinics for specific post-exposure management. Currently, aside from staff of the Hospital Authority who are managed at two designated clinics post-exposure, all other health care workers from private hospitals, and government or private clinics and laboratories are referred to the Therapeutic Prevention Clinic (TPC) of the Integrated Treatment Centre, Department of Health. Since its launch in mid-1999, the TPC has provided comprehensive post-exposure management to people with documented percutaneous, mucosal, or breached skin exposure to blood or body fluids in accordance with the local guidelines set out by the Scientific Committee on AIDS and STI, and Infection Control Branch of Centre for Health Protection, Department of Health.2 The present study describes the characteristics and outcome of health care workers who attended the TPC from mid-1999 to 2013 following occupational exposure to blood or body fluids.
 
Methods
The study included all health care workers seen in the TPC from July 1999 to December 2013 following occupational exposure to blood or body fluids, who attended following secondary referral by an accident and emergency department of a public hospital. Using two standard questionnaires (Appendices 1 and 2), data were collected by the attending nurse and doctor during a face-to-face interview with each health care worker on the following: demography and occupation of the exposed client, type and pattern of exposure, post-exposure management, and clinical outcome.
 
Appendix 1. TPC First Consultation Assessment Form
 
Appendix 2. Therapeutic Prevention Clinic (TPC) human immunodeficiency virus (HIV) Post-exposure Prophylaxis Registry Form (to be completed on completion or cessation of post-exposure prophylaxis)
 
Details of the exposure, including type of exposure and the situation in which it occurred, were noted. The number of risk factors (see definitions below) for HIV transmission was counted for each exposure and further classified as high risk or low risk. Where known and reported by the injured party, hepatitis B surface antigen (HBsAg), HCV, and HIV status of the source were recorded.
 
The timing of the first medical consultation in the accident and emergency department, any prescription of HIV post-exposure prophylaxis (PEP), and the time since injury were noted. Exposed health care workers who received HIV PEP were reviewed at clinic visits every 2 weeks until completion of the 4-week course of treatment, and any treatment-related adverse effects were reported. Blood was obtained as appropriate at these visits for measurement of complete blood count, renal and liver function, and amylase, creatine kinase, fasting lipid, and glucose levels.
 
Apart from HIV PEP–related side-effects (reported and rated by patients as mild, moderate, or severe), the rate of completion of PEP, and number of HBV, HCV, and HIV seroconversions following the incident was also recorded. The HBsAg, anti-HBs, anti-HCV, and anti-HIV were checked at baseline and 6 months post-exposure to determine whether seroconversion had occurred. Those exposed to a known HCV-infected source or a source known to be an injecting drug user had additional blood tests 6 weeks post-exposure for liver function, anti-HCV, and HCV RNA. Additional HIV antibody testing at 3 and 12 months post-exposure was arranged for those who received HIV PEP. For those who contracted HCV infection from a source co-infected with HCV and HIV, further HIV testing was performed at 1 year post-exposure to detect delayed seroconversion.
 
Definitions
Health care workers included doctors and medical students, dentists and dental workers, nurses, midwives, inoculators, laboratory workers, phlebotomists, ward or clinic attendants, and workmen. Staff working in non–health care institutions (eg elderly home, hostels, and sheltered workshops) were excluded. Five factors were classified as high-risk exposure: (i) deep percutaneous injury, (ii) procedures involving a device placed in a blood vessel, (iii) use of a hollow-bore needle, (iv) device that was visibly contaminated with blood, and (iv) source person with acquired immunodeficiency syndrome (AIDS).3 Another five factors were classified as low-risk exposure: (i) moderate percutaneous injury, (ii) mucosal contact, (iii) contact with deep body fluids other than blood, (iv) source person was HIV-infected but not or not sure about the stage of AIDS, and (v) any other reason contributing to increased risk according to clinical judgement.
 
Results
From July 1999 to December 2013, 1525 health care workers (75-168 per year) with occupational exposure to HBV, HCV, or HIV were referred to the TPC (Fig). Females constituted 77% of all attendees. The median age was 33 years (range, 17-73 years). The majority came from the dental profession (36.8%) and nursing profession (33.4%), followed by ward/clinic ancillary staff (11.6%) and the medical profession (4.7%).
 

Figure. Referrals of health care workers with occupational exposure to Therapeutic Prevention Clinic and the post-exposure prophylaxis (PEP) prescription
 
Type and pattern of exposure
The majority of exposures occurred in a public clinic or laboratory (n=519, 34.0%), followed by public hospital (n=432, 28.3%), private clinic or laboratory (n=185, 12.1%), and private hospital (n=23, 1.5%). Most were a percutaneous injury (88.9%). Mucosal contact, breached skin contact, and human bite were infrequent (Table 1). Approximately 60% of the incidents occurred in one of the four situations: (a) cleaning/tidying up after procedures (the most common), (b) other bedside/treatment room procedures, (c) injection, including recapping of needles, or (d) blood taking/intravenous catheter insertion. The contact specimen was blood or blood products, blood-contaminated fluid, and saliva or urine in 30.6%, 5.8%, and 14.1% of the cases, respectively. The technical device involved was a hollow-bore needle in 48.1%, dental instrument in 20.7%, and lancet in 7.7%. More than 80% considered the injury superficial.
 

Table 1. Details of occupational exposure in health care workers
 
High-risk and low-risk factors were noted in 869 (57%) and 166 (11%) exposures, respectively. Blood taking/intravenous catheter insertion carried the highest risk among all the procedures, with a mean risk factor of 1.29 (Table 2). Gloves were used in 956 (62.7%) exposures, goggles/mask in 50 (3.3%), and gown/apron in 55 (3.6%). Nonetheless, 101 (6.6%) health care workers indicated that they did not use any personal protective equipment during the exposure.
 

Table 2. Risk factors in health care workers with higher-risk occupational exposure during various activities/procedures from 1999 to 2013
 
The source patient could be identified in 1277 (83.7%) cases but the infectious status was unknown in most. The baseline known positivity rate for HBV, HCV, and HIV of all identified sources was 7.4%, 1.6%, and 3.3%, respectively (Table 1).
 
Care and clinical outcome
Nearly half of the injured health care workers attended a medical consultation within 2 hours (n=720, 47.2%) and another 552 (36.2%) attended between 2 and 12 hours following exposure. The median time between exposure and medical consultation was 2.0 hours.
 
During the study period, 48 (3.1%) health care workers received HIV PEP for occupational exposure, ranging from zero to eight per year (Fig). One third received PEP within 2 hours of exposure, and the majority (89.6%) within 24 hours. The median time to PEP was 4.0 hours post-exposure (interquartile range, 2.0-8.1 hours). A three-drug regimen was prescribed in 85.7% of cases. The most common regimen was zidovudine/lamivudine/indinavir (39.6%), followed by zidovudine/lamivudine/ritonavir-boosted lopinavir (31.3%), and zidovudine/lamivudine (12.5%) [Table 3]. Upon consultation and risk assessment at the TPC, 36 (75%) workers had treatment continued from the accident and emergency department. Among them, the source was confirmed to be HIV-positive in 14 (38.9%) cases. Of the 35 clients with known outcome, drug-related adverse events were seen in 31 (88.6%) health care workers; more than half (n=18, 58.1%) of which were considered to be moderate or severe. Treatment-related side-effects led to early termination of PEP in eight (22.9%) health care workers. Excluding nine clients in whom prophylaxis was stopped when the source was established to be HIV-negative, 19 (73.1%) clients were able to complete the 28-day course of PEP. Of the 14 clients who sustained injury from an HIV-infected source patient, all received PEP but two did not complete the course; the completion rate was 85.7%.
 

Table 3. Post-exposure prophylaxis regimens of human immunodeficiency virus
 
At baseline, none of the injured health care workers tested positive for HCV or HIV, while 49 (3.2% of all health care workers seen in TPC) tested HBsAg-positive. Almost half of the health care workers (n=732, 48.0%) were immune to HBV (anti-HBs positive). After follow-up of 6 months (1 year for those who took PEP), no case of HBV, HCV, or HIV seroconversion was detected in this cohort.
 
Discussion
Health care workers may be exposed to blood-borne viruses when they handle sharps and body fluids. Thus, adherence to standard precautions of infection control is an integral component of occupational health and safety for health care workers. In this cohort, percutaneous injury with sharps during cleaning or tidying up after procedures remained the most common mechanism of injury. Many of these incidents could have been prevented by safer practice, for instance, by not recapping needles or by disposing needles directly into a sharps box after use. The use of gloves as part of standard precautions was suboptimal and greater emphasis on the importance of wearing the appropriate personal protective equipment should be made during staff training at induction and on refresher courses. Technical devices with safety needleless features may reduce sharps injuries. Improvement in the system (eg by placing a sharps box near the work area) or the workflow to minimise distraction may also help compliance with infection control measures.
 
Once exposure occurs, PEP is the last defence against HBV and HIV. For HBV infection, PEP with hepatitis B immunoglobulin followed by hepatitis B vaccination has long been the standard practice in Hong Kong. For HIV infection, the efficacy of PEP in health care workers following occupational exposure was demonstrated by a historic landmark overseas case-control study.3 Prescription of zidovudine achieved an 81% reduction in risk of HIV seroconversion following percutaneous exposure to HIV-infected blood.3 Local and international guidelines now recommend a combination of three antiretroviral drugs for PEP.2 4 5 6 In this cohort, although more than half of the exposures had higher risk factors for HIV acquisition, it was uncommon for the source patients to have known HIV infection (2.8% of these exposures). Thus, in accordance with the local guideline, PEP was not commonly prescribed. Nevertheless, PEP was prescribed in all 14 exposures to a known HIV-positive source and in other 34 exposures after risk assessment. Our experience is comparable with the health care service in the UK and US. In the UK, 78% of health care workers exposed to an HIV-infected source patient were prescribed PEP.7 In a report from the US, only 68% of health care workers with such exposure took PEP.8 For HCV, PEP with antiviral therapy is not recommended according to the latest guidelines from American Association for the Study of Liver Diseases/Infectious Diseases Society of America.9 In case seroconversion occurs and early treatment is considered desirable, these patients with acute hepatitis C can be treated with direct-acting antivirals using the same regimen recommended for chronic hepatitis C.
 
If indicated, HIV PEP should be taken as early as possible after exposure to achieve maximal effect. Initiation of PEP after 72 hours of exposure was shown to be ineffective in animal studies.10 The timing of PEP initiation in our cohort appeared to be less prompt (33.3% within 2 hours compared with more than 60% and 80% within 3 hours in the UK and US, respectively). Overall, however, 89.6% managed to start PEP within 24 hours, in line with experience in the UK or US. Health care workers should be reminded about post-exposure management and the need for timely medical assessment following occupational exposure. In the accident and emergency department, priority assessment should be given to health care workers exposed to blood-borne viruses. The median duration of PEP intake of 28 days was in line with the local guidelines. With the availability of newer drugs with fewer toxicities, the tolerance and compliance rate should improve.
 
Finally, using the estimated risk of HIV transmission with percutaneous injury of 0.3%, we would expect four HIV seroconversions in 1356 percutaneous exposures in TPC if all were exposed to HIV-infected blood. Because in most of these exposures the source HIV status was unknown and likely negative in this region of overall low HIV prevalence (approximately 0.1%11), the actual risk of HIV transmission was much lower in the health care setting of Hong Kong. This finding is confirmed by the fact that no HIV seroconversion occurred in this cohort. In addition, those with exposure of the highest risk received HIV PEP. In the UK, there were 4381 significant occupational exposures from 2002 to 2011, of which 1336 were exposures to HIV-infected blood or body fluid. No HIV seroconversions occurred among these exposures.7 In the US, there has been one confirmed case of occupational transmission of HIV in health care workers since 1999.12 Similarly, the local prevalence of HCV infection is low (<0.1% in new blood donors13), partly explaining the absence of HCV transmission in this cohort. In contrast, there were 20 cases of HCV seroconversion in health care workers reported between 1997 and 2011 in the UK.7 Hepatitis B is considered to be endemic in Hong Kong, with HBsAg positivity of 1.1% in new blood donors and 6.5% in antenatal women in 2013.13 Nonetheless, the HBV vaccination programme in health care workers coupled with HBV PEP has proven successful in preventing HBV transmission to health care workers. With concerted efforts in infection control and timely PEP, transmission of blood-borne viruses via sharps and mucosal injury in the health care setting is largely preventable.
 
There are several limitations to our study. First, data were collected from a single centre and based on secondary referral. We did not have data for other health care workers who had occupational exposure but who were not referred to the TPC for post-exposure management, or who were referred but did not attend. Thus, we were not able to draw any general conclusions on the true magnitude of the problem. Second, details of the exposure and the infection status of the source were self-reported by the exposed client and prone to bias and under-reporting.
 
Conclusions
Percutaneous injury with sharps during cleaning or tidying up after procedures was the most common cause of occupational exposure to blood or body fluids in this cohort of health care workers. The majority of source patients were not confirmed HIV-positive and HIV PEP was not generally indicated. Prescriptions of HIV PEP were appropriate and timely in most cases. There were no HIV, HBV, and HCV seroconversions in health care workers who attended the TPC following sharps or mucosal injury from mid-1999 to 2013.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Pruss-Ustun A, Rapiti E, Hutin Y. Sharps injuries: Global burden of disease from sharps injuries to health-care workers (World Health Organization Environmental Burden of Disease Series, No. 3). Available from: http://www.who.int/quantifying_ehimpacts/publications/en/sharps.pdf?ua=1. Accessed 2 Feb 2016.
2. Scientific Committee on AIDS and STI (SCAS), and Infection Control Branch, Centre for Health Protection, Department of Health. Recommendations on the management and postexposure prophylaxis of needlestick injury or mucosal contact to HBV, HCV and HIV. Hong Kong: Department of Health; 2014.
3. Cardo DM, Culver DH, Ciesielski CA, et al. A case-control study of HIV seroconversion in health care workers after percutaneous exposure. Centers for Disease Control and Prevention Needlestick Surveillance Group. N Engl J Med 1997;337:1485-90. Crossref
4. Kuhar DT, Henderson DK, Struble KA, et al. Updated US Public Health Service guidelines for the management of occupational exposures to human immunodeficiency virus and recommendations for postexposure prophylaxis. Infect Control Hosp Epidemiol 2013;34:875-92. Crossref
5. UK Department of Health. HIV post-exposure prophylaxis: guidance from the UK Chief Medical Officers’ Expert Advisory Group on AIDS. 19 September 2008 (last updated 29 April 2015).
6. WHO Guidelines Approved by the Guidelines Review Committee. Guidelines on Post-Exposure Prophylaxis for HIV and the Use of Co-Trimoxazole Prophylaxis for HIV-Related Infections Among Adults, Adolescents and Children: Recommendations for a Public Health Approach: December 2014 supplement to the 2013 consolidated guidelines on the use of antiretroviral drugs for treating and preventing HIV infection. Geneva: World Health Organization; December 2014.
7. Eye of the Needle. United Kingdom surveillance of significant occupational exposures to bloodborne viruses in healthcare workers. London: Health Protection Agency; December 2012.
8. US Department of Health and Human Services, Centers for Disease Control and Prevention. The National Surveillance System for Healthcare Workers (NaSH): Summary report for blood and body fluid exposure data collected from participating healthcare facilities (June 1995 through December 2007).
9. American Association for the Study of Liver Diseases/Infectious Diseases Society of America. HCV guidance: recommendations for testing, managing, and treating hepatitis C (updated 24 February 2016). Available from: http://www.hcvguidelines.org. Accessed 5 May 2016.
10. Tsai CC, Emau P, Follis KE, et al. Effectiveness of postinoculation (R)-9-(2-phosphonylmethoxypropyl) adenine treatment for prevention of persistent simian immunodeficiency virus SIVmne infection depends critically on timing of initiation and duration of treatment. J Virol 1998;72:4265-73.
11. HIV surveillance report—2014 update. Department of Health, The Government of the Hong Kong Special Administrative Region; December 2015.
12. Joyce MP, Kuhar D, Brooks JT, Occupationally acquired HIV infection among health care workers—United States, 1985-2013. MMWR Morb Mortal Wkly Rep 2015;63:1245-6.
13. Surveillance of viral hepatitis in Hong Kong—2014 update. Department of Health, The Government of the Hong Kong Special Administrative Region; December 2015.

Violence against emergency department employees and the attitude of employees towards violence

Hong Kong Med J 2016 Oct;22(5):464–71 | Epub 26 Aug 2016
DOI: 10.12809/hkmj154714
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Violence against emergency department employees and the attitude of employees towards violence
Halil Í Çıkrıklar, MD1; Yusuf Yürümez, MD1; Buket Güngör, MD2; Rüstem Aşkın, MD2; Murat Yücel, MD1; Canan Baydemir, MD3
1 Department of Emergency Medicine, Sakarya University, Medical Faculty, Sakarya, Turkey
2 Psychiatry Clinic, Ministry of Health, Şevket Yilmaz Training and Research Hospital, Bursa, Turkey
3 Department of Biostatistics, Eskişehir Osmangazi University, Medical Faculty, Eskişehir, Turkey
 
Corresponding author: Dr Halil Í Çıkrıklar (halilcikriklar@hotmail.com)
 
 Full paper in PDF
 
Abstract
Introduction: This study was conducted to evaluate the occurrence of violent incidents in the workplace among the various professional groups working in the emergency department. We characterised the types of violence encountered by different occupation groups and the attitude of individuals working in different capacities.
 
Methods: This cross-sectional study included 323 people representing various professional groups working in two distinct emergency departments in Turkey. The participants were asked to complete questionnaires prepared in advance by the researchers. The data were analysed using the Statistical Package for the Social Sciences (Windows version 15.0).
 
Results: A total of 323 subjects including 189 (58.5%) men and 134 (41.5%) women participated in the study. Their mean (± standard deviation) age was 31.5 ± 6.5 years and 32.0 ± 6.9 years, respectively. In all, 74.0% of participants had been subjected to verbal or physical violence at any point since starting employment in a medical profession. Moreover, 50.2% of participants stated that they had been subjected to violence for more than 5 times. Among those who reported being subjected to violence, 42.7% had formally reported the incident(s). Besides, 74.3% of participants did not enjoy their profession, did not want to work in the emergency department, or would prefer employment in a non–health care field after being subjected to violence. According to the study participants, the most common cause of violence was the attitude of patients or their family members (28.7%). In addition, 79.6% (n=257) of participants stated that they did not have adequate safety protection in their working area. According to the study participants, there is a need for legal regulations to effectively deter violence and increased safety measures designed to reduce the incidence of violence in the emergency department.
 
Conclusion: Violence against employees in the emergency department is a widespread problem. This situation has a strong negative effect on employee satisfaction and work performance. In order to reduce the incidence of violence in the emergency department, both patients and their families should be better informed so they have realistic expectations as an emergency patient, deterrent legal regulations should be put in place, and increased efforts should be made to provide enhanced security for emergency department personnel. These measures will reduce workplace violence and the stress experienced by emergency workers. We expect this to have a positive impact on emergency health care service delivery.
 
 
New knowledge added by this study
  • The prevalence of violence against employees in emergency departments is high.
Implications for clinical practice or policy
  • Various measures can be implemented to reduce the incidence of violence in the emergency department.
 
 
Introduction
Violence, which has been ever present throughout the history of humanity, is defined as a threat or application of possessed power or strength towards another person, self, a group, or a community in order to cause injury and/or loss.1 The World Health Organization defines violence as “physical assault, homicide, verbal assault, emotional, sexual or racial harassment”.2
 
Workplace violence is defined as “abuse or attacks by one or more people on an employee within the workplace”.3 The health care field, which encompasses a wide range of employees, is among those in which workplace violence is common.4 Violence in the health care field is defined as “risk to a health worker due to threatening behaviour, verbal threats, physical assault and sexual assault committed by patients, patient relatives, or any other person”.3
 
According to the 2002 Workplace Violence in the Health Sector report, 25% of all violent incidents occurred in the health care sector.5 A study conducted in the United States determined that the risk of being subjected to violence is 16 times higher in the health care sector relative to other service sectors.6 Within the health care field, the department that is most frequently exposed to violence is the emergency department (ED).3 7 8 9 In this context, verbal and physical attacks by dissatisfied patients and their relatives are at the forefront.10 11
 
In this study we aimed to determine the extent of violence towards ED employees, analyse the attitude of the staff exposed to violence, and propose possible solutions.
 
Methods
This cross-sectional study was conducted in the EDs of Şevket Yilmaz Training and Research Hospital and Sakarya University between 1 July and 15 August 2012. Employees of ED—including doctors, nurses, health care officials, Emergency Medical Technicians (EMT), secretaries, laboratory technicians, radiology technicians, and security and cleaning staff—were included in the study. The questionnaire was prepared in accordance with previous publications3 10 11 and distributed to participants. All study participants were provided with information regarding the objectives of the study and were given instructions for completing the form. Of the 437 ED employees working in the two hospitals, 323 (73.9%) agreed to participate in the study and returned a completed questionnaire.
 
In addition to demographic information, the questionnaire contained questions about the number of violent incidents to which the individual had been subjected to, the type of violence, and whether the subject reported the incident or the reason for not reporting. Additional questions concerned a description of the person(s) responsible for the violence, the estimated age of the person(s) responsible for the violence, and the severity of the violence. We also asked participants about their attitude following the violent incident and suggestions for reducing violence in the ED.
 
This study was conducted in accordance with the principles of the 2008 Helsinki Declaration. The data were analysed using the Statistical Package for the Social Sciences (Windows version 15.0; SPSS Inc, Chicago [IL], US). Both proportions and mean ± standard deviation were used to represent the results. The Student’s t test, Pearson’s Chi squared test, and the Monte Carlo Chi squared tests were used to evaluate observed differences between groups and a P value of <0.05 was considered to represent a statistically significant difference.
 
Results
Among the 323 participants included in the study, 189 (58.5%) were male and 134 (41.5%) were female. The mean age of the male participants was 31.5 ± 6.5 years (range, 18-55 years) and that of the female participants was 32.0 ± 6.9 years (range, 20-52 years). There was no significant difference in the age distribution between the male and female participants (P=0.476).
 
When participants were asked if they had ever been exposed to verbal or physical violence in the workplace during the course of their career, 239 (74.0%) indicated that they had been subjected to one or the other, and 57 (17.6%) reported being subjected to both verbal and physical violence. Among the participants who were subjected to violence, 162 (67.8%) reported being the victim of more than five violent incidents (Table 1).
 

Table 1. Frequency of exposure to violence for male and female employees
 
The frequency of exposure to violence and the frequency of exposure to more than five violent incidents were similar for both men and women (P=0.185 and 0.104, respectively). Nonetheless, 25.9% of men reported both verbal and physical violence compared with only 6.0% of women, suggesting that the incidence of verbal and physical violence against men was greater than that against women (P<0.001) [Table 1].
 
We investigated the frequency of exposure to violence and the reported incidence of violence among various occupation groups (Table 2). The prevalence of exposure to violence was the highest among health care officials, EMTs, doctors, and security staff (P<0.001). In addition, only 102 (42.7%) out of 239 participants reported these violent incidents. It is notable that although the rate of incident reporting was 100% among security staff, none of the laboratory technicians reported the violent incidents (P<0.001).
 

Table 2. The distribution of occupation groups according to frequency of exposure to violence and rate of reporting
 
A total of 43 (31.4%) out of the 137 study participants who had been exposed to violence but had not reported the incident provided reasons (Table 3). The most common reason for not notifying the authorities was the perception that “no resolution will be reached”. Other important reasons included the heavy workload, not wanting to deal with the legal process, disregarding verbal attacks, understanding/sympathising with the emotions of patients and their relatives, fear of the threat from patients and their relatives, and not knowing how and where to report such incidents.
 

Table 3. Reasons for not reporting a violent incident (n=43)
 
A total of 248 participants responded to a question regarding the identity of the person who was to blame for the violence in ED in general (not their own experiences). Accordingly, 65.3% (n=162) stated that the patient’s relatives were responsible, 27.0% (n=67) stated that both the patients and their relatives were responsible, and 5.2% (n=13) placed sole responsibility on the patients. Six (2.4%) participants stated that they had been subjected to violence from other health care professionals.
 
When we asked individuals to estimate the age of the person(s) causing the violence that they had experienced, respondents who were exposed to multiple violent incidents answered this question by selecting multiple options and a total of 405 answers were obtained. As shown in Table 4, the majority (71.4%) of people responsible for violent incidents were young patients and patient relatives between the ages of 18 and 39 years.
 

Table 4. Estimated age of violent patients/family members (n=405)
 
When participants who were exposed to violence were asked who caused the violent incident, three (1.3%) participants stated that they themselves were responsible, five (2.1%) indicated that both sides were responsible, and the remaining 231 (96.7%) held the attacker responsible.
 
Participants were asked “What do you think is the reason for the violence?”. A total of 181 (56.0%) participants responded to this question. Some participants indicated more than one reason and a total of 188 answers were obtained. The top 10 most common responses to this question are given in descending order of frequency in Table 5. The most common cause of violence was ignorance and lack of education of patients and their relatives (28.7%), followed by the impatient attitudes and demanding priorities (23.4%) and the heavy workload and prolonged waiting time (10.6%).
 

Table 5. Answers to the questions: “What do you think is the reason for the violence?” and “How do you think violence against health care workers can be reduced?”
 
Participants were asked “How do you think violence against health care workers can be reduced?”. Some participants indicated more than one reason and a total of 509 answers were obtained. They considered the most important steps suggested to reduce violence against ED employees were the enactment of deterrent legislation (42.6%), increased security measures in hospitals (28.5%), and improved public education (16.7%) [Table 5].
 
Participants were asked about their attitude after experiencing violence. Some respondents gave more than one answer and a total of 498 answers were obtained. There were 27.1% of participants who did not enjoy working in their current profession, 25.7% wanted to work in non–health care field, and 21.5% did not want to work in the ED (Table 6).
 

Table 6. The attitude of health care workers after exposure to violence (n=498)
 
A total of 96.3% (n=311) of participants answered “Yes” to the question “Do you think that the violence against health care workers has increased in recent years?” Moreover, 90.7% (n=293) of the participants answered “Yes” to the question “Do news reports regarding violence against health care workers affect you?”. Then, when participants were asked “How does the news affect you?”, 64.7% (n=209) reported that they were “sad”, 44.3% (n=143) said they were “angry”, and 18.9% (n=61) said they were “scared”.
 
When participants were asked “Are there sufficient security measures in your workplace?”, only 66 (20.4%) participants gave a positive response, while 257 (79.6%) responded negatively. Among the 41 participants working as security staff, 33 (80.5%) found the safety measures inadequate. Thus, both the security staff and the general employee population agreed that hospital security was inadequate.
 
Discussion
Workplace violence is the most prevalent in the health care sector.4 The ED is the health care unit with the highest frequency of exposure to violence.3 7 8 9 According to several previous studies, the proportion of health care professionals who report prior exposure to violence in the workplace ranges from 45% to 67.6%.3 8 12 13 14 The rate of violence against ED employees (79%-99%), however, is higher than the average for the health care field.15 16 17
 
Emergency services are high-risk areas for patients and staff with regard to workplace violence18 19 20 21; 24-hour accessibility, a high-stress environment, and the apparent lack of trained security personnel are underlying factors.22 Workplace violence negatively affects the morale of health care workers and negatively affects the health and effectiveness of presentation.23 24 25 26
 
Our study was conducted among ED employees of two different hospitals. We investigated the rate of exposure to verbal or physical violence. Among the participants, 239 (74.0%) stated that they had been subjected to exposure to violence, and 57 (17.6%) reported having been exposed to both verbal and physical violence. A study in Turkey found that among ED employees, including nurses, in the İzmir province of Turkey, 98.5% of respondents had been subjected to verbal violence and 19.7% were exposed to physical violence.16 In another study conducted in Turkey, 88.6% of ED employees were subjected to verbal violence and 49.4% reported having been the victim of physical violence.17
 
In the present study, the rate of exposure to violence by profession was 95.7% among health care officials/EMTs, 90.7% among doctors, and 80.5% among security personnel. According to Ayrancı et al,3 exposure to violence was most common among practitioners (67.6%) and nurses (58.4%). In another study, Alçelik et al27 reported that nurses were exposed to violence 3 times more often than other health care professionals. In the present study, the frequency of exposure to violence among nurses was 62.7%, which is lower than that in other professional groups.
 
In the present study, the estimated age distribution of patients and patient relatives responsible for violent incidents showed that the majority (71.4%) were between 18 and 39 years of age. Other studies have reported that individuals prone to violence are generally younger than 30 years.28
 
Health care workers are often subjected to verbal and physical attacks from patients and their relatives who are dissatisfied with the services provided.10 11 In the present study, the most common cause of violence was the lack of education and ignorance of the patients and their relatives. Heavy workload was identified as another cause of workplace violence. Factors such as patient stress and anxiety regarding their condition, high expectations of the patients and their relatives, lack of effective institutional and legal arrangements aimed at preventing violence, and the failure to effectively document the extent of workplace violence contribute to the high frequency of violence.12 There are several factors that increase the risk of violence in health care institutions, including 24-hour service, long waiting time for patients, poor access to health care services, heavy workload, limited staff, inadequate employee training, and lack of security personnel.29 30
 
Previous studies conducted in Turkey revealed that 60% of ED employees who were exposed to violence did not report the incident. Among the reasons for not reporting was a lack of confidence in health care and executive leadership as well as the justice system.12 In the present study, the incident reporting rate was also low (42.7%) and the most important reason (34.9%) for not reporting was the perception that “no resolution will be reached”. Indeed, a study found that there were no repercussions for the attacker in 77% of instances.12 This suggests the perception that “no resolution will be reached” is a valid one.
 
A heavy workload consumes the energy of employees and reduces their ability to empathise with patients and tolerate violent situations. Sometimes verbal or physical conflicts may arise between a stressed patient who may be subject to long waiting times and exhausted and stressed health care workers. Training regarding communication with patients helps health care professionals to avoid these problems.31 Effective communication alone, however, is not sufficient and additional steps must be taken to reduce waiting time of patients. Previous studies have indicated that the most important reason for patient dissatisfaction in the ED is the waiting time.32 33 Yet, the most important reason for long waiting times is the heavy workload caused, in part, by the discourteous attitude of patients and their relatives. Studies have also shown that more than half of patients who present to the ED are not ‘emergency patients’.34 35 36 Further education regarding the definition of “emergency” and the practice of effective triage may reduce the heavy workload in the ED and associated violent incidents.
 
One previous study reported that verbal and physical attacks by patients and their relatives are the most important factors contributing to stress among ED employees.37 Consistent exposure to high-stress conditions resulting from exposure to verbal and physical violence results in both physical and mental exhaustion. As a result, a situation known commonly as ‘burnout syndrome’ emerges.38 39 The burnout syndrome is defined as holding a negative view of current events, frequent despair, and lost productivity and motivation.40 Reluctance among physicians to work in the ED is one consequence of burnout syndrome.41 In the present study, among the participants who were subjected to violence, 21.5% indicated that they wanted to work in a department other than the ED, while 25.7% stated a desire to work outside the health care field. In a study conducted in Canada, 18% of participants who had been exposed to violence stated that they did not want to work in the ED, and 38% wanted to work outside the health care field.9 Others indicated that they had quitted their jobs because of workplace stress.9 In the present study, 10.4% of ED employees stated that they were afraid of patients and their relatives. In the same Canadian study, 73% of respondents stated that after experiencing violence they were afraid of patients.9 In our study, 96.3% of respondents thought that there had been an increase in violence against ED health care workers in recent years. Moreover, 79.6% of respondents stated that the safety measures in their institutions were insufficient. The participants in the present study suggested that the preparation of deterrent legislation, increased security measures, and efforts to better educate the general population regarding the appropriate use of ED resources will help to reduce violence against health care workers.
 
Limitations
The study was carried out in only two hospitals in Turkey that may not be representative of all hospitals. In addition, participants could decide whether or not to answer all questions and some questionnaires were incomplete. The response rate was only 74% and this might give rise to self-selection bias, that is, those who did not respond may have had a higher (or lower) exposure to violence than those who responded. Hence, the various percentages reported in this paper might be over- or under-estimated.
 
Conclusion
The results of the current study as well as those of earlier studies indicate that the prevalence of violence against ED employees is high. Factors such as patient and stress of health care provider, prolonged waiting times due to overcrowding in the ED, negative attitude of discourteous patients and their relatives, insufficient security measures, and the lack of sufficiently dissuasive legal regulations may contribute to increased violence in the ED. These factors in turn increase stress among ED employees, reduce job satisfaction, and lower the quality of services provided. Measures to decrease the workload in the ED and shorten waiting time of patients, the adoption of legal policies that deter violent behaviour, and increased security measures in health care facilities should be reassessed. Steps should be taken to educate the public in order to reduce violence against health care workers.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Kocacik F. On violence [in Turkish]. Cumhuriyet Univ J Econ Adm Sci 2001;2:1-2.
2. Violence and injury prevention. Available from: http://www.who.int/violence_injury_prevention/violence/activities/workplace/documents/en/index.html. Accessed Nov 2012.
3. Ayrancı Ü, Yenilmez Ç, Günay Y, Kaptanoğlu C. The frequency of being exposed to violence in the various health institutions and health profession groups. Anatol J Psychiatry 2002;3:147-54.
4. Wells J, Bowers L. How prevalent is violence towards nurses working in general hospitals in the UK? J Adv Nurs 2002;39:230-40. Crossref
5. Workplace violence in the health sector. Framework guidelines for addressing workplace violence in the health sector. Available from: http://www.ilo.org/wcmsp5/groups/public/---ed_dialogue/---sector/documents/publication/wcms_160912.pdf. Accessed Nov 2012.
6. Kingma M. Workplace violence in the health sector: a problem of epidemic proportion. Int Nurs Rev 2001;48:129-30. Crossref
7. Gülalp B, Karcıoğlu, Köseoğlu Z, Sari A. Dangers faced by emergency staff: experience in urban centers in southern Turkey. Ulus Travma Acil Cerrahi Derg 2009;15:239-42.
8. Lau J, Magarey J, McCutcheon H. Violence in the emergency department: a literature review. Aust Emerg Nurs J 2004;7:27-37. Crossref
9. Fernandes CM, Bouthillette F, Raboud JM, et al. Violence in the emergency department: a survey of health care workers. CMAJ 1999;161:1245-8.
10. Yanci H, Boz B, Demirkiran Ö, Kiliççioğlu B, Yağmur F. Medical personal subjected to the violence in emergency department—enquiry study. Turk J Emerg Med 2003;3:16-20.
11. Sucu G, Cebeci F, Karazeybek E. Violence by patient and relatives against emergency service personnel. Turk J Emerg Med 2007;7:156-62.
12. Çamci O, Kutlu Y. Determination of workplace violence toward health workers in Kocaeli. J Psychiatr Nurs 2011;2:9-16.
13. Stirling G, Higgins JE, Cooke MW. Violence in A&E departments: a systematic review of the literature. Accid Emerg Nurs 2001;9:77-85. Crossref
14. Sönmez M, Karaoğlu L, Egri M, Genç MF, Günes G, Pehlivan E. Prevalence of workplace violence against health staff in Malatya. Bitlis Eren Univ J Sci Technol 2013;3:26-31.
15. Stene J, Larson E, Levy M, Dohlman M. Workplace violence in the emergency department: giving staff the tools and support to report. Perm J 2015;19:e113-7. Crossref
16. Senuzun Ergün F, Karadakovan A. Violence towards nursing staff in emergency departments in one Turkish city. Int Nurs Rev 2005;52:154-60. Crossref
17. Boz B, Acar K, Ergin A, et al. Violence toward health care workers in emergency departments in Denizli, Turkey. Adv Ther 2006;23:364-9. Crossref
18. Joa TS, Morken T. Violence towards personnel in out-of-hours primary care: a cross-sectional study. Scand J Prim Health Care 2012;30:55-60. Crossref
19. Magnavita N, Heponiemi T. Violence towards health care workers in a Public Health Care Facility in Italy: a repeated cross-sectional study. BMC Health Serv Res 2012;12:108. Crossref
20. Arimatsu M, Wada K, Yoshikawa T, et al. An epidemiological study of work-related violence experienced by physicians who graduated from a medical school in Japan. J Occup Health 2008;50:357-61. Crossref
21. Taylor JL, Rew L. A systematic review of the literature: workplace violence in the emergency department. J Clin Nurs 2011;20:1072-85. Crossref
22. Gacki-Smith J, Juarez AM, Boyett L, Homeyer C, Robinson L, MacLean SL. Violence against nurses working in US emergency departments. J Nurs Adm 2009;39:340-9. Crossref
23. Kowalenko T, Gates D, Gillespie GL, Succop P, Mentzel TK. Prospective study of violence against ED workers. Am J Emerg Med 2013;31:197-205. Crossref
24. Position statement: violence in the emergency care setting. Available from: https://www.ena.org/government/State/Documents/ENAWorkplaceViolencePS.pdf. Accessed Nov 2012.
25. Workplace violence. Washington, DC: United States Department of Labor; 2013. Available from: http://www.osha.gov/SLTC/workplaceviolence/index.html. Accessed Nov 2012.
26. Adib SM, Al-Shatti AK, Kamal S, El-Gerges N, Al-Raqem M. Violence against nurses in healthcare facilities in Kuwait. Int J Nurs Stud 2002;39:469-78. Crossref
27. Alçelik A, Deniz F, Yeşildal N, Mayda AS, Ayakta Şerifi B. Health survey and life habits of nurses who work at the medical faculty hospital at AIBU [in Turkish]. TAF Prev Med Bull 2005;4:55-65.
28. Young GP. The agitated patient in the emergency department. Emerg Med Clin North Am 1987;5:765-81.
29. Stathopoulou HG. Violence and aggression towards health care professionals. Health Sci J 2007;2:1-7.
30. Hoag-Apel CM. Violence in the emergency department. Nurs Manage 1998;29:60,63.
31. Yardan T, Eden AO, Baydın A, Genç S, Gönüllü H. Communication with relatives of the patients in emergency department. Eurasian J Emerg Med 2008;7:9-13.
32. Al B, Yıldırım C, Togun İ, et al. Factors that affect patient satisfaction in emergency department. Eurasian J Emerg Med 2009;8:39-44.
33. Yiğit Ö, Oktay C, Bacakoğlu G. Analysis of the patient satisfaction forms about Emergency Department services at Akdeniz University Hospital. Turk J Emerg Med 2010;10:181-6.
34. Kiliçaslan İ, Bozan H, Oktay C, Göksu E. Demographic properties of patients presenting to the emergency department in Turkey. Turk J Emerg Med 2005;5:5-13.
35. Ersel M, Karcıoğlu Ö, Yanturali S, Yörüktümen A, Sever M, Tunç MA. Emergency Department utilization characteristics and evaluation for patient visit appropriateness from the patients’ and physicians’ point of view. Turk J Emerg Med 2006;6:25-35.
36. Aydin T, Aydın ŞA, Köksal Ö, Özdemir F, Kulaç S, Bulut M. Evaluation of features of patients attending the Emergency Department of Uludağ University Medicine Faculty Hospital and emergency department practices. Eurasian J Emerg Med 2010;9:163-8. Crossref
37. Kalemoglu M, Keskin O. Evaluation of stress factors and burnout in the emergency department staff [in Turkish]. Ulus Travma Derg 2002;8:215-9.
38. Ferns T, Stacey C, Cork A. Violence and aggression in the emergency department: Factors impinging on nursing research. Accid Emerg Nurs 2006;14:49-55. Crossref
39. Keser Özcan N, Bilgin H. Violence towards healthcare workers in Turkey: A systematic review [in Turkish]. Turkiye Klinikleri J Med Sci 2011;31:1442-56. Crossref
40. Maslach C. Burned-out. Hum Behav 1976;5:16-22.
41. Dwyer BJ. Surviving the 10-year ache: emergency practice burnout. Emerg Med Rep 1991;23:S1-8.

Population-based survey of the prevalence of lower urinary tract symptoms in adolescents with and without psychotropic substance abuse

Hong Kong Med J 2016 Oct;22(5):454–63 | Epub 12 Aug 2016
DOI: 10.12809/hkmj154806
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Population-based survey of the prevalence of lower urinary tract symptoms in adolescents with and without psychotropic substance abuse
YH Tam, FHKAM (Surgery)1; CF Ng, FHKAM (Surgery)2; YS Wong, FHKAM (Surgery)1; Kristine KY Pang, FHKAM (Surgery)1; YL Hong, MSc1; WM Lee, MSc2; PT Lai, BN2
1 Division of Paediatric Surgery and Paediatric Urology, Department of Surgery, Prince of Wales Hospital, The Chinese University of Hong Kong, Shatin, Hong Kong
2 Division of Urology, Department of Surgery, Prince of Wales Hospital, The Chinese University of Hong Kong, Shatin, Hong Kong
 
Corresponding author: Dr YH Tam (pyhtam@surgery.cuhk.edu.hk)
 
 Full paper in PDF
 
Abstract
Objective: To investigate the prevalence of lower urinary tract symptoms in adolescents and the effects of psychotropic substance use.
 
Methods: This was a population-based cross-sectional survey using a validated questionnaire in students from 45 secondary schools in Hong Kong randomly selected over the period of January 2012 to January 2014. A total of 11 938 secondary school students (response rate, 74.6%) completed and returned a questionnaire that was eligible for analysis. Individual lower urinary tract symptoms and history of psychotropic substance abuse were documented.
 
Results: In this study, 11 617 non-substance abusers were regarded as control subjects and 321 (2.7%) were psychotropic substance users. Among the control subjects, 2106 (18.5%) had experienced at least one lower urinary tract symptom with urinary frequency being the most prevalent symptom (10.2%). Females had more daytime urinary incontinence (P<0.001) and males had more voiding symptoms (P=0.01). Prevalence of lower urinary tract symptoms increased with age from 13.9% to 25.8% towards young adulthood and age of ≥18 years (P<0.001). Among the substance users, ketamine was most commonly abused. Substance users had significantly more lower urinary tract symptoms than control subjects (P<0.001). In multivariate analysis, increasing age and psychotropic substance abuse increased the odds for lower urinary tract symptoms. Non-ketamine substance users and ketamine users were respectively 2.8-fold (95% confidence interval, 2.0-3.9) and 6.2-fold (4.1-9.1) more likely than control subjects to develop lower urinary tract symptoms. Females (odds ratio=9.9; 95% confidence interval, 5.4-18.2) were more likely to develop lower urinary tract symptoms than males (4.2; 2.5-7.1) when ketamine was abused.
 
Conclusions: Lower urinary tract symptoms are prevalent in the general adolescent population. It is important to obtain an accurate history regarding psychotropic substance use when treating teenagers with lower urinary tract symptoms.
 
 
New knowledge added by this study
  • Prevalence of lower urinary tract symptoms (LUTS) increases consistently from onset of adolescence towards adulthood. Psychotropic substance abuse, particularly ketamine, is associated with an increased risk of developing LUTS in adolescents. Girls are more susceptible than boys if ketamine is abused.
Implications for clinical practice or policy
  • It is important to obtain an accurate history regarding psychotropic substance use when treating teenagers with LUTS.
 
 
Introduction
Lower urinary tract symptoms (LUTSs) are prevalent worldwide. An estimated 45.2% of the 2008 worldwide population aged ≥20 years are affected by at least one LUTS.1 Large-scale population-based survey has reported that LUTS prevalence increases with advancing age up to 60% at the age of 60 years.2 Evaluation and treatment of LUTS for the general population have incurred significant costs to the health care system. In children, the association of LUTS with urinary tract infection, persistent vesicoureteric reflux, renal scarring, and constipation have drawn substantial attention over the years.3 4 Among various LUTSs, urinary incontinence (UI) has been most extensively investigated in children with the reported prevalence varying from 1.8% to 20%.5 Previous studies of the prevalence of individual LUTS using the International Children’s Continence Society (ICCS) definitions6 have focused primarily on pre-adolescent children in primary schools.7 8 9 10 To date, no large-scale studies have investigated the prevalence of LUTSs in adolescents.
 
Psychotropic substance use among adolescents is a growing concern worldwide and creates psychosocial, security, and health care issues. In recent years, ketamine abuse has been found to cause severe LUTSs and Hong Kong is one of the earliest countries/regions to report the newly established clinical entity of ketamine-associated uropathy.11 12 13 Ketamine is the most popular psychotropic substance being abused by people aged <21 years in our society.14 The aim of the present study was to investigate the prevalence of LUTSs in our adolescents and the differences between those with and without psychotropic substance use.
 
Methods
Study design, sample size estimation, and participant recruitment
This was a cross-sectional questionnaire survey that recruited adolescents from secondary schools serving Hong Kong local residents during the period of January 2012 to January 2014. There were almost 500 secondary schools in Hong Kong serving approximately 470 000 adolescents in 2009/10. Based on the data of children and young adults available in the literature,2 9 we assumed the prevalence of LUTSs among our adolescents to be 20%. A study sample of 6050 participants would be required to allow an error of ±1%. Government sources suggested 2.3% of our secondary school students used psychotropic substance in 2011/12.15 We assumed the prevalence of LUTSs among those secondary students using psychotropic substance was 15% higher than in normal subjects. In order to detect a difference with a type 1 error of 0.05 and a power of 0.8, a sample size of 4500 participants would be required. Based on the above two assumptions and a predicted response rate of 50% to 60%, we determined that a potential target of not less than 10 000 participants would be required.
 
In the selection of schools we included all government, aided, and Direct Subsidy Scheme schools. Private international schools and special schools were excluded. Co-educational, boys’, and girls’ schools were included. The list of secondary schools was provided by the Education Bureau and schools were grouped into 18 geographical districts. As the prevalence of psychotropic substance use might vary significantly between schools, we arbitrarily determined to recruit participants from not less than 8% to 10% of the secondary schools in order to reduce the sampling bias.
 
The random selection process started with drawing a district followed by a school within the selected district. Based on a rough estimation of population distribution, we intended to select schools from Hong Kong Island (HKI), Kowloon (Kln), and New Territories (NT) in an approximate ratio of 1:2:3. We invited the selected schools to participate in the study. If the invitation was declined, the next school following the drawing sequence would be contacted. The above procedure was repeated until the target sampling size was reached. Finally, 45 out of 121 schools were selected and approached, and agreed to participate in the study (HKI, n=7; Kln, n=13; NT, n= 25) giving a potential target of 16 000 participants.
 
The grades/classes of students participating in the survey from each school were not randomly selected but were determined after discussion and mutual agreement with the school management. In order to avoid the possible bias of intentional selection or exclusion of a particular class of students, school management was invited to express their preferences about which grade/grades of students would participate provided that all students of the selected grade/grades participated. Although we tried to avoid over-representation of a particular grade of students by making some suggestions to the school management, their preferences were always respected and accepted. Of the 45 participating schools, we recruited two or three grades of Form 1-3 students, two or three grades of Form 4-6 students, and all the students in 18, 10, and 8 schools, respectively. In the remaining nine schools, we recruited only one grade of their students.
 
Study measures
The measuring tool was an anonymous self-reported questionnaire accompanied by an information sheet. In both the information sheet and the questionnaire, we stated clearly that participation in the study was voluntary and consent to participate was presumed on receipt of a completed questionnaire that was returned in the envelope provided. Individuals who did not consent to participate were told to disregard the questionnaire. The questionnaire consisted of three parts: demographic data on gender and age, LUTS assessment, and history of psychotropic substance use (Appendix).
 

Appendix. The questionnaire
 
Age was divided into four categories: <13, 13-15, 16-17, and ≥18 years. Participants were asked to respond to an 8-item LUTS assessment that included storage symptoms (urinary frequency, urgency, nocturia, and daytime UI), voiding symptoms (intermittent stream, straining, and dysuria), and post-micturition symptom (incomplete emptying). The recall period was the last 4 weeks. The LUTS questions were adapted from the Hong Kong Chinese version of International Prostate Symptom Score questionnaire that has been validated to assess LUTSs in our local adult population.16 We believed that the level of comprehension of most of our adolescent participants in secondary education was close to that of an average adult. The response options for most of the LUTSs were on a 6-point Likert scale: “never”, “seldom (<20% of the time)”, “sometimes (20-50% of the time)”, “often (50% of the time)”, “always (>50% of the time)”, and “almost every time”. Any LUTS with frequency threshold of ‘≥20% of the time’ was defined as being present in the study subject. Daytime UI and nocturia were assessed on a different 5-point Likert scale according to their frequency. Daytime UI and nocturia were defined as present if the study subject had ≥1 to 3 times per month and ≥2 times per night, respectively.2 17
 
Responses to questions on psychotropic substance use were dichotomised as either “yes” or “no”. Those with positive responses were directed to questions on the type of substance being abused, which included ketamine, ecstasy, methamphetamine, cough mixture, marijuana, and others. Participants were allowed to indicate more than one substance. According to the response to questions on psychotropic substance use, the participants were classified as control subjects or psychotropic substance users. The psychotropic substance users were further subdivided into either ketamine users or non-ketamine users.
 
Statistical analysis
The responses to each LUTS were dichotomised as “present” versus “absent” and prevalence rate for each LUTS was expressed in percentage with 95% confidence interval (CI). Missing data were excluded for analysis. Chi squared and trend tests were performed in univariate analysis to compare prevalence differences between groups divided by gender, age, and psychotropic substance use. Using the outcome of “at least one LUTS”, which was dichotomised into “yes” or “no”, a binary logistic regression model using enter method was set up to investigate risk factors including gender, age, and psychotropic substance use. Odds ratio (OR) of “at least one LUTS” was estimated with 95% CI for the potential risk factors. A P value of <0.05 was considered to be significant.
 
The study protocol was approved by the Joint CUHK-NTEC Clinical Research Ethics Committee.
 
Results
A total of 16 000 questionnaires were sent to schools and 11 938 were returned (estimated response rate, 74.6%) that were eligible for analysis in the study. The response rate was estimated since the number of questionnaires delivered to each school was not necessarily equal to the number of students of that school who received the questionnaire. The conduction of the survey at schools was not supervised. We were uncertain if students absent from school would receive our questionnaire. The number of questionnaires requested by each school was always rounded off to the nearest 10 and not necessarily equal to the actual number of students in the selected classes. It seems logical to assume that the actual number of students who received our questionnaires was less than 16 000 and the actual response rate might be higher. There were similar numbers of males (n=6040) and females (n=5819) among the participants who responded to the question on gender. Among the 11 938 participants, 11 617 did not report use of any psychotropic substances and were defined as control subjects; 321 (2.7%) participants reported to have used one or more types of psychotropic substance were defined as substance users.
 
Of 11 617 control subjects, 2106 (18.5%; including only the valid subjects) had experienced at least one LUTS with the symptom frequency of ‘≥20% of the time’ in the last 4 weeks (Table 1). The most prevalent LUTSs were urinary frequency (10.2%), incomplete emptying (5.4%), and nocturia ≥2 times per night (4.4%). Daytime UI ≥1 to 3 times per month was reported by 3.7% of control subjects. Females had more daytime UI than males (5.2% vs 2.2%; P<0.001), while males had significantly more voiding symptoms and incomplete emptying. There was significant increase in the prevalence of all LUTSs except for daytime UI across the age-groups from <13 years to the young adulthood age-group of ≥18 years (Table 2).
 

Table 1. Prevalence of LUTS in control subjects and comparison by gender
 

Table 2. Comparison of LUTS in control subjects by age
 
Compared with control subjects, the psychotropic substance users experienced significantly more LUTSs in all areas (Table 3). Of the 321 substance abusers, 305 responded to the question about types of psychotropic substance abused. Ketamine was the most commonly abused substance (n=139; 45.6%), followed by cough mixture (n=96; 31.5%), ecstasy (n=77; 25.2%), methamphetamine (n=76; 24.9%), and marijuana (n=70; 23.0%). Of ketamine users, 60.7% had at least one LUTS. Comparing the ketamine users with other non-ketamine substance users, the former experienced significantly more LUTSs in all areas except for daytime UI, though for which a higher prevalence was still observed. Female ketamine users appeared to be more affected by LUTS than males (Table 4).
 

Table 3. Comparison of control subjects with psychotropic substance users
 

Table 4. Comparison of ketamine users with non-ketamine substance users, and male with female ketamine users
 
In multivariate analysis, increasing age and psychotropic substance use were found to increase the odds for experiencing at least one LUTS. With reference to age of <13 years, the ORs of experiencing at least one LUTS at age 13-15, 16-17, and ≥18 years were 1.3 (95% CI, 1.1-1.5), 1.7 (95% CI, 1.4-2.0), and 2.1 (95% CI, 1.7-2.7), respectively. With reference to the control subjects, the ORs of experiencing at least one LUTS were 2.8 (95% CI, 2.0-3.9) for those who used substances other than ketamine, and 6.2 (95% CI, 4.1-9.1) for those who used ketamine. When assessing the two genders separately in multivariate analysis, female ketamine users were 9.9-fold (95% CI, 5.4-18.2) and male ketamine users were 4.2-fold (95% CI, 2.5-7.1) more likely than their non-abuser counterparts to develop LUTSs.
 
Discussion
Large-scale population-based surveys of LUTS prevalence have been conducted in adults.2 17 Recently a few paediatric studies using the ICCS definitions have reported LUTS prevalence in children varying from 9.3% to 46.4%.7 8 9 The wide variation in prevalence can be attributed to the differences in the study population, questions used to assess LUTS, and the criteria to define the presence of symptoms. Vaz et al8 reported a prevalence of 21.8% in 739 Brazilian children aged 6 to 12 years while Yüksel et al7 found 9.3% of their 4016 Turkish children aged 6 to 15 years had LUTSs. In both studies, the investigators used validated scoring systems for a combination of LUTSs being assessed and pre-determined cut-off points in the total scores to define the presence or absence of LUTS.7 8 In contrast, Chung et al9 investigated 16 516 Korean children aged 5 to 13 years by measuring the presence of individual LUTS and reported the highest prevalence of 46.4% experiencing at least one LUTS. The high prevalence rate in the Korean study can be partly explained by their methodology wherein the responses to the LUTS questions were dichotomised into “yes” or “no” and a positive symptom was defined without considering its frequency.9
 
To the best of our knowledge, the present study is the first large-scale prevalence study focused on adolescents. We used a similar methodology to other major adult studies to measure each LUTS individually and define its presence by a frequency threshold of ‘≥20% of the time’.2 17 18 19 We agree with others that using a scoring system to define LUTS in a prevalence study may not reflect the true impact of individual LUTS as it is possible that a highly prevalent symptom may happen alone and the summed score may not reach the threshold.17
 
In our adolescents without any substance abuse, 18.5% experienced at least one LUTS. Our finding suggests that LUTS prevalence in adolescents appears to be lower than that in young adults. Previous studies including two conducted in Chinese populations have reported that 17% to 42% of men and women aged 18 to 39 years experience at least one LUTS.2 18 19 Notably, LUTS prevalence increased with age during adolescence from 13.9% in those <13 years to 25.8% in those aged ≥18 years in this study. In children, the prevalence of LUTS peaks at age 5 to 7 years and then declines with increasing age up to 13 to 14 years.7 8 9 10 20 The decline in prevalence has been attributed to the maturation of urinary bladder function along with the growth and development of children. Our study is the first to provide evidence that LUTS prevalence rises from the trough at the onset of adolescence and continues to increase throughout adolescence into adulthood. Our reported prevalence of 25.8% in those participants aged ≥18 years is in agreement with the trend in young adults reported elsewhere.2 18 19
 
Little is known in the existing literature regarding the trend of LUTS prevalence from adolescence to adulthood. In a Finnish study of 594 subjects aged 4 to 26 years, the authors reported that individuals aged 18 to 26 years had more urgency than the other two age-groups of 8 to 12 years and 13 to 17 years.10 Although adolescence spans less than a decade, it is unique with rapid physical, psychological, and developmental changes. Reasons for the increase in LUTS prevalence from adolescence to young adulthood are largely unknown but likely to be multifactorial. Changes in lifestyle, altered micturition behaviour, habitual postponement of micturition, unhealthy bowel habits, attitudes to the use of school toilets, anxiety associated with academic expectations, or worsening of relationships with family may all contribute to newly developed LUTS during adolescence. Further studies are warranted to investigate this phenomenon.
 
Our findings that storage symptoms were more prevalent than voiding symptoms are in agreement with the reported results in young adults.2 18 19 Urinary frequency (10.2%) and nocturia ≥2 times per night (4.4%) were the two most prevalent storage symptoms among the control subjects. We agree with others that nocturia once per night is very common in the general population and using the threshold of nocturia ≥2 times per night as LUTS is more appropriate.2 17 18 19 Only 2.9% of our control subjects had urgency suggestive of overactive bladder (OAB) according to ICCS definitions,6 in contrast to 12% of Korean children aged 13 years.20 Children with OAB may have urinary frequency in addition to urgency. The much lower prevalence of urgency than urinary frequency in our study suggests that many of our study subjects had urinary frequency unrelated to OAB. Glassberg et al21 found over 70% of their paediatric patients with dysfunctional voiding (DV) and primary bladder neck dysfunction (PBND) experienced urinary frequency; DV and PBND are also associated with high residual urine volume. Our finding that the feeling of incomplete emptying (5.4%) was the second most prevalent LUTS suggests that in some participants urinary frequency was secondary to incomplete bladder emptying associated with DV or PBND.
 
In our study, male non-substance users experienced more voiding symptoms while females had more daytime UI. Literature has consistently found female gender to be a risk factor for daytime UI in children.5 22 23 Our finding suggests that the gender association with daytime UI extends from childhood to adolescence. There are inconsistencies in the paediatric literature with respect to gender differences in voiding symptoms. Kyrklund et al10 found more voiding symptoms in boys than girls only in the age-group of 4 to 7 years, while such difference was not noted by others.8 24
 
Psychotropic substance use increased the risk of LUTS in our adolescents. Notably, 60% of our adolescents who abused ketamine had experienced at least one LUTS and had high prevalence rates of 28% to 47% in all areas of LUTS. Our finding that 2.7% of our participants abused psychotropic substances is consistent with the latest figure of 2.3% estimated by our government in its survey conducted in 2011/12.15 Ketamine-associated uropathy has emerged as a new clinical entity in our society since 2007.13 This chemically induced cystitis as a result of the urinary metabolites of ketamine is associated with severe LUTS with the possible consequence of irreversible bladder damage.12 25 Little information is available in the medical literature about the prevalence of LUTS among ketamine users. An online survey conducted in the UK reported a prevalence of 26.6% of at least one LUTS in the last 12 months among 1285 participants who had illicitly used ketamine.26 The LUTS prevalence is likely influenced by variation in dose and frequency of ketamine use of the study population. We have recently reported that both the dose and frequency of ketamine use and female gender are associated with the severity of the LUTS at presentation among the young patients who sought urological treatment for ketamine-associated uropathy.25 In the present study, female ketamine users were at a higher risk of developing LUTS than males. This observation is in agreement with our previous findings and our postulation that females appear to be more susceptible to the chemically induced injury following illicit use of ketamine for unknown reasons.25
 
Non-ketamine substance users also experienced more LUTSs than the control subjects in this study although the prevalence was not as high as that of ketamine users. Most recently Korean investigators have reported a 77% prevalence rate of LUTS among a group of young methamphetamine (also known as ‘ice’) users, and suggested that a pathological dopaminergic mechanism plays a predominant role in methamphetamine-associated LUTS.27 There has been a rising trend of using methamphetamine in recent years and it is now the second most popular psychotropic substance abused by youths aged <21 years in our community.14 It would not be surprising if we encountered more and more young patients presenting with LUTS associated with methamphetamine use in the foreseeable future.
 
Limitations of this study
There was potential bias in the sampling process as almost two thirds of the schools that we selected and approached refused to participate, the grades of the participants were not randomly selected, and non-response rate was approximately 20%. Young participants of lower grades may not be able to comprehend the LUTS questions that were designed for adults. Nevertheless, our finding of 2.7% of psychotropic substance use appears to be consistent with the 2.3% reported by the 2011/12 government survey in over 80 000 secondary school students.15 We did not study other potential risk factors that may be associated with LUTS in adolescents such as bowel function, urinary tract infection, stressful events, lifestyle, and toilet environment. The 0.5% to 2% missing data in each of the LUTS questions, though small, may still affect the estimated prevalence of each LUTS among our control subjects. Although daytime UI was not a prevalent symptom, the fact that less than half of the participants were asked this question because of a printing error may underestimate the overall prevalence of experiencing at least one LUTS among different subgroups. The 4-week recall period only allowed crude assessment of LUTS. A more-prevalent symptom may not necessarily cause more inconvenience than a less-prevalent symptom. How each individual LUTS concerned the participant and how different the substance abusers and non-substance abusers were concerned by the LUTS were not investigated in this study. Therefore individuals, particularly the non-substance abusers, who reported the experience of LUTS did not necessarily suffer from any established lower urinary tract conditions that warranted medical attention. The dose and frequency of illicit psychotropic substance use would certainly have an impact on the prevalence of LUTS but this was not investigated in this survey.
 
Despite all these limitations, our study provides important data on the prevalence of LUTS in adolescents and the effect of psychotropic substance use. The LUTSs are prevalent in the general adolescent population. It is important for clinicians to obtain a history about psychotropic substance use when treating teenagers with LUTS as there is a substantial possibility that the LUTSs are caused by organic pathology associated with psychotropic substance use and not functional voiding disorders.
 
Appendix
Additional material related to this article can be found on the HKMJ website. Please go to <http://www.hkmj.org>, and search for the article.
 
Declaration
The study was supported by the Beat Drugs Fund (BDF101012) of the Hong Kong SAR Government. The funding source had no role in the study design, data collection, data analysis, results interpretation, writing of the manuscript, or the decision to submit the manuscript for publication. All authors have no conflicts of interest relevant to this article to disclose.
 
References
1. Irwin DE, Kopp ZS, Agatep B, Milsom I, Abrams P. Worldwide prevalence estimates of lower urinary tract symptoms, overactive bladder, urinary incontinence and bladder outlet obstruction. BJU Int 2011;108:1132-8. Crossref
2. Irwin DE, Milsom I, Hunskaar S, et al. Population-based survey of urinary incontinence, overactive bladder, and other lower urinary tract symptoms in five countries: results of the EPIC study. Eur Urol 2006;50:1306-14; discussion 1314-5. Crossref
3. Koff AS, Wagner TT, Jayanthi VR. The relationship among dysfunctional elimination syndromes, primary vesicoureteral reflux and urinary tract infections in children. J Urol 1998;160:1019-22. Crossref
4. Leonardo CR, Filgueiras MF, Vasconcelos MM, et al. Risk factors for renal scarring in children and adolescents with lower urinary tract dysfunction. Pediatr Nephrol 2007;22:1891-6. Crossref
5. Sureshkumar P, Jones M, Cumming R, Craig J. A population based study of 2,856 school-age children with urinary incontinence. J Urol 2009;181:808-15; discussion 815-6. Crossref
6. Nevéus T, von Gontard A, Hoebeke P, et al. The standardization of terminology of lower urinary tract function in children and adolescents: report from the Standardization Committee of the International Children’s Continence Society. J Urol 2006;176:314-24. Crossref
7. Yüksel S, Yurdakul AC, Zencir M, Cördük N. Evaluation of lower urinary tract dysfunction in Turkish primary schoolchildren: an epidemiological study. J Pediatr Urol 2014;10:1181-6. Crossref
8. Vaz GT, Vasconcelo MM, Oliveira EA, et al. Prevalence of lower urinary tract symptoms in school-age children. Pediatr Nephrol 2012;27:597-603. Crossref
9. Chung JM, Lee SD, Kang DI, et al. An epidemiologic study of voiding and bowel habits in Korean children: a nationwide multicenter study. Urology 2010;76:215-9. Crossref
10. Kyrklund K, Taskinen S, Rintala RJ, Pakarinen MP. Lower urinary tract symptoms from childhood to adulthood: a population based study of 594 Finnish individuals 4 to 26 years old. J Urol 2012;188:588-93. Crossref
11. Wood D, Cottrell A, Baker SC, et al. Recreational ketamine: from pleasure to pain. BJU Int 2011;107:1881-4. Crossref
12. Chu PS, Ma WK, Wong SC, et al. The destruction of the lower urinary tract by ketamine abuse: a new syndrome? BJU Int 2008;102:1616-22. Crossref
13. Chu PS, Kwok SC, Lam KM, et al. ‘Street ketamine’–associated bladder dysfunction: a report of ten cases. Hong Kong Med J 2007;13:311-3.
14. Central Registry of Drug Abuse Sixty-third Report 2004-2013. Narcotics Division, Security Bureau, The Government of the Hong Kong Special Administrative Region. Available from: http://www.nd.gov.hk/en/crda_63rd_report.htm. Accessed Dec 2015.
15. The 2011/12 survey of drug use among students. Narcotics Division, Security Bureau, The Government of the Hong Kong Special Administrative Region. Available from: http://www.nd.gov.hk/en/survey_of_drug_use_11-12.htm. Accessed Dec 2015.
16. Yee CH, Li JK, Lam HC, Chan ES, Hou SS, Ng CF. The prevalence of lower urinary tract symptoms in a Chinese population, and the correlation with uroflowmetry and disease perception. Int Urol Nephrol 2014;46:703-10. Crossref
17. Coyne KS, Sexton CC, Thompson CL, et al. The prevalence of lower urinary tract symptoms (LUTS) in the USA, the UK and Sweden: results from the Epidemiology of LUTS (EpiLUTS) study. BJU Int 2009;104:352-60. Crossref
18. Zhang L, Zhu L, Xu T, et al. A population-based survey of the prevalence, potential risk factors, and symptom-specific bother of lower urinary tract symptoms in adult Chinese women. Eur Urol 2015;68:97-112. Crossref
19. Wang Y, Hu H, Xu K, Wang X, Na Y, Kang X. Prevalence, risk factors and the bother of lower urinary tract symptoms in China: a population-based survey. Int Urogynecol J 2015;26:911-9. Crossref
20. Chung JM, Lee SD, Kang DI, et al. Prevalence and associated factors of overactive bladder in Korean children 5-13 years old: a nationwide multicenter study. Urology 2009;73:63-7; discussion 68-9. Crossref
21. Glassberg KI, Combs AJ, Horowitz M. Nonneurogenic voiding disorders in children and adolescents: clinical and videourodynamic findings in 4 specific conditions. J Urol 2010;184:2123-7. Crossref
22. Kajiwara M, Inoue K, Usui A, Kurihara M, Usui T. The micturition habits and prevalence of daytime urinary incontinence in Japanese primary school children. J Urol 2004;171:403-7. Crossref
23. Hellström A, Hanson E, Hansson S, Hjälmås K, Jodal U. Micturition habits and incontinence in 7-year-old Swedish school entrants. Eur J Pediatr 1990;149:434-7. Crossref
24. Akil IO, Ozmen D, Cetinkaya AC. Prevalence of urinary incontinence and lower urinary tract symptoms in school-age children. Urol J 2014;11:1602-8.
25. Tam YH, Ng CF, Pang KK, et al. One-stop clinic for ketamine-associated uropathy: report on service delivery model, patients’ characteristics and non-invasive investigations at baseline by a cross-sectional study in a prospective cohort of 318 teenagers and young adults. BJU Int 2014;114:754-60. Crossref
26. Winstock AR, Mitcheson L, Gillatt DA, Cottrell AM. The prevalence and natural history of urinary symptoms among recreational ketamine users. BJU Int 2012;110:1762-6. Crossref
27. Koo KC, Lee DH, Kim JH, et al. Prevalence and management of lower urinary tract symptoms in methamphetamine abusers: an under-recognized clinical identity. J Urol 2014;191:722-6. Crossref

Clinical transition for adolescents with developmental disabilities in Hong Kong: a pilot study

Hong Kong Med J 2016 Oct;22(5):445–53 | Epub 19 Aug 2016
DOI: 10.12809/hkmj154747
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Clinical transition for adolescents with developmental disabilities in Hong Kong: a pilot study
Tamis W Pin, PhD; Wayne LS Chan, PhD; CL Chan, BSc (Hons) Physiotherapy; KH Foo, BSc (Hons) Physiotherapy; Kevin HW Fung, BSc (Hons) Physiotherapy; LK Li, BSc (Hons) Physiotherapy; Tina CL Tsang, BSc (Hons) Physiotherapy
Department of Rehabilitation Sciences, Hong Kong Polytechnic University, Hunghom, Hong Kong
 
Corresponding author: Dr Tamis W Pin (tamis.pin@polyu.edu.hk)
 
This paper was presented as a poster at the Hong Kong Physiotherapy Association Conference 2015, Hong Kong on 3-4 November 2015.
 
 Full paper in PDF
 
Abstract
Introduction: Children with developmental disabilities usually move from the paediatric to adult health service after the age of 18 years. This clinical transition is fragmented in Hong Kong. There are no local data for adolescents with developmental disabilities and their families about the issues they face during the clinical transition. This pilot study aimed to explore and collect information from adolescents with developmental disabilities and their caregivers about their transition from paediatric to adult health care services in Hong Kong.
 
Methods: This exploratory survey was carried out in two special schools in Hong Kong. Convenient samples of adolescents with developmental disabilities and their parents were taken. The questionnaire was administered by interviewers in Cantonese. Descriptive statistics were used to analyse the answers to closed-ended questions. Responses to open-ended questions were summarised.
 
Results: In this study, 22 parents (mean age ± standard deviation: 49.9 ± 10.0 years) and 13 adolescents (19.6 ± 1.0 years) completed the face-to-face questionnaire. The main diagnoses of the adolescents were cerebral palsy (59%) and cognitive impairment (55%). Of the study parents, 77% were reluctant to transition. For the 10 families who did move to adult care, 60% of the parents were not satisfied with the services. The main reasons were reluctant to change and dissatisfaction with the adult medical service. The participants emphasised their need for a structured clinical transition service to support them during this challenging time.
 
Conclusions: This study is the first in Hong Kong to present preliminary data on adolescents with developmental disabilities and their families during transition from paediatric to adult medical care. Further studies are required to understand the needs of this population group during clinical transition.
 
 
New knowledge added by this study
  • These results are the first published findings on clinical transition for adolescents with developmental disabilities in Hong Kong.
  • Dissatisfaction with the adult health services and reluctance to change were the main barriers to clinical transition.
  • The concerns and needs of the families were similar regardless of whether adolescents had physical or cognitive disabilities.
Implications for clinical practice or policy
  • A structured clinical transition service is required for adolescents with developmental disabilities and their parents.
  • Further in-depth studies are required to examine the needs for and concerns about clinical transition for all those involved. This should include adolescents with developmental disabilities, their parents or caregivers, and service providers in both paediatric and adult health services.
 
 
Introduction
Advances in medical management now enable children with developmental disabilities (DD) who may previously have died to live well into adulthood.1 Such disabilities are defined as any condition that is present before the age of 22 years and due to physical or cognitive impairment or a combination of both that significantly affects self-care, receptive and expressive language, mobility, learning, independent living, or the economic independence of the individual.2 The transition from adolescence to adulthood is a critical period for all young people.3 In a clinical context, adult transition is “the purposeful, planned movement of adolescents and young adults with chronic physical and medical conditions from child-centered to adult-oriented health-care systems”.4 In 2001, a consensus statement with guidelines was endorsed to ensure adolescents with DD, who depend on coordinated health care services, make a smooth transition to the adult health care system in order to receive the services that they need in developed countries such as the United States.5
 
Researchers have identified needs and factors necessary for the successful transition of adolescents with DD.6 7 From the adolescent’s perspective, barriers to success include their dependence on others, reduced treatment time and access to specialists in the adult health service, lack of information about transition, and lack of involvement in the decision-making process. Parents of adolescents with DD were reluctant or confused about changing responsibilities during the transition period. The majority of challenges came from the service systems and included unclear eligibility criteria and procedures, limited time and lack of carer training, fragmented adult health service provision, lack of communication between service providers, and inaccessibility to resources including information.6 7
 
Based on the 2015 census of ‘Persons with disabilities and chronic diseases in Hong Kong’ from the Hong Kong Census and Statistics Department, there were 22 100 (3.8%) people with disability (excluding cognitive impairment) aged between 15 and 29 years, ie who were transitioning from the paediatric to adult health service.8 According to the Hospital Authority, Hong Kong, all public hospitals, and specialist and general out-patient clinics are organised into seven hospital clusters based on geographical location.9 The Duchess of Kent Children’s Hospital (DKCH) is a tertiary centre that provides specialised services for children with orthopaedic problems, spinal deformities, cerebral palsy and neurodevelopmental disorders, and neurological and degenerative diseases. Unlike overseas health care, there is no children’s hospital in Hong Kong that provides an acute health service. All paediatric patients go to the same hospital as adult patients but are triaged into the paediatric section for management by both in-patient and out-patient services. The specialised out-patient clinic list under each hospital cluster varies. Children with DD might receive services from general paediatrics clinic, a cerebral palsy clinic, Down’s clinic, behavioural clinic, or paediatric neurology out-patient clinic. Once a child reaches the age of 18 years, they are referred to the adult section of the same hospital for continued care. They will be followed up in neurology or movement disorder clinics, where other patients with adult-onset neurological conditions or movement disorders, such as stroke, Parkinson’s disease, multiple sclerosis are followed up. There is no separate specialised clinic for complex child-onset DD.10
 
Although adult transition for adolescents with DD has been recognised as a crucial area in health care overseas, it is an under-developed service in Hong Kong.11 A local study found that there is a service gap in adult transition for young people with chronic medical conditions, such as asthma, diabetes, and epilepsy. Training and education are urgently required for both service providers and young people with chronic health conditions and their families.11 It is unclear if the challenges and barriers identified in overseas literature6 7 are applicable to Hong Kong, where the paediatric and adult health services, especially the medical services, are located in the same building. At present, no study has been conducted with adolescents with DD and their families in Hong Kong about the issues they face during clinical transition. As a start, in this pilot study, we aimed to explore the acceptance of clinical transition and identify the main barriers to successful clinical transition for adolescents with DD and their caregivers in Hong Kong.
 
Methods
Participants
A survey study was conducted on a convenience sample of adolescents and/or their caregivers, who were recruited from two special high schools (Hong Kong Red Cross John F Kennedy Centre [JFK] and Haven of Hope Sunnyside School [HOH]) in Hong Kong. Students from JFK have primarily physical and multiple disabilities including cerebral palsy or muscular dystrophy and those from HOH have severe cognitive impairment. Both schools provide rehabilitation services on site including physiotherapy, occupational therapy, speech therapy, nursing support, and family support via the school social workers. The medical out-patient services for the students fall under different hospital clusters, depending on where the students’ families live. The parents are responsible for taking their adolescent children for medical review. As there was no previous study on which basis to calculate the required sample size and as a pilot study, we aimed to recruit 10 adolescents and their parents/caregivers in each school.
 
The inclusion criteria of the adolescents were: (1) aged 16 to 19 years and (2) a diagnosis of DD. All participating adolescents and/or their parents or legal guardians gave written informed consent before the survey. For adolescents with severe cognitive impairment, consent was sought from their parents as proxy and only their parents participated in the survey. This pilot study was approved by the Human Subjects Ethics Sub-committee of the Hong Kong Polytechnic University.
 
Survey
A specific questionnaire was developed for this pilot study to collect information relative to: (1) demographic characteristics of the participants; (2) whether or not the study participants were aware of the transition and the information source(s); (3) if the study participants were willing to transition to the adult health service and the underlying reasons; (4) for those who had transitioned, if they were satisfied with the adult health service and the underlying reasons; and (5) the opinion of the study participants of the clinical transition service. As a pioneer pilot study, health service included both medical and rehabilitation services and the study aimed to explore the general issues faced by this group of adolescents and their families during clinical transition. Lastly, as we predicted that the competency level of the adolescents and their caregivers in managing their disabilities might influence their perception of clinical transition, information about their self-rated competency level was also collected. Questions in the latter part were based on information from the Adolescent Transition Care Assessment Tool and were designed to assist health care professionals to provide a better transition for adolescents with chronic illness.12 The whole survey was administered by interviewers for the study adolescents and/or their caregivers separately in Cantonese. All interviews were recorded for data analyses.
 
The survey comprised closed- and open-ended questions. The closed-ended questions were designed to require a dichotomous answer, ie ‘yes’ or ‘no’, or an answer from a set of choices. For example, when asked about the sources of information about clinical transition, the study participants could choose their answers from a list of professionals such as paediatricians, social workers, physiotherapists, and school teachers. The open-ended questions focused on the reasons for an earlier response. For example, when asked if they wished to move to the adult health service, the individual would first answer ‘yes’ or ‘no’, then give their reasons. For the questions about self-perceived competency level, study participants read a number of statements (8 for the adolescents and 14 for parents) and indicated how much they agreed with the statement using a Likert-scale from ‘strongly disagree’ to ‘strongly agree’.
 
Data analyses
Descriptive statistics—including mean, median, standard deviation, or quartiles—were used to analyse the responses to the closed-ended questions. Discussion of the open-ended questions was transcribed and summarised by five team members (CLC, KHF, KHWF, LKL, and TCLT). The content was analysed and themes were identified independently by two other team members (TWP and WLSC). These themes were discussed and a consensus reached by all team members.
 
Results
Thirteen potential families were approached at the JFK via the physiotherapist of the school anticipating possible refusal by some. All the students and their parents agreed to participate, so all were included. Ten families were approached at the HOH via the social worker at the school but one parent declined the offer and no other family was interested in participating in the study. Since the students from HOH, who were cognitively impaired, were not able to be interviewed, only their parents/caregivers were interviewed. As a result, 22 parents (13 from the JFK and 9 from the HOH) and 13 adolescents (all from the JFK) were asked to complete the face-to-face survey. The demographic data of the participants are listed in Table 1. Cerebral palsy and cognitive impairment were the principal types of DD. All adolescents received rehabilitation from the Hospital Authority and/or from their special school. All the adolescents accessed between four and seven paediatric medical specialists (eg paediatrician, neurologist, orthopaedic surgeon) and rehabilitation services (eg physiotherapy, occupational therapy, speech therapy) indicating their complex needs. Over 90% of the JFK students were followed up at the DKCH that is within walking distance of the school (personal communication, Senior Physiotherapist at JFK).
 

Table 1. Demographic information of study participants
 
Table 2 summarises the participant responses about clinical transition. The majority of the parents (77%) and adolescents (85%) knew that clinical transition to adult care would occur at 18 years of age. They were mainly informed by their paediatrician (50% of parents and 69% of adolescents). Most parents (77%) were reluctant to make this move. Ten parents stated that their adolescent child was already receiving care from the adult sector and over half of them (60%) were dissatisfied with the service. Four of the 13 adolescents clearly stated that they had transitioned to the adult health service during the survey and all of them were happy with the transition. Among those 17 families who knew that clinical transition to adult care would occur at 18 years of age, 10 adolescents were all over 18 years old, whereas those in another seven (41%) families were also over 18 years old and still receiving services from the paediatric sector at the time of the study.
 

Table 2. Summary of responses to questions about clinical transition
 
When asked why they were not willing to the transition or why they were dissatisfied after the transition, the parent responses could be summed up as two main areas of concern: reluctance to change and dissatisfaction with the adult health services. Most parents (16/22, 73%) did not want to change their existing care circumstances. When asked why, some parents cited dissatisfaction with the adult health service or health system (for the latter, 13/22, 59% of parents). For example, parents found it difficult to attend the follow-up appointments using public transport. Although there was a free shuttle bus service for families who needed it, parents were frustrated by the limited service.
 
Some parents also wanted more flexible visiting hours in the adult hospital so that they could look after their adolescent children, especially those who were cognitively impaired. The parents worried about the quality of care for their children who were entirely dependent for their daily activities. They were also unhappy about the waiting time for medical appointments and stated that their children with DD had a short attention span and were unable to control their behaviour. Long waiting times in a crowded waiting area, which is commonly observed in the adult setting, could easily trigger their behavioural problems.
 
There was also dissatisfaction with the adult health service providers (13/22, 59% of parents). Parents often found that the adult medical staff demonstrated limited understanding and knowledge of their child’s clinical presentation and abilities, especially for those with severe cognitive impairment. The adult health service providers did not know how to communicate with the cognitively impaired adolescents and treated them as other normal adults.
 
There is no formal clinical transition service in Hong Kong but when asked, the majority of parents (21/22, 95%) and adolescents (11/13, 85%) stated that they would welcome such a service. About two thirds of the study parents and adolescents (23/35, 66%) would like the clinical transition service to support them during the clinical transition. About one third of the parents (7/22, 32%) believed that the service could act as a bridge linking the paediatric and adult health services, providing information about available services in the adult sector.
 
The Figure summarises the responses of adolescents for self-perceived competency in managing their disability, and Table 3 summarises the study parents’ responses. Most adolescents demonstrated understanding of instructions (11/13, 85%), confidence in communicating with the service providers about their condition (10/13, 77%), and understanding the importance of treatments for their condition (12/13, 92%) [Fig]. About half were confident in seeking help from different specialties according to their condition (6/13, 46%) and making medical decisions (7/13, 54%) [Fig]. Over half of the adolescents, however, lacked the confidence to attend routine medical visits on their own (8/13, 62%) and worried about the unfamiliar adult medical service (6/13, 46%) [Fig]. Most parents stated that they were familiar with their children’s medical conditions and treatments (20/22, 91%) and able to seek help from different medical specialties based on their child’s condition (14/22, 64%) [Table 3]. Only a minority of parents (1/22, 5%), however, believed that their children were capable of attending medical appointments on their own. Less than half of the parents believed that their children would be able to explain their medical condition (9/22, 41%) or make independent clinical decisions in the future (7/22, 32%). None of the parents of an adolescent with cognitive impairment believed that the child would ever be able to manage their own health.
 

Figure. Responses of study adolescents about their self-perceived competency in managing their disability
 

Table 3. Summary of responses of study parents about their self-perceived competency in managing the disability
 
Discussion
The present pilot study aimed to determine how adolescents with DD and their parents in Hong Kong accept the clinical transition and identify the main barriers to successful transition. This was the first step to understanding the issues of this population group during clinical transition and to enable planning for the future. As far as we know, this study is the first to be conducted in Hong Kong for this population group. Overall, 22 parents and 13 adolescents were recruited from two special schools (one for primarily physically disabled children and the other for severe physically and/or cognitively impaired individuals), aiming to understand what was the acceptance level and barriers faced by these two vastly different groups with DD. The results were very similar between these two subgroups, indicating that the study parents had similar issues during clinical transition, regardless of the type of DD of their child. Hence, the results from these two subgroups were discussed as one group.
 
Most of the study participants were aware of the clinical transition necessary at the age of 18 years. Only 10 (45%) of the 22 families shifted to the adult health service, despite the fact that their adolescent child was close to or over 18 years old (Tables 1 and 2). The reasons for this delay were not thoroughly explored but it has been suggested that medical practitioners in the paediatric service felt that the adolescents were not ready for the transition so they continued to see them well into adulthood while the parents and the adolescents were reluctant to make the move.11 The latter appeared to be true because when asked, most study participants did not want to change and move to the adult health service. This contradicts the results of a previous local study of adolescents with chronic medical conditions, in which over 80% of the study participants (adolescents and parents) were willing to move to the adult health service.11 The difference is likely due to the complexity of the health conditions of the present cohort. Adolescents with DD usually have varying degrees of physical and/or cognitive impairment and so depend more on others for managing their health condition, making them and their carers more anxious about any change.6 7 For those with chronic medical conditions, the physical and cognitive abilities of the adolescent were unlikely affected and hence the adolescents could manage their condition more independently and more readily after the transition.13 This speculation was supported by the findings about self-perceived competency level. Most study adolescents, who had mainly physical disabilities, were not confident about attending a medical appointment alone because of their limited physical abilities (Fig). The reluctance for change may also be due to fear of the unknown and of not being well prepared.11 In Hong Kong, clinical transition is non-structured and unplanned.11 Parents are often informed just before the transition, leading to poor preparation and confusion. Early and continuous clinical transition from early adolescence can enable parents and adolescents with DD to be prepared and actively participate in the transition planning.7 14 Although the clinical transition service is not well-known in Hong Kong, the study participants had a positive attitude towards a clinical transition service to help them navigate the process by bridging the paediatric and adult health services. In addition, the study parents wished to have more information about available adult health services, eg rehabilitation services, wheelchair maintenance services, etc. More information about the unknown has been shown to reduce the reluctance of parents and adolescents to change and to further improve their confidence about moving to adult care.5 6 15 16
 
Another barrier was dissatisfaction with the health care system and service providers in the adult setting (Table 2), and is in line with present literature.7 Some parents found it difficult to arrange transport to the adult hospital for follow-up while most of the appointments were currently at the special school. More accessible public transport might help, especially in Hong Kong, where private vehicles are not a common option. Flexible visiting hours in the adult hospital that would enable parents to care for their dependent adolescent child may also reduce their dissatisfaction. Longer waiting times for medical appointments in the adult setting was frequently mentioned by the study parents. In the adult sector, patients with all kinds of neurological conditions, both child-onset and adult-onset, are reviewed in the same clinic. The number of patients attending the clinic is vastly increased compared with the paediatric setting. In addition, patients with adult-onset neurological conditions and their family may not understand the characteristics of DD. Stressed behaviour of an adolescent child with DD may be perceived by other families as impatience. In the paediatric clinic, where all clinic attendees were children or adolescents with DD and their parents, the waiting time was shorter. Families as well as clinic staff had a full understanding of DD and would be more tolerant. It is likely that this lack of support in the adult setting further discouraged the study parents to make the transition willingly. Changes to the existing health care system, such as a separate clinic for child-onset DD conditions, may be a possible small step to assist this group in making a smooth transition. Education about clinical transition for staff in both the paediatric and adult settings allows them to prepare the families in advance. Education about paediatric conditions and communication with adolescents with DD can also equip adult staff with confidence so they develop a strong rapport with the families.16
 
Interestingly, from the perspective of the adolescents, the four adolescents who had transitioned stated that they were happy with the adult health service. Two (50%) adolescents indicated that because the ‘new’ doctors did not know them well, they paid more attention to them and one adolescent did not give any reason to support his statement. It is likely that in the paediatric sector, these adolescents had been followed up from early childhood by the same medical staff who virtually watched them grow up. A change of scene and people in the adult sector is welcomed by these adolescents. In addition, they might welcome the idea that they have started to actively participate in the consultation as a ‘patient’, unlike in the paediatric sector, where the consultation was directed at their parents, not them.14
 
Parents of adolescents who had physical and/or cognitive impairment had a similar perception of their adolescent child, that they would never be able to attend a medical appointment alone, presumably because of their disability (responses to questions 9 to 10 in Table 3). None of the parents of a cognitively impaired adolescent child believed their child to be capable of explaining their medical condition to others or making an independent medical decision. On the contrary, parents of a physically disabled child thought that while it may not apply at present, their child would be able to make their own decisions in future (responses to questions 11 to 14 in Table 3). Nonetheless, there was a discrepancy in this perception between the study adolescents and parents (Fig and Table 3). Most adolescents believed they could explain their condition to others (question 2 in the Fig) and over half believed that they could make an independent medical decision at the time of the study (question 7 in the Fig). Further studies are needed to determine whether this discrepancy is due to confusion on the part of the parents because of changing responsibilities during the transition.13 While western literature emphasises the importance of active participation by adolescents during clinical transition,14 it would be interesting to see if this can be endorsed in a parent-dominant Chinese society such as Hong Kong, where cultural differences may influence attitudes towards clinical transition.17
 
We were unable to analyse other challenges identified in overseas literature, such as unclear eligibility criteria and procedures and limited time for clinical transition, because comparable data were not available for Hong Kong. In other developed countries, adolescents are shifted with the assistance of a clinical transition service based in the children’s hospital (if any) or by applying clinical guidelines for best service.18 In Hong Kong, adolescents are referred from the paediatric section to the adult section within the same hospital. Each hospital cluster may have different procedures for this ‘transition’ and there is no defined department within the hospital structure to assist adolescents and their families through this process. Nor do the families receive any advice about how to negotiate this process. A future in-depth study is recommended to understand the existing situation of clinical transition in different hospital clusters and determine how to establish a more formal approach among all the hospital clusters in Hong Kong to support this group of adolescents and their families during this confusing time.
 
Limitations of the present study
There may have been a selection bias in the study sample as the families were approached by convenience through the school staff. Due to the small sample size, no statistical analysis was conducted to compare the subgroups of adolescents with physical and cognitive impairments. Although the sample size was small, the purpose of the present pilot study was not to generalise the findings to all adolescents with DD but to begin to understand the acceptance of clinical transition and the main barriers to success for adolescents with DD and their family in Hong Kong. The present results are also in line with the literature in this area.4 7 11 14 17 Future studies with a larger sample size and more in-depth qualitative data are required to verify the present results. The potential subjective bias of the results, especially for the open-ended questions, was another limitation but we attempted to minimise this through consensus agreement among the team.
 
Conclusions
In the present explorative study, close to half of the study families had a delayed clinical transition to the adult health service. Most study parents were reluctant for their adolescent children to shift to the adult health service due to unwillingness to change and dissatisfaction with the adult medical service. A structured and well-planned clinical transition was urged by the study participants to bridge the paediatric and adult health services and to provide support to the family. Further studies are required to analyse the needs and concerns of adolescents with DD and their families as well as the service providers in the adult medical setting to facilitate the future development of a clinical transition service in Hong Kong.
 
Acknowledgements
The authors would like to thank all the participating families from the Hong Kong Red Cross John F Kennedy Centre and Haven of Hope Sunnyside School.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Westbom L, Bergstrand L, Wagner P, Nordmark E. Survival at 19 years of age in a total population of children and young people with cerebral palsy. Dev Med Child Neurol 2011;53:808-14. Crossref
2. Public Law 98-527, Developmental Disabilities Act of 1984.
3. Staff J, Mortimer JT. Diverse transitions from school to work. Work Occup 2003;30:361-9. Crossref
4. Blum RW, Garell D, Hodgman CH, et al. Transition from child-centered to adult health-care systems for adolescents with chronic conditions. A position paper of the Society for Adolescent Medicine. J Adolesc Health 1993;14:570-6. Crossref
5. American Academy of Pediatrics, American Academy of Family Physicians, American College of Physicians-American Society of Internal Medicine. A consensus statement on health care transitions for young adults with special health care needs. Pediatrics 2002;110(6 Pt 2):1304-6.
6. Bindels-de Heus KG, van Staa A, van Vliet I, Ewals FV, Hilberink SR. Transferring young people with profound intellectual and multiple disabilities from pediatric to adult medical care: parents’ experiences and recommendations. Intellect Dev Disabil 2013;51:176-89. Crossref
7. Stewart D, Stavness C, King G, Antle B, Law M. A critical appraisal of literature reviews about the transition to adulthood for youth with disabilities. Phys Occup Ther Pediatr 2006;26:5-24. Crossref
8. Persons with disabilities and chronic diseases in Hong Kong. Hong Kong: Hong Kong Census and Statistics Department; 2016. Available from: http://www.statistics.gov.hk/pub/B71501FB2015XXXXB0100.pdf. Accessed Jul 2016.
9. Clusters, hospitals & institutions. Hospital Authority. 2016. Available from: http://www.ha.org.hk/visitor/ha_visitor_index.asp?Content_ID=10036&Lang=ENG&Dimension=100&Parent_ID=10004. Accessed Jul 2016.
10. Hospital Authority Statistical Report 2012-2013. Hong Kong: Hospital Authority; 2013.
11. Wong LH, Chan FW, Wong FY, et al. Transition care for adolescents and families with chronic illnesses. J Adolesc Health 2010;47:540-6. Crossref
12. Hong Kong Society for Adolescent Health. Adolescent Transition Care Assessment Tool, Public Education Series No. 12 (2013). Available from: http://hksah.blogspot.hk/2013/11/adolescent-transition-care-assessment.html. Accessed 6 Feb 2016.
13. Stewart DA, Law MC, Rosenbaum P, Willms DG. A qualitative study of the transition to adulthood for youth with physical disabilities. Phys Occup Ther Pediatr 2002;21:3-21. Crossref
14. Viner RM. Transition of care from paediatric to adult services: one part of improved health services for adolescents. Arch Dis Child 2008;93:160-3. Crossref
15. Blum RW. Introduction. Improving transition for adolescents with special health care needs from pediatric to adult-centered health care. Pediatrics 2002;110(6 Pt 2):1301-3.
16. Stewart D. Transition to adult services for young people with disabilities: current evidence to guide future research. Dev Med Child Neurol 2009;51 Suppl 4:169-73. Crossref
17. Barnhart RC. Aging adult children with developmental disabilities and their families: challenges for occupational therapists and physical therapists. Phys Occup Ther Pediatr 2001;21:69-81. Crossref
18. Department of Health. National Service Framework for Children, Young People and Maternity Services. Transition: getting it right for young people. Improving the transition of young people with long term conditions from children’s to adult health services. 2006. Available from: http://dera.ioe.ac.uk/8742/1/DH_4132145%3FIdcService%3DGET_FILE%26dID%3D23915%26Rendition%3DWeb. Accessed Jul 2016.

Pages