Rectificatie toevoegen

Verzenden

Ik wil de tekst...

Kopiëren Rectificeren
13 mrt 2026

Nederlandse Ziekenhuisfarmaciedagen 2025

  • Rubriek: Congresabstracts
  • Identificatie: 2026;11:a1811
  • Auteur(s): Redactiebureau NPFO - verscheidene auteurs

Survival of patients with colorectal or pancreatic cancer who received UGT1A1 genotype-guided dosing of irinotecan: a multicenter real-world study

Sofía L.J. Peeters ab*, Niels Heersche cd, Doortje M.M. Bohm ab, Stefan Böhringer e, Roselien Guiljam c, Marije Joosse a, Aisha Osman a, Emma C. Hulshof ab, Femke M. de Man c, Mirjam de With cd, Irene E.G. van Hellemond f, Brigitte C.M. Haberkorn g, Arjan J. Verschoor h, Miriam L. Wumkes i, Ron H.N. van Schaik d, Anna M.J. Thijs f, Hans Gelderblom j, Henk-Jan Guchelaar b, Ron H.J. Mathijssen c and Maarten J. Deenen ab

a Department of Clinical Pharmacy, Catharina Hospital, Eindhoven.
b Department of Clinical Pharmacy and Toxicology, Leiden University Medical Centre, Leiden.
c Department of Medical Oncology, Erasmus MC Cancer Institute, Rotterdam.
d Department of Clinical Chemistry, Erasmus University Medical Centre, Rotterdam.
e Department of Biomedical Data Sciences, Leiden University Medical Centre, Leiden.
f Department of Medical Oncology, Catharina Hospital, Eindhoven.
g Department of Medical Oncology, Maasstad Hospital, Rotterdam.
h Department of Medical Oncology, Reinier de Graaf Gasthuis Hospital, Delft.
i Department of Medical Oncology, Jeroen Bosch Hospital, ‘s Hertogenbosch.
j Department of Medical Oncology, Leiden University Medical Centre, Leiden.

* Correspondence: sofia.peeters@catharinaziekenhuis.nl.

 

Background
Uridine diphosphate glucuronosyl transferase 1A1 (UGT1A1) genotype-guided dosing significantly reduces the incidence of severe toxicity in UGT1A1 poor metabolizer (PM) patients treated with irinotecan [1]. However, the impact of UGT1A1 genotype-guided dosing of irinotecan on survival outcomes remains unknown. This study evaluated whether upfront 30% dose reductions of irinotecan in UGT1A1 PMs affect survival by comparing progression-free (PFS) and overall survival (OS) between PMs treated with an initial 30% dose-reduction and fully dosed intermediate and normal metabolizers (IM/NMs).

Methods
We conducted a retrospective, multicenter cohort study in patients with pancreatic cancer (PC) or colorectal cancer (CRC) treated with UGT1A1 genotype-guided irinotecan dosing at six Dutch hospitals between Aug 2017-Apr 2024. All treatment regimens were eligible for inclusion. Patients were included in the primary analysis if irinotecan was dosed according to UGT1A1 genotype (i.e. 100% dose intensity for IM/NMs and 70% for PMs; ± 10% deviation allowed) in at least cycle 1. PFS events were defined as either radiological progression (RECIST 1.1), clinical progression or death from any cause. Survival analyses were performed using Kaplan-Meier estimates and multivariable Cox regressions, stratified by tumor type. Safety was also assessed.

Results
In total, 779 patients were included in the primary analysis, 76 (9.8%) of whom were PMs (Figure 1). All baseline characteristics were evenly distributed across UGT1A1 groups. PFS and OS rates were comparable over time between PMs and IM/NMs (stratified log-rank test: PFS: P = 0.542; OS: P = 0.419) (Figure 2). For patients with PC, median PFS was 9.0 months (95% CI: 6.2-11.8) in PMs and 8.3 months (95% CI: 7.2-9.4) in IM/NMs. For patients with CRC, median PFS was 6.2 months(95% CI: 5.1-7.3) in PMs and 6.0 months (95% CI: 5.3-6.7) in IM/NMs. Median OS was similar between PMs and IM/NMs in both PC and CRC groups. In stratified multivariable Cox regression analyses, the adjusted hazard ratio of PMs vs IM/NMs was 1.015 (95% CI: 0.78-1.32; P = 0.90) for PFS and was 1.10 (95% CI: 0.82-1.48; P = 0.51) for OS, indicating no significant differences in survival outcomes between 30% dose-reduced PMs and fully dosed IM/NMs. Severe toxicity rates were comparable between PMs and IM/NMs.

Conclusions
Survival of UGT1A1 poor metabolizers is not affected by an upfront 30% dose reduction of irinotecan. Therefore, UGT1A1 genotype-guided dosing of irinotecan can be confidently performed and should become the new standard-of-care dosing strategy for irinotecan to improve patient safety.

References
1. Hulshof EC, de With M, de Man FM, et al. UGT1A1 genotype-guided dosing of irinotecan: A prospective safety and cost analysis in poor metaboliser patients. Eur J Cancer. 2022 Feb;162:148-157.

 

De abstractpresentatie van Sofía Peeters werd bekroond met de prijs voor Best Abstract 2025.

Cefazolin versus (flu)cloxacillin for methicillin-susceptible Staphylococcus aureus bacteremia: a randomized clinical trial

F.S. Sinkeler a*, N.G.L. Jager a, R.J. Brüggemann a, I. Kouijzer b and J. ten Oever b

a Department of Pharmacy, Pharmacology and Toxicology, Radboud university medical center, Research Institute for Medical Innovation, Nijmegen.
b Department of Internal Medicine and Radboud Community for Infectious Diseases, Radboud university medical center, Nijmegen.

* Correspondence: fleur.sinkeler@radboudumc.nl.

 

Background
Staphylococcus aureus (S. aureus) bacteremia is a leading cause of mortality worldwide, with more than 1 in 4 afflicted patients dying within 90 days. In the Netherlands, reported mortality rates range from 15-30%. The optimal therapy for methicillin-susceptible S. aureus (MSSA) bacteremia remains debated, as guidelines favor antistaphylococcal penicillins over cefazolin due to concerns about the inoculum effect. Meta-analyses of observational studies suggest cefazolin may lower 30-day mortality and has fewer adverse events than antistaphylococcal penicillins, but only a randomized trial can provide reliable evidence. To address this and other questions in S. aureus bacteremia, the international S. aureus Adaptive Platform (SNAP) trial was initiated in 2022 as an ongoing Bayesian adaptive platform trial.

Methods
Within the SNAP platform, an open-label, randomized 1:1 comparison of cefazolin versus (flu)cloxacillin was conducted in patients ≥ 18 years with MSSA bacteremia, including participation from Dutch centers. The primary outcome was 90-day all-cause mortality, analyzed using a hierarchical Bayesian logistic regression model. Secondary outcomes included acute kidney injury (AKI), defined as an absolute creatinine increase ≥ 26.5 µmol/L within 5 days or a relative increase ≥ 50% from baseline within 14 days and hepatotoxicity within 14 days. Non-inferiority was defined as an adjusted odds ratio (aOR) < 1.2 (< 2.5% absolute margin if mortality in the (flu)cloxacillin arm was 15%) and superiority as an aOR < 1.0; conclusions were drawn if the respective posterior probability exceeded 99%.

Results
The trial was conducted between February 2022 and August 2024, at which time non-inferiority for 90-day mortality was met, with a concurrent safety signal of increased AKI in the (flu)cloxacillin group. In total, 671 patients (for NL, 22 patients) were randomized to the cefazolin arm and 670 patients to the (flu)cloxacillin arm. 54 patients were lost to follow-up, leaving 1287 patients in the primary outcome analysis. At 90 days, all-cause mortality was 15.0% with cefazolin (97/645) and 17.0% with (flu)cloxacillin (109/642; aOR 0.81; 95% CrI 0.59-1.12), demonstrating non-inferiority (posterior probability 99.2%) but not superiority (89.8%). Cefazolin was superior to (flu)cloxacillin with respect to AKI (13.9% [92/660] vs. 19.6% [127/648]; aOR 0.67; 95% CrI 0.50-0.89; posterior probability of superiority 99.7%). Hepatotoxicity within 14 days occurred in 13.1% (75/574) of cefazolin recipients and 13.9% (78/562) of (flu)cloxacillin recipients.

Conclusions
Cefazolin was non-inferior to (flu)cloxacillin for 90-day mortality and was associated with less AKI. Cefazolin should be considered the preferred therapy for most adults with MSSA bacteremia.

The effect of allopurinol on incident knee and hip osteoarthritis: a population-based cohort study

M. Hartog ab†, J.P.A. Houwen cd†*, M.W.J. Heijman ab, C.H.M. van den Ende ab, P.C. Souverein c, A. Lalmohamed cd, C.D. Popa be and B.J.F. van den Bemt afg

a Department of Research, Sint Maartenskliniek, Nijmegen.
b Department of Rheumatology, Radboudumc, Nijmegen.
c Division of Pharmacoepidemiology & Clinical Pharmacology, Utrecht Institute for Pharmaceutical Sciences, Utrecht University, Utrecht.
d Department of Clinical Pharmacy, UMC Utrecht, Utrecht.
e Department of Rheumatology, Sint Maartenskliniek, Nijmegen.
f Department of Pharmacy, Sint Maartenskliniek, Nijmegen.
g Department of Pharmacy, Radboudumc, Nijmegen.

These authors contributed equally to this work.

* Correspondence: j.p.a.houwen-2@umcutrecht.nl.

 

Background
Osteoarthritis (OA) is a leading cause of pain and disability worldwide, yet no disease-modifying OA drugs have been established. Allopurinol, a xanthine oxidase inhibitor commonly used for gout, has been hypothesized to possess disease-modifying properties in OA. This study aimed to investigate the association between allopurinol use and the risk of incident knee or hip OA.

Methods
A retrospective propensity score-matched population-based cohort study was conducted using data from the United Kingdom Clinical Practice Research Datalink. The study included patients aged ≥ 40 years who initiated allopurinol therapy, 1:1 propensity score-matched to non-users. The primary outcome was knee or hip OA incidence. Time-dependent cox proportional hazard models were used to estimate hazard ratios (HRs) and 95% confidence intervals (CIs). Secondary analyses were performed to assess the impact of allopurinol usage patterns and OA incidence.

Results
A total of 214,452 allopurinol users and 214,452 matched non-users were included, with a mean follow-up of 7.6 years (SD 6.2) and 7.4 years (SD 5.9) respectively. Allopurinol use was significantly associated with a 19% lower hazard of OA (HR: 0.81; 95% CI: 0.74-0.88), and was comparable for both hip (HR: 0.82; 95% CI: 0.69-0.96) and knee OA (HR: 0.80; 95% CI: 0.73-0.88). Longer use (> 1 year) was associated with progressively lower risk (HR: 0.77; 95% CI: 0.71-0.84). High medication adherence (> 80%) showed the strongest risk reduction (HR: 0.18; 95% CI: 0.16-0.20).

Conclusions
Allopurinol use was associated with a 19% reduction in risk of incident knee or hip OA, suggesting disease-modifying properties. Further research is necessary to confirm the preventive role in OA onset.

The impact of therapeutic drug monitoring of fludarabine on clinical outcomes after allogeneic hematopoietic cell transplantation: a randomized controlled clinical trial

Tim Bognàr a†*, Moniek de Witte bc†, Klaartje Nijssen b, Anna van Rhenen b, Lotte van der Wagen b, Laura van Hussen b, Anke Janssen b, Anniek Stuut c, Peter van de Ven d, Dirk-Jan Moes e, Peter van Balen f, Stijn Halkes f, Aurelia de Vries Schultink a, Arief Lalmohamed a, Kim van der Elst a and Jurgen Kuball bc

a Department of Clinical Pharmacy, University Medical Center Utrecht/Wilhelmina Children's Hospital, Utrecht.
b Department of Hematology, University Medical Center Utrecht, Utrecht.
c Center of Translational Immunology, University Medical Center Utrecht, Utrecht.
d Department of Data Science and Biostatistics, Julius Center, University Medical Center Utrecht.
e Pharmacy Department, Leiden University Medical Center, Leiden.
f Hematology Department, Leiden University Medical Center, Leiden.
These authors contributed equally to this work.

* Correspondence: t.bognar-2@umcutrecht.nl.

 

Background
In a retrospective study, we showed that 57% of adults receiving fludarabine during conditioning for allogeneic hematopoietic stem cell transplantation (allo-SCT) were overexposed, which was associated with an increase in non-relapse mortality (NRM) [1]. Here, we report an interim analysis of a prospective randomized controlled trial, in which therapeutic drug monitoring (TDM)-guided fludarabine dosing was compared to conventional body surface area (BSA)-based dosing in recipients of an allo-SCT.

Methods
Adult patients with hematological malignancies and a creatinine clearance of ≥40 ml/min were eligible for inclusion. Conditioning consisted of anti-thymocyte globulin, fludarabine, and busulfan. Patients were randomized (1:1) into two groups. The fludarabine-BSA group received 40 mg/m2/day fludarabine over 4 days, while in the fludarabine-TDM group dosing was performed using TDM targeting a cumulative area under the curve (AUCα) of 20 mg*h/L (range 15-25 mg*h/L). The primary endpoint was the cumulative incidence of severe viral infections (defined as CTC-AE > grade 3) 3 months post allo-SCT. Secondary endpoints were NRM, acute graft-versus-host disease (aGVHD) ≥ grade II, chronic GVHD (cGVHD), engraftment, overall survival (OS), event-free survival (EFS), GVHD-relapse-free survival (GRFS), immune reconstitution, and attained fludarabine exposure.

Results
A total of 98 patients with a median age of 59 years were enrolled: 48 in the fludarabine-BSA group and 50 in the fludarabine-TDM group. Median time of follow-up was 24 months. Cohorts were well balanced, except for the HCT-comorbidity index (HCT-CI) score, with more patients scoring ≥ 3 in the fludarabine-TDM group (32% vs 13%). TDM-guided dosing was effective, as all patients received the target fludarabine exposure (median 19.6 mg*h/L, 18.1-24.1), whereas in the BSA-based dosing group, 52.1% were overexposed (median 25.7 mg*h/L, range 13.8-44.5, Figure 1. Severe viral infections were comparable between those with fludarabine BSA-based dosing and fludarabine TDM-guided dosing (Figure 2). GVHD incidence was not impacted (Figure 2). Relapse incidence was also similar between groups (31% in the fludarabine-BSA group and 27% in the fludarabine-TDM group; P = 0.89). Multivariate analysis showed no differences in any endpoints including OS and NRM. Immune reconstitution was improved only after one month for the fludarabine-TDM group, but not at later time points.

Conclusions
In conclusion, while fludarabine TDM was successful for attaining the target exposure, we did not see a positive impact on clinical outcomes. TDM was associated with improved early CD4+ immune reconstitution, but failed to meet the primary endpoint, namely the cumulative incidence of severe viral infections.

1. Langenhorst JB, van Kesteren C, van Maarseveen EM, Dorlo TPC, Nierkens S, Lindemans CA, de Witte MA, van Rhenen A, Raijmakers R, Bierings M, Kuball J, Huitema ADR, Boelens JJ. Fludarabine exposure in the conditioning prior to allogeneic hematopoietic cell transplantation predicts outcomes. Blood Adv. 2019 Jul 23;3(14):2179-2187.

 

Dried blood spot recommendations for optimal practices in children and adolescents

Lisa T. Ringeling ab*, Jiayi Liang a, Hilal Cetin a, Soma Bahmany a, Rebecca A. Hermans ab, Bram Dierckx b, Birgit C.P. Koch a and Brenda C.M. de Winter a

a Department of Hospital Pharmacy, Erasmus University Medical Center, Rotterdam.
b Department of Child and Adolescent Psychiatry/Psychology, Erasmus University Medical Center, Rotterdam.

* Correspondence: l.ringeling@erasmusmc.nl.

 

Background
Dried blood spot (DBS) offers several advantages for therapeutic drug monitoring (TDM), including a minimally invasive procedure, small blood volume requirements, and the possibility for home sampling. However, the main challenge of DBS lies in obtaining qualitative spots for reliable analysis. To evaluate the impact of enhanced visual and textual support in the sampling guidance on DBS quality, we analysed data collected from a prospective multicenter study in children and adolescents.

Methods
A total of 108 DBS cards from 56 children were assessed based on the quality of the spots. Rejection criteria included: spot diameter less than 6 mm, smeared or irregularly shaped spots, and overlapping spots. Approval and rejection rates, overall success rates of DBS cards, and the most common rejection reasons were compared with and without the enhanced visual and textual support in the sampling guidance (intervention).

Results
Pre-intervention, 47.6% of the DBS cards contained at least one qualitatively approved spot for analysis, and 25.7% of all the spots were qualified for analysis. Post-intervention, the acceptance rate increased respectively to 85.1% and 63.8%. In both the pre- and post-intervention group, the most common reason for rejection was the presence of smeared or irregularly shaped spots.

Conclusions
Continued improvement in DBS quality hinges on the effective resolution of home sampling challenges, which is critical for its seamless incorporation into clinical practice.

Comparable infliximab exposure and reduced treatment burden after switching from intravenous to subcutaneous maintenance therapy in adult patients with inflammatory bowel disease in remission, independent of immunomodulator co-use (SHUFFLE study)

Lieke M.J. van de Ven-van Dinter a*, Mariëlle Romberg-Camps b, Dennis R. Wong a, Adriaan A. van Bodegraven b† and Niels W. Boone a

a Department of Clinical pharmacy, Pharmacology and Toxicology, Zuyderland Medical Centre, Heerlen/Sittard-Geleen.
b Departement of Gastroenterology, Zuyderland Medical Center, Heerlen/Sittard-Geleen.
These authors share last authorship.

* Correspondence: l.vandinter@zuyderland.nl.

 

Background
Infliximab (IFX), an anti-TNF-alpha monoclonal antibody, is approved for Inflammatory Bowel Disease (IBD) treatment. Recently subcutaneous (SC) flat-dose IFX biosimilar was introduced offering potential advantages over intravenous (IV) weight-based dosing. Prospective pharmacokinetic-data and clinical outcomes in IBD practice are scarce, since drug approval was primarily based on a rheumatoid- arthritis study-population in concurrent use with methotrexate.
The main objective of this prospective study is to compare IFX exposure and clinical outcomes when switching IV to SC IFX therapy in IBD remission patients and assess the effect of concomitant immunomodulators.

Methods
In this single-centre, prospective study, thirty-two adult IBD patients in clinical remission on a 6-8 weekly IFX IV-dosing interval were switched to biweekly dosed SC IFX and followed for 24 weeks. The primary endpoint was the comparison between the Area Under the Concentration-time curves (AUCs) at steady state before and after switching. AUCs were calculated using MwPharm++ (Mediware, version 2.4.0; Hanzel et al., 2021). Secondary endpoints included trough levels, time burden, quality of life (IBDQ-NL) and anti-drug antibodies (ADAbs).

Results
A total of 32 patients were evaluated, of whom 20 received IFX monotherapy and 12 received IFX combination therapy. The cohort comprised 11 patients with ulcerative colitis and 21 with Crohn's disease. Mean AUCs6-8 weeks were comparable [28116 ± 7240 mg·h/L vs 28534 ± 10274 mg·h/L (P = 0.75) between IV and SC administration, independent of immunomodulator use [AUCSC 28810 ± 10683 mg·h/L vs 27075 ± 10001 mg·h/L, P = 0.962). IFX trough levels increased on SC IFX (median 15.9 vs 4.9 mg/L, P < 0.001), independent from immunomodulator use [P = 0.713]. Time burden decreased substantially (median reduction 536 minutes/6 months, P < 0.001) and IBDQ-NL score increased (188 to 197, P = 0.006). ADAbs were detected in 6% without clinical impact. During the follow-up period, no patient had an exacerbation or required escalation of treatment. The percentage still being treated with IFX SC after 6 months was 97%.

Conclusions
Switching from IV to SC IFX in IBD patients in remission maintains equivalent drug exposure with higher trough levels, without the risk of ADAb formation, reduces time, and improves quality of life, regardless of immunomodulator co-use.

The use of ChatGPT in summarizing adverse drug reactions in patients with polypharmacy

Johanna H.M. Driessen ab, Receb Gündogan c*, Bregje Witjes c, Laura Nijstad c, Alan Abdulla c and Fatma Karapinar-Çarkit ab

a Department of Clinical Pharmacy & Toxicology, Maastricht University Medical Center+, Maastricht.
b CARIM, department of clinical pharmacy, Cardiovascular Research Institute Maastricht, Maastricht University, Maastricht.
c Department of Hospital Pharmacy, Erasmus University Medical Center, Rotterdam.

* Correspondence: r.gundogan@franciscus.nl.

 

Background
Polypharmacy occurs frequently, especially in older people. Polypharmacy has been associated with medication related harm, including hospital admissions and adverse drug reactions (ADRs). ADRs are often not recognized in patients with polypharmacy. Therefore, an overview of potential ADRs for currently used drugs is needed. However, retrieving ADR data is time-consuming and often impractical. Large language models, such as ChatGPT, are capable of summarizing information very quickly. Therefore, we investigated whether ChatGPT can produce an adequate overview of side effects.

Methods
Thirty commonly prescribed cardiovascular drugs (including ten combination drugs) were tested across five iterations using an engineered prompt in ChatGPT 4o that restricted sourcing to the Farmacotherapeutisch Kompas (FK). ChatGPT extracted adverse effects and categorized frequency; only common (1–10%) and very common (>10%) effects were tabulated. Outputs were manually cross-checked with the FK.
The accuracy, categorization errors, hallucinations and omission errors were evaluated. Categorization and hallucination errors were calculated as the number of errors divided by the number of side effects reported by ChatGPT x 100%. The omission error rate was the number of omissions divided by the number of side effects reported in FK x 100%. The accuracy was calculated by dividing the number of correctly reported side effects by the total number of side effects reported in FK. The mean and standard deviation (SD) and median with inter quartile range (IQR), was calculated over the five iterations per investigated drug.
Furthermore, we investigated whether results were affected by brand versus substance names and stepwise prompts versus single-message prompts. An independent samples t-test was used to compare the mean accuracy between the groups.

Results
The mean accuracy for the thirty tested drugs was 77.9% (SD 21.7%), with a median of 80.0% (IQR: 62.4-98.4%). The most common errors were omission errors (mean: 19.4%, SD: 19.5%), whereas hallucination (mean: 0.5%, SD: 1.8%) and categorization errors (mean: 0.11%, SD: 0.5) occurred less often. Seven of 30 drugs were fully correct across all five iterations. Stepwise split prompts yielded better results than single-message prompts (78% vs. 13% mean accuracy, P-value 0.046). Whereas substance versus brand names did not influence the results significantly (75% vs 78% mean accuracy, P-value: 0.541).

Conclusions
ChatGPT showed promising results in summarizing ADRs of the tested cardiovascular drugs. Omission errors were the most frequently occurring errors. Prompt optimization is needed before clinical application to avoid omission errors.

Interventions in response to computerized decision support alerts on antithrombotics: a nationwide flashmob study in Dutch community and outpatient pharmacies

Jetske Graafsma ab, Sarah Agterhuis c, Sander Borgsteede cd, Liset van Dijk ce, Joanna E. Klopotowska fg, Fatma Karapinar-Carkit hi* and Patricia M.L.A. van den Bemt a; for the Dutch primary care CDSS-efficiency studygroup

a Department of Clinical Pharmacy and Pharmacology, University Medical Center Groningen, University of Groningen, Groningen.
b Department of Pharmacy, Frisius MC Heerenveen, Heerenveen.
c Department of PharmacoTherapy, -Epidemiology and -Economics, University of Groningen, Groningen.
d Department of Clinical Decision Support, Health Base Foundation, Houten.
e Nivel, Netherland institute for health services research, Utrecht.
f Department of Medical Informatics Amsterdam UMC, University of Amsterdam, Amsterdam.
g Amsterdam Public Health Institute, Amsterdam.
h Department of Clinical Pharmacy & Toxicology, Maastricht University Medical Center+, Maastricht.
i Department of Clinical Pharmacy, CARIM, Cardiovascular Research Institute Maastricht, Maastricht University, Maastricht.

* Correspondence: f.karapinar@mumc.nl.

 

Background
Medication safety is a critical aspect of healthcare, as adverse drug events (ADEs) can result in severe patient harm or even death. Antithrombotics are frequently implicated in (fatal) ADEs. In Dutch community and outpatient pharmacies computerized decision support systems (CDSS) generate medication safety alerts to help prevent ADEs. However, current CDSS have low specificity due to limited personalization and lack of clinical context. The extent to which this hampers efficiency of CDSS remains largely unexplored in the primary care setting, for medication alerts in general and for antithrombotic alerts in particular. Therefore, the main objective of this study was to determine the proportion of CDSS alerts on antithrombotics that result in an intervention in Dutch community and outpatient pharmacies.

Methods
A multicenter, single-day, cross-sectional observational study was conducted using a flashmob design. All community and outpatient pharmacies in the Netherlands were invited to participate. Pharmacy staff was asked to collect the number and type of CDSS alerts (e.g. drug-drug interactions) on antithrombotics and on any medication, the number of interventions triggered by these alerts, and the estimated time spent on assessing alert relevance. An intervention was defined as an action in response to an alert by pharmacy staff, e.g. contact physician. The primary outcome was the efficiency of CDSS alerts involving antithrombotics, defined as the proportion of alerts needing an intervention. Secondary outcomes included the efficiency of alerts on any medication, the efficiency per alert type and the time spent on assessing the alert relevance. Descriptive statistics were used for data analysis.

Results
Of the 2004 pharmacies invited (1928 community and 76 outpatient), 82 pharmacies (50 community and 32 outpatient) participated (response rate 4.1%) and 80 pharmacies were eventually included. A total of 2463 alerts on antithrombotics were included (median 25 per pharmacy, interquartile range (IQR) 42) and 36,508 alerts on any medication (median 382 per pharmacy, IQR 369). CDSS efficiency was 0.0% (IQR 2.4%) for alerts on antithrombotic and 1.9% (IQR 2.9%) for any medication. For alerts on any medication, drug-drug interaction alerts showed the highest efficiency (4.0%, IQR 9.5%). The estimated time needed for assessing the relevance of the alerts was almost 2 hours per pharmacy (01:54h, IQR 01:38).

Conclusions
This study shows that the efficiency of CDSS alerts on both antithrombotics and on any medication is very low in community and outpatient pharmacies. Optimization of CDSS efficiency is necessary to improve medication safety in primary care.

A digital assistant for medication allergy history taking (ChaRM-A): discrepancies between medication allergy history documentation by patients supported by a digital assistant and pharmacy technicians

S. Heeren a*, B. Apaydin b, E. Derissen b, G. Schoonman cd, K. Tenfelde e, C. van der Lee d, J. de Wit e and B. Maat b

a Department of Pharmacoepidemiology and Clinical Pharmacology, Utrecht University, Utrecht.
b Department of Hospital Pharmacy, Elisabeth-Tweesteden Hospital, Tilburg.
c Department of Neurology, Elisabeth-Tweesteden Hospital, Tilburg.
d Department of Communication and Cognition, Tilburg School of Humanities and Digital Sciences, Tilburg University, Tilburg.
e Tranzo Scientific Center for Care and Wellbeing, Tilburg School of Social and Behavioral Sciences, Tilburg University, Tilburg.

* Correspondence: s.heeren@students.uu.nl.

 

Background
Medication reconciliation is the process of creating an accurate overview of the medication and allergy history of patients, aiming to achieve safe medication use around care transitions. Electronic personal health records (ePHRs) increasingly allow patients to manage their own medical data. Integration of a digital assistant (DA; i.e. a conversational agent) into ePHRs may be of added value as a supportive tool to help better engage patients and improve patients’ experience of entering (complex) data. Previously, the use of a DA for medication history taking (ChaRM) showed positive results: patients positively experienced DA guidance and few clinically relevant inaccuracies occurred. Consequently, an additional prototype was developed that supports patients in documenting their medication allergy history (ChaRM-A). The aim of this study was to evaluate the accuracy of medication allergy history taking performed by patients guided by ChaRM-A compared to pharmacy technicians.

Methods
A prospective, observational study was conducted from February-May 2025. Outpatients ≥18 years, who were scheduled for a visit to the outpatient medication reconciliation department, were randomly selected for participation. Patients performed medication allergy history taking in an online standalone environment with ChaRM-A support. Subsequently, a pharmacy technician conducted medication reconciliation including medication allergy history taking (usual care). The primary outcome was the proportion of patients with a discrepancy between the two composed medication allergy lists. Secondary outcomes were the nature and prevalence of the discrepancies, their clinical relevance and potential risk factors. Data were analysed through descriptive statistics and regression analysis.

Results
Forty-five patients (median age 66 years [IQR 57-76], range 19-88 years) documented their medication allergy history with support of ChaRM-A. The median time they needed was 2 minutes and 27 seconds [IQR 01:33-04:39]. Sixteen (35.6%) patients did not have any allergies. The other 29 patients documented ≥ 1 allergy and 28 (96.6%) of them had ≥ 1 discrepancy (median 1 discrepancy per patient [IQR 1-4], range 1-10). In total, 70 discrepancies were found: 37.1% documentation deviations, 35.7% omissions and 27.1% additions. Most discrepancies (n = 61, 87.1%) were classified as unharmful. The number of discrepancies per patient was significantly associated with the number of allergies per patient (P = 0.03 [95% CI: 1.03-1.94]).

Conclusions
Although a high number of discrepancies between medication allergy history taken by patients with a DA and pharmacy technicians was found, their clinical relevance was low, indicating that a DA may safely support patients. Additional research and optimisation of the DA is needed to fully support patients in self-reporting medication allergies.

Portability of a text mining algorithm for detecting adverse drug reactions in electronic health records across diverse patient groups in two Dutch hospitals

Britt W.M. van de Burgt ab*, Loes F.C. van Dijck c, Bjorn Dullemond d, Naomi T. Jessurun e, Minou van Seyen f, Rob J. van Marum fgh, Remco J.A. van Wensen i, Wai-Yan Liu i, Carolien M.J. van der Linden j, Rene J.E. Grouls a, R. Arthur Bouwman bk, Erik H.M. Korsten b and Toine C.G. Egberts lm

a Department of Clinical Pharmacy, Catharina Hospital Eindhoven, Eindhoven.
b Department of Electrical engineering, signal processing group, Technical University Eindhoven, Eindhoven.
c Department of Clinical Pharmacy, SJG Weert, Weert.
d Department of Mathematics and Computer Science, Technical University Eindhoven, Eindhoven.
e Netherlands Pharmacovigilance Centre LAREB,'s-Hertogenbosch.
f Department of Clinical Pharmacy, Jeroen Bosch Hospital, ‘s-Hertogenbosch.
g Departments of Clinical Pharmacology and Geriatrics, Jeroen Bosch Hospital, 's-Hertogenbosch.
h Department of Elderly Care Medicine, Amsterdam Public Health Research Institute, Amsterdam UMC Location VUmc.
i Department of Orthopaedic Surgery & Trauma, Catharina Hospital, Eindhoven.
j Department of Geriatrics, Catharina Hospital Eindhoven, Eindhoven.
k Department of Anesthesiology, Catharina Hospital Eindhoven, Eindhoven.
l Department of Clinical Pharmacy, University Medical Centre Utrecht, Utrecht.
m Department of Pharmacoepidemiology and Clinical Pharmacology, Utrecht Institute for Pharmaceutical Sciences, Faculty of Science, Utrecht University, Utrecht.

* Correspondence: britt.vd.burgt@catharinaziekenhuis.nl.

 

Background
Adverse Drug Reactions (ADRs) pose a significant challenge in healthcare. While structured documentation of ADRs in electronic health records (EHRs) enables automated alerting, many ADRs are recorded as unstructured free-text, limiting detection. Text mining (TM) shows potential for extracting clinically relevant data from unstructured text. However, the portability of TM algorithms across different institutions and departments remains uncertain, due to variations in EHR structures and documentation practices. To enhance these general-purpose algorithms, evaluating their portability is essential for ensuring effective performance across diverse clinical settings. To evaluate the portability of a previously developed TM-based ADR identification algorithm by assessing its performance using EHRs from two different departments in two different hospitals.

Methods
EHR free-text data from 62 hospitalized patients in the geriatric and orthopaedic departments of two Dutch teaching hospitals were reviewed for ADRs via manual review and the TM algorithm. Performance was evaluated using F-score, sensitivity and positive predictive value (PPV), with comparisons across hospitals and departments.

Results
Manual review identified 359 unique ADRs. The TM algorithm detected 534 potential ADRs, 286 of which overlapped with manual review, yielding an F-score of 0.64, sensitivity of 80% and PPV of 54%. Performance was consistent across hospitals and departments. Notably, 26 ADRs identified by the algorithm were clinically relevant yet missed in manual review.

Conclusions
This study demonstrates portability of the TM algorithm by identifying pADRs across different hospitals and departments without adaptations. These findings support its broader implementation potential for ADR detection in diverse healthcare settings.

Adequacy of Direct Oral Anticoagulant Dosing in Nursing Home Residents: Cross-Sectional Data from the DOAC-FRAIL Study

Irme I.S. Franssen ab*, Melanie J. de Jong ab, Anne Vievermans, Jasmijn J.F. Stroo, Yvonne M.C. Henskens c, Henri Spronk ab, Daisy J.A. Janssen de, Dionne Braeken a, Sander M.J. van Kuijk f, Kristien Winckers abg and Fabienne J.H. Magdelijns bh

a Thrombosis Expertise Center, Maastricht University Medical Center+, Maastricht.
b Cardiovascular Research Institute Maastricht (CARIM), Maastricht University, Maastricht.
c Central Diagnostic Laboratory, Department of Clinical Chemistry and Hematology, Maastricht University Medical Center+, Maastricht.
d Department of Health Services Research, CAPHRI, Maastricht University, Maastricht.
e Department of Expertise and Treatment, Proteion, Horn.
f Department of Clinical Epidemiology and Medical Technology Assessment, Maastricht University Medical Center+, Maastricht.
g Department of Internal Medicine, Section of Vascular Medicine, Maastricht University Medical Center+, Maastricht.
h Department of Internal Medicine, Section of Geriatric Medicine, Maastricht University Medical Center+, Maastricht.

* Correspondence: irme.franssen@mumc.nl.

 

Background
Direct oral anticoagulants (DOACs) are widely prescribed in older adults, but dosing adequacy in frail institutionalized populations remains unclear. Inappropriate dosing may increase the risk of thromboembolic events or bleeding. We aimed to determine the prevalence of inappropriate DOAC dosing in nursing home residents and identify associated clinical factors.

Methods
We conducted a cross-sectional study among 100 residents prescribed a DOAC in a Dutch long-term care organization. Dosing adequacy was assessed according to guideline-based criteria (EHRA for atrial fibrillation, NICE for venous thromboembolism). Two independent reviewers evaluated each case, with expert consensus in case of disagreement. Logistic regression was used to identify factors associated with under- or overdosing.

Results
Median age was 86 years (IQR 82-89), 59% female. DOAC distribution: apixaban 60%, rivaroxaban 22%, dabigatran 10%, edoxaban 8%. Indication was atrial fibrillation in 87% of residents. Overall, 76% were adequately dosed, 18% underdosed, and 6% overdosed. Rivaroxaban use was independently associated with underdosing (aOR 4.61, 95% CI 1.37-15.51). Impaired renal function (eGFR < 60 mL/min) was not significantly associated with underdosing (aOR 0.35, 95% CI 0.11-1.12). Numbers of overdosed patients (n = 6) precluded multivariable analysis.

Conclusions
Nearly one in four nursing home residents prescribed a DOAC received inappropriate dosing, predominantly underdosing. Rivaroxaban use was strongly associated with underdosing. These findings highlight the need for systematic dose reassessment and tailored prescribing support in frail institutionalized populations to optimize safety and efficacy of anticoagulation.

Evaluation of drug-drug interactions associated with metamizole at a Dutch regional hospital

T.G. Jacobs1 ab*, S.E.J.D. van den Eijnde a, L. Nijboer c, A. Kylstra a, S.B. Nicia d, D.M. Burger ce, P.D. van der Linden a and M. van Nuland a

a Department of Pharmacy, Tergooi Medical Center, Hilversum.
b Department of Pharmacy, Antonius Ziekenhuis, Nieuwegein.
c Department of Pharmacy, Pharmacology & Toxicology, Research Institute for Medical Innovations (RIMI), Radboud UMC, Nijmegen.
d Department of Anaesthesiology, Tergooi Medical Center, Hilversum.
e Global DDI Solutions, Utrecht.

* Correspondence: tjacobs@tergooi.nl.

 

Background
Metamizole, a non-steroidal anti-inflammatory drug, has seen increased use in national clinical practice following its inclusion in guidelines for managing post-operative pain. Recent studies have shown that metamizole is a moderate inducer of cytochrome P450 (CYP)3A4, CYP2B6, and CYP2C19 which can lead to drug-drug interactions (DDIs) with co-medications.

Methods
This retrospective observational study evaluates the number of DDIs associated with metamizole when prescribed for ≥24 hours in a clinical setting. Data were collected from the electronic healthcare records of adult patients prescribed metamizole at Tergooi Medical Center between June 2017 and May 2024. Data were extracted using Clinical data Collector (CTCue) and analysed using descriptive statistics in SPSS and R. Relevant DDIs with metamizole were identified using the Metamizole DDI Manager developed by Global DDI Solutions (www.DDIManagers.com). Only clinically relevant, i.e. orange (action may be needed) and red (contra-indicated), DDIs that occurred during or within 7 after discontinuation of metamizole treatment were considered in this study.

Results
A total of 37,110 unique patients received at least one prescription of metamizole between June 2017 and May 2024. Of these, 2.6% (n = 968) were treated with metamizole for ≥24 hours. Notably, the yearly number of patients receiving metamizole for ≥24 hours has increased over the study period. Among all patients receiving metamizole ≥24 hours, 98.6% (n = 954) were prescribed at least one interacting medication. In total, 3,680 DDIs were identified, corresponding to an average of 3.8 DDIs per metamizole prescription. A total of 98 different interacting medications were identified, of which 95% were classified as orange, and 5% as red.

Conclusion
In conclusion, metamizole is associated with a high number of potential DDIs in clinical practice, many of which are potentially clinically relevant. Given the prevalence of these interactions, caution is needed when prescribing metamizole, and pharmacists play a critical role in identifying and managing DDIs. Further research is needed to assess the impact of metamizole on commonly co-prescribed medications and to better understand how these interactions may affect patient outcomes.

The Incidence of Bleeding and Thrombotic Complications After Bariatric Surgery among Vitamin K Antagonist Users and Direct Oral Anticoagulant Users

I. Tebbens a*, M.A. Damhof b, M.A. Kaijser c, M. Emous c, D. Sizoo c, E.N. van Roon b and B.E. Oortgiesen ab

a Department of PharmacoTherapy- Epidemiology & Economics, University of Groningen, Groningen.
b Department of Pharmacy and Pharmacology, Frisius Medical Centre, Leeuwarden.
c Centre Obesity Northern Netherlands, Department of Surgery, Frisius Medical Centre, Leeuwarden.

* Correspondence: ilse.tebbens@frisiusmc.nl.

 

Background
Bariatric surgery alters gastrointestinal (GI) anatomy, potentially affecting drug absorption [1]. While the effect of vitamin K antagonists (VKAs) can be monitored via international normalized ratio (INR), direct oral anticoagulants (DOACs) lack such routine monitoring. Furthermore, absorption sites differ between individual DOACs [2,3], and the safety and efficacy of DOACs after bariatric surgery remain unclear. Therefore, this study aims to determine the incidence of bleeding and thrombotic complications after bariatric surgery in patients using VKAs or DOACs.

Methods
In this retrospective study at Frisius Medical Centre Leeuwarden, patients using a DOAC (apixaban, dabigatran, edoxaban, rivaroxaban) or a VKA (acenocoumarol, phenprocoumon) after bariatric surgery were included. Patients with surgery before 2016 or with gastric/intestinal tubes were excluded. Bleeding and thrombotic events were assessed from postoperative day 30 to exclude surgery-related events. Follow-up ended at the earliest of: (1) a bleeding or thrombotic event, (2) discontinuation/switch of anticoagulant, (3) death, (4) absence of records in the last month, or (5) predefined study end. Only the first prescribed anticoagulant was analysed. Primary outcomes were incidence of any and clinically relevant bleeding and thrombotic events per 100 person years. Statistical analyses included chi square, fisher’s exact and survival analyses (Kaplan-Meier, log-rank, cox regression).

Results
This study included 169 patients (97 VKA, 72 DOAC). The median follow-up duration was 2.1 years for VKA users, 1.7 years for apixaban, 1.5 years for dabigatran, 1.9 years for edoxaban and 3.1 year for rivaroxaban. No thrombotic complications occurred. When considering any bleeding, 12 events occurred: 1.6 per 100 person-years for VKA users and 6.8 per 100 person-years for rivaroxaban (n = 38) users. Notably, no bleeding events occurred with apixaban (n = 20), dabigatran (n = 8) or edoxaban (n = 6). Rivaroxaban was associated with statistically significant higher risk of any bleeding versus VKA (hazard ratio 4.6, 95% CI 1.2-17.5), emerging after 12 months. Clinically relevant bleeding occurred in 2 VKA users and 3 rivaroxaban users (0.8 vs. 2.5 per 100 person-years), suggesting a nonsignificant trend toward higher risk with rivaroxaban.

Conclusions
No thrombotic complications were observed in this study. Rivaroxaban was associated with a statistical significantly higher risk of any bleeding compared with VKA, while showing a nonsignificant trend toward increased risk of clinically relevant bleeding. Given the small sample size for individual DOACs, these findings should be interpreted with caution, and larger studies are warranted to further assess bleeding and thrombotic risks across DOAC subtypes.

References
1. Kingma JS, Burgers DMT, Monpellier VM, et al. Oral drug dosing following bariatric surgery: General concepts and specific dosing advice. Br J Clin Pharmacol. 2021 Dec;87(12):4560-4576.
2. Martin KA, Lee CR, Farrell TM, Moll S. Oral Anticoagulant Use After Bariatric Surgery: A Literature Review and Clinical Guidance. Am J Med. 2017 May;130(5):517-524.
3. Byon W, Nepal S, Schuster AE, Shenker A, Frost CE. Regional Gastrointestinal Absorption of Apixaban in Healthy Subjects. J Clin Pharmacol. 2018 Jul;58(7):965-971.

Incidence of rash and acute kidney failure in a real-life cohort of hospitalized patients with legionella pneumonia treated with levofloxacin or ciprofloxacin

Vera Bukkems ab*, Jet Gisolf c, Maaike de Blauw c and Monique de Maat a

a Department of Clinical Pharmacy and Pharmacology, Rijnstate, Arnhem.
b Department of Pharmacy, Pharmacology and Toxicolgy, Radboudumc, Nijmegen.
c Department of Internal medicine, Rijnstate, Arnhem.

* Correspondence: vera.bukkems@radboudumc.nl.

 

Background
Directed antibiotic therapy of the serious condition legionella pneumonia consist of a fluorquinolone or macrolide. Since the introduction of levofloxacin instead of ciprofloxacin as first-line agent in 2020 health care providers in our hospital seem to observe more rash and acute kidney injury (AKI). In the literature these side-effects are described for both fluorquinolones with a incidence of 1-3%. However no difference in incidence between ciprofloxacin and levofloxacin has been described. This study therefore aimed to objectively test this hypothesis with a retrospective database study in our hospital.

Methods
In this single centre retrospective cohort study patients who were hospitalized with a positive legionella urine antigen and/or PCR test and treated with levofloxacin and/or ciprofloxacin between June 2015 and December 2024 were included using CTcue v4.13.1. The primary endpoint was a descriptive comparison of the incidence of rash and AKI for ciprofloxacin versus levofloxacin. The occurrence of rash as assessed by the treating health care provider was collected from the description of rash (or a synonym) in the patient data file. AKI was defined as an increase in serum creatinine since start medication conform the KDIGO 2012 criteria.

Results
A total of 192 patients with a median (min-max) age of 69 (29-96) years met the inclusion criteria. Treatment consisted of ciprofloxacin alone (45%), levofloxacin alone (26%) or both (switch therapy; 30%). Rash and AKI occurred in respectively 12 (6%) and 27 (14%) patients. When comparing the treatment groups, 7%, 4% and 9% of, respectively, the ciprofloxacin, levofloxacin and switch therapy group experienced rash. For AKI, this was 15%, 14% and 12%. Of the patients with a rash or AKI, respectively 50% and 67% were admitted at the intensive care unit

Conclusions
Rash and AKI are common adverse events for hospitalized patients with legionella pneumonia, and we observed a higher incidence of these adverse events as described in the literature. The levofloxacin therapy group did not experience more rash or AKI when compared to the ciprofloxacin therapy group. These results do not provide grounds for changing the first line policy, and reflect that switching to another fluorquinolone upon a rash or AKI does not seem sensible.

Closing the dosing gap in pregnancy: evidence-based antibiotic dose recommendations from Project MADAM

D.T.A. Hiensch a*, V. Bukkems a, A.C. Dibbets abc, H.C.J. Scheepers bc and S.N. de Wildt ade

a Department of Pharmacy, Pharmacology and Toxicology, Radboud University Medical Center, Nijmegen.
b Department of Obstetrics and Gynaecology, Maastricht University Medical Centre, Maastricht.
c GROW, Institute for Oncology and Reproduction, Maastricht.
d Department of Pediatric and Neonatal Intensive Care, Erasmus MC-Sophia Children’s Hospital, Rotterdam.
e Department of Intensive Care, Radboud University Medical Center, Nijmegen.

* Correspondence: dagmar.hiensch@radboudumc.nl.

 

Background
Amoxicillin, cefuroxime, cefazolin, and azithromycin are frequently used antibiotics during pregnancy. Standard dosing guidelines often recommend equal or lower doses than those for non-pregnant individuals, despite pregnancy-related physiological changes that can reduce drug exposure and therefore compromise therapeutic effectiveness. We aimed to determine pregnancy-specific dosing recommendations using literature review, physiologically based pharmacokinetic (PBPK) modelling, and multidisciplinary expert consensus.

Methods
Pharmacokinetic and safety data were collected for the four antibiotics and assessed using our Framework for Dose Selection in Pregnancy (FDSP) [1]. Antenatal dosing recommendations were established for the following indications: amoxicillin for mild-to-moderate infections, prophylaxis in preterm pre-labour rupture of membranes (PPROM), and prevention of early-onset neonatal sepsis (EONS); cefuroxime for pyelonephritis and community-acquired pneumonia (CAP); cefazolin as peri-operative prophylaxis; and azithromycin for general infection treatment. Recommendations were based on the minimum inhibitory concentrations (MICs) of the disease-associated pathogens, aiming adequate target attainment. The proposed dose recommendations were reviewed by a multidisciplinary committee, comprising clinicians, pharmacists, pharmacometricians, a medical ethicist, and patient representatives.

Results
Studies show that amoxicillin exposure declines during the second and third trimesters of pregnancy. Therefore, the highest dose within the usual range is recommended for treating mild-to-moderate infections during pregnancy. For PPROM, increasing the dose from 500 mg three times daily to 750 mg three times daily is warranted. The standard EONS prophylaxis regimen remains sufficient for target attainment. Similarly, cefuroxime exposure decreases in late pregnancy. For the treatment of pyelonephritis, it is advised to increase the dosing frequency from three to four times daily, using a 1500 mg intravenous (IV) dose in the second and third trimester. Conversely, the current regimen for CAP remains appropriate. Cefazolin exposure is also reduced during pregnancy. PBPK modelling suggests that pregnant individuals weighing over 80 kg should receive a 2 g IV dose 30 minutes before surgery, while those weighing under 80 kg should receive 1 g IV. In contrast, azithromycin concentrations appear comparable between pregnant and non-pregnant individuals, supporting the use of non-pregnant dosing regimens during pregnancy for this antibiotic. These dose recommendations have been endorsed by the multidisciplinary committee and are available via the websites of Moeders van Morgen Lareb and Farmacotherapeutisch Kompas, enabling their use in Dutch clinical practice.

Conclusions
These pharmacokinetic-pharmacodynamic based evidence reviews underscore the increased risk of underdosing antibiotics during pregnancy and the consequent need to adjust doses accordingly to ensure effective and safe treatment for both the mother and foetus.

Reference
1. Koldeweij C, Dibbets C, Franklin BD, Scheepers HCJ, de Wildt SN. A User-Driven Framework for Dose Selection in Pregnancy: Proof of Concept for Sertraline. Clin Pharmacol Ther. 2025 Jan;117(1):214-224.

Antibiotic treatment duration of complicated urinary tract infection patients in Europe: results from POS-cUTI study

C.J.A.R. Kats ab*, J.M. Bravo-Ferrer c, T. ten Doesschate ad, P.D. van der Linden ab, M.J.M. Bonten a and J. Rodríguez-Baño c

a Epidemiology of Infectious Diseases, Julius Center for Health Sciences and Primary Care, Utrecht.
b Tergooi Hospital, Hilversum.
c Hospital Universitario Virgen Macarena/Universidad de Sevilla/IBiS/CIBERINFEC, Spain.
d Jeroen Bosch Hospital, ‘s Hertogenbosch.

* Correspondence: ckats@tergooi.nl.

 

Background
Complicated urinary tract infections (cUTI) are a leading cause of hospital admissions and antibiotic use, contributing significantly to healthcare costs and antimicrobial resistance. Guidelines recommend antibiotic treatment durations of 7-14 days. However, limited data exists regarding real word treatment duration in European countries. This study aims to map antibiotic treatment duration and associated baseline characteristics in the POS-cUTI study.

Methods
The POS-cUTI study is a perpetual observational study enrolling hospitalised patients with cUTI, initiated in October 2022. Inclusion criteria include patients aged ≥ 18 years with cUTI. Exclusion criteria include limited life expectancy prior to cUTI or participation in randomised controlled trials. Information (day 0-30) on clinical course and outcomes, including treatment and complications, is collected prospectively using structured forms. Antibiotic treatment duration is calculated by subtracting start date from end date of antibiotic therapy.

Results
The median total, intravenous, and oral antibiotic treatment durations were 12 days (interquartile range [IQR] 8-15), 6 days (IQR 3-10), and 4 days (IQR 0-8), respectively. Approximately a quarter of patients had treatment duration longer than 14 days. Longer treatment duration was associated with prostatitis, renal abscesses, ICU admission, and hospital readmission due to cUTI (Figure 1). Conversely, complicated cystitis (median 6.5, IQR 5-14) was associated with shorter treatment duration. There was significant cross-country variance in treatment duration, though representation (e.g., Romania: n = 4; Denmark: n = 8) and hospital type per country varied (Figure 2).

Conclusions
These results reveal variability in antibiotic treatment duration among hospitalised cUTI patients, influenced by infection type and severity, and country. The relatively long median intravenous treatment duration of 6 days suggests a potential opportunity for reduction, which could shorten hospital stays, decrease healthcare costs, and improve sustainability. Further investigation into the underlying motivations for prescribing longer total treatment durations, particularly in less severe cases, is warranted. Addressing these discrepancies is crucial for optimising antibiotic stewardship and mitigating antimicrobial resistance.

Prevalence and impact of medication discrepancies across care transitions: A prospective observational cohort study in Dutch hospitals

Betsie Limmen ab*, Joy van Broekhuizen c, Alan Abdulla d, Michiel Duyvendak e, Linda van Eikenhorst f, Karin Hek f, Judith de Ruijter-van Dalem ag, Patricia van den Bemt c and Fatma Karapinar ab

a Department of Clinical Pharmacy and Toxicology, Maastricht University Medical Center+, Maastricht.
b Department of Clinical Pharmacy, CARIM Cardiovascular Research Institute Maastricht, Maastricht University, Maastricht.
c Department of Clinical Pharmacy and Pharmacology, University of Groningen, University Medical Centre Groningen, Groningen.
d Department of Hospital Pharmacy, Erasmus University Medical Center, Rotterdam.
e Department of Clinical Pharmacy, Antonius Hospital Sneek, Sneek.
f Netherlands Institute for Health Services Research (NIVEL), Utrecht.
g Department of Clinical Pharmacy, NUTRIM, Institute of Nutrition and Translational Research in Metabolism, Maastricht University, Maastricht.

* Correspondence: betsie.limmen@mumc.nl.

Background
Transitions in healthcare can pose risks to patient safety, as medication information must be transferred between healthcare sectors. Incomplete or inconsistent transfer often results in medication discrepancies (MDs), which are associated with patient harm, avoidable hospitalisations, and increased healthcare costs. To address these risks, the Dutch National Medication Transfer Program aims to implement comprehensive electronic transfer of medication information. The primary aim of this study was to quantify the proportion of patients with at least one MD across different care transitions. Secondary aims were to assess the time required for medication reconciliation and the associated costs.

Methods
A prospective observational cohort study was conducted in two Dutch hospitals, the Erasmus University Medical Centre Rotterdam and the Antonius hospital Sneek. Adults (≥ 18 years) using ≥ 3 prescribed medications were eligible for inclusion between July and September 2024. At hospital admission and discharge, medication list reconciled by pharmacy technicians were compared with medication lists from the Nationwide Medication Record System (NMRS), community pharmacies, and general practices (GP). For outpatient clinics, the physicians’ medication lists were compared with the NMRS. In the hospital setting, the unintentionality of discrepancies was assessed with the treating physician. The primary outcome was the percentage of patients with ≥1 unintentional medication discrepancy (UMD) at admission, discharge, or outpatient clinics or ≥1 MD in community pharmacy or GP records. Secondary outcomes were time required for medication reconciliation and associated costs. Descriptive statistics were used for analysis.

Results
A total of 144 patients were included (82 [57%] female; median age 65 years, IQR 55-75). Of these, 65 were included at admission, 66 at discharge (partly overlapping with the admission group), and 66 at outpatient clinics. At admission 6 (9%) patients had ≥ 1 UMD, and at discharge 9 (14%) patients. In outpatient clinics, 29 patients (44%) had ≥ 1 UMD.
In community pharmacies 47 (76%) patients at admission had ≥ 1 MD and 46 (78%) at discharge. For general practitioners, these data were 43 (78%) and 52 (90%), respectively.
The median time spent on medication reconciliation was 12 minutes (IQR 8-15) at admission, 10 minutes (IQR 7-15) at discharge, and 1.4 minutes (IQR 0.8-1.8) in outpatient clinics, corresponding to costs of € 6.00, € 5.00, and € 3.21 per patient.

Conclusions
This study shows that (U)MDs are common, particularly in settings without medication reconciliation by pharmacy technicians. The time and costs of medication reconciliation are substantial. These findings underscore the need for improved medication information transfer.

Reducing oxycodone prescription and use in orthopedic patients after hip and knee arthroplasty: a pre-post intervention study

M. Phaff a*, R. Geuze b and B. Maat c

a Department of Pharmacoepidemiology and Clinical Pharmacology, Utrecht University, Utrecht.
b Department of Orthopaedic Surgery, Elisabeth-Tweesteden Hospital, Tilburg.
c Department of Hospital Pharmacy, Elisabeth-Tweesteden Hospital, Tilburg.

* Correspondence: m.m.phaff@students.uu.nl.

 

Background
Surgical procedures such as total knee arthroplasty (TKA) and total hip arthroplasty (THA) are associated with higher risks of long-term opioid use, compared to non-surgical procedures. Studied interventions to reduce opioid use after THA and TKA mainly focused on pharmacological approaches, including peripheral nerve blocks, local anaesthetic infiltration, nonsteroidal anti-inflammatory drugs, and multimodal analgesia. The aim of this study was to assess the effect of a multifaceted intervention (at patient-, nurse- and prescriber-level) on postoperative prescribing and use of oxycodone after discharge in TKA and THA patients.

Methods
A prospective monocenter pre-post intervention study was conducted. Patients ≥ 18 years scheduled for TKA or THA between 17/03/2025-04/04/2025 (pre-intervention) and 06/05/2025-23/05/2025 (post-intervention) were included. The multifaceted intervention combined (i) intensifying patient education on opiate use, (ii) extending postoperative pain assessment by nurses and (iii) tailoring oxycodone prescribing by physicians. Primary outcomes were postoperative opioid prescribing and -use (percentage of patients discharged with at least one oxycodone prescription and percentage of patients using oxycodone on the day of discharge, day 1, 3, 7 and 14 post-discharge). Secondary outcomes were postoperative pain, defined as Numeric Rating Scale pain scores on the prespecified days, prescription/refill requests, and leftover medication after discharge. Differences between pre- and postintervention groups were analysed using an unpaired t-test or chi-square test. Changes in pain scores over time and differences in these changes between groups were analysed using Mixed Repeated Measures ANOVA (mixed ANOVA).

Results
Pre-intervention 34 patients were included, post-intervention 32 patients. Patient characteristics were comparable between groups. Pre-intervention, 34/34 patients (100%) were discharged with oxycodone (immediate release, IR), compared to 16/32 post-intervention patients (50%) (P < 0.0001). This reduction remained significant for THA and TKA patients separately (P < 0.0001 resp. P < 0.005). No significant reduction in opioid use after discharge was observed for either THA or TKA patients There were no significant differences in pain scores between the pre- and post-intervention groups. Among patients who were not discharged with oxycodone IR, only one requested a prescription, which was ultimately not used. Pre-intervention, 28/34 (82.4%) patients had surplus oxycodone tablets, post-intervention less, but still 14/34 (41.2%) patients reported having oxycodone tablets left.

Conclusions
The multifaceted intervention significantly reduced the proportion of THA and TKA patients postoperatively discharged with oxycodone without increasing postoperative pain after discharge. Nonetheless, post-intervention, 40% of patients had excess oxycodone tablets, indicating that there is room for further decrease of the availability of oxycodone and consequent risks of long-term use.

Towards standardized drug-drug interaction alerts and workflows in Dutch intensive care units: results of a national survey

Charlotte A.M. Mittendorff ab*, Joanna E. Klopotowska acd and Arthur T.M. Wasylewicz ae, ADAPTIVE Study Group

a Department of Medical Informatics, Amsterdam UMC location University of Amsterdam, Amsterdam.
b Department of Clinical Pharmacy, Haga Teaching Hospital, The Hague.
c Amsterdam Public Health, Digital Health, Amsterdam.
d Amsterdam Public Health, Quality of Care, Amsterdam.
e Lead Healthcare, Baarn.

* Correspondence: c.a.m.mittendorff@amsterdamumc.nl.

 

Background
Intensive care unit (ICU) professionals are frequently overloaded with drug-drug interaction (DDI)-alerts that lack clinical relevance in the ICU context, undermining the effectiveness of Clinical Decision Support Systems (CDSSs).
In 2019, a Dutch multicenter trial demonstrated that an ICU-specific strategy for DDI-alerts, developed by a national panel of ICU professionals, reduced administration of high-risk drug combinations by 12% (P = 0.0008) and ICU length of stay by 6% (P = 0.0021). These findings underscored that CDSSs only become an effective safety tool in DDI prevention in the ICU when tailored to that setting. Since this trial, however, most ICUs have transitioned from ICU-exclusive to hospital-wide electronic health records (EHRs), while CDSS alerting has shifted from simple to flowchart-based logic. Whether these transitions have altered the need for a centralized ICU-specific strategy for DDI-alerts remains unclear.
Therefore, the objective of this study was to characterize current CDSS practices for DDI-alerts and associated workflows within Dutch ICUs in order to inform future CDSS intervention strategies.

Methods
We conducted a nationwide electronic survey between March and September 2025 among Dutch clinical pharmacists with expertise in ICU, medication safety, or ICT. The 11-question survey assessed EHR/CDSS infrastructure, ICU-specific strategy for DDI-alerts, and related workflows.

Results
The response rate was 74% (50/68 hospitals), covering all seven university hospitals, 82% of teaching hospitals, and 63% of general hospitals. Regarding EHR/CDSS infrastructure, 74% of ICUs reported using integrated CDSS as part of a hospital-wide EHR, 12% an integrated CDSS within an ICU-exclusive EHR, and 14% non-integrated or manual surveillance. A majority (88%) used flowchart-based alert logic. Despite trial evidence supporting ICU-specific DDI-alerts, 46% of hospitals reported no ICU-specific strategy. Substantial workflow variation was observed, e.g., at ICU admission, home medication is routinely discontinued in 10% of hospitals, temporarily paused in 16% (still triggering alerts), and continues in 70% which impacts DDI-alert generation. Alert presentation also varies, being directed to both pharmacy and ICU staff in 74%, only to pharmacy in 18%, and only to ICU staff in 6% hospitals.

Conclusions
Despite trial evidence that ICU-specific strategy for DDI-alerts improves CDSSs effectiveness and patient outcomes, nearly half of Dutch hospitals lack such strategies. In addition, variation in workflows further impacts consistency across organizations. Centralizing the development and maintenance of ICU-specific strategy for DDI-alerts would standardize practice, reduce workload needed for local customizations, and facilitate national implementation compatible with modern EHR/CDSS systems. Thus ensuring CDSSs achieves its potential in critical care.

Drug-related problems following bariatric surgery identified during clinical pharmacist-initiated reviews: prevalence and practical recommendations from a cohort study

B.E. Oortgiesen ab*, W.E.M. van Bakel b, J. Witteveen c W. Burger d, M. Emous d and E.N. van Roon a

a Department of Pharmacy, Frisius MC, Leeuwarden.
b University of Groningen, Groningen Research Institute of Pharmacy, Department of Pharmaceutical Analysis, Groningen.
c Department of Internal medicine, Frisius MC, Leeuwarden.
d Department of Bariatric surgery, Frisius MC, Leeuwarden.

* Correspondence: berdien.oortgiesen@frisiusmc.nl.

 

Background
Patients undergoing bariatric surgery are at increased risk of drug-related problems (DRPs) due to gastrointestinal alterations, postoperative shifts in body composition and subsequent changes in pharmacokinetics and pharmacodynamics[1]. Identifying and managing DRPs in this population is essential to ensure optimal pharmacotherapy and minimize associated risks. This study aimed to evaluate the prevalence and nature of DRPs identified during a clinical pharmacist-initiated review, six months after bariatric surgery in patients at high risk for medication-related complications.

Methods
This prospective observational study included patients at high risk for medication-related complications who underwent bariatric surgery and attended the outpatient clinic for follow-up six months after surgery. DRPs were identified through patient interviews and medication reviews conducted by a clinical pharmacist. Patients were classified as ‘high risk for medication-related complications’ based on polypharmacy (defined as the concurrent use of five or more medications with systemic exposure, excluding standard postoperative bariatric surgery medications), the use of specific high-risk drugs (e.g., psychotropic agents or medications with a narrow therapeutic range), and/or the presence of patient-reported medication-related concerns. Medication regimens were systematically reviewed, and data were analysed to characterize common DRPs and explore their associations with patient demographics and comorbidities.

Results
Seventy-eight patients were reviewed between April 2023 and December 2024, all of whom provided informed consent, and identified 173 DRPs in 69 patients (median 2 per patient; range 0-7). As shown in Table 1, the most frequent DRPs were drug therapy without clear indication (40%, n = 69), adverse drug reactions (12%, n = 21), drug-drug interactions (12%, n = 21), suboptimal drug selection (10%, n = 18), and issues related to therapy adherence (10%, n = 17). Commonly proposed interventions included discontinuation of proton pump inhibitors when no longer indicated, addressing orthostatic hypotension caused by continued use of antihypertensive drugs, and resolving interactions regarding calcium supplements that restrict iron absorption from bariatric multivitamins. The number of comorbidities at the time of the medication review showed a statistically significant (P < 0.001) correlation with the occurrence of DRPs, whereas no associations were observed with age or BMI.

Conclusions
This study found a high prevalence of DRPs in patients at high risk for medication-related complications six months after bariatric surgery, particularly regarding drug therapy without indication, adverse drug reactions, and drug-drug interactions. These results emphasize the value of medication reviews conducted by clinical pharmacists in the post-bariatric surgery population, as they effectively identify DRPs and enable targeted interventions to optimize pharmacotherapeutic management.

Reference
1. Kingma JS, Burgers DMT, Monpellier VM, et al. Oral drug dosing following bariatric surgery: General concepts and specific dosing advice. Br J Clin Pharmacol. 2021 Dec;87(12):4560-4576.

Beliefs, practices and knowledge regarding the use and risks of opioids among clinicians

M.B.J.M. Smit a*, B. Maat b, R.E. Geuze c, T.C.C. Jaspers b, C.P.A. van Hees d, D. Jansen e and M.M.P.M. Jansen b

a Department of Pharmaceutical Sciences, Utrecht University, Utrecht.
b Department of Clinical Pharmacy, Elisabeth-TweeSteden Hospital, Tilburg.
c Department of Orthopaedic Surgery, Elisabeth-TweeSteden Hospital, Tilburg.
d Department of Vascular Surgery, Elisabeth-TweeSteden Hospital, Tilburg.
e Department of Anaesthesiology, Elisabeth-TweeSteden Hospital, Tilburg.

* Correspondence: m.b.j.m.smit@students.uu.nl.

 

Background
In the Netherlands, opioid prescriptions increased significantly between 2008 and 2017, raising concerns about misuse and resulting in the development of prescribing guidelines and awareness efforts. While previous research mainly focused on the perception of prescribing opioids for chronic non-malignant pain in primary care, this study aimed to provide an overview of the perspectives among secondary care clinicians, distinguishing between different pain types and comparing perspectives with those in primary care.

Methods
A prospective quantitative study was conducted at the Elisabeth-TweeSteden Hospital, a large teaching hospital, using a self-developed, anonymous questionnaire, based on the Health Belief Model and available literature, to assess clinicians’ beliefs, practices, and knowledge regarding opioid prescribing and (mis)use. The questionnaire included 38 items across three domains and distinguished between three types of pain: acute and post-operative pain (APOP), chronic non-malignant pain (CNMP), and cancer pain (CP). In addition, an indirect comparison was made between primary and secondary care prescribers with regard to CNMP. Data were analysed descriptively.

Results
168 of 668 of the invited clinicians completed the questionnaire for at least one type of pain (response rate 25.1%). Most responses concerned APOP (n = 133), followed by CNMP (n = 49) and CP (n = 30). Clinicians managing CNMP reported less confidence and were more cautious when prescribing opioids compared to those treating APOP or CP. Clinicians treating CP perceived fewer barriers and greater benefits of opioids. Clinicians treating APOP considered opioids primarily for effective short-term pain management. This in contrast to clinicians treating CNMP or CP who more often considered long-term opioid use necessary. Across all pain types, 70.5% of clinicians reported low confidence in recognising symptoms of misuse. Clear agreements on opioid refills within specialties (26%) or between specialties and general practitioners (GP’s) (18%) were rare. Finally, an indirect comparison between GP’s and clinicians in secondary care showed that GP’s reported greater confidence in their skills in prescribing opioids for CNMP (61.2% vs. 34.7%).

Conclusions
This study shows that clinicians' beliefs about opioid prescribing vary by pain type, with those treating CNMP expressing more concerns and less confidence than those managing APOP or CP. Recognizing opioid misuse is a challenge for all clinicians. General practitioners seem more confident in managing CNMP than clinicians in secondary care. Finally, there seems to be room for harmonisation of responsibilities regarding opioid refills.

Creating Awareness of preScribing Cascades & managing Adverse Drug Effects (CASC-ADE) tool: appraisal by future end-users

P. Denig a, K. van der Walle bc*, T. Boerman a, F.J.H. Magdelijns de, J.T.H. Nielen bd and F. Karapinar-Çarkit bd

a Department of Clinical Pharmacy and Pharmacology, University of Groningen, University Medical Centre Groningen, Groningen.
b Department of Clinical Pharmacy & Toxicology, Maastricht University Medical Center+, Maastricht.
c NUTRIM, Institute of Nutrition and Translational Research in Metabolism, Maastricht University, Maastricht.
d CARIM, Cardiovascular Research Institute Maastricht, Maastricht University, Maastricht.
e Department of Internal Medicine, subdivision of General Internal Medicine, section of Geriatric medicine, Maastricht University Medical Center+, Maastricht.

* Correspondence: kjell.van.der.walle@mumc.nl.

 

Background
Prescribing cascades occur when adverse drug reactions (ADR) are misinterpreted as new medical conditions and subsequently treated with additional medications (Figure 1), leading to various negative outcomes for patients and the health care system in general. A key pillar in the management of prescribing cascades is having theoretical and procedural knowledge on ADRs and prescribing cascades. Therefore, a tool comprised of knowledge documents summarizing background information to support advice to recognise, address and prevent several clinically relevant prescribing cascades was developed. This study aimed at assessing the end-users’ appraisal of this so-called “Creating Awareness of preScribing Cascades & managing Adverse Drug Effects” (CASC-ADE) tool.

Methods
An online questionnaire including 15 items, scored on 7-point Likert scales, was developed to cover four domains: purpose, rigour, usability, and usability of the CASC-ADE tool. The questionnaire was sent, along with at least one knowledge document, to a convenience sample of HCPs, medical and pharmacy students. Suggestions for improvement and other comments could be made by respondents in open text boxes. Descriptive statistics are presented separately for HCPs and students.

Results
Seventeen HCPs (9 pharmacists, 8 clinicians) and 11 students (5 pharmacy, 6 medical) were included. All items received positive median scores, ranging from 5-7 for HCPs and 5-6 for students. Highest scores were seen for purpose, usability (in particular, clarity and efficiency), and usefulness for recognising prescribing cascades (4 items, medians 6-7). The usability for addressing and preventing prescribing cascades was perceived slightly lower (medians 5-6). Fifteen HCPs and 9 students were positive about using the tool in practice although some indicated that this would depend on linkage to existing information systems. A few HCPs and students suggested to present explicit recommendations instead of a summary of advice on how to manage specific prescribing cascades.

Conclusions
The CASC-ADE tool was generally perceived as more than adequate across the domains studied for both HCPs and students. Further research is needed to assess whether the CASC-ADE tool improves recognition, management and prevention of prescribing cascades in clinical practice.

The time to next treatment (TTNT) of patients with chronic lymphocytic leukemia (CLL): a population-based study

A. Veenstra a, B.E. Oortgiesen ab*, F.G.A. Jansman ac and M. Hoogendoorn d

a University of Groningen, Groningen Research Institute of Pharmacy, Department of Pharmaceutical Analysis, Groningen.
b Department of Pharmacy, Frisius Medical Centre, Leeuwarden.
c Department of Pharmacy, Deventer Hospital, Deventer.
d Department of Haematology, Frisius Medical Centre, Leeuwarden.

* Correspondence: berdien.oortgiesen@frisiusmc.nl.

 

Background
The treatment landscape of chronic lymphocytic leukemia (CLL) has changed drastically in recent years, as new treatments have emerged from clinical trials. However, differences between clinical trials and daily practice, for example in disease monitoring, limit understanding of treatment patterns and outcomes of real-world patients. This study aimed to address knowledge gaps between clinical trials and daily practice, by determining the time to next treatment (TTNT) in a CLL population.

Methods
A retrospective, observational study was conducted. All patients diagnosed with CLL between January 2005 and January 2024, who received ≥ 1 treatment(s) at any of the four Frisian hospitals (Frisius MC Leeuwarden, Frisius MC Heerenveen, Nij Smellinghe, and Antonius) were included using the electronic health record system Hemobase. TTNT was defined as the interval between the start of a treatment and the start of a subsequent treatment, death, or censoring at last follow-up. Secondary endpoints included time to first treatment (TTFT), defined as the interval between diagnosis and initiation of first-line (1L) therapy, and progression-free survival (PFS), defined as the interval between the start of a treatment and disease progression, death, or censoring at last follow-up. In accordance with the International Workshop on Chronic Lymphocytic Leukemia (iwCLL) criteria, disease progression was defined as an increase in lymphocyte count of ≥ 50% after treatment, reaching at least 5 x 109/L lymphocytes [1].

Results
Of 697 eligible patients, 282 (40.5%) received 1L treatment. Median follow-up of the treated cohort was 90 months. Median TTFT was 15 months. Three-year PFS was 44.3% overall, and 9.5%, 46.4% and 78.8% for chemotherapy (CT), CT + anti-CD20 and targeted therapy respectively (Figure 1A). 139 patients (19.9%) received second-line (2L) treatment. Three-year 1L TTNT was 62.4% overall, and 36.9%, 68.3% and 86.6% for CT, CT + anti-CD20 and targeted therapy respectively (Figure 1B). Patients receiving CT + anti-CD20 (HR = 0.46, 95% CI [0.32, 0.65], P < 0.001) or targeted therapy (HR = 0.25, 95% CI [0.12, 0.52], P < 0.001) had lower risk initiating 2L therapy compared to CT. No significant difference was observed between CT + anti-CD20 and targeted therapy (P = 0.096).

Conclusions
Targeted therapy showed a longer TTNT than CT and a longer, though not statistically significant, TTNT than CIT in our analysis, whereas clinical trials have demonstrated a significant benefit of targeted therapy over CIT [2,3]. Future research is necessary to determine whether this advantage translates into real-world practice and to optimize treatment sequencing.

References
1. Hallek M, Cheson BD, Catovsky D, et al. iwCLL guidelines for diagnosis, indications for treatment, response assessment, and supportive management of CLL. Blood. 2018 Jun 21;131(25):2745-2760.
2. Al-Sawaf O, Robrecht S, Zhang C, et al. Venetoclax-obinutuzumab for previously untreated chronic lymphocytic leukemia: 6-year results of the randomized phase 3 CLL14 study. Blood. 2024 Oct 31;144(18):1924-1935.
3.Niemann CU, Munir T, Moreno C, et al. Fixed-duration ibrutinib-venetoclax versus chlorambucil-obinutuzumab in previously untreated chronic lymphocytic leukaemia (GLOW): 4-year follow-up from a multicentre, open-label, randomised, phase 3 trial. Lancet Oncol. 2023 Dec;24(12):1423-1433.

Pharmacokinetics of long-acting cabotegravir and rilpivirine in virologically suppressed adolescents living with HIV-1 in Sub-Saharan Africa: Data from the LATA trial

Lisanne Bevers a*, Elizabeth Chappell b, Cissy Kityo Mutuluuza c, Henry Mugerwa c, Mutsa Bwakura Dangarembizi d, Sibusisiwe Weza d, Louis Diana Anena e, Moherndran Archary f, Molly Bush b, Alexandra Green b, Margaret J. Thomason b, Deborah Ford b, Adeodata R. Kekitiinwa e, Sarah L. Pett b, Tom Jacobs a and David M. Burger a, for LATA trial team

a Radboud University Medical Center, Nijmegen.
b Medical Research Council Clinical Trials Unit at University College London (UCL), London, United Kingdom.
c Joint Clinical Research Centre, Kampala, Uganda.
d University of Zimbabwe Clinical Research Centre, Harare, Zimbabwe.
e Baylor College of Medicine Children's Foundation, Kampala, Uganda.
f Department of Paediatrics and Children Health, King Edward VIII Hospital, Enhancing Care Foundation, University of KwaZulu-Natal, Durban, South Africa.

* Correspondence: lisanne.bevers@radboudumc.nl.

 

Background
Adolescents living with HIV often face difficulties in maintaining viral suppression, with adherence to daily oral antiretroviral therapy (ART) being a key barrier. Long-acting (LA) injectable ART with cabotegravir (CAB) and rilpivirine (RPV), administered every 8 weeks (Q8W), has shown efficacy and acceptability in adults and adolescents. However, data in African adolescents remain limited. The LATA trial (NCT05154747) compares Q8W LA CAB/RPV (600/900mg) with once-daily oral lamivudine/tenofovir disoproxil fumarate/dolutegravir in virologically suppressed adolescents (HIV-1 RNA < 50 c/mL) aged 12 to < 20 years in Sub-Saharan Africa. Here, we present pharmacokinetic (PK) results from a sub-study in the LA arm after 1 year of treatment.

Methods
We aimed for ≥ 20 evaluable PK curves with balanced sex representation. Participants received LA CAB/RPV loading doses (600/900 mg) at weeks 4 and 8, followed by Q8W maintenance injections. An optional 4-week oral lead-in (OLI) was available; those not receiving OLI continued their existing ART. Around the second maintenance dose (week 24), five blood samples were collected at pre-injection (t = 0), day 3, day 7, day 28, and day 56 (pre-week 32 injection). Samples were processed within 3 hours in 99% of cases. CAB and RPV concentrations were measured using validated UPLC-MS/MS. PK parameters (AUC0-56d, Cmax, Ctrough) were estimated with non-compartmental analysis and compared with data from adults (LATTE-2, ATLAS-2M) and adolescents (MOCHA cohort 2).

Results
Twenty-eight adolescents from Uganda, Zimbabwe, and South Africa were included (13 female, 15 male); 16 (57%) received an OLI. At week 24, median (range) age was 17.4 (12.7–20.4) years and BMI 21.0 (14.0–28.9) kg/m². For CAB, geometric mean (GM, %CV) AUC0–56d, Cmax and Ctrough were 4887 (30) h*mg/L, 5.35 (42) mg/L and 2.19 (47) mg/L, respectively. For RPV, GM (%CV) AUC0-56d, Cmax and Ctrough were 150.1 (29) h*mg/L, 0.194 (33) mg/L and 0.078 (28) mg/L. Three participants (11%) had CAB Ctrough below the Q1 threshold of 1.12 mg/L, with one (4%) also below 0.66 mg/L (4×PA-IC90) at 0.63 mg/L. All participants had RPV Ctrough above both Q1 (0.032 mg/L) and PA-IC90 (0.048 mg/L) thresholds. CAB and RPV exposures were comparable with previously reported adult and adolescent data.

Conclusion
In this African adolescent cohort, LA CAB/RPV achieved drug exposures consistent with earlier adult and adolescent studies. The ongoing 96-week primary analysis will provide further data on the long-term efficacy and safety of LA CAB/RPV in this population.

Silent Danger: Dabigatran Build-Up in a Frail Nonagenarian - Rethinking DOAC Monitoring in Acute Care

Irme S. Franssen ab*, Sabine R. de Wild c, Melanie J. de Jong ab, Astrid M.L. Oude Lashof d, Kristien Winckers be and Fabienne J.H. Magdelijns bf

a Thrombosis Expertise Center, Maastricht University Medical Center+, Maastricht.
b Cardiovascular Research Institute Maastricht (CARIM), Maastricht University, Maastricht.
c Department of Internal Medicine, Maastricht University Medical Center+, Maastricht.
d Department of Medical Microbiology and Infectious Diseases, Maastricht University Medical Center+, and NUTRIM School for Nutrition and Translational Research in Metabolism, Maastricht.
e Department of Internal Medicine, Section of Vascular Medicine, Maastricht University Medical Center+, Maastricht.
f Department of Internal Medicine, Section of Geriatric Medicine, Maastricht University Medical Center+, Maastricht.
Shared first authorship.

* Correspondence: irme.franssen@mumc.nl.

 

Background
Direct oral anticoagulants (DOACs) are generally prescribed without routine monitoring due to predictable pharmacokinetics. However, frail elderly patients with renal impairment or acute illness may be at risk for drug accumulation and bleeding. We describe a case of clinically significant dabigatran build-up in a frail nonagenarian with acute kidney injury.

Methods (case presentation)
A 91-year-old woman on dabigatran 110 mg twice daily presented with a leg hematoma after a fall. Laboratory testing revealed anaemia (Hb 5.7 mmol/L), severe inflammation, and acute kidney injury (creatinine 246 µmol/L). Dabigatran was discontinued after gastrointestinal bleeding occurred on day 1. Despite transfusion, haemoglobin remained low. Diluted thrombin time assay ~24h after last intake showed dabigatran 446 ng/mL, far exceeding expected trough levels (40-150 ng/mL). Levels remained elevated on day 3 (240-270 ng/mL). Prothrombin complex concentrate was administered; idarucizumab was not used. The patient opted for comfort care and died a few days later.

Results
This case demonstrated persistent supratherapeutic dabigatran concentrations despite discontinuation, reflecting impaired renal clearance. Laboratory confirmation of drug accumulation was delayed until clinical deterioration had progressed.

Conclusions
Elderly, frail patients on dabigatran with acute kidney injury are at risk for drug build-up and bleeding. Early drug level testing in acute care may guide reversal strategies and transfusion decisions. Systems that enable timely DOAC assays and clear reversal protocols are crucial to improve outcomes in this vulnerable population.

Personalized osimertinib dose frequency adjustment and pharmacokinetic boosting to improve cost-effectiveness in advanced epidermal growth factor receptor (EGFR) mutated Non-Small Cell Lung Cancer (NSCLC) treatment: the OSIBOOST 2A study

P.D. Kruithof a*, B.J. Sikkema b, D.A.C. Lanser bc, M.M.A. Salmans ad, A.C. Dingemans c, M.S. Paats c, G. Ruiter e, J. Smit e, A.J. van der Wekken f, T.H. Oude Munnink g, R.H.J. Mathijssen b, L.E.L. Hendriks d, R.M.J.M. van Geel a and S. Croes a

a Department of Clinical Pharmacy & Toxicology, CARIM, research institute for Cardiovascular Disease, Maastricht University Medical Centre+, Maastricht.
b Department of Medical Oncology, Erasmus MC Cancer Institute, Erasmus University Medical Center, Rotterdam.
c Department of Pulmonology, Erasmus MC Cancer Institute, Erasmus University Medical Center, Rotterdam.
d Department of Pulmonary Diseases, GROW research institute for Oncology and Reproduction, Maastricht University Medical Center+, Maastricht.
e Department of Thoracic Oncology and Department of Early Clinical Development, Netherlands Cancer Institute, Amsterdam.
f Department of Pulmonology and Tuberculosis, University of Groningen, University Medical Centre Groningen, Groningen.
g Department of Clinical Pharmacy and Pharmacology, University Medical Centre Groningen.

* Correspondence: paul.kruithof@mumc.nl.

 

Background
Osimertinib 80 mg once daily (QD) is a standard treatment for EGFR-mutated NSCLC, but is associated with a substantial financial burden. Although reducing the dosing frequency may alleviate treatment associated costs, there is limited evidence supporting the efficacy of osimertinib at lower steady state trough concentrations (Cmin,ss < 125 ng/mL). Additionally, due to flat pricing, dose reduction (to 40 mg QD) is equally expensive. Furthermore, this reduced dose results in Cmin,ss < 125 ng/mL for the majority of patients. Therefore, we are investigating the feasibility of personalized osimertinib dose frequency reduction, with or without pharmacokinetic boosting (co-administration with the strong CYP3A inhibitor cobicistat) as a novel strategy to improve cost-effectiveness.

Methods
OSIBOOST-2A (NCT05748093) is an open-label multi-center feasibility trial, funded by ZonMw (848017002). Patients on osimertinib 80 mg QD are eligible and stratified by baseline osimertinib exposure: those with initial osimertinib Cmin,ss (> 259 ng/mL) first undergo dose frequency reduction, those with Cmin,ss (< 210 ng/mL) are boosted with cobicistat 150 mg QD, and patients with Cmin,ss (210-259 ng/mL) simultaneously undergo both dose frequency reduction and boosting with cobicistat. After reaching new steady-state, osimertinib exposure is re-evaluated and dosing is adjusted if needed to maintain osimertinib exposure within the provisional therapeutic window (Cmin,ss 125-259 ng/mL), ensuring efficacy and avoiding toxicity. Treatment continues until disease progression, with follow-up PK sampling every 12 weeks. The primary endpoint is cumulative osimertinib usage.

Results
To date, 15 out of 40 participants have been enrolled, with 9 having completed study intervention. After completing the osimertinib adjustment intervention, the most commonly used dosing schedule – applied in 5 out of 9 patients – is osimertinib 80 mg every other day, combined with cobicistat 150 mg QD. Preliminary results indicate an average reduction in osimertinib usage of 39%. So far, no serious adverse events were noticed. Patients using CYP3A substrates with a narrow therapeutic window (~30%) were excluded from participation.

Conclusions
OSIBOOST-2A evaluates if improving osimertinib cost-effectiveness is feasible. Preliminary findings suggest that substantial cost reductions are achievable with the applied strategy. Updated results will be presented.

Exploring barriers and facilitators regarding medication adherence to anti-seizure medication in epilepsy patients with a low socio-economic status: a qualitative study

S.H.K. Günes a, M. van Heuckelum b*, S.M. Droger c, A.J.J. Rampen d, T.A. Smits b and B. Kuijper e

a Department of Neurology, Maasstad Ziekenhuis, Rotterdam; Faculty of Science, Utrecht University, Utrecht,
b Department of Clinical Pharmacy, Maasstad Ziekenhuis, Rotterdam,
c Department of Neurology, Maasstad Ziekenhuis, Rotterdam,
d Academisch Centrum voor Epileptologie, Kempenhaeghe, Breda,
e Department of Neurology, Maasstad Ziekenhuis, Rotterdam,; Academisch Centrum voor Epileptologie, Breda

* Correspondence: m.vanheuckelum@asz.nl.

 

Background
With approximately five million people diagnosed with epilepsy each year, epilepsy significantly contributes to the disease burden worldwide. Up to 70% of the patients with epilepsy could become seizure-free with the appropriate use of anti-seizure medication (ASM). However, several reasons might exist for not taking the ASM as prescribed. In adherence research, individuals with a low socio-economic status (SES) are often underrepresented, resulting in a limited understanding of their perspectives and needs regarding barriers and facilitators for adequate medication-intake. This study aims to provide insight in the barriers and facilitators for adherence to ASM among patients diagnosed with epilepsy and having a low SES.

Methods
This qualitative monocenter study based on semi-structured interviews was performed between March 2024 and May 2024. A purposive sampling method was used to recruit adult patients (aged ≥ 18 years) diagnosed with epilepsy and a low SES. Patients were eligible to participate if they were treated for epilepsy in the Maasstad Hospital in the previous two years at time of patient recruitment. An interview guide, based on literature review and expert opinions from a neurologist, nurse practitioner, and clinical pharmacist, was used to identify barriers and facilitators to ASM adherence. A low SES was determined based on level of education, income, and position in the labour market. Data were analysed using a thematic content analysis.

Results
Of the 26 patients who met the inclusion criteria, 12 patients were included (response rate: 46.1%). The main themes identified in this study were: 1) disease acceptance, 2) knowledge and understanding of epilepsy, 3) trigger factors for seizures, 4) interaction between patients and healthcare providers, 5) involvement of social environment, 6) influence on daily functioning and health, and 7) medication use. Facilitators for ASM adherence were daily routines, tools to support medication-intake, trust in the prescribed medication, perceived necessity and support from significant others. Barriers for ASM adherence were forgetfulness, insufficient disease acceptance, medication adjustments and concerns about medication.

Conclusion
Healthcare professionals should establish a strong trust-based relationship with their patients to support medication adherence. Efforts should be made to minimize unnecessary modifications in the prescribed ASM and attention should be paid to no-shows for medical appointments. No-shows for medical appointments may indicate limited health literacy or underlying financial or occupational challenges. Punitive responses to these no-shows should be avoided, as they may contribute to a vicious cycle of reduced healthcare access and a further decline in adherence rates to ASM.

Advancing sustainable medication use in Dutch hospital care: a Delphi study on prescribing interventions

E.M. Smale a*, J.L. van der Giessen a, C. Appels b, E. Leegwater c, M.F. Dietz d, P.M.L.A. van den Bemt e, S.M. Coenradie f, R.B. Kool g, H.F. Kwint h, E. Ista ij and N.G.M. Hunfeld ak

a Department of Adult Intensive Care, Erasmus MC University Medical Centre, Rotterdam.
b Rheumatic care South West Netherlands (RZWN), location Roosendaal, Roosendaal.
c Department of Pharmacy, Radboud University Medical Center, Nijmegen.
d Department of Internal Medicine, Alrijne Ziekenhuis, Leiderdorp.
e Department of Clinical Pharmacy and Pharmacology, University Medical Centre Groningen, University of Groningen, Groningen.
f Department of Pharmacy, Reinier de Graaf Gasthuis, Delft.
g IQ Health science department, Radboud University Medical Center, Nijmegen.
h SIR Institute for Pharmacy Practice and Policy, Leiden.
i Department of Internal Medicine, Division of Nursing Science, Erasmus MC University Medical Centre, Rotterdam.
j Department of Neontal and Pediatric Intensive Care, division of Pediatric Intensive Care, Erasmus MC-Sophia Children’s Hospital, Erasmus MC University Medical Centre, Rotterdam.
k Department of Hospital Pharmacy, Erasmus MC University Medical Centre, Rotterdam.

* Correspondence: e.smale@erasmusmc.nl.

 

Background
The ‘Greening Healthcare Together’ programme seeks to reduce the environmental impact of medication in Dutch hospital care by implementing evidence-based prescribing interventions, focusing on more sustainable dosage forms and deprescribing. This study aimed to identify and prioritize the most appropriate (de)prescribing interventions for commonly used medication (classes) in in- and outpatient hospital care to advance sustainable medication use.

Methods
The twenty most-frequently used medication classes were identified from national outpatient claim data and inpatient purchasing data from four hospitals. Potential interventions were retrieved from literature and expert opinion, limited to active pharmaceutical ingredients, commonly prescribed in hospital care and modifiable according to clinical guidelines. Interventions were categorized by care setting (inpatient/outpatient) and intervention type (deprescribing/sustainable dosage form). A modified RAND Delphi study was conducted (April-July 2025) with physicians and pharmacists working in Dutch hospitals, recruited through newsletters and social media. In two online Delphi rounds, appropriateness of interventions was rated on a 7-point Likert scale. Median scores and disagreement index guided consecutive evaluation until consensus (median score 6-7). In the third round, participants prioritized interventions, with final selection through Rank Sum Weighting (RSW).

Results
Fifty-one interventions were identified for eighteen medication classes. The Delphi panel consisted of 63 participants (36 physicians, 27 pharmacists), of whom three withdrew during the study. Response rates were 53 (85%) in the first round, 49 (80%) in the second, and 49 (82%) in the third. Overall, 22/29 deprescribing interventions (76%), and 20/22 sustainable prescribing interventions (91%) were deemed appropriate.
For outpatients, the highest-ranked deprescribing interventions were stopping chronically used proton pump inhibitors (RSW: 351), communicating opioid stop dates to first-line healthcare providers (RSW: 261), and monitoring stop dates of dual/triple anticoagulant therapy (RSW: 231). For inpatients, they were prescribing opioids with fixed stop date at discharge (RSW: 293), stopping chronically used proton pump inhibitors (RSW: 287), and reducing non-indicated proton pump inhibitor prescriptions (RSW: 283).
The top sustainable dosage form interventions were switching or initiating dry powder inhalers (RSW: 227 and 220 respectively) for outpatients, and for inpatients oral administration of paracetamol (RSW: 429) and antibiotics (RSW: 342), and enteral administration of metoclopramide (RSW: 257).

Conclusions
Most prescribing interventions were considered appropriate for advancing sustainable medication use, highlighting support for their potential implementation in Dutch hospital care. This study informed the ‘Greening Healthcare Together’ programme, that will promote the implementation and evaluation of these interventions in Dutch hospitals (Figure 1).

Prevention of drug residues reaching surface water by placing a filter at the source: a hospital setting

Froucke van Gosliga a*, Caspar Korteweg b and Minke Jongsma a

a Centre of Pharmacy, Frisius Medical Centre, Leeuwarden.
b Laboratory and Toxicology Unit, Martini Hospital, Groningen.

* Correspondence: froucke.van.gosliga@frisiusmc.nl.

 

Background
In 2020 it was estimated that in the Netherlands at least 190 tons of drug residues end up in surface water every year, despite purification by sewage treatment plants [1]. Due to the aging population and increasing drug use, the amount of drug residues in water is expected to rise further [2]. These drug residues can adversely affect aquatic ecosystems, for example by contributing to antibiotic resistance or by causing hormonal and behavioural changes in fish [1].
The aim of this study was to assess the potential of a medication filter at source to prevent drug residues from entering the sewage system by measuring the percentage of drug removal. A secondary objective was to determine the filter’s operational lifetime.

Methods
This explorative study a medication filter (MediCatch Zereau) was installed at the Intensive Care Unit of the Frisius MC Heerenveen location. This unit was selected because of the high use of intravenous medication and the fact that urine is often already collected via catheter. Urine from ICU patients was filtered by disposing of the catheter contents through the filter. A total of 14 samples were taken on seven different days over a 52-day period from the reservoir before the filter (influent) and after the filter (effluent). Drugs were identified using LC-MS analysis (ToxTyper®). Removal efficiency was determined by comparing drug signals (counts) in the influent and effluent.

Results
82% (n = 51) of all drugs and metabolites (n = 62) had a removal percentage above 90% across all sample days, of which 92% (n = 47) were removed completely (100%). Among drugs detected on at least five different sampling days, 61% showed a removal rate of 100%, as can be seen in Figure 1. For three drugs a higher signal was found in the effluent than in the influent on at least one sample day. At 38 days after filter placement the overall removal percentage dropped below 90% (to 89%), and after 52 days it was 79%.

Conclusions
A medication filter at source can be used to reduce the entry of drug residues into the sewage system and ultimately surface waters, thereby potentially easing the burden on sewage treatment plants. To maintain sufficient purification efficiency, a filter lifetime of one month is recommended. Further research is needed to optimize the sampling method and to determine the contexts in which filtration at source provides the greatest added value.

References
1. Moermond CTA, Montforts MHMM, Roex EWM, Venhuis BJ. Medicijnresten en waterkwaliteit: een update. RIVM-briefrapport 2020-0088. Bilthoven: RIVM; 2020.
2. Stichting Farmaceutische Kengetallen. 25% meer gebruik van geneesmiddelen in 2040. Pharmaceutisch Weekblad. 2022 Dec 22;PW51/52.

Verantwoording

De hier opgenomen abstracts vormen de mondelinge presentaties tijdens de Nederlandse Ziekenhuisfarmaciedagen op 13 en 14 november 2025 te Arnhem.

Referentie

Citeer als: Nederlandse Ziekenhuisfarmaciedagen 2025. Nederlands Platform voor Farmaceutisch Onderzoek. 2026;11:a1811.

DOI

https://www.knmp.nl/resolveuid/d8977f4ac05a49f0b193d53e7036fcc5

Open access

Reactie toevoegen

* verplichte velden
Versturen

Bekijk ook