首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
2.
Chemotherapy‐induced peripheral neuropathy (CIPN) is a dose‐limiting adverse event associated with treatment with paclitaxel and other chemotherapeutic agents. The prevention and treatment of CIPN are limited by a lack of understanding of the molecular mechanisms underlying this toxicity. In the current study, a human induced pluripotent stem cell–derived sensory neuron (iPSC‐SN) model was developed for the study of chemotherapy‐induced neurotoxicity. The iPSC‐SNs express proteins characteristic of nociceptor, mechanoreceptor, and proprioceptor sensory neurons and show Ca2+ influx in response to capsaicin, α,β‐meATP, and glutamate. The iPSC‐SNs are relatively resistant to the cytotoxic effects of paclitaxel, with half‐maximal inhibitory concentration (IC50) values of 38.1 µM (95% confidence interval (CI) 22.9–70.9 µM) for 48‐hour exposure and 9.3 µM (95% CI 5.7–16.5 µM) for 72‐hour treatment. Paclitaxel causes dose‐dependent and time‐dependent changes in neurite network complexity detected by βIII‐tubulin staining and high content imaging. The IC50 for paclitaxel reduction of neurite area was 1.4 µM (95% CI 0.3–16.9 µM) for 48‐hour exposure and 0.6 µM (95% CI 0.09–9.9 µM) for 72‐hour exposure. Decreased mitochondrial membrane potential, slower movement of mitochondria down the neurites, and changes in glutamate‐induced neuronal excitability were also observed with paclitaxel exposure. The iPSC‐SNs were also sensitive to docetaxel, vincristine, and bortezomib. Collectively, these data support the use of iPSC‐SNs for detailed mechanistic investigations of genes and pathways implicated in chemotherapy‐induced neurotoxicity and the identification of novel therapeutic approaches for its prevention and treatment.

Study Highlights
  • WHAT IS THE CURRENT KNOWLEDGE ON THE TOPIC?
☑ Sensory peripheral neuropathy is a common and dose‐limiting adverse event during chemotherapy. The lack of a molecular understanding of this toxicity limits options for its prevention and treatment.
  • WHAT QUESTION DID THIS STUDY ADDRESS?
☑ The current study tested whether sensory neurons differentiated from human induced pluripotent stem cells (iPSC‐SNs) can be used to investigate chemotherapy‐induced neurotoxicity, using paclitaxel as a model neurotoxic chemotherapeutic.
  • WHAT DOES THIS STUDY ADD TO OUR KNOWLEDGE?
☑ The iPSC‐SNs are a robust and reproducible model of paclitaxel‐induced neurotoxicity. Treatment of iPSC‐SNs with paclitaxel affects neurite networks, neuron excitability, and mitochondrial function.
  • HOW MIGHT THIS CHANGE CLINICAL PHARMACOLOGY OR TRANSLATIONAL SCIENCE?
☑ This novel stem cell model of chemotherapy‐induced neurotoxicity will be valuable for identifying genes and pathways critical for this toxicity and could be a useful platform for testing therapeutic approaches for treatment.

Chemotherapy‐induced peripheral neuropathy (CIPN) is a dose‐limiting toxicity associated with a number of drugs used for the treatment of solid tumors and hematological cancers. 1 , 2 , 3 Drugs with diverse mechanisms of action, including microtubule disruptors, proteasome inhibitors, and DNA‐crosslinking agents, all cause significant peripheral neuropathy. CIPN typically presents as burning, tingling, or numbness in the hands and feet that occurs in a glove and stocking distribution. 2 , 4 In addition to negatively affecting a patient’s quality of life, dose reductions, treatment delays, and discontinuation can impact the therapeutic effectiveness of these drugs. 2 Despite years of research, there are no effective therapies to prevent and/or treat CIPN, highlighting the need to define the molecular basis of this toxicity to support the development of novel strategies for treatment.Most mechanistic studies of CIPN have used behavioral testing in rodent models or cell‐based studies using primary rodent dorsal root ganglion (DRG) neurons. Common mechanisms associated with the development of CIPN include axon degeneration, altered Ca2+ homeostasis, mitochondrial dysfunction, changes in neuronal excitability, and neuroinflammation, although the relative contribution of these mechanisms varies for individual drugs. 5 , 6 , 7 For example, the microtubule stabilizing effects of paclitaxel inhibit anterograde and retrograde transport of synaptic vesicles down the microtubules, resulting in axon degeneration and membrane remodeling. This phenomenon is thought to be a major contributor to paclitaxel‐induced peripheral neuropathy. 8 In contrast, the ability of DNA alkylators, like cisplatin and oxaliplatin, to form adducts with mitochondrial DNA and increase reactive oxygen species contributes significantly to their peripheral neuropathy. 5 Although these studies in preclinical models and primary cultures of rodent DRG neurons have enhanced our knowledge of potential mechanisms for CIPN, attempts to translate these findings into humans have been largely unsuccessful. 3 In recent years, human induced pluripotent stem cell (iPSC)‐derived neurons have been used for the study of CIPN. Commercially available iPSC‐derived neurons (e.g., iCell neurons and Peri.4U neurons) have been evaluated as a model of neurotoxicity, 9 used to screen for neurotoxic compounds, 10 , 11 , 12 , 13 and utilized for functional validation of genes identified in human genomewide association studies of CIPN. 9 , 14 , 15 , 16 The use of human iPSC‐derived neurons affords an advantage over rodent DRG neurons in their human origin and the potential to differentiate into specific peripheral sensory neuron populations. The iCell neurons are a mixture of postmitotic GABAergic and glutamatergic cortical neurons that are more characteristic of relatively immature forebrain neurons than the sensory neurons found in the DRG. 17 , 18 Peri.4U neurons are more peripheral‐like, expressing βIII‐tubulin, peripherin, MAP2, and vGLUT2, but have been minimally characterized with respect to functional properties. 10 , 19 Additionally, neurons derived from human fibroblasts, blood, and embryonic stem cells that express more canonical nociceptive markers, like ISL1, BRN3A, P2RX3, the NTRK receptors, and NF200, 20 , 21 , 22 , 23 have also been used to study chemotherapy toxicity. Although these human derived cells resemble the DRG sensory neurons that are targeted by chemotherapeutics, there is significant interindividual variation across donor samples that limits their routine use for mechanistic studies and confounds the evaluation of functional consequences of genetic variation associated with human CIPN. 24 Despite advances made in recent years in the development of human cell‐based models for the study of CIPN, there remains a need for a robust, widely available, and reproducible model for detailed mechanistic studies of this dose‐limiting toxicity. The goal of the studies described below was to develop an iPSC‐derived sensory neuron (iPSC‐SN) model for the study of chemotherapy‐induced neurotoxicity. Paclitaxel was used as a model neurotoxic chemotherapeutic to evaluate morphological, mitochondrial, and functional changes associated with exposure of iPSC‐SNs to neurotoxic compounds.  相似文献   

3.
An investigational wearable injector (WI), the BD Libertas Wearable Injector (BD Libertas is a trademark of Becton, Dickinson and Company), was evaluated in an early feasibility clinical study for functional performance, tissue effects, subject tolerability, and acceptability of 5 mL, non‐Newtonian ~ 8 cP subcutaneous placebo injections in 52 healthy adult subjects of 2 age groups (18–64 years and ≥ 65 years). Randomized WI subcutaneous injections (n = 208, 4/subject) were delivered to the right and left abdomen and thigh of each subject, 50% (1 thigh and 1 abdomen) with a defined movement sequence during injection. Injector functional performance was documented. Deposition was qualified and quantified with ultrasound. Tissue effects and tolerability (pain) were monitored through 24 hours with corresponding acceptability questionnaires administered through 72 hours. WI (n = 205) automatically inserted the needle, delivered 5 mL ± 5% in 5.42 minutes (SD 0.74) and retracted. Depots were entirely (93.2%) or predominantly (5.4%) localized within the target subcutaneous tissue. Slight to moderate wheals (63.9%) and erythema (75.1%) were observed with ≥ 50% resolution within 30–60 minutes. Subject pain (100 mm Visual Analog Scale) peaked mid‐injection (mean 9.1 mm, SD 13.4) and rapidly resolved within 30 minutes (mean 0.4 mm, SD 2.6). Subjects’ peak pain (≥ 90.2%), injection site appearance (≥ 92.2%) and injector wear, size, and removal (≥ 92.1%) were acceptable (Likert responses) with 100% likely to use the injector if prescribed. Injection site preference was divided between none (46%), abdomen (25%), or thigh (26.9%). The investigational WI successfully delivered 5 mL viscous subcutaneous injections. Tissue effects and pain were transient, well‐tolerated and acceptable. Neither injection site, movement or subject age affected injector functional performance or subject pain and acceptability.

Study Highlights
  • WHAT IS THE CURRENT KNOWLEDGE ON THE TOPIC?
☑ Transitioning chronic disease therapies from intravenous infusion to large volume subcutaneous injection requires reliable and accurate delivery devices that may enable intuitive self or care‐giver administration. Limited options are commercially available.
  • WHAT QUESTION DID THIS STUDY ADDRESS?
☑ An investigational wearable injector’s functionality and tolerability for 5 mL, ~ 8 cP subcutaneous placebo injections to the thigh and abdomen with and without movement in healthy adults of 2 age groups (18–64 years and ≥ 65 years) is described. Depot location, corresponding local tissue effects, and acceptability are documented.
  • WHAT DOES THIS STUDY ADD TO OUR KNOWLEDGE?
☑ The investigational injector performed as designed, consistently delivering 5 mL ± 5% to the target subcutaneous tissue in ~ 5.5 minutes with transient, well‐tolerated tissue effects and pain. Neither injection site, movement or subject age affected injector functional performance or subject pain and acceptability.
  • HOW MIGHT THIS CHANGE CLINICAL PHARMACOLOGY OR TRANSLATIONAL SCIENCE?
☑ The investigational injector demonstrated equivalent functional performance with broad acceptability across subject genders, body mass index categories, and age range with and without movement. Results indicate promising potential of device design and delivery boundaries.

Chronic disease biological therapies are transitioning from traditional intravenous to subcutaneous administration. Adapting intravenous therapies to subcutaneous administration creates delivery challenges, such as larger than traditional volumes and viscosities. 1 , 2 , 3 , 4 , 5 , 6 Intuitive and reliable subcutaneous injection system design will help navigate the complexity of these new delivery challenges while ensuring patient ease of wear and use. Effective subcutaneous injection system design requires a strong understanding of the biomechanical and physiological impact to subcutaneous tissue of delivery at increased volumes and viscosities with corresponding subject tolerability and acceptability. 1 , 7 , 8 , 9 , 10 , 11 Subcutaneous administration conveys many benefits, such as reduced cost and treatment time and increased patient autonomy, convenience, and tolerance/acceptance. 3 , 21 Multiple comparative studies report that both patients and health care providers (HCPs) prefer subcutaneous to intravenous administration, citing improved clinical management, efficiency, and convenience with decreased pain and adverse systemic effects. 12 , 18 , 19 , 20 , 21 , 22 , 23 , 24 , 25 Historically, literature has identified multiple thresholds for the subcutaneous bolus limit between 1.5 and 3 mL due to subject pain and tissue feasibility. 1 , 2 , 3 , 7 , 15 , 26 Observation that injection volumes > 2 mL may create site wheals (surface tissue displacement) or induration (hardening of the soft tissue) likely contributed to the anticipated low tolerability of these injections despite the absence of relevant clinical evidence linking wheal formation or induration to pain. 2 Multiple studies using pump‐driven injection systems as surrogates for functional subcutaneous injection devices document the feasibility and tolerability of 3 to 20 mL single subcutaneous bolus injections in human clinical subjects. 1 , 10 , 12 , 27 , 28 Subcutaneous administration is both feasible and convenient with the introduction of combination products, such as wearable or on‐body injectors, autoinjectors, and prefilled syringes that use fixed dosing to reduce dosing errors and enable patient choice in injection provider, device type, and setting. 2 , 12 Wearable injectors (WIs) complement and may exceed the volume and viscosity capacities currently available in prefilled syringes or autoinjectors; however, there are currently limited commercial on‐body or WI options available. 3 , 10 , 29 The current study is a first‐in‐human clinical assessment of an investigational WI for functional performance and corresponding tissue effects, depot location, subject tolerability, and acceptability for 5 mL, ~ 8 cP injections of a viscous non‐Newtonian placebo, hyaluronic acid (HA) diluted in saline. The study included 52 healthy adult subjects of both genders and 2 age groups (18–64 years and ≥ 65 years). Each subject received four injections (2 abdominal and 2 thigh) with and without movement for each location. WI functional performance (injection duration, delivered volume, adherence, and status indicator) was documented from application through removal. Depot location was qualified and quantified via ultrasound. Site tissue effects (wheal and erythema) and subject pain tolerance (100 mm Visual Analog Scale, VAS) were monitored through 24 hours with corresponding acceptability documented via questionnaires through 72 hours postinjection.  相似文献   

4.
5.
Graft function is crucial for successful kidney transplantation. Many factors may affect graft function or cause delayed graft function (DGF), which decreases the prognosis for graft survival. This study was designed to evaluate whether the perioperative use of dexmedetomidine (Dex) could improve the incidence of function of graft kidney and complications after kidney transplantation. A total of 780 patients underwent kidney transplantations, 315 received intravenous Dex infusion during surgery, and 465 did not. Data were adjusted with propensity scores and multivariate logistic regression was used. The primary outcomes are major adverse complications, including DGF and acute rejection in the early post‐transplantation phase. The secondary outcomes included length of hospital stay (LOS), infection, overall complication, graft functional status, post‐transplantation serum creatinine values, and estimated glomerular filtration rate (eGFR). Dex use significantly decreased DGF (19.37% vs. 23.66%; adjusted odds ratio, 0.744; 95% confidence interval, 0.564–0.981; P = 0.036), risk of infection, risk of acute rejection in the early post‐transplantation phase, the risk of overall complications, and LOS. However, there were no statistical differences in 90‐day graft functional status or 7‐day, 30‐day, and 90‐day eGFR. Perioperative Dex use reduced incidence of DGF, risk of infection, risk of acute rejection, overall complications, and LOS in patients who underwent kidney transplantation.

Study Highlights
  • WHAT IS THE CURRENT KNOWLEDGE ON THE TOPIC?
☑ Graft function is crucial for successful kidney transplantation. Dexmedetomidine (Dex) has been shown to have renal protective effect in preclinical and other surgeries.
  • WHAT QUESTION DID THIS STUDY ADDRESS?
☑ The objective of this study was to evaluate whether the perioperative Dex administration was associated with improved graft kidney function or decreased complications after kidney transplantation.
  • WHAT DOES THIS STUDY ADD TO OUR KNOWLEDGE?
☑ This study demonstrated that perioperative Dex administration was associated with improved kidney function and outcomes in patients who underwent kidney transplantation.
  • HOW MIGHT THIS CHANGE CLINICAL PHARMACOLOGY OR TRANSLATIONAL SCIENCE?
☑ The results from this study suggest perioperative Dex administration could be beneficial to donor kidney grafts.

The cost to care for patients with chronic kidney disease and endstage renal disease (ESRD) is significant with total spending over US $120 billion for Medicare beneficiaries alone representing 33.8% of total Medicare fee‐for‐service spending according to the United States Renal Data System 2019 annual data report. 1 There were nearly 500,000 patients receiving maintenance dialysis treatments and well over 200,000 living with a kidney transplant in the United States by the end of 2015. 2 Thus, ESRD is a major public health problem due to its high morbidity and mortality as well as social and financial implications. 3 Treatment outcomes vary depending on different modalities like hemodialysis, peritoneal dialysis, and renal transplantation. Renal transplantation has an obvious survival advantage over dialysis treatments for patients with ESRD along with better quality of life. 4 , 5 , 6 However, the 5‐year graft survival rate was 74.4% in deceased‐donor transplants and 85.6% in living‐donor transplants. 7 The etiology of graft kidney dysfunction is multifactorial and involves immunologic factors, surgical techniques, hemodynamic alterations, inflammatory mechanisms, apoptosis, and ischemia/reperfusion (I/R) injury. 8 Although advances in immunosuppressive therapy and treatment of hypertension and hyperlipidemia have improved outcomes following kidney transplantation, poor initial graft function occurs in up to 5% of living donor recipients and up to 20% of deceased donor recipients. Infection occurs in up to 30% of renal transplant recipients during the first 3 months post‐transplantation. 9 , 10 The transplant population has expanded to older and sicker patients, and only about 7.3% candidates on the US kidney transplant waiting list received deceased donor kidney transplantations. 11 Approximately 15% of procured kidneys were discarded despite long waiting lists. 11 At the same time, graft rejection episodes occur in about 20% of low‐risk transplant recipients within the first 26 weeks post‐transplantation. 9 The probability of first‐year all‐cause graft failure (return to dialysis, repeat transplantation, or death with a functioning transplant) for deceased donor kidney transplant recipients was about 7.7%. 3 , 12 , 13 It is important to identify factors responsible for decreased graft function and find appropriate interventions.It is well known that renal function is closely associated with hemodynamic performance, sympathetic activity, inflammatory responses, and I/R injury. The hemodynamic stabilizing and sympatholytic effects produced by alpha2 agonists have been shown to prevent the deterioration of renal function after cardiac surgery. 12 , 14 , 15 The mechanisms could be inhibition of renin release, increased glomerular filtration, and increased excretion of sodium and water via the kidneys. 16 Dexmedetomidine (Dex) is a short‐acting selective alpha2 agonist in comparison to clonidine and has an alpha2 to alpha1 selectivity ratio of 1,600:1. 17 Dex has a stabilizing effect on hemodynamics mediated by reducing sympathetic tone, decreasing inflammatory response, alleviating I/R injury, inhibiting renin release, increasing glomerular filtration rate, increasing secretion of sodium and water by the kidneys, and decreasing insulin secretion. 18 , 19 Although Dex has been shown to alleviate acute kidney injury (AKI) in other surgeries, 14 , 15 no study has demonstrated the benefit of Dex on graft function in renal transplantation. Thus, this study was designed to determine whether the perioperative use of Dex is associated with improved graft kidney function and decreased incidence of complications after renal transplantation.  相似文献   

6.
Therapeutic drug monitoring (TDM) is mandatory for the immunosuppressive drug tacrolimus (Tac). For clinical applicability, TDM is performed using morning trough concentrations. With recent developments making tacrolimus concentration determination possible in capillary microsamples and Bayesian estimator predicted area under the concentration curve (AUC), AUC‐guided TDM may now be clinically applicable. Tac circadian variation has, however, been reported, with lower systemic exposure following the evening dose. The aim of the present study was to investigate tacrolimus pharmacokinetic (PK) after morning and evening administrations of twice‐daily tacrolimus in a real‐life setting without restrictions regarding food and concomitant drug timing. Two 12 hour tacrolimus investigations were performed; after the morning dose and the following evening dose, respectively, in 31 renal transplant recipients early after transplantation both in a fasting‐state and under real‐life nonfasting conditions (14 patients repeated the investigation). We observed circadian variation under fasting‐conditions: 45% higher peak‐concentration and 20% higher AUC following the morning dose. In the real‐life nonfasting setting, the PK‐profiles were flat but comparable after the morning and evening doses, showing slower absorption rate and lower AUC compared with the fasting‐state. Limited sampling strategies using concentrations at 0, 1, and 3 hours predicted AUC after fasting morning administration, and samples obtained at 1, 3, and 6 hours predicted AUC for the other conditions (evening and real‐life nonfasting). In conclusion, circadian variation of tacrolimus is present when performed in patients who are in the fasting‐state, whereas flatter PK‐profiles and no circadian variation was present in a real‐life, nonfasting setting.

Study Highlights
  • WHAT IS THE CURRENT KNOWLEDGE ON THE TOPIC?
☑ Circadian variation of tacrolimus (Tac) is controversial. Most Tac population pharmacokinetic (PK) models are based on fasting‐day data.
  • WHAT QUESTION DID THIS STUDY ADDRESS?
☑ It investigated circadian variation in Tac PK and the effect on Tac PK‐profiles when administered in a real‐life setting with regard to food and concomitant drug timing.
  • WHAT DOES THIS STUDY ADD TO OUR KNOWLEDGE?
☑ In a real‐life nonfasting setting, the PK‐profiles were flat without circadian variation. The study supports circadian variation of Tac under fasting conditions. Data on the real‐world behavior of the patients are needed for a population PK model to predict area under the concentration curve (AUC) during both conditions.
  • HOW MIGHT THIS CHANGE CLINICAL PHARMACOLOGY OR TRANSLATIONAL SCIENCE?
☑ Proposed Tac AUC‐target levels need to be redefined due to circadian variation and flat real‐life nonfasting PK‐profiles. The association between high peak concentrations and side effects of Tac may be overestimated given the flat real‐life nonfasting PK‐profiles. The effect of real‐life dosing of Tac may very well be present for other drugs and should be investigated for drugs where TDM is indicated.

Following organ transplantation, there is a need for life‐long immunosuppressive therapy. For the last 10–15 years, the calcineurin inhibitor tacrolimus (Tac) has been the cornerstone in most transplant centers. 1 The narrow therapeutic index and large pharmacokinetic (PK) interindividual and intra‐individual variability makes therapeutic drug monitoring (TDM) of Tac mandatory, 2 and is normally performed using morning trough concentrations.When Tac was introduced in transplant protocols, importance of avoiding acute rejections led to TDM targeting high Tac trough concentrations. High concentrations induce nephrotoxicity and development of other side effects, like hypertension, post‐transplant diabetes mellitus, neurotoxicity, and cancer. 3 , 4 In combination with mycophenolate mofetil (MMF) and modern induction therapy, the recommended Tac trough concentration target has gradually been reduced. 5 , 6 There is still room for improving long‐term outcomes following renal transplantation, 7 , 8 and improved tailoring of the Tac dosing may be an important contributor. 9 The area under the concentration vs. time curve (AUC), reflecting total systemic Tac exposure, should theoretically be a more relevant measure for both efficacy and side effects compared with trough concentrations. 10 A recent consensus report also recommended AUC thresholds and advocates the need for prospective AUC‐dosed studies. 10 By utilizing limited sampling strategies (LSS), preferably by capillary microsampling, in combination with population PK model‐derived Bayesian estimators have made AUC‐targeted dosing of Tac applicable in clinical practice. 11 , 12 However, data used to develop most Tac population PK models are based on data from clinical trials. 13 , 14 Such data are generally obtained in selected patients under highly controlled conditions (i.e., fasting, without concomitant drugs at time of Tac dose administration); hence, these results may not reflect a real‐life situation of individual transplant recipients. In addition, the majority of AUC data are obtained during the day (i.e., following the morning dose of Tac). Because Tac has shown circadian variation, with higher drug exposure after the morning dose, 15 , 16 , 17 , 18 using models that assume a similar PK‐profile following the morning dose and evening dose will introduce biased Tac exposure 0–24‐hour AUC (AUC0–24) predictions. In addition, Tac PK is also affected by food consumption. 19 If there is a correlation between systemic Tac exposure and long‐term outcomes, models reflecting the real‐life scenario over the entire dosing interval may prove advantageous.The primary aim of this study was to investigate Tac PK after the morning and evening administration of twice‐daily Tac in a real‐life setting with regard to food and concomitant drug timing. Second, we aimed to determine the predictive performance of Tac AUC predictions using LSS and Bayesian estimators from a nonparametric population PK model.  相似文献   

7.
On March 11, 2020, the World Health Organization declared its assessment of coronavirus disease 2019 (COVID‐19) as a global pandemic. However, specific anti‐severe acute respiratory syndrome‐coronavirus 2 (SARS‐CoV‐2) drugs are still under development, and patients are managed by multiple complementary treatments. We performed a retrospective analysis to compare and evaluate the effect of low molecular weight heparin (LMWH) treatment on disease progression. For this purpose, the clinical records and laboratory indicators were extracted from electronic medical records of 42 patients with COVID‐19 (21 of whom were treated with LMWH, and 21 without LMWH) hospitalized (Union Hospital of Huazhong University of Science and Technology) from February 1 to March 15, 2020. Changes in the percentage of lymphocytes before and after LMWH treatment were significantly different from those in the control group (P = 0.011). Likewise, changes in the levels of D‐dimer and fibrinogen degradation products in the LMWH group before and after treatment were significantly different from those in the control group (P = 0.035). Remarkably, IL‐6 levels were significantly reduced after LMWH treatment (P = 0.006), indicating that, besides other beneficial properties, LMWH may exert an anti‐inflammatory effect and attenuate in part the “cytokine storm” induced by the virus. Our results support the use of LMWH as a potential therapeutic drug for the treatment of COVID‐19, paving the way for a subsequent well‐controlled clinical study.

Study Highlights
  • WHAT IS THE CURRENT KNOWLEDGE ON THE TOPIC?
☑ Our results strongly suggest low molecular weight heparin (LMWH) as an effective strategy in a therapeutic or combination therapy against coronavirus disease 2019 (COVID‐19).
  • WHAT QUESTION DID THIS STUDY ADDRESS?
☑ LMWH exerts an anti‐inflammatory effect by means of reducing IL‐6 and increasing lymphocyte%. We, therefore, favor the use of LMWH as a potential therapeutic drug for the treatment of COVID‐19.
  • WHAT DOES THIS STUDY ADD TO OUR KNOWLEDGE?
☑ A new therapeutic approach for COVID‐19 was proposed based on the non‐anticoagulant properties of LMWH.
  • HOW MIGHT THIS CHANGE CLINICAL PHARMACOLOGY OR TRANSLATIONAL SCIENCE?
☑ In view of the COVID‐19 pandemic, our study will be of pronounced interest to a broad spectrum of clinicians and scientists of several disciplines focusing on translational and basic aspects related to COVID‐19 and virology in general.

On March 11, 2020, the World Health Organization (WHO) declared its assessment of coronavirus disease 2019 (COVID‐19) as a global pandemic. Severe acute respiratory syndrome‐coronavirus 2 (SARS‐CoV‐2) is characterized by a long incubation period, high infectivity, and multiple routes of transmission. 1 , 2 However, no effective medicines are currently available, so patients are treated symptomatically. A better understanding of the mechanisms of pathological changes will help to screen potential drugs out of the currently available medications.Several clinical studies revealed that cytokine storms are important mechanisms underlying disease exacerbation and death of patients with COVID‐19. 3 , 4 , 5 Particularly, IL‐6 levels in severely ill patients were significantly higher than in mild cases. 6 IL‐6 is one of the core cytokines, 7 contributing to many of the key symptoms of cytokine storm, such as vascular leakage, activation of the complement, and coagulation cascades, inducing disseminated intravascular coagulation. 8 , 9 Reducing the levels of IL‐6 and decreasing its activity may prevent or even reverse the cytokine storm syndrome, 10 thereby improving the condition of patients with COVID‐19.Substantial studies have reported that low molecular weight heparin (LMWH) has various non‐anticoagulant properties that play an anti‐inflammatory role by reducing the release of IL‐6. 11 , 12 , 13 However, the anti‐inflammatory effects of LMWH in COVID‐19 are currently unknown. By analyzing the effect of LMWH in patients with COVID‐19, our retrospective cohort study demonstrates, for the first time, the significant beneficial effect of LMWH in controlling cytokine storm and delaying disease progression (Figure 1 ).Open in a separate windowFigure 1Possible mechanism of anti‐inflammatory effects of low molecular weight heparin (LMWH) in patients with coronavirus disease 2019 (COVID‐19). Under conventional antiviral treatment regimens, LMWH improves hypercoagulability, inhibits IL‐6 release, and attenuates IL‐6 biological activity. It has potential antiviral effects and helps delay or block inflammatory cytokine storms. LMWH can increases the lymphocyte% in the patients. The multiple effects of LMWH encourages its application for the treatment of patients with COVID‐19. HSPG, heparin sulfate proteoglycan; SARS‐CoV‐2, severe acute respiratory syndrome‐coronavirus 2.  相似文献   

8.
Small cell lung cancer (SCLC) is a leading cause of cancer death worldwide, with few treatment options. Rovalpituzumab tesirine (Rova‐T) is an antibody‐drug conjugate that targets delta‐like 3 on SCLC cells to deliver a cytotoxic payload directly to tumor cells. In this study, the cardiac safety profile of Rova‐T was assessed by evaluating changes in QT interval, electrocardiogram (ECG) waveform, heart rate, and proarrhythmic adverse events (AEs) after treatment with Rova‐T in patients with previously treated extensive‐stage SCLC. Patients underwent ECG monitoring for 2 weeks after each of 2 i.v. infusions of 0.3 mg/kg Rova‐T over 30 minutes, administered 6 weeks apart. Forty‐six patients received at least one dose of Rova‐T. At the geometric mean Rova‐T maximum serum concentration of 7,940 ng/mL, ECG monitoring showed no significant changes in the Fridericia‐corrected QT (QTcF) interval; the upper limit of the 2‐sided 90% confidence interval did not exceed 10 msec for any time point. There were no clinically significant changes in QRS or PR intervals, ECG waveforms, or heart rate after Rova‐T administration. All patients experienced a treatment‐emergent AE (TEAE); 78% had a grade ≥ 3 TEAE, 59% had a serious TEAE, and 41% had a cardiac‐related TEAE. The TEAEs that might signal proarrhythmia tendencies were uncommon. Confirmed partial responses were observed in 24% of patients. Based on the evaluation of ECG data collected in this study from patients treated with Rova‐T at 0.3 mg/kg i.v. administered every 6 weeks, a QTcF effect of clinical concern can be excluded.

Study Highlights
  • WHAT IS THE CURRENT KNOWLEDGE ON THE TOPIC?
☑ There currently are no clinical data regarding the electrocardiographic (ECG) effects of rovalpituzumab tesirine (Rova‐T).
  • WHAT QUESTION DID THIS STUDY ADDRESS?
☑ This study was conducted to address questions regarding cardiac safety of this agent during its clinical drug development cycle to meet the US Food and Drug Administration (FDA) requirements.
  • WHAT DOES THIS STUDY ADD TO OUR KNOWLEDGE?
☑ This study confirmed that targeting delta‐like 3 using Rova‐T at the 0.3 mg/kg dose that was utilized in the phase II/III studies did not result in any clinically significant changes in the ECG throughout several time points.
  • HOW MIGHT THIS CHANGE CLINICAL PHARMACOLOGY OR TRANSLATIONAL SCIENCE?
☑ This knowledge will facilitate future development of tesirine‐containing antibody‐drug conjugates by ameliorating concerns of potential ECG effect, even if Rova‐T itself did not meet efficacy end points for drug approval.

Lung cancer is one of the most common and deadly cancers, with 228,000 new diagnoses per year and 143,000 deaths per year in the United States. 1 Small cell lung cancer (SCLC) accounts for 10% to 15% of lung cancers 1 and is a leading cause of cancer death worldwide. 2 , 3 The prognosis of patients with SCLC is poor, with a 5‐year survival rate of < 5%. 4 , 5 SCLC is categorized into limited‐stage and extensive‐stage (ES) disease based on the extent of the disease, with ES disease accounting for 65% of cases. 6 Treatment options are limited for ES disease, with platinum doublet chemotherapy along with anti‐PD‐L1 checkpoint blockade (atezolizumab or durvalumab) as the preferred first‐line treatment. There are few effective therapies approved for second‐line treatment of ES SCLC 7 ; median overall survival in patients treated with topotecan is only 26 weeks. 8 Recent studies of single‐agent cytotoxic agents and immunotherapy have yielded only modest improvements. 7 The Notch‐family ligand delta‐like 3 (DLL3) is highly expressed on SCLC cells but not expressed in most normal tissue, making it a tractable drug target for SCLC. 9 Rovalpituzumab tesirine (Rova‐T) is a first‐in‐class antibody‐drug conjugate (ADC) that targets DLL3 to deliver a cytotoxic agent directly to SCLC cells. Rova‐T is composed of a monoclonal DLL3 antibody linked to a DNA‐intercalating payload (pyrrolobenzodiazepine (PBD)) via a protease‐cleavable linker. The safety and efficacy of Rova‐T were initially evaluated in 82 patients in the first‐in‐human phase I study SCRX16‐001 (74 patients with SCLC and 8 patients with large‐cell neuroendocrine carcinoma). Treatment‐related cardiac adverse events (AEs) were uncommon. 10 The change in Fredericia‐corrected QT interval (QTcF) remained below a 10‐msec increase relative to baseline at 30 minutes after the end of infusion, when maximum Rova‐T serum concentrations were observed (unpublished data). Effects on QTcF at later time points have not been evaluated. In the phase 2 TRINITY study of patients with DLL3‐expressing relapsed/refractory SCLC, 12% of patients had a confirmed objective response to Rova‐T, and a manageable safety profile was observed. 11 Further development of Rova‐T has since been halted because two phase III studies showed a lack of clinical benefit of Rova‐T in the frontline maintenance and second‐line settings. 12 , 13 Determining cardiovascular safety is a key part of drug characterization. A delay in cardiac repolarization results in an electrophysiological environment that may lead to development of potentially fatal cardiac arrythmias. Typically, a thorough QT/corrected QT (QTc) study is conducted after the initial clinical study to determine whether an investigational agent meets a threshold level of effect on cardiac repolarization as detected by QT/QTc prolongation. 14 The cytotoxic component of Rova‐T precluded a thorough conventional QT/QTc study, which usually includes a crossover with the therapeutic dose, a supratherapeutic dose, a placebo, and a positive control in healthy volunteers. Additionally, because patients with SCLC require active treatment, the use of a placebo and/or a positive control would not be ethical. 15 Therefore, an alternative, intensive QT/QTc study design was determined to be appropriate, with the primary objective of determining the effect of Rova‐T on QTcF in patients with SCLC during the first two cycles of Rova‐T administration.  相似文献   

9.
This open‐label randomized controlled pilot study aimed to test the study feasibility of bromhexine hydrochloride (BRH) tablets for the treatment of mild or moderate coronavirus disease 2019 (COVID‐19) and to explore its clinical efficacy and safety. Patients with mild or moderate COVID‐19 were randomly divided into the BRH group or the control group at a 2:1 ratio. Routine treatment according to China’s Novel Coronavirus Pneumonia Diagnosis and Treatment Plan was performed in both groups, whereas patients in the BRH group were additionally given oral BRH (32 mg t.i.d.) for 14 consecutive days. The efficacy and safety of BRH were evaluated. A total of 18 patients with moderate COVID‐19 were randomized into the BRH group (n = 12) or the control group (n = 6). There were suggestions of BRH advantage over placebo in improved chest computed tomography, need for oxygen therapy, and discharge rate within 20 days. However, none of these findings were statistically significant. BRH tablets may potentially have a beneficial effect in patients with COVID‐19, especially for those with lung or hepatic injury. A further definitive large‐scale clinical trial is feasible and necessary.

Study Highlights
  • WHAT IS THE CURRENT KNOWLEDGE ON THE TOPIC?
☑ Bromhexine hydrochloride (BRH) is capable of inhibiting transmembrane protease serine 2 (TMPRSS2) and TMPRSS2‐specific viral entry and is theoretically regarded to be effective against severe acute respiratory syndrome‐coronavirus 2.
  • WHAT QUESTION DID THIS STUDY ADDRESS?
☑ This open‐label randomized controlled pilot study evaluated the study feasibility of BRH tablets for the treatment of coronavirus disease 2019 (COVID‐19) and to explore its clinical efficacy and safety.
  • WHAT DOES THIS STUDY ADD TO OUR KNOWLEDGE?
☑ The study of BRH tablets for the treatment of COVID‐19 is feasible and necessary.
  • HOW MIGHT THIS CHANGE CLINICAL PHARMACOLOGY OR TRANSLATIONAL SCIENCE?
☑ BRH tablets may potentially have a beneficial effect against COVID‐19, especially for patients with lung and hepatic injury. A further large‐scale clinical trial is warranted to confirm our findings.

The epidemic novel coronavirus disease 2019 (COVID‐19) has now rapidly spread from China to around the world. 1 , 2 Among all infected patients, 80% of patients have been categorized as having moderate disease; and the overall fatality rate is ~ 2.3%, with the elderly experiencing a higher rate. 3 Asymptomatic carriers are also contagious, which contributes to the growing epidemic status. There is an urgent need for effective treatment to not only relieve the symptomatic patients but to curb viral transmission.The novel coronavirus mainly invades the human body through angiotensin‐converting enzyme 2/transmembrane protease serine 2 (TMPRSS2). 4 Previous studies on severe acute respiratory syndrome, Middle East respiratory syndrome, and other respiratory viruses have revealed that TMPRSS2 participates in the process of host cell entry, maturation, and release of the virus, which enhance the viral infectivity. 5 , 6 , 7 Therefore, inhibition of TMPRSS2 may be a promising therapeutic approach for COVID‐19. 4 The latest prophylactic and treatment option for severe acute respiratory syndrome‐coronavirus 2 (SARS‐CoV‐2) infection proposed by researchers from the Netherlands, the United States, Indonesia, South Africa, and Italy is the use of a TMPRSS2 inhibitor. 8 Bromhexine hydrochloride (BRH), a widely used mucolytic agent, has a specific inhibitory effect on TMPRSS2. 9 The half‐maximal inhibitory concentration value of BRH toward TMPRSS2 is merely 0.75 μM. 9 BRH at a dosage that selectively inhibits TMPRSS2 and TMPRSS2‐specific viral entry is regarded to be effective against SARS‐CoV‐2. 8 Therefore, the aim of this study was to conduct a clinical pilot study to test the study feasibility of BRH tablets for the treatment of moderate COVID‐19 and to explore its clinical efficacy and safety.  相似文献   

10.
The rapidly advancing field of digital health technologies provides a great opportunity to radically transform the way clinical trials are conducted and to shift the clinical trial paradigm from a site‐centric to a patient‐centric model. Merck’s (Kenilworth, NJ) digitally enabled clinical trial initiative is focused on introduction of digital technologies into the clinical trial paradigm to reduce patient burden, improve drug adherence, provide a means of more closely engaging with the patient, and enable higher quality, faster, and more frequent data collection. This paper will describe the following four key areas of focus from Merck’s digitally enabled clinical trials initiative, along with corresponding enabling technologies: (i) use of technologies that can monitor and improve drug adherence (smart dosing), (ii) collection of pharmacokinetic (PK), pharmacodynamic (PD), and biomarker samples in an outpatient setting (patient‐centric sampling), (iii) use of digital devices to collect and measure physiological and behavioral data (digital biomarkers), and (iv) use of data platforms that integrate digital data streams, visualize data in real‐time, and provide a means of greater patient engagement during the trial (digital platform). Furthermore, this paper will discuss the synergistic power in implementation of these approaches jointly within a trial to enable better understanding of adherence, safety, efficacy, PK, PD, and corresponding exposure‐response relationships of investigational therapies as well as reduced patient burden for clinical trial participation. Obstacle and challenges to adoption and full realization of the vision of patient‐centric, digitally enabled trials will also be discussed.

The rapidly advancing field of digital health technologies provides an opportunity to transform the pharmaceutical industry and the way clinical trials are conducted. Although the conduct of clinical trials has evolved over the last century to improve the unbiased evaluation of new therapies, there remain several limitations in the current clinical trial paradigm. Pharmaceutical clinical trials are often site‐centric, requiring patients to come to the clinical site for sample and data collection. The need to travel to the clinical site often restricts the trial population to those that live in geographic proximity to the clinical site, and, thus, restricts who participates and limits patient diversity, leaving many patients excluded and underserved. 1 , 2 , 3 , 4 , 5 The current trial paradigm provides only static snapshots of data (corresponding to the time of the clinical visit), resulting in lost opportunity to monitor end points of disease progression, pharmacokinetics (PK), pharmacodynamics (PD), and safety and tolerability end points in between clinical visits. Additionally, clinical trial outcome measures may not be particularly meaningful to patients or their health care providers, and end points may be limited by categorical, episodic, subjective assessments that progress slowly, thus requiring large, long, expensive clinical trials to enable detection of meaningful change in the end point. Furthermore, patient medication adherence and persistence to therapy in clinical trials is often low, 6 , 7 limiting the researcher’s ability to adequately assess the drug’s safety, efficacy, and exposure‐response relationships. Lastly, patients often find the clinical trial language confusing and the trial’s expectation of what they are supposed to do intrusive into their daily lives, limiting the number of patients that participate in clinical trials and threatening the retention of those patients that do consent to participate. 1 , 2 , 3 , 4 , 5 The potential benefits of digital health and outpatient sampling technologies in clinical trials are tremendous. They can enable increased access to the appropriate patient population, reduced patient burden to participate, augmented, more informed, objective data sets (both in collecting and measuring existing end points at home and in access to new end points that would have been impossible to collect in the past), increased engagement with the patient, and better understanding of the patient experience throughout the trial. All these benefits will ultimately improve the patient experience during the trial and enable improved drug development decisions and understanding of drug and disease effects. 8 Despite all these potential improvements, the relative “explosion” in both the number of digital health technologies as well as their capabilities, and an increased adoption of consumer‐grade health‐tracking devices in the marketplace, adoption of use of such technologies in pharmaceutical trials has been lagging by comparison. 9 , 10 , 11 Some of the challenges to pharmaceutical trial adoption include questions around patient privacy, lack of sufficient validation for digital end points, lack of transparency for calculation of end points (“black box” algorithms), challenges related to patient adherence and burden of wearing and using devices, operational and data transfer challenges, and regulatory unknowns. However, use of digital end points in drug development trials, including as primary and secondary end points and to support label claims, is becoming a reality, and “pilot” trials evaluating technologies of interest, often evaluating digital end points in comparison to a traditionally accepted clinical standard end point, are being increasingly conducted. 12 , 13 , 14 The digitally enabled clinical trials initiative at Merck (Kenilworth, NJ) is aimed at using innovative, digital technologies in clinical trials both at the clinical site and in at‐home settings to reduce patient burden, collect higher quality, enrich clinical trial data sets, and ultimately enable more rapid and informed clinical decisions. We ultimately aim to shift the clinical trial paradigm from one that is site‐centric to patient‐centric. Key areas of focus include (i) collection of at‐home PK, PD, and biomarker samples (outpatient sampling), (ii) use of technologies to monitor and improve patient adherence (smart dosing), (iii) use of digital devices to collect and measure physiological and behavioral data (digital biomarkers), and (iv) development and use of data platforms that can acquire the data from digital devices, provide real‐time analytic capabilities, and maintain patient engagement throughout the trial (digital platform; Figure  1 ). Application of these components in clinical trials will lead to access to higher quality and previously unattainable data for more informed clinical decision making.Open in a separate windowFigure 1Areas of focus for digitally enabled clinical trials.This paper describes the four key areas of focus of our digitally enabled clinical trials initiative and reviews corresponding enabling technologies. Furthermore, this paper discusses the synergistic power in implementation of these approaches jointly within a trial to enable a more accurate understanding of adherence, safety, PK, and corresponding exposure‐response relationships of investigational new drugs (INDs) as well as reduced patient burden for clinical trial participation. Obstacles and challenges to adoption and fully realizing the vision of patient‐centric, digitally enabled trials are also discussed.  相似文献   

11.
12.
13.
The current diagnosis of Parkinson’s disease (PD) mostly relies on clinical rating scales related to motor dysfunction. Given that clinical symptoms of PD appear after significant neuronal cell death in the brain, it is required to identify accessible, objective, and quantifiable biomarkers for early diagnosis of PD. In this study, a total of 20 patients with idiopathic PD and 20 age‐matched patients with essential tremor according to the UK Brain Bank Criteria were consecutively enrolled to identify peripheral blood biomarkers for PD. Clinical data were obtained by clinical survey and assessment. Using albumin‐depleted and immunoglobulin G‐depleted plasma samples, we performed immunoblot analysis of seven autophagy‐related proteins and compared the levels of proteins to those of the control group. We also analyzed the correlation between the levels of candidate proteins and clinical characteristics. Finally, we validated our biomarker models using receiver operating characteristic curve analysis. We found that the levels of BCL2‐associated athanogene 2 (BAG2) and cathepsin D were significantly decreased in plasma of patients with PD (P = 0.009 and P = 0.0077, respectively). The level of BAG2 in patients with PD was significantly correlated with Cross‐Culture Smell Identification Test score, which indicates olfactory dysfunction. We found that our biomarker model distinguishes PD with 87.5% diagnostic accuracy (area under the curve (AUC) = 0.875, P < 0.0001). Our result suggests BAG2 and cathepsin D as candidates for early‐diagnosis plasma biomarkers for PD. We provide the possibility of plasma biomarkers related to the autophagy pathway, by which decreased levels of BAG2 and cathepsin D might lead to dysfunction of autophagy.

Study Highlights
  • WHAT IS THE CURRENT KNOWLEDGE ON THE TOPIC?
☑ Although the current diagnostic method for Parkinson’s disease (PD) shows high accuracy, it is frequently inefficacious to diagnose early PD or predict PD onset. Several studies showed that the autophagy‐lysosomal pathway is altered in patients with early PD, suggesting autophagy‐related proteins could be potential biomarkers for early PD.
  • WHAT QUESTION DID THIS STUDY ADDRESS?
☑ We aimed to identify plasma biomarkers for PD by quantitative analysis of proteins related to the autophagy‐lysosomal pathway.
  • WHAT DOES THIS STUDY ADD TO OUR KNOWLEDGE?
☑ This study showed that decreased levels of BCL2‐associated athanogene and cathepsin D could be used as PD biomarkers with high accuracy.
  • HOW MIGHT THIS CHANGE CLINICAL PHARMACOLOGY OR TRANSLATIONAL SCIENCE?
☑ The diagnostic model using biomarkers identified in this study can be used for more accurate and convenient PD diagnosis. This study also supports that the autophagy‐lysosomal pathway is fundamentally linked to the pathogenesis of PD.

Parkinson’s disease (PD) is the second most common neurodegenerative disorder of insidious onset. PD is characterized by the presence of motor symptoms, including shaking, rigidity, bradykinesia, and postural disturbances, and non‐motor symptoms, including gait, speech, and swallowing difficulties. 1 The motor symptoms of PD are caused by a significant decrease in dopamine levels in the brain due to the degeneration of dopaminergic (DA) neurons. 2 Because the motor disturbance symptoms begin after a 60 to 80% loss of the DA neurons, it is critical to initiate the appropriate medical intervention at the early stage of disease progression. 3 Despite the rapid increase in PD prevalence, there are still no effective biological or imaging markers. The current diagnosis method of PD is made through the clinical criteria developed by the Brain Bank of the Parkinson’s Disease Society in the UK. 4 Even though these criteria are with a high degree of accuracy, it is still not effective to predict PD onset or diagnose patients with early PD without motor symptoms. Thus, the development of early PD biomarkers to predict PD is of importance.Identifying biomarkers is necessary as they can be administered in worldwide screening to predict PD progress and diagnose early PD. A biomarker should be applicable to all sexes and ages, easily accessible, noninvasive, and, most importantly, it should be a quantifiable value for clinical application. In this regard, using peripheral blood plasma is a promising way to develop biomarkers for PD. 5 Although several studies showed that the level of different types of α‐synuclein in the plasma of patients with PD could be used as a biomarker, it is still controversial whether the α‐synuclein level is a suitable biomarker for PD prediction or diagnosis, due to the inconsistency of the results. 6 , 7 Accordingly, the peripheral α‐synuclein level does not seem to have potential as a biomarker. Nonetheless, a promising finding is that the level of DJ‐1 decreased in the cerebrospinal fluid of patients with PD; however, DJ‐1 levels in the sera of patients with PD did not differ from those of control patients. 6 , 8 In addition, the levels of uric acid in sera and epidermal growth factor in plasma are reported to be decreased in patients with PD. 6 Thanks to the years of research on PD, it is now well‐known that several factors, including α‐synuclein, parkin, PINK1, LRRK2, and DJ‐1, are deeply related to the pathogenesis of PD. The α‐synuclein is a presynaptic neuronal protein, which is neuropathologically related to PD. 9 , 10 Several studies showed the implication of parkin and PINK1 in mitophagy that is thought to be one of the underlying pathogenic mechanisms of PD. 10 , 11 , 12 In addition, LRRK2 and DJ‐1 are reported to play important roles in autophagy‐mediated DA neuronal cell loss. 13 , 14 Most recently, Laperle et al. showed that lysosomal membrane proteins, such as LAMP1, were decreased in induced pluripotent stem cells of patients with young‐onset PD. 15 These studies, all together, suggest the deep implication of autophagy in PD.In this study, we aimed to identify autophagy‐related proteins as potential biomarkers for PD by quantitative analysis with patient plasma samples. In addition, we investigated the relationship between the potential biomarkers and clinical characteristics of the patients with PD.  相似文献   

14.
Many targeted therapies are administered at or near the maximum tolerated dose (MTD). With the advent of precision medicine, a larger therapeutic window is expected. Therefore, dose optimization will require a new approach to early clinical trial design. We analyzed publicly available data for 21 therapies targeting six kinases, and four poly (ADP‐ribose) polymerase inhibitors, focusing on potency and exposure to gain insight into dose selection. The free average steady‐state concentration (Css) at the approved dose was compared to the in vitro cell potency (half‐maximal inhibitory concentration (IC50)). Average steady‐state area under the plasma concentration‐time curve, the fraction unbound drug in plasma, and the cell potency were taken from the US drug labels, US and European regulatory reviews, and peer‐reviewed journal articles. The Css was remarkably similar to the IC50. The median Css/IC50 value was 1.2, and 76% of the values were within 3‐fold of unity. However, three drugs (encorafenib, erlotinib, and ribociclib) had a Css/IC50 value > 25. Seven other therapies targeting the same 3 kinases had much lower Css/IC50 values ranging from 0.5 to 4. These data suggest that these kinase inhibitors have a large therapeutic window that is not fully exploited; lower doses may be similarly efficacious with improved tolerability. We propose a revised first‐in‐human trial design in which dose cohort expansion is initiated at doses less than the MTD when there is evidence of clinical activity and Css exceeds a potency threshold. This potency‐guided approach is expected to maximize the therapeutic window thereby improving patient outcomes.

Study Highlights
  • WHAT IS THE CURRENT KNOWLEDGE ON THE TOPIC?
☑ The primary objective of most first‐in‐human (FIH) studies is to establish a maximum tolerated dose (MTD). In oncology, the MTD is assumed to be ideal and lower doses are rarely studied.
  • WHAT QUESTION DID THIS STUDY ADDRESS?
☑ How can we best leverage preclinical data to identify doses that exploit the larger therapeutic window expected for next generation targeted therapies?
  • WHAT DOES THIS STUDY ADD TO OUR KNOWLEDGE?
☑ At the approved doses of 25 targeted therapies studied, the average free concentration at steady state (Css) was similar to the in vitro cell potency (half‐maximal inhibitory concentration (IC50)). However, 3 of these drugs have Css/IC50 values > 25 suggesting a large therapeutic window. Lower doses of these agent may be equally effective with less toxicity.
  • HOW MIGHT THIS CHANGE CLINICAL PHARMACOLOGY OR TRANSLATIONAL SCIENCE?
☑ We propose a revised FIH trial design for next generation targeted therapy in which dose cohort expansion is initiated at doses less than the MTD when there is evidence of clinical activity and Css exceeds a threshold informed by in vitro cell potency.

Most often, the primary objective of the first‐in‐human (FIH) trial in oncology is to establish a maximum tolerated dose (MTD). Where targeted therapies are studied in defined patient populations, it is not uncommon to observe meaningful clinical responses during dose escalation. Nonetheless, the MTD is typically assumed to be the ideal therapeutic dose and dose escalation continues with 3–6 patients per dose level until the MTD is reached. An expansion cohort is initiated most often at the MTD to evaluate preliminary efficacy, at which point lower doses are no longer explored. Thus, limited information is collected in these FIH studies that would facilitate a comparison of the efficacy at the MTD with that of lower doses, which may be better tolerated. 1 , 2 , 3 Given the desire to advance the most promising agents to confirmatory trials as rapidly as possible, there has been considerable debate regarding dose selection in oncology. 4 , 5 , 6 It remains a question whether the “MTD approach,” which is well‐established for chemotherapeutics that have a narrow therapeutic window, is equally appropriate for targeted therapies that may have a larger therapeutic window. Analysis of the growing number of approved targeted agents, including preclinical data made public during the regulatory review and approval process, provides unique insights into this question.A potency‐guided FIH trial leverages quantitative preclinical data regarding the underlying concentration‐response relationship driving therapeutic efficacy. At steady‐state, for cell permeable drugs not subject to active transport processes, the unbound drug concentration in the blood is equal to the unbound concentration in the tumor, where the free drug interacts with its target. Under these conditions, systemic drug concentrations approximating the in vitro potency are expected to elicit the desired pharmacologic response. This hypothesis can be validated using xenograft models in which the inhibition of tumor cell growth is studied in cell culture and in animals under similar conditions. Concordance between in vitro and in vivo potency has been demonstrated for drugs targeting specific genetic abnormalities that drive tumor cell growth. 7 , 8 , 9 , 10 In the present study, the free average steady‐state concentration (Css) of 25 marketed oncology drugs, including 21 kinase inhibitors (5 ABL, 3 ALK, 3 BRAF, 3 CDK4/6, 4 EGFR, and 3 MEK1/2) and 4 poly (ADP‐ribose) polymerase (PARP) inhibitors, has been compared with the in vitro cell line potency (half‐maximal inhibitory concentration (IC50)) of the drug to derive a unitless ratio herein defined as Css/IC50. Many of these therapies have a Css/IC50 value near unity and are administered at their MTD. Drugs that fit these parameters have a relatively narrow therapeutic window where higher doses are not tolerated, and lower doses result in insufficient target engagement. However, for those drugs administered at their MTD that have a Css/IC50 value substantially > 1, a lower dose has the potential to provide similar efficacy with a more favorable safety profile.FIH studies of mutant‐selective oncogene inhibitors and drugs that leverage synthetic lethal interactions are expected to enroll homogeneous patient populations that are highly sensitive to therapy. We propose that these studies use a revised FIH trial design in which dose cohort expansion is initiated at doses less than the MTD when there is evidence of clinical activity and Css values exceed an IC50 threshold (Figure 1 ). The performance of multiple expansion cohorts can be compared directly before selecting the dose for further evaluation. For medicines, which by design are expected to minimize toxicity to normal tissue and maximize tumor cytotoxicity, this approach should help identify the optimal dose. Where efficacy can be achieved at lower, equally effective doses, we expect less toxicity, better compliance, and, accordingly, increased benefit to patients.Open in a separate windowFigure 1Potency‐guided first‐in‐human trial design, including theoretical outcomes. Dose expansion is initiated at dose level 2 when the steady‐state concentration (Css) value is 2‐fold greater than the half‐maximal inhibitory concentration (IC50) with no DLTs and 1 PR. Dose expansion is also initiated at dose levels 4 and 5 (the maximum tolerated dose (MTD)). Comparison of the first 10 patients in the 3 expansion cohorts suggests dose level 4 is most promising and further enrollment is limited to dose level 4. Dose levels 3 is not selected for expansion as exposure is overlapping, due to pharmacokinetic variability, with adjacent dose levels. Blue arrows represent enrollment into dose expansion cohorts. DLT, dose limiting toxicity; PR, partial response; RP2D, recommended phase 2 dose.  相似文献   

15.
Recurrent and acute bleeding from intestinal tract angioectasia (AEC) presents a major challenge for clinical intervention. Current treatments are empiric, with frequent poor clinical outcomes. Improvements in understanding the pathophysiology of these lesions will help guide treatment. Using data from the US Food and Drug Administration (FDA)’s Adverse Event Reporting System (FAERS), we analyzed 12 million patient reports to identify drugs inversely correlated with gastrointestinal bleeding and potentially limiting AEC severity. FAERS analysis revealed that drugs used in patients with diabetes and those targeting PPARγ‐related mechanisms were associated with decreased AEC phenotypes (P < 0.0001). Electronic health records (EHRs) at University of Cincinnati Hospital were analyzed to validate FAERS analysis. EHR data showed a 5.6% decrease in risk of AEC and associated phenotypes in patients on PPARγ agonists. Murine knockout models of AEC phenotypes were used to construct a gene‐regulatory network of candidate drug targets and pathways, which revealed that wound healing, vasculature development and regulation of oxidative stress were impacted in AEC pathophysiology. Human colonic tissue was examined for expression differences across key pathway proteins, PPARγ, HIF1α, VEGF, and TGFβ1. In vitro analysis of human AEC tissues showed lower expression of PPARγ and TGFβ1 compared with controls (0.55 ± 0.07 and 0.49 ± 0.05). National Center for Biotechnology Information (NCBI) Gene Expression Omnibus (GEO) RNA‐Seq data was analyzed to substantiate human tissue findings. This integrative discovery approach showing altered expression of key genes involved in oxidative stress and injury repair mechanisms presents novel insight into AEC etiology, which will improve targeted mechanistic studies and more optimal medical therapy for AEC.

Study Highlights
  • WHAT IS THE CURRENT KNOWLEDGE ON THE TOPIC?
☑ The clinical detection of angioectasia (AEC) has increased using push‐enteroscopy, capsule enterography, colonoscopy, and esophagogastroduodenoscopy. Management is difficult. Currently, endoscopic ablation is an option for lesions within endoscopic reach, whereas angiogenesis inhibitors and octreotide are pharmacological agents additionally used in the treatment of AEC often with limited clinical benefit. The precise pathophysiology of AEC is unknown; however, AECs are known to result from an imbalance between the pro‐angiogenic and anti‐angiogenic factors.
  • WHAT QUESTION DID THIS STUDY ADDRESS?
☑ How do intestinal AEC develop and how can we design targeted therapeutic discovery for AEC.
  • WHAT DOES THIS STUDY ADD TO OUR KNOWLEDGE?
☑ Insight into the development of intestinal AEC and a targeted approach for novel therapeutic strategies.
  • HOW MIGHT THIS CHANGE CLINICAL PHARMACOLOGY OR TRANSLATIONAL SCIENCE?
☑ The results of this study demonstrate the complexity of AEC development and novel therapeutic directions that could impact patient care and treatment.

Angioectasia (AEC) lesions are common vascular abnormalities characterized by ectatic, dilated, and proliferated blood vessels, and are a significant source of obscure gastrointestinal (GI) bleeding. These aberrant blood vessels are typically < 10 mm in diameter, thin walled with little or no smooth muscle, malformed, and uncommunicative, 1 , 2 and symptomatically present with overt and occult GI hemorrhage, 1 melena, hematochezia, and resulting anemia. 1 , 3 The clinical procedure of endoscopy has shown the presence of AEC in the upper GI tract, 1 small bowel, 1 , 3 descending colon, 1 , 4 and linked their existence to upper and lower GI hemorrhage. 1 , 5 AECs are also significantly correlated with occurrence of synchronous lesions 6 , 7 , 8 and aging. 1 , 9 The clinical detection of AEC has increased using push‐enteroscopy, capsule enterography, colonoscopy, and esophagogastroduodenoscopy, and management of these lesions is difficult with options for treatment being suboptimal. 1 , 10 , 11 Currently, endoscopic ablation is an option for lesions within endoscopic reach, whereas angiogenesis inhibitors, such as thalidomide, lenalidomide (thalidomide derivative), and octreotide, are pharmacological agents additionally used in the treatment of AEC often with limited clinical benefit. 11 , 12 The precise pathophysiology of AEC is unknown; however, AECs are known to result from an imbalance between the pro‐angiogenic and anti‐angiogenic factors and expression of growth factors, including VEGF in AECs, is suggestive of angiogenesis playing a role in their development. 1 Angiogenesis promotes formation of new functional microvascular networks in human tissues in response to hypoxia or ischemia. 1 AEC formation appears to be linked to patients with von Willebrand factor in Heyde’s syndrome and left ventricular assist device, whereas mutations in several genes in the TGFβ pathway are common in patients with hereditary hemorrhagic telangiectasia. 13 The VEGF‐dependent proliferation and migration represents an important angiogenesis‐hemostasis relationship that may have therapeutic implications in the management of AEC. 1 , 14 , 15 , 16 Understanding the role of key mediators in AEC development will be important in identifying novel therapeutic strategies that will overcome this unmet clinical need.In this report, we describe a novel integrative systems biology‐based approach and clinical validation study that evaluates the pathophysiology of these lesions. We sought to identify if reduction in severity or decrease in rate of AEC and AEC‐correlated events occurred with use of specific drugs, hence using the medication’s own mechanism of action to ascertain a “first‐cut” of the inflammatory processes at work in vivo. To understand how therapeutic agents may impact AEC‐associated disease pathology, we used in silico drug discovery and gene regulatory networks analysis to identify key pathways/proteins involved in the pathophysiology of AEC and test candidate therapeutics for their protective mechanisms.  相似文献   

16.
Volunteer infection studies using the induced blood stage malaria (IBSM) model have been shown to facilitate antimalarial drug development. Such studies have traditionally been undertaken in single‐dose cohorts, as many as necessary to obtain the dose‐response relationship. To enhance ethical and logistic aspects of such studies, and to reduce the number of cohorts needed to establish the dose‐response relationship, we undertook a retrospective in silico analysis of previously accrued data to improve study design. A pharmacokinetic (PK)/pharmacodynamic (PD) model was developed from initial fictive‐cohort data for OZ439 (mixing the data of the three single‐dose cohorts as: n = 2 on 100 mg, 2 on 200 mg, and 4 on 500 mg). A three‐compartment model described OZ439 PKs. Net growth of parasites was modeled using a Gompertz function and drug‐induced parasite death using a Hill function. Parameter estimates for the PK and PD models were comparable for the multidose single‐cohort vs. the pooled analysis of all cohorts. Simulations based on the multidose single‐cohort design described the complete data from the original IBSM study. The novel design allows for the ascertainment of the PK/PD relationship early in the study, providing a basis for rational dose selection for subsequent cohorts and studies.

Study Highlights
  • WHAT IS THE CURRENT KNOWLEDGE ON THE TOPIC?
☑ Volunteer infection studies are routinely used in antimalarial drug development to generate early pharmacokinetic/pharmacodynamic data for compounds.
  • WHAT QUESTION DID THIS STUDY ADDRESS?
☑ Can in silico analyses be used to suggest improvements to volunteer infection study designs?
  • WHAT DOES THIS STUDY ADD TO OUR KNOWLEDGE?
☑ Multiple dose adaptive trial designs can potentially reduce the number of cohorts needed to establish the dose‐response relationship in volunteer infection studies.
  • HOW MIGHT THIS CHANGE CLINICAL PHARMACOLOGY OR TRANSLATIONAL SCIENCE?
☑ Real time data analyses can be used to recommend doses for adaptive volunteer infection studies.

Volunteer infection studies using the induced blood stage malaria (IBSM) model have been recognized as a valuable system for defining the key pharmacokinetic (PK) and pharmacodynamic (PD) relationships for dose selection in antimalarial drug development. 1 , 2 , 3 , 4 , 5 , 6 , 7 In such studies, healthy volunteers are inoculated intravenously with a given quantity (with small variability) of Plasmodium‐infected red cells. Parasitemia is then followed by quantitative polymerase chain reaction until a prespecified treatment threshold is reached when the test drug is administered. Parasite and drug concentrations are then measured. These studies are conducted prior to phase II dose‐response (D‐R) trials and can be included in an integrated first‐in‐human study protocol, or after completion of the first‐in‐human PK and safety study. IBSM studies have been typically designed as flexible multiple cohort studies where each volunteer of one cohort receives a single dose of the same amount of drug (“single dose per cohort”). 2 , 3 , 4 , 5 After each cohort, a decision is made to stop or to add a cohort to test a lower or higher dose based on the response observed in the previous cohorts.For the multiple single‐dose‐per‐cohort design, the starting dose is typically selected based on safety and PK information from a phase I single ascending dose (SAD) study and, more recently, on preclinical data from a severe combined immunodeficient mouse model, with the dose selected on the basis of being best able to inform the D‐R relationship, rather than aiming for cure. This approach, where a single dose is tested in all subjects of the initial cohort, risks missing the dose likely to be most informative for defining the PK/PD relationship.An alternative approach is to spread a range of doses across a smaller number of subjects within the initial cohort and use PK/PD models developed based on data from this cohort to support dose selections of subsequent cohorts and studies. Using data from a previous study, 2 we undertook an in silico investigation of such an adaptive study design, aiming to reduce the number of subjects exposed to inefficacious doses, and to establish a D‐R relationship. This multiple‐dose‐groups‐per‐cohort design, referred to as the “2‐2‐4” design, is contrasted with the already implemented study design depicted in Figure  1 .Open in a separate windowFigure 1Comparison of standard and adaptive designs of IBSM studies. A/B/C, dose levels to be selected during the progress of the study based on pharmacokinetic/pharmacodynamic results of the initial cohort; CHMI, controlled human malaria infection; D‐R, dose‐response; IBSM, induced blood stage malaria infection; n, number of subjects at each dose.The objectives of this retrospective analysis were to: (i) compare PK/PD parameter estimates from the initial cohort of the 2‐2‐4 study design with the prior results from the data of the full study and (ii) propose a preliminary workflow to establish D‐R early in an IBSM study, and use modeling and simulation (M&S) to support dose selections for subsequent cohorts and later phase clinical trials.  相似文献   

17.
Umibecestat, an orally active β‐secretase inhibitor, reduces the production of amyloid beta‐peptide that accumulates in the brain of patients with Alzheimer’s disease. The echocardiogram effects of umibecestat, on QTcF (Fridericia‐corrected QT), on PR and QRS and heart rate (HR), were estimated by concentration‐effect modeling. Three phase I/II studies with durations up to 3 months, with 372 healthy subjects over a wide age range, including both sexes and 2 ethnicities, were pooled, providing a large data set with good statistical power. No clinically relevant effect on QTcF, PR interval, QRS duration, or HR were observed up to supratherapeutic doses. The upper bound of 90% confidence intervals of the ∆QTcF was below the 10 ms threshold of regulatory concern for all concentrations measured. Prespecified sensitivity analysis confirmed the results in both sexes, in those over and below 60 years, and in Japanese subjects. All conclusions were endorsed by the US Food and Drug Administration (FDA).

Study Highlights
  • WHAT IS THE CURRENT KNOWLEDGE ON THE TOPIC?
☑ Cardiac safety remains a focus of drug development and regulation. The International Conference on Harmonization E14 recommends a definitive QT assessment, for which concentration‐response modeling now serves as an accepted alternative to the thorough QT study.
  • WHAT QUESTION DID THIS STUDY ADDRESS?
☑ Can echocardiogram data from phase I/II trials be successfully pooled for concentration‐effect modeling if these involve young and elderly, and male and female volunteers, including two ethnicities?
  • WHAT DOES THIS STUDY ADD TO OUR KNOWLEDGE?
☑ These results demonstrate the cardiac safety of a beta‐secretase inhibitor, umibecestat. Furthermore, the pooling strategy supports the pooling of phase I/II studies to increase power in concentration‐response modeling, including sensitivity analyses regarding age, sex, and ethnicity.
  • HOW MIGHT THIS CHANGE CLINICAL PHARMACOLOGY OR TRANSLATIONAL SCIENCE?
☑ We demonstrate that pooled analysis of phase I/II studies can be a successful approach to assess cardiac safety to achieve health authority approval. Adapting such an approach a priori instead of post hoc, as demonstrated here, may reduce the sample sizes necessary, expedite drug development, and be more cost‐effective.

Alzheimer’s disease (AD) is one of the most prevalent and debilitating neurodegenerative disorders, and there is a high unmet medical need for effective prevention or treatment. According to the Global Burden of Disease Study 2016, a total of 43.8 million individuals were living with dementia globally. 1 Although the approved pharmacological agents (donepezil, galantamine, rivastigmine, and memantine) treat the symptoms of AD, no disease‐modifying treatment for presymptomatic or prodromal AD is currently available. 2 AD is associated with the accumulation of amyloid beta (Aβ)‐peptide plaques and tau proteins, which are the hallmarks of the multifactorial nature of late‐onset AD. 3 , 4 These plaques consist of aggregated fibrils of Aβ peptides that are derived via enzymatic processing of the amyloid precursor protein (APP). 5 A substantial body of genetic, histopathological, and biomarker evidence supports a potential causal role for Aβ in AD. 6 , 7 , 8 , 9 Thus, prevention of Aβ formation by inhibiting the protease responsible for the critical first step in APP processing, beta‐site‐APP cleaving enzyme‐1 (BACE‐1), has been proposed as a therapeutic approach. 10 Umibecestat, an orally active BACE‐1 inhibitor, reduces brain and cerebrospinal fluid Aβ in rats and dogs, and Aβ plaque deposition in APP‐transgenic mice. 11 Treatment of healthy adults of white and Japanese origin, including healthy adults ≥ 60 years old, resulted in robust and dose‐dependent Aβ reduction in the cerebrospinal fluid (first‐in‐human (FIH) study, Novartis, data on file). 11 The pharmacokinetics (PK) of umibecestat in humans were reported by Neumann et al., 2018. After a single dose, umibecestat displayed a moderate absorption rate (time of maximum concentration (Tmax) within 1–8 hours after dose), and mean terminal elimination (terminal half‐life) was 61.3–83.8 hours in healthy adult participants and 81.4–109 hours in participants ≥ 60 years of age, suggesting that umibecestat was suitable for once‐daily dosing in humans. Umibecestat plasma exposure (peak plasma concentration (Cmax) and area under the curve (AUC)) increased approximately in proportion to dose following either single or repeated doses, with an accumulation ratio of up to five. Upon daily dosing, plasma levels of umibecestat increased within the first month of administration in subjects ≥ 60 years, and then remained stable for an additional 2 months of dosing. In blood, umibecestat distributed mainly to the plasma fraction (ratio of concentration in blood to concentration in plasma = 0.739), with high protein binding (95.9%). Umibecestat displayed good penetration through the blood–brain barrier. Assessment of metabolites showed that the major circulating components in plasma were unchanged umibecestat (44% of total AUC) followed by the amide hydrolysis and oxidative metabolites (FIH study, Novartis, data on file).Umibecestat 15 mg and 50 mg were studied in the Generation Program in two clinical prevention studies in subjects at risk for clinical onset of AD. 12 , 13 However, an interim assessment of outcomes performed during a preplanned review of unblinded data identified a worsening in some assessments of cognitive function, similar to that reported with other BACE inhibitors, 14 , 15 and the sponsor decided to discontinue both studies.Here, we present the results of the cardiac safety analysis for umibecestat in heathy volunteers. Cardiotoxicity is a well‐known serious side effect that may result from off‐target interactions between drugs and cardiac voltage‐gated ion channels, such as the human ether‐à‐go‐go related gene (hERG) potassium channel, that controls the heart rhythm: these interactions can trigger potentially lethal arrhythmias, such as prolongation of QT interval and Torsades de Pointes. 16 , 17 The hERG, unlike other ion channels, can bind a very wide range of ligands, 18 including BACE‐1 inhibitors. Previous in vitro cardio‐safety assessments showed that umibecestat inhibited hERG potassium channel currents (half‐maximal inhibitory concentration (IC50) of 3.2 μM) and the L‐type calcium channel hCav1.2 (IC50 of 9.1 μM). Based on the mean Cmax drug exposure at the highest dose of 50 mg once daily investigated in the AD prevention studies; safety margins (SMs) of > 100‐fold were established relative to the projected free plasma concentration (hERG SM = 114‐fold; hCav1.2 SM≥ 300‐fold). Additionally, in a functional assay, umibecestat had no relevant effects on the human cardiac potassium channel hKCNQ1/MinK 1.5. In vivo safety pharmacology studies with umibecestat in dogs showed no electrocardiogram (ECG) or cardiovascular effects up to the highest doses tested, with a fivefold margin to this highest dose. 11 The potential cardiac safety issues that can result from blockade of cardiac ion channels, and, in particular, of hERG, are a major focus of drug development and regulation. Although the International Conference on Harmonization (ICH) E14 recommends a dedicated QT assessment to assess the potential for corrected QT (QTc) prolongation for all small molecules with systemic exposure, concentration–response modeling is now an accepted alternative to a thorough QT study to satisfy the regulatory requirement for QT assessment. 19 , 20 , 21 Whereas the approach has become increasingly popular, still only few publications describe a strategic developmental approach to pool data from individual early studies. In a recent example, pooled data from two 14‐day multiple ascending‐dose studies on lemborexant, used in the treatment of insomnia, involving 48 and 18 healthy subjects of Japanese and non‐Japanese ethnicity, were used to estimate QTc intervals with a linear mixed‐effects concentration‐response model. 22 The model predicted a QTc effect of 1.1 ms (90% confidence interval (CI), –3.49 to 5.78 milliseconds) at the highest observed Cmax. Another recent study used pooled data from two phase I studies involving a total of 122 subjects treated with the novel phosphodiesterase‐4 inhibitor CHF6001. 23 The upper limit of the 90% CI for mean change of baseline corrected QT Fridericia’s formula (ΔΔQTcF) did not exceed 10 ms in either of the two models used, a simple linear mixed‐effects model and another including oscillatory functions.We present the results of a cardiac safety analysis, including concentration‐effect modeling, based on the outcome of pooled analyses of ECG data from healthy volunteers enrolled in early phase studies: an FIH study in healthy adult and elderly subjects, an ethnic sensitivity study in healthy adult and elderly Japanese subjects, and a safety and tolerability dose range study in healthy elderly subjects (Novartis, data on file). 11 In addition to presenting the cardiac safety data for umibecestat, we also present this pooled analysis as an example of how to address cardiac safety early in the drug development process.  相似文献   

18.
High‐salt (HS) intake is closely associated with the ignition and progression of hypertension. The mechanisms might be involved in endothelial dysfunction, nitric oxide deficiency, oxidative stress, and proinflammatory cytokines. Propolis is widely used as a natural antioxidant and is a well‐known functional food for its biological activities, which includes anti‐inflammation, antimicrobial, and liver detoxification. In this study, we successfully replicated a HS diet‐induced hypertensive rat model. We found that in the long‐term HS diet group, the myocardial function of the rats was altered and led to a significant decrease (around 49%) in heart function. However, doses of Chinese water‐soluble propolis (WSP) were found directly proportional (11%, 60%, 91%, respectively) to the myocardial function improvement in hypertensive rats. The results from the blood circulation test and hematoxylin‐eosin stains showed that propolis had protective effects on myocardial functions and blood vessels in hypertensive rats. Also, based on the results of western blot and polymerase chain reaction, WSP effectively regulated Nox2 and Nox4 levels and was responsible for a decrease in reactive oxygen species synthesis. Our findings demonstrate that Chinese WSP has a significant effect on the blood pressure of hypertensive rats and their cardiovascular functions that improved significantly. The improvement in the cardiovascular functions might be related to the process of anti‐oxidation, anti‐inflammation, and the improvements of the endothelial function in hypertensive rats.

Salt consumption has increased dramatically among the Chinese. A diet consisting of low salt concentrations could promote good cardiovascular health while a diet high in salt could be detrimental to health. 1 Therefore, it is well recognized that high‐salt (HS) intake is the main risk factor in the ignition and progression of hypertension. Many studies suggest that endothelial dysfunction, nitric oxide deficiency, oxidative stress, and proinflammatory cytokines contribute to the development of hypertension. 2 The HS diet has been associated with a dysregulation of the intrarenal renin‐angiotensin system (RAS), oxidative stress, and inflammatory cytokines that lead to excessive retention of Na+ increased vascular resistance, and high blood pressure. 3 HS intake affects cardiovascular functions through a mechanism that involves the transforming growth factor (TGF‐β1) and nitric oxide (NO). 4 Also, it has been reported that an HS diet promotes an increased generation of superoxide anion (O2 ) from nitric oxide synthase (NOS), which could, in turn, impair the endothelium‐dependent dilation through reduced NO bioavailability. Therefore, it is crucial to understand and prevent tissue injuries associated with oxidative stress and inflammatory cytokines.Propolis is the generic name of a complex resinous mixture that is collected from plant buds and exudates by honey bees. Propolis is enriched with bee''s saliva and enzyme‐containing secretions and used in the construction, adaptation, and protection of hives after pollen collection. 5 Nowadays, numerous studies show that honey and propolis have a beneficial effect on human health. For this reason, honey and propolis are widely used in cosmetics and are also popular alternatives for self‐treatment of various diseases. Propolis samples from Asia, South America, and Europe have different compositions and therefore varying biological activities. However, propolis generally shows great similarity in composition regardless of their botanical source. Propolis compounds have cardioprotective, antioxidant, antiangiogenic actions, antiatherosclerosis, vasoprotective, and anti‐inflammatory properties, which could be used in the modulation of cardiovascular disease.Recent studies on Malaysian Propolis (MP) demonstrated antioxidant properties and cardioprotective activity against isoproterenol‐induced oxidative stress through direct cytotoxic radical‐scavenging. 6 Other studies on Brazilian red propolis have shown attenuated hypertension and renal damage in the 5/6 renal ablation model. 7 Studies on the Chinese poplar propolis have shown that it decreases oxidized low‐density lipoprotein‐induced endothelial cells injury. 8 The flavonoids extracted from propolis have the potential to inhibit the pathological cardiac hypertrophy progression and heart failure. 9 Chinese propolis could be considered a healthy food option; however, its beneficial effects on the protection of healthy cardiovascular function remain elusive. It has been reported that the protective effects of Chinese propolis on the damaged myocardial cells are induced by oxidative stress, platelet aggregation through inhibitory effects, and attenuate endothelial dysfunction. 10 This study aimed to investigate the protective effects of Chinese water‐soluble propolis (WSP) on hypertension induced by a HS diet and the discussion of the mechanisms involved.  相似文献   

19.
20.
Voriconazole is the mainstay for the treatment of invasive fungal infections in patients who underwent a kidney transplant. Variant CYP2C19 alleles, hepatic function, and concomitant medications are directly involved in the metabolism of voriconazole. However, the drug is also associated with numerous adverse events. The purpose of this study was to identify predictors of adverse events using binary logistic regression and to measure its trough concentration using multiple linear modeling. We conducted a prospective analysis of 93 kidney recipients cotreated with voriconazole and recorded 213 trough concentrations of it. Predictors of the adverse events were voriconazole trough concentration with the odds ratios (OR) of 2.614 (P = 0.016), cytochrome P450 2C19 (CYP2C19), and hemoglobin (OR 0.181, P = 0.005). The predictive power of these three factors was 91.30%. We also found that CYP2C19 phenotypes, hemoglobin, platelet count, and concomitant use of ilaprazole had quantitative relationships with voriconazole trough concentration. The fit coefficient of this regression equation was R 2 = 0.336, demonstrating that the model explained 33.60% of interindividual variability in the disposition of voriconazole. In conclusion, predictors of adverse events are CYP2C19 phenotypes, hemoglobin, and voriconazole trough concentration. Determinants of the voriconazole trough concentration were CYP2C19 phenotypes, platelet count, hemoglobin, concomitant use of ilaprazole. If we consider these factors during voriconazole use, we are likely to maximize the treatment effect and minimize adverse events.

Study Highlights
  • WHAT IS THE CURRENT KNOWLEDGE ON THE TOPIC?
☑ Voriconazole demonstrates wide interpatient variability in serum concentrations, due in part to variant CYP2C19 alleles. Individuals who are CYP2C19 ultrarapid metabolizers have decreased trough voriconazole concentrations, delaying achievement of target blood concentrations. In comparison, poor metabolizers have increased trough concentrations and are at increased risk of adverse drug events. However, CYP2C19 genotyping cannot replace therapeutic drug monitoring, as other factors (i.e., drug interactions, hepatic function, renal function, site of infection, and comorbidities) also influence the use of voriconazole. Besides, this association is markedly less visible in kidney transplantation recipients. Further studies are required to ensure the intelligent use of voriconazole.
  • WHAT QUESTION DID THIS STUDY ADDRESS?
☑ This study identified predictors of the occurrence of adverse events and determinations of the magnitude of serum voriconazole trough concentration in kidney transplantation recipients.
  • WHAT DOES THIS STUDY ADD TO OUR KNOWLEDGE?
☑ This paper adds to the evidence that the CYP2C19 genotype serves as a mediator for voriconazole associated adverse events. Notably, it was seldom reported that the concentration of hemoglobin could statistically significantly influence the occurrence of adverse events and trough concentration of voriconazole.
  • HOW MIGHT THIS CHANGE CLINICAL PHARMACOLOGY OR TRANSLATIONAL SCIENCE?
☑ Attention should be given not only to the genotype of CYP2C19 but also to other predictors, such as hemoglobin, platelet count, and drug interactions, during therapy with voriconazole in kidney transplant recipients.

Invasive fungal infections are a feared complication in kidney transplant recipients, occurring in 0.1–3.5% of solid organ recipients. 1 Its 12‐week survival rates were only 60.7%, and 22.1% of the survivors experienced graft loss because of invasive fungal infections. 2 Kidney transplantation, in conjunction with calcineurin inhibitors, is regarded as the best option for patients with end‐stage kidney disease. However, immunosuppression increases the risk of opportunistic infections and induces the occurrence of secondary fungal infections with high mortality rates (40–60%). 1 , 3 Voriconazole is the first available second‐generation triazole. Experts recommend voriconazole as primary therapy for invasive aspergillosis. 4 Clinicians also use it prophylactically to avoid severe infections in immunosuppressed organ transplant recipients. However, voriconazole exhibits nonlinear pharmacokinetics. With increasing dose, it shows a super‐proportional increase in area under the plasma concentration‐time curve; therefore, there is limited predictability of its accumulation or elimination. Maximum concentration and area under the plasma concentration‐time curve also increase disproportionately with the dose. 5 Based on data from healthy individuals, voriconazole is rapidly absorbed within 2 hours after oral administration. The oral bioavailability of voriconazole is over 90%, allowing switching between oral and intravenous formulations. The protein binding is 58% and it is independent of dose or plasma concentrations. The mean elimination half‐life of voriconazole is generally about 6 hours. The time to reach steady‐state plasma concentrations is approximately 5 days with a maintenance dose. If administrated with a loading dose, it reaches a steady‐state within 24 hours. The volume of distribution of voriconazole is 2–4.6 L/kg. 5 , 6 , 7 Metabolism is hepatic, mediated by the CYP isoenzymes CYP2C9, CYP2C19, and CYP3A4 via N‐oxidation, predominantly by CYP2C19. 8 Furthermore, it is both a substrate and an inhibitor of CYP2C19. 4 Like other CYP450 superfamily members, CYP2C19 is highly polymorphic with 35 defined variant star (*) alleles. A gene summary of CYP2C19 is available online. 9 Of note, the CYP2C19 genotype is a significant determinant of the wide pharmacokinetics variability for voriconazole. 4 , 10 Voriconazole is also associated with numerous adverse events, such as neurotoxicity, hepatotoxicity, and visual disturbances; adverse events correlated with concentration. 11 , 12 Nevertheless, the risk factors of adverse events in kidney transplantation recipients require further study. It is worth remembering that the ideal target trough concentration is not uniform, ranging from 0.5 mg/L to 6.0 mg/L. Simultaneously, voriconazole concentrations are affected by variant CYP2C19 alleles, age, hepatic function, concomitant medications, and inflammation. 13 , 14 , 15 Generally, voriconazole concentrations demonstrate wide interpatient variability. 16 Further studies are required to determine its variability in pathological states. Furthermore, most studies used classical population pharmacokinetics, which are not well‐suited for clinicians. The purpose of this study was to identify predictors of the occurrence of adverse events and to determine the magnitude of serum voriconazole trough concentration in kidney transplantation recipients.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号