Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia. In the Asian population, patients with AF have been shown to have increased risks of ischemic stroke and all-cause death compared to patients without AF by 3.34- and 2.61-fold, respectively. AF guidelines recommend oral anticoagulation (OAC) therapy in AF patients with a CHA2DS2-VASc score of ≥1 for men and ≥2 for women with non-valvular AF. After the introduction of non-vitamin K antagonist oral anticoagulants (NOACs) as a treatment for AF, their use has become widespread. Compared to warfarin, NOACs showed comparable efficacy for the prevention of thromboembolic events and superior safety in terms of bleeding complications, especially intracranial hemorrhage. Physicians should keep in mind considerations for optimal OAC therapy to achieve the best outcome. Furthermore, appropriate dose selection in order to achieve the best clinical outcome is an important issue in clinical practice. All NOACs do not have the same rules for dose reduction, and dose reduction of NOACs is primarily recommended according to the dose reduction criteria investigated in pivotal randomized control trials. In this review, we focus on the optimal dose of NOAC and summarize current guidelines and evidence for appropriate dosing of NOACs.
Type 2 diabetes mellitus (T2DM) is a complex disorder and is associated with an increased risk for developing atherosclerotic cardiovascular disease. Control of major risk factors of T2DM can reduce major adverse cardiovascular events (MACEs) in patients. Glycemic control has long been the gold standard for treatment of T2DM. However, strict blood glucose control strategies have repeatedly failed in the prevention of cardiovascular events in key clinical trials. The 2019 American and European practice guidelines for the prevention of cardiovascular disease in patients with T2DM have recommended the use of novel hypoglycemic agents, such as sodium glucose transporter 2 inhibitors and glucagonlike peptide-1 receptor antagonist, which have shown significant reductions in the risk of MACE in spite of their modest glycemic control capacity. A paradigm shift from the glucosecentered approach in treating diabetic patients with cardiovascular disease is imperative. Based on positive outcomes from previous evidence, the reduction of the risk of MACE should be a primary objective for treatment.
Interrupted time series analysis is often used to evaluate the effects of healthcare policies and interventional projects using observational data. Interrupted time series analysis is one of the epidemiological methods, which are based on the assumption that the trend of the pre-intervention time series, if not intervened, would have the same tendency in the post-intervention period. Time series during the pre-intervention period are used to model a counterfactual situation without intervention during the post-intervention period. The effects of intervention can be seen in the form of abrupt changes in the result level (intercept) due to the intervention and/or changes in the result over time (slope) after the intervention. If the effects of intervention are predefined, the effects of the intervention can be distinguished and analyzed based on the time series analysis model constructed accordingly. Interrupted time series analysis is generally performed in a pre-post comparison using the intervention series. Recently, however, controlled interrupted time series analysis, which uses a control series as well as an intervention series, has also been used. The controlled interrupted time series analysis uses a control series to control potential confounding due to events occurring concurrently with the intervention of interest. Even though interrupted time series analysis is a useful way to assess the effects of intervention using observational data, misleading results can be derived if the conditions for proper application are not met. Before applying the method, it is necessary to make sure that the data conforms to the conditions for proper application.
The Mendelian Randomization (MR) approach is a method that enables causal inference in observational studies. There are 3 assumptions that must be satisfied to obtain suitable results: 1) The genetic variant is strongly associated with the exposure, 2) The genetic variant is independent of the outcome, given the exposure and all confounders (measured and unmeasured) of the exposure-outcome association, 3) The genetic variant is independent of factors (measured and unmeasured) that confound the exposure-outcome relationship. This analysis has been used increasingly since 2011, but many researchers still do not know how to perform MR. Here, we introduce the basic concepts, assumptions, and methods of MR analysis to enable better understanding of this approach.
Citations
Citations to this article as recorded by
Alcohol consumption and risk of psoriasis: Results from observational and genetic analyses in more than 100,000 individuals from the Danish general population Alexander Jordan, Charlotte Näslund-Koch, Signe Vedel-Krogh, Stig Egil Bojesen, Lone Skov JAAD International.2024; 15: 197. CrossRef
MR-GGI: accurate inference of gene–gene interactions using Mendelian randomization Wonseok Oh, Junghyun Jung, Jong Wha J. Joo BMC Bioinformatics.2024;[Epub] CrossRef