1.Latent class analysis and influencing factor study of work-related musculoskeletal disorders among operating room nurses in tertiary hospitals
Xiaogui TANG ; Li LI ; Yue ZHAO ; Ningning HU ; Feng FU ; Boya LI ; Mengru YANG ; Yinglan LI
Journal of Environmental and Occupational Medicine 2025;42(3):293-301
Background Work-related musculoskeletal disorders (WMSDs), as one of the major occupational health issues worldwide, have shown an increasing positive rate year by year. Due to the unique demands of work, operating room nurses exhibit a higher positive rate of WMSDs compared to other occupational groups, necessitating active attention and intervention. Objective To estimate the prevalence of WMSDs among operating room nurses in tertiary hospitals, explore the characteristics and latent categories of WMSDs, and analyze the influencing factors associated with the occurrence of WMSDs. Method Using a randomized cluster sampling method, operating room nurses from nine tertiary hospitals in Urumqi were selected as study participants between December 2023 and January 2024. Data were collected through a general information questionnaire, an ergonomic questionnaire for operating room nurses, and the Chinese Musculoskeletal Disorders Questionnaire. Latent class analysis was employed to examine the patterns of WMSDs among the nurses, while chi-square test and multinomial logistic regression were utilized to analyze the influencing factors of WMSDs. Result A total of 411 valid questionnaires were collected in this survey. The positive rate of WMSDs among operating room nurses in the tertiary hospitals of Urumqi over the past year was 91.9%. The positive rates, ordered from highest to lowest by body region, were neck (79.1%), shoulders (70.3%), and lower back (68.1%). The operating room nurses were categorized into three distinct groups by latent class analysis: multi-site pain group, neck-shoulder-back pain group, and neck and lower back pain group. The results of the multinomial logistic regression models revealed that gender, job strain level, ergonomic load level in the operating room, and exposure to cold or drafty working conditions or not were significant influencing factors for reporting WMSDs among operating room nurses. Specifically, having less than 5 years of work experience, low ergonomic load level, low job strain, and moderate job strain were identified as protective factors against WMSDs. Conversely, exposure to cold or drafty working environments and being female were identified as risk factors for WMSDs. The logistic regression models also indicated that compared to the neck-lower back pain group, the neck-shoulder-back pain group had a higher probability of reporting low job strain (OR=0.168, 95%CI: 0.029, 0.968) and being female (OR=4.847, 95%CI: 2.506, 9.378). In contrast, when comparing to the neck-lower back pain group, the multi-site pain group had a higher probability of reporting, low-level ergonomic workload (OR=0.079, 95%CI: 0.015, 0.412), low job strain (OR=0.019, 95%CI: 0.002, 0.145), moderate job strain (OR=0.080, 95%CI: 0.016, 0.401), high job strain (OR=0.132, 95%CI: 0.027, 0.647), less than 5 years of work experience (OR=0.173, 95%CI: 0.044, 0.683), being female (OR=2.424, 95%CI: 1.130, 5.200), and exposure to cold or drafty working environments (OR=3.277, 95%CI: 1.657, 6.481). Conclusion The positive rate WMSDs among operating room nurses in tertiary hospitals is notably high in Urumqi, with distinct co-occurrence characteristics observed within the population. To mitigate the risk of WMSDs, it is essential to implement targeted health education and prevention training programs tailored to different patterns of WMSDs. Additionally, improving working conditions, optimizing human resource allocation , and other proactive measures should be undertaken. These efforts will effectively reduce the incidence of WMSDs among operating room nurses and safeguard their occupational health.
2.Introduction and enlightenment of the Recommendations and Expert Consensus for Plasm a and Platelet Transfusion Practice in Critically ill Children: from the Transfusion and Anemia Expertise Initiative-Control/Avoidance of Bleeding (TAXI-CAB)
Lu LU ; Jiaohui ZENG ; Hao TANG ; Lan GU ; Junhua ZHANG ; Zhi LIN ; Dan WANG ; Mingyi ZHAO ; Minghua YANG ; Rong HUANG ; Rong GUI
Chinese Journal of Blood Transfusion 2025;38(4):585-594
To guide transfusion practice in critically ill children who often need plasma and platelet transfusions, the Transfusion and Anemia Expertise Initiative-Control/Avoidance of Bleeding (TAXI-CAB) developed Recommendations and Expert Consensus for Plasma and Platelet Transfusion Practice in Critically Ill Children. This guideline addresses 53 recommendations related to plasma and platelet transfusion in critically ill children with 8 kinds of diseases, laboratory testing, selection/treatment of plasma and platelet components, and research priorities. This paper introduces the specific methods and results of the recommendation formation of the guideline.
3.Research advances in the gut microbiota-gut-brain axis in migraine
Journal of Apoplexy and Nervous Diseases 2025;42(7):583-587
Migraine is a complex chronic central nervous system disorder with a gradually increasing prevalence rate around the world, causing a significant healthcare burden.Recent studies have shown that gut microbiota plays a crucial role in the pathophysiological process of migraine through the bidirectional communication network of the gut-brain axis. This article systematically reviews the association and mechanisms between the gut microbiota-gut-brain axis and migraine, in order to provide new perspectives for in-depth research and clinical prevention and treatment of migraine.
4.Analysis of prognosis and influencing factors of sepsis patients receiving blood component transfusion
Bingjie ZHAO ; Bowei CAO ; Yuanpei ZHU ; Ningjie ZHANG
Chinese Journal of Blood Transfusion 2025;38(7):879-885
Objective: To identify influencing factors associated with the prognosis of sepsis patients receiving blood component transfusion, and to provide a more rational and scientific transfusion strategy for clinical management. Methods: Clinical data of 232 patients with sepsis treated at the Second Xiangya Hospital of Central South University between January 2022 and December 2023 were retrospectively analyzed. These patients were categorized into the transfusion group (n=64) and the non-transfusion group (n=168) based on whether they received transfusions, and the patients in the transfusion group were further divided into non-survivor group (n=26) and survivor group (n=38) based on their survival outcome. Baseline characteristics and clinical characteristics were compared between two groups. Factors impacting the prognosis of sepsis patients undergoing blood component transfusion were identified using logistic regression. Results: Compared to the non-transfusion group, the transfusion group showed significantly higher levels of coagulation indicators (prothrombin time, activated partial thromboplastin time, international normalized ratio, D-dimer) and inflammatory markers (C-reactive protein, procalcitonin, interleukin-6), while the level of hemoglobin, platelet, lymphocyte, fibrinogen, albumin, blood glucose, and oxygen saturation were significantly lower (P<0.05). The [M(P
, P
)] for C-reactive protein (mg/L), hemoglobin (g/L), and platelet count (×10
/L) in the transfusion vs non-transfusion groups were 178.0(156.1-178) vs 102.7(74.0-119.6), 88.5(72.3-113.0) vs 110.5(101-121.8), and 63.0(26.5-156.5) vs 202.5(108.3-286.8), respectively (all P<0.05). Logistic regression analysis revealed that hemoglobin level, platelet count, lactate concentration, and the storage duration of transfused red blood cells were independent risk factors affecting the survival outcomes of sepsis patients receiving transfusions (P<0.05). In septic transfusion patients, the [M(P
, P
)] lactate concentration (mmol/L) and RBC storage time (d) in the non-survivor vs survivor groups were 3.5(1.9-7.7) vs 2.1(1.3-3.5), 18.0 (13.0-18.0) vs 12.0(9.0-14.0), respectively (both P<0.05). Conclusion: Compared to non-transfused sepsis patients, those receiving transfusions exhibited poorer baseline conditions, more severe infections, and worse survival outcomes. More importantly, the study found that the timing of transfusion decisions and the quality control of blood products (such as storage duration) may directly impact patient prognosis, providing critical evidence for optimizing transfusion strategies in septicemia patients.
5.Analysis of prognosis and influencing factors of sepsis patients receiving blood component transfusion
Bingjie ZHAO ; Bowei CAO ; Yuanpei ZHU ; Ningjie ZHANG
Chinese Journal of Blood Transfusion 2025;38(7):879-885
Objective: To identify influencing factors associated with the prognosis of sepsis patients receiving blood component transfusion, and to provide a more rational and scientific transfusion strategy for clinical management. Methods: Clinical data of 232 patients with sepsis treated at the Second Xiangya Hospital of Central South University between January 2022 and December 2023 were retrospectively analyzed. These patients were categorized into the transfusion group (n=64) and the non-transfusion group (n=168) based on whether they received transfusions, and the patients in the transfusion group were further divided into non-survivor group (n=26) and survivor group (n=38) based on their survival outcome. Baseline characteristics and clinical characteristics were compared between two groups. Factors impacting the prognosis of sepsis patients undergoing blood component transfusion were identified using logistic regression. Results: Compared to the non-transfusion group, the transfusion group showed significantly higher levels of coagulation indicators (prothrombin time, activated partial thromboplastin time, international normalized ratio, D-dimer) and inflammatory markers (C-reactive protein, procalcitonin, interleukin-6), while the level of hemoglobin, platelet, lymphocyte, fibrinogen, albumin, blood glucose, and oxygen saturation were significantly lower (P<0.05). The [M(P
, P
)] for C-reactive protein (mg/L), hemoglobin (g/L), and platelet count (×10
/L) in the transfusion vs non-transfusion groups were 178.0(156.1-178) vs 102.7(74.0-119.6), 88.5(72.3-113.0) vs 110.5(101-121.8), and 63.0(26.5-156.5) vs 202.5(108.3-286.8), respectively (all P<0.05). Logistic regression analysis revealed that hemoglobin level, platelet count, lactate concentration, and the storage duration of transfused red blood cells were independent risk factors affecting the survival outcomes of sepsis patients receiving transfusions (P<0.05). In septic transfusion patients, the [M(P
, P
)] lactate concentration (mmol/L) and RBC storage time (d) in the non-survivor vs survivor groups were 3.5(1.9-7.7) vs 2.1(1.3-3.5), 18.0 (13.0-18.0) vs 12.0(9.0-14.0), respectively (both P<0.05). Conclusion: Compared to non-transfused sepsis patients, those receiving transfusions exhibited poorer baseline conditions, more severe infections, and worse survival outcomes. More importantly, the study found that the timing of transfusion decisions and the quality control of blood products (such as storage duration) may directly impact patient prognosis, providing critical evidence for optimizing transfusion strategies in septicemia patients.
6.tRF Prospect: tRNA-derived Fragment Target Prediction Based on Neural Network Learning
Dai-Xi REN ; Jian-Yong YI ; Yong-Zhen MO ; Mei YANG ; Wei XIONG ; Zhao-Yang ZENG ; Lei SHI
Progress in Biochemistry and Biophysics 2025;52(9):2428-2438
ObjectiveTransfer RNA-derived fragments (tRFs) are a recently characterized and rapidly expanding class of small non-coding RNAs, typically ranging from 13 to 50 nucleotides in length. They are derived from mature or precursor tRNA molecules through specific cleavage events and have been implicated in a wide range of cellular processes. Increasing evidence indicates that tRFs play important regulatory roles in gene expression, primarily by interacting with target messenger RNAs (mRNAs) to induce transcript degradation, in a manner partially analogous to microRNAs (miRNAs). However, despite their emerging biological relevance and potential roles in disease mechanisms, there remains a significant lack of computational tools capable of systematically predicting the interaction landscape between tRFs and their target mRNAs. Existing databases often rely on limited interaction features and lack the flexibility to accommodate novel or user-defined tRF sequences. The primary goal of this study was to develop a machine learning based prediction algorithm that enables high-throughput, accurate identification of tRF:mRNA binding events, thereby facilitating the functional analysis of tRF regulatory networks. MethodsWe began by assembling a manually curated dataset of 38 687 experimentally verified tRF:mRNA interaction pairs and extracting seven biologically informed features for each pair: (1) AU content of the binding site, (2) site pairing status, (3) binding region location, (4) number of binding sites per mRNA, (5) length of the longest consecutive complementary stretch, (6) total binding region length, and (7) seed sequence complementarity. Using this dataset and feature set, we trained 4 distinct machine learning classifiers—logistic regression, random forest, decision tree, and a multilayer perceptron (MLP)—to compare their ability to discriminate true interactions from non-interactions. Each model’s performance was evaluated using overall accuracy, receiver operating characteristic (ROC) curves, and the corresponding area under the ROC curve (AUC). The MLP consistently achieved the highest AUC among the four, and was therefore selected as the backbone of our prediction framework, which we named tRF Prospect. For biological validation, we retrieved 3 high-throughput RNA-seq datasets from the gene expression omnibus (GEO) in which individual tRFs were overexpressed: AS-tDR-007333 (GSE184690), tRF-3004b (GSE197091), and tRF-20-S998LO9D (GSE208381). Differential expression analysis of each dataset identified genes downregulated upon tRF overexpression, which we designated as putative targets. We then compared the predictions generated by tRF Prospect against those from three established tools—tRFTar, tRForest, and tRFTarget—by quantifying the number of predicted targets for each tRF and assessing concordance with the experimentally derived gene sets. ResultsThe proposed algorithm achieved high predictive accuracy, with an AUC of 0.934. Functional validation was conducted using transcriptome-wide RNA-seq datasets from cells overexpressing specific tRFs, confirming the model’s ability to accurately predict biologically relevant downregulation of mRNA targets. When benchmarked against established tools such as tRFTar, tRForest, and tRFTarget, tRF Prospect consistently demonstrated superior performance, both in terms of predictive precision and sensitivity, as well as in identifying a higher number of true-positive interactions. Moreover, unlike static databases that are limited to precomputed results, tRF Prospect supports real-time prediction for any user-defined tRF sequence, enhancing its applicability in exploratory and hypothesis-driven research. ConclusionThis study introduces tRF Prospect as a powerful and flexible computational tool for investigating tRF:mRNA interactions. By leveraging the predictive strength of deep learning and incorporating a broad spectrum of interaction-relevant features, it addresses key limitations of existing platforms. Specifically, tRF Prospect: (1) expands the range of detectable tRF and target types; (2) improves prediction accuracy through multilayer perceptron model; and (3) allows for dynamic, user-driven analysis beyond database constraints. Although the current version emphasizes miRNA-like repression mechanisms and faces challenges in accurately capturing 5'UTR-associated binding events, it nonetheless provides a critical foundation for future studies aiming to unravel the complex roles of tRFs in gene regulation, cellular function, and disease pathogenesis.
7.Guideline for the workflow of clinical comprehensive evaluation of drugs
Zhengxiang LI ; Rong DUAN ; Luwen SHI ; Jinhui TIAN ; Xiaocong ZUO ; Yu ZHANG ; Lingli ZHANG ; Junhua ZHANG ; Hualin ZHENG ; Rongsheng ZHAO ; Wudong GUO ; Liyan MIAO ; Suodi ZHAI
China Pharmacy 2025;36(19):2353-2365
OBJECTIVE To standardize the main processes and related technical links of the clinical comprehensive evaluation of drugs, and provide guidance and reference for improving the quality of comprehensive evaluation evidence and its transformation and application value. METHODS The construction of Guideline for the Workflow of Clinical Comprehensive Evaluation of Drugs was based on the standard guideline formulation method of the World Health Organization (WHO), strictly followed the latest definition of guidelines by the Institute of Medicine of the National Academy of Sciences of the United States, and conformed to the six major areas of the Guideline Research and Evaluation Tool Ⅱ. Delphi method was adopted to construct the research questions; research evidence was established by applying the research methods of evidence-based medicine. The evidence quality classification system of the Chinese Evidence-Based Medicine Center was adopted for evidence classification and evaluation. The recommendation strength was determined by the recommendation strength classification standard formulated by the Oxford University Evidence-Based Medicine Center, and the recommendation opinions were formed through the expert consensus method. RESULTS & CONCLUSIONS The Guideline for the Workflow of Clinical Comprehensive Evaluation of Drugs covers 4 major categories of research questions, including topic selection, evaluation implementation, evidence evaluation, and application and transformation of results. The formulation of this guideline has standardized the technical links of the entire process of clinical comprehensive evaluation of drugs, which can effectively guide the high-quality and high-efficient development of this work, enhance the standardized output and transformation application value of evaluation evidence, and provide high-quality evidence support for the scientific decision-making of health and the rationalization of clinical medication.
8.Effectiveness of narrative exposure therapy for post-traumatic stress disorder in children and adolescents: a Meta-analysis
Junyu LIU ; Jianjian WANG ; Yuan LUO ; Liping ZHAO ; Zhijing LIU
Sichuan Mental Health 2024;37(2):179-186
BackgroundNarrative exposure therapy (NET), an integration of narrative therapy and exposure therapy, has been shown to be effective in relieving the symptoms of post-traumatic stress disorder (PTSD), which can help patients gain a deeper understanding of their trauma and is also considered to be quite safe. PTSD is highly prevalent in children and adolescents, while the effectiveness of NET intervention varies among the subjects. ObjectiveTo systematically evaluate the effectiveness of NET for PTSD in children and adolescents, so as to provide references for the clinical application of NET. MethodsOn August 1, 2022, the Cochrane Library, PubMed, Web of Science, CINAHL, China National Knowledge Infrastructure (CNKI), SinoMed, VIP and Wanfang database were searched from their inception to June 2022. Search was conducted with the use of a combination of medical subject heading and free text terms, and randomized controlled trials relevant to NET for PTSD in children and adolescents were collected. Then the quality of the controlled trials was evaluated according to the Cochrane Collaboration's tool for assessing risk of bias (2011), and Meta-analysis was performed using RevMan 5.4 software. ResultsNine randomized controlled trials involving 394 children and adolescents with PTSD were included. Meta-analysis showed that NET and relaxation therapy reported comparable symptom relief in PTSD patients within 1 to 3 months after intervention (SMD=0.22, 95% CI: -0.84~1.28) and at 6 months after intervention (SMD=0.21, 95% CI: -0.75~1.17), while NET provided greater PTSD symptom relief than routine therapy both within 1 to 3 months after intervention (SMD=-0.66, 95% CI: -1.04~-0.27) and at 6 months after intervention (SMD=-0.77, 95% CI: -1.36~-0.19), with statistically significant differences. Regarding the alleviation of depressive symptoms, the effect was similar between NET and routine therapy within 1 to 3 months after intervention (SMD=-0.39, 95% CI: -0.98~0.21) and at 6 months after intervention (SMD=-0.74, 95% CI: -2.23~0.75). No statistical difference was demonstrated between NET and routine therapy in relieving psychological distress (SMD=-0.54, 95% CI: -2.14~1.07) and suppressing hyperorexia (SMD=-0.17, 95% CI: -0.54~0.19) 1 to 3 months after intervention. ConclusionNET yields a better outcome and a medium- and long-term effectiveness in alleviating symptoms of PTSD in children and adolescents compared with routine therapy, while it does not offer any significant advantages in improving depression symptoms, psychological distress and hyperorexia.
9.Mid to long-term clinical outcomes improvement through dual antiplatelet therapy after coronary artery bypass grafting: Interpretation of DACAB-FE trial
Jianyu QU ; Si CHEN ; Zhijian WANG ; Kang ZHOU ; Yuan ZHAO ; Ran DONG ; Dongmei SHI ; Nianguo DONG ; Zhe ZHENG
Chinese Journal of Clinical Thoracic and Cardiovascular Surgery 2024;31(08):1096-1100
Coronary artery bypass grafting (CABG) is one of the most effective revascularization treatments for coronary heart disease. Secondary prevention strategies, which rely on antiplatelet and lipid-lowering drugs, are crucial after CABG to ensure the durability of revascularization treatment effects and prevent adverse cardiovascular and cerebrovascular events in the medium to long term. Previous research conducted by Professor Zhao Qiang's team from Ruijin Hospital of Shanghai Jiao Tong University, known as the DACAB study, indicated that dual antiplatelet therapy (DAPT, specifically ticagrelor+aspirin) after CABG can enhance venous graft patency. However, it remains uncertain whether DAPT can further improve the medium to long-term clinical outcomes of CABG patients. Recently, the team reported the medium to long-term follow-up results of the DACAB study, termed the DACAB-FE study, finding that DAPT administered after CABG can reduce the incidence of major cardiovascular events over five years and improve patients' medium to long-term clinical outcomes. This article will interpret the methodological highlights and significant clinical implications of the DACAB-FE study.
10.Apatinib and gamabufotalin co-loaded lipid/Prussian blue nanoparticles for synergistic therapy to gastric cancer with metastasis
Chen BINLONG ; Zhao YANZHONG ; Lin ZICHANG ; Liang JIAHAO ; Fan JIALONG ; Huang YANYAN ; He LEYE ; Liu BIN
Journal of Pharmaceutical Analysis 2024;14(5):707-721
Due to the non-targeted release and low solubility of anti-gastric cancer agent,apatinib(Apa),a first-line drug with long-term usage in a high dosage often induces multi-drug resistance and causes serious side effects.In order to avoid these drawbacks,lipid-film-coated Prussian blue nanoparticles(PB NPs)with hyaluronan(HA)modification was used for Apa loading to improve its solubility and targeting ability.Furthermore,anti-tumor compound of gamabufotalin(CS-6)was selected as a partner of Apa with reducing dosage for combinational gastric therapy.Thus,HA-Apa-Lip@PB-CS-6 NPs were constructed to synchro-nously transport the two drugs into tumor tissue.In vitro assay indicated that HA-Apa-Lip@PB-CS-6 NPs can synergistically inhibit proliferation and invasion/metastasis of BGC-823 cells via downregulating vascular endothelial growth factor receptor(VEGFR)and matrix metalloproteinase-9(MMP-9).In vivo assay demonstrated strongest anti-tumor growth and liver metastasis of HA-Apa-Lip@PB-CS-6 NPs adminis-tration in BGC-823 cells-bearing mice compared with other groups due to the excellent penetration in tumor tissues and outstanding synergistic effects.In summary,we have successfully developed a new nanocomplexes for synchronous Apa/CS-6 delivery and synergistic gastric cancer(GC)therapy.

Result Analysis
Print
Save
E-mail