1.Strategic optimization of patient flow and staffing schemes during the COVID-19 pandemic through Operations Management in a Neonatal Intensive Care Unit
Paul Sherwin O. Tarnate ; Anna Lisa T. Ong-Lim
Acta Medica Philippina 2024;58(7):90-102
Background:
The COVID-19 pandemic posed challenges in making time-bound hospital management decisions. The University of the Philippines -Philippine General Hospital (UP-PGH) is a tertiary COVID-19 referral center located in Manila, Philippines. The mismatch of increasing suspected or confirmed COVID-19 infected mothers with few documented cases of infected infants has caused significant patient overflow and manpower shortage in its NICU.
Objective:
We present an evaluated scheme for NICU bed reallocation to maximize capacity performance, staff
rostering, and resource conservation, while preserving COVID-19 infection prevention and control measures.
Methods:
Existing process workflows translated into operational models helped create a solution that modified cohorting and testing schemes. Staffing models were transitioned to meet patient flow. Outcome measurements were obtained, and feedback was monitored during the implementation phase.
Results:
The scheme evaluation demonstrated benefits in (a) achieving shorter COVID-19 subunit length of stay; (b) better occupancy rates with minimal overflows; (c) workforce shortage mitigation with increased non-COVID workforce pool; (d) reduced personal protective equipment requirements; and (e) zero true SARS-CoV-2 infections.
Conclusion
Designed for hospital operations leaders and stakeholders, this operations process can aid in hospital policy formulation in modifying cohorting schemes to maintain quality NICU care and service during the COVID-19 pandemic.
COVID-19
;
Operations Research
;
Intensive Care Units, Neonatal
2.Computer-assisted simulations using R and RStudio to assist in operations research and analysis in the context of clinical laboratory management: A gentle introduction and simple guide for pathologists and laboratory professionals
Mark Angelo Ang ; Karen Cybelle Sotalbo
Philippine Journal of Pathology 2024;9(2):38-52
Operations research (OR) is a valuable yet underutilized field in clinical laboratory management, offering practical solutions to optimize workflows, resource allocation, and decision-making. Despite its potential, the adoption of OR methodologies remain limited due to a lack of training and familiarity among pathologists and laboratory professionals. This paper addresses this gap by presenting an accessible introduction and practical guide to analyzing operations research problems in clinical laboratories using computer-assisted simulations in R, implemented within the R Studio environment.
The proposed framework emphasizes simplicity and flexibility, leveraging the extensive capabilities of base R to model and analyze critical OR questions. The paper outlines step-by-step methods for defining problems, constructing simulation models, and interpreting results, ensuring that readers can replicate and adapt these techniques to their unique laboratory contexts.
Key features of the framework include its emphasis on reproducibility, customization, and the integration of data-driven insights into decision-making processes. Case studies and examples drawn from real-world laboratory scenarios illustrate the application of R simulations to address challenges such as minimizing turnaround times, balancing staffing levels, and managing inventory efficiently.
This guide aims to empower laboratory professionals and pathologists with the tools and skills to integrate operations research into their practice, fostering a culture of innovation and efficiency in clinical settings. By bridging the gap between OR theory and practical application, this paper contributes to the broader adoption of computational approaches in laboratory management, ultimately enhancing the quality and sustainability of healthcare services.
Human ; Operations Research
3.Model construction and software design of computed tomography radiation system based on visualization.
Ying LIU ; Ting MENG ; Haowei ZHANG ; Heqing LU
Journal of Biomedical Engineering 2023;40(5):989-995
The Monte Carlo N-Particle (MCNP) is often used to calculate the radiation dose during computed tomography (CT) scans. However, the physical calculation process of the model is complicated, the input file structure of the program is complex, and the three-dimensional (3D) display of the geometric model is not supported, so that the researchers cannot establish an accurate CT radiation system model, which affects the accuracy of the dose calculation results. Aiming at these two problems, this study designed a software that visualized CT modeling and automatically generated input files. In terms of model calculation, the theoretical basis was based on the integration of CT modeling improvement schemes of major researchers. For 3D model visualization, LabVIEW was used as the new development platform, constructive solid geometry (CSG) was used as the algorithm principle, and the introduction of editing of MCNP input files was used to visualize CT geometry modeling. Compared with a CT model established by a recent study, the root mean square error between the results simulated by this visual CT modeling software and the actual measurement was smaller. In conclusion, the proposed CT visualization modeling software can not only help researchers to obtain an accurate CT radiation system model, but also provide a new research idea for the geometric modeling visualization method of MCNP.
Radiation Dosage
;
Software Design
;
Tomography, X-Ray Computed/methods*
;
Software
;
Algorithms
;
Phantoms, Imaging
;
Monte Carlo Method
4.Evaluation of PET Mainstream Scattering Correction Methods.
Zhipeng SUN ; Ming LI ; Jian MA ; Jinjin MA ; Guodong LIANG
Chinese Journal of Medical Instrumentation 2023;47(1):47-53
OBJECTIVE:
Current mainstream PET scattering correction methods are introduced and evaluated horizontally, and finally, the existing problems and development direction of scattering correction are discussed.
METHODS:
Based on NeuWise Pro PET/CT products of Neusoft Medical System Co. Ltd. , the simulation experiment is carried out to evaluate the influence of radionuclide distribution out of FOV (field of view) on the scattering estimation accuracy of each method.
RESULTS:
The scattering events produced by radionuclide out of FOV have an obvious impact on the spatial distribution of scattering, which should be considered in the model. The scattering estimation accuracy of Monte Carlo method is higher than single scatter simulation (SSS).
CONCLUSIONS
Clinically, if the activity of the adjacent parts out of the FOV is high, such as brain, liver, kidney and bladder, it is likely to lead to the deviation of scattering estimation. Considering the Monte Carlo scattering estimation of the distribution of radionuclide out of FOV, it's helpful to improve the accuracy of scattering distribution estimation.
Positron Emission Tomography Computed Tomography
;
Scattering, Radiation
;
Computer Simulation
;
Brain
;
Monte Carlo Method
;
Phantoms, Imaging
;
Image Processing, Computer-Assisted
5.Comparison of 7 methods for sample size determination based on confidence interval estimation for a single proportion.
Mi Lai YU ; Xiao Tong SHI ; Bi Qing ZOU ; Sheng Li AN
Journal of Southern Medical University 2023;43(1):105-110
OBJECTIVE:
To compare different methods for calculating sample size based on confidence interval estimation for a single proportion with different event incidences and precisions.
METHODS:
We compared 7 methods, namely Wald, AgrestiCoull add z2, Agresti-Coull add 4, Wilson Score, Clopper-Pearson, Mid-p, and Jefferys, for confidence interval estimation for a single proportion. The sample size was calculated using the search method with different parameter settings (proportion of specified events and half width of the confidence interval [ω=0.05, 0.1]). With Monte Carlo simulation, the estimated sample size was used to simulate and compare the width of the confidence interval, the coverage of the confidence interval and the ratio of the noncoverage probability.
RESULTS:
For a high accuracy requirement (ω =0.05), the Mid-p method and Clopper Pearson method performed better when the incidence of events was low (P < 0.15). In other settings, the performance of the 7 methods did not differ significantly except for a poor symmetry of the Wald method. In the setting of ω=0.1 with a very low p (0.01-0.05), failure of iteration occurred with nearly all the methods except for the Clopper-Pearson method.
CONCLUSION
Different sample size determination methods based on confidence interval estimation should be selected for single proportions with different parameter settings.
Confidence Intervals
;
Sample Size
;
Computer Simulation
;
Monte Carlo Method
;
Probability
6.Influence of group sample size on statistical power of tests for quantitative data with an imbalanced design.
Qihong LIANG ; Xiaolin YU ; Shengli AN
Journal of Southern Medical University 2020;40(5):713-717
OBJECTIVE:
To explore the relationship between sample size in the groups and statistical power of ANOVA and Kruskal-Wallis test with an imbalanced design.
METHODS:
The sample sizes of the two tests were estimated by SAS program with given parameter settings, and Monte Carlo simulation was used to examine the changes in power when the total sample size varied or remained fixed.
RESULTS:
In ANOVA, when the total sample size was fixed, increasing the sample size in the group with a larger mean square error improved the statistical power, but an excessively large difference in the sample sizes between groups led to reduced power. When the total sample size was not fixed, a larger mean square error in the group with increased sample size was associated with a greater increase of the statistical power. In Kruskal-wallis test, when the total sample size was fixed, increasing the sample size in groups with large mean square errors increased the statistical power irrespective of the sample size difference between the groups; when total sample size was not fixed, a larger mean square error in the group with increased sample size resulted in an increased statistical power, and the increment was similar to that for a fixed total sample size.
CONCLUSIONS
The relationship between statistical power and sample size in groups is affected by the mean square error, and increasing the sample size in a group with a large mean square error increases the statistical power. In Kruskal-Wallis test, increasing the sample size in a group with a large mean square error is more cost- effective than increasing the total sample size to improve the statistical power.
Computer Simulation
;
Models, Statistical
;
Monte Carlo Method
;
Sample Size
7.A Fluorescence Diffusion Optical Tomography System Based on Lattice Boltzmann Forward Model.
Xingxing CEN ; Zhuangzhi YAN ; Huandi WU
Chinese Journal of Medical Instrumentation 2020;44(1):1-6
Fluorescence Diffuse Optical Tomography (FDOT) is significant for biomedical applications, such as medical diagnostics, drug research. The fluorescence probe distribution in biological tissues can be quantitatively and non-invasively obtained via FDOT, achieving targets positioning and detection. In order to reduce the cost of FDOT, this study designs a FDOT system based on Lattice Boltzmann forward model. The system is used to realize two functions of light propagation simulation and FDOT reconstruction, and is composed of a parameter module, an algorithm module, a result display module and a data interaction module. In order to verify the effectiveness of the platform, this study carries out the light propagation simulation experiment and the FDOT reconstruction experiment, respectively comparing the Monte Carlo (MC) light propagation simulation results and the real position of the light source to be reconstructed. Experiments show that the proposed FDOT system has good reliability and has a high promotion value.
Algorithms
;
Computer Simulation
;
Monte Carlo Method
;
Optical Devices
;
Reproducibility of Results
;
Tomography, Optical
8.Study of clustered damage in DNA after proton irradiation based on density-based spatial clustering of applications with noise algorithm.
Jing TANG ; Pengcheng ZHANG ; Qinfeng XIAO ; Jie LI ; Zhiguo GUI
Journal of Biomedical Engineering 2019;36(4):633-642
The deoxyribonucleic acid (DNA) molecule damage simulations with an atom level geometric model use the traversal algorithm that has the disadvantages of quite time-consuming, slow convergence and high-performance computer requirement. Therefore, this work presents a density-based spatial clustering of applications with noise (DBSCAN) clustering algorithm based on the spatial distributions of energy depositions and hydroxyl radicals (·OH). The algorithm with probability and statistics can quickly get the DNA strand break yields and help to study the variation pattern of the clustered DNA damage. Firstly, we simulated the transportation of protons and secondary particles through the nucleus, as well as the ionization and excitation of water molecules by using Geant4-DNA that is the Monte Carlo simulation toolkit for radiobiology, and got the distributions of energy depositions and hydroxyl radicals. Then we used the damage probability functions to get the spatial distribution dataset of DNA damage points in a simplified geometric model. The DBSCAN clustering algorithm based on damage points density was used to determine the single-strand break (SSB) yield and double-strand break (DSB) yield. Finally, we analyzed the DNA strand break yield variation trend with particle linear energy transfer (LET) and summarized the variation pattern of damage clusters. The simulation results show that the new algorithm has a faster simulation speed than the traversal algorithm and a good precision result. The simulation results have consistency when compared to other experiments and simulations. This work achieves more precise information on clustered DNA damage induced by proton radiation at the molecular level with high speed, so that it provides an essential and powerful research method for the study of radiation biological damage mechanism.
Algorithms
;
Computer Simulation
;
DNA
;
radiation effects
;
DNA Damage
;
Linear Energy Transfer
;
Monte Carlo Method
;
Protons
9.Randomization in clinical studies
Korean Journal of Anesthesiology 2019;72(3):221-232
Randomized controlled trial is widely accepted as the best design for evaluating the efficacy of a new treatment because of the advantages of randomization (random allocation). Randomization eliminates accidental bias, including selection bias, and provides a base for allowing the use of probability theory. Despite its importance, randomization has not been properly understood. This article introduces the different randomization methods with examples: simple randomization; block randomization; adaptive randomization, including minimization; and response-adaptive randomization. Ethics related to randomization are also discussed. The study is helpful in understanding the basic concepts of randomization and how to use R software.
Bias (Epidemiology)
;
Ethics
;
Probability Theory
;
Random Allocation
;
Selection Bias
10.Designing optimized food intake patterns for Korean adults using linear programming (II): adjustment of the optimized food intake pattern by establishing stepwise intake goals of sodium
Kana ASANO ; Hongsuk YANG ; Youngmi LEE ; Meeyoung KIM ; Jihyun YOON
Journal of Nutrition and Health 2019;52(4):342-353
PURPOSE: The Dietary Reference Intakes for Koreans (KDRIs) suggest that the goal for the intake of sodium should be less than 2,000 mg, which is thought to be infeasible to achieve when eating the typical Korean diet. This study aimed to obtain the new intake goals for sodium with improved feasibility to achieve, and also to design optimized food intake patterns for Korean adults by performing linear programming. METHODS: The data from a one day 24-hour dietary recall of the 2010 ~ 2014 Korea National Health and Nutrition Survey were used to quantify food items that Korean adults usually consumed. These food items were categorized into seven groups and 24 subgroups. The mean intakes and intake distributions of the food groups and the food subgroups were calculated for eight age (19 ~ 29, 30 ~ 49, 50 ~ 64, and over 65 years old) and gender (male and female) groups. A linear programming model was constructed to minimize the difference between the optimized intakes and the mean intakes of the food subgroups while meeting KDRIs for energy and 13 nutrients, and not exceeding the typical quantities of each of the food subgroups consumed by the respective age and gender groups. As an initial solution of the linear programming, the optimized intake of seasonings, including salt, was calculated as 0 g for all the age and gender groups when the sodium constraint was inserted not to exceed 2,000 mg. Therefore, the sodium constraint was progressively increased by 100 mg until the optimized intake of seasoning was obtained as the values closest to the 25(th) percentile of the intake distribution of seasonings for the respective age and gender groups. RESULTS: The optimized food intake patterns were mathematically obtained by performing linear programming when the sodium constraint values were 3,600 mg, 4,500 mg, 4,200 mg, 3,400 mg, 2,800 mg, 3,100 mg, 3,100 mg, and 2,500 mg for the eight age and gender groups. CONCLUSION: The optimized food intake patterns for Korean adults were designed by performing linear programming after increasing the sodium constraint values from 2,000 mg to 2500 ~ 4,500 mg according to the age and gender groups. The resulting patterns suggest that current diets should be modified to increase the intake of vegetables for all the groups, milk/dairy products for the female groups, and fruits for the female groups except for the females aged 50 ~ 64 years.
Adult
;
Diet
;
Eating
;
Female
;
Fruit
;
Humans
;
Korea
;
Nutrition Surveys
;
Nutritional Requirements
;
Programming, Linear
;
Recommended Dietary Allowances
;
Seasons
;
Sodium
;
Vegetables


Result Analysis
Print
Save
E-mail