1.A Gaussian mixture-hidden Markov model of human visual behavior.
Huaqian LIU ; Xiujuan ZHENG ; Yan WANG ; Yun ZHANG ; Kai LIU
Journal of Biomedical Engineering 2021;38(3):512-519
Vision is an important way for human beings to interact with the outside world and obtain information. In order to research human visual behavior under different conditions, this paper uses a Gaussian mixture-hidden Markov model (GMM-HMM) to model the scanpath, and proposes a new model optimization method, time-shifting segmentation (TSS). The TSS method can highlight the characteristics of the time dimension in the scanpath, improve the pattern recognition results, and enhance the stability of the model. In this paper, a linear discriminant analysis (LDA) method is used for multi-dimensional feature pattern recognition to evaluates the rationality and the accuracy of the proposed model. Four sets of comparative trials were carried out for the model evaluation. The first group applied the GMM-HMM to model the scanpath, and the average accuracy of the classification could reach 0.507, which is greater than the opportunity probability of three classification (0.333). The second set of trial applied TSS method, and the mean accuracy of classification was raised to 0.610. The third group combined GMM-HMM with TSS method, and the mean accuracy of classification reached 0.602, which was more stable than the second model. Finally, comparing the model analysis results with the saccade amplitude (SA) characteristics analysis results, the modeling analysis method is much better than the basic information analysis method. Via analyzing the characteristics of three types of tasks, the results show that the free viewing task have higher specificity value and a higher sensitivity to the cued object search task. In summary, the application of GMM-HMM model has a good performance in scanpath pattern recognition, and the introduction of TSS method can enhance the difference of scanpath characteristics. Especially for the recognition of the scanpath of search-type tasks, the model has better advantages. And it also provides a new solution for a single state eye movement sequence.
Algorithms
;
Discriminant Analysis
;
Eye Movements
;
Humans
;
Markov Chains
;
Normal Distribution
;
Probability
2.Progress in filters for denoising cryo-electron microscopy images.
Xin Rui HUANG ; Sha LI ; Song GAO
Journal of Peking University(Health Sciences) 2021;53(2):425-433
Cryo-electron microscopy (cryo-EM) imaging has the unique potential to bridge the gap between cellular and molecular biology. Therefore, cryo-EM three-dimensional (3D) reconstruction has been rapidly developed in recent several years and applied widely in life science research to reveal the structures of large macromolecular assemblies and cellular complexes, which is critical to understanding their functions at all scales. Although the technical breakthrough in recent years, for example, the introduction of the direct detection device (DDD) camera and the development of cryo-EM software tools, made the three cryo-EM pioneers share the 2017 Nobel Prize, several bottleneck problems still exist that hamper the further increase of the resolution of single-particle reconstruction and hold back the application of in situ subnanometer structure determination by cryo-tomography. Radiation damage is still the key limiting factor in cryo-EM. In order to minimize the radiation damage and preserve as much resolution as possible, the imaging conditions of a low dose and weak contrast make cryo-EM images extremely noisy with very low signal-to-noise ratios (SNR), generally about 0.1. The high noise will obscure the fine details in cryo-EM images or reconstructed maps. Thus, a method to reduce the level of noise and improve the resolution has become an important issue. In this paper, we systematically reviewed and compared some robust filters in the cryo-EM field of two aspects, single-particle analysis (SPA) and cryo-electron tomography (cryo-ET), and especially studied their applications, such as, 3D reconstruction, visualization, structural analysis, and interpretation. Conventional approaches to noise reduction in cryo-EM imaging include the use of Gaussian, median, and bilateral filters, among other means. A Gaussian filter selects an appropriate filter kernel to conduct spatial convolution with a noisy image. Although noise with larger standard deviations in cryo-EM images can be suppressed and satisfactory performance is achieved in certain cases, this filter also blurs the images and over-smooths small-scale image features. This is especially detrimental when precise quantitative information needs to be extracted. Unlike a Gaussian filter, a median filter is based on the order statistics of the image and selects the median intensity in a window of the adjacent pixels to denoise the image. Although this filter is robust to outliers, it suffers from aliasing problems that possibly result in incorrect information for cryo-EM structure interpretation. A bilateral filter is a nonlinear filter that performs spatial weighted averaging and is more selective in the pixels allowing to contribute to the weighted sum, excluding the high frequency noise from the smoothing process. Thus, this filter can be used to smooth out noise while maintaining the edge details, which is similar to an anisotropic diffusion filter, and distinct from a Gaussian filter but its utility will be limited when the SNR of a cryo-EM image is very low. Generally, spatial filtering methods have the disadvantage of losing image resolution when reducing noise. A wavelet transform can exploit the wavelet's natural ability to separate a signal from noise at multiple image scales to allow for joint resolution in both the spatial and frequency domains, and thus has the potential to outperform existing methods. The modified wavelet shrinkage filter we developed can offer a remarkable improvement in image quality with a good compromise between detail preservation and noise smoothing. We expect that our review study on different filters can provide benefits to cryo-EM applications and the interpretation of biological structures.
Algorithms
;
Cryoelectron Microscopy
;
Image Processing, Computer-Assisted
;
Normal Distribution
;
Signal-To-Noise Ratio
;
Tomography, X-Ray Computed
3.Segmentation of heart sound signals based on duration hidden Markov model.
Haoran KUI ; Jiahua PAN ; Rong ZONG ; Hongbo YANG ; Wei SU ; Weilian WANG
Journal of Biomedical Engineering 2020;37(5):765-774
Heart sound segmentation is a key step before heart sound classification. It refers to the processing of the acquired heart sound signal that separates the cardiac cycle into systolic and diastolic, etc. To solve the accuracy limitation of heart sound segmentation without relying on electrocardiogram, an algorithm based on the duration hidden Markov model (DHMM) was proposed. Firstly, the heart sound samples were positionally labeled. Then autocorrelation estimation method was used to estimate cardiac cycle duration, and Gaussian mixture distribution was used to model the duration of sample-state. Next, the hidden Markov model (HMM) was optimized in the training set and the DHMM was established. Finally, the Viterbi algorithm was used to track back the state of heart sounds to obtain S
Algorithms
;
Electrocardiography
;
Heart Sounds
;
Markov Chains
;
Normal Distribution
4.Detection of carotid intima and media thicknesses based on ultrasound B-mode images clustered with Gaussian mixture model.
Guiling QI ; Bingbing HE ; Yufeng ZHANG ; Zhiyao LI ; Hong MO ; Jie CHENG
Journal of Biomedical Engineering 2020;37(6):1080-1088
In clinic, intima and media thickness are the main indicators for evaluating the development of atherosclerosis. At present, these indicators are measured by professional doctors manually marking the boundaries of the inner and media on B-mode images, which is complicated, time-consuming and affected by many artificial factors. A grayscale threshold method based on Gaussian Mixture Model (GMM) clustering is therefore proposed to detect the intima and media thickness in carotid arteries from B-mode images in this paper. Firstly, the B-mode images are clustered based on the GMM, and the boundary between the intima and media of the vessel wall is then detected by the gray threshold method, and finally the thickness of the two is measured. Compared with the measurement technique using the gray threshold method directly, the clustering of B-mode images of carotid artery solves the problem of gray boundary blurring of inner and middle membrane, thereby improving the stability and detection accuracy of the gray threshold method. In the clinical trials of 120 healthy carotid arteries, means of 4 manual measurements obtained by two experts are used as reference values. Experimental results show that the normalized root mean square errors (NRMSEs) of the estimated intima and media thickness after GMM clustering were 0.104 7 ± 0.076 2 and 0.097 4 ± 0.068 3, respectively. Compared with the results of the direct gray threshold estimation, means of NRMSEs are reduced by 19.6% and 22.4%, respectively, which indicates that the proposed method has higher measurement accuracy. The standard deviations are reduced by 17.0% and 21.7%, respectively, which indicates that the proposed method has better stability. In summary, this method is helpful for early diagnosis and monitoring of vascular diseases, such as atherosclerosis.
Carotid Arteries/diagnostic imaging*
;
Carotid Intima-Media Thickness
;
Normal Distribution
;
Ultrasonography
5.A tooth cone beam computer tomography image segmentation method based on the local Gaussian distribution fitting.
Journal of Biomedical Engineering 2019;36(2):291-297
Oral teeth image segmentation plays an important role in teeth orthodontic surgery and implant surgery. As the tooth roots are often surrounded by the alveolar, the molar's structure is complex and the inner pulp chamber usually exists in tooth, it is easy to over-segment or lead to inner edges in teeth segmentation process. In order to further improve the segmentation accuracy, a segmentation algorithm based on local Gaussian distribution fitting and edge detection is proposed to solve the above problems. This algorithm combines the local pixels' variance and mean values, which improves the algorithm's robustness by incorporating the gradient information. In the experiment, the root is segmented precisely in cone beam computed tomography (CBCT) teeth images. Segmentation results by the proposed algorithm are then compared with the classical algorithms' results. The comparison results show that the proposed method can distinguish the root and alveolar around the root. In addition, the split molars can be segmented accurately and there are no inner contours around the pulp chamber.
Algorithms
;
Computers
;
Cone-Beam Computed Tomography
;
Humans
;
Image Processing, Computer-Assisted
;
Normal Distribution
;
Tooth
;
diagnostic imaging
;
Tooth Root
;
diagnostic imaging
6.More about the basic assumptions of t-test: normality and sample size
Korean Journal of Anesthesiology 2019;72(4):331-335
Most parametric tests start with the basic assumption on the distribution of populations. The conditions required to conduct the t-test include the measured values in ratio scale or interval scale, simple random extraction, normal distribution of data, appropriate sample size, and homogeneity of variance. The normality test is a kind of hypothesis test which has Type I and II errors, similar to the other hypothesis tests. It means that the sample size must influence the power of the normality test and its reliability. It is hard to find an established sample size for satisfying the power of the normality test. In the current article, the relationships between normality, power, and sample size were discussed. As the sample size decreased in the normality test, sufficient power was not guaranteed even with the same significance level. In the independent t-test, the change in power according to sample size and sample size ratio between groups was observed. When the sample size of one group was fixed and that of another group increased, power increased to some extent. However, it was not more efficient than increasing the sample sizes of both groups equally. To ensure the power in the normality test, sufficient sample size is required. The power is maximized when the sample size ratio between two groups is 1 : 1.
Biostatistics
;
Normal Distribution
;
Sample Size
7.Normality Test in Clinical Research.
Sang Gyu KWAK ; Sung Hoon PARK
Journal of Rheumatic Diseases 2019;26(1):5-11
In data analysis, given that various statistical methods assume that the distribution of the population data is normal distribution, it is essential to check and test whether or not the data satisfy the normality requirement. Although the analytical methods vary depending on whether or not the normality is satisfied, inconsistent results might be obtained depending on the analysis method used. In many clinical research papers, the results are presented and interpreted without checking or testing normality. According to the central limit theorem, the distribution of the sample mean satisfies the normal distribution when the number of samples is above 30. However, in many clinical studies, due to cost and time restrictions during data collection, the number of samples is frequently lower than 30. In this case, a proper statistical analysis method is required to determine whether or not the normality is satisfied by performing a normality test. In this regard, this paper discusses the normality check, several methods of normality test, and several statistical analysis methods with or without normality checks.
Data Collection
;
Methods
;
Normal Distribution
;
Statistics as Topic
8.Survey of the use of statistical methods in Journal of the Korean Association of Oral and Maxillofacial Surgeons
Journal of the Korean Association of Oral and Maxillofacial Surgeons 2018;44(1):25-28
OBJECTIVES: This study aimed to describe recent patterns in the types of statistical test used in original articles that were published in Journal of the Korean Association of Oral and Maxillofacial Surgeons. MATERIALS AND METHODS: Thirty-six original articles published in the Journal in 2015 and 2016 were ascertained. The type of statistical test was identified by one researcher. Descriptive statistics, such as frequency, rank, and proportion, were calculated. Graphical statistics, such as a histogram, were constructed to reveal the overall utilization pattern of statistical test types. RESULTS: Twenty-two types of statistical test were used. Statistical test type was not reported in four original articles and classified as unclear in 5%. The four most frequently used statistical tests constituted 47% of the total tests and these were the chi-square test, Student's t-test, Fisher's exact test, and Mann-Whitney test in descending order. Regression models, such as the Cox proportional hazard model and multiple logistic regression to adjust for potential confounding variables, were used in only 6% of the studies. Normality tests, including the Kolmogorov-Smirnov test, Levene test, Shapiro-Wilk test, and Scheffé's test, were used diversely but in only 10% of the studies. CONCLUSION: A total of 22 statistical tests were identified, with four tests occupying almost half of the results. Adoption of a nonparametric test is recommended when the status of normality is vague. Adjustment for confounding variables should be pursued using a multiple regression model when the number of potential confounding variables is numerous.
Confounding Factors (Epidemiology)
;
Logistic Models
;
Methods
;
Normal Distribution
;
Oral and Maxillofacial Surgeons
;
Proportional Hazards Models
9.Central limit theorem: the cornerstone of modern statistics.
Korean Journal of Anesthesiology 2017;70(2):144-156
According to the central limit theorem, the means of a random sample of size, n, from a population with mean, µ, and variance, σ², distribute normally with mean, µ, and variance, σ²/n. Using the central limit theorem, a variety of parametric tests have been developed under assumptions about the parameters that determine the population probability distribution. Compared to non-parametric tests, which do not require any assumptions about the population probability distribution, parametric tests produce more accurate and precise estimates with higher statistical powers. However, many medical researchers use parametric tests to present their data without knowledge of the contribution of the central limit theorem to the development of such tests. Thus, this review presents the basic concepts of the central limit theorem and its role in binomial distributions and the Student's t-test, and provides an example of the sampling distributions of small populations. A proof of the central limit theorem is also described with the mathematical concepts required for its near-complete understanding.
Mathematical Concepts
;
Normal Distribution
;
Statistical Distributions
10.Practical statistics in pain research.
The Korean Journal of Pain 2017;30(4):243-249
Pain is subjective, while statistics related to pain research are objective. This review was written to help researchers involved in pain research make statistical decisions. The main issues are related with the level of scales that are often used in pain research, the choice of statistical methods between parametric or nonparametric statistics, and problems which arise from repeated measurements. In the field of pain research, parametric statistics used to be applied in an erroneous way. This is closely related with the scales of data and repeated measurements. The level of scales includes nominal, ordinal, interval, and ratio scales. The level of scales affects the choice of statistics between parametric or non-parametric methods. In the field of pain research, the most frequently used pain assessment scale is the ordinal scale, which would include the visual analogue scale (VAS). There used to be another view, however, which considered the VAS to be an interval or ratio scale, so that the usage of parametric statistics would be accepted practically in some cases. Repeated measurements of the same subjects always complicates statistics. It means that measurements inevitably have correlations between each other, and would preclude the application of one-way ANOVA in which independence between the measurements is necessary. Repeated measures of ANOVA (RMANOVA), however, would permit the comparison between the correlated measurements as long as the condition of sphericity assumption is satisfied. Conclusively, parametric statistical methods should be used only when the assumptions of parametric statistics, such as normality and sphericity, are established.
Analysis of Variance
;
Biostatistics
;
Normal Distribution
;
Pain Measurement
;
Visual Analog Scale
;
Weights and Measures

Result Analysis
Print
Save
E-mail