This study presents an estimation approach to non-life insurance claim counts relating to a specified time. The objective of this study is to estimate the parameters in non-life insurance claim counting process, inclu...This study presents an estimation approach to non-life insurance claim counts relating to a specified time. The objective of this study is to estimate the parameters in non-life insurance claim counting process, including the homogeneous Poisson process (HPP) and the non-homogeneous Poisson process (NHPP) with a bell-shaped intensity. We use the estimating function, the zero mean martingale (ZMM) as a procedure of parameter estimation in the insurance claim counting process. Then, Λ(t) , the compensator of is proposed for the number of claims in the time interval . We present situations through a simulation study of both processes on the time interval . Some examples of the situations in the simulation study are depicted by a sample path relating to its compensator Λ(t). In addition, an example of the claim counting process illustrates the result of the compensator estimate misspecification.展开更多
An important problem of actuarial risk management is the calculation of the probability of ruin. Using probability theory and the definition of the Laplace transform one obtains expressions, in the classical risk mode...An important problem of actuarial risk management is the calculation of the probability of ruin. Using probability theory and the definition of the Laplace transform one obtains expressions, in the classical risk model, for survival probabilities in a finite time horizon. Then explicit solutions are found with the inversion of the double Laplace transform;using algebra, the Laplace complex inversion formula and Matlab, for the exponential claim amount distribution.展开更多
The aim of this study is to propose an estimation approach to non-life insurance claim counts related to the insurance claim counting process, including the non-homogeneous Poisson process (NHPP) with a bell-shaped in...The aim of this study is to propose an estimation approach to non-life insurance claim counts related to the insurance claim counting process, including the non-homogeneous Poisson process (NHPP) with a bell-shaped intensity and a beta-shaped intensity. The estimating function, such as the zero mean martingale (ZMM), is used as a procedure for parameter estimation of the insurance claim counting process, and the parameters of model claim intensity are estimated by the Bayesian method. Then,Λ(t), the compensator of N(t) is proposed for the number of claims in a time interval (0,t]. Given the process over the time interval (0,t]., the situations are presented through a simulation study and some examples of these situations are also depicted by a sample path relating N(t) to its compensatorΛ(t).展开更多
Red blood cell(RBC)counting is a standard medical test that can help diagnose various conditions and diseases.Manual counting of blood cells is highly tedious and time consuming.However,new methods for counting blood ...Red blood cell(RBC)counting is a standard medical test that can help diagnose various conditions and diseases.Manual counting of blood cells is highly tedious and time consuming.However,new methods for counting blood cells are customary employing both electronic and computer-assisted techniques.Image segmentation is a classical task in most image processing applications which can be used to count blood cells in a microscopic image.In this research work,an approach for erythrocytes counting is proposed.We employed a classification before counting and a new segmentation idea was implemented on the complex overlapping clusters in a microscopic smear image.Experimental results show that the proposed method is of higher counting accuracy and it performs much better than most counting algorithms existed in the situation of three or more RBCs overlapping complexly into a group.The average total erythrocytes counting accuracy of the proposed method reaches 92.9%.展开更多
Glaucoma is a multifactorial optic neuropathy characterized by the damage and death of the retinal ganglion cells.This disease results in vision loss and blindness.Any vision loss resulting from the disease cannot be ...Glaucoma is a multifactorial optic neuropathy characterized by the damage and death of the retinal ganglion cells.This disease results in vision loss and blindness.Any vision loss resulting from the disease cannot be restored and nowadays there is no available cure for glaucoma; however an early detection and treatment,could offer neuronal protection and avoid later serious damages to the visual function.A full understanding of the etiology of the disease will still require the contribution of many scientific efforts.Glial activation has been observed in glaucoma,being microglial proliferation a hallmark in this neurodegenerative disease.A typical project studying these cellular changes involved in glaucoma often needs thousands of images- from several animals- covering different layers and regions of the retina.The gold standard to evaluate them is the manual count.This method requires a large amount of time from specialized personnel.It is a tedious process and prone to human error.We present here a new method to count microglial cells by using a computer algorithm.It counts in one hour the same number of images that a researcher counts in four weeks,with no loss of reliability.展开更多
In biomedical research fields,the in vio Aow cytometry(IVFC)is a widely used technology which is able to monitor target cells dynamically in living animals.Although the setup of IVFC system has been well established,b...In biomedical research fields,the in vio Aow cytometry(IVFC)is a widely used technology which is able to monitor target cells dynamically in living animals.Although the setup of IVFC system has been well established,baseline drift is still a challenge in the process of quantifying circulating cells.Previous methods,i.e.,the dynamic peak picking method,counted cells by setting a static threshold without considering the baseline drift,leading to an inaccurate cell quantification.Here,we developed a method of cell counting for IVFC data with baseline drift by interpolation fitting,automatic segmentation and wavelet-based denoising.We demonstrated its performance for IVFC signals with three types of representative baseline drift.Compared with non-baseline correction methods,this method showed a higher sensitivity and specificity,as well as a better result in the Pearson's correlation coefficient and the mean-squared error(MSE).展开更多
Both wave-frequency(WF) and low-frequency(LF) components of mooring tension are in principle non-Gaussian due to nonlinearities in the dynamic system.This paper conducts a comprehensive investigation of applicable pro...Both wave-frequency(WF) and low-frequency(LF) components of mooring tension are in principle non-Gaussian due to nonlinearities in the dynamic system.This paper conducts a comprehensive investigation of applicable probability density functions(PDFs) of mooring tension amplitudes used to assess mooring-line fatigue damage via the spectral method.Short-term statistical characteristics of mooring-line tension responses are firstly investigated,in which the discrepancy arising from Gaussian approximation is revealed by comparing kurtosis and skewness coefficients.Several distribution functions based on present analytical spectral methods are selected to express the statistical distribution of the mooring-line tension amplitudes.Results indicate that the Gamma-type distribution and a linear combination of Dirlik and Tovo-Benasciutti formulas are suitable for separate WF and LF mooring tension components.A novel parametric method based on nonlinear transformations and stochastic optimization is then proposed to increase the effectiveness of mooring-line fatigue assessment due to non-Gaussian bimodal tension responses.Using time domain simulation as a benchmark,its accuracy is further validated using a numerical case study of a moored semi-submersible platform.展开更多
The issue of document management has been raised for a long time, especially with the appearance of office automation in the 1980s, which led to dematerialization and Electronic Document Management (EDM). In the same ...The issue of document management has been raised for a long time, especially with the appearance of office automation in the 1980s, which led to dematerialization and Electronic Document Management (EDM). In the same period, workflow management has experienced significant development, but has become more focused on the industry. However, it seems to us that document workflows have not had the same interest for the scientific community. But nowadays, the emergence and supremacy of the Internet in electronic exchanges are leading to a massive dematerialization of documents;which requires a conceptual reconsideration of the organizational framework for the processing of said documents in both public and private administrations. This problem seems open to us and deserves the interest of the scientific community. Indeed, EDM has mainly focused on the storage (referencing) and circulation of documents (traceability). It paid little attention to the overall behavior of the system in processing documents. The purpose of our researches is to model document processing systems. In the previous works, we proposed a general model and its specialization in the case of small documents (any document processed by a single person at a time during its processing life cycle), which represent 70% of documents processed by administrations, according to our study. In this contribution, we extend the model for processing small documents to the case where they are managed in a system comprising document classes organized in subclasses;which is the case for most administrations. We have thus observed that this model is a Markovian <i>M<sup>L×K</sup>/M<sup>L×K</sup>/</i>1 queues network. We have analyzed the constraints of this model and deduced certain characteristics and metrics. <span style="white-space:normal;"><i></i></span><i>In fine<span style="white-space:normal;"></span></i>, the ultimate objective of our work is to design a document workflow management system, integrating a component of global behavior prediction.展开更多
Spinning has a significant influence on all textile processes. Combinations of all the capital equipment display the process’ critical condition. By transforming unprocessed fibers into carded sliver and yarn, the ca...Spinning has a significant influence on all textile processes. Combinations of all the capital equipment display the process’ critical condition. By transforming unprocessed fibers into carded sliver and yarn, the carding machine serves a critical role in the textile industry. The carding machine’s licker-in and flat speeds are crucial operational factors that have a big influence on the finished goods’ quality. The purpose of this study is to examine the link between licker-in and flat speeds and how they affect the yarn and carded sliver quality. A thorough experimental examination on a carding machine was carried out to accomplish this. The carded sliver and yarn produced after experimenting with different licker-in and flat speed combinations were assessed for important quality factors including evenness, strength, and flaws. To account for changes in material qualities and machine settings, the study also took into consideration the impact of various fiber kinds and processing circumstances. The findings of the investigation showed a direct relationship between the quality of the carded sliver and yarn and the licker-in and flat speeds. Within a limited range, greater licker-in speeds were shown to increase carding efficiency and decrease fiber tangling. On the other hand, extremely high speeds led to more fiber breakage and neps. Higher flat speeds, on the other hand, helped to enhance fiber alignment, which increased the evenness and strength of the carded sliver and yarn. Additionally, it was discovered that the ideal blend of licker-in and flat rates varied based on the fiber type and processing circumstances. When being carded, various fibers displayed distinctive behaviors that necessitated adjusting the operating settings in order to provide the necessary quality results. The study also determined the crucial speed ratios between the licker-in and flat speeds that reduced fiber breakage and increased the caliber of the finished goods. The results of this study offer useful information for textile producers and process engineers to improve the quality of carded sliver and yarn while maximizing the performance of carding machines. Operators may choose machine settings and parameter adjustments wisely by knowing the impacts of licker-in and flat speeds, which will increase textile industry efficiency, productivity, and product quality.展开更多
Spinning has a significant influence on all textile processes. Combinations of all the capital equipment display the process’ critical condition. By transforming unprocessed fibers into carded sliver and yarn, the ca...Spinning has a significant influence on all textile processes. Combinations of all the capital equipment display the process’ critical condition. By transforming unprocessed fibers into carded sliver and yarn, the carding machine serves a critical role in the textile industry. The carding machine’s licker-in and flat speeds are crucial operational factors that have a big influence on the finished goods’ quality. The purpose of this study is to examine the link between licker-in and flat speeds and how they affect the yarn and carded sliver quality. A thorough experimental examination on a carding machine was carried out to accomplish this. The carded sliver and yarn produced after experimenting with different licker-in and flat speed combinations were assessed for important quality factors including evenness, strength, and flaws. To account for changes in material qualities and machine settings, the study also took into consideration the impact of various fiber kinds and processing circumstances. The findings of the investigation showed a direct relationship between the quality of the carded sliver and yarn and the licker-in and flat speeds. Within a limited range, greater licker-in speeds were shown to increase carding efficiency and decrease fiber tangling. On the other hand, extremely high speeds led to more fiber breakage and neps. Higher flat speeds, on the other hand, helped to enhance fiber alignment, which increased the evenness and strength of the carded sliver and yarn. Additionally, it was discovered that the ideal blend of licker-in and flat rates varied based on the fiber type and processing circumstances. When being carded, various fibers displayed distinctive behaviors that necessitated adjusting the operating settings in order to provide the necessary quality results. The study also determined the crucial speed ratios between the licker-in and flat speeds that reduced fiber breakage and increased the caliber of the finished goods. The results of this study offer useful information for textile producers and process engineers to improve the quality of carded sliver and yarn while maximizing the performance of carding machines. Operators may choose machine settings and parameter adjustments wisely by knowing the impacts of licker-in and flat speeds, which will increase textile industry efficiency, productivity, and product quality.展开更多
The fingerspelling recognition by hand shape is an important step for developing a human-computer interaction system. A method of fingerspelling recognition by hand shape using HLAC (higher-order local auto-correlat...The fingerspelling recognition by hand shape is an important step for developing a human-computer interaction system. A method of fingerspelling recognition by hand shape using HLAC (higher-order local auto-correlation) features is proposed. Furthermore, in order to use HLAC features more effectively, the use of image processing techniques: reducing an image resolution, dividing an image, and image pre-processing techniques, is also proposed. The experimental results show that the proposed method is promising.展开更多
To realize automatic counting of urediospores of Puccinia striiformis f.sp.tritici(Pst)(causal agent of wheat stripe rust),an automatic counting system for urediospores of wheat stripe rust pathogen based on image pro...To realize automatic counting of urediospores of Puccinia striiformis f.sp.tritici(Pst)(causal agent of wheat stripe rust),an automatic counting system for urediospores of wheat stripe rust pathogen based on image processing was developed using MATLAB GUIDE platform in combination with Local C Compiler(LCC).The system is independent of the MATLAB environment and can be run on a computer without the MATLAB software.Using this system,automatic counting of Pst urediospores in a microscopic image can be implemented via image processing technologies including image scaling,clustering segmentation,morphological modification,watershed transformation,connected region labeling,etc.Structure design of the automatic counting system,the key algorithms used in the system and realization of the main functions of the system were described in detail.Spore counting tests were conducted using microscopic digital images of Pst urediospores and the high accuracies more than 95%were obtained.The results indicated that it is feasible to count Pst urediospores automatically using the developed system based on image processing.展开更多
The High-energy Fragment Separator(HFRS),which is currently under construction,is a leading international radioactive beam device.Multiple sets of position-sensitive twin time projection chamber(TPC)detectors are dist...The High-energy Fragment Separator(HFRS),which is currently under construction,is a leading international radioactive beam device.Multiple sets of position-sensitive twin time projection chamber(TPC)detectors are distributed on HFRS for particle identification and beam monitoring.The twin TPCs'readout electronics system operates in a trigger-less mode due to its high counting rate,leading to a challenge of handling large amounts of data.To address this problem,we introduced an event-building algorithm.This algorithm employs a hierarchical processing strategy to compress data during transmission and aggregation.In addition,it reconstructs twin TPCs'events online and stores only the reconstructed particle information,which significantly reduces the burden on data transmission and storage resources.Simulation studies demonstrated that the algorithm accurately matches twin TPCs'events and reduces more than 98%of the data volume at a counting rate of 500 kHz/channel.展开更多
Autocorrelations exist in real production extensively, and special statistical tools are needed for process monitoring. Residual charts based on autoregressive integrated moving average(ARIMA) models are typically use...Autocorrelations exist in real production extensively, and special statistical tools are needed for process monitoring. Residual charts based on autoregressive integrated moving average(ARIMA) models are typically used. However, ARIMA models need a quite amount of experience, which sometimes causes inconveniences in the implementation. With a good performance under less experience or even none, hidden Markov models(HMMs)were proposed. Since ARIMA models have many different performances in positive and negative autocorrelations,it is interesting and essential to study how HMMs affect the performances of residual charts in opposite autocorrelations, which has not been studied yet. Therefore, we extend HMMs to negatively auto-correlated observations.The cross-validation method is used to select the relatively optimal state number. The experiment results show that HMMs are more stable than Auto-Regressive of order one(AR(1) models) in both cases of positive and negative autocorrelations. For detecting abnormalities, the performance of HMMs approach is much better than AR(1) models under positive autocorrelations while under negative autocorrelations both methods have similar performances.展开更多
文摘This study presents an estimation approach to non-life insurance claim counts relating to a specified time. The objective of this study is to estimate the parameters in non-life insurance claim counting process, including the homogeneous Poisson process (HPP) and the non-homogeneous Poisson process (NHPP) with a bell-shaped intensity. We use the estimating function, the zero mean martingale (ZMM) as a procedure of parameter estimation in the insurance claim counting process. Then, Λ(t) , the compensator of is proposed for the number of claims in the time interval . We present situations through a simulation study of both processes on the time interval . Some examples of the situations in the simulation study are depicted by a sample path relating to its compensator Λ(t). In addition, an example of the claim counting process illustrates the result of the compensator estimate misspecification.
文摘An important problem of actuarial risk management is the calculation of the probability of ruin. Using probability theory and the definition of the Laplace transform one obtains expressions, in the classical risk model, for survival probabilities in a finite time horizon. Then explicit solutions are found with the inversion of the double Laplace transform;using algebra, the Laplace complex inversion formula and Matlab, for the exponential claim amount distribution.
文摘The aim of this study is to propose an estimation approach to non-life insurance claim counts related to the insurance claim counting process, including the non-homogeneous Poisson process (NHPP) with a bell-shaped intensity and a beta-shaped intensity. The estimating function, such as the zero mean martingale (ZMM), is used as a procedure for parameter estimation of the insurance claim counting process, and the parameters of model claim intensity are estimated by the Bayesian method. Then,Λ(t), the compensator of N(t) is proposed for the number of claims in a time interval (0,t]. Given the process over the time interval (0,t]., the situations are presented through a simulation study and some examples of these situations are also depicted by a sample path relating N(t) to its compensatorΛ(t).
基金This work was supported by the 863 National Plan Foundation of China under Grant No.2007AA01Z333Special Grand National Project of China under Grant No.2009ZX02204-008.
文摘Red blood cell(RBC)counting is a standard medical test that can help diagnose various conditions and diseases.Manual counting of blood cells is highly tedious and time consuming.However,new methods for counting blood cells are customary employing both electronic and computer-assisted techniques.Image segmentation is a classical task in most image processing applications which can be used to count blood cells in a microscopic image.In this research work,an approach for erythrocytes counting is proposed.We employed a classification before counting and a new segmentation idea was implemented on the complex overlapping clusters in a microscopic smear image.Experimental results show that the proposed method is of higher counting accuracy and it performs much better than most counting algorithms existed in the situation of three or more RBCs overlapping complexly into a group.The average total erythrocytes counting accuracy of the proposed method reaches 92.9%.
基金supported by the Science Foundation of Arizona through the Bisgrove Program to PdG,Grant Number:BSP 0529-13the Ophthalmological Network OFTARED(RD12-0034/0002)+5 种基金the Institute of Health Carlos IIIthe PN I+D+i 2008–2011the ISCIII-Subdireccion General de Redes y Centros de Investigación Cooperativathe European Programme FEDERthe project SAF2014-53779-Rthe project:“The role of encapsulated NSAIDs in PLGA microparticles as a neuroprotective treatment” funded by the Spanish Ministry of Economy and Competitiveness
文摘Glaucoma is a multifactorial optic neuropathy characterized by the damage and death of the retinal ganglion cells.This disease results in vision loss and blindness.Any vision loss resulting from the disease cannot be restored and nowadays there is no available cure for glaucoma; however an early detection and treatment,could offer neuronal protection and avoid later serious damages to the visual function.A full understanding of the etiology of the disease will still require the contribution of many scientific efforts.Glial activation has been observed in glaucoma,being microglial proliferation a hallmark in this neurodegenerative disease.A typical project studying these cellular changes involved in glaucoma often needs thousands of images- from several animals- covering different layers and regions of the retina.The gold standard to evaluate them is the manual count.This method requires a large amount of time from specialized personnel.It is a tedious process and prone to human error.We present here a new method to count microglial cells by using a computer algorithm.It counts in one hour the same number of images that a researcher counts in four weeks,with no loss of reliability.
基金the grants of the National Major Scientific Research Program of China(Grant Nos.2011CB910404 and 2012CB966801)the National N ature Science Foundation of China(Grant No.61227017)the National Science Fund for Distinguished Young Scholars(Grant No.61425006).
文摘In biomedical research fields,the in vio Aow cytometry(IVFC)is a widely used technology which is able to monitor target cells dynamically in living animals.Although the setup of IVFC system has been well established,baseline drift is still a challenge in the process of quantifying circulating cells.Previous methods,i.e.,the dynamic peak picking method,counted cells by setting a static threshold without considering the baseline drift,leading to an inaccurate cell quantification.Here,we developed a method of cell counting for IVFC data with baseline drift by interpolation fitting,automatic segmentation and wavelet-based denoising.We demonstrated its performance for IVFC signals with three types of representative baseline drift.Compared with non-baseline correction methods,this method showed a higher sensitivity and specificity,as well as a better result in the Pearson's correlation coefficient and the mean-squared error(MSE).
基金the financial support of the Major Program of the National Natural Science Foundation of China(No.51490675)the National Science Fund for Distinguished Young Scholars(No.51625902)+1 种基金the Taishan Scholars Program of Shandong Provincethe Fundamental Research Funds for the Central Universities(No.841713035)
文摘Both wave-frequency(WF) and low-frequency(LF) components of mooring tension are in principle non-Gaussian due to nonlinearities in the dynamic system.This paper conducts a comprehensive investigation of applicable probability density functions(PDFs) of mooring tension amplitudes used to assess mooring-line fatigue damage via the spectral method.Short-term statistical characteristics of mooring-line tension responses are firstly investigated,in which the discrepancy arising from Gaussian approximation is revealed by comparing kurtosis and skewness coefficients.Several distribution functions based on present analytical spectral methods are selected to express the statistical distribution of the mooring-line tension amplitudes.Results indicate that the Gamma-type distribution and a linear combination of Dirlik and Tovo-Benasciutti formulas are suitable for separate WF and LF mooring tension components.A novel parametric method based on nonlinear transformations and stochastic optimization is then proposed to increase the effectiveness of mooring-line fatigue assessment due to non-Gaussian bimodal tension responses.Using time domain simulation as a benchmark,its accuracy is further validated using a numerical case study of a moored semi-submersible platform.
文摘The issue of document management has been raised for a long time, especially with the appearance of office automation in the 1980s, which led to dematerialization and Electronic Document Management (EDM). In the same period, workflow management has experienced significant development, but has become more focused on the industry. However, it seems to us that document workflows have not had the same interest for the scientific community. But nowadays, the emergence and supremacy of the Internet in electronic exchanges are leading to a massive dematerialization of documents;which requires a conceptual reconsideration of the organizational framework for the processing of said documents in both public and private administrations. This problem seems open to us and deserves the interest of the scientific community. Indeed, EDM has mainly focused on the storage (referencing) and circulation of documents (traceability). It paid little attention to the overall behavior of the system in processing documents. The purpose of our researches is to model document processing systems. In the previous works, we proposed a general model and its specialization in the case of small documents (any document processed by a single person at a time during its processing life cycle), which represent 70% of documents processed by administrations, according to our study. In this contribution, we extend the model for processing small documents to the case where they are managed in a system comprising document classes organized in subclasses;which is the case for most administrations. We have thus observed that this model is a Markovian <i>M<sup>L×K</sup>/M<sup>L×K</sup>/</i>1 queues network. We have analyzed the constraints of this model and deduced certain characteristics and metrics. <span style="white-space:normal;"><i></i></span><i>In fine<span style="white-space:normal;"></span></i>, the ultimate objective of our work is to design a document workflow management system, integrating a component of global behavior prediction.
文摘Spinning has a significant influence on all textile processes. Combinations of all the capital equipment display the process’ critical condition. By transforming unprocessed fibers into carded sliver and yarn, the carding machine serves a critical role in the textile industry. The carding machine’s licker-in and flat speeds are crucial operational factors that have a big influence on the finished goods’ quality. The purpose of this study is to examine the link between licker-in and flat speeds and how they affect the yarn and carded sliver quality. A thorough experimental examination on a carding machine was carried out to accomplish this. The carded sliver and yarn produced after experimenting with different licker-in and flat speed combinations were assessed for important quality factors including evenness, strength, and flaws. To account for changes in material qualities and machine settings, the study also took into consideration the impact of various fiber kinds and processing circumstances. The findings of the investigation showed a direct relationship between the quality of the carded sliver and yarn and the licker-in and flat speeds. Within a limited range, greater licker-in speeds were shown to increase carding efficiency and decrease fiber tangling. On the other hand, extremely high speeds led to more fiber breakage and neps. Higher flat speeds, on the other hand, helped to enhance fiber alignment, which increased the evenness and strength of the carded sliver and yarn. Additionally, it was discovered that the ideal blend of licker-in and flat rates varied based on the fiber type and processing circumstances. When being carded, various fibers displayed distinctive behaviors that necessitated adjusting the operating settings in order to provide the necessary quality results. The study also determined the crucial speed ratios between the licker-in and flat speeds that reduced fiber breakage and increased the caliber of the finished goods. The results of this study offer useful information for textile producers and process engineers to improve the quality of carded sliver and yarn while maximizing the performance of carding machines. Operators may choose machine settings and parameter adjustments wisely by knowing the impacts of licker-in and flat speeds, which will increase textile industry efficiency, productivity, and product quality.
文摘Spinning has a significant influence on all textile processes. Combinations of all the capital equipment display the process’ critical condition. By transforming unprocessed fibers into carded sliver and yarn, the carding machine serves a critical role in the textile industry. The carding machine’s licker-in and flat speeds are crucial operational factors that have a big influence on the finished goods’ quality. The purpose of this study is to examine the link between licker-in and flat speeds and how they affect the yarn and carded sliver quality. A thorough experimental examination on a carding machine was carried out to accomplish this. The carded sliver and yarn produced after experimenting with different licker-in and flat speed combinations were assessed for important quality factors including evenness, strength, and flaws. To account for changes in material qualities and machine settings, the study also took into consideration the impact of various fiber kinds and processing circumstances. The findings of the investigation showed a direct relationship between the quality of the carded sliver and yarn and the licker-in and flat speeds. Within a limited range, greater licker-in speeds were shown to increase carding efficiency and decrease fiber tangling. On the other hand, extremely high speeds led to more fiber breakage and neps. Higher flat speeds, on the other hand, helped to enhance fiber alignment, which increased the evenness and strength of the carded sliver and yarn. Additionally, it was discovered that the ideal blend of licker-in and flat rates varied based on the fiber type and processing circumstances. When being carded, various fibers displayed distinctive behaviors that necessitated adjusting the operating settings in order to provide the necessary quality results. The study also determined the crucial speed ratios between the licker-in and flat speeds that reduced fiber breakage and increased the caliber of the finished goods. The results of this study offer useful information for textile producers and process engineers to improve the quality of carded sliver and yarn while maximizing the performance of carding machines. Operators may choose machine settings and parameter adjustments wisely by knowing the impacts of licker-in and flat speeds, which will increase textile industry efficiency, productivity, and product quality.
文摘The fingerspelling recognition by hand shape is an important step for developing a human-computer interaction system. A method of fingerspelling recognition by hand shape using HLAC (higher-order local auto-correlation) features is proposed. Furthermore, in order to use HLAC features more effectively, the use of image processing techniques: reducing an image resolution, dividing an image, and image pre-processing techniques, is also proposed. The experimental results show that the proposed method is promising.
基金supported by International Research Exchange Scheme of the Marie Curie Program of the 7th Framework Program(Ref.PIRSES-GA-2013-612659)National Key Basic Research Program of China(2013CB127700)National Key Technologies Research and Development Program of China(2012BAD19BA04).
文摘To realize automatic counting of urediospores of Puccinia striiformis f.sp.tritici(Pst)(causal agent of wheat stripe rust),an automatic counting system for urediospores of wheat stripe rust pathogen based on image processing was developed using MATLAB GUIDE platform in combination with Local C Compiler(LCC).The system is independent of the MATLAB environment and can be run on a computer without the MATLAB software.Using this system,automatic counting of Pst urediospores in a microscopic image can be implemented via image processing technologies including image scaling,clustering segmentation,morphological modification,watershed transformation,connected region labeling,etc.Structure design of the automatic counting system,the key algorithms used in the system and realization of the main functions of the system were described in detail.Spore counting tests were conducted using microscopic digital images of Pst urediospores and the high accuracies more than 95%were obtained.The results indicated that it is feasible to count Pst urediospores automatically using the developed system based on image processing.
基金partially supported by the Strategic Priority Research Program of Chinese Academy of Science(No.XDB 34030000)the National Natural Science Foundation of China(Nos.11975293 and 12205348)。
文摘The High-energy Fragment Separator(HFRS),which is currently under construction,is a leading international radioactive beam device.Multiple sets of position-sensitive twin time projection chamber(TPC)detectors are distributed on HFRS for particle identification and beam monitoring.The twin TPCs'readout electronics system operates in a trigger-less mode due to its high counting rate,leading to a challenge of handling large amounts of data.To address this problem,we introduced an event-building algorithm.This algorithm employs a hierarchical processing strategy to compress data during transmission and aggregation.In addition,it reconstructs twin TPCs'events online and stores only the reconstructed particle information,which significantly reduces the burden on data transmission and storage resources.Simulation studies demonstrated that the algorithm accurately matches twin TPCs'events and reduces more than 98%of the data volume at a counting rate of 500 kHz/channel.
基金the National Natural Science Foundation of China(No.71701098)Natural Science Foundation of Jiangsu Province(No.BK20160940)+1 种基金Humanities and Social Sciences Youth Fund of Chinese Ministry of Education(No.17YJC630070)Philosophy and Social Sciences Fund of Colleges and Universities in Jiangsu Province(No.2017SJB0105)
文摘Autocorrelations exist in real production extensively, and special statistical tools are needed for process monitoring. Residual charts based on autoregressive integrated moving average(ARIMA) models are typically used. However, ARIMA models need a quite amount of experience, which sometimes causes inconveniences in the implementation. With a good performance under less experience or even none, hidden Markov models(HMMs)were proposed. Since ARIMA models have many different performances in positive and negative autocorrelations,it is interesting and essential to study how HMMs affect the performances of residual charts in opposite autocorrelations, which has not been studied yet. Therefore, we extend HMMs to negatively auto-correlated observations.The cross-validation method is used to select the relatively optimal state number. The experiment results show that HMMs are more stable than Auto-Regressive of order one(AR(1) models) in both cases of positive and negative autocorrelations. For detecting abnormalities, the performance of HMMs approach is much better than AR(1) models under positive autocorrelations while under negative autocorrelations both methods have similar performances.