The feasibility of estimating patient-specific dose verification results directly from linear accelerator (linac) log files has been investigated for prostate cancer patients who undergo volumetric modulated arc thera...The feasibility of estimating patient-specific dose verification results directly from linear accelerator (linac) log files has been investigated for prostate cancer patients who undergo volumetric modulated arc therapy (VMAT). Twenty-six patients who underwent VMAT in our facility were consecutively selected. VMAT plans were created using Monaco treatment planning system and were transferred to an Elekta linac. During the beam delivery, dynamic machine parameters such as positions of the multi-leaf collimator and the gantry were recorded in the log files;subsequently, root mean square (rms) values of control errors, speeds and accelerations of the above machine parameters were calculated for each delivery. Dose verification was performed for all the plans using a cylindrical phantom with diodes placed in a spiral array. The gamma index pass rates were evaluated under 3%/3 mm and 2%/2 mm criteria with a dose threshold of 10%. Subsequently, the correlation coefficients between the gamma index pass rates and each of the above rms values were calculated. Under the 2%/2 mm criteria, significant negative correlations were found between the gamma index pass rates and the rms gantry angle errors (r = 0.64, p < 0.001) as well as the pass rates and the rms gantry accelerations (r = 0.68, p < 0.001). On the other hand, the rms values of the other dynamic machine parameters did not significantly correlate with the gamma index pass rates. We suggest that the VMAT quality assurance (QA) results can be directly estimated from the log file thereby providing potential to simplify patient-specific prostate VMAT QA procedure.展开更多
In this era of a data-driven society, useful data(Big Data) is often unintentionally ignored due to lack of convenient tools and expensive software. For example, web log files can be used to identify explicit informat...In this era of a data-driven society, useful data(Big Data) is often unintentionally ignored due to lack of convenient tools and expensive software. For example, web log files can be used to identify explicit information of browsing patterns when users access web sites. Some hidden information,however, cannot be directly derived from the log files. We may need external resources to discover more knowledge from browsing patterns. The purpose of this study is to investigate the application of web usage mining based on web log files. The outcome of this study sets further directions of this investigation on what and how implicit information embedded in log files can be efficiently and effectively extracted. Further work involves combining the use of social media data to improve business decision quality.展开更多
To better understand different users' accessing intentions, a novel clustering and supervising method based on accessing path is presented. This method divides users' interest space to express the distribution...To better understand different users' accessing intentions, a novel clustering and supervising method based on accessing path is presented. This method divides users' interest space to express the distribution of users' interests, and directly to instruct the constructing process of web pages indexing for advanced performance.展开更多
The evolution of telecommunications has allowed the development of broadband services based mainly on fiber optic backbone networks. The operation and maintenance of these optical networks is made possible by using su...The evolution of telecommunications has allowed the development of broadband services based mainly on fiber optic backbone networks. The operation and maintenance of these optical networks is made possible by using supervision platforms that generate alarms that can be archived in the form of log files. But analyzing the alarms in the log files is a laborious and difficult task for the engineers who need a degree of expertise. Identifying failures and their root cause can be time consuming and impact the quality of service, network availability and service level agreements signed between the operator and its customers. Therefore, it is more than important to study the different possibilities of alarms classification and to use machine learning algorithms for alarms correlation in order to quickly determine the root causes of problems faster. We conducted a research case study on one of the operators in Cameroon who held an optical backbone based on SDH and WDM technologies with data collected from 2016-03-28 to “2022-09-01” with 7201 rows and 18. In this paper, we will classify alarms according to different criteria and use 02 unsupervised learning algorithms namely the K-Means algorithm and the DBSCAN to establish correlations between alarms in order to identify root causes of problems and reduce the time to troubleshoot. To achieve this objective, log files were exploited in order to obtain the root causes of the alarms, and then K-Means algorithm and the DBSCAN were used firstly to evaluate their performance and their capability to identify the root cause of alarms in optical network.展开更多
文摘The feasibility of estimating patient-specific dose verification results directly from linear accelerator (linac) log files has been investigated for prostate cancer patients who undergo volumetric modulated arc therapy (VMAT). Twenty-six patients who underwent VMAT in our facility were consecutively selected. VMAT plans were created using Monaco treatment planning system and were transferred to an Elekta linac. During the beam delivery, dynamic machine parameters such as positions of the multi-leaf collimator and the gantry were recorded in the log files;subsequently, root mean square (rms) values of control errors, speeds and accelerations of the above machine parameters were calculated for each delivery. Dose verification was performed for all the plans using a cylindrical phantom with diodes placed in a spiral array. The gamma index pass rates were evaluated under 3%/3 mm and 2%/2 mm criteria with a dose threshold of 10%. Subsequently, the correlation coefficients between the gamma index pass rates and each of the above rms values were calculated. Under the 2%/2 mm criteria, significant negative correlations were found between the gamma index pass rates and the rms gantry angle errors (r = 0.64, p < 0.001) as well as the pass rates and the rms gantry accelerations (r = 0.68, p < 0.001). On the other hand, the rms values of the other dynamic machine parameters did not significantly correlate with the gamma index pass rates. We suggest that the VMAT quality assurance (QA) results can be directly estimated from the log file thereby providing potential to simplify patient-specific prostate VMAT QA procedure.
基金Supported by Royal Thai Government ScholarshipFaculty of IT,Monash University,Resources Support
文摘In this era of a data-driven society, useful data(Big Data) is often unintentionally ignored due to lack of convenient tools and expensive software. For example, web log files can be used to identify explicit information of browsing patterns when users access web sites. Some hidden information,however, cannot be directly derived from the log files. We may need external resources to discover more knowledge from browsing patterns. The purpose of this study is to investigate the application of web usage mining based on web log files. The outcome of this study sets further directions of this investigation on what and how implicit information embedded in log files can be efficiently and effectively extracted. Further work involves combining the use of social media data to improve business decision quality.
文摘To better understand different users' accessing intentions, a novel clustering and supervising method based on accessing path is presented. This method divides users' interest space to express the distribution of users' interests, and directly to instruct the constructing process of web pages indexing for advanced performance.
文摘The evolution of telecommunications has allowed the development of broadband services based mainly on fiber optic backbone networks. The operation and maintenance of these optical networks is made possible by using supervision platforms that generate alarms that can be archived in the form of log files. But analyzing the alarms in the log files is a laborious and difficult task for the engineers who need a degree of expertise. Identifying failures and their root cause can be time consuming and impact the quality of service, network availability and service level agreements signed between the operator and its customers. Therefore, it is more than important to study the different possibilities of alarms classification and to use machine learning algorithms for alarms correlation in order to quickly determine the root causes of problems faster. We conducted a research case study on one of the operators in Cameroon who held an optical backbone based on SDH and WDM technologies with data collected from 2016-03-28 to “2022-09-01” with 7201 rows and 18. In this paper, we will classify alarms according to different criteria and use 02 unsupervised learning algorithms namely the K-Means algorithm and the DBSCAN to establish correlations between alarms in order to identify root causes of problems and reduce the time to troubleshoot. To achieve this objective, log files were exploited in order to obtain the root causes of the alarms, and then K-Means algorithm and the DBSCAN were used firstly to evaluate their performance and their capability to identify the root cause of alarms in optical network.