In basketball, each player’s skill level is the key to a team’s success or failure, the skill level is affected by many personal and environmental factors. A physics-informed AI statistics has become extremely impor...In basketball, each player’s skill level is the key to a team’s success or failure, the skill level is affected by many personal and environmental factors. A physics-informed AI statistics has become extremely important. In this article, a complex non-linear process is considered by taking into account the average points per game of each player, playing time, shooting percentage, and others. This physics-informed statistics is to construct a multiple linear regression model with physics-informed neural networks. Based on the official data provided by the American Basketball League, and combined with specific methods of R program analysis, the regression model affecting the player’s average points per game is verified, and the key factors affecting the player’s average points per game are finally elucidated. The paper provides a novel window for coaches to make meaningful in-game adjustments to team members.展开更多
In this present time,Human Activity Recognition(HAR)has been of considerable aid in the case of health monitoring and recovery.The exploitation of machine learning with an intelligent agent in the area of health infor...In this present time,Human Activity Recognition(HAR)has been of considerable aid in the case of health monitoring and recovery.The exploitation of machine learning with an intelligent agent in the area of health informatics gathered using HAR augments the decision-making quality and significance.Although many research works conducted on Smart Healthcare Monitoring,there remain a certain number of pitfalls such as time,overhead,and falsification involved during analysis.Therefore,this paper proposes a Statistical Partial Regression and Support Vector Intelligent Agent Learning(SPR-SVIAL)for Smart Healthcare Monitoring.At first,the Statistical Partial Regression Feature Extraction model is used for data preprocessing along with the dimensionality-reduced features extraction process.Here,the input dataset the continuous beat-to-beat heart data,triaxial accelerometer data,and psychological characteristics were acquired from IoT wearable devices.To attain highly accurate Smart Healthcare Monitoring with less time,Partial Least Square helps extract the dimensionality-reduced features.After that,with these resulting features,SVIAL is proposed for Smart Healthcare Monitoring with the help of Machine Learning and Intelligent Agents to minimize both analysis falsification and overhead.Experimental evaluation is carried out for factors such as time,overhead,and false positive rate accuracy concerning several instances.The quantitatively analyzed results indicate the better performance of our proposed SPR-SVIAL method when compared with two state-of-the-art methods.展开更多
The development of defect prediction plays a significant role in improving software quality. Such predictions are used to identify defective modules before the testing and to minimize the time and cost. The software w...The development of defect prediction plays a significant role in improving software quality. Such predictions are used to identify defective modules before the testing and to minimize the time and cost. The software with defects negatively impacts operational costs and finally affects customer satisfaction. Numerous approaches exist to predict software defects. However, the timely and accurate software bugs are the major challenging issues. To improve the timely and accurate software defect prediction, a novel technique called Nonparametric Statistical feature scaled QuAdratic regressive convolution Deep nEural Network (SQADEN) is introduced. The proposed SQADEN technique mainly includes two major processes namely metric or feature selection and classification. First, the SQADEN uses the nonparametric statistical Torgerson–Gower scaling technique for identifying the relevant software metrics by measuring the similarity using the dice coefficient. The feature selection process is used to minimize the time complexity of software fault prediction. With the selected metrics, software fault perdition with the help of the Quadratic Censored regressive convolution deep neural network-based classification. The deep learning classifier analyzes the training and testing samples using the contingency correlation coefficient. The softstep activation function is used to provide the final fault prediction results. To minimize the error, the Nelder–Mead method is applied to solve non-linear least-squares problems. Finally, accurate classification results with a minimum error are obtained at the output layer. Experimental evaluation is carried out with different quantitative metrics such as accuracy, precision, recall, F-measure, and time complexity. The analyzed results demonstrate the superior performance of our proposed SQADEN technique with maximum accuracy, sensitivity and specificity by 3%, 3%, 2% and 3% and minimum time and space by 13% and 15% when compared with the two state-of-the-art methods.展开更多
Electrical impedance tomography (EIT) aims to reconstruct the conductivity distribution using the boundary measured voltage potential. Traditional regularization based method would suffer from error propagation due to...Electrical impedance tomography (EIT) aims to reconstruct the conductivity distribution using the boundary measured voltage potential. Traditional regularization based method would suffer from error propagation due to the iteration process. The statistical inverse problem method uses statistical inference to estimate unknown parameters. In this article, we develop a nonlinear weighted anisotropic total variation (NWATV) prior density function based on the recently proposed NWATV regularization method. We calculate the corresponding posterior density function, i.e., the solution of the EIT inverse problem in the statistical sense, via a modified Markov chain Monte Carlo (MCMC) sampling. We do numerical experiment to validate the proposed approach.展开更多
Statistical literacy is crucial for cultivating well-rounded thinkers.The integration of evidence-based strategies in teaching and learning is pivotal for enhancing students’statistical literacy.This research specifi...Statistical literacy is crucial for cultivating well-rounded thinkers.The integration of evidence-based strategies in teaching and learning is pivotal for enhancing students’statistical literacy.This research specifically focuses on the utilization of Share and Model Concepts and Nurturing Metacognition as evidence-based strategies aimed at improving the statistical literacy of learners.The study employed a quasi-experimental design,specifically the nonequivalent control group,wherein students answered pre-test and post-test instruments and researcher-made questionnaires.The study included 50 first-year Bachelor in Secondary Education majors in Mathematics and Science for the academic year 2023-2024.The results of the study revealed a significant difference in the scores of student respondents,indicating that the use of evidence-based strategies helped students enhance their statistical literacy.This signifies a noteworthy increase in their performance,ranging from very low to very high proficiency in understanding statistical concepts,insights into the application of statistical concepts,numeracy,graph skills,interpretation capabilities,and visualization and communication skills.Furthermore,the study showed a significant difference in the post-test scores’performance of the two groups in understanding statistical concepts and visualization and communication skills.However,no significant difference was found in the post-test scores of the two groups concerning insights into the application of statistical concepts,numeracy and graph skills,and interpretation capabilities.Additionally,students acknowledged that the implementation of evidence-based strategies significantly contributed to the improvement of their statistical literacy.展开更多
The study of land surface temperature(LST)is of great significance for ecosystem monitoring and ecological environmental protection in the Qinling Mountains of China.In view of the contradicting spatial and temporal r...The study of land surface temperature(LST)is of great significance for ecosystem monitoring and ecological environmental protection in the Qinling Mountains of China.In view of the contradicting spatial and temporal resolutions in extracting LST from satellite remote sensing(RS)data,the areas with complex landforms of the Eastern Qinling Mountains were selected as the research targets to establish the correlation between the normalized difference vegetation index(NDVI)and LST.Detailed information on the surface features and temporal changes in the land surface was provided by Sentinel-2 and Sentinel-3,respectively.Based on the statistically downscaling method,the spatial scale could be decreased from 1000 m to 10 m,and LST with a Sentinel-3 temporal resolution and a 10 m spatial resolution could be retrieved.Comparing the 1 km resolution Sentinel-3 LST with the downscaling results,the 10 m LST downscaling data could accurately reflect the spatial distribution of the thermal characteristics of the original LST image.Moreover,the surface temperature data with a 10 m high spatial resolution had clear texture and obvious geomorphic features that could depict the detailed information of the ground features.The results showed that the average error was 5 K on April 16,2019 and 2.6 K on July 15,2019.The smaller error values indicated the higher vegetation coverage of summer downscaling result with the highest level on July 15.展开更多
The frequent occurrence of dry and hot(DH)days in South China in summer has a negative impact on social development and human health.This study explored the variation characteristics of DH days and the possible reason...The frequent occurrence of dry and hot(DH)days in South China in summer has a negative impact on social development and human health.This study explored the variation characteristics of DH days and the possible reasons for this knotty problem.The findings revealed a notable increase in the number of DH days across most stations,indicating a significant upward trend.Additionally,DH events were observed to occur frequently.The number of DH days increased during 1970-1990,decreased from 1991 to 1997,and stayed stable after 1997.The key climate factors affecting the interannual variability of the number of DH days were the Indian Ocean Basin warming(IOBW)in spring and the East Asian Summer Monsoon(EASM).Compared with the negative phase of IOBW,in the positive phase of IOBW,500 hPa and 850 hPa geopotential height enhanced,the West Pacific subtropical high strengthened and extended abnormally to the west,more solar radiation reached the surface,surface outgoing longwave radiation increased,and there was an anomalous anticyclone in eastern South China.The atmospheric circulation characteristics of the positive and negative phases of ESAM were opposite to those of IOBW,and the abnormal circulation of the positive(negative)phases of ESAM was unfavorable(favorable)for the increase in the number of DH days.A long-term prediction model for the number of summer DH days was established using multiple linear regression,incorporating the key climate factors.The correlation coefficient between the observed and predicted number of DH days was 0.65,and the root-mean-square error was 2.8.In addition,independent forecasts for 2019 showed a deviation of just 1 day.The results of the independent recovery test confirmed the stability of the model,providing evidence that climatic factors did have an impact on DH days in South China.展开更多
Two statistical validation methods were used to evaluate the confidence level of the Total Column Ozone (TCO) measurements recorded by satellite systems measuring simultaneously, one using the normal distribution and ...Two statistical validation methods were used to evaluate the confidence level of the Total Column Ozone (TCO) measurements recorded by satellite systems measuring simultaneously, one using the normal distribution and another using the Mann-Whitney test. First, the reliability of the TCO measurements was studied hemispherically. While similar coincidences and levels of significance > 0.05 were found with the two statistical tests, an enormous variability in the levels of significance throughout the year was also exposed. Then, using the same statistical comparison methods, a latitudinal study was carried out in order to elucidate the geographical distribution that gave rise to this variability. Our study reveals that between the TOMS and OMI measurements in 2005 there was only a coincidence in 50% of the latitudes, which explained the variability. This implies that for 2005, the TOMS measurements are not completely reliable, except between the -50° and -15° latitude band in the southern hemisphere and between +15° and +50° latitude band in the northern hemisphere. In the case of OMI-OMPS, we observe that between 2011 and 2016 the measurements of both satellite systems are reasonably similar with a confidence level higher than 95%. However, in 2017 a band with a width of 20° latitude centered on the equator appeared, in which the significance levels were much less than 0.05, indicating that one of the measurement systems had begun to fail. In 2018, the fault was not only located in the equator, but was also replicated in various bands in the Southern Hemisphere. We interpret this as evidence of irreversible failure in one of the measurement systems.展开更多
文摘In basketball, each player’s skill level is the key to a team’s success or failure, the skill level is affected by many personal and environmental factors. A physics-informed AI statistics has become extremely important. In this article, a complex non-linear process is considered by taking into account the average points per game of each player, playing time, shooting percentage, and others. This physics-informed statistics is to construct a multiple linear regression model with physics-informed neural networks. Based on the official data provided by the American Basketball League, and combined with specific methods of R program analysis, the regression model affecting the player’s average points per game is verified, and the key factors affecting the player’s average points per game are finally elucidated. The paper provides a novel window for coaches to make meaningful in-game adjustments to team members.
基金supported by Princess Nourah bint Abdulrahman University Researchers Supporting Project Number(PNURSP2022R194)Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘In this present time,Human Activity Recognition(HAR)has been of considerable aid in the case of health monitoring and recovery.The exploitation of machine learning with an intelligent agent in the area of health informatics gathered using HAR augments the decision-making quality and significance.Although many research works conducted on Smart Healthcare Monitoring,there remain a certain number of pitfalls such as time,overhead,and falsification involved during analysis.Therefore,this paper proposes a Statistical Partial Regression and Support Vector Intelligent Agent Learning(SPR-SVIAL)for Smart Healthcare Monitoring.At first,the Statistical Partial Regression Feature Extraction model is used for data preprocessing along with the dimensionality-reduced features extraction process.Here,the input dataset the continuous beat-to-beat heart data,triaxial accelerometer data,and psychological characteristics were acquired from IoT wearable devices.To attain highly accurate Smart Healthcare Monitoring with less time,Partial Least Square helps extract the dimensionality-reduced features.After that,with these resulting features,SVIAL is proposed for Smart Healthcare Monitoring with the help of Machine Learning and Intelligent Agents to minimize both analysis falsification and overhead.Experimental evaluation is carried out for factors such as time,overhead,and false positive rate accuracy concerning several instances.The quantitatively analyzed results indicate the better performance of our proposed SPR-SVIAL method when compared with two state-of-the-art methods.
文摘The development of defect prediction plays a significant role in improving software quality. Such predictions are used to identify defective modules before the testing and to minimize the time and cost. The software with defects negatively impacts operational costs and finally affects customer satisfaction. Numerous approaches exist to predict software defects. However, the timely and accurate software bugs are the major challenging issues. To improve the timely and accurate software defect prediction, a novel technique called Nonparametric Statistical feature scaled QuAdratic regressive convolution Deep nEural Network (SQADEN) is introduced. The proposed SQADEN technique mainly includes two major processes namely metric or feature selection and classification. First, the SQADEN uses the nonparametric statistical Torgerson–Gower scaling technique for identifying the relevant software metrics by measuring the similarity using the dice coefficient. The feature selection process is used to minimize the time complexity of software fault prediction. With the selected metrics, software fault perdition with the help of the Quadratic Censored regressive convolution deep neural network-based classification. The deep learning classifier analyzes the training and testing samples using the contingency correlation coefficient. The softstep activation function is used to provide the final fault prediction results. To minimize the error, the Nelder–Mead method is applied to solve non-linear least-squares problems. Finally, accurate classification results with a minimum error are obtained at the output layer. Experimental evaluation is carried out with different quantitative metrics such as accuracy, precision, recall, F-measure, and time complexity. The analyzed results demonstrate the superior performance of our proposed SQADEN technique with maximum accuracy, sensitivity and specificity by 3%, 3%, 2% and 3% and minimum time and space by 13% and 15% when compared with the two state-of-the-art methods.
文摘Electrical impedance tomography (EIT) aims to reconstruct the conductivity distribution using the boundary measured voltage potential. Traditional regularization based method would suffer from error propagation due to the iteration process. The statistical inverse problem method uses statistical inference to estimate unknown parameters. In this article, we develop a nonlinear weighted anisotropic total variation (NWATV) prior density function based on the recently proposed NWATV regularization method. We calculate the corresponding posterior density function, i.e., the solution of the EIT inverse problem in the statistical sense, via a modified Markov chain Monte Carlo (MCMC) sampling. We do numerical experiment to validate the proposed approach.
文摘Statistical literacy is crucial for cultivating well-rounded thinkers.The integration of evidence-based strategies in teaching and learning is pivotal for enhancing students’statistical literacy.This research specifically focuses on the utilization of Share and Model Concepts and Nurturing Metacognition as evidence-based strategies aimed at improving the statistical literacy of learners.The study employed a quasi-experimental design,specifically the nonequivalent control group,wherein students answered pre-test and post-test instruments and researcher-made questionnaires.The study included 50 first-year Bachelor in Secondary Education majors in Mathematics and Science for the academic year 2023-2024.The results of the study revealed a significant difference in the scores of student respondents,indicating that the use of evidence-based strategies helped students enhance their statistical literacy.This signifies a noteworthy increase in their performance,ranging from very low to very high proficiency in understanding statistical concepts,insights into the application of statistical concepts,numeracy,graph skills,interpretation capabilities,and visualization and communication skills.Furthermore,the study showed a significant difference in the post-test scores’performance of the two groups in understanding statistical concepts and visualization and communication skills.However,no significant difference was found in the post-test scores of the two groups concerning insights into the application of statistical concepts,numeracy and graph skills,and interpretation capabilities.Additionally,students acknowledged that the implementation of evidence-based strategies significantly contributed to the improvement of their statistical literacy.
基金Supported by the National Key R&D Plan(2018YFC1506500)Open Research Fund Project of Key Laboratory of Ecological Environment Meteorology of Qinling Mountains and Loess Plateau of Shaanxi Provincial Meteorological Bureau(2020Y-13)+1 种基金Open Research Fund of Shangluo Key Laboratory of Climate Adaptable City(SLSYS2022007)Shangluo Demonstration Project of Qinling Ecological Monitoring Service System(2020-611002-74-01-006200)。
文摘The study of land surface temperature(LST)is of great significance for ecosystem monitoring and ecological environmental protection in the Qinling Mountains of China.In view of the contradicting spatial and temporal resolutions in extracting LST from satellite remote sensing(RS)data,the areas with complex landforms of the Eastern Qinling Mountains were selected as the research targets to establish the correlation between the normalized difference vegetation index(NDVI)and LST.Detailed information on the surface features and temporal changes in the land surface was provided by Sentinel-2 and Sentinel-3,respectively.Based on the statistically downscaling method,the spatial scale could be decreased from 1000 m to 10 m,and LST with a Sentinel-3 temporal resolution and a 10 m spatial resolution could be retrieved.Comparing the 1 km resolution Sentinel-3 LST with the downscaling results,the 10 m LST downscaling data could accurately reflect the spatial distribution of the thermal characteristics of the original LST image.Moreover,the surface temperature data with a 10 m high spatial resolution had clear texture and obvious geomorphic features that could depict the detailed information of the ground features.The results showed that the average error was 5 K on April 16,2019 and 2.6 K on July 15,2019.The smaller error values indicated the higher vegetation coverage of summer downscaling result with the highest level on July 15.
基金National Natural Science Foundation of China(92044302,41805115)Guangzhou Municipal Science and Technology Project(202002020065)。
文摘The frequent occurrence of dry and hot(DH)days in South China in summer has a negative impact on social development and human health.This study explored the variation characteristics of DH days and the possible reasons for this knotty problem.The findings revealed a notable increase in the number of DH days across most stations,indicating a significant upward trend.Additionally,DH events were observed to occur frequently.The number of DH days increased during 1970-1990,decreased from 1991 to 1997,and stayed stable after 1997.The key climate factors affecting the interannual variability of the number of DH days were the Indian Ocean Basin warming(IOBW)in spring and the East Asian Summer Monsoon(EASM).Compared with the negative phase of IOBW,in the positive phase of IOBW,500 hPa and 850 hPa geopotential height enhanced,the West Pacific subtropical high strengthened and extended abnormally to the west,more solar radiation reached the surface,surface outgoing longwave radiation increased,and there was an anomalous anticyclone in eastern South China.The atmospheric circulation characteristics of the positive and negative phases of ESAM were opposite to those of IOBW,and the abnormal circulation of the positive(negative)phases of ESAM was unfavorable(favorable)for the increase in the number of DH days.A long-term prediction model for the number of summer DH days was established using multiple linear regression,incorporating the key climate factors.The correlation coefficient between the observed and predicted number of DH days was 0.65,and the root-mean-square error was 2.8.In addition,independent forecasts for 2019 showed a deviation of just 1 day.The results of the independent recovery test confirmed the stability of the model,providing evidence that climatic factors did have an impact on DH days in South China.
文摘Two statistical validation methods were used to evaluate the confidence level of the Total Column Ozone (TCO) measurements recorded by satellite systems measuring simultaneously, one using the normal distribution and another using the Mann-Whitney test. First, the reliability of the TCO measurements was studied hemispherically. While similar coincidences and levels of significance > 0.05 were found with the two statistical tests, an enormous variability in the levels of significance throughout the year was also exposed. Then, using the same statistical comparison methods, a latitudinal study was carried out in order to elucidate the geographical distribution that gave rise to this variability. Our study reveals that between the TOMS and OMI measurements in 2005 there was only a coincidence in 50% of the latitudes, which explained the variability. This implies that for 2005, the TOMS measurements are not completely reliable, except between the -50° and -15° latitude band in the southern hemisphere and between +15° and +50° latitude band in the northern hemisphere. In the case of OMI-OMPS, we observe that between 2011 and 2016 the measurements of both satellite systems are reasonably similar with a confidence level higher than 95%. However, in 2017 a band with a width of 20° latitude centered on the equator appeared, in which the significance levels were much less than 0.05, indicating that one of the measurement systems had begun to fail. In 2018, the fault was not only located in the equator, but was also replicated in various bands in the Southern Hemisphere. We interpret this as evidence of irreversible failure in one of the measurement systems.