BACKGROUND Meniscal tears are one of the most common knee injuries.After the diagnosis of a meniscal tear has been made,there are several factors physicians use to guide clinical decision-making.The influence of time ...BACKGROUND Meniscal tears are one of the most common knee injuries.After the diagnosis of a meniscal tear has been made,there are several factors physicians use to guide clinical decision-making.The influence of time between injury and isolated meniscus repair on patient outcomes is not well described.Assessing this relationship is important as it may influence clinical decision-making and can add to the preoperative patient education process.We hypothesized that increasing the time from injury to meniscus surgery would worsen postoperative outcomes.AIM To investigate the current literature for data on the relationship between time between meniscus injury and repair on patient outcomes.METHODS PubMed,Academic Search Complete,MEDLINE,CINAHL,and SPORTDiscus were searched for studies published between January 1,1995 and July 13,2023 on isolated meniscus repair.Exclusion criteria included concomitant ligament surgery,incomplete outcomes or time to surgery data,and meniscectomies.Patient demographics,time to injury,and postoperative outcomes from each study were abstracted and analyzed.RESULTS Five studies met all inclusion and exclusion criteria.There were 204(121 male,83 female)patients included.Three of five(60%)studies determined that time between injury and surgery was not statistically significant for postoperative Lysholm scores(P=0.62),Tegner scores(P=0.46),failure rate(P=0.45,P=0.86),and International Knee Documentation Committee scores(P=0.65).Two of five(40%)studies found a statistically significant increase in Lysholm scores with shorter time to surgery(P=0.03)and a statistically significant association between progression of medial meniscus extrusion ratio(P=0.01)and increasing time to surgery.CONCLUSION Our results do not support the hypothesis that increased time from injury to isolated meniscus surgery worsens postoperative outcomes.Decision-making primarily based on injury interval is thus not recommended.展开更多
Several promising plasma biomarker proteins,such as amyloid-β(Aβ),tau,neurofilament light chain,and glial fibrillary acidic protein,are widely used for the diagnosis of neurodegenerative diseases.However,little is k...Several promising plasma biomarker proteins,such as amyloid-β(Aβ),tau,neurofilament light chain,and glial fibrillary acidic protein,are widely used for the diagnosis of neurodegenerative diseases.However,little is known about the long-term stability of these biomarker proteins in plasma samples stored at-80°C.We aimed to explore how storage time would affect the diagnostic accuracy of these biomarkers using a large cohort.Plasma samples from 229 cognitively unimpaired individuals,encompassing healthy controls and those experiencing subjective cognitive decline,as well as 99 patients with cognitive impairment,comprising those with mild cognitive impairment and dementia,were acquired from the Sino Longitudinal Study on Cognitive Decline project.These samples were stored at-80°C for up to 6 years before being used in this study.Our results showed that plasma levels of Aβ42,Aβ40,neurofilament light chain,and glial fibrillary acidic protein were not significantly correlated with sample storage time.However,the level of total tau showed a negative correlation with sample storage time.Notably,in individuals without cognitive impairment,plasma levels of total protein and tau phosphorylated protein threonine 181(p-tau181)also showed a negative correlation with sample storage time.This was not observed in individuals with cognitive impairment.Consequently,we speculate that the diagnostic accuracy of plasma p-tau181 and the p-tau181 to total tau ratio may be influenced by sample storage time.Therefore,caution is advised when using these plasma biomarkers for the identification of neurodegenerative diseases,such as Alzheimer's disease.Furthermore,in cohort studies,it is important to consider the impact of storage time on the overall results.展开更多
Despite extensive research, timing channels (TCs) are still known as a principal category of threats that aim to leak and transmit information by perturbing the timing or ordering of events. Existing TC detection appr...Despite extensive research, timing channels (TCs) are still known as a principal category of threats that aim to leak and transmit information by perturbing the timing or ordering of events. Existing TC detection approaches use either signature-based approaches to detect known TCs or anomaly-based approach by modeling the legitimate network traffic in order to detect unknown TCs. Un-fortunately, in a software-defined networking (SDN) environment, most existing TC detection approaches would fail due to factors such as volatile network traffic, imprecise timekeeping mechanisms, and dynamic network topology. Furthermore, stealthy TCs can be designed to mimic the legitimate traffic pattern and thus evade anomalous TC detection. In this paper, we overcome the above challenges by presenting a novel framework that harnesses the advantages of elastic re-sources in the cloud. In particular, our framework dynamically configures SDN to enable/disable differential analysis against outbound network flows of different virtual machines (VMs). Our framework is tightly coupled with a new metric that first decomposes the timing data of network flows into a number of using the discrete wavelet-based multi-resolution transform (DWMT). It then applies the Kullback-Leibler divergence (KLD) to measure the variance among flow pairs. The appealing feature of our approach is that, compared with the existing anomaly detection approaches, it can detect most existing and some new stealthy TCs without legitimate traffic for modeling, even with the presence of noise and imprecise timekeeping mechanism in an SDN virtual environment. We implement our framework as a prototype system, OBSERVER, which can be dynamically deployed in an SDN environment. Empirical evaluation shows that our approach can efficiently detect TCs with a higher detection rate, lower latency, and negligible performance overhead compared to existing approaches.展开更多
Two methods for smoothing pseudorange observable by Carrier and Doppler are discussed. Then the procedure based on the RINEX observation files is tested using the Ashtech Z-XII3T geodetic receivers driven by a stable ...Two methods for smoothing pseudorange observable by Carrier and Doppler are discussed. Then the procedure based on the RINEX observation files is tested using the Ashtech Z-XII3T geodetic receivers driven by a stable external frequency at UNSO. This paper proposes to adapt this procedure for the links between geodetic receivers, in order to take advantage of the P codes available on L 1 and L 2. This new procedure uses the 30-second RINEX observations files, the standard of the International GPS Service (IGS), and processes the ionosphere-free combination of the codes P 1 and P 2; the satellite positions are deduced from the IGS rapid orbits, available after two days.展开更多
This paper proposes a digital background calibration scheme for timing skew in time-interleaved analog-to-digital converters (TIADCs). It detects the relevant timing error by subtracting the output difference with the...This paper proposes a digital background calibration scheme for timing skew in time-interleaved analog-to-digital converters (TIADCs). It detects the relevant timing error by subtracting the output difference with the sum of the first derivative of the digital output. The least-mean-square (LMS) loop is exploited to compensate the timing skew. Since the calibration scheme depends on the digital output, all timing skew sources can be calibrated and the main ADC is maintained. The proposed scheme is effective within the entire frequency range of 0 ? fs/2. Compared with traditional calibration schemes, the proposed approach is more feasible and consumes significantly lesser power and smaller area.展开更多
In order to achieve the auto-timing counts measurement of nuclear radiation using ORTEC 974 Counter/Timer, an auto-timing counts virtual instrument system based on the LabVIEW virtual instrument development platform a...In order to achieve the auto-timing counts measurement of nuclear radiation using ORTEC 974 Counter/Timer, an auto-timing counts virtual instrument system based on the LabVIEW virtual instrument development platform and GPIB instrument control and transmission bus protocol is designed in this paper. By introducing software timing technique, the minimum time base of factory setting improves from 0.1 s to 0.03 s. The timing counts performance and longtime stability are also discussed in detail. The automatic data recording and saving facilitates data analysis and processing. Its real-time display and statistic function is very convenient for monitoring the nuclear radiation.展开更多
A fine grained distributed multimedia synchronization model——Enhanced Fuzzy timing Petri Net was proposed which is good at modeling indeterminacy and fuzzy. To satisfy the need of maximum tolerable jitter, the suffi...A fine grained distributed multimedia synchronization model——Enhanced Fuzzy timing Petri Net was proposed which is good at modeling indeterminacy and fuzzy. To satisfy the need of maximum tolerable jitter, the sufficient conditions are given in intra object synchronization. Method to find a proper granularity in inter object synchronization is also given to satisfy skew. Exceptions are detected and corrected as early as possible using restricted blocking method.展开更多
this paper develops a real-time traffic signal timing model which is to be integrated into a single intersection for urban road, thereby solving the problem of traffic congestion. We analyze the current situation of t...this paper develops a real-time traffic signal timing model which is to be integrated into a single intersection for urban road, thereby solving the problem of traffic congestion. We analyze the current situation of the traffic flow with release matrix firstly, and then put forward the basic models to minimize total delay time of vehicles at the intersection. The optimal real-time signal timing model (non-fixed cycle and non-fixed split) is built with the Webster split optimal model. At last, the simulated results, which are compared with conventional model, manifest the promising properties of proposed model.展开更多
In recent years,the time-frequency overlapping multi-carrier signal has been a novel and valuable topic in blind signal processing,especially in the non-cooperative receiving field.But there is little related research...In recent years,the time-frequency overlapping multi-carrier signal has been a novel and valuable topic in blind signal processing,especially in the non-cooperative receiving field.But there is little related research in public published papers.This paper proposes two timing estimation algorithms,which are non-data-aided and based on the cyclic auto-correlation function.In order to evaluate the performance of the proposed algorithms,the theoretical bound of the timing estimation is derived.According to the analyses and simulation results,the effectiveness of the proposed algorithms has been demonstrated.It shows that MethodⅠhas better performance than MethodⅡ.However,MethodⅡdoes not need prior information,so it has a wider range of applications.展开更多
VHF (Very High Frequency) band antenna array will receive analog signal from universe for storage after digital sampling and adding time scale, and then do the interference analysis of different sub-station digital si...VHF (Very High Frequency) band antenna array will receive analog signal from universe for storage after digital sampling and adding time scale, and then do the interference analysis of different sub-station digital signal. It requires the time-frequency system with high precision and low drifting. This paper explains a time-frequency system of VHF band antenna, which can produce standard 10 MHz signal and clock signal needed by sampler, to ensure that two computers which sampling data has the same system time and the storage data has the accurate time scale, the system includes time comparison programme based on the GPS network timing two different sampling control computers. Timing strategy uses a time comparison software which based on the Labview graphical programming platform. This software captures the system time of two computers to analyze and determine the time deviation when the two computers occurs time offset, and then grant the GPS time of NTP server to the two computers through local area network in this time deviation. Final results show that this method can automatically calibrate the system time of the computers in the LAN, Precision Can Reach 0.1 s Orless.展开更多
The amount of oxygen blown into the converter is one of the key parameters for the control of the converter blowing process,which directly affects the tap-to-tap time of converter. In this study, a hybrid model based ...The amount of oxygen blown into the converter is one of the key parameters for the control of the converter blowing process,which directly affects the tap-to-tap time of converter. In this study, a hybrid model based on oxygen balance mechanism (OBM) and deep neural network (DNN) was established for predicting oxygen blowing time in converter. A three-step method was utilized in the hybrid model. First, the oxygen consumption volume was predicted by the OBM model and DNN model, respectively. Second, a more accurate oxygen consumption volume was obtained by integrating the OBM model and DNN model. Finally, the converter oxygen blowing time was calculated according to the oxygen consumption volume and the oxygen supply intensity of each heat. The proposed hybrid model was verified using the actual data collected from an integrated steel plant in China, and compared with multiple linear regression model, OBM model, and neural network model including extreme learning machine, back propagation neural network, and DNN. The test results indicate that the hybrid model with a network structure of 3 hidden layer layers, 32-16-8 neurons per hidden layer, and 0.1 learning rate has the best prediction accuracy and stronger generalization ability compared with other models. The predicted hit ratio of oxygen consumption volume within the error±300 m^(3)is 96.67%;determination coefficient (R^(2)) and root mean square error (RMSE) are0.6984 and 150.03 m^(3), respectively. The oxygen blow time prediction hit ratio within the error±0.6 min is 89.50%;R2and RMSE are0.9486 and 0.3592 min, respectively. As a result, the proposed model can effectively predict the oxygen consumption volume and oxygen blowing time in the converter.展开更多
Historically,landslides have been the primary type of geological disaster worldwide.Generally,the stability of reservoir banks is primarily affected by rainfall and reservoir water level fluctuations.Moreover,the stab...Historically,landslides have been the primary type of geological disaster worldwide.Generally,the stability of reservoir banks is primarily affected by rainfall and reservoir water level fluctuations.Moreover,the stability of reservoir banks changes with the long-term dynamics of external disastercausing factors.Thus,assessing the time-varying reliability of reservoir landslides remains a challenge.In this paper,a machine learning(ML)based approach is proposed to analyze the long-term reliability of reservoir bank landslides in spatially variable soils through time series prediction.This study systematically investigated the prediction performances of three ML algorithms,i.e.multilayer perceptron(MLP),convolutional neural network(CNN),and long short-term memory(LSTM).Additionally,the effects of the data quantity and data ratio on the predictive power of deep learning models are considered.The results show that all three ML models can accurately depict the changes in the time-varying failure probability of reservoir landslides.The CNN model outperforms both the MLP and LSTM models in predicting the failure probability.Furthermore,selecting the right data ratio can improve the prediction accuracy of the failure probability obtained by ML models.展开更多
Background:Prolonged sitting and reduced physical activity lead to low energy expenditures.However,little is known about the joint impact of daily sitting time and physical activity on body fat distribution.We investi...Background:Prolonged sitting and reduced physical activity lead to low energy expenditures.However,little is known about the joint impact of daily sitting time and physical activity on body fat distribution.We investigated the independent and joint associations of daily sitting time and physical activity with body fat among adults.Methods:This was a cross-sectional analysis of U.S.nationally representative data from the National Health and Nutrition Examination Survey2011-2018 among adults aged 20 years or older.Daily sitting time and leisure-time physical activity(LTPA)were self-reported using the Global Physical Activity Questionnaire.Body fat(total and trunk fat percentage)was determined via dual X-ray absorptiometry.Results:Among 10,808 adults,about 54.6%spent 6 h/day or more sitting;more than one-half reported no LTPA(inactive)or less than 150 min/week LTPA(insufficiently active)with only 43.3%reported 150 min/week or more LTPA(active)in the past week.After fully adjusting for sociodemographic data,lifestyle behaviors,and chronic conditions,prolonged sitting time and low levels of LTPA were associated with higher total and trunk fat percentages in both sexes.When stratifying by LTPA,the association between daily sitting time and body fat appeared to be stronger in those who were inactive/insuufficiently active.In the joint analyses,inactive/insuufficiently active adults who reported sitting more than 8 h/day had the highest total(female:3.99%(95%confidence interval(95%CI):3.09%-4.88%);male:3.79%(95%CI:2.75%-4.82%))and trunk body fat percentages(female:4.21%(95%CI:3.09%-5.32%);male:4.07%(95%CI:2.95%-5.19%))when compared with those who were active and sitting less than 4 h/day.Conclusion:Prolonged daily sitting time was associated with increased body fat among U.S.adults.The higher body fat associated with 6 h/day sitting may not be offset by achieving recommended levels of physical activity.展开更多
The problem of prescribed performance tracking control for unknown time-delay nonlinear systems subject to output constraints is dealt with in this paper. In contrast with related works, only the most fundamental requ...The problem of prescribed performance tracking control for unknown time-delay nonlinear systems subject to output constraints is dealt with in this paper. In contrast with related works, only the most fundamental requirements, i.e., boundedness and the local Lipschitz condition, are assumed for the allowable time delays. Moreover, we focus on the case where the reference is unknown beforehand, which renders the standard prescribed performance control designs under output constraints infeasible. To conquer these challenges, a novel robust prescribed performance control approach is put forward in this paper.Herein, a reverse tuning function is skillfully constructed and automatically generates a performance envelop for the tracking error. In addition, a unified performance analysis framework based on proof by contradiction and the barrier function is established to reveal the inherent robustness of the control system against the time delays. It turns out that the system output tracks the reference with a preassigned settling time and good accuracy,without constraint violations. A comparative simulation on a two-stage chemical reactor is carried out to illustrate the above theoretical findings.展开更多
文摘BACKGROUND Meniscal tears are one of the most common knee injuries.After the diagnosis of a meniscal tear has been made,there are several factors physicians use to guide clinical decision-making.The influence of time between injury and isolated meniscus repair on patient outcomes is not well described.Assessing this relationship is important as it may influence clinical decision-making and can add to the preoperative patient education process.We hypothesized that increasing the time from injury to meniscus surgery would worsen postoperative outcomes.AIM To investigate the current literature for data on the relationship between time between meniscus injury and repair on patient outcomes.METHODS PubMed,Academic Search Complete,MEDLINE,CINAHL,and SPORTDiscus were searched for studies published between January 1,1995 and July 13,2023 on isolated meniscus repair.Exclusion criteria included concomitant ligament surgery,incomplete outcomes or time to surgery data,and meniscectomies.Patient demographics,time to injury,and postoperative outcomes from each study were abstracted and analyzed.RESULTS Five studies met all inclusion and exclusion criteria.There were 204(121 male,83 female)patients included.Three of five(60%)studies determined that time between injury and surgery was not statistically significant for postoperative Lysholm scores(P=0.62),Tegner scores(P=0.46),failure rate(P=0.45,P=0.86),and International Knee Documentation Committee scores(P=0.65).Two of five(40%)studies found a statistically significant increase in Lysholm scores with shorter time to surgery(P=0.03)and a statistically significant association between progression of medial meniscus extrusion ratio(P=0.01)and increasing time to surgery.CONCLUSION Our results do not support the hypothesis that increased time from injury to isolated meniscus surgery worsens postoperative outcomes.Decision-making primarily based on injury interval is thus not recommended.
基金supported by the National Key Research&Development Program of China,Nos.2021YFC2501205(to YC),2022YFC24069004(to JL)the STI2030-Major Project,Nos.2021ZD0201101(to YC),2022ZD0211800(to YH)+2 种基金the National Natural Science Foundation of China(Major International Joint Research Project),No.82020108013(to YH)the Sino-German Center for Research Promotion,No.M-0759(to YH)a grant from Beijing Municipal Science&Technology Commission(Beijing Brain Initiative),No.Z201100005520018(to JL)。
文摘Several promising plasma biomarker proteins,such as amyloid-β(Aβ),tau,neurofilament light chain,and glial fibrillary acidic protein,are widely used for the diagnosis of neurodegenerative diseases.However,little is known about the long-term stability of these biomarker proteins in plasma samples stored at-80°C.We aimed to explore how storage time would affect the diagnostic accuracy of these biomarkers using a large cohort.Plasma samples from 229 cognitively unimpaired individuals,encompassing healthy controls and those experiencing subjective cognitive decline,as well as 99 patients with cognitive impairment,comprising those with mild cognitive impairment and dementia,were acquired from the Sino Longitudinal Study on Cognitive Decline project.These samples were stored at-80°C for up to 6 years before being used in this study.Our results showed that plasma levels of Aβ42,Aβ40,neurofilament light chain,and glial fibrillary acidic protein were not significantly correlated with sample storage time.However,the level of total tau showed a negative correlation with sample storage time.Notably,in individuals without cognitive impairment,plasma levels of total protein and tau phosphorylated protein threonine 181(p-tau181)also showed a negative correlation with sample storage time.This was not observed in individuals with cognitive impairment.Consequently,we speculate that the diagnostic accuracy of plasma p-tau181 and the p-tau181 to total tau ratio may be influenced by sample storage time.Therefore,caution is advised when using these plasma biomarkers for the identification of neurodegenerative diseases,such as Alzheimer's disease.Furthermore,in cohort studies,it is important to consider the impact of storage time on the overall results.
文摘Despite extensive research, timing channels (TCs) are still known as a principal category of threats that aim to leak and transmit information by perturbing the timing or ordering of events. Existing TC detection approaches use either signature-based approaches to detect known TCs or anomaly-based approach by modeling the legitimate network traffic in order to detect unknown TCs. Un-fortunately, in a software-defined networking (SDN) environment, most existing TC detection approaches would fail due to factors such as volatile network traffic, imprecise timekeeping mechanisms, and dynamic network topology. Furthermore, stealthy TCs can be designed to mimic the legitimate traffic pattern and thus evade anomalous TC detection. In this paper, we overcome the above challenges by presenting a novel framework that harnesses the advantages of elastic re-sources in the cloud. In particular, our framework dynamically configures SDN to enable/disable differential analysis against outbound network flows of different virtual machines (VMs). Our framework is tightly coupled with a new metric that first decomposes the timing data of network flows into a number of using the discrete wavelet-based multi-resolution transform (DWMT). It then applies the Kullback-Leibler divergence (KLD) to measure the variance among flow pairs. The appealing feature of our approach is that, compared with the existing anomaly detection approaches, it can detect most existing and some new stealthy TCs without legitimate traffic for modeling, even with the presence of noise and imprecise timekeeping mechanism in an SDN virtual environment. We implement our framework as a prototype system, OBSERVER, which can be dynamically deployed in an SDN environment. Empirical evaluation shows that our approach can efficiently detect TCs with a higher detection rate, lower latency, and negligible performance overhead compared to existing approaches.
基金Funded by the Key Laboratory of Geospace Environment and Geodesy, Ministry of Education, China( No.02 09 0.5) and the National Natural ScienceFoundation of China (No.40174005).
文摘Two methods for smoothing pseudorange observable by Carrier and Doppler are discussed. Then the procedure based on the RINEX observation files is tested using the Ashtech Z-XII3T geodetic receivers driven by a stable external frequency at UNSO. This paper proposes to adapt this procedure for the links between geodetic receivers, in order to take advantage of the P codes available on L 1 and L 2. This new procedure uses the 30-second RINEX observations files, the standard of the International GPS Service (IGS), and processes the ionosphere-free combination of the codes P 1 and P 2; the satellite positions are deduced from the IGS rapid orbits, available after two days.
文摘This paper proposes a digital background calibration scheme for timing skew in time-interleaved analog-to-digital converters (TIADCs). It detects the relevant timing error by subtracting the output difference with the sum of the first derivative of the digital output. The least-mean-square (LMS) loop is exploited to compensate the timing skew. Since the calibration scheme depends on the digital output, all timing skew sources can be calibrated and the main ADC is maintained. The proposed scheme is effective within the entire frequency range of 0 ? fs/2. Compared with traditional calibration schemes, the proposed approach is more feasible and consumes significantly lesser power and smaller area.
文摘In order to achieve the auto-timing counts measurement of nuclear radiation using ORTEC 974 Counter/Timer, an auto-timing counts virtual instrument system based on the LabVIEW virtual instrument development platform and GPIB instrument control and transmission bus protocol is designed in this paper. By introducing software timing technique, the minimum time base of factory setting improves from 0.1 s to 0.03 s. The timing counts performance and longtime stability are also discussed in detail. The automatic data recording and saving facilitates data analysis and processing. Its real-time display and statistic function is very convenient for monitoring the nuclear radiation.
文摘A fine grained distributed multimedia synchronization model——Enhanced Fuzzy timing Petri Net was proposed which is good at modeling indeterminacy and fuzzy. To satisfy the need of maximum tolerable jitter, the sufficient conditions are given in intra object synchronization. Method to find a proper granularity in inter object synchronization is also given to satisfy skew. Exceptions are detected and corrected as early as possible using restricted blocking method.
文摘this paper develops a real-time traffic signal timing model which is to be integrated into a single intersection for urban road, thereby solving the problem of traffic congestion. We analyze the current situation of the traffic flow with release matrix firstly, and then put forward the basic models to minimize total delay time of vehicles at the intersection. The optimal real-time signal timing model (non-fixed cycle and non-fixed split) is built with the Webster split optimal model. At last, the simulated results, which are compared with conventional model, manifest the promising properties of proposed model.
基金supported by the National Natural Science Foundation of China under Grant No. 61501084。
文摘In recent years,the time-frequency overlapping multi-carrier signal has been a novel and valuable topic in blind signal processing,especially in the non-cooperative receiving field.But there is little related research in public published papers.This paper proposes two timing estimation algorithms,which are non-data-aided and based on the cyclic auto-correlation function.In order to evaluate the performance of the proposed algorithms,the theoretical bound of the timing estimation is derived.According to the analyses and simulation results,the effectiveness of the proposed algorithms has been demonstrated.It shows that MethodⅠhas better performance than MethodⅡ.However,MethodⅡdoes not need prior information,so it has a wider range of applications.
文摘VHF (Very High Frequency) band antenna array will receive analog signal from universe for storage after digital sampling and adding time scale, and then do the interference analysis of different sub-station digital signal. It requires the time-frequency system with high precision and low drifting. This paper explains a time-frequency system of VHF band antenna, which can produce standard 10 MHz signal and clock signal needed by sampler, to ensure that two computers which sampling data has the same system time and the storage data has the accurate time scale, the system includes time comparison programme based on the GPS network timing two different sampling control computers. Timing strategy uses a time comparison software which based on the Labview graphical programming platform. This software captures the system time of two computers to analyze and determine the time deviation when the two computers occurs time offset, and then grant the GPS time of NTP server to the two computers through local area network in this time deviation. Final results show that this method can automatically calibrate the system time of the computers in the LAN, Precision Can Reach 0.1 s Orless.
基金financially supported by the National Natural Science Foundation of China (Nos.51974023 and52374321)the funding of State Key Laboratory of Advanced Metallurgy,University of Science and Technology Beijing,China (No.41620007)。
文摘The amount of oxygen blown into the converter is one of the key parameters for the control of the converter blowing process,which directly affects the tap-to-tap time of converter. In this study, a hybrid model based on oxygen balance mechanism (OBM) and deep neural network (DNN) was established for predicting oxygen blowing time in converter. A three-step method was utilized in the hybrid model. First, the oxygen consumption volume was predicted by the OBM model and DNN model, respectively. Second, a more accurate oxygen consumption volume was obtained by integrating the OBM model and DNN model. Finally, the converter oxygen blowing time was calculated according to the oxygen consumption volume and the oxygen supply intensity of each heat. The proposed hybrid model was verified using the actual data collected from an integrated steel plant in China, and compared with multiple linear regression model, OBM model, and neural network model including extreme learning machine, back propagation neural network, and DNN. The test results indicate that the hybrid model with a network structure of 3 hidden layer layers, 32-16-8 neurons per hidden layer, and 0.1 learning rate has the best prediction accuracy and stronger generalization ability compared with other models. The predicted hit ratio of oxygen consumption volume within the error±300 m^(3)is 96.67%;determination coefficient (R^(2)) and root mean square error (RMSE) are0.6984 and 150.03 m^(3), respectively. The oxygen blow time prediction hit ratio within the error±0.6 min is 89.50%;R2and RMSE are0.9486 and 0.3592 min, respectively. As a result, the proposed model can effectively predict the oxygen consumption volume and oxygen blowing time in the converter.
基金supported by the National Natural Science Foundation of China(Grant No.52308340)the Innovative Projects of Universities in Guangdong(Grant No.2022KTSCX208)Sichuan Transportation Science and Technology Project(Grant No.2018-ZL-01).
文摘Historically,landslides have been the primary type of geological disaster worldwide.Generally,the stability of reservoir banks is primarily affected by rainfall and reservoir water level fluctuations.Moreover,the stability of reservoir banks changes with the long-term dynamics of external disastercausing factors.Thus,assessing the time-varying reliability of reservoir landslides remains a challenge.In this paper,a machine learning(ML)based approach is proposed to analyze the long-term reliability of reservoir bank landslides in spatially variable soils through time series prediction.This study systematically investigated the prediction performances of three ML algorithms,i.e.multilayer perceptron(MLP),convolutional neural network(CNN),and long short-term memory(LSTM).Additionally,the effects of the data quantity and data ratio on the predictive power of deep learning models are considered.The results show that all three ML models can accurately depict the changes in the time-varying failure probability of reservoir landslides.The CNN model outperforms both the MLP and LSTM models in predicting the failure probability.Furthermore,selecting the right data ratio can improve the prediction accuracy of the failure probability obtained by ML models.
文摘Background:Prolonged sitting and reduced physical activity lead to low energy expenditures.However,little is known about the joint impact of daily sitting time and physical activity on body fat distribution.We investigated the independent and joint associations of daily sitting time and physical activity with body fat among adults.Methods:This was a cross-sectional analysis of U.S.nationally representative data from the National Health and Nutrition Examination Survey2011-2018 among adults aged 20 years or older.Daily sitting time and leisure-time physical activity(LTPA)were self-reported using the Global Physical Activity Questionnaire.Body fat(total and trunk fat percentage)was determined via dual X-ray absorptiometry.Results:Among 10,808 adults,about 54.6%spent 6 h/day or more sitting;more than one-half reported no LTPA(inactive)or less than 150 min/week LTPA(insufficiently active)with only 43.3%reported 150 min/week or more LTPA(active)in the past week.After fully adjusting for sociodemographic data,lifestyle behaviors,and chronic conditions,prolonged sitting time and low levels of LTPA were associated with higher total and trunk fat percentages in both sexes.When stratifying by LTPA,the association between daily sitting time and body fat appeared to be stronger in those who were inactive/insuufficiently active.In the joint analyses,inactive/insuufficiently active adults who reported sitting more than 8 h/day had the highest total(female:3.99%(95%confidence interval(95%CI):3.09%-4.88%);male:3.79%(95%CI:2.75%-4.82%))and trunk body fat percentages(female:4.21%(95%CI:3.09%-5.32%);male:4.07%(95%CI:2.95%-5.19%))when compared with those who were active and sitting less than 4 h/day.Conclusion:Prolonged daily sitting time was associated with increased body fat among U.S.adults.The higher body fat associated with 6 h/day sitting may not be offset by achieving recommended levels of physical activity.
基金supported in part by the National Natural Science Foundation of China (62103093)the National Key Research and Development Program of China (2022YFB3305905)+6 种基金the Xingliao Talent Program of Liaoning Province of China (XLYC2203130)the Fundamental Research Funds for the Central Universities of China (N2108003)the Natural Science Foundation of Liaoning Province (2023-MS-087)the BNU Talent Seed Fund,UIC Start-Up Fund (R72021115)the Guangdong Key Laboratory of AI and MM Data Processing (2020KSYS007)the Guangdong Provincial Key Laboratory IRADS for Data Science (2022B1212010006)the Guangdong Higher Education Upgrading Plan 2021–2025 of “Rushing to the Top,Making Up Shortcomings and Strengthening Special Features” with UIC Research,China (R0400001-22,R0400025-21)。
文摘The problem of prescribed performance tracking control for unknown time-delay nonlinear systems subject to output constraints is dealt with in this paper. In contrast with related works, only the most fundamental requirements, i.e., boundedness and the local Lipschitz condition, are assumed for the allowable time delays. Moreover, we focus on the case where the reference is unknown beforehand, which renders the standard prescribed performance control designs under output constraints infeasible. To conquer these challenges, a novel robust prescribed performance control approach is put forward in this paper.Herein, a reverse tuning function is skillfully constructed and automatically generates a performance envelop for the tracking error. In addition, a unified performance analysis framework based on proof by contradiction and the barrier function is established to reveal the inherent robustness of the control system against the time delays. It turns out that the system output tracks the reference with a preassigned settling time and good accuracy,without constraint violations. A comparative simulation on a two-stage chemical reactor is carried out to illustrate the above theoretical findings.