Buried interfacial voids have always been a notorious phenomenon observed in the fabrication of lead perovskite films. The existence of interfacial voids at the buried interface will capture the carrier, suppress carr...Buried interfacial voids have always been a notorious phenomenon observed in the fabrication of lead perovskite films. The existence of interfacial voids at the buried interface will capture the carrier, suppress carrier transport efficiencies, and affect the stability of photovoltaic devices. However, the impact of these buried interfacial voids on tin perovskites, a promising avenue for advancing lead-free photovoltaics, has been largely overlooked. Here, we utilize an innovative weakly polar solvent pretreatment strategy(WPSPS) to mitigate buried interfacial voids of tin perovskites. Our investigation reveals the presence of numerous voids in tin perovskites during annealing, attributed to trapped dimethyl sulfoxide(DMSO) used in film formation. The WPSPS method facilitates accelerated DMSO evaporation, effectively reducing residual DMSO. Interestingly, the WPSPS shifts the energy level of PEDOT:PSS downward, making it more aligned with the perovskite. This alignment enhances the efficiency of charge carrier transport. As the result, tin perovskite film quality is significantly improved,achieving a maximum power conversion efficiency approaching 12% with only an 8.3% efficiency loss after 1700 h of stability tests, which compares well with the state-of-the-art stability of tin-based perovskite solar cells.展开更多
Many fields,such as neuroscience,are experiencing the vast prolife ration of cellular data,underscoring the need fo r organizing and interpreting large datasets.A popular approach partitions data into manageable subse...Many fields,such as neuroscience,are experiencing the vast prolife ration of cellular data,underscoring the need fo r organizing and interpreting large datasets.A popular approach partitions data into manageable subsets via hierarchical clustering,but objective methods to determine the appropriate classification granularity are missing.We recently introduced a technique to systematically identify when to stop subdividing clusters based on the fundamental principle that cells must differ more between than within clusters.Here we present the corresponding protocol to classify cellular datasets by combining datadriven unsupervised hierarchical clustering with statistical testing.These general-purpose functions are applicable to any cellular dataset that can be organized as two-dimensional matrices of numerical values,including molecula r,physiological,and anatomical datasets.We demonstrate the protocol using cellular data from the Janelia MouseLight project to chara cterize morphological aspects of neurons.展开更多
With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud...With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud servers vulnerable due to insufficient encryption.This paper introduces a novel mechanism that encrypts data in‘bundle’units,designed to meet the dual requirements of efficiency and security for frequently updated collaborative data.Each bundle includes updated information,allowing only the updated portions to be reencrypted when changes occur.The encryption method proposed in this paper addresses the inefficiencies of traditional encryption modes,such as Cipher Block Chaining(CBC)and Counter(CTR),which require decrypting and re-encrypting the entire dataset whenever updates occur.The proposed method leverages update-specific information embedded within data bundles and metadata that maps the relationship between these bundles and the plaintext data.By utilizing this information,the method accurately identifies the modified portions and applies algorithms to selectively re-encrypt only those sections.This approach significantly enhances the efficiency of data updates while maintaining high performance,particularly in large-scale data environments.To validate this approach,we conducted experiments measuring execution time as both the size of the modified data and the total dataset size varied.Results show that the proposed method significantly outperforms CBC and CTR modes in execution speed,with greater performance gains as data size increases.Additionally,our security evaluation confirms that this method provides robust protection against both passive and active attacks.展开更多
A remarkable marine heatwave,known as the“Blob”,occurred in the Northeast Pacific Ocean from late 2013 to early 2016,which displayed strong warm anomalies extending from the surface to a depth of 300 m.This study em...A remarkable marine heatwave,known as the“Blob”,occurred in the Northeast Pacific Ocean from late 2013 to early 2016,which displayed strong warm anomalies extending from the surface to a depth of 300 m.This study employed two assimilation schemes based on the global Climate Forecast System of Nanjing University of Information Science(NUIST-CFS 1.0)to investigate the impact of ocean data assimilation on the seasonal prediction of this extreme marine heatwave.The sea surface temperature(SST)nudging scheme assimilates SST only,while the deterministic ensemble Kalman filter(EnKF)scheme assimilates observations from the surface to the deep ocean.The latter notably improves the forecasting skill for subsurface temperature anomalies,especially at the depth of 100-300 m(the lower layer),outperforming the SST nudging scheme.It excels in predicting both horizontal and vertical heat transport in the lower layer,contributing to improved forecasts of the lower-layer warming during the Blob.These improvements stem from the assimilation of subsurface observational data,which are important in predicting the upper-ocean conditions.The results suggest that assimilating ocean data with the EnKF scheme significantly enhances the accuracy in predicting subsurface temperature anomalies during the Blob and offers better understanding of its underlying mechanisms.展开更多
There is a growing body of clinical research on the utility of synthetic data derivatives,an emerging research tool in medicine.In nephrology,clinicians can use machine learning and artificial intelligence as powerful...There is a growing body of clinical research on the utility of synthetic data derivatives,an emerging research tool in medicine.In nephrology,clinicians can use machine learning and artificial intelligence as powerful aids in their clinical decision-making while also preserving patient privacy.This is especially important given the epidemiology of chronic kidney disease,renal oncology,and hypertension worldwide.However,there remains a need to create a framework for guidance regarding how to better utilize synthetic data as a practical application in this research.展开更多
Phosphate-manganese, tannic acid and vanadium conversion coatings were proposed as an effective pre-treatment layer between electroless Ni-P coating and AZ91D magnesium alloy substrate to replace the traditional chrom...Phosphate-manganese, tannic acid and vanadium conversion coatings were proposed as an effective pre-treatment layer between electroless Ni-P coating and AZ91D magnesium alloy substrate to replace the traditional chromate plus HF pre-treatment. The electrochemical results show that the chrome-free coatings plus electroless Ni-P coating on the magnesium alloy has the lowest corrosion current density and most positive corrosion potential compared with chromate plus electroless Ni-P coating on the magnesium alloy. These proposed pre-treatment layers on the substrate reduce the corrosion of magnesium during plating process, and reduce the potential difference between the matrix and the second phase. Thus, an electroless Ni-P coating with fine crystalline and dense structure was obtained, with preferential phosphorus content, low porosity, good corrosion-resistance and strengthened adhesion than the chromate plus electroless Ni-P.展开更多
AIM: To investigate the effects of selenium in rat retinal ischemia reperfusion(IR) model and compare pretreatment and post-treatment use.METHODS: Selenium pre-treatment group(n =8) was treated with intraperitoneal(i....AIM: To investigate the effects of selenium in rat retinal ischemia reperfusion(IR) model and compare pretreatment and post-treatment use.METHODS: Selenium pre-treatment group(n =8) was treated with intraperitoneal(i.p.) selenium 0.5 mg/kg for7 d and terminated 24 h after the IR injury. Selenium posttreatment group( n = 8) was treated with i. p. selenium0.5 mg/kg for 7d after the IR injury with termination at the end of the 7d period. Sham group(n =8) received i.p.saline injections identical to the selenium volume for 7d with termination 24 h after the IR injury. Control group(n =8) received no intervention. Main outcome measures were retina superoxide dismutase(SOD), glutathione(GSH),total antioxidant status(TAS), malondialdehyde(MDA),DNA fragmentation levels, and immunohistological apoptosis evaluation.RESULTS: Compared to the Sham group, selenium pre-treatment had a statistical difference in all parameters except SOD. Post-treatment selenium also resulted in statistical differences in all parameters except the MDA levels. When comparing selenium groups, the pre-treatment selenium group had a statistically higher success in reduction of markers of cell damage such as MDA and DNA fragmentation. In contrast, the post-selenium treatment group had resulted in statisticallyhigher levels of GSH. Histologically both selenium groups succeeded to limit retinal thickening and apoptosis. Pre-treatment use was statistically more successful in decreasing apoptosis in ganglion cell layer compared to post-treatment use.CONCLUSION: Selenium was successful in retinal protection in IR injuries. Pre-treatment efficacy was superior in terms of prevention of tissue damage and apoptosis.展开更多
Pre-treatment, which supplies a stable, high-quality feed for reverse osmosis (RO) membranes, is a criti- cal step for successful operation in a seawater reverse osmosis plant. In this study, ceramic membrane system...Pre-treatment, which supplies a stable, high-quality feed for reverse osmosis (RO) membranes, is a criti- cal step for successful operation in a seawater reverse osmosis plant. In this study, ceramic membrane systems were employed as pre-treatment for seawater desalination. A laboratory experiment was performed to investigate the effect of the cross-flow velocity on the critical flux and consequently to optimize the permeate flux. Then a pilot test was performed to investigate the long-term performance. The result shows that there is no significant effect of the cross-flow velocity on the critical flux when the cross-flow velocity varies in laminar flow region only or in turbulent flow region only, but the effect is distinct when the cross-flow velocity varies in the transition region. The membrane fouling is slight at the permeate flux of 150 L·m^-2·h^-1 and the system is stable, producing a high-quality feed (the turbidity and silt density index are less than 0.1 NTU and 3.0, respectively) for RO to run for 2922.4 h without chemical cleaning. Thus the ceramic membranes are suitable to filtrate seawater as the pre-treatment for RO.展开更多
Microwave,as a new heat treatment technology,has the characteristics of uniform and fast heating speed.It is an energy-saving technology known for improving oilseed product quality.Its efficiency mainly depends on the...Microwave,as a new heat treatment technology,has the characteristics of uniform and fast heating speed.It is an energy-saving technology known for improving oilseed product quality.Its efficiency mainly depends on the roasting power and time.However,the production of high-quality peanut butter using short-time roasting con-ditions are limited.Herein,we determined an appropriate microwave roasting power and time for peanuts and evaluated its impacts on the quality of peanut butter.Different roasting powers(400 W,800 W and 1200 W)and times(4,4.5,5,and 5.5 min)were preliminarily tested.Among them,800 W at 5 min was the most suitable.The roasting efficiency was further evaluated using color,sensory,bioactive compounds,storage stability,and safety risk factors of peanut butter produced from four peanut cultivars(Silihong,Baisha-1016,Yuanza-9102,and Yuhua-9414).The pre-treated butter obtained from three cultivars(Silihong,Yuanza-9102,and Yuhua-9414)with moisture content between 5%and 7.2%had a similar sensory score(6-7)as the commercial on a 9-point hedonic scale compare to the other.The color of the pre-treated peanut butter varies statistically with the commercial but remained in the recommended range of Hunter L*values of 51-52,respectively,for high initial moisture peanut cultivars.The total polyphenol(35.20-31.59±0.59μmol GAE/g)and tocopherol(19.05±0.35 mg/100 g)content in the butter obtained from three cultivars(Yuahua-9102,Yuhua,and Baisha-1016)and Silihong respectively,were significantly(P<0.05)higher than those in the commercial butter.The induction times of all pre-treated butter(19.80±0.99-7.84±0.07 h)were significantly(P<0.05)longer during storage at accelerated temperature than commercial samples.In addition,no benzo[a]pyrene was found in the pre-treated samples.Collectively,the microwave pretreated peanut butter was superior to the commercial one.These findings provided data support and a reference basis to promote microwave use for peanut butter production.展开更多
BACKGROUND During cirrhosis,the liver is impaired and unable to synthesize and clear thrombopoietin properly.At the same time,the spleen assumes the function of hemofiltration and storage due to liver dysfunction,resu...BACKGROUND During cirrhosis,the liver is impaired and unable to synthesize and clear thrombopoietin properly.At the same time,the spleen assumes the function of hemofiltration and storage due to liver dysfunction,resulting in hypersplenism and excessive removal of platelets in the spleen,further reducing platelet count.When liver function is decompensated in cirrhotic patients,the decrease of thrombopoietin(TPO)synthesis is the main reason for the decrease of new platelet production.This change of TPO leads to thrombocytopenia and bleeding tendency in cirrhotic patients with hypersplenism.AIM To investigate the clinical efficacy of recombinant human TPO(rhTPO)in the treatment of perioperative thrombocytopenia during liver transplantation in cirrhotic mice with hypersplenism.METHODS C57BL/6J mice and TPO receptor-deficient mice were used to establish models of cirrhosis with hypersplenism.Subsequently,these mice underwent orthotopic liver transplantation(OLT).The mice in the experimental group were given rhTPO treatment for 3 consecutive days before surgery and 5 consecutive days after surgery,while the mice in the control group received the same dose of saline at the same frequency.Differences in liver function and platelet counts were determined between the experimental and control groups.Enzyme-linked immunosorbent assay was used to assess the expression of TPO and TPO receptor(c-Mpl)in the blood.RESULTS Preoperative administration of rhTPO significantly improved peri-OLT thrombocytopenia in mice with cirrhosis and hypersplenism.Blocking the expression of TPO receptors exacerbated peri-OLT thrombocytopenia.The concentration of TPO decreased while the concentration of c-Mpl increased in compensation in the mouse model of cirrhosis with hypersplenism.TPO pre-treatment significantly increased the postoperative TPO concentration in mice,which in turn led to a decrease in the c-Mpl concentration.TPO pre-treatment also significantly enhanced the Janus kinase(Jak)/signal transducers and activators of transcription pathway protein expressions in bone marrow stem cells of the C57BL/6J mice.Moreover,the administration of TPO,both before and after surgery,regulated the levels of biochemical indicators,such as alanine aminotransferase,alkaline phosphatase,and aspartate aminotransferase in the C57BL/6J mice.CONCLUSION Pre-treatment with TPO not only exhibited therapeutic effects on perioperative thrombocytopenia in the mice with cirrhosis and hypersplenism,who underwent liver transplantation but also significantly enhanced the perioperative liver function.展开更多
Detailed experimental investigations were carried out for microwave pre-treatment of high ash Indian coal at high power level(900 W) in microwave oven. The microwave exposure times were fixed at60 s and 120 s. A rheol...Detailed experimental investigations were carried out for microwave pre-treatment of high ash Indian coal at high power level(900 W) in microwave oven. The microwave exposure times were fixed at60 s and 120 s. A rheology characteristic for microwave pre-treatment of coal-water slurry(CWS) was performed in an online Bohlin viscometer. The non-Newtonian character of the slurry follows the rheological model of Ostwald de Waele. The values of n and k vary from 0.31 to 0.64 and 0.19 to 0.81 Pa·sn,respectively. This paper presents an artificial neural network(ANN) model to predict the effects of operational parameters on apparent viscosity of CWS. A 4-2-1 topology with Levenberg-Marquardt training algorithm(trainlm) was selected as the controlled ANN. Mean squared error(MSE) of 0.002 and coefficient of multiple determinations(R^2) of 0.99 were obtained for the outperforming model. The promising values of correlation coefficient further confirm the robustness and satisfactory performance of the proposed ANN model.展开更多
A knowledge of the tree-ring stable nitrogen isotope ratio(δ^(15)N)can deepen our understanding of forest ecosystem dynamics by indicating the long-term availability,cycling and sources of nitrogen(N).However,the rad...A knowledge of the tree-ring stable nitrogen isotope ratio(δ^(15)N)can deepen our understanding of forest ecosystem dynamics by indicating the long-term availability,cycling and sources of nitrogen(N).However,the radial mobility of N blurs the interannual variations in the long-term N records.Previous studies of the chemical extraction of tree rings before analysis had produced inconsistent results and it is still unclear whether it is necessary to pre-treat wood samples from specific tree species to remove soluble N compounds before determining theδ^(15)N values.We compared the effects of pre-treatment with organic solvents and hot ultrapure water on the N concentration andδ^(15)N of tree rings from endemic Qinghai spruce(Picea crassifolia)growing in the interior of the central Qilian Mountains,China,during the last 60 a.We assessed the effects of different preparation protocols on the removal of the labile N compounds and investigated the need to pre-treat wood samples before determining theδ^(15)N values of tree rings.Increasing trends of the tree-ring N concentration were consistently observed in both the extracted and unextracted wood samples.The total N removed by extraction with organic solvents was about 17.60%,with a significantly higher amount in the sapwood section(P<0.01).Theδ^(15)N values of tree rings decreased consistently from 1960 to 2019 in both the extracted and unextracted wood samples.Extraction with organic solvents increased theδ^(15)N values markedly by about 5.2‰and reduced the variations in theδ^(15)N series.However,extraction with hot ultrapure water had little effect,with only a slight decrease in theδ^(15)N values of about 0.5‰.Our results showed that the radial pattern in the inter-ring movement of N in Qinghai spruce was not minimized by extraction with either organic solvents or hot ultrapure water.It is unnecessary to conduct hot ultrapure water extraction for the wood samples from Qinghai spruce because of its negligible effect on the removal of the labile N.Theδ^(15)N variation trend of tree rings in the unextracted wood samples was not influenced by the heartwood-sapwood transition zone.We suggest that theδ^(15)N values of the unextracted wood samples of the climate-sensitive Qinghai spruce could be used to explore the ecophysiological dynamics while focusing on the long-term variations.展开更多
The purpose of this work was to study the potential to enhance biogas production from pulp and paper mill sludge by the use of thermal pre-treatment in combination with chemical pre-treatment. Biogas from waste is a r...The purpose of this work was to study the potential to enhance biogas production from pulp and paper mill sludge by the use of thermal pre-treatment in combination with chemical pre-treatment. Biogas from waste is a renewable fuel with very low emissions during combustion. To further reduce the use of fossil fuels, more biogas substrates are necessary. Pulp and paper mill sludge is a large untapped reservoir of potential biogas. Pulp and paper mill sludge was collected from a mill that produces both pulp and paper and has a modified waste activated sludge system as part of its wastewater treatment. Pre-treatments were chosen heat (70 ~C or 140℃) combined with either acid (pH 2 or pH 4) or base (pH 9 or pH 11, obtained with Ca(OH)2 or NaOH). Biogas potential was tested by anaerobic digestion batch assays under mesophilic conditions. All pre-treatments were tested in six replicates. Biogas volume was measured with a gas-tight syringe and methane concentration was measured with a gas chromatograph. The methane yield from sludge subjected to thermal pre-treatment at 70℃ did not differ from the untreated sludge, but thermal pre-treatment at 140℃ had a positive effect. Within the 70℃ thermal pre-treatment group, the pH 2 acid was the most successful chemical pre-treatment, and Ca(OH)2 pH 9 had the least effect with no measurable improvement in methane yield. For the 140 ℃ thermal pre-treatment group, acid and NaOH impacted methane production negatively, while the Ca(OH)2-treated sludge did not differ from sludge with no chemical pre-treatment. In conclusion, thermal pre-treatment at 70℃ showed no effect, whereas, pre-treatment at 140℃ improved methane yield with 170%, and for this sludge additional, chemical pre-treatments are unnecessary.展开更多
In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose...In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose a Hadoop based big data secure storage scheme.Firstly,in order to disperse the NameNode service from a single server to multiple servers,we combine HDFS federation and HDFS high-availability mechanisms,and use the Zookeeper distributed coordination mechanism to coordinate each node to achieve dual-channel storage.Then,we improve the ECC encryption algorithm for the encryption of ordinary data,and adopt a homomorphic encryption algorithm to encrypt data that needs to be calculated.To accelerate the encryption,we adopt the dualthread encryption mode.Finally,the HDFS control module is designed to combine the encryption algorithm with the storage model.Experimental results show that the proposed solution solves the problem of a single point of failure of metadata,performs well in terms of metadata reliability,and can realize the fault tolerance of the server.The improved encryption algorithm integrates the dual-channel storage mode,and the encryption storage efficiency improves by 27.6% on average.展开更多
With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapi...With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapid development of IIoT.Blockchain technology has immutability,decentralization,and autonomy,which can greatly improve the inherent defects of the IIoT.In the traditional blockchain,data is stored in a Merkle tree.As data continues to grow,the scale of proofs used to validate it grows,threatening the efficiency,security,and reliability of blockchain-based IIoT.Accordingly,this paper first analyzes the inefficiency of the traditional blockchain structure in verifying the integrity and correctness of data.To solve this problem,a new Vector Commitment(VC)structure,Partition Vector Commitment(PVC),is proposed by improving the traditional VC structure.Secondly,this paper uses PVC instead of the Merkle tree to store big data generated by IIoT.PVC can improve the efficiency of traditional VC in the process of commitment and opening.Finally,this paper uses PVC to build a blockchain-based IIoT data security storage mechanism and carries out a comparative analysis of experiments.This mechanism can greatly reduce communication loss and maximize the rational use of storage space,which is of great significance for maintaining the security and stability of blockchain-based IIoT.展开更多
Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal depende...Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal dependence,and noise.Therefore,methodologies for data augmentation and conversion of time series data into images for analysis have been studied.This paper proposes a fault detection model that uses time series data augmentation and transformation to address the problems of data imbalance,temporal dependence,and robustness to noise.The method of data augmentation is set as the addition of noise.It involves adding Gaussian noise,with the noise level set to 0.002,to maximize the generalization performance of the model.In addition,we use the Markov Transition Field(MTF)method to effectively visualize the dynamic transitions of the data while converting the time series data into images.It enables the identification of patterns in time series data and assists in capturing the sequential dependencies of the data.For anomaly detection,the PatchCore model is applied to show excellent performance,and the detected anomaly areas are represented as heat maps.It allows for the detection of anomalies,and by applying an anomaly map to the original image,it is possible to capture the areas where anomalies occur.The performance evaluation shows that both F1-score and Accuracy are high when time series data is converted to images.Additionally,when processed as images rather than as time series data,there was a significant reduction in both the size of the data and the training time.The proposed method can provide an important springboard for research in the field of anomaly detection using time series data.Besides,it helps solve problems such as analyzing complex patterns in data lightweight.展开更多
Since the impoundment of Three Gorges Reservoir(TGR)in 2003,numerous slopes have experienced noticeable movement or destabilization owing to reservoir level changes and seasonal rainfall.One case is the Outang landsli...Since the impoundment of Three Gorges Reservoir(TGR)in 2003,numerous slopes have experienced noticeable movement or destabilization owing to reservoir level changes and seasonal rainfall.One case is the Outang landslide,a large-scale and active landslide,on the south bank of the Yangtze River.The latest monitoring data and site investigations available are analyzed to establish spatial and temporal landslide deformation characteristics.Data mining technology,including the two-step clustering and Apriori algorithm,is then used to identify the dominant triggers of landslide movement.In the data mining process,the two-step clustering method clusters the candidate triggers and displacement rate into several groups,and the Apriori algorithm generates correlation criteria for the cause-and-effect.The analysis considers multiple locations of the landslide and incorporates two types of time scales:longterm deformation on a monthly basis and short-term deformation on a daily basis.This analysis shows that the deformations of the Outang landslide are driven by both rainfall and reservoir water while its deformation varies spatiotemporally mainly due to the difference in local responses to hydrological factors.The data mining results reveal different dominant triggering factors depending on the monitoring frequency:the monthly and bi-monthly cumulative rainfall control the monthly deformation,and the 10-d cumulative rainfall and the 5-d cumulative drop of water level in the reservoir dominate the daily deformation of the landslide.It is concluded that the spatiotemporal deformation pattern and data mining rules associated with precipitation and reservoir water level have the potential to be broadly implemented for improving landslide prevention and control in the dam reservoirs and other landslideprone areas.展开更多
Capabilities to assimilate Geostationary Operational Environmental Satellite “R-series ”(GOES-R) Geostationary Lightning Mapper(GLM) flash extent density(FED) data within the operational Gridpoint Statistical Interp...Capabilities to assimilate Geostationary Operational Environmental Satellite “R-series ”(GOES-R) Geostationary Lightning Mapper(GLM) flash extent density(FED) data within the operational Gridpoint Statistical Interpolation ensemble Kalman filter(GSI-EnKF) framework were previously developed and tested with a mesoscale convective system(MCS) case. In this study, such capabilities are further developed to assimilate GOES GLM FED data within the GSI ensemble-variational(EnVar) hybrid data assimilation(DA) framework. The results of assimilating the GLM FED data using 3DVar, and pure En3DVar(PEn3DVar, using 100% ensemble covariance and no static covariance) are compared with those of EnKF/DfEnKF for a supercell storm case. The focus of this study is to validate the correctness and evaluate the performance of the new implementation rather than comparing the performance of FED DA among different DA schemes. Only the results of 3DVar and pEn3DVar are examined and compared with EnKF/DfEnKF. Assimilation of a single FED observation shows that the magnitude and horizontal extent of the analysis increments from PEn3DVar are generally larger than from EnKF, which is mainly caused by using different localization strategies in EnFK/DfEnKF and PEn3DVar as well as the integration limits of the graupel mass in the observation operator. Overall, the forecast performance of PEn3DVar is comparable to EnKF/DfEnKF, suggesting correct implementation.展开更多
When building a classification model,the scenario where the samples of one class are significantly more than those of the other class is called data imbalance.Data imbalance causes the trained classification model to ...When building a classification model,the scenario where the samples of one class are significantly more than those of the other class is called data imbalance.Data imbalance causes the trained classification model to be in favor of the majority class(usually defined as the negative class),which may do harm to the accuracy of the minority class(usually defined as the positive class),and then lead to poor overall performance of the model.A method called MSHR-FCSSVM for solving imbalanced data classification is proposed in this article,which is based on a new hybrid resampling approach(MSHR)and a new fine cost-sensitive support vector machine(CS-SVM)classifier(FCSSVM).The MSHR measures the separability of each negative sample through its Silhouette value calculated by Mahalanobis distance between samples,based on which,the so-called pseudo-negative samples are screened out to generate new positive samples(over-sampling step)through linear interpolation and are deleted finally(under-sampling step).This approach replaces pseudo-negative samples with generated new positive samples one by one to clear up the inter-class overlap on the borderline,without changing the overall scale of the dataset.The FCSSVM is an improved version of the traditional CS-SVM.It considers influences of both the imbalance of sample number and the class distribution on classification simultaneously,and through finely tuning the class cost weights by using the efficient optimization algorithm based on the physical phenomenon of rime-ice(RIME)algorithm with cross-validation accuracy as the fitness function to accurately adjust the classification borderline.To verify the effectiveness of the proposed method,a series of experiments are carried out based on 20 imbalanced datasets including both mildly and extremely imbalanced datasets.The experimental results show that the MSHR-FCSSVM method performs better than the methods for comparison in most cases,and both the MSHR and the FCSSVM played significant roles.展开更多
基金National Natural Science Foundation of China (62274094, 62175117)Natural Science Foundation of Jiangsu Higher Education Institutions (22KJB510011)+1 种基金Key Lab of Modern Optical Technologies of Education Ministry of China, Soochow University (KJS2260)Huali Talents Program of Nanjing University of Posts and Telecommunications。
文摘Buried interfacial voids have always been a notorious phenomenon observed in the fabrication of lead perovskite films. The existence of interfacial voids at the buried interface will capture the carrier, suppress carrier transport efficiencies, and affect the stability of photovoltaic devices. However, the impact of these buried interfacial voids on tin perovskites, a promising avenue for advancing lead-free photovoltaics, has been largely overlooked. Here, we utilize an innovative weakly polar solvent pretreatment strategy(WPSPS) to mitigate buried interfacial voids of tin perovskites. Our investigation reveals the presence of numerous voids in tin perovskites during annealing, attributed to trapped dimethyl sulfoxide(DMSO) used in film formation. The WPSPS method facilitates accelerated DMSO evaporation, effectively reducing residual DMSO. Interestingly, the WPSPS shifts the energy level of PEDOT:PSS downward, making it more aligned with the perovskite. This alignment enhances the efficiency of charge carrier transport. As the result, tin perovskite film quality is significantly improved,achieving a maximum power conversion efficiency approaching 12% with only an 8.3% efficiency loss after 1700 h of stability tests, which compares well with the state-of-the-art stability of tin-based perovskite solar cells.
基金supported in part by NIH grants R01NS39600,U01MH114829RF1MH128693(to GAA)。
文摘Many fields,such as neuroscience,are experiencing the vast prolife ration of cellular data,underscoring the need fo r organizing and interpreting large datasets.A popular approach partitions data into manageable subsets via hierarchical clustering,but objective methods to determine the appropriate classification granularity are missing.We recently introduced a technique to systematically identify when to stop subdividing clusters based on the fundamental principle that cells must differ more between than within clusters.Here we present the corresponding protocol to classify cellular datasets by combining datadriven unsupervised hierarchical clustering with statistical testing.These general-purpose functions are applicable to any cellular dataset that can be organized as two-dimensional matrices of numerical values,including molecula r,physiological,and anatomical datasets.We demonstrate the protocol using cellular data from the Janelia MouseLight project to chara cterize morphological aspects of neurons.
基金supported by the Institute of Information&communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(RS-2024-00399401,Development of Quantum-Safe Infrastructure Migration and Quantum Security Verification Technologies).
文摘With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud servers vulnerable due to insufficient encryption.This paper introduces a novel mechanism that encrypts data in‘bundle’units,designed to meet the dual requirements of efficiency and security for frequently updated collaborative data.Each bundle includes updated information,allowing only the updated portions to be reencrypted when changes occur.The encryption method proposed in this paper addresses the inefficiencies of traditional encryption modes,such as Cipher Block Chaining(CBC)and Counter(CTR),which require decrypting and re-encrypting the entire dataset whenever updates occur.The proposed method leverages update-specific information embedded within data bundles and metadata that maps the relationship between these bundles and the plaintext data.By utilizing this information,the method accurately identifies the modified portions and applies algorithms to selectively re-encrypt only those sections.This approach significantly enhances the efficiency of data updates while maintaining high performance,particularly in large-scale data environments.To validate this approach,we conducted experiments measuring execution time as both the size of the modified data and the total dataset size varied.Results show that the proposed method significantly outperforms CBC and CTR modes in execution speed,with greater performance gains as data size increases.Additionally,our security evaluation confirms that this method provides robust protection against both passive and active attacks.
基金supported by the National Natural Science Foundation of China [grant number 42030605]the National Key R&D Program of China [grant number 2020YFA0608004]。
文摘A remarkable marine heatwave,known as the“Blob”,occurred in the Northeast Pacific Ocean from late 2013 to early 2016,which displayed strong warm anomalies extending from the surface to a depth of 300 m.This study employed two assimilation schemes based on the global Climate Forecast System of Nanjing University of Information Science(NUIST-CFS 1.0)to investigate the impact of ocean data assimilation on the seasonal prediction of this extreme marine heatwave.The sea surface temperature(SST)nudging scheme assimilates SST only,while the deterministic ensemble Kalman filter(EnKF)scheme assimilates observations from the surface to the deep ocean.The latter notably improves the forecasting skill for subsurface temperature anomalies,especially at the depth of 100-300 m(the lower layer),outperforming the SST nudging scheme.It excels in predicting both horizontal and vertical heat transport in the lower layer,contributing to improved forecasts of the lower-layer warming during the Blob.These improvements stem from the assimilation of subsurface observational data,which are important in predicting the upper-ocean conditions.The results suggest that assimilating ocean data with the EnKF scheme significantly enhances the accuracy in predicting subsurface temperature anomalies during the Blob and offers better understanding of its underlying mechanisms.
文摘There is a growing body of clinical research on the utility of synthetic data derivatives,an emerging research tool in medicine.In nephrology,clinicians can use machine learning and artificial intelligence as powerful aids in their clinical decision-making while also preserving patient privacy.This is especially important given the epidemiology of chronic kidney disease,renal oncology,and hypertension worldwide.However,there remains a need to create a framework for guidance regarding how to better utilize synthetic data as a practical application in this research.
基金Project(50871046)supported by the National Natural Science Foundation of ChinaProject(2010CB631001)supported by the National Basic Research Program of China+1 种基金Project supported by the Program for Changjiang Scholars and Innovative Research Team inUniversity,ChinaIndo-China cultural exchange scholarship program by the Ministry of Human Resource Department(MHRD,India)and Ministry of Education(MOE,China)
文摘Phosphate-manganese, tannic acid and vanadium conversion coatings were proposed as an effective pre-treatment layer between electroless Ni-P coating and AZ91D magnesium alloy substrate to replace the traditional chromate plus HF pre-treatment. The electrochemical results show that the chrome-free coatings plus electroless Ni-P coating on the magnesium alloy has the lowest corrosion current density and most positive corrosion potential compared with chromate plus electroless Ni-P coating on the magnesium alloy. These proposed pre-treatment layers on the substrate reduce the corrosion of magnesium during plating process, and reduce the potential difference between the matrix and the second phase. Thus, an electroless Ni-P coating with fine crystalline and dense structure was obtained, with preferential phosphorus content, low porosity, good corrosion-resistance and strengthened adhesion than the chromate plus electroless Ni-P.
文摘AIM: To investigate the effects of selenium in rat retinal ischemia reperfusion(IR) model and compare pretreatment and post-treatment use.METHODS: Selenium pre-treatment group(n =8) was treated with intraperitoneal(i.p.) selenium 0.5 mg/kg for7 d and terminated 24 h after the IR injury. Selenium posttreatment group( n = 8) was treated with i. p. selenium0.5 mg/kg for 7d after the IR injury with termination at the end of the 7d period. Sham group(n =8) received i.p.saline injections identical to the selenium volume for 7d with termination 24 h after the IR injury. Control group(n =8) received no intervention. Main outcome measures were retina superoxide dismutase(SOD), glutathione(GSH),total antioxidant status(TAS), malondialdehyde(MDA),DNA fragmentation levels, and immunohistological apoptosis evaluation.RESULTS: Compared to the Sham group, selenium pre-treatment had a statistical difference in all parameters except SOD. Post-treatment selenium also resulted in statistical differences in all parameters except the MDA levels. When comparing selenium groups, the pre-treatment selenium group had a statistically higher success in reduction of markers of cell damage such as MDA and DNA fragmentation. In contrast, the post-selenium treatment group had resulted in statisticallyhigher levels of GSH. Histologically both selenium groups succeeded to limit retinal thickening and apoptosis. Pre-treatment use was statistically more successful in decreasing apoptosis in ganglion cell layer compared to post-treatment use.CONCLUSION: Selenium was successful in retinal protection in IR injuries. Pre-treatment efficacy was superior in terms of prevention of tissue damage and apoptosis.
基金Supported by the National High Technology Research and Development of China (2007AA030303)
文摘Pre-treatment, which supplies a stable, high-quality feed for reverse osmosis (RO) membranes, is a criti- cal step for successful operation in a seawater reverse osmosis plant. In this study, ceramic membrane systems were employed as pre-treatment for seawater desalination. A laboratory experiment was performed to investigate the effect of the cross-flow velocity on the critical flux and consequently to optimize the permeate flux. Then a pilot test was performed to investigate the long-term performance. The result shows that there is no significant effect of the cross-flow velocity on the critical flux when the cross-flow velocity varies in laminar flow region only or in turbulent flow region only, but the effect is distinct when the cross-flow velocity varies in the transition region. The membrane fouling is slight at the permeate flux of 150 L·m^-2·h^-1 and the system is stable, producing a high-quality feed (the turbidity and silt density index are less than 0.1 NTU and 3.0, respectively) for RO to run for 2922.4 h without chemical cleaning. Thus the ceramic membranes are suitable to filtrate seawater as the pre-treatment for RO.
基金Key Research Projects of Hubei Province(2020BBA045)the Agricultural Science and Technology Innovation Project of the Chinese Academy of Agricultural Sciences(CAAS-ASTIP-2016-OCRI)。
文摘Microwave,as a new heat treatment technology,has the characteristics of uniform and fast heating speed.It is an energy-saving technology known for improving oilseed product quality.Its efficiency mainly depends on the roasting power and time.However,the production of high-quality peanut butter using short-time roasting con-ditions are limited.Herein,we determined an appropriate microwave roasting power and time for peanuts and evaluated its impacts on the quality of peanut butter.Different roasting powers(400 W,800 W and 1200 W)and times(4,4.5,5,and 5.5 min)were preliminarily tested.Among them,800 W at 5 min was the most suitable.The roasting efficiency was further evaluated using color,sensory,bioactive compounds,storage stability,and safety risk factors of peanut butter produced from four peanut cultivars(Silihong,Baisha-1016,Yuanza-9102,and Yuhua-9414).The pre-treated butter obtained from three cultivars(Silihong,Yuanza-9102,and Yuhua-9414)with moisture content between 5%and 7.2%had a similar sensory score(6-7)as the commercial on a 9-point hedonic scale compare to the other.The color of the pre-treated peanut butter varies statistically with the commercial but remained in the recommended range of Hunter L*values of 51-52,respectively,for high initial moisture peanut cultivars.The total polyphenol(35.20-31.59±0.59μmol GAE/g)and tocopherol(19.05±0.35 mg/100 g)content in the butter obtained from three cultivars(Yuahua-9102,Yuhua,and Baisha-1016)and Silihong respectively,were significantly(P<0.05)higher than those in the commercial butter.The induction times of all pre-treated butter(19.80±0.99-7.84±0.07 h)were significantly(P<0.05)longer during storage at accelerated temperature than commercial samples.In addition,no benzo[a]pyrene was found in the pre-treated samples.Collectively,the microwave pretreated peanut butter was superior to the commercial one.These findings provided data support and a reference basis to promote microwave use for peanut butter production.
基金All procedures involving animals were reviewed and approved by the Tianjin Tiancheng New Drug Evaluation Co.,Ltd(Approval No.2023041701).
文摘BACKGROUND During cirrhosis,the liver is impaired and unable to synthesize and clear thrombopoietin properly.At the same time,the spleen assumes the function of hemofiltration and storage due to liver dysfunction,resulting in hypersplenism and excessive removal of platelets in the spleen,further reducing platelet count.When liver function is decompensated in cirrhotic patients,the decrease of thrombopoietin(TPO)synthesis is the main reason for the decrease of new platelet production.This change of TPO leads to thrombocytopenia and bleeding tendency in cirrhotic patients with hypersplenism.AIM To investigate the clinical efficacy of recombinant human TPO(rhTPO)in the treatment of perioperative thrombocytopenia during liver transplantation in cirrhotic mice with hypersplenism.METHODS C57BL/6J mice and TPO receptor-deficient mice were used to establish models of cirrhosis with hypersplenism.Subsequently,these mice underwent orthotopic liver transplantation(OLT).The mice in the experimental group were given rhTPO treatment for 3 consecutive days before surgery and 5 consecutive days after surgery,while the mice in the control group received the same dose of saline at the same frequency.Differences in liver function and platelet counts were determined between the experimental and control groups.Enzyme-linked immunosorbent assay was used to assess the expression of TPO and TPO receptor(c-Mpl)in the blood.RESULTS Preoperative administration of rhTPO significantly improved peri-OLT thrombocytopenia in mice with cirrhosis and hypersplenism.Blocking the expression of TPO receptors exacerbated peri-OLT thrombocytopenia.The concentration of TPO decreased while the concentration of c-Mpl increased in compensation in the mouse model of cirrhosis with hypersplenism.TPO pre-treatment significantly increased the postoperative TPO concentration in mice,which in turn led to a decrease in the c-Mpl concentration.TPO pre-treatment also significantly enhanced the Janus kinase(Jak)/signal transducers and activators of transcription pathway protein expressions in bone marrow stem cells of the C57BL/6J mice.Moreover,the administration of TPO,both before and after surgery,regulated the levels of biochemical indicators,such as alanine aminotransferase,alkaline phosphatase,and aspartate aminotransferase in the C57BL/6J mice.CONCLUSION Pre-treatment with TPO not only exhibited therapeutic effects on perioperative thrombocytopenia in the mice with cirrhosis and hypersplenism,who underwent liver transplantation but also significantly enhanced the perioperative liver function.
基金the sponsor CSIR (Council of Scientific and Industrial Research), New Delhi for their financial grant to carry out the present research work
文摘Detailed experimental investigations were carried out for microwave pre-treatment of high ash Indian coal at high power level(900 W) in microwave oven. The microwave exposure times were fixed at60 s and 120 s. A rheology characteristic for microwave pre-treatment of coal-water slurry(CWS) was performed in an online Bohlin viscometer. The non-Newtonian character of the slurry follows the rheological model of Ostwald de Waele. The values of n and k vary from 0.31 to 0.64 and 0.19 to 0.81 Pa·sn,respectively. This paper presents an artificial neural network(ANN) model to predict the effects of operational parameters on apparent viscosity of CWS. A 4-2-1 topology with Levenberg-Marquardt training algorithm(trainlm) was selected as the controlled ANN. Mean squared error(MSE) of 0.002 and coefficient of multiple determinations(R^2) of 0.99 were obtained for the outperforming model. The promising values of correlation coefficient further confirm the robustness and satisfactory performance of the proposed ANN model.
基金supported by the National Natural Science Foundation of China (41971104)the Open Foundation of the State Key Laboratory of Loess and Quaternary Geology,Institute of Earth Environment+1 种基金Chinese Academy of Sciences (CASSKLLQG1817)the Qilian Mountain National Park Research Center (Qinghai)(GKQ2019-01)。
文摘A knowledge of the tree-ring stable nitrogen isotope ratio(δ^(15)N)can deepen our understanding of forest ecosystem dynamics by indicating the long-term availability,cycling and sources of nitrogen(N).However,the radial mobility of N blurs the interannual variations in the long-term N records.Previous studies of the chemical extraction of tree rings before analysis had produced inconsistent results and it is still unclear whether it is necessary to pre-treat wood samples from specific tree species to remove soluble N compounds before determining theδ^(15)N values.We compared the effects of pre-treatment with organic solvents and hot ultrapure water on the N concentration andδ^(15)N of tree rings from endemic Qinghai spruce(Picea crassifolia)growing in the interior of the central Qilian Mountains,China,during the last 60 a.We assessed the effects of different preparation protocols on the removal of the labile N compounds and investigated the need to pre-treat wood samples before determining theδ^(15)N values of tree rings.Increasing trends of the tree-ring N concentration were consistently observed in both the extracted and unextracted wood samples.The total N removed by extraction with organic solvents was about 17.60%,with a significantly higher amount in the sapwood section(P<0.01).Theδ^(15)N values of tree rings decreased consistently from 1960 to 2019 in both the extracted and unextracted wood samples.Extraction with organic solvents increased theδ^(15)N values markedly by about 5.2‰and reduced the variations in theδ^(15)N series.However,extraction with hot ultrapure water had little effect,with only a slight decrease in theδ^(15)N values of about 0.5‰.Our results showed that the radial pattern in the inter-ring movement of N in Qinghai spruce was not minimized by extraction with either organic solvents or hot ultrapure water.It is unnecessary to conduct hot ultrapure water extraction for the wood samples from Qinghai spruce because of its negligible effect on the removal of the labile N.Theδ^(15)N variation trend of tree rings in the unextracted wood samples was not influenced by the heartwood-sapwood transition zone.We suggest that theδ^(15)N values of the unextracted wood samples of the climate-sensitive Qinghai spruce could be used to explore the ecophysiological dynamics while focusing on the long-term variations.
文摘The purpose of this work was to study the potential to enhance biogas production from pulp and paper mill sludge by the use of thermal pre-treatment in combination with chemical pre-treatment. Biogas from waste is a renewable fuel with very low emissions during combustion. To further reduce the use of fossil fuels, more biogas substrates are necessary. Pulp and paper mill sludge is a large untapped reservoir of potential biogas. Pulp and paper mill sludge was collected from a mill that produces both pulp and paper and has a modified waste activated sludge system as part of its wastewater treatment. Pre-treatments were chosen heat (70 ~C or 140℃) combined with either acid (pH 2 or pH 4) or base (pH 9 or pH 11, obtained with Ca(OH)2 or NaOH). Biogas potential was tested by anaerobic digestion batch assays under mesophilic conditions. All pre-treatments were tested in six replicates. Biogas volume was measured with a gas-tight syringe and methane concentration was measured with a gas chromatograph. The methane yield from sludge subjected to thermal pre-treatment at 70℃ did not differ from the untreated sludge, but thermal pre-treatment at 140℃ had a positive effect. Within the 70℃ thermal pre-treatment group, the pH 2 acid was the most successful chemical pre-treatment, and Ca(OH)2 pH 9 had the least effect with no measurable improvement in methane yield. For the 140 ℃ thermal pre-treatment group, acid and NaOH impacted methane production negatively, while the Ca(OH)2-treated sludge did not differ from sludge with no chemical pre-treatment. In conclusion, thermal pre-treatment at 70℃ showed no effect, whereas, pre-treatment at 140℃ improved methane yield with 170%, and for this sludge additional, chemical pre-treatments are unnecessary.
文摘In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose a Hadoop based big data secure storage scheme.Firstly,in order to disperse the NameNode service from a single server to multiple servers,we combine HDFS federation and HDFS high-availability mechanisms,and use the Zookeeper distributed coordination mechanism to coordinate each node to achieve dual-channel storage.Then,we improve the ECC encryption algorithm for the encryption of ordinary data,and adopt a homomorphic encryption algorithm to encrypt data that needs to be calculated.To accelerate the encryption,we adopt the dualthread encryption mode.Finally,the HDFS control module is designed to combine the encryption algorithm with the storage model.Experimental results show that the proposed solution solves the problem of a single point of failure of metadata,performs well in terms of metadata reliability,and can realize the fault tolerance of the server.The improved encryption algorithm integrates the dual-channel storage mode,and the encryption storage efficiency improves by 27.6% on average.
基金supported by China’s National Natural Science Foundation(Nos.62072249,62072056)This work is also funded by the National Science Foundation of Hunan Province(2020JJ2029).
文摘With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapid development of IIoT.Blockchain technology has immutability,decentralization,and autonomy,which can greatly improve the inherent defects of the IIoT.In the traditional blockchain,data is stored in a Merkle tree.As data continues to grow,the scale of proofs used to validate it grows,threatening the efficiency,security,and reliability of blockchain-based IIoT.Accordingly,this paper first analyzes the inefficiency of the traditional blockchain structure in verifying the integrity and correctness of data.To solve this problem,a new Vector Commitment(VC)structure,Partition Vector Commitment(PVC),is proposed by improving the traditional VC structure.Secondly,this paper uses PVC instead of the Merkle tree to store big data generated by IIoT.PVC can improve the efficiency of traditional VC in the process of commitment and opening.Finally,this paper uses PVC to build a blockchain-based IIoT data security storage mechanism and carries out a comparative analysis of experiments.This mechanism can greatly reduce communication loss and maximize the rational use of storage space,which is of great significance for maintaining the security and stability of blockchain-based IIoT.
基金This research was financially supported by the Ministry of Trade,Industry,and Energy(MOTIE),Korea,under the“Project for Research and Development with Middle Markets Enterprises and DNA(Data,Network,AI)Universities”(AI-based Safety Assessment and Management System for Concrete Structures)(ReferenceNumber P0024559)supervised by theKorea Institute for Advancement of Technology(KIAT).
文摘Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal dependence,and noise.Therefore,methodologies for data augmentation and conversion of time series data into images for analysis have been studied.This paper proposes a fault detection model that uses time series data augmentation and transformation to address the problems of data imbalance,temporal dependence,and robustness to noise.The method of data augmentation is set as the addition of noise.It involves adding Gaussian noise,with the noise level set to 0.002,to maximize the generalization performance of the model.In addition,we use the Markov Transition Field(MTF)method to effectively visualize the dynamic transitions of the data while converting the time series data into images.It enables the identification of patterns in time series data and assists in capturing the sequential dependencies of the data.For anomaly detection,the PatchCore model is applied to show excellent performance,and the detected anomaly areas are represented as heat maps.It allows for the detection of anomalies,and by applying an anomaly map to the original image,it is possible to capture the areas where anomalies occur.The performance evaluation shows that both F1-score and Accuracy are high when time series data is converted to images.Additionally,when processed as images rather than as time series data,there was a significant reduction in both the size of the data and the training time.The proposed method can provide an important springboard for research in the field of anomaly detection using time series data.Besides,it helps solve problems such as analyzing complex patterns in data lightweight.
基金supported by the Natural Science Foundation of Shandong Province,China(Grant No.ZR2021QD032)。
文摘Since the impoundment of Three Gorges Reservoir(TGR)in 2003,numerous slopes have experienced noticeable movement or destabilization owing to reservoir level changes and seasonal rainfall.One case is the Outang landslide,a large-scale and active landslide,on the south bank of the Yangtze River.The latest monitoring data and site investigations available are analyzed to establish spatial and temporal landslide deformation characteristics.Data mining technology,including the two-step clustering and Apriori algorithm,is then used to identify the dominant triggers of landslide movement.In the data mining process,the two-step clustering method clusters the candidate triggers and displacement rate into several groups,and the Apriori algorithm generates correlation criteria for the cause-and-effect.The analysis considers multiple locations of the landslide and incorporates two types of time scales:longterm deformation on a monthly basis and short-term deformation on a daily basis.This analysis shows that the deformations of the Outang landslide are driven by both rainfall and reservoir water while its deformation varies spatiotemporally mainly due to the difference in local responses to hydrological factors.The data mining results reveal different dominant triggering factors depending on the monitoring frequency:the monthly and bi-monthly cumulative rainfall control the monthly deformation,and the 10-d cumulative rainfall and the 5-d cumulative drop of water level in the reservoir dominate the daily deformation of the landslide.It is concluded that the spatiotemporal deformation pattern and data mining rules associated with precipitation and reservoir water level have the potential to be broadly implemented for improving landslide prevention and control in the dam reservoirs and other landslideprone areas.
基金supported by NOAA JTTI award via Grant #NA21OAR4590165, NOAA GOESR Program funding via Grant #NA16OAR4320115provided by NOAA/Office of Oceanic and Atmospheric Research under NOAA-University of Oklahoma Cooperative Agreement #NA11OAR4320072, U.S. Department of Commercesupported by the National Oceanic and Atmospheric Administration (NOAA) of the U.S. Department of Commerce via Grant #NA18NWS4680063。
文摘Capabilities to assimilate Geostationary Operational Environmental Satellite “R-series ”(GOES-R) Geostationary Lightning Mapper(GLM) flash extent density(FED) data within the operational Gridpoint Statistical Interpolation ensemble Kalman filter(GSI-EnKF) framework were previously developed and tested with a mesoscale convective system(MCS) case. In this study, such capabilities are further developed to assimilate GOES GLM FED data within the GSI ensemble-variational(EnVar) hybrid data assimilation(DA) framework. The results of assimilating the GLM FED data using 3DVar, and pure En3DVar(PEn3DVar, using 100% ensemble covariance and no static covariance) are compared with those of EnKF/DfEnKF for a supercell storm case. The focus of this study is to validate the correctness and evaluate the performance of the new implementation rather than comparing the performance of FED DA among different DA schemes. Only the results of 3DVar and pEn3DVar are examined and compared with EnKF/DfEnKF. Assimilation of a single FED observation shows that the magnitude and horizontal extent of the analysis increments from PEn3DVar are generally larger than from EnKF, which is mainly caused by using different localization strategies in EnFK/DfEnKF and PEn3DVar as well as the integration limits of the graupel mass in the observation operator. Overall, the forecast performance of PEn3DVar is comparable to EnKF/DfEnKF, suggesting correct implementation.
基金supported by the Yunnan Major Scientific and Technological Projects(Grant No.202302AD080001)the National Natural Science Foundation,China(No.52065033).
文摘When building a classification model,the scenario where the samples of one class are significantly more than those of the other class is called data imbalance.Data imbalance causes the trained classification model to be in favor of the majority class(usually defined as the negative class),which may do harm to the accuracy of the minority class(usually defined as the positive class),and then lead to poor overall performance of the model.A method called MSHR-FCSSVM for solving imbalanced data classification is proposed in this article,which is based on a new hybrid resampling approach(MSHR)and a new fine cost-sensitive support vector machine(CS-SVM)classifier(FCSSVM).The MSHR measures the separability of each negative sample through its Silhouette value calculated by Mahalanobis distance between samples,based on which,the so-called pseudo-negative samples are screened out to generate new positive samples(over-sampling step)through linear interpolation and are deleted finally(under-sampling step).This approach replaces pseudo-negative samples with generated new positive samples one by one to clear up the inter-class overlap on the borderline,without changing the overall scale of the dataset.The FCSSVM is an improved version of the traditional CS-SVM.It considers influences of both the imbalance of sample number and the class distribution on classification simultaneously,and through finely tuning the class cost weights by using the efficient optimization algorithm based on the physical phenomenon of rime-ice(RIME)algorithm with cross-validation accuracy as the fitness function to accurately adjust the classification borderline.To verify the effectiveness of the proposed method,a series of experiments are carried out based on 20 imbalanced datasets including both mildly and extremely imbalanced datasets.The experimental results show that the MSHR-FCSSVM method performs better than the methods for comparison in most cases,and both the MSHR and the FCSSVM played significant roles.