Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal depende...Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal dependence,and noise.Therefore,methodologies for data augmentation and conversion of time series data into images for analysis have been studied.This paper proposes a fault detection model that uses time series data augmentation and transformation to address the problems of data imbalance,temporal dependence,and robustness to noise.The method of data augmentation is set as the addition of noise.It involves adding Gaussian noise,with the noise level set to 0.002,to maximize the generalization performance of the model.In addition,we use the Markov Transition Field(MTF)method to effectively visualize the dynamic transitions of the data while converting the time series data into images.It enables the identification of patterns in time series data and assists in capturing the sequential dependencies of the data.For anomaly detection,the PatchCore model is applied to show excellent performance,and the detected anomaly areas are represented as heat maps.It allows for the detection of anomalies,and by applying an anomaly map to the original image,it is possible to capture the areas where anomalies occur.The performance evaluation shows that both F1-score and Accuracy are high when time series data is converted to images.Additionally,when processed as images rather than as time series data,there was a significant reduction in both the size of the data and the training time.The proposed method can provide an important springboard for research in the field of anomaly detection using time series data.Besides,it helps solve problems such as analyzing complex patterns in data lightweight.展开更多
With the continuous development of big data technology,the digital transformation of enterprise human resource management has become a development trend.Human resources is one of the most important resources of enterp...With the continuous development of big data technology,the digital transformation of enterprise human resource management has become a development trend.Human resources is one of the most important resources of enterprises,which is crucial to the competitiveness of enterprises.Enterprises need to attract,retain,and motivate excellent employees,thereby enhancing the innovation ability of enterprises and improving competitiveness and market share in the market.To maintain advantages in the fierce market competition,enterprises need to adopt more scientific and effective human resource management methods to enhance organizational efficiency and competitiveness.At the same time,this paper analyzes the dilemma faced by enterprise human resource management,points out the new characteristics of enterprise human resource management enabled by big data,and puts forward feasible suggestions for enterprise digital transformation.展开更多
This study endeavors to formulate a comprehensive methodology for establishing a Geological Knowledge Base(GKB)tailored to fracture-cavity reservoir outcrops within the North Tarim Basin.The acquisition of quantitativ...This study endeavors to formulate a comprehensive methodology for establishing a Geological Knowledge Base(GKB)tailored to fracture-cavity reservoir outcrops within the North Tarim Basin.The acquisition of quantitative geological parameters was accomplished through diverse means such as outcrop observations,thin section studies,unmanned aerial vehicle scanning,and high-resolution cameras.Subsequently,a three-dimensional digital outcrop model was generated,and the parameters were standardized.An assessment of traditional geological knowledge was conducted to delineate the knowledge framework,content,and system of the GKB.The basic parameter knowledge was extracted using multiscale fine characterization techniques,including core statistics,field observations,and microscopic thin section analysis.Key mechanism knowledge was identified by integrating trace elements from filling,isotope geochemical tests,and water-rock simulation experiments.Significant representational knowledge was then extracted by employing various methods such as multiple linear regression,neural network technology,and discriminant classification.Subsequently,an analogy study was performed on the karst fracture-cavity system(KFCS)in both outcrop and underground reservoir settings.The results underscored several key findings:(1)Utilization of a diverse range of techniques,including outcrop observations,core statistics,unmanned aerial vehicle scanning,high-resolution cameras,thin section analysis,and electron scanning imaging,enabled the acquisition and standardization of data.This facilitated effective management and integration of geological parameter data from multiple sources and scales.(2)The GKB for fracture-cavity reservoir outcrops,encompassing basic parameter knowledge,key mechanism knowledge,and significant representational knowledge,provides robust data support and systematic geological insights for the intricate and in-depth examination of the genetic mechanisms of fracture-cavity reservoirs.(3)The developmental characteristics of fracturecavities in karst outcrops offer effective,efficient,and accurate guidance for fracture-cavity research in underground karst reservoirs.The outlined construction method of the outcrop geological knowledge base is applicable to various fracture-cavity reservoirs in different layers and regions worldwide.展开更多
With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapi...With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapid development of IIoT.Blockchain technology has immutability,decentralization,and autonomy,which can greatly improve the inherent defects of the IIoT.In the traditional blockchain,data is stored in a Merkle tree.As data continues to grow,the scale of proofs used to validate it grows,threatening the efficiency,security,and reliability of blockchain-based IIoT.Accordingly,this paper first analyzes the inefficiency of the traditional blockchain structure in verifying the integrity and correctness of data.To solve this problem,a new Vector Commitment(VC)structure,Partition Vector Commitment(PVC),is proposed by improving the traditional VC structure.Secondly,this paper uses PVC instead of the Merkle tree to store big data generated by IIoT.PVC can improve the efficiency of traditional VC in the process of commitment and opening.Finally,this paper uses PVC to build a blockchain-based IIoT data security storage mechanism and carries out a comparative analysis of experiments.This mechanism can greatly reduce communication loss and maximize the rational use of storage space,which is of great significance for maintaining the security and stability of blockchain-based IIoT.展开更多
In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose...In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose a Hadoop based big data secure storage scheme.Firstly,in order to disperse the NameNode service from a single server to multiple servers,we combine HDFS federation and HDFS high-availability mechanisms,and use the Zookeeper distributed coordination mechanism to coordinate each node to achieve dual-channel storage.Then,we improve the ECC encryption algorithm for the encryption of ordinary data,and adopt a homomorphic encryption algorithm to encrypt data that needs to be calculated.To accelerate the encryption,we adopt the dualthread encryption mode.Finally,the HDFS control module is designed to combine the encryption algorithm with the storage model.Experimental results show that the proposed solution solves the problem of a single point of failure of metadata,performs well in terms of metadata reliability,and can realize the fault tolerance of the server.The improved encryption algorithm integrates the dual-channel storage mode,and the encryption storage efficiency improves by 27.6% on average.展开更多
Hemorrhagic transformation is a major complication of large-artery atheroscle rotic stroke(a major ischemic stro ke subtype)that wo rsens outcomes and increases mortality.Disruption of the gut microbiota is an importa...Hemorrhagic transformation is a major complication of large-artery atheroscle rotic stroke(a major ischemic stro ke subtype)that wo rsens outcomes and increases mortality.Disruption of the gut microbiota is an important feature of stroke,and some specific bacteria and bacterial metabolites may contribute to hemorrhagic transformation pathogenesis.We aimed to investigate the relationship between the gut microbiota and hemorrhagic transformation in largearte ry atheroscle rotic stro ke.An observational retrospective study was conducted.From May 2020 to September 2021,blood and fecal samples were obtained upon admission from 32 patients with first-ever acute ischemic stroke and not undergoing intravenous thrombolysis or endovascular thrombectomy,as well as 16 healthy controls.Patients with stro ke who developed hemorrhagic transfo rmation(n=15)were compared to those who did not develop hemorrhagic transformation(n=17)and with healthy controls.The gut microbiota was assessed through 16S ribosomal ribonucleic acid sequencing.We also examined key components of the lipopolysaccharide pathway:lipopolysaccharide,lipopolysaccharide-binding protein,and soluble CD14.We observed that bacterial diversity was decreased in both the hemorrhagic transformation and non-hemorrhagic transfo rmation group compared with the healthy controls.The patients with ischemic stro ke who developed hemorrhagic transfo rmation exhibited altered gut micro biota composition,in particular an increase in the relative abundance and dive rsity of members belonging to the Enterobacteriaceae family.Plasma lipopolysaccharide and lipopolysaccharide-binding protein levels were higher in the hemorrhagic transformation group compared with the non-hemorrhagic transfo rmation group.lipopolysaccharide,lipopolysaccharide-binding protein,and soluble CD14 concentrations were associated with increased abundance of Enterobacte riaceae.Next,the role of the gut microbiota in hemorrhagic transformation was evaluated using an experimental stroke rat model.In this model,transplantation of the gut microbiota from hemorrhagic transformation rats into the recipient rats triggered higher plasma levels of lipopolysaccharide,lipopolysaccharide-binding protein,and soluble CD14.Ta ken togethe r,our findings demonstrate a noticeable change in the gut microbiota and lipopolysaccharide-related inflammatory response in stroke patients with hemorrhagic transformation.This suggests that maintaining a balanced gut microbiota may be an important factor in preventing hemorrhagic transfo rmation after stro ke.展开更多
To study the formation and transformation mechanism of long-period stacked ordered(LPSO)structures,a systematic atomic scale analysis was conducted for the structural evolution of long-period stacked ordered(LPSO)stru...To study the formation and transformation mechanism of long-period stacked ordered(LPSO)structures,a systematic atomic scale analysis was conducted for the structural evolution of long-period stacked ordered(LPSO)structures in the Mg-Gd-Y-Zn-Zr alloy annealed at 300℃~500℃.Various types of metastable LPSO building block clusters were found to exist in alloy structures at different temperatures,which precipitate during the solidification and homogenization process.The stability of Zn/Y clusters is explained by the first principles of density functional theory.The LPSO structure is distinguished by the arrangement of its different Zn/Y enriched LPSO structural units,which comprises local fcc stacking sequences upon a tightly packed plane.The presence of solute atoms causes local lattice distortion,thereby enabling the rearrangement of Mg atoms in the different configurations in the local lattice,and local HCP-FCC transitions occur between Mg and Zn atoms occupying the nearest neighbor positions.This finding indicates that LPSO structures can generate necessary Schockley partial dislocations on specific slip surfaces,providing direct evidence of the transition from 18R to 14H.Growth of the LPSO,devoid of any defects and non-coherent interfaces,was observed separately from other precipitated phases.As a result,the precipitation sequence of LPSO in the solidification stage was as follows:Zn/Ycluster+Mg layers→various metastable LPSO building block clusters→18R/24R LPSO;whereas the precipitation sequence of LPSO during homogenization treatment was observed to be as follows:18R LPSO→various metastable LPSO building block clusters→14H LPSO.Of these,14H LPSO was found to be the most thermodynamically stable structure.展开更多
Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary w...Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary with a deformation condition.This study proposes a novel approach for accurately predicting an anisotropic deformation behavior of wrought Mg alloys using machine learning(ML)with data augmentation.The developed model combines four key strategies from data science:learning the entire flow curves,generative adversarial networks(GAN),algorithm-driven hyperparameter tuning,and gated recurrent unit(GRU)architecture.The proposed model,namely GAN-aided GRU,was extensively evaluated for various predictive scenarios,such as interpolation,extrapolation,and a limited dataset size.The model exhibited significant predictability and improved generalizability for estimating the anisotropic compressive behavior of ZK60 Mg alloys under 11 annealing conditions and for three loading directions.The GAN-aided GRU results were superior to those of previous ML models and constitutive equations.The superior performance was attributed to hyperparameter optimization,GAN-based data augmentation,and the inherent predictivity of the GRU for extrapolation.As a first attempt to employ ML techniques other than artificial neural networks,this study proposes a novel perspective on predicting the anisotropic deformation behaviors of wrought Mg alloys.展开更多
There are challenges to the reliability evaluation for insulated gate bipolar transistors(IGBT)on electric vehicles,such as junction temperature measurement,computational and storage resources.In this paper,a junction...There are challenges to the reliability evaluation for insulated gate bipolar transistors(IGBT)on electric vehicles,such as junction temperature measurement,computational and storage resources.In this paper,a junction temperature estimation approach based on neural network without additional cost is proposed and the lifetime calculation for IGBT using electric vehicle big data is performed.The direct current(DC)voltage,operation current,switching frequency,negative thermal coefficient thermistor(NTC)temperature and IGBT lifetime are inputs.And the junction temperature(T_(j))is output.With the rain flow counting method,the classified irregular temperatures are brought into the life model for the failure cycles.The fatigue accumulation method is then used to calculate the IGBT lifetime.To solve the limited computational and storage resources of electric vehicle controllers,the operation of IGBT lifetime calculation is running on a big data platform.The lifetime is then transmitted wirelessly to electric vehicles as input for neural network.Thus the junction temperature of IGBT under long-term operating conditions can be accurately estimated.A test platform of the motor controller combined with the vehicle big data server is built for the IGBT accelerated aging test.Subsequently,the IGBT lifetime predictions are derived from the junction temperature estimation by the neural network method and the thermal network method.The experiment shows that the lifetime prediction based on a neural network with big data demonstrates a higher accuracy than that of the thermal network,which improves the reliability evaluation of system.展开更多
As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The ...As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The eddy dissipation rate(EDR)has been established as the standard metric for quantifying turbulence in civil aviation.This study aims to explore a universally applicable symbolic classification approach based on genetic programming to detect turbulence anomalies using quick access recorder(QAR)data.The detection of atmospheric turbulence is approached as an anomaly detection problem.Comparative evaluations demonstrate that this approach performs on par with direct EDR calculation methods in identifying turbulence events.Moreover,comparisons with alternative machine learning techniques indicate that the proposed technique is the optimal methodology currently available.In summary,the use of symbolic classification via genetic programming enables accurate turbulence detection from QAR data,comparable to that with established EDR approaches and surpassing that achieved with machine learning algorithms.This finding highlights the potential of integrating symbolic classifiers into turbulence monitoring systems to enhance civil aviation safety amidst rising environmental and operational hazards.展开更多
A benchmark experiment on^(238)U slab samples was conducted using a deuterium-tritium neutron source at the China Institute of Atomic Energy.The leakage neutron spectra within energy levels of 0.8-16 MeV at 60°an...A benchmark experiment on^(238)U slab samples was conducted using a deuterium-tritium neutron source at the China Institute of Atomic Energy.The leakage neutron spectra within energy levels of 0.8-16 MeV at 60°and 120°were measured using the time-of-flight method.The samples were prepared as rectangular slabs with a 30 cm square base and thicknesses of 3,6,and 9 cm.The leakage neutron spectra were also calculated using the MCNP-4C program based on the latest evaluated files of^(238)U evaluated neutron data from CENDL-3.2,ENDF/B-Ⅷ.0,JENDL-5.0,and JEFF-3.3.Based on the comparison,the deficiencies and improvements in^(238)U evaluated nuclear data were analyzed.The results showed the following.(1)The calculated results for CENDL-3.2 significantly overestimated the measurements in the energy interval of elastic scattering at 60°and 120°.(2)The calculated results of CENDL-3.2 overestimated the measurements in the energy interval of inelastic scattering at 120°.(3)The calculated results for CENDL-3.2 significantly overestimated the measurements in the 3-8.5 MeV energy interval at 60°and 120°.(4)The calculated results with JENDL-5.0 were generally consistent with the measurement results.展开更多
When building a classification model,the scenario where the samples of one class are significantly more than those of the other class is called data imbalance.Data imbalance causes the trained classification model to ...When building a classification model,the scenario where the samples of one class are significantly more than those of the other class is called data imbalance.Data imbalance causes the trained classification model to be in favor of the majority class(usually defined as the negative class),which may do harm to the accuracy of the minority class(usually defined as the positive class),and then lead to poor overall performance of the model.A method called MSHR-FCSSVM for solving imbalanced data classification is proposed in this article,which is based on a new hybrid resampling approach(MSHR)and a new fine cost-sensitive support vector machine(CS-SVM)classifier(FCSSVM).The MSHR measures the separability of each negative sample through its Silhouette value calculated by Mahalanobis distance between samples,based on which,the so-called pseudo-negative samples are screened out to generate new positive samples(over-sampling step)through linear interpolation and are deleted finally(under-sampling step).This approach replaces pseudo-negative samples with generated new positive samples one by one to clear up the inter-class overlap on the borderline,without changing the overall scale of the dataset.The FCSSVM is an improved version of the traditional CS-SVM.It considers influences of both the imbalance of sample number and the class distribution on classification simultaneously,and through finely tuning the class cost weights by using the efficient optimization algorithm based on the physical phenomenon of rime-ice(RIME)algorithm with cross-validation accuracy as the fitness function to accurately adjust the classification borderline.To verify the effectiveness of the proposed method,a series of experiments are carried out based on 20 imbalanced datasets including both mildly and extremely imbalanced datasets.The experimental results show that the MSHR-FCSSVM method performs better than the methods for comparison in most cases,and both the MSHR and the FCSSVM played significant roles.展开更多
A large-scale fine-grained Mg-Gd-Y-Zn-Zr alloy plate with high strength and ductility was successfully prepared by multi-pass friction stir processing(MFSP)technology in this work.The structure of grains and long peri...A large-scale fine-grained Mg-Gd-Y-Zn-Zr alloy plate with high strength and ductility was successfully prepared by multi-pass friction stir processing(MFSP)technology in this work.The structure of grains and long period stacking ordered(LPSO)phase were characterized,and the mechanical properties uniformity was investigated.Moreover,a quantitative relationship between the microstructure and tensile yield strength was established.The results showed that the grains in the processed zone(PZ)and interfacial zone(IZ)were refined from 50μm to 3μm and 4μm,respectively,and numerous original LPSO phases were broken.In IZ,some block-shaped 18R LPSO phases were transformed into needle-like 14H LPSO phases due to stacking faults and the short-range diffusion of solute atoms.The severe shear deformation in the form of kinetic energy caused profuse stacking fault to be generated and move rapidly,greatly increasing the transformation rate of LPSO phase.After MFSP,the ultimate tensile strength,yield strength and elongation to failure of the large-scale plate were 367 MPa,305 MPa and 18.0% respectively.Grain refinement and LPSO phase strengthening were the major strengthening mechanisms for the MFSP sample.In particularly,the strength of IZ was comparable to that of PZ because the strength contribution of the 14H LPSO phase offsets the lack of grain refinement strengthening in IZ.This result opposes the widely accepted notion that IZ is a weak region in MFSP-prepared large-scale fine-grained plate.展开更多
Accurate prediction of formation pore pressure is essential to predict fluid flow and manage hydrocarbon production in petroleum engineering.Recent deep learning technique has been receiving more interest due to the g...Accurate prediction of formation pore pressure is essential to predict fluid flow and manage hydrocarbon production in petroleum engineering.Recent deep learning technique has been receiving more interest due to the great potential to deal with pore pressure prediction.However,most of the traditional deep learning models are less efficient to address generalization problems.To fill this technical gap,in this work,we developed a new adaptive physics-informed deep learning model with high generalization capability to predict pore pressure values directly from seismic data.Specifically,the new model,named CGP-NN,consists of a novel parametric features extraction approach(1DCPP),a stacked multilayer gated recurrent model(multilayer GRU),and an adaptive physics-informed loss function.Through machine training,the developed model can automatically select the optimal physical model to constrain the results for each pore pressure prediction.The CGP-NN model has the best generalization when the physicsrelated metricλ=0.5.A hybrid approach combining Eaton and Bowers methods is also proposed to build machine-learnable labels for solving the problem of few labels.To validate the developed model and methodology,a case study on a complex reservoir in Tarim Basin was further performed to demonstrate the high accuracy on the pore pressure prediction of new wells along with the strong generalization ability.The adaptive physics-informed deep learning approach presented here has potential application in the prediction of pore pressures coupled with multiple genesis mechanisms using seismic data.展开更多
Rock mass quality serves as a vital index for predicting the stability and safety status of rock tunnel faces.In tunneling practice,the rock mass quality is often assessed via a combination of qualitative and quantita...Rock mass quality serves as a vital index for predicting the stability and safety status of rock tunnel faces.In tunneling practice,the rock mass quality is often assessed via a combination of qualitative and quantitative parameters.However,due to the harsh on-site construction conditions,it is rather difficult to obtain some of the evaluation parameters which are essential for the rock mass quality prediction.In this study,a novel improved Swin Transformer is proposed to detect,segment,and quantify rock mass characteristic parameters such as water leakage,fractures,weak interlayers.The site experiment results demonstrate that the improved Swin Transformer achieves optimal segmentation results and achieving accuracies of 92%,81%,and 86%for water leakage,fractures,and weak interlayers,respectively.A multisource rock tunnel face characteristic(RTFC)dataset includes 11 parameters for predicting rock mass quality is established.Considering the limitations in predictive performance of incomplete evaluation parameters exist in this dataset,a novel tree-augmented naive Bayesian network(BN)is proposed to address the challenge of the incomplete dataset and achieved a prediction accuracy of 88%.In comparison with other commonly used Machine Learning models the proposed BN-based approach proved an improved performance on predicting the rock mass quality with the incomplete dataset.By utilizing the established BN,a further sensitivity analysis is conducted to quantitatively evaluate the importance of the various parameters,results indicate that the rock strength and fractures parameter exert the most significant influence on rock mass quality.展开更多
Getting insight into the spatiotemporal distribution patterns of knowledge innovation is receiving increasing attention from policymakers and economic research organizations.Many studies use bibliometric data to analy...Getting insight into the spatiotemporal distribution patterns of knowledge innovation is receiving increasing attention from policymakers and economic research organizations.Many studies use bibliometric data to analyze the popularity of certain research topics,well-adopted methodologies,influential authors,and the interrelationships among research disciplines.However,the visual exploration of the patterns of research topics with an emphasis on their spatial and temporal distribution remains challenging.This study combined a Space-Time Cube(STC)and a 3D glyph to represent the complex multivariate bibliographic data.We further implemented a visual design by developing an interactive interface.The effectiveness,understandability,and engagement of ST-Map are evaluated by seven experts in geovisualization.The results suggest that it is promising to use three-dimensional visualization to show the overview and on-demand details on a single screen.展开更多
For the goals of security and privacy preservation,we propose a blind batch encryption-and public ledger-based data sharing protocol that allows the integrity of sensitive data to be audited by a public ledger and all...For the goals of security and privacy preservation,we propose a blind batch encryption-and public ledger-based data sharing protocol that allows the integrity of sensitive data to be audited by a public ledger and allows privacy information to be preserved.Data owners can tightly manage their data with efficient revocation and only grant one-time adaptive access for the fulfillment of the requester.We prove that our protocol is semanticallly secure,blind,and secure against oblivious requesters and malicious file keepers.We also provide security analysis in the context of four typical attacks.展开更多
Background: Cavernous transformation of the portal vein(CTPV) due to portal vein obstruction is a rare vascular anomaly defined as the formation of multiple collateral vessels in the hepatic hilum. This study aimed to...Background: Cavernous transformation of the portal vein(CTPV) due to portal vein obstruction is a rare vascular anomaly defined as the formation of multiple collateral vessels in the hepatic hilum. This study aimed to investigate the imaging features of intrahepatic portal vein in adult patients with CTPV and establish the relationship between the manifestations of intrahepatic portal vein and the progression of CTPV. Methods: We retrospectively analyzed 14 CTPV patients in Beijing Tsinghua Changgung Hospital. All patients underwent both direct portal venography(DPV) and computed tomography angiography(CTA) to reveal the manifestations of the portal venous system. The vessels measured included the left portal vein(LPV), right portal vein(RPV), main portal vein(MPV) and the portal vein bifurcation(PVB). Results: Nine males and 5 females, with a median age of 40.5 years, were included in the study. No significant difference was found in the diameters of the LPV or RPV measured by DPV and CTA. The visualization in terms of LPV, RPV and PVB measured by DPV was higher than that by CTA. There was a significant association between LPV/RPV and PVB/MPV in term of visibility revealed with DPV( P = 0.01), while this association was not observed with CTA. According to the imaging features of the portal vein measured by DPV, CTPV was classified into three categories to facilitate the diagnosis and treatment. Conclusions: DPV was more accurate than CTA for revealing the course of the intrahepatic portal vein in patients with CTPV. The classification of CTPV, that originated from the imaging features of the portal vein revealed by DPV, may provide a new perspective for the diagnosis and treatment of CTPV.展开更多
Ratoon rice,which refers to a second harvest of rice obtained from the regenerated tillers originating from the stubble of the first harvested crop,plays an important role in both food security and agroecology while r...Ratoon rice,which refers to a second harvest of rice obtained from the regenerated tillers originating from the stubble of the first harvested crop,plays an important role in both food security and agroecology while requiring minimal agricultural inputs.However,accurately identifying ratoon rice crops is challenging due to the similarity of its spectral features with other rice cropping systems(e.g.,double rice).Moreover,images with a high spatiotemporal resolution are essential since ratoon rice is generally cultivated in fragmented croplands within regions that frequently exhibit cloudy and rainy weather.In this study,taking Qichun County in Hubei Province,China as an example,we developed a new phenology-based ratoon rice vegetation index(PRVI)for the purpose of ratoon rice mapping at a 30 m spatial resolution using a robust time series generated from Harmonized Landsat and Sentinel-2(HLS)images.The PRVI that incorporated the red,near-infrared,and shortwave infrared 1 bands was developed based on the analysis of spectro-phenological separability and feature selection.Based on actual field samples,the performance of the PRVI for ratoon rice mapping was carefully evaluated by comparing it to several vegetation indices,including normalized difference vegetation index(NDVI),enhanced vegetation index(EVI)and land surface water index(LSWI).The results suggested that the PRVI could sufficiently capture the specific characteristics of ratoon rice,leading to a favorable separability between ratoon rice and other land cover types.Furthermore,the PRVI showed the best performance for identifying ratoon rice in the phenological phases characterized by grain filling and harvesting to tillering of the ratoon crop(GHS-TS2),indicating that only several images are required to obtain an accurate ratoon rice map.Finally,the PRVI performed better than NDVI,EVI,LSWI and their combination at the GHS-TS2 stages,with producer's accuracy and user's accuracy of 92.22 and 89.30%,respectively.These results demonstrate that the proposed PRVI based on HLS data can effectively identify ratoon rice in fragmented croplands at crucial phenological stages,which is promising for identifying the earliest timing of ratoon rice planting and can provide a fundamental dataset for crop management activities.展开更多
基金This research was financially supported by the Ministry of Trade,Industry,and Energy(MOTIE),Korea,under the“Project for Research and Development with Middle Markets Enterprises and DNA(Data,Network,AI)Universities”(AI-based Safety Assessment and Management System for Concrete Structures)(ReferenceNumber P0024559)supervised by theKorea Institute for Advancement of Technology(KIAT).
文摘Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal dependence,and noise.Therefore,methodologies for data augmentation and conversion of time series data into images for analysis have been studied.This paper proposes a fault detection model that uses time series data augmentation and transformation to address the problems of data imbalance,temporal dependence,and robustness to noise.The method of data augmentation is set as the addition of noise.It involves adding Gaussian noise,with the noise level set to 0.002,to maximize the generalization performance of the model.In addition,we use the Markov Transition Field(MTF)method to effectively visualize the dynamic transitions of the data while converting the time series data into images.It enables the identification of patterns in time series data and assists in capturing the sequential dependencies of the data.For anomaly detection,the PatchCore model is applied to show excellent performance,and the detected anomaly areas are represented as heat maps.It allows for the detection of anomalies,and by applying an anomaly map to the original image,it is possible to capture the areas where anomalies occur.The performance evaluation shows that both F1-score and Accuracy are high when time series data is converted to images.Additionally,when processed as images rather than as time series data,there was a significant reduction in both the size of the data and the training time.The proposed method can provide an important springboard for research in the field of anomaly detection using time series data.Besides,it helps solve problems such as analyzing complex patterns in data lightweight.
文摘With the continuous development of big data technology,the digital transformation of enterprise human resource management has become a development trend.Human resources is one of the most important resources of enterprises,which is crucial to the competitiveness of enterprises.Enterprises need to attract,retain,and motivate excellent employees,thereby enhancing the innovation ability of enterprises and improving competitiveness and market share in the market.To maintain advantages in the fierce market competition,enterprises need to adopt more scientific and effective human resource management methods to enhance organizational efficiency and competitiveness.At the same time,this paper analyzes the dilemma faced by enterprise human resource management,points out the new characteristics of enterprise human resource management enabled by big data,and puts forward feasible suggestions for enterprise digital transformation.
基金supported by the Major Scientific and Technological Projects of CNPC under grant ZD2019-183-006the National Science and Technology Major Project of China (2016ZX05014002-006)the National Natural Science Foundation of China (42072234,42272180)。
文摘This study endeavors to formulate a comprehensive methodology for establishing a Geological Knowledge Base(GKB)tailored to fracture-cavity reservoir outcrops within the North Tarim Basin.The acquisition of quantitative geological parameters was accomplished through diverse means such as outcrop observations,thin section studies,unmanned aerial vehicle scanning,and high-resolution cameras.Subsequently,a three-dimensional digital outcrop model was generated,and the parameters were standardized.An assessment of traditional geological knowledge was conducted to delineate the knowledge framework,content,and system of the GKB.The basic parameter knowledge was extracted using multiscale fine characterization techniques,including core statistics,field observations,and microscopic thin section analysis.Key mechanism knowledge was identified by integrating trace elements from filling,isotope geochemical tests,and water-rock simulation experiments.Significant representational knowledge was then extracted by employing various methods such as multiple linear regression,neural network technology,and discriminant classification.Subsequently,an analogy study was performed on the karst fracture-cavity system(KFCS)in both outcrop and underground reservoir settings.The results underscored several key findings:(1)Utilization of a diverse range of techniques,including outcrop observations,core statistics,unmanned aerial vehicle scanning,high-resolution cameras,thin section analysis,and electron scanning imaging,enabled the acquisition and standardization of data.This facilitated effective management and integration of geological parameter data from multiple sources and scales.(2)The GKB for fracture-cavity reservoir outcrops,encompassing basic parameter knowledge,key mechanism knowledge,and significant representational knowledge,provides robust data support and systematic geological insights for the intricate and in-depth examination of the genetic mechanisms of fracture-cavity reservoirs.(3)The developmental characteristics of fracturecavities in karst outcrops offer effective,efficient,and accurate guidance for fracture-cavity research in underground karst reservoirs.The outlined construction method of the outcrop geological knowledge base is applicable to various fracture-cavity reservoirs in different layers and regions worldwide.
基金supported by China’s National Natural Science Foundation(Nos.62072249,62072056)This work is also funded by the National Science Foundation of Hunan Province(2020JJ2029).
文摘With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapid development of IIoT.Blockchain technology has immutability,decentralization,and autonomy,which can greatly improve the inherent defects of the IIoT.In the traditional blockchain,data is stored in a Merkle tree.As data continues to grow,the scale of proofs used to validate it grows,threatening the efficiency,security,and reliability of blockchain-based IIoT.Accordingly,this paper first analyzes the inefficiency of the traditional blockchain structure in verifying the integrity and correctness of data.To solve this problem,a new Vector Commitment(VC)structure,Partition Vector Commitment(PVC),is proposed by improving the traditional VC structure.Secondly,this paper uses PVC instead of the Merkle tree to store big data generated by IIoT.PVC can improve the efficiency of traditional VC in the process of commitment and opening.Finally,this paper uses PVC to build a blockchain-based IIoT data security storage mechanism and carries out a comparative analysis of experiments.This mechanism can greatly reduce communication loss and maximize the rational use of storage space,which is of great significance for maintaining the security and stability of blockchain-based IIoT.
文摘In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose a Hadoop based big data secure storage scheme.Firstly,in order to disperse the NameNode service from a single server to multiple servers,we combine HDFS federation and HDFS high-availability mechanisms,and use the Zookeeper distributed coordination mechanism to coordinate each node to achieve dual-channel storage.Then,we improve the ECC encryption algorithm for the encryption of ordinary data,and adopt a homomorphic encryption algorithm to encrypt data that needs to be calculated.To accelerate the encryption,we adopt the dualthread encryption mode.Finally,the HDFS control module is designed to combine the encryption algorithm with the storage model.Experimental results show that the proposed solution solves the problem of a single point of failure of metadata,performs well in terms of metadata reliability,and can realize the fault tolerance of the server.The improved encryption algorithm integrates the dual-channel storage mode,and the encryption storage efficiency improves by 27.6% on average.
基金supported by the National Key Research and Development Projects,Nos.2022 YFC3602400,2022 YFC3602401(to JX)the Project Program of National Clinical Research Center for Geriatric Disorders(Xiangya Hospital),No.2020LNJJ16(to JX)the National Natural Science Foundation of China,No.82271369(to JX)。
文摘Hemorrhagic transformation is a major complication of large-artery atheroscle rotic stroke(a major ischemic stro ke subtype)that wo rsens outcomes and increases mortality.Disruption of the gut microbiota is an important feature of stroke,and some specific bacteria and bacterial metabolites may contribute to hemorrhagic transformation pathogenesis.We aimed to investigate the relationship between the gut microbiota and hemorrhagic transformation in largearte ry atheroscle rotic stro ke.An observational retrospective study was conducted.From May 2020 to September 2021,blood and fecal samples were obtained upon admission from 32 patients with first-ever acute ischemic stroke and not undergoing intravenous thrombolysis or endovascular thrombectomy,as well as 16 healthy controls.Patients with stro ke who developed hemorrhagic transfo rmation(n=15)were compared to those who did not develop hemorrhagic transformation(n=17)and with healthy controls.The gut microbiota was assessed through 16S ribosomal ribonucleic acid sequencing.We also examined key components of the lipopolysaccharide pathway:lipopolysaccharide,lipopolysaccharide-binding protein,and soluble CD14.We observed that bacterial diversity was decreased in both the hemorrhagic transformation and non-hemorrhagic transfo rmation group compared with the healthy controls.The patients with ischemic stro ke who developed hemorrhagic transfo rmation exhibited altered gut micro biota composition,in particular an increase in the relative abundance and dive rsity of members belonging to the Enterobacteriaceae family.Plasma lipopolysaccharide and lipopolysaccharide-binding protein levels were higher in the hemorrhagic transformation group compared with the non-hemorrhagic transfo rmation group.lipopolysaccharide,lipopolysaccharide-binding protein,and soluble CD14 concentrations were associated with increased abundance of Enterobacte riaceae.Next,the role of the gut microbiota in hemorrhagic transformation was evaluated using an experimental stroke rat model.In this model,transplantation of the gut microbiota from hemorrhagic transformation rats into the recipient rats triggered higher plasma levels of lipopolysaccharide,lipopolysaccharide-binding protein,and soluble CD14.Ta ken togethe r,our findings demonstrate a noticeable change in the gut microbiota and lipopolysaccharide-related inflammatory response in stroke patients with hemorrhagic transformation.This suggests that maintaining a balanced gut microbiota may be an important factor in preventing hemorrhagic transfo rmation after stro ke.
基金financially funded by Natural Science Basic Research Program of Shaanxi(grant number 2022JM-239)Key Research and Development Project of Shaanxi Provincial(grant number 2021LLRH-05–08)。
文摘To study the formation and transformation mechanism of long-period stacked ordered(LPSO)structures,a systematic atomic scale analysis was conducted for the structural evolution of long-period stacked ordered(LPSO)structures in the Mg-Gd-Y-Zn-Zr alloy annealed at 300℃~500℃.Various types of metastable LPSO building block clusters were found to exist in alloy structures at different temperatures,which precipitate during the solidification and homogenization process.The stability of Zn/Y clusters is explained by the first principles of density functional theory.The LPSO structure is distinguished by the arrangement of its different Zn/Y enriched LPSO structural units,which comprises local fcc stacking sequences upon a tightly packed plane.The presence of solute atoms causes local lattice distortion,thereby enabling the rearrangement of Mg atoms in the different configurations in the local lattice,and local HCP-FCC transitions occur between Mg and Zn atoms occupying the nearest neighbor positions.This finding indicates that LPSO structures can generate necessary Schockley partial dislocations on specific slip surfaces,providing direct evidence of the transition from 18R to 14H.Growth of the LPSO,devoid of any defects and non-coherent interfaces,was observed separately from other precipitated phases.As a result,the precipitation sequence of LPSO in the solidification stage was as follows:Zn/Ycluster+Mg layers→various metastable LPSO building block clusters→18R/24R LPSO;whereas the precipitation sequence of LPSO during homogenization treatment was observed to be as follows:18R LPSO→various metastable LPSO building block clusters→14H LPSO.Of these,14H LPSO was found to be the most thermodynamically stable structure.
基金Korea Institute of Energy Technology Evaluation and Planning(KETEP)grant funded by the Korea government(Grant No.20214000000140,Graduate School of Convergence for Clean Energy Integrated Power Generation)Korea Basic Science Institute(National Research Facilities and Equipment Center)grant funded by the Ministry of Education(2021R1A6C101A449)the National Research Foundation of Korea grant funded by the Ministry of Science and ICT(2021R1A2C1095139),Republic of Korea。
文摘Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary with a deformation condition.This study proposes a novel approach for accurately predicting an anisotropic deformation behavior of wrought Mg alloys using machine learning(ML)with data augmentation.The developed model combines four key strategies from data science:learning the entire flow curves,generative adversarial networks(GAN),algorithm-driven hyperparameter tuning,and gated recurrent unit(GRU)architecture.The proposed model,namely GAN-aided GRU,was extensively evaluated for various predictive scenarios,such as interpolation,extrapolation,and a limited dataset size.The model exhibited significant predictability and improved generalizability for estimating the anisotropic compressive behavior of ZK60 Mg alloys under 11 annealing conditions and for three loading directions.The GAN-aided GRU results were superior to those of previous ML models and constitutive equations.The superior performance was attributed to hyperparameter optimization,GAN-based data augmentation,and the inherent predictivity of the GRU for extrapolation.As a first attempt to employ ML techniques other than artificial neural networks,this study proposes a novel perspective on predicting the anisotropic deformation behaviors of wrought Mg alloys.
文摘There are challenges to the reliability evaluation for insulated gate bipolar transistors(IGBT)on electric vehicles,such as junction temperature measurement,computational and storage resources.In this paper,a junction temperature estimation approach based on neural network without additional cost is proposed and the lifetime calculation for IGBT using electric vehicle big data is performed.The direct current(DC)voltage,operation current,switching frequency,negative thermal coefficient thermistor(NTC)temperature and IGBT lifetime are inputs.And the junction temperature(T_(j))is output.With the rain flow counting method,the classified irregular temperatures are brought into the life model for the failure cycles.The fatigue accumulation method is then used to calculate the IGBT lifetime.To solve the limited computational and storage resources of electric vehicle controllers,the operation of IGBT lifetime calculation is running on a big data platform.The lifetime is then transmitted wirelessly to electric vehicles as input for neural network.Thus the junction temperature of IGBT under long-term operating conditions can be accurately estimated.A test platform of the motor controller combined with the vehicle big data server is built for the IGBT accelerated aging test.Subsequently,the IGBT lifetime predictions are derived from the junction temperature estimation by the neural network method and the thermal network method.The experiment shows that the lifetime prediction based on a neural network with big data demonstrates a higher accuracy than that of the thermal network,which improves the reliability evaluation of system.
基金supported by the Meteorological Soft Science Project(Grant No.2023ZZXM29)the Natural Science Fund Project of Tianjin,China(Grant No.21JCYBJC00740)the Key Research and Development-Social Development Program of Jiangsu Province,China(Grant No.BE2021685).
文摘As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The eddy dissipation rate(EDR)has been established as the standard metric for quantifying turbulence in civil aviation.This study aims to explore a universally applicable symbolic classification approach based on genetic programming to detect turbulence anomalies using quick access recorder(QAR)data.The detection of atmospheric turbulence is approached as an anomaly detection problem.Comparative evaluations demonstrate that this approach performs on par with direct EDR calculation methods in identifying turbulence events.Moreover,comparisons with alternative machine learning techniques indicate that the proposed technique is the optimal methodology currently available.In summary,the use of symbolic classification via genetic programming enables accurate turbulence detection from QAR data,comparable to that with established EDR approaches and surpassing that achieved with machine learning algorithms.This finding highlights the potential of integrating symbolic classifiers into turbulence monitoring systems to enhance civil aviation safety amidst rising environmental and operational hazards.
基金This work was supported by the general program(No.1177531)joint funding(No.U2067205)from the National Natural Science Foundation of China.
文摘A benchmark experiment on^(238)U slab samples was conducted using a deuterium-tritium neutron source at the China Institute of Atomic Energy.The leakage neutron spectra within energy levels of 0.8-16 MeV at 60°and 120°were measured using the time-of-flight method.The samples were prepared as rectangular slabs with a 30 cm square base and thicknesses of 3,6,and 9 cm.The leakage neutron spectra were also calculated using the MCNP-4C program based on the latest evaluated files of^(238)U evaluated neutron data from CENDL-3.2,ENDF/B-Ⅷ.0,JENDL-5.0,and JEFF-3.3.Based on the comparison,the deficiencies and improvements in^(238)U evaluated nuclear data were analyzed.The results showed the following.(1)The calculated results for CENDL-3.2 significantly overestimated the measurements in the energy interval of elastic scattering at 60°and 120°.(2)The calculated results of CENDL-3.2 overestimated the measurements in the energy interval of inelastic scattering at 120°.(3)The calculated results for CENDL-3.2 significantly overestimated the measurements in the 3-8.5 MeV energy interval at 60°and 120°.(4)The calculated results with JENDL-5.0 were generally consistent with the measurement results.
基金supported by the Yunnan Major Scientific and Technological Projects(Grant No.202302AD080001)the National Natural Science Foundation,China(No.52065033).
文摘When building a classification model,the scenario where the samples of one class are significantly more than those of the other class is called data imbalance.Data imbalance causes the trained classification model to be in favor of the majority class(usually defined as the negative class),which may do harm to the accuracy of the minority class(usually defined as the positive class),and then lead to poor overall performance of the model.A method called MSHR-FCSSVM for solving imbalanced data classification is proposed in this article,which is based on a new hybrid resampling approach(MSHR)and a new fine cost-sensitive support vector machine(CS-SVM)classifier(FCSSVM).The MSHR measures the separability of each negative sample through its Silhouette value calculated by Mahalanobis distance between samples,based on which,the so-called pseudo-negative samples are screened out to generate new positive samples(over-sampling step)through linear interpolation and are deleted finally(under-sampling step).This approach replaces pseudo-negative samples with generated new positive samples one by one to clear up the inter-class overlap on the borderline,without changing the overall scale of the dataset.The FCSSVM is an improved version of the traditional CS-SVM.It considers influences of both the imbalance of sample number and the class distribution on classification simultaneously,and through finely tuning the class cost weights by using the efficient optimization algorithm based on the physical phenomenon of rime-ice(RIME)algorithm with cross-validation accuracy as the fitness function to accurately adjust the classification borderline.To verify the effectiveness of the proposed method,a series of experiments are carried out based on 20 imbalanced datasets including both mildly and extremely imbalanced datasets.The experimental results show that the MSHR-FCSSVM method performs better than the methods for comparison in most cases,and both the MSHR and the FCSSVM played significant roles.
基金supported by the National Key Research and Development Program of China(2021YFB3501002)State Key Program of National Natural Science Foundation of China(5203405)+3 种基金National Natural Science Foundation of China(51974220,52104383)National Key Research and Development Program of China(2021YFB3700902)Key Research and Development Program of Shaanxi Province(2020ZDLGY13-06,2017ZDXM-GY-037)Shaanxi Province National Science Fund for Distinguished Young Scholars(2022JC-24)。
文摘A large-scale fine-grained Mg-Gd-Y-Zn-Zr alloy plate with high strength and ductility was successfully prepared by multi-pass friction stir processing(MFSP)technology in this work.The structure of grains and long period stacking ordered(LPSO)phase were characterized,and the mechanical properties uniformity was investigated.Moreover,a quantitative relationship between the microstructure and tensile yield strength was established.The results showed that the grains in the processed zone(PZ)and interfacial zone(IZ)were refined from 50μm to 3μm and 4μm,respectively,and numerous original LPSO phases were broken.In IZ,some block-shaped 18R LPSO phases were transformed into needle-like 14H LPSO phases due to stacking faults and the short-range diffusion of solute atoms.The severe shear deformation in the form of kinetic energy caused profuse stacking fault to be generated and move rapidly,greatly increasing the transformation rate of LPSO phase.After MFSP,the ultimate tensile strength,yield strength and elongation to failure of the large-scale plate were 367 MPa,305 MPa and 18.0% respectively.Grain refinement and LPSO phase strengthening were the major strengthening mechanisms for the MFSP sample.In particularly,the strength of IZ was comparable to that of PZ because the strength contribution of the 14H LPSO phase offsets the lack of grain refinement strengthening in IZ.This result opposes the widely accepted notion that IZ is a weak region in MFSP-prepared large-scale fine-grained plate.
基金funded by the National Natural Science Foundation of China(General Program:No.52074314,No.U19B6003-05)National Key Research and Development Program of China(2019YFA0708303-05)。
文摘Accurate prediction of formation pore pressure is essential to predict fluid flow and manage hydrocarbon production in petroleum engineering.Recent deep learning technique has been receiving more interest due to the great potential to deal with pore pressure prediction.However,most of the traditional deep learning models are less efficient to address generalization problems.To fill this technical gap,in this work,we developed a new adaptive physics-informed deep learning model with high generalization capability to predict pore pressure values directly from seismic data.Specifically,the new model,named CGP-NN,consists of a novel parametric features extraction approach(1DCPP),a stacked multilayer gated recurrent model(multilayer GRU),and an adaptive physics-informed loss function.Through machine training,the developed model can automatically select the optimal physical model to constrain the results for each pore pressure prediction.The CGP-NN model has the best generalization when the physicsrelated metricλ=0.5.A hybrid approach combining Eaton and Bowers methods is also proposed to build machine-learnable labels for solving the problem of few labels.To validate the developed model and methodology,a case study on a complex reservoir in Tarim Basin was further performed to demonstrate the high accuracy on the pore pressure prediction of new wells along with the strong generalization ability.The adaptive physics-informed deep learning approach presented here has potential application in the prediction of pore pressures coupled with multiple genesis mechanisms using seismic data.
基金supported by the National Natural Science Foundation of China(Nos.52279107 and 52379106)the Qingdao Guoxin Jiaozhou Bay Second Submarine Tunnel Co.,Ltd.,the Academician and Expert Workstation of Yunnan Province(No.202205AF150015)the Science and Technology Innovation Project of YCIC Group Co.,Ltd.(No.YCIC-YF-2022-15)。
文摘Rock mass quality serves as a vital index for predicting the stability and safety status of rock tunnel faces.In tunneling practice,the rock mass quality is often assessed via a combination of qualitative and quantitative parameters.However,due to the harsh on-site construction conditions,it is rather difficult to obtain some of the evaluation parameters which are essential for the rock mass quality prediction.In this study,a novel improved Swin Transformer is proposed to detect,segment,and quantify rock mass characteristic parameters such as water leakage,fractures,weak interlayers.The site experiment results demonstrate that the improved Swin Transformer achieves optimal segmentation results and achieving accuracies of 92%,81%,and 86%for water leakage,fractures,and weak interlayers,respectively.A multisource rock tunnel face characteristic(RTFC)dataset includes 11 parameters for predicting rock mass quality is established.Considering the limitations in predictive performance of incomplete evaluation parameters exist in this dataset,a novel tree-augmented naive Bayesian network(BN)is proposed to address the challenge of the incomplete dataset and achieved a prediction accuracy of 88%.In comparison with other commonly used Machine Learning models the proposed BN-based approach proved an improved performance on predicting the rock mass quality with the incomplete dataset.By utilizing the established BN,a further sensitivity analysis is conducted to quantitatively evaluate the importance of the various parameters,results indicate that the rock strength and fractures parameter exert the most significant influence on rock mass quality.
文摘Getting insight into the spatiotemporal distribution patterns of knowledge innovation is receiving increasing attention from policymakers and economic research organizations.Many studies use bibliometric data to analyze the popularity of certain research topics,well-adopted methodologies,influential authors,and the interrelationships among research disciplines.However,the visual exploration of the patterns of research topics with an emphasis on their spatial and temporal distribution remains challenging.This study combined a Space-Time Cube(STC)and a 3D glyph to represent the complex multivariate bibliographic data.We further implemented a visual design by developing an interactive interface.The effectiveness,understandability,and engagement of ST-Map are evaluated by seven experts in geovisualization.The results suggest that it is promising to use three-dimensional visualization to show the overview and on-demand details on a single screen.
基金partially supported by the National Natural Science Foundation of China under grant no.62372245the Foundation of Yunnan Key Laboratory of Blockchain Application Technology under Grant 202105AG070005+1 种基金in part by the Foundation of State Key Laboratory of Public Big Datain part by the Foundation of Key Laboratory of Computational Science and Application of Hainan Province under Grant JSKX202202。
文摘For the goals of security and privacy preservation,we propose a blind batch encryption-and public ledger-based data sharing protocol that allows the integrity of sensitive data to be audited by a public ledger and allows privacy information to be preserved.Data owners can tightly manage their data with efficient revocation and only grant one-time adaptive access for the fulfillment of the requester.We prove that our protocol is semanticallly secure,blind,and secure against oblivious requesters and malicious file keepers.We also provide security analysis in the context of four typical attacks.
文摘Background: Cavernous transformation of the portal vein(CTPV) due to portal vein obstruction is a rare vascular anomaly defined as the formation of multiple collateral vessels in the hepatic hilum. This study aimed to investigate the imaging features of intrahepatic portal vein in adult patients with CTPV and establish the relationship between the manifestations of intrahepatic portal vein and the progression of CTPV. Methods: We retrospectively analyzed 14 CTPV patients in Beijing Tsinghua Changgung Hospital. All patients underwent both direct portal venography(DPV) and computed tomography angiography(CTA) to reveal the manifestations of the portal venous system. The vessels measured included the left portal vein(LPV), right portal vein(RPV), main portal vein(MPV) and the portal vein bifurcation(PVB). Results: Nine males and 5 females, with a median age of 40.5 years, were included in the study. No significant difference was found in the diameters of the LPV or RPV measured by DPV and CTA. The visualization in terms of LPV, RPV and PVB measured by DPV was higher than that by CTA. There was a significant association between LPV/RPV and PVB/MPV in term of visibility revealed with DPV( P = 0.01), while this association was not observed with CTA. According to the imaging features of the portal vein measured by DPV, CTPV was classified into three categories to facilitate the diagnosis and treatment. Conclusions: DPV was more accurate than CTA for revealing the course of the intrahepatic portal vein in patients with CTPV. The classification of CTPV, that originated from the imaging features of the portal vein revealed by DPV, may provide a new perspective for the diagnosis and treatment of CTPV.
基金supported by the National Natural Science Foundation of China(42271360 and 42271399)the Young Elite Scientists Sponsorship Program by China Association for Science and Technology(CAST)(2020QNRC001)the Fundamental Research Funds for the Central Universities,China(2662021JC013,CCNU22QN018)。
文摘Ratoon rice,which refers to a second harvest of rice obtained from the regenerated tillers originating from the stubble of the first harvested crop,plays an important role in both food security and agroecology while requiring minimal agricultural inputs.However,accurately identifying ratoon rice crops is challenging due to the similarity of its spectral features with other rice cropping systems(e.g.,double rice).Moreover,images with a high spatiotemporal resolution are essential since ratoon rice is generally cultivated in fragmented croplands within regions that frequently exhibit cloudy and rainy weather.In this study,taking Qichun County in Hubei Province,China as an example,we developed a new phenology-based ratoon rice vegetation index(PRVI)for the purpose of ratoon rice mapping at a 30 m spatial resolution using a robust time series generated from Harmonized Landsat and Sentinel-2(HLS)images.The PRVI that incorporated the red,near-infrared,and shortwave infrared 1 bands was developed based on the analysis of spectro-phenological separability and feature selection.Based on actual field samples,the performance of the PRVI for ratoon rice mapping was carefully evaluated by comparing it to several vegetation indices,including normalized difference vegetation index(NDVI),enhanced vegetation index(EVI)and land surface water index(LSWI).The results suggested that the PRVI could sufficiently capture the specific characteristics of ratoon rice,leading to a favorable separability between ratoon rice and other land cover types.Furthermore,the PRVI showed the best performance for identifying ratoon rice in the phenological phases characterized by grain filling and harvesting to tillering of the ratoon crop(GHS-TS2),indicating that only several images are required to obtain an accurate ratoon rice map.Finally,the PRVI performed better than NDVI,EVI,LSWI and their combination at the GHS-TS2 stages,with producer's accuracy and user's accuracy of 92.22 and 89.30%,respectively.These results demonstrate that the proposed PRVI based on HLS data can effectively identify ratoon rice in fragmented croplands at crucial phenological stages,which is promising for identifying the earliest timing of ratoon rice planting and can provide a fundamental dataset for crop management activities.