With the intelligentization of the Internet of Vehicles(lovs),Artificial Intelligence(Al)technology is becoming more and more essential,especially deep learning.Federated Deep Learning(FDL)is a novel distributed machi...With the intelligentization of the Internet of Vehicles(lovs),Artificial Intelligence(Al)technology is becoming more and more essential,especially deep learning.Federated Deep Learning(FDL)is a novel distributed machine learning technology and is able to address the challenges like data security,privacy risks,and huge communication overheads from big raw data sets.However,FDL can only guarantee data security and privacy among multiple clients during data training.If the data sets stored locally in clients are corrupted,including being tampered with and lost,the training results of the FDL in intelligent IoVs must be negatively affected.In this paper,we are the first to design a secure data auditing protocol to guarantee the integrity and availability of data sets in FDL-empowered IoVs.Specifically,the cuckoo filter and Reed-Solomon codes are utilized to guarantee error tolerance,including efficient corrupted data locating and recovery.In addition,a novel data structure,Skip Hash Table(SHT)is designed to optimize data dynamics.Finally,we illustrate the security of the scheme with the Computational Diffie-Hellman(CDH)assumption on bilinear groups.Sufficient theoretical analyses and performance evaluations demonstrate the security and efficiency of our scheme for data sets in FDL-empowered IoVs.展开更多
Background: Intravoxel incoherent motion diffusion weighted imaging (IVIM-DWI) can not only observe the diffusion of tissue water molecules but also reflect the blood perfusion information of tissue microcirculation. ...Background: Intravoxel incoherent motion diffusion weighted imaging (IVIM-DWI) can not only observe the diffusion of tissue water molecules but also reflect the blood perfusion information of tissue microcirculation. IVIM-DWI has been applied in many clinical areas. However, few studies have addressed the use of IVIM-DWI for the evaluation of transarterial chemoembolization (TACE) response in hepatocellular carcinoma (HCC) patients. Objectives: The purpose of the present study was to explore the role of IVIM-DWI for the therapeutic response of TACE for HCC. Materials and Methods: Twenty patients underwent IVIM-DWI scan on a 3.0T magnetic resonance imaging instrument 1 - 3 days before and 30 to 40 days after TACE. The values of IVIM-DWI parameters, including standard apparent diffusion coefficient (ADC), pure diffusion coefficient (Dslow), pseudo-diffusion coefficient (Dfast) and perfusion fraction (f) were measured. The values of IVIM-DWI parameters before and after TACE were compared using paired t tests. The values between responsive and non-responsive groups were compared using independent-samples t test. P 0.05 indicated statistical significance. Results: After TACE, the ADC and Dslow values in the tumors increased significantly, and the values of Dfast decreased significantly, while the values of f value did not change obviously. The ADC values in responsive group were remarkably higher than those in non-responsive group, the Dfast values in responsive group were significantly lower than those in non-responsive group, but the values of Dslow and f between the two groups were not different significantly. Conclusions: IVIM-DWI parameters can be used as potential markers for the therapeutic response of TACE for HCC.展开更多
Fall behavior is closely related to high mortality in the elderly,so fall detection becomes an important and urgent research area.However,the existing fall detection methods are difficult to be applied in daily life d...Fall behavior is closely related to high mortality in the elderly,so fall detection becomes an important and urgent research area.However,the existing fall detection methods are difficult to be applied in daily life due to a large amount of calculation and poor detection accuracy.To solve the above problems,this paper proposes a dense spatial-temporal graph convolutional network based on lightweight OpenPose.Lightweight OpenPose uses MobileNet as a feature extraction network,and the prediction layer uses bottleneck-asymmetric structure,thus reducing the amount of the network.The bottleneck-asymmetrical structure compresses the number of input channels of feature maps by 1×1 convolution and replaces the 7×7 convolution structure with the asymmetric structure of 1×7 convolution,7×1 convolution,and 7×7 convolution in parallel.The spatial-temporal graph convolutional network divides the multi-layer convolution into dense blocks,and the convolutional layers in each dense block are connected,thus improving the feature transitivity,enhancing the network’s ability to extract features,thus improving the detection accuracy.Two representative datasets,Multiple Cameras Fall dataset(MCF),and Nanyang Technological University Red Green Blue+Depth Action Recognition dataset(NTU RGB+D),are selected for our experiments,among which NTU RGB+D has two evaluation benchmarks.The results show that the proposed model is superior to the current fall detection models.The accuracy of this network on the MCF dataset is 96.3%,and the accuracies on the two evaluation benchmarks of the NTU RGB+D dataset are 85.6%and 93.5%,respectively.展开更多
With the rapid development of information technology,the development of blockchain technology has also been deeply impacted.When performing block verification in the blockchain network,if all transactions are verified...With the rapid development of information technology,the development of blockchain technology has also been deeply impacted.When performing block verification in the blockchain network,if all transactions are verified on the chain,this will cause the accumulation of data on the chain,resulting in data storage problems.At the same time,the security of data is also challenged,which will put enormous pressure on the block,resulting in extremely low communication efficiency of the block.The traditional blockchain system uses theMerkle Tree method to store data.While verifying the integrity and correctness of the data,the amount of proof is large,and it is impossible to verify the data in batches.A large amount of data proof will greatly impact the verification efficiency,which will cause end-to-end communication delays and seriously affect the blockchain system’s stability,efficiency,and security.In order to solve this problem,this paper proposes to replace the Merkle tree with polynomial commitments,which take advantage of the properties of polynomials to reduce the proof size and communication consumption.By realizing the ingenious use of aggregated proof and smart contracts,the verification efficiency of blocks is improved,and the pressure of node communication is reduced.展开更多
Gesture recognition technology enables machines to read human gestures and has significant application prospects in the fields of human-computer interaction and sign language translation.Existing researches usually us...Gesture recognition technology enables machines to read human gestures and has significant application prospects in the fields of human-computer interaction and sign language translation.Existing researches usually use convolutional neural networks to extract features directly from raw gesture data for gesture recognition,but the networks are affected by much interference information in the input data and thus fit to some unimportant features.In this paper,we proposed a novel method for encoding spatio-temporal information,which can enhance the key features required for gesture recognition,such as shape,structure,contour,position and hand motion of gestures,thereby improving the accuracy of gesture recognition.This encoding method can encode arbitrarily multiple frames of gesture data into a single frame of the spatio-temporal feature map and use the spatio-temporal feature map as the input to the neural network.This can guide the model to fit important features while avoiding the use of complex recurrent network structures to extract temporal features.In addition,we designed two sub-networks and trained the model using a sub-network pre-training strategy that trains the sub-networks first and then the entire network,so as to avoid the subnetworks focusing too much on the information of a single category feature and being overly influenced by each other’s features.Experimental results on two public gesture datasets show that the proposed spatio-temporal information encoding method achieves advanced accuracy.展开更多
Due to the extensive use of various intelligent terminals and the popularity of network social tools,a large amount of data in the field of medical emerged.How to manage these massive data safely and reliably has beco...Due to the extensive use of various intelligent terminals and the popularity of network social tools,a large amount of data in the field of medical emerged.How to manage these massive data safely and reliably has become an important challenge for the medical network community.This paper proposes a data management framework of medical network community based on Consortium Blockchain(CB)and Federated learning(FL),which realizes the data security sharing between medical institutions and research institutions.Under this framework,the data security sharing mechanism of medical network community based on smart contract and the data privacy protection mechanism based on FL and alliance chain are designed to ensure the security of data and the privacy of important data in medical network community,respectively.An intelligent contract system based on Keyed-Homomorphic Public Key(KH-PKE)Encryption scheme is designed,so that medical data can be saved in the CB in the form of ciphertext,and the automatic sharing of data is realized.Zero knowledge mechanism is used to ensure the correctness of shared data.Moreover,the zero-knowledge mechanism introduces the dynamic group signature mechanism of chosen ciphertext attack(CCA)anonymity,which makes the scheme more efficient in computing and communication cost.In the end of this paper,the performance of the scheme is analyzed fromboth asymptotic and practical aspects.Through experimental comparative analysis,the scheme proposed in this paper is more effective and feasible.展开更多
A Generative Adversarial Neural(GAN)network is designed based on deep learning for the Super-Resolution(SR)reconstruction task of temperaturefields(comparable to downscaling in the meteorologicalfield),which is limite...A Generative Adversarial Neural(GAN)network is designed based on deep learning for the Super-Resolution(SR)reconstruction task of temperaturefields(comparable to downscaling in the meteorologicalfield),which is limited by the small number of ground stations and the sparse distribution of observations,resulting in a lack offineness of data.To improve the network’s generalization performance,the residual structure,and batch normalization are used.Applying the nearest interpolation method to avoid over-smoothing of the climate element values instead of the conventional Bicubic interpolation in the computer visionfield.Sub-pixel convolution is used instead of transposed convolution or interpolation methods for up-sampling to speed up network inference.The experimental dataset is the European Centre for Medium-Range Weather Forecasts Reanalysis v5(ERA5)with a bidirectional resolution of 0:1°×0:1°.On the other hand,the task aims to scale up the size by a factor of 8,which is rare compared to conventional methods.The comparison methods include traditional interpolation methods and a more widely used GAN-based network such as the SRGAN.Thefinal experimental results show that the proposed scheme advances the performance of Root Mean Square Error(RMSE)by 37.25%,the Peak Signal-to-noise Ratio(PNSR)by 14.4%,and the Structural Similarity(SSIM)by 10.3%compared to the Bicubic Interpolation.For the traditional SRGAN network,a relatively obvious performance improvement is observed by experimental demonstration.Meanwhile,the GAN network can converge stably and reach the approximate Nash equilibrium for various initialization parameters to empirically illustrate the effectiveness of the method in the temperature fields.展开更多
Since transactions in blockchain are based on public ledger verification,this raises security concerns about privacy protection.And it will cause the accumulation of data on the chain and resulting in the low efficien...Since transactions in blockchain are based on public ledger verification,this raises security concerns about privacy protection.And it will cause the accumulation of data on the chain and resulting in the low efficiency of block verification,when the whole transaction on the chain is verified.In order to improve the efficiency and privacy protection of block data verification,this paper proposes an efficient block verification mechanism with privacy protection based on zeroknowledge proof(ZKP),which not only protects the privacy of users but also improves the speed of data block verification.There is no need to put the whole transaction on the chain when verifying block data.It just needs to generate the ZKP and root hash with the transaction information,then save them to the smart contract for verification.Moreover,the ZKP verification in smart contract is carried out to realize the privacy protection of the transaction and efficient verification of the block.When the data is validated,the buffer accepts the complete transaction,updates the transaction status in the cloud database,and packages up the chain.So,the ZKP strengthens the privacy protection ability of blockchain,and the smart contracts save the time cost of block verification.展开更多
In the data communication system,the real-time information interaction of communication device increases the risk of privacy sensitive data being tam-pered with.Therefore,maintaining data security is one of the most i...In the data communication system,the real-time information interaction of communication device increases the risk of privacy sensitive data being tam-pered with.Therefore,maintaining data security is one of the most important issues in network data communication.Because the timestamp is the most impor-tant way to authenticate data in information interaction,it is very necessary to pro-vide timestamp service in the data communication system.However,the existing centralized timestamp mechanism is difficult to provide credible timestamp ser-vice,and users can conspire with timestamping servers to forge timestamps.Therefore,this paper designs a distributed timestamp mechanism based on contin-uous verifiable delay functions.It utilizes multiple independent timestamp servers to provide timestamp services in a distributed model and appends the timestamp to the data once the data is generated.Thus,it can prove that the data already exists at a certain time and ensure the accuracy of the timestamp.Moreover,a digital blind signature based on elliptic curve cryptography is utilized to solve the problem of timestamp forgery in timestamp service.Finally,the security ana-lysis of the scheme ensures the data security of data communication system and the concurrency rate of timestamp.The experimental results also show that the scheme greatly improves the efficiency of digital signatures.展开更多
[Objectives]miRNAs play an important role in the proliferation and differentiation of different myoblasts.This study was conducted to elucidate the complex genetic mechanisms that affect the meat production performanc...[Objectives]miRNAs play an important role in the proliferation and differentiation of different myoblasts.This study was conducted to elucidate the complex genetic mechanisms that affect the meat production performance of Sichuan white rabbits and reveal the regulatory role of miRNAs in their muscle growth and meat quality formation.[Methods]Three constructed skeletal muscle libraries of Sichuan white rabbits aged six months were sequenced by the solexa technology to identify known miRNAs,predict new miRNAs and construct an expression profile of muscle miRNAs.[Results]A total of 511 known miRNAs and 42 miRNAs were detected in 34089472 pure sequences,and the proportion of miRNAs with a length of 22 nt was the highest.The number of known miRNA sequences accounted for 71.38%of pure sequences,which was much higher than the proportion of other types of RNAs.The proportion of sequences from exons was 0.38%,indicating a low degree of mRNA degradation in the samples.Base U had the highest proportion at the first position,and the bases with the highest proportions at positions 8 and 10 were U and A,respectively.Muscle-specific miRNAs(miR-1,miR-133,and miR-206)ranked in the top 10 in terms of expression level.The number and expression levels of new miRNAs were lower than those of known miRNAs.The length distribution,base bias at different positions and expression profile characteristics of miRNAs might be related to the biological function of miRNAs in regulating muscle proliferation and differentiation and the action mechanisms with target genes.[Conclusions]The identification and expression of miRNAs in muscle tissues of Sichuan white rabbits will help to understand the complex molecular mechanisms of meat production performance and provide a theoretical basis for the functional research of miRNAs in meat rabbits.展开更多
Aortic dissection (AD) is a life-threatening clinical emergency requiring rapid diagnosis and effective intervention to improve patient survival and prognosis. Computed tomography angiography (CTA) can be used to diag...Aortic dissection (AD) is a life-threatening clinical emergency requiring rapid diagnosis and effective intervention to improve patient survival and prognosis. Computed tomography angiography (CTA) can be used to diagnose AD accurately and quickly, making it the first choice for diagnosing AD in an emergency. This article reviews the application of CTA in the diagnosis and treatment of AD.展开更多
Since the British National Archive put forward the concept of the digital continuity in 2007,several developed countries have worked out their digital continuity action plan.However,the technologies of the digital con...Since the British National Archive put forward the concept of the digital continuity in 2007,several developed countries have worked out their digital continuity action plan.However,the technologies of the digital continuity guarantee are still lacked.At first,this paper analyzes the requirements of digital continuity guarantee for electronic record based on data quality theory,then points out the necessity of data quality guarantee for electronic record.Moreover,we convert the digital continuity guarantee of electronic record to ensure the consistency,completeness and timeliness of electronic record,and construct the first technology framework of the digital continuity guarantee for electronic record.Finally,the temporal functional dependencies technology is utilized to build the first integration method to insure the consistency,completeness and timeliness of electronic record.展开更多
Distributed storage can store data in multiple devices or servers to improve data security.However,in today’s explosive growth of network data,traditional distributed storage scheme is faced with some severe challeng...Distributed storage can store data in multiple devices or servers to improve data security.However,in today’s explosive growth of network data,traditional distributed storage scheme is faced with some severe challenges such as insufficient performance,data tampering,and data lose.A distributed storage scheme based on blockchain has been proposed to improve security and efficiency of traditional distributed storage.Under this scheme,the following improvements have been made in this paper.This paper first analyzes the problems faced by distributed storage.Then proposed to build a new distributed storage blockchain scheme with sharding blockchain.The proposed scheme realizes the partitioning of the network and nodes by means of blockchain sharding technology,which can improve the efficiency of data verification between nodes.In addition,this paper uses polynomial commitment to construct a new verifiable secret share scheme called PolyVSS.This new scheme is one of the foundations for building our improved distributed storage blockchain scheme.Compared with the previous scheme,our new scheme does not require a trusted third party and has some new features such as homomorphic and batch opening.The security of VSS can be further improved.Experimental comparisons show that the proposed scheme significantly reduces storage and communication costs.展开更多
Under the co-promotion of the wave of urbanization and the rise of data science,smart cities have become the new concept and new practice of urban development.Smart cities are the combination of information technology...Under the co-promotion of the wave of urbanization and the rise of data science,smart cities have become the new concept and new practice of urban development.Smart cities are the combination of information technology represented by the Internet of Things,cloud computing,mobile networks and big data,and urbanization.How to effectively achieve the long-term preservation of massive,heterogeneous,and multi-source digital electronic records in smart cities is a key issue thatmust be solved.Digital continuity can ensure the accessibility,integrity and availability of information.The quality management of electronic record,like the quality management of product,will run through every phase of the urban lifecycle.Based on data quality management,this paper constructs digital continuity of smart city electronic records.Furthermore,thework in this paper ensures the authenticity,integrity,availability and timeliness of electronic documents by quality management of electronic record.This paper elaborates on the overall technical architecture of electronic record,as well as the various technical means needed to protect its four characteristics.展开更多
Cloud storage represents the trend of intensive,scale and specialization of information technology,which has changed the technical architecture and implementation method of electronic records management.Moreover,it wi...Cloud storage represents the trend of intensive,scale and specialization of information technology,which has changed the technical architecture and implementation method of electronic records management.Moreover,it will provide a convenient way to generate more advanced and efficient management of the electronic data records.However,in cloud storage environment,it is difficult to guarantee the trustworthiness of electronic records,which results in a series of severe challenges to electronic records management.Starting from the definition and specification of electronic records,this paper firstly analyzes the requirements of the trustworthiness in cloud storage during their long-term preservation according to the information security theory and subdivides the trustworthiness into the authenticity,integrity,usability,and reliability of electronic records in cloud storage.Moreover,this paper proposes the technology framework of preservation for trusted electronic records.Also,the technology of blockchain,proofs of retrievability,the open archival information system model and erasure code are adopted to protect these four security attributes,to guarantee the credibility of the electronic record.展开更多
The application field of the Internet of Things(IoT)involves all aspects,and its application in the fields of industry,agriculture,environment,transportation,logistics,security and other infrastructure has effectively...The application field of the Internet of Things(IoT)involves all aspects,and its application in the fields of industry,agriculture,environment,transportation,logistics,security and other infrastructure has effectively promoted the intelligent development of these aspects.Although the IoT has gradually grown in recent years,there are still many problems that need to be overcome in terms of technology,management,cost,policy,and security.We need to constantly weigh the benefits of trusting IoT products and the risk of leaking private data.To avoid the leakage and loss of various user data,this paper developed a hybrid algorithm of kernel function and random perturbation method based on the algorithm of non-negative matrix factorization,which realizes personalized recommendation and solves the problem of user privacy data protection in the process of personalized recommendation.Compared to non-negative matrix factorization privacy-preserving algorithm,the new algorithm does not need to know the detailed information of the data,only need to know the connection between each data;and the new algorithm can process the data points with negative characteristics.Experiments show that the new algorithm can produce recommendation results with certain accuracy under the premise of preserving users’personal privacy.展开更多
In view of the low accuracy of traditional ground nephogram recognition model,the authors put forward a k-means algorithm-acquired neural network ensemble method,which takes BP neural network ensemble model as the bas...In view of the low accuracy of traditional ground nephogram recognition model,the authors put forward a k-means algorithm-acquired neural network ensemble method,which takes BP neural network ensemble model as the basis,uses k-means algorithm to choose the individual neural networks with partial diversities for integration,and builds the cloud form classification model.Through simulation experiments on ground nephogram samples,the results show that the algorithm proposed in the article can effectively improve the Classification accuracy of ground nephogram recognition in comparison with applying single BP neural network and traditional BP AdaBoost ensemble algorithm on classification of ground nephogram.展开更多
In order to solve the problem that real-time face recognition is susceptible to illumination changes,this paper proposes a face recognition method that combines Local Binary Patterns(LBP)and Embedded Hidden Markov Mod...In order to solve the problem that real-time face recognition is susceptible to illumination changes,this paper proposes a face recognition method that combines Local Binary Patterns(LBP)and Embedded Hidden Markov Model(EHMM).Face recognition method.The method firstly performs LBP preprocessing on the input face image,then extracts the feature vector,and finally sends the extracted feature observation vector to the EHMM for training or recognition.Experiments on multiple face databases show that the proposed algorithm is robust to illumination and improves recognition rate.展开更多
With the diversification of electronic devices,cloud-based services have become the link between different devices.As a cryptosystem with secure conversion function,proxy re-encryption enables secure sharing of data i...With the diversification of electronic devices,cloud-based services have become the link between different devices.As a cryptosystem with secure conversion function,proxy re-encryption enables secure sharing of data in a cloud environment.Proxy re-encryption is a public key encryption system with ciphertext security conversion function.A semi-trusted agent plays the role of ciphertext conversion,which can convert the user ciphertext into the same plaintext encrypted by the principal’s public key.Proxy re-encryption has been a hotspot in the field of information security since it was proposed by Blaze et al.[Blaze,Bleumer and Strauss(1998)].After 20 years of development,proxy re-encryption has evolved into many forms been widely used.This paper elaborates on the definition,characteristics and development status of proxy re-encryption,and classifies proxy re-encryption from the perspectives of user identity,conversion condition,conversion hop count and conversion direction.The aspects of the existing program were compared and briefly reviewed from the aspects of features,performance,and security.Finally,this paper looks forward to the possible development direction of proxy re-encryption in the future.展开更多
As an important maritime hub,Bohai Sea Bay provides great convenience for shipping and suffers from sea ice disasters of different severity every winter,which greatly affects the socio-economic and development of the ...As an important maritime hub,Bohai Sea Bay provides great convenience for shipping and suffers from sea ice disasters of different severity every winter,which greatly affects the socio-economic and development of the region.Therefore,this paper uses FY-4A(a weather satellite)data to study sea ice in the Bohai Sea.After processing the data for land removal and cloud detection,it combines multi-channel threshold method and adaptive threshold algorithm to realize the recognition of Bohai Sea ice under clear sky conditions.The random forests classification algorithm is introduced in sea ice identification,which can achieve a certain effect of sea ice classification recognition under cloud cover.Under non-clear sky conditions,the results of Bohai Sea ice identification based on random forests have been improved,and the algorithm can effectively identify Bohai Sea Ice and can improve the accuracy of sea ice identification,which lays a foundation for the accuracy and stability of sea ice identification.It realizes sea ice identification in the Bohai Sea and provides data support and algorithm support for marine climate forecasting related departments.展开更多
基金supported by the National Natural Science Foundation of China under Grants No.U1836115,No.61922045,No.61877034,No.61772280the Natural Science Foundation of Jiangsu Province under Grant No.BK20181408+2 种基金the Peng Cheng Laboratory Project of Guangdong Province PCL2018KP004the CICAEET fundthe PAPD fund.
文摘With the intelligentization of the Internet of Vehicles(lovs),Artificial Intelligence(Al)technology is becoming more and more essential,especially deep learning.Federated Deep Learning(FDL)is a novel distributed machine learning technology and is able to address the challenges like data security,privacy risks,and huge communication overheads from big raw data sets.However,FDL can only guarantee data security and privacy among multiple clients during data training.If the data sets stored locally in clients are corrupted,including being tampered with and lost,the training results of the FDL in intelligent IoVs must be negatively affected.In this paper,we are the first to design a secure data auditing protocol to guarantee the integrity and availability of data sets in FDL-empowered IoVs.Specifically,the cuckoo filter and Reed-Solomon codes are utilized to guarantee error tolerance,including efficient corrupted data locating and recovery.In addition,a novel data structure,Skip Hash Table(SHT)is designed to optimize data dynamics.Finally,we illustrate the security of the scheme with the Computational Diffie-Hellman(CDH)assumption on bilinear groups.Sufficient theoretical analyses and performance evaluations demonstrate the security and efficiency of our scheme for data sets in FDL-empowered IoVs.
文摘Background: Intravoxel incoherent motion diffusion weighted imaging (IVIM-DWI) can not only observe the diffusion of tissue water molecules but also reflect the blood perfusion information of tissue microcirculation. IVIM-DWI has been applied in many clinical areas. However, few studies have addressed the use of IVIM-DWI for the evaluation of transarterial chemoembolization (TACE) response in hepatocellular carcinoma (HCC) patients. Objectives: The purpose of the present study was to explore the role of IVIM-DWI for the therapeutic response of TACE for HCC. Materials and Methods: Twenty patients underwent IVIM-DWI scan on a 3.0T magnetic resonance imaging instrument 1 - 3 days before and 30 to 40 days after TACE. The values of IVIM-DWI parameters, including standard apparent diffusion coefficient (ADC), pure diffusion coefficient (Dslow), pseudo-diffusion coefficient (Dfast) and perfusion fraction (f) were measured. The values of IVIM-DWI parameters before and after TACE were compared using paired t tests. The values between responsive and non-responsive groups were compared using independent-samples t test. P 0.05 indicated statistical significance. Results: After TACE, the ADC and Dslow values in the tumors increased significantly, and the values of Dfast decreased significantly, while the values of f value did not change obviously. The ADC values in responsive group were remarkably higher than those in non-responsive group, the Dfast values in responsive group were significantly lower than those in non-responsive group, but the values of Dslow and f between the two groups were not different significantly. Conclusions: IVIM-DWI parameters can be used as potential markers for the therapeutic response of TACE for HCC.
基金supported,in part,by the National Nature Science Foundation of China under Grant Numbers 62272236,62376128in part,by the Natural Science Foundation of Jiangsu Province under Grant Numbers BK20201136,BK20191401.
文摘Fall behavior is closely related to high mortality in the elderly,so fall detection becomes an important and urgent research area.However,the existing fall detection methods are difficult to be applied in daily life due to a large amount of calculation and poor detection accuracy.To solve the above problems,this paper proposes a dense spatial-temporal graph convolutional network based on lightweight OpenPose.Lightweight OpenPose uses MobileNet as a feature extraction network,and the prediction layer uses bottleneck-asymmetric structure,thus reducing the amount of the network.The bottleneck-asymmetrical structure compresses the number of input channels of feature maps by 1×1 convolution and replaces the 7×7 convolution structure with the asymmetric structure of 1×7 convolution,7×1 convolution,and 7×7 convolution in parallel.The spatial-temporal graph convolutional network divides the multi-layer convolution into dense blocks,and the convolutional layers in each dense block are connected,thus improving the feature transitivity,enhancing the network’s ability to extract features,thus improving the detection accuracy.Two representative datasets,Multiple Cameras Fall dataset(MCF),and Nanyang Technological University Red Green Blue+Depth Action Recognition dataset(NTU RGB+D),are selected for our experiments,among which NTU RGB+D has two evaluation benchmarks.The results show that the proposed model is superior to the current fall detection models.The accuracy of this network on the MCF dataset is 96.3%,and the accuracies on the two evaluation benchmarks of the NTU RGB+D dataset are 85.6%and 93.5%,respectively.
基金This work is supported by the Fundamental Research Funds for the central Universities(Zhejiang University NGICS Platform),Xiaofeng Yu receives the grant and the URLs to sponsors’websites are https://www.zju.edu.cn/.And the work are supported by China’s National Natural Science Foundation(No.62072249,62072056)JinWang and Yongjun Ren receive the grant and the URLs to sponsors’websites are https://www.nsfc.gov.cn/.This work is also funded by the National Science Foundation of Hunan Province(2020JJ2029)Jin Wang receives the grant and the URLs to sponsors’websites are http://kjt.hunan.gov.cn/.
文摘With the rapid development of information technology,the development of blockchain technology has also been deeply impacted.When performing block verification in the blockchain network,if all transactions are verified on the chain,this will cause the accumulation of data on the chain,resulting in data storage problems.At the same time,the security of data is also challenged,which will put enormous pressure on the block,resulting in extremely low communication efficiency of the block.The traditional blockchain system uses theMerkle Tree method to store data.While verifying the integrity and correctness of the data,the amount of proof is large,and it is impossible to verify the data in batches.A large amount of data proof will greatly impact the verification efficiency,which will cause end-to-end communication delays and seriously affect the blockchain system’s stability,efficiency,and security.In order to solve this problem,this paper proposes to replace the Merkle tree with polynomial commitments,which take advantage of the properties of polynomials to reduce the proof size and communication consumption.By realizing the ingenious use of aggregated proof and smart contracts,the verification efficiency of blocks is improved,and the pressure of node communication is reduced.
基金This work was supported,in part,by the National Nature Science Foundation of China under grant numbers 62272236in part,by the Natural Science Foundation of Jiangsu Province under grant numbers BK20201136,BK20191401in part,by the Priority Academic Program Development of Jiangsu Higher Education Institutions(PAPD)fund.
文摘Gesture recognition technology enables machines to read human gestures and has significant application prospects in the fields of human-computer interaction and sign language translation.Existing researches usually use convolutional neural networks to extract features directly from raw gesture data for gesture recognition,but the networks are affected by much interference information in the input data and thus fit to some unimportant features.In this paper,we proposed a novel method for encoding spatio-temporal information,which can enhance the key features required for gesture recognition,such as shape,structure,contour,position and hand motion of gestures,thereby improving the accuracy of gesture recognition.This encoding method can encode arbitrarily multiple frames of gesture data into a single frame of the spatio-temporal feature map and use the spatio-temporal feature map as the input to the neural network.This can guide the model to fit important features while avoiding the use of complex recurrent network structures to extract temporal features.In addition,we designed two sub-networks and trained the model using a sub-network pre-training strategy that trains the sub-networks first and then the entire network,so as to avoid the subnetworks focusing too much on the information of a single category feature and being overly influenced by each other’s features.Experimental results on two public gesture datasets show that the proposed spatio-temporal information encoding method achieves advanced accuracy.
基金supported by the NSFC(No.62072249)Yongjun Ren received the grant and the URLs to sponsors’websites is https://www.nsfc.gov.cn/.
文摘Due to the extensive use of various intelligent terminals and the popularity of network social tools,a large amount of data in the field of medical emerged.How to manage these massive data safely and reliably has become an important challenge for the medical network community.This paper proposes a data management framework of medical network community based on Consortium Blockchain(CB)and Federated learning(FL),which realizes the data security sharing between medical institutions and research institutions.Under this framework,the data security sharing mechanism of medical network community based on smart contract and the data privacy protection mechanism based on FL and alliance chain are designed to ensure the security of data and the privacy of important data in medical network community,respectively.An intelligent contract system based on Keyed-Homomorphic Public Key(KH-PKE)Encryption scheme is designed,so that medical data can be saved in the CB in the form of ciphertext,and the automatic sharing of data is realized.Zero knowledge mechanism is used to ensure the correctness of shared data.Moreover,the zero-knowledge mechanism introduces the dynamic group signature mechanism of chosen ciphertext attack(CCA)anonymity,which makes the scheme more efficient in computing and communication cost.In the end of this paper,the performance of the scheme is analyzed fromboth asymptotic and practical aspects.Through experimental comparative analysis,the scheme proposed in this paper is more effective and feasible.
基金supported by the National Natural Science Foundation of China under Grant Nos.61772280 and 62072249.
文摘A Generative Adversarial Neural(GAN)network is designed based on deep learning for the Super-Resolution(SR)reconstruction task of temperaturefields(comparable to downscaling in the meteorologicalfield),which is limited by the small number of ground stations and the sparse distribution of observations,resulting in a lack offineness of data.To improve the network’s generalization performance,the residual structure,and batch normalization are used.Applying the nearest interpolation method to avoid over-smoothing of the climate element values instead of the conventional Bicubic interpolation in the computer visionfield.Sub-pixel convolution is used instead of transposed convolution or interpolation methods for up-sampling to speed up network inference.The experimental dataset is the European Centre for Medium-Range Weather Forecasts Reanalysis v5(ERA5)with a bidirectional resolution of 0:1°×0:1°.On the other hand,the task aims to scale up the size by a factor of 8,which is rare compared to conventional methods.The comparison methods include traditional interpolation methods and a more widely used GAN-based network such as the SRGAN.Thefinal experimental results show that the proposed scheme advances the performance of Root Mean Square Error(RMSE)by 37.25%,the Peak Signal-to-noise Ratio(PNSR)by 14.4%,and the Structural Similarity(SSIM)by 10.3%compared to the Bicubic Interpolation.For the traditional SRGAN network,a relatively obvious performance improvement is observed by experimental demonstration.Meanwhile,the GAN network can converge stably and reach the approximate Nash equilibrium for various initialization parameters to empirically illustrate the effectiveness of the method in the temperature fields.
基金This work was supported by China’s National Natural Science Foundation(No.62072249,62072056).Jin Wang and Yongjun Ren received the grant and the URLs to sponsors’websites are https://www.nsfc.gov.cn/.This work was also funded by the Researchers Supporting Project No.(RSP-2021/102)King Saud University,Riyadh,Saudi Arabia.
文摘Since transactions in blockchain are based on public ledger verification,this raises security concerns about privacy protection.And it will cause the accumulation of data on the chain and resulting in the low efficiency of block verification,when the whole transaction on the chain is verified.In order to improve the efficiency and privacy protection of block data verification,this paper proposes an efficient block verification mechanism with privacy protection based on zeroknowledge proof(ZKP),which not only protects the privacy of users but also improves the speed of data block verification.There is no need to put the whole transaction on the chain when verifying block data.It just needs to generate the ZKP and root hash with the transaction information,then save them to the smart contract for verification.Moreover,the ZKP verification in smart contract is carried out to realize the privacy protection of the transaction and efficient verification of the block.When the data is validated,the buffer accepts the complete transaction,updates the transaction status in the cloud database,and packages up the chain.So,the ZKP strengthens the privacy protection ability of blockchain,and the smart contracts save the time cost of block verification.
基金supported by National Key R&D Program of China(Grant Nos.2021YF B2700503,2020YF B1005900)supported by the National Natural Science Foundation of China(No.62072249)。
文摘In the data communication system,the real-time information interaction of communication device increases the risk of privacy sensitive data being tam-pered with.Therefore,maintaining data security is one of the most important issues in network data communication.Because the timestamp is the most impor-tant way to authenticate data in information interaction,it is very necessary to pro-vide timestamp service in the data communication system.However,the existing centralized timestamp mechanism is difficult to provide credible timestamp ser-vice,and users can conspire with timestamping servers to forge timestamps.Therefore,this paper designs a distributed timestamp mechanism based on contin-uous verifiable delay functions.It utilizes multiple independent timestamp servers to provide timestamp services in a distributed model and appends the timestamp to the data once the data is generated.Thus,it can prove that the data already exists at a certain time and ensure the accuracy of the timestamp.Moreover,a digital blind signature based on elliptic curve cryptography is utilized to solve the problem of timestamp forgery in timestamp service.Finally,the security ana-lysis of the scheme ensures the data security of data communication system and the concurrency rate of timestamp.The experimental results also show that the scheme greatly improves the efficiency of digital signatures.
基金Supported by Science and Technology Achievement Transformation Project of Scientific Research Institutions(2021JDZH0019)National Modern Agricultural Industry Technology System of the Ministry of Agriculture(CARS-43-D-1)+4 种基金Sichuan Provincial Breeding Research(2021YFYZ0033)Special Fund for Basic Scientific Research Business of Sichuan Animal Science Academy(SASA202105SASA202305)Natural Science Foundation of Sichuan Province(2023NSFSC0171)Basic Research for Application of Sichuan Provincial Science and Technology Planning Project(2021YJ0267)。
文摘[Objectives]miRNAs play an important role in the proliferation and differentiation of different myoblasts.This study was conducted to elucidate the complex genetic mechanisms that affect the meat production performance of Sichuan white rabbits and reveal the regulatory role of miRNAs in their muscle growth and meat quality formation.[Methods]Three constructed skeletal muscle libraries of Sichuan white rabbits aged six months were sequenced by the solexa technology to identify known miRNAs,predict new miRNAs and construct an expression profile of muscle miRNAs.[Results]A total of 511 known miRNAs and 42 miRNAs were detected in 34089472 pure sequences,and the proportion of miRNAs with a length of 22 nt was the highest.The number of known miRNA sequences accounted for 71.38%of pure sequences,which was much higher than the proportion of other types of RNAs.The proportion of sequences from exons was 0.38%,indicating a low degree of mRNA degradation in the samples.Base U had the highest proportion at the first position,and the bases with the highest proportions at positions 8 and 10 were U and A,respectively.Muscle-specific miRNAs(miR-1,miR-133,and miR-206)ranked in the top 10 in terms of expression level.The number and expression levels of new miRNAs were lower than those of known miRNAs.The length distribution,base bias at different positions and expression profile characteristics of miRNAs might be related to the biological function of miRNAs in regulating muscle proliferation and differentiation and the action mechanisms with target genes.[Conclusions]The identification and expression of miRNAs in muscle tissues of Sichuan white rabbits will help to understand the complex molecular mechanisms of meat production performance and provide a theoretical basis for the functional research of miRNAs in meat rabbits.
文摘Aortic dissection (AD) is a life-threatening clinical emergency requiring rapid diagnosis and effective intervention to improve patient survival and prognosis. Computed tomography angiography (CTA) can be used to diagnose AD accurately and quickly, making it the first choice for diagnosing AD in an emergency. This article reviews the application of CTA in the diagnosis and treatment of AD.
基金This work is supported by the NSFC(Nos.61772280,61772454)the Changzhou Sci&Tech Program(No.CJ20179027)the PAPD fund from NUIST.Prof.Jin Wang is the corresponding author。
文摘Since the British National Archive put forward the concept of the digital continuity in 2007,several developed countries have worked out their digital continuity action plan.However,the technologies of the digital continuity guarantee are still lacked.At first,this paper analyzes the requirements of digital continuity guarantee for electronic record based on data quality theory,then points out the necessity of data quality guarantee for electronic record.Moreover,we convert the digital continuity guarantee of electronic record to ensure the consistency,completeness and timeliness of electronic record,and construct the first technology framework of the digital continuity guarantee for electronic record.Finally,the temporal functional dependencies technology is utilized to build the first integration method to insure the consistency,completeness and timeliness of electronic record.
基金This work was supported by the National Natural Science Foundation of China under Grant 62072249,61772280,61772454,62072056.J.Wang and Y.Ren received the grants,and the URL of the sponsors’website is http://www.nsfc.gov.cn/This work was also supported by the Project of Transformation and Upgrading of Industries and Information Technologies of Jiangsu Province(No.JITC-1900AX2038/01).X.Yu received the grant,and the URL of the sponsors’website is http://gxt.jiangsu.gov.cn/.
文摘Distributed storage can store data in multiple devices or servers to improve data security.However,in today’s explosive growth of network data,traditional distributed storage scheme is faced with some severe challenges such as insufficient performance,data tampering,and data lose.A distributed storage scheme based on blockchain has been proposed to improve security and efficiency of traditional distributed storage.Under this scheme,the following improvements have been made in this paper.This paper first analyzes the problems faced by distributed storage.Then proposed to build a new distributed storage blockchain scheme with sharding blockchain.The proposed scheme realizes the partitioning of the network and nodes by means of blockchain sharding technology,which can improve the efficiency of data verification between nodes.In addition,this paper uses polynomial commitment to construct a new verifiable secret share scheme called PolyVSS.This new scheme is one of the foundations for building our improved distributed storage blockchain scheme.Compared with the previous scheme,our new scheme does not require a trusted third party and has some new features such as homomorphic and batch opening.The security of VSS can be further improved.Experimental comparisons show that the proposed scheme significantly reduces storage and communication costs.
基金the NSFC (Nos. 61772280, 62072249)the AIrecognition scoring system of weather map (No. SYCX202011)+1 种基金the national training programsof innovation and entrepreneurship for undergraduates (Nos. 201910300123Y, 202010300200)the PAPD fund from NUIST. Jinyue Xia is the corresponding author.
文摘Under the co-promotion of the wave of urbanization and the rise of data science,smart cities have become the new concept and new practice of urban development.Smart cities are the combination of information technology represented by the Internet of Things,cloud computing,mobile networks and big data,and urbanization.How to effectively achieve the long-term preservation of massive,heterogeneous,and multi-source digital electronic records in smart cities is a key issue thatmust be solved.Digital continuity can ensure the accessibility,integrity and availability of information.The quality management of electronic record,like the quality management of product,will run through every phase of the urban lifecycle.Based on data quality management,this paper constructs digital continuity of smart city electronic records.Furthermore,thework in this paper ensures the authenticity,integrity,availability and timeliness of electronic documents by quality management of electronic record.This paper elaborates on the overall technical architecture of electronic record,as well as the various technical means needed to protect its four characteristics.
基金This work is supported by the NSFC(No.61772280,61772454,6171101570,61702236)Natural Science Foundation of Jiangsu Province under grant No.BK20150460,the Changzhou Sci&Tech Program(No.CJ20179027)the PAPD fund from NUIST.Prof.Hye-Jin Kim is the corresponding author.
文摘Cloud storage represents the trend of intensive,scale and specialization of information technology,which has changed the technical architecture and implementation method of electronic records management.Moreover,it will provide a convenient way to generate more advanced and efficient management of the electronic data records.However,in cloud storage environment,it is difficult to guarantee the trustworthiness of electronic records,which results in a series of severe challenges to electronic records management.Starting from the definition and specification of electronic records,this paper firstly analyzes the requirements of the trustworthiness in cloud storage during their long-term preservation according to the information security theory and subdivides the trustworthiness into the authenticity,integrity,usability,and reliability of electronic records in cloud storage.Moreover,this paper proposes the technology framework of preservation for trusted electronic records.Also,the technology of blockchain,proofs of retrievability,the open archival information system model and erasure code are adopted to protect these four security attributes,to guarantee the credibility of the electronic record.
基金the National Natural Science Foundation of Chinaunder Grant No.61772280by the China Special Fund for Meteorological Research in the Public Interestunder Grant GYHY201306070by the Jiangsu Province Innovation and Entrepreneurship TrainingProgram for College Students under Grant No.201910300122Y.
文摘The application field of the Internet of Things(IoT)involves all aspects,and its application in the fields of industry,agriculture,environment,transportation,logistics,security and other infrastructure has effectively promoted the intelligent development of these aspects.Although the IoT has gradually grown in recent years,there are still many problems that need to be overcome in terms of technology,management,cost,policy,and security.We need to constantly weigh the benefits of trusting IoT products and the risk of leaking private data.To avoid the leakage and loss of various user data,this paper developed a hybrid algorithm of kernel function and random perturbation method based on the algorithm of non-negative matrix factorization,which realizes personalized recommendation and solves the problem of user privacy data protection in the process of personalized recommendation.Compared to non-negative matrix factorization privacy-preserving algorithm,the new algorithm does not need to know the detailed information of the data,only need to know the connection between each data;and the new algorithm can process the data points with negative characteristics.Experiments show that the new algorithm can produce recommendation results with certain accuracy under the premise of preserving users’personal privacy.
基金This research was supported by the National Natural Science Foundation of China under Grant No.61772280by the China Special Fund for Meteorological Research in the Public Interest under Grant GYHY201306070and by the Jiangsu Province Innovation and Entrepreneurship Training Program for College Students under Grant No.201810300079X。
文摘In view of the low accuracy of traditional ground nephogram recognition model,the authors put forward a k-means algorithm-acquired neural network ensemble method,which takes BP neural network ensemble model as the basis,uses k-means algorithm to choose the individual neural networks with partial diversities for integration,and builds the cloud form classification model.Through simulation experiments on ground nephogram samples,the results show that the algorithm proposed in the article can effectively improve the Classification accuracy of ground nephogram recognition in comparison with applying single BP neural network and traditional BP AdaBoost ensemble algorithm on classification of ground nephogram.
文摘In order to solve the problem that real-time face recognition is susceptible to illumination changes,this paper proposes a face recognition method that combines Local Binary Patterns(LBP)and Embedded Hidden Markov Model(EHMM).Face recognition method.The method firstly performs LBP preprocessing on the input face image,then extracts the feature vector,and finally sends the extracted feature observation vector to the EHMM for training or recognition.Experiments on multiple face databases show that the proposed algorithm is robust to illumination and improves recognition rate.
基金This work is supported by the NSFC(Nos.61772280,61702236)the Changzhou Sci&Tech Program(No.CJ20179027),and the PAPD fund from NUIST.Prof.
文摘With the diversification of electronic devices,cloud-based services have become the link between different devices.As a cryptosystem with secure conversion function,proxy re-encryption enables secure sharing of data in a cloud environment.Proxy re-encryption is a public key encryption system with ciphertext security conversion function.A semi-trusted agent plays the role of ciphertext conversion,which can convert the user ciphertext into the same plaintext encrypted by the principal’s public key.Proxy re-encryption has been a hotspot in the field of information security since it was proposed by Blaze et al.[Blaze,Bleumer and Strauss(1998)].After 20 years of development,proxy re-encryption has evolved into many forms been widely used.This paper elaborates on the definition,characteristics and development status of proxy re-encryption,and classifies proxy re-encryption from the perspectives of user identity,conversion condition,conversion hop count and conversion direction.The aspects of the existing program were compared and briefly reviewed from the aspects of features,performance,and security.Finally,this paper looks forward to the possible development direction of proxy re-encryption in the future.
基金This research was supported by the National Natural Science Foundation of China under Grant No.61772280 and No.62072249。
文摘As an important maritime hub,Bohai Sea Bay provides great convenience for shipping and suffers from sea ice disasters of different severity every winter,which greatly affects the socio-economic and development of the region.Therefore,this paper uses FY-4A(a weather satellite)data to study sea ice in the Bohai Sea.After processing the data for land removal and cloud detection,it combines multi-channel threshold method and adaptive threshold algorithm to realize the recognition of Bohai Sea ice under clear sky conditions.The random forests classification algorithm is introduced in sea ice identification,which can achieve a certain effect of sea ice classification recognition under cloud cover.Under non-clear sky conditions,the results of Bohai Sea ice identification based on random forests have been improved,and the algorithm can effectively identify Bohai Sea Ice and can improve the accuracy of sea ice identification,which lays a foundation for the accuracy and stability of sea ice identification.It realizes sea ice identification in the Bohai Sea and provides data support and algorithm support for marine climate forecasting related departments.