In traditional digital twin communication system testing,we can apply test cases as completely as possible in order to ensure the correctness of the system implementation,and even then,there is no guarantee that the d...In traditional digital twin communication system testing,we can apply test cases as completely as possible in order to ensure the correctness of the system implementation,and even then,there is no guarantee that the digital twin communication system implementation is completely correct.Formal verification is currently recognized as a method to ensure the correctness of software system for communication in digital twins because it uses rigorous mathematical methods to verify the correctness of systems for communication in digital twins and can effectively help system designers determine whether the system is designed and implemented correctly.In this paper,we use the interactive theorem proving tool Isabelle/HOL to construct the formal model of the X86 architecture,and to model the related assembly instructions.The verification result shows that the system states obtained after the operations of relevant assembly instructions is consistent with the expected states,indicating that the system meets the design expectations.展开更多
The consensus of the automotive industry and traffic management authorities is that autonomous vehicles must follow the same traffic laws as human drivers.Using formal or digital methods,natural language traffic rules...The consensus of the automotive industry and traffic management authorities is that autonomous vehicles must follow the same traffic laws as human drivers.Using formal or digital methods,natural language traffic rules can be translated into machine language and used by autonomous vehicles.In this paper,a translation flow is designed.Beyond the translation,a deeper examination is required,because the semantics of natural languages are rich and complex,and frequently contain hidden assumptions.The issue of how to ensure that digital rules are accurate and consistent with the original intent of the traffic rules they represent is both significant and unresolved.In response,we propose a method of formal verification that combines equivalence verification with model checking.Reasonable and reassuring digital traffic rules can be obtained by utilizing the proposed traffic rule digitization flow and verification method.In addition,we offer a number of simulation applications that employ digital traffic rules to assess vehicle violations.The experimental findings indicate that our digital rules utilizing metric temporal logic(MTL)can be easily incorporated into simulation platforms and autonomous driving systems(ADS).展开更多
Recent advancements in satellite technologies and the declining cost of access to space have led to the emergence of large satellite constellations in Low Earth Orbit(LEO).However,these constellations often rely on be...Recent advancements in satellite technologies and the declining cost of access to space have led to the emergence of large satellite constellations in Low Earth Orbit(LEO).However,these constellations often rely on bent-pipe architecture,resulting in high communication costs.Existing onboard inference architectures suffer from limitations in terms of low accuracy and inflexibility in the deployment and management of in-orbit applications.To address these challenges,we propose a cloud-native-based satellite design specifically tailored for Earth Observation tasks,enabling diverse computing paradigms.In this work,we present a case study of a satellite-ground collaborative inference system deployed in the Tiansuan constellation,demonstrating a remarkable 50%accuracy improvement and a substantial 90%data reduction.Our work sheds light on in-orbit energy,where in-orbit computing accounts for 17%of the total onboard energy consumption.Our approach represents a significant advancement of cloud-native satellite,aiming to enhance the accuracy of in-orbit computing while simultaneously reducing communication cost.展开更多
The maturity of 5G technology has enabled crowd-sensing services to collect multimedia data over wireless network,so it has promoted the applications of crowd-sensing services in different fields,but also brings more ...The maturity of 5G technology has enabled crowd-sensing services to collect multimedia data over wireless network,so it has promoted the applications of crowd-sensing services in different fields,but also brings more privacy security challenges,the most commom which is privacy leakage.As a privacy protection technology combining data integrity check and identity anonymity,ring signature is widely used in the field of privacy protection.However,introducing signature technology leads to additional signature verification overhead.In the scenario of crowd-sensing,the existing signature schemes have low efficiency in multi-signature verification.Therefore,it is necessary to design an efficient multi-signature verification scheme while ensuring security.In this paper,a batch-verifiable signature scheme is proposed based on the crowd-sensing background,which supports the sensing platform to verify the uploaded multiple signature data efficiently,so as to overcoming the defects of the traditional signature scheme in multi-signature verification.In our proposal,a method for linking homologous data was presented,which was valuable for incentive mechanism and data analysis.Simulation results showed that the proposed scheme has good performance in terms of security and efficiency in crowd-sensing applications with a large number of users and data.展开更多
Dynamic signature is a biometric modality that recognizes an individual’s anatomic and behavioural characteristics when signing their name. The rampant case of signature falsification (Identity Theft) was the key mot...Dynamic signature is a biometric modality that recognizes an individual’s anatomic and behavioural characteristics when signing their name. The rampant case of signature falsification (Identity Theft) was the key motivating factor for embarking on this study. This study was necessitated by the damages and dangers posed by signature forgery coupled with the intractable nature of the problem. The aim and objectives of this study is to design a proactive and responsive system that could compare two signature samples and detect the correct signature against the forged one. Dynamic Signature verification is an important biometric technique that aims to detect whether a given signature is genuine or forged. In this research work, Convolutional Neural Networks (CNNsor ConvNet) which is a class of deep, feed forward artificial neural networks that has successfully been applied to analysing visual imagery was used to train the model. The signature images are stored in a file directory structure which the Keras Python library can work with. Then the CNN was implemented in python using the Keras with the TensorFlow backend to learn the patterns associated with the signature. The result showed that for the same CNNs-based network experimental result of average accuracy, the larger the training dataset, the higher the test accuracy. However, when the training dataset are insufficient, better results can be obtained. The paper concluded that by training datasets using CNNs network, 98% accuracy in the result was recorded, in the experimental part, the model achieved a high degree of accuracy in the classification of the biometric parameters used.展开更多
Blockchain can realize the reliable storage of a large amount of data that is chronologically related and verifiable within the system.This technology has been widely used and has developed rapidly in big data systems...Blockchain can realize the reliable storage of a large amount of data that is chronologically related and verifiable within the system.This technology has been widely used and has developed rapidly in big data systems across various fields.An increasing number of users are participating in application systems that use blockchain as their underlying architecture.As the number of transactions and the capital involved in blockchain grow,ensuring information security becomes imperative.Addressing the verification of transactional information security and privacy has emerged as a critical challenge.Blockchain-based verification methods can effectively eliminate the need for centralized third-party organizations.However,the efficiency of nodes in storing and verifying blockchain data faces unprecedented challenges.To address this issue,this paper introduces an efficient verification scheme for transaction security.Initially,it presents a node evaluation module to estimate the activity level of user nodes participating in transactions,accompanied by a probabilistic analysis for all transactions.Subsequently,this paper optimizes the conventional transaction organization form,introduces a heterogeneous Merkle tree storage structure,and designs algorithms for constructing these heterogeneous trees.Theoretical analyses and simulation experiments conclusively demonstrate the superior performance of this scheme.When verifying the same number of transactions,the heterogeneous Merkle tree transmits less data and is more efficient than traditional methods.The findings indicate that the heterogeneous Merkle tree structure is suitable for various blockchain applications,including the Internet of Things.This scheme can markedly enhance the efficiency of information verification and bolster the security of distributed systems.展开更多
Online Signature Verification (OSV), as a personal identification technology, is widely used in various industries.However, it faces challenges, such as incomplete feature extraction, low accuracy, and computational h...Online Signature Verification (OSV), as a personal identification technology, is widely used in various industries.However, it faces challenges, such as incomplete feature extraction, low accuracy, and computational heaviness. Toaddress these issues, we propose a novel approach for online signature verification, using a one-dimensionalGhost-ACmix Residual Network (1D-ACGRNet), which is a Ghost-ACmix Residual Network that combines convolutionwith a self-attention mechanism and performs improvement by using Ghost method. The Ghost-ACmix Residualstructure is introduced to leverage both self-attention and convolution mechanisms for capturing global featureinformation and extracting local information, effectively complementing whole and local signature features andmitigating the problem of insufficient feature extraction. Then, the Ghost-based Convolution and Self-Attention(ACG) block is proposed to simplify the common parts between convolution and self-attention using the Ghostmodule and employ feature transformation to obtain intermediate features, thus reducing computational costs.Additionally, feature selection is performed using the random forestmethod, and the data is dimensionally reducedusing Principal Component Analysis (PCA). Finally, tests are implemented on the MCYT-100 datasets and theSVC-2004 Task2 datasets, and the equal error rates (EERs) for small-sample training using five genuine andforged signatures are 3.07% and 4.17%, respectively. The EERs for training with ten genuine and forged signaturesare 0.91% and 2.12% on the respective datasets. The experimental results illustrate that the proposed approacheffectively enhances the accuracy of online signature verification.展开更多
Background:The purpose of the study was to investigate the active ingredients and potential biochemical mechanisms of Juanbi capsule in knee osteoarthritis based on network pharmacology,molecular docking and animal ex...Background:The purpose of the study was to investigate the active ingredients and potential biochemical mechanisms of Juanbi capsule in knee osteoarthritis based on network pharmacology,molecular docking and animal experiments.Methods:Chemical components for each drug in the Juanbi capsule were obtained from Traditional Chinese Medicine Systems Pharmacology Database and Analysis Platform,while the target proteins for knee osteoarthritis were retrieved from the Drugbank,GeneCards,and OMIM databases.The study compared information on knee osteoarthritis and the targets of drugs to identify common elements.The data was imported into the STRING platform to generate a protein-protein interaction network diagram.Subsequently,a“component-target”network diagram was created using the screened drug components and target information with Cytoscape software.Common targets were imported into Metascape for GO function and KEGG pathway enrichment analysis.AutoDockTools was utilized to predict the molecular docking of the primary chemical components and core targets.Ultimately,the key targets were validated through animal experiments.Results:Juanbi capsule ameliorated Knee osteoarthritis mainly by affecting tumor necrosis factor,interleukin1β,MMP9,PTGS2,VEGFA,TP53,and other cytokines through quercetin,kaempferol,andβ-sitosterol.The drug also influenced the AGE-RAGE,interleukin-17,tumor necrosis factor,Relaxin,and NF-κB signaling pathways.The network pharmacology analysis results were further validated in animal experiments.The results indicated that Juanbi capsule could decrease the levels of tumor necrosis factor-αand interleukin-1βin the serum and synovial fluid of knee osteoarthritis rats and also down-regulate the expression levels of MMP9 and PTGS2 proteins in the articular cartilage.Conclusion:Juanbi capsule may improve the knee bone microstructure and reduce the expression of inflammatory factors of knee osteoarthritis via multiple targets and multiple signaling pathways.展开更多
Objective:To apply and verify the application of intelligent audit rules for urine analysis by Cui et al.Method:A total of 1139 urine samples of hospitalized patients in Tai’an Central Hospital from September 2021 to...Objective:To apply and verify the application of intelligent audit rules for urine analysis by Cui et al.Method:A total of 1139 urine samples of hospitalized patients in Tai’an Central Hospital from September 2021 to November 2021 were randomly selected,and all samples were manually microscopic examined after the detection of the UN9000 urine analysis line.The intelligent audit rules(including the microscopic review rules and manual verification rules)were validated based on the manual microscopic examination and manual audit,and the rules were adjusted to apply to our laboratory.The laboratory turnaround time(TAT)before and after the application of intelligent audit rules was compared.Result:The microscopic review rate of intelligent rules was 25.63%(292/1139),the true positive rate,false positive rate,true negative rate,and false negative rate were 27.66%(315/1139),6.49%(74/1139),62.34%(710/1139)and 3.51%(40/1139),respectively.The approval consistency rate of manual verification rules was 84.92%(727/856),the approval inconsistency rate was 0%(0/856),the interception consistency rate was 12.61%(108/856),and the interception inconsistency rate was 0%(0/856).Conclusion:The intelligence audit rules for urine analysis by Cui et al.have good clinical applicability in our laboratory.展开更多
In order to evaluate the precipitation forecast performance of mesoscale numerical model in Northeast China,mesoscale model in Liaoning Province and T213 model,and improve the ability to use their forecast products fo...In order to evaluate the precipitation forecast performance of mesoscale numerical model in Northeast China,mesoscale model in Liaoning Province and T213 model,and improve the ability to use their forecast products for forecasters,the synoptic verifications of their 12 h accumulated precipitation forecasts of 3 numerical modes from May to August in 2008 were made on the basis of different systems impacting weather in Liaoning Province.The time limitations were 24,36,48 and 60 h.The verified contents included 6 aspects such as intensity and position of precipitation center,intensity,location,scope and moving velocity of precipitation main body.The results showed that the three models had good forecasting capability for precipitation in Liaoning Province,but the cupacity of each model was obviously different.展开更多
This paper introduces a novel verification development platform for the passive UHF RFID tag,which is compatible with the ISO/IEC 18000-6B standard,operating in the 915MHz ISM band. This platform efficiently reduces t...This paper introduces a novel verification development platform for the passive UHF RFID tag,which is compatible with the ISO/IEC 18000-6B standard,operating in the 915MHz ISM band. This platform efficiently reduces the design and development time and cost, and implements a fast prototype design of the passive UHF RFID tag. It includes the RFID analog front end and the tag control logic, which is implemented in an Altera ACEX FPGA. The RFID analog front end, which is fabricated using a Chartered 0.35μm two-poly four-metal CMOS process, contains a local oscillator, power on reset circuit, matching network and backscatter, rectifier, regu- lator,AM demodulator, etc. The platform achieves rapid, flexible and efficient verification and development, and can also be fit for other RFID standards after changing the tag control logic in FPGA.展开更多
To achieve an on-demand and dynamic composition model of inter-organizational business processes, a new approach for business process modeling and verification is introduced by using the pi-calculus theory. A new busi...To achieve an on-demand and dynamic composition model of inter-organizational business processes, a new approach for business process modeling and verification is introduced by using the pi-calculus theory. A new business process model which is multi-role, multi-dimensional, integrated and dynamic is proposed relying on inter-organizational collaboration. Compatible with the traditional linear sequence model, the new model is an M x N multi-dimensional mesh, and provides horizontal and vertical formal descriptions for the collaboration business process model. Finally, the pi-calculus theory is utilized to verify the deadlocks, livelocks and synchronization of the example models. The result shows that the proposed approach is efficient and applicable in inter-organizational business process modeling.展开更多
This paper presents an approximation method to display realistic pictures of numerical control (NC) machining simulation very quickly. T he tool movement envelope is divided into many small regions and the normal to...This paper presents an approximation method to display realistic pictures of numerical control (NC) machining simulation very quickly. T he tool movement envelope is divided into many small regions and the normal to t hese small regions is calculated. The system saves the calculated result in a fi le before starting animation display. When the system starts displaying machinin g animation, it does not need to calculate small triangular facets normal to the workpiece surface. It only needs to find out what part of the cutter cuts the w orkpiece surface and to read the normal from the file. A highly efficient NC cod e verification method is also presented in this paper. The method first detects the error in z direction. If some points are reported to be out of the tolerance , the system divides neighborhood of these points into smaller grids and calcula tes the normal surface at each grid intersection and the error in the normal ve ctor direction.展开更多
We describe a post resolution-enhancement-technique verification method for use in manufacturing data flow. The goal of the method is to verify whether designs function as intended,or more precisely, whether the print...We describe a post resolution-enhancement-technique verification method for use in manufacturing data flow. The goal of the method is to verify whether designs function as intended,or more precisely, whether the printed images are consistent with the design intent. The process modeling is described for the model-based verifi cation method. The performance of the method is demonstrated by experiment.展开更多
A solution is proposed for the real-time vehicle verification which is an important problem for numerous on- road vehicle applications. First, based on the vertical symmetry characteristics of vehicle images, a vertic...A solution is proposed for the real-time vehicle verification which is an important problem for numerous on- road vehicle applications. First, based on the vertical symmetry characteristics of vehicle images, a vertical symmetrical histograms of oriented gradients (VS-HOG) descriptor is proposed for extracting the image features. In the classification stage, an extreme learning machine (ELM) is used to improve the real-time performance. Experimental data demonstrate that, compared with other classical methods, the vehicle verification algorithm based on VS-HOG and ELM achieves a better trade-off between cost and performance. The computational cost is reduced by using the algorithm, while keeping the performance loss as low as possible. Furthermore, experimental results further show that the proposed vehicle verification method is suitable for on-road vehicle applications due to its better performance both in efficiency and accuracy.展开更多
This study investigated the impact of different verification-area designs on the sensitive areas identified using the conditional nonlinear optimal perturbation (CNOP) method for tropical cyclone targeted observatio...This study investigated the impact of different verification-area designs on the sensitive areas identified using the conditional nonlinear optimal perturbation (CNOP) method for tropical cyclone targeted observations.The sensitive areas identified using the first singular vector (FSV) method,which is the linear approximation of CNOP,were also investigated for comparison.By analyzing the validity of the sensitive areas,the proper design of a verification area was developed.Tropical cyclone Rananim,which occurred in August 2004 in the northwest Pacific Ocean,was studied.Two sets of verification areas were designed;one changed position,and the other changed both size and position.The CNOP and its identified sensitive areas were found to be less sensitive to small variations of the verification areas than those of the FSV and its sensitive areas.With larger variations of the verification area,the CNOP and the FSV as well as their identified sensitive areas changed substantially.In terms of reducing forecast errors in the verification area,the CNOP-identified sensitive areas were more beneficial than those identified using FSV.The design of the verification area is important for cyclone prediction.The verification area should be designed with a proper size according to the possible locations of the cyclone obtained from the ensemble forecast results.In addition,the development trend of the cyclone analyzed from its dynamic mechanisms was another reference.When the general position of the verification area was determined,a small variation in size or position had little influence on the results of CNOP.展开更多
Objective To verify Working Group for Obesity in China (WGOC) recommended body mass index (BMI) classification reference for overweight and obesity in Chinese children and adolescents using the data of 2002 China ...Objective To verify Working Group for Obesity in China (WGOC) recommended body mass index (BMI) classification reference for overweight and obesity in Chinese children and adolescents using the data of 2002 China Nationwide Nutrition and Health Survey. Methods PediaWic metabolic syndrome (MetS) and abnormality of each risk factor for MetS were defined using the criteria for US adolescents. Definition of hyper-TC, LDL, and dyslipidemia in adults was applied as well. The average level and abnormality rate of the metabolic indicators were described by BMI percentiles and compared with general linear model analysis. Receiver operating characteristic analysis was used to summarize the potential of BMI to discriminate between the presence and absence of the abnormality of these indicators. Results There was neither significantly increasing nor significantly decreasing trend of biochemical parameter levels in low BMI percentile range (〈65th). Slight increasing trend from the 75th and a significant increase were found when BMI≥85th percentile. In general, the prevalence of the examined risk factors varied slightly when BMI percentile〈75th, and substantial increases were consistently seen when BMI percentile≥75th. As an indicator of hyper-TG, hypertension and MetS, the sensitivity and specificity were equal at the point of BMI〈75th percentile, and the Youden's index of risk factors also reached peak point before 75th percentile except for MetS. When the BMI percentile was used as the screening indicator of MetS, Youden's index reached peak point at 85th percentile, just the point in the ROC graph that was nearest to the upper left comer. Conclusion The BMI classification reference for overweight and obesity recommended by WGOC is rational to predict and prevent health risks in Chinese children and adolescents. Lower screening cut-off points, such as 83th percentile or 80th percentile, should not be excluded when they are considered as overweight criteria in future intervention or prevention studies.展开更多
The mathematical model used to describe the detonation multi-physics phenomenon is usually given by highly coupled nonlinear partial differential equations. Numerical simulation and the computer aided engineering (CAE...The mathematical model used to describe the detonation multi-physics phenomenon is usually given by highly coupled nonlinear partial differential equations. Numerical simulation and the computer aided engineering (CAE) technique has become the third pillar of detonation research, along with theory and experiment, due to the detonation phenomenon is difficult to explain by the theoretical analysis, and the cost required to accredit the reliability of detonation products is very high, even some physical experiments of detonation are impossible. The numerical simulation technique can solve these complex problems in the real situation repeatedly and reduce the design cost and time stunningly. But the reliability of numerical simulation software and the serviceability of the computational result seriously hinders the extension, application and the self-restoration of the simulation software, restricts its independently innovational ability. This article deals with the physical modeling, numerical simulation, and software development of detonation in a unified way. Verification and validation and uncertainty quantification (V&V&UQ) is an important approach in ensuring the credibility of the modeling and simulation of detonation. V&V of detonation is based on our independently developed detonation multiphysics software-LAD2D. We propose the verification method based on mathematical theory and program function as well as availability of its program execution. Validation is executed by comparing with the experiment data. At last, we propose the future prospect of numerical simulation software and the CAE technique, and we also pay attention to the research direction of V&V&UQ.展开更多
基金supported in part by the Natural Science Foundation of Jiangsu Province in China under grant No.BK20191475the fifth phase of“333 Project”scientific research funding project of Jiangsu Province in China under grant No.BRA2020306the Qing Lan Project of Jiangsu Province in China under grant No.2019.
文摘In traditional digital twin communication system testing,we can apply test cases as completely as possible in order to ensure the correctness of the system implementation,and even then,there is no guarantee that the digital twin communication system implementation is completely correct.Formal verification is currently recognized as a method to ensure the correctness of software system for communication in digital twins because it uses rigorous mathematical methods to verify the correctness of systems for communication in digital twins and can effectively help system designers determine whether the system is designed and implemented correctly.In this paper,we use the interactive theorem proving tool Isabelle/HOL to construct the formal model of the X86 architecture,and to model the related assembly instructions.The verification result shows that the system states obtained after the operations of relevant assembly instructions is consistent with the expected states,indicating that the system meets the design expectations.
文摘The consensus of the automotive industry and traffic management authorities is that autonomous vehicles must follow the same traffic laws as human drivers.Using formal or digital methods,natural language traffic rules can be translated into machine language and used by autonomous vehicles.In this paper,a translation flow is designed.Beyond the translation,a deeper examination is required,because the semantics of natural languages are rich and complex,and frequently contain hidden assumptions.The issue of how to ensure that digital rules are accurate and consistent with the original intent of the traffic rules they represent is both significant and unresolved.In response,we propose a method of formal verification that combines equivalence verification with model checking.Reasonable and reassuring digital traffic rules can be obtained by utilizing the proposed traffic rule digitization flow and verification method.In addition,we offer a number of simulation applications that employ digital traffic rules to assess vehicle violations.The experimental findings indicate that our digital rules utilizing metric temporal logic(MTL)can be easily incorporated into simulation platforms and autonomous driving systems(ADS).
基金supported by National Natural Science Foundation of China(62032003).
文摘Recent advancements in satellite technologies and the declining cost of access to space have led to the emergence of large satellite constellations in Low Earth Orbit(LEO).However,these constellations often rely on bent-pipe architecture,resulting in high communication costs.Existing onboard inference architectures suffer from limitations in terms of low accuracy and inflexibility in the deployment and management of in-orbit applications.To address these challenges,we propose a cloud-native-based satellite design specifically tailored for Earth Observation tasks,enabling diverse computing paradigms.In this work,we present a case study of a satellite-ground collaborative inference system deployed in the Tiansuan constellation,demonstrating a remarkable 50%accuracy improvement and a substantial 90%data reduction.Our work sheds light on in-orbit energy,where in-orbit computing accounts for 17%of the total onboard energy consumption.Our approach represents a significant advancement of cloud-native satellite,aiming to enhance the accuracy of in-orbit computing while simultaneously reducing communication cost.
基金supported by National Natural Science Foundation of China under Grant No.61972360Shandong Provincial Natural Science Foundation of China under Grant Nos.ZR2020MF148,ZR2020QF108.
文摘The maturity of 5G technology has enabled crowd-sensing services to collect multimedia data over wireless network,so it has promoted the applications of crowd-sensing services in different fields,but also brings more privacy security challenges,the most commom which is privacy leakage.As a privacy protection technology combining data integrity check and identity anonymity,ring signature is widely used in the field of privacy protection.However,introducing signature technology leads to additional signature verification overhead.In the scenario of crowd-sensing,the existing signature schemes have low efficiency in multi-signature verification.Therefore,it is necessary to design an efficient multi-signature verification scheme while ensuring security.In this paper,a batch-verifiable signature scheme is proposed based on the crowd-sensing background,which supports the sensing platform to verify the uploaded multiple signature data efficiently,so as to overcoming the defects of the traditional signature scheme in multi-signature verification.In our proposal,a method for linking homologous data was presented,which was valuable for incentive mechanism and data analysis.Simulation results showed that the proposed scheme has good performance in terms of security and efficiency in crowd-sensing applications with a large number of users and data.
文摘Dynamic signature is a biometric modality that recognizes an individual’s anatomic and behavioural characteristics when signing their name. The rampant case of signature falsification (Identity Theft) was the key motivating factor for embarking on this study. This study was necessitated by the damages and dangers posed by signature forgery coupled with the intractable nature of the problem. The aim and objectives of this study is to design a proactive and responsive system that could compare two signature samples and detect the correct signature against the forged one. Dynamic Signature verification is an important biometric technique that aims to detect whether a given signature is genuine or forged. In this research work, Convolutional Neural Networks (CNNsor ConvNet) which is a class of deep, feed forward artificial neural networks that has successfully been applied to analysing visual imagery was used to train the model. The signature images are stored in a file directory structure which the Keras Python library can work with. Then the CNN was implemented in python using the Keras with the TensorFlow backend to learn the patterns associated with the signature. The result showed that for the same CNNs-based network experimental result of average accuracy, the larger the training dataset, the higher the test accuracy. However, when the training dataset are insufficient, better results can be obtained. The paper concluded that by training datasets using CNNs network, 98% accuracy in the result was recorded, in the experimental part, the model achieved a high degree of accuracy in the classification of the biometric parameters used.
基金funded by the National Natural Science Foundation of China(62072056,62172058)the Researchers Supporting Project Number(RSP2023R102)King Saud University,Riyadh,Saudi Arabia+4 种基金funded by the Hunan Provincial Key Research and Development Program(2022SK2107,2022GK2019)the Natural Science Foundation of Hunan Province(2023JJ30054)the Foundation of State Key Laboratory of Public Big Data(PBD2021-15)the Young Doctor Innovation Program of Zhejiang Shuren University(2019QC30)Postgraduate Scientific Research Innovation Project of Hunan Province(CX20220940,CX20220941).
文摘Blockchain can realize the reliable storage of a large amount of data that is chronologically related and verifiable within the system.This technology has been widely used and has developed rapidly in big data systems across various fields.An increasing number of users are participating in application systems that use blockchain as their underlying architecture.As the number of transactions and the capital involved in blockchain grow,ensuring information security becomes imperative.Addressing the verification of transactional information security and privacy has emerged as a critical challenge.Blockchain-based verification methods can effectively eliminate the need for centralized third-party organizations.However,the efficiency of nodes in storing and verifying blockchain data faces unprecedented challenges.To address this issue,this paper introduces an efficient verification scheme for transaction security.Initially,it presents a node evaluation module to estimate the activity level of user nodes participating in transactions,accompanied by a probabilistic analysis for all transactions.Subsequently,this paper optimizes the conventional transaction organization form,introduces a heterogeneous Merkle tree storage structure,and designs algorithms for constructing these heterogeneous trees.Theoretical analyses and simulation experiments conclusively demonstrate the superior performance of this scheme.When verifying the same number of transactions,the heterogeneous Merkle tree transmits less data and is more efficient than traditional methods.The findings indicate that the heterogeneous Merkle tree structure is suitable for various blockchain applications,including the Internet of Things.This scheme can markedly enhance the efficiency of information verification and bolster the security of distributed systems.
基金National Natural Science Foundation of China(Grant No.62073227)Liaoning Provincial Science and Technology Department Foundation(Grant No.2023JH2/101300212).
文摘Online Signature Verification (OSV), as a personal identification technology, is widely used in various industries.However, it faces challenges, such as incomplete feature extraction, low accuracy, and computational heaviness. Toaddress these issues, we propose a novel approach for online signature verification, using a one-dimensionalGhost-ACmix Residual Network (1D-ACGRNet), which is a Ghost-ACmix Residual Network that combines convolutionwith a self-attention mechanism and performs improvement by using Ghost method. The Ghost-ACmix Residualstructure is introduced to leverage both self-attention and convolution mechanisms for capturing global featureinformation and extracting local information, effectively complementing whole and local signature features andmitigating the problem of insufficient feature extraction. Then, the Ghost-based Convolution and Self-Attention(ACG) block is proposed to simplify the common parts between convolution and self-attention using the Ghostmodule and employ feature transformation to obtain intermediate features, thus reducing computational costs.Additionally, feature selection is performed using the random forestmethod, and the data is dimensionally reducedusing Principal Component Analysis (PCA). Finally, tests are implemented on the MCYT-100 datasets and theSVC-2004 Task2 datasets, and the equal error rates (EERs) for small-sample training using five genuine andforged signatures are 3.07% and 4.17%, respectively. The EERs for training with ten genuine and forged signaturesare 0.91% and 2.12% on the respective datasets. The experimental results illustrate that the proposed approacheffectively enhances the accuracy of online signature verification.
基金funding from the Basic Research Project of the Education Department of Shaanxi Province(21JC010,21JP035)the Young and Middle-Aged Scientific Research and Innovation Team of the Shaanxi Provincial Administration of Traditional Chinese Medicine(2022SLRHLJ001)the 2023 Central Financial Transfer Payment Local Project“Innovation and Improvement of Five Types of Hospital Preparations,Such as Roumudan Granules”.
文摘Background:The purpose of the study was to investigate the active ingredients and potential biochemical mechanisms of Juanbi capsule in knee osteoarthritis based on network pharmacology,molecular docking and animal experiments.Methods:Chemical components for each drug in the Juanbi capsule were obtained from Traditional Chinese Medicine Systems Pharmacology Database and Analysis Platform,while the target proteins for knee osteoarthritis were retrieved from the Drugbank,GeneCards,and OMIM databases.The study compared information on knee osteoarthritis and the targets of drugs to identify common elements.The data was imported into the STRING platform to generate a protein-protein interaction network diagram.Subsequently,a“component-target”network diagram was created using the screened drug components and target information with Cytoscape software.Common targets were imported into Metascape for GO function and KEGG pathway enrichment analysis.AutoDockTools was utilized to predict the molecular docking of the primary chemical components and core targets.Ultimately,the key targets were validated through animal experiments.Results:Juanbi capsule ameliorated Knee osteoarthritis mainly by affecting tumor necrosis factor,interleukin1β,MMP9,PTGS2,VEGFA,TP53,and other cytokines through quercetin,kaempferol,andβ-sitosterol.The drug also influenced the AGE-RAGE,interleukin-17,tumor necrosis factor,Relaxin,and NF-κB signaling pathways.The network pharmacology analysis results were further validated in animal experiments.The results indicated that Juanbi capsule could decrease the levels of tumor necrosis factor-αand interleukin-1βin the serum and synovial fluid of knee osteoarthritis rats and also down-regulate the expression levels of MMP9 and PTGS2 proteins in the articular cartilage.Conclusion:Juanbi capsule may improve the knee bone microstructure and reduce the expression of inflammatory factors of knee osteoarthritis via multiple targets and multiple signaling pathways.
文摘Objective:To apply and verify the application of intelligent audit rules for urine analysis by Cui et al.Method:A total of 1139 urine samples of hospitalized patients in Tai’an Central Hospital from September 2021 to November 2021 were randomly selected,and all samples were manually microscopic examined after the detection of the UN9000 urine analysis line.The intelligent audit rules(including the microscopic review rules and manual verification rules)were validated based on the manual microscopic examination and manual audit,and the rules were adjusted to apply to our laboratory.The laboratory turnaround time(TAT)before and after the application of intelligent audit rules was compared.Result:The microscopic review rate of intelligent rules was 25.63%(292/1139),the true positive rate,false positive rate,true negative rate,and false negative rate were 27.66%(315/1139),6.49%(74/1139),62.34%(710/1139)and 3.51%(40/1139),respectively.The approval consistency rate of manual verification rules was 84.92%(727/856),the approval inconsistency rate was 0%(0/856),the interception consistency rate was 12.61%(108/856),and the interception inconsistency rate was 0%(0/856).Conclusion:The intelligence audit rules for urine analysis by Cui et al.have good clinical applicability in our laboratory.
文摘In order to evaluate the precipitation forecast performance of mesoscale numerical model in Northeast China,mesoscale model in Liaoning Province and T213 model,and improve the ability to use their forecast products for forecasters,the synoptic verifications of their 12 h accumulated precipitation forecasts of 3 numerical modes from May to August in 2008 were made on the basis of different systems impacting weather in Liaoning Province.The time limitations were 24,36,48 and 60 h.The verified contents included 6 aspects such as intensity and position of precipitation center,intensity,location,scope and moving velocity of precipitation main body.The results showed that the three models had good forecasting capability for precipitation in Liaoning Province,but the cupacity of each model was obviously different.
文摘This paper introduces a novel verification development platform for the passive UHF RFID tag,which is compatible with the ISO/IEC 18000-6B standard,operating in the 915MHz ISM band. This platform efficiently reduces the design and development time and cost, and implements a fast prototype design of the passive UHF RFID tag. It includes the RFID analog front end and the tag control logic, which is implemented in an Altera ACEX FPGA. The RFID analog front end, which is fabricated using a Chartered 0.35μm two-poly four-metal CMOS process, contains a local oscillator, power on reset circuit, matching network and backscatter, rectifier, regu- lator,AM demodulator, etc. The platform achieves rapid, flexible and efficient verification and development, and can also be fit for other RFID standards after changing the tag control logic in FPGA.
基金The National Natural Science Foundation of China(No60473078)
文摘To achieve an on-demand and dynamic composition model of inter-organizational business processes, a new approach for business process modeling and verification is introduced by using the pi-calculus theory. A new business process model which is multi-role, multi-dimensional, integrated and dynamic is proposed relying on inter-organizational collaboration. Compatible with the traditional linear sequence model, the new model is an M x N multi-dimensional mesh, and provides horizontal and vertical formal descriptions for the collaboration business process model. Finally, the pi-calculus theory is utilized to verify the deadlocks, livelocks and synchronization of the example models. The result shows that the proposed approach is efficient and applicable in inter-organizational business process modeling.
文摘This paper presents an approximation method to display realistic pictures of numerical control (NC) machining simulation very quickly. T he tool movement envelope is divided into many small regions and the normal to t hese small regions is calculated. The system saves the calculated result in a fi le before starting animation display. When the system starts displaying machinin g animation, it does not need to calculate small triangular facets normal to the workpiece surface. It only needs to find out what part of the cutter cuts the w orkpiece surface and to read the normal from the file. A highly efficient NC cod e verification method is also presented in this paper. The method first detects the error in z direction. If some points are reported to be out of the tolerance , the system divides neighborhood of these points into smaller grids and calcula tes the normal surface at each grid intersection and the error in the normal ve ctor direction.
文摘We describe a post resolution-enhancement-technique verification method for use in manufacturing data flow. The goal of the method is to verify whether designs function as intended,or more precisely, whether the printed images are consistent with the design intent. The process modeling is described for the model-based verifi cation method. The performance of the method is demonstrated by experiment.
基金The National Natural Science Foundation of China(No.61203237)the Natural Science Foundation of Zhejiang Province(No.LQ12F03016)the China Postdoctoral Science Foundation(No.2011M500836)
文摘A solution is proposed for the real-time vehicle verification which is an important problem for numerous on- road vehicle applications. First, based on the vertical symmetry characteristics of vehicle images, a vertical symmetrical histograms of oriented gradients (VS-HOG) descriptor is proposed for extracting the image features. In the classification stage, an extreme learning machine (ELM) is used to improve the real-time performance. Experimental data demonstrate that, compared with other classical methods, the vehicle verification algorithm based on VS-HOG and ELM achieves a better trade-off between cost and performance. The computational cost is reduced by using the algorithm, while keeping the performance loss as low as possible. Furthermore, experimental results further show that the proposed vehicle verification method is suitable for on-road vehicle applications due to its better performance both in efficiency and accuracy.
基金supported by the National Natural Science Foundation of China (Grant No. 40830955)the China Meteorological Administration (Grant No. GYHY200906009)the National Basic Research Program of China (Grant Nos.2006CB403606,2007CB411800,and 2009CB421505)
文摘This study investigated the impact of different verification-area designs on the sensitive areas identified using the conditional nonlinear optimal perturbation (CNOP) method for tropical cyclone targeted observations.The sensitive areas identified using the first singular vector (FSV) method,which is the linear approximation of CNOP,were also investigated for comparison.By analyzing the validity of the sensitive areas,the proper design of a verification area was developed.Tropical cyclone Rananim,which occurred in August 2004 in the northwest Pacific Ocean,was studied.Two sets of verification areas were designed;one changed position,and the other changed both size and position.The CNOP and its identified sensitive areas were found to be less sensitive to small variations of the verification areas than those of the FSV and its sensitive areas.With larger variations of the verification area,the CNOP and the FSV as well as their identified sensitive areas changed substantially.In terms of reducing forecast errors in the verification area,the CNOP-identified sensitive areas were more beneficial than those identified using FSV.The design of the verification area is important for cyclone prediction.The verification area should be designed with a proper size according to the possible locations of the cyclone obtained from the ensemble forecast results.In addition,the development trend of the cyclone analyzed from its dynamic mechanisms was another reference.When the general position of the verification area was determined,a small variation in size or position had little influence on the results of CNOP.
基金Funded by Ministry of Health and Science and Technology (2001 DEA 30035, 2002 DZA 40022, 200DIA6N008), China.
文摘Objective To verify Working Group for Obesity in China (WGOC) recommended body mass index (BMI) classification reference for overweight and obesity in Chinese children and adolescents using the data of 2002 China Nationwide Nutrition and Health Survey. Methods PediaWic metabolic syndrome (MetS) and abnormality of each risk factor for MetS were defined using the criteria for US adolescents. Definition of hyper-TC, LDL, and dyslipidemia in adults was applied as well. The average level and abnormality rate of the metabolic indicators were described by BMI percentiles and compared with general linear model analysis. Receiver operating characteristic analysis was used to summarize the potential of BMI to discriminate between the presence and absence of the abnormality of these indicators. Results There was neither significantly increasing nor significantly decreasing trend of biochemical parameter levels in low BMI percentile range (〈65th). Slight increasing trend from the 75th and a significant increase were found when BMI≥85th percentile. In general, the prevalence of the examined risk factors varied slightly when BMI percentile〈75th, and substantial increases were consistently seen when BMI percentile≥75th. As an indicator of hyper-TG, hypertension and MetS, the sensitivity and specificity were equal at the point of BMI〈75th percentile, and the Youden's index of risk factors also reached peak point before 75th percentile except for MetS. When the BMI percentile was used as the screening indicator of MetS, Youden's index reached peak point at 85th percentile, just the point in the ROC graph that was nearest to the upper left comer. Conclusion The BMI classification reference for overweight and obesity recommended by WGOC is rational to predict and prevent health risks in Chinese children and adolescents. Lower screening cut-off points, such as 83th percentile or 80th percentile, should not be excluded when they are considered as overweight criteria in future intervention or prevention studies.
基金supported by Science Challenge Project [No TZ2018001]Shandong Provincial Natural Science Foundation [No ZR2017BA014]+1 种基金National Natural Science Foundation of China [No91630312]the Development Program for Defense Ministry of China [No.C1520110002]
文摘The mathematical model used to describe the detonation multi-physics phenomenon is usually given by highly coupled nonlinear partial differential equations. Numerical simulation and the computer aided engineering (CAE) technique has become the third pillar of detonation research, along with theory and experiment, due to the detonation phenomenon is difficult to explain by the theoretical analysis, and the cost required to accredit the reliability of detonation products is very high, even some physical experiments of detonation are impossible. The numerical simulation technique can solve these complex problems in the real situation repeatedly and reduce the design cost and time stunningly. But the reliability of numerical simulation software and the serviceability of the computational result seriously hinders the extension, application and the self-restoration of the simulation software, restricts its independently innovational ability. This article deals with the physical modeling, numerical simulation, and software development of detonation in a unified way. Verification and validation and uncertainty quantification (V&V&UQ) is an important approach in ensuring the credibility of the modeling and simulation of detonation. V&V of detonation is based on our independently developed detonation multiphysics software-LAD2D. We propose the verification method based on mathematical theory and program function as well as availability of its program execution. Validation is executed by comparing with the experiment data. At last, we propose the future prospect of numerical simulation software and the CAE technique, and we also pay attention to the research direction of V&V&UQ.