期刊文献+
共找到4,809篇文章
< 1 2 241 >
每页显示 20 50 100
Study on An Absolute Non-Collision Hash and Jumping Table IP Classification Algorithms
1
作者 SHANG Feng-jun 1,2 ,PAN Ying-jun 1 1. Key Laboratory of Opto-Electronic Technology and System of Ministry of Education/College of Opto-Electronic Engineering,Chongqing University, Chongqing 400044,China 2. College of Computer Science and Technology, Chongqing University of Posts and Telecommunications, Chongqing 400065,China 《Wuhan University Journal of Natural Sciences》 EI CAS 2004年第5期835-838,共4页
In order to classify packet, we propose a novel IP classification based the non-collision hash and jumping table trie-tree (NHJTTT) algorithm, which is based on noncollision hash Trie-tree and Lakshman and Stiliadis p... In order to classify packet, we propose a novel IP classification based the non-collision hash and jumping table trie-tree (NHJTTT) algorithm, which is based on noncollision hash Trie-tree and Lakshman and Stiliadis proposing a 2-dimensional classification algorithm (LS algorithm). The core of algorithm consists of two parts: structure the non-collision hash function, which is constructed mainly based on destination/source port and protocol type field so that the hash function can avoid space explosion problem; introduce jumping table Trie-tree based LS algorithm in order to reduce time complexity. The test results show that the classification rate of NHJTTT algorithm is up to 1 million packets per second and the maximum memory consumed is 9 MB for 10 000 rules. Key words IP classification - lookup algorithm - trie-tree - non-collision hash - jumping table CLC number TN 393.06 Foundation item: Supported by the Chongqing of Posts and Telecommunications Younger Teacher Fundation (A2003-03).Biography: SHANG Feng-jun (1972-), male, Ph.D. candidate, lecture, research direction: the smart instrument and network. 展开更多
关键词 IP classification lookup algorithm trie-tree non-collision hash jumping table
下载PDF
Underwater Pulse Waveform Recognition Based on Hash Aggregate Discriminant Network
2
作者 WANG Fangchen ZHONG Guoqiang WANG Liang 《Journal of Ocean University of China》 SCIE CAS CSCD 2024年第3期654-660,共7页
Underwater pulse waveform recognition is an important method for underwater object detection.Most existing works focus on the application of traditional pattern recognition methods,which ignore the time-and space-vary... Underwater pulse waveform recognition is an important method for underwater object detection.Most existing works focus on the application of traditional pattern recognition methods,which ignore the time-and space-varying characteristics in sound propagation channels and cannot easily extract valuable waveform features.Sound propagation channels in seawater are time-and space-varying convolutional channels.In the extraction of the waveform features of underwater acoustic signals,the effect of high-accuracy underwater acoustic signal recognition is identified by eliminating the influence of time-and space-varying convolutional channels to the greatest extent possible.We propose a hash aggregate discriminative network(HADN),which combines hash learning and deep learning to minimize the time-and space-varying effects on convolutional channels and adaptively learns effective underwater waveform features to achieve high-accuracy underwater pulse waveform recognition.In the extraction of the hash features of acoustic signals,a discrete constraint between clusters within a hash feature class is introduced.This constraint can ensure that the influence of convolutional channels on hash features is minimized.In addition,we design a new loss function called aggregate discriminative loss(AD-loss).The use of AD-loss and softmax-loss can increase the discriminativeness of the learned hash features.Experimental results show that on pool and ocean datasets,which were collected in pools and oceans,respectively,by using acoustic collectors,the proposed HADN performs better than other comparative models in terms of accuracy and mAP. 展开更多
关键词 convolutional channel hash aggregate discriminative network aggregate discriminant loss waveform recognition
下载PDF
基于Simhash算法的题库查重系统的设计与实现
3
作者 熊良钰 邓伦丹 《科学技术创新》 2024年第9期91-94,共4页
Simhash算法是一种基于局部敏感哈希(LSH)的技术,以其快速的计算速度和高度的查重准确性而知名。该算法通过将文本特征转换为二进制码,进而通过计算这些二进制码之间的汉明距离来评估文本的相似度。在文本去重和重复文档检测等多个领域,... Simhash算法是一种基于局部敏感哈希(LSH)的技术,以其快速的计算速度和高度的查重准确性而知名。该算法通过将文本特征转换为二进制码,进而通过计算这些二进制码之间的汉明距离来评估文本的相似度。在文本去重和重复文档检测等多个领域,Simhash算法已经展现出了显著的效果。鉴于此,将Simhash算法应用于题库查重具有很高的可行性和实际应用价值。 展开更多
关键词 Simhash算法 汉明距离 题库查重系统 文本相似度计算 哈希函数
下载PDF
基于变色龙hash的区块链可扩展存储方案 被引量:1
4
作者 胡宁玉 郝耀军 +1 位作者 常建龙 冯丽萍 《计算机应用研究》 CSCD 北大核心 2023年第12期3539-3544,3550,共7页
区块链中的节点以副本形式保存数据,随着时间的推移,区块链中的区块数不断增加,导致节点承受的存储压力随之增大,存储压力成为区块链应用落地的瓶颈之一。为了解决区块链中存储压力问题,提出了基于变色龙hash的区块链可扩展存储方案,该... 区块链中的节点以副本形式保存数据,随着时间的推移,区块链中的区块数不断增加,导致节点承受的存储压力随之增大,存储压力成为区块链应用落地的瓶颈之一。为了解决区块链中存储压力问题,提出了基于变色龙hash的区块链可扩展存储方案,该方案利用节点被攻击成功的概率和改进的温度模型,将区块分为高低安全性的冷热区块;基于变色龙hash算法和改进的Merkle tree,对高安全性的冷区块进行部分节点存储。在存储过程中,除高安全性冷区块的区块体信息被重构外,其余数据保持不变。仿真实验表明,在不改变区块链结构和安全性能的情况下,所提出的方案可减少区块链中数据的存储总量,减少存储节点的存储压力;且区块数量越多,其优势越明显。 展开更多
关键词 区块链 变色龙hash 存储扩展性
下载PDF
基于RS_Hash频繁项集的卫星载荷关联规则算法
5
作者 贾澎涛 温滋 《国外电子测量技术》 北大核心 2023年第2期9-15,共7页
遥测数据是反映卫星健康状态的重要依据,对遥测载荷数据进行关联性分析,在一定程度上能反映出卫星的整体运行情况的好坏。针对传统关联规则算法存在效率低下、占用内存过多的问题,提出一种基于RS_Hash频繁项集的卫星载荷关联规则算法。... 遥测数据是反映卫星健康状态的重要依据,对遥测载荷数据进行关联性分析,在一定程度上能反映出卫星的整体运行情况的好坏。针对传统关联规则算法存在效率低下、占用内存过多的问题,提出一种基于RS_Hash频繁项集的卫星载荷关联规则算法。首先对事务数据库使用动态随机抽样的方法获取样本数据,设计抽样误差和抽样停止规则来确定最优的样本容量;其次将抽取出的样本使用哈希桶来存储频繁项集,进而减少占用的内存,提高算法的运行效率;最后使用3个与载荷数据相似的公开数据集和卫星载荷数据集进行实验,结果表明,在公共数据集上取得了良好的效果,尤其是在具有大数据量级的卫星载荷数据集上效果明显,在不同事务长度和支持度的情况下,相较于Apriori、PCY、SON、FP-Growth、RCM_Apriori和Hash_Cumulate算法,RS_Hash算法在平均时间效率上分别提高了75.81%、49.10%、59.38%、50.22%、40.16%和39.22%。 展开更多
关键词 卫星载荷分析 关联规则 频繁项集 动态随机抽样算法 哈希桶
下载PDF
TOE Hash冲突处理设计与实现 被引量:1
6
作者 许旭晗 张俊杰 +1 位作者 陈彦昊 裴华明 《工业控制计算机》 2023年第3期79-81,共3页
随着以太网速率的不断提高,为了处理TCP/IP协议CPU承受了巨大的负担,因此使用单独的TCP卸载引擎处理TCP数据流显得尤为重要。TCP卸载引擎首先需要对链接进行辨认,而Hash函数作为一种发挥映射作用的函数常常被用于TCP链接的辨认。然而使... 随着以太网速率的不断提高,为了处理TCP/IP协议CPU承受了巨大的负担,因此使用单独的TCP卸载引擎处理TCP数据流显得尤为重要。TCP卸载引擎首先需要对链接进行辨认,而Hash函数作为一种发挥映射作用的函数常常被用于TCP链接的辨认。然而使用Hash函数后将不可避免地产生Hash冲突问题。设计实现的孔雀散列冲突解决方式相较传统一冗余与三冗余结构,分别节省4.1%空间与51.57%空间。在使用同种Hash且填充率为106%的情况下,孔雀散列的冲突解决率相较传统一冗余与三冗余结构分别提升21%与降低1%。在综合考虑使用空间与冲突解决率的情况下该方案有显著优势。 展开更多
关键词 TCP卸载引擎 hash冲突处理 孔雀散列
下载PDF
基于Hash和倒排项集的海上钻井平台隐患关.联分析
7
作者 易军 陈凯 +3 位作者 蔡昆 车承志 周伟 刘洪 《安全与环境学报》 CAS CSCD 北大核心 2023年第4期981-988,共8页
为了充分挖掘海上平台隐患案例中隐患属性与致因之间的关联性,提高对平台隐患风险预测的准确性和时效性,提出了一种基于Hash技术和倒排项集的关联规则挖掘模式。首先针对292条平台结构现场数据的每个隐患属性进行分析提取;其次在多支持... 为了充分挖掘海上平台隐患案例中隐患属性与致因之间的关联性,提高对平台隐患风险预测的准确性和时效性,提出了一种基于Hash技术和倒排项集的关联规则挖掘模式。首先针对292条平台结构现场数据的每个隐患属性进行分析提取;其次在多支持度下,按照多维、多层关联规则挖掘模式挖掘隐患设备、隐患位置、隐患现象等属性的关联规则;然后提出用信息增益来衡量关联规则的有效性;最后,根据挖掘结果分析并总结海上钻井平台隐患特征。结果表明:春季油水分离器常出现缺少保养的隐患,护管常出现未封堵的隐患;一般腐蚀和锈蚀的致因为操作维保不当、防护层剥落,以及防护装置锈蚀严重、脱焊、不符合规范,提高海上平台管理水平可减小现场隐患的发生概率。 展开更多
关键词 安全工程 海上钻井 多维关联规则 APRIORI算法 hash技术
下载PDF
并行耦合动态整数帐篷Hash函数设计
8
作者 刘玉杰 刘建东 +2 位作者 刘博 钟鸣 李博 《计算机应用与软件》 北大核心 2023年第2期296-301,共6页
提出一种基于耦合动态整数帐篷映射的并行Hash函数。针对数据量较大、处理速度较慢等问题,采用MD6算法框架,利用多核处理器技术并行处理数据。同时在压缩函数中,利用双向耦合映像格子模型进行扩散,用动态整数帐篷映射代替传统的逻辑函... 提出一种基于耦合动态整数帐篷映射的并行Hash函数。针对数据量较大、处理速度较慢等问题,采用MD6算法框架,利用多核处理器技术并行处理数据。同时在压缩函数中,利用双向耦合映像格子模型进行扩散,用动态整数帐篷映射代替传统的逻辑函数。实验结果表明,该算法可根据需要获得不同长度的Hash值,使用较短时间即可完成对数据的杂凑处理,同时具有较为理想的混乱与扩散性质,符合Hash函数的各项性能要求。 展开更多
关键词 hash函数 动态整数帐篷映射 并行 耦合映像格子
下载PDF
基于hash表的华容道算法研究
9
作者 周扬 陈伊琳 +1 位作者 韦妮君 周一诺 《计算机应用文摘》 2023年第1期102-104,109,共4页
目前,华容道算法是基于广度优先或深度优先搜索策略的改进,时间复杂为O(V+E)。为了提高效率,文章利用排列组合算法找到所有开局,再用广度优先算法找到开局的最优解,将所有开局及对应最优解保存在文件中。执行算法时最快可以以O(1)的时... 目前,华容道算法是基于广度优先或深度优先搜索策略的改进,时间复杂为O(V+E)。为了提高效率,文章利用排列组合算法找到所有开局,再用广度优先算法找到开局的最优解,将所有开局及对应最优解保存在文件中。执行算法时最快可以以O(1)的时间复杂度找到最优解。理论分析和实验结果都表明,基于h as h表的求解华容道的算法能明显提高算法效率。 展开更多
关键词 华容道 时间复杂度 hash
下载PDF
利用HASH方法反演美国三次中等地震的震源机制解
10
作者 仇瀚之 张学涛 《山东石油化工学院学报》 2023年第1期19-23,41,共6页
震源机制是了解震源信息及应力场的重要手段。在对HASH(Hardebeck and Shearer)方法的原理进行详细介绍的基础上,利用该方法反演得到美国3次4级以上天然地震的震源机制解。反演结果与美国地质调查局(USGS)结果基本一致,进一步证实该方... 震源机制是了解震源信息及应力场的重要手段。在对HASH(Hardebeck and Shearer)方法的原理进行详细介绍的基础上,利用该方法反演得到美国3次4级以上天然地震的震源机制解。反演结果与美国地质调查局(USGS)结果基本一致,进一步证实该方法对于中等地震反演计算的可靠性。 展开更多
关键词 震源机制 P波初动极性 hash方法 反演 离源角
下载PDF
基于Vision Transformer Hashing的民族布艺图案哈希检索算法研究 被引量:1
11
作者 韩雨萌 宁涛 +1 位作者 段晓东 高原 《大连民族大学学报》 CAS 2023年第3期250-254,共5页
针对民族布艺图案复杂多样、语义提取、图像识别与检索困难等问题,以蜡染图案和织锦图案为例,提出一种图像检索算法,以提高匹配和检索民族布艺图案的准确度。结合民族布艺图案领域知识将民族布艺图案图像进行预处理,使用VIT为主干网络... 针对民族布艺图案复杂多样、语义提取、图像识别与检索困难等问题,以蜡染图案和织锦图案为例,提出一种图像检索算法,以提高匹配和检索民族布艺图案的准确度。结合民族布艺图案领域知识将民族布艺图案图像进行预处理,使用VIT为主干网络在哈希检索算法框架下进行图像检索。该方法优化了深度哈希检索算法,通过自身的自注意力机制提升了提取图案深层语义特征的能力,提高了深度哈希算法检索民族布艺图案的速度和精度。实验结果表明:提出的方法最佳检索精度可以达到95.32%。 展开更多
关键词 图像检索 深度哈希检索 VIT 民族布艺图案检索
下载PDF
An Efficient Encrypted Speech Retrieval Based on Unsupervised Hashing and B+ Tree Dynamic Index
12
作者 Qiu-yu Zhang Yu-gui Jia +1 位作者 Fang-Peng Li Le-Tian Fan 《Computers, Materials & Continua》 SCIE EI 2023年第7期107-128,共22页
Existing speech retrieval systems are frequently confronted with expanding volumes of speech data.The dynamic updating strategy applied to construct the index can timely process to add or remove unnecessary speech dat... Existing speech retrieval systems are frequently confronted with expanding volumes of speech data.The dynamic updating strategy applied to construct the index can timely process to add or remove unnecessary speech data to meet users’real-time retrieval requirements.This study proposes an efficient method for retrieving encryption speech,using unsupervised deep hashing and B+ tree dynamic index,which avoid privacy leak-age of speech data and enhance the accuracy and efficiency of retrieval.The cloud’s encryption speech library is constructed by using the multi-threaded Dijk-Gentry-Halevi-Vaikuntanathan(DGHV)Fully Homomorphic Encryption(FHE)technique,which encrypts the original speech.In addition,this research employs Residual Neural Network18-Gated Recurrent Unit(ResNet18-GRU),which is used to learn the compact binary hash codes,store binary hash codes in the designed B+tree index table,and create a mapping relation of one to one between the binary hash codes and the corresponding encrypted speech.External B+tree index technology is applied to achieve dynamic index updating of the B+tree index table,thereby satisfying users’needs for real-time retrieval.The experimental results on THCHS-30 and TIMIT showed that the retrieval accuracy of the proposed method is more than 95.84%compared to the existing unsupervised hashing methods.The retrieval efficiency is greatly improved.Compared to the method of using hash index tables,and the speech data’s security is effectively guaranteed. 展开更多
关键词 Encrypted speech retrieval unsupervised deep hashing learning to hash B+tree dynamic index DGHV fully homomorphic encryption
下载PDF
TECMH:Transformer-Based Cross-Modal Hashing For Fine-Grained Image-Text Retrieval
13
作者 Qiqi Li Longfei Ma +2 位作者 Zheng Jiang Mingyong Li Bo Jin 《Computers, Materials & Continua》 SCIE EI 2023年第5期3713-3728,共16页
In recent years,cross-modal hash retrieval has become a popular research field because of its advantages of high efficiency and low storage.Cross-modal retrieval technology can be applied to search engines,crossmodalm... In recent years,cross-modal hash retrieval has become a popular research field because of its advantages of high efficiency and low storage.Cross-modal retrieval technology can be applied to search engines,crossmodalmedical processing,etc.The existing main method is to use amulti-label matching paradigm to finish the retrieval tasks.However,such methods do not use fine-grained information in the multi-modal data,which may lead to suboptimal results.To avoid cross-modal matching turning into label matching,this paper proposes an end-to-end fine-grained cross-modal hash retrieval method,which can focus more on the fine-grained semantic information of multi-modal data.First,the method refines the image features and no longer uses multiple labels to represent text features but uses BERT for processing.Second,this method uses the inference capabilities of the transformer encoder to generate global fine-grained features.Finally,in order to better judge the effect of the fine-grained model,this paper uses the datasets in the image text matching field instead of the traditional label-matching datasets.This article experiment on Microsoft COCO(MS-COCO)and Flickr30K datasets and compare it with the previous classicalmethods.The experimental results show that this method can obtain more advanced results in the cross-modal hash retrieval field. 展开更多
关键词 Deep learning cross-modal retrieval hash learning TRANSFORMER
下载PDF
ViT2CMH:Vision Transformer Cross-Modal Hashing for Fine-Grained Vision-Text Retrieval
14
作者 Mingyong Li Qiqi Li +1 位作者 Zheng Jiang Yan Ma 《Computer Systems Science & Engineering》 SCIE EI 2023年第8期1401-1414,共14页
In recent years,the development of deep learning has further improved hash retrieval technology.Most of the existing hashing methods currently use Convolutional Neural Networks(CNNs)and Recurrent Neural Networks(RNNs)... In recent years,the development of deep learning has further improved hash retrieval technology.Most of the existing hashing methods currently use Convolutional Neural Networks(CNNs)and Recurrent Neural Networks(RNNs)to process image and text information,respectively.This makes images or texts subject to local constraints,and inherent label matching cannot capture finegrained information,often leading to suboptimal results.Driven by the development of the transformer model,we propose a framework called ViT2CMH mainly based on the Vision Transformer to handle deep Cross-modal Hashing tasks rather than CNNs or RNNs.Specifically,we use a BERT network to extract text features and use the vision transformer as the image network of the model.Finally,the features are transformed into hash codes for efficient and fast retrieval.We conduct extensive experiments on Microsoft COCO(MS-COCO)and Flickr30K,comparing with baselines of some hashing methods and image-text matching methods,showing that our method has better performance. 展开更多
关键词 hash learning cross-modal retrieval fine-grained matching TRANSFORMER
下载PDF
Cross-Domain Authentication Scheme Based on Blockchain and Consistent Hash Algorithm for System-Wide Information Management
15
作者 Lizhe Zhang Yongqiang Huang +1 位作者 Jia Nie Kenian Wang 《Computers, Materials & Continua》 SCIE EI 2023年第11期1467-1488,共22页
System-wide information management(SWIM)is a complex distributed information transfer and sharing system for the next generation of Air Transportation System(ATS).In response to the growing volume of civil aviation ai... System-wide information management(SWIM)is a complex distributed information transfer and sharing system for the next generation of Air Transportation System(ATS).In response to the growing volume of civil aviation air operations,users accessing different authentication domains in the SWIM system have problems with the validity,security,and privacy of SWIM-shared data.In order to solve these problems,this paper proposes a SWIM crossdomain authentication scheme based on a consistent hashing algorithm on consortium blockchain and designs a blockchain certificate format for SWIM cross-domain authentication.The scheme uses a consistent hash algorithm with virtual nodes in combination with a cluster of authentication centers in the SWIM consortium blockchain architecture to synchronize the user’s authentication mapping relationships between authentication domains.The virtual authentication nodes are mapped separately using different services provided by SWIM to guarantee the partitioning of the consistent hash ring on the consortium blockchain.According to the dynamic change of user’s authentication requests,the nodes of virtual service authentication can be added and deleted to realize the dynamic load balancing of cross-domain authentication of different services.Security analysis shows that this protocol can resist network attacks such as man-in-the-middle attacks,replay attacks,and Sybil attacks.Experiments show that this scheme can reduce the redundant authentication operations of identity information and solve the problems of traditional cross-domain authentication with single-point collapse,difficulty in expansion,and uneven load.At the same time,it has better security of information storage and can realize the cross-domain authentication requirements of SWIM users with low communication costs and system overhead.KEYWORDS System-wide information management(SWIM);consortium blockchain;consistent hash;cross-domain authentication;load balancing. 展开更多
关键词 System-wide information management(SWIM) consortium blockchain consistent hash cross-domain authentication load balancing
下载PDF
Secure Content Based Image Retrieval Scheme Based on Deep Hashing and Searchable Encryption
16
作者 Zhen Wang Qiu-yu Zhang +1 位作者 Ling-tao Meng Yi-lin Liu 《Computers, Materials & Continua》 SCIE EI 2023年第6期6161-6184,共24页
To solve the problem that the existing ciphertext domain image retrieval system is challenging to balance security,retrieval efficiency,and retrieval accuracy.This research suggests a searchable encryption and deep ha... To solve the problem that the existing ciphertext domain image retrieval system is challenging to balance security,retrieval efficiency,and retrieval accuracy.This research suggests a searchable encryption and deep hashing-based secure image retrieval technique that extracts more expressive image features and constructs a secure,searchable encryption scheme.First,a deep learning framework based on residual network and transfer learn-ing model is designed to extract more representative image deep features.Secondly,the central similarity is used to quantify and construct the deep hash sequence of features.The Paillier homomorphic encryption encrypts the deep hash sequence to build a high-security and low-complexity searchable index.Finally,according to the additive homomorphic property of Paillier homomorphic encryption,a similarity measurement method suitable for com-puting in the retrieval system’s security is ensured by the encrypted domain.The experimental results,which were obtained on Web Image Database from the National University of Singapore(NUS-WIDE),Microsoft Common Objects in Context(MS COCO),and ImageNet data sets,demonstrate the system’s robust security and precise retrieval,the proposed scheme can achieve efficient image retrieval without revealing user privacy.The retrieval accuracy is improved by at least 37%compared to traditional hashing schemes.At the same time,the retrieval time is saved by at least 9.7%compared to the latest deep hashing schemes. 展开更多
关键词 Content-based image retrieval deep supervised hashing central similarity quantification searchable encryption Paillier homomorphic encryption
下载PDF
Hash-Indexing Block-Based Deduplication Algorithm for Reducing Storage in the Cloud
17
作者 D.Viji S.Revathy 《Computer Systems Science & Engineering》 SCIE EI 2023年第7期27-42,共16页
Cloud storage is essential for managing user data to store and retrieve from the distributed data centre.The storage service is distributed as pay a service for accessing the size to collect the data.Due to the massiv... Cloud storage is essential for managing user data to store and retrieve from the distributed data centre.The storage service is distributed as pay a service for accessing the size to collect the data.Due to the massive amount of data stored in the data centre containing similar information and file structures remaining in multi-copy,duplication leads to increase storage space.The potential deduplication system doesn’t make efficient data reduction because of inaccuracy in finding similar data analysis.It creates a complex nature to increase the storage consumption under cost.To resolve this problem,this paper proposes an efficient storage reduction called Hash-Indexing Block-based Deduplication(HIBD)based on Segmented Bind Linkage(SBL)Methods for reducing storage in a cloud environment.Initially,preprocessing is done using the sparse augmentation technique.Further,the preprocessed files are segmented into blocks to make Hash-Index.The block of the contents is compared with other files through Semantic Content Source Deduplication(SCSD),which identifies the similar content presence between the file.Based on the content presence count,the Distance Vector Weightage Correlation(DVWC)estimates the document similarity weight,and related files are grouped into a cluster.Finally,the segmented bind linkage compares the document to find duplicate content in the cluster using similarity weight based on the coefficient match case.This implementation helps identify the data redundancy efficiently and reduces the service cost in distributed cloud storage. 展开更多
关键词 Cloud computing DEDUPLICATION hash indexing relational content analysis document clustering cloud storage record linkage
下载PDF
Fusion of Hash-Based Hard and Soft Biometrics for Enhancing Face Image Database Search and Retrieval
18
作者 Ameerah Abdullah Alshahrani Emad Sami Jaha Nahed Alowidi 《Computers, Materials & Continua》 SCIE EI 2023年第12期3489-3509,共21页
The utilization of digital picture search and retrieval has grown substantially in numerous fields for different purposes during the last decade,owing to the continuing advances in image processing and computer vision... The utilization of digital picture search and retrieval has grown substantially in numerous fields for different purposes during the last decade,owing to the continuing advances in image processing and computer vision approaches.In multiple real-life applications,for example,social media,content-based face picture retrieval is a well-invested technique for large-scale databases,where there is a significant necessity for reliable retrieval capabilities enabling quick search in a vast number of pictures.Humans widely employ faces for recognizing and identifying people.Thus,face recognition through formal or personal pictures is increasingly used in various real-life applications,such as helping crime investigators retrieve matching images from face image databases to identify victims and criminals.However,such face image retrieval becomes more challenging in large-scale databases,where traditional vision-based face analysis requires ample additional storage space than the raw face images already occupied to store extracted lengthy feature vectors and takes much longer to process and match thousands of face images.This work mainly contributes to enhancing face image retrieval performance in large-scale databases using hash codes inferred by locality-sensitive hashing(LSH)for facial hard and soft biometrics as(Hard BioHash)and(Soft BioHash),respectively,to be used as a search input for retrieving the top-k matching faces.Moreover,we propose the multi-biometric score-level fusion of both face hard and soft BioHashes(Hard-Soft BioHash Fusion)for further augmented face image retrieval.The experimental outcomes applied on the Labeled Faces in the Wild(LFW)dataset and the related attributes dataset(LFW-attributes),demonstrate that the retrieval performance of the suggested fusion approach(Hard-Soft BioHash Fusion)significantly improved the retrieval performance compared to solely using Hard BioHash or Soft BioHash in isolation,where the suggested method provides an augmented accuracy of 87%when executed on 1000 specimens and 77%on 5743 samples.These results remarkably outperform the results of the Hard BioHash method by(50%on the 1000 samples and 30%on the 5743 samples),and the Soft BioHash method by(78%on the 1000 samples and 63%on the 5743 samples). 展开更多
关键词 Face image retrieval soft biometrics similar pictures hashING database search large databases score-level fusion multimodal fusion
下载PDF
Numeric Identifier Transmission Algorithm Using Hash Function
19
作者 Vladyslav Kutsman 《Open Journal of Applied Sciences》 2023年第9期1581-1587,共7页
When developing programs or websites, it is very convenient to use relational databases, which contain powerful and convenient tools that allow to work with data very flexibly and get the necessary information in a ma... When developing programs or websites, it is very convenient to use relational databases, which contain powerful and convenient tools that allow to work with data very flexibly and get the necessary information in a matter of milliseconds. A relational database consists of tables and records in these tables, each table must have a primary key, in particular, it can be a number of BIGINT type, which is a unique index of a record in the table, which allows to fetch operation with maximum speed and O (1) complexity. After the operation of writing a row to the table of database, the program receives the row identifier ID in the form of a number, and in the future this ID can be used to obtain this record. In the case of a website, this could be the GET method of the http protocol with the entry ID in the request. But very often it happens that the transmission of an identifier in the clear form is not safe, both for business reasons and for security reasons of access to information. And in this case, it is necessary to create additional functionality for checking access rights and come up with a way to encode data in such a way that it would be impossible to determine the record identifier, and this, in turn, leads to the fact that the program code becomes much more complicated and also increases the amount of data, necessary to ensure the operation of the program. This article presents an algorithm that solves these problems “on the fly” without complicating the application logic and does not require resources to store additional information. Also, this algorithm is very reliable since it is based on the use of hash functions and synthesized as a result of many years of work related to writing complex systems that require an increased level of data security and program performance. 展开更多
关键词 CRYPTOGRAPHY Security CODING hash Functions ALGORITHMS Fintech BANKING Golang POSTGRESQL
下载PDF
基于Hash算法的无线网络安全态势评估方法
20
作者 刘亚鹏 《计算机应用文摘》 2023年第5期82-84,共3页
由于传统的无线网络安全状态评估技术不能正确评估网络安全状态,为此需要研究基于Hash算法的无线网络安全态势评估方法。利用无线网络传输与虚拟化技术,建立无线网络安全状态判断模型;通过消息队列问题的优化,改变有限域中的最高次数,... 由于传统的无线网络安全状态评估技术不能正确评估网络安全状态,为此需要研究基于Hash算法的无线网络安全态势评估方法。利用无线网络传输与虚拟化技术,建立无线网络安全状态判断模型;通过消息队列问题的优化,改变有限域中的最高次数,构建多变量Hash压缩函数;利用无线网络深度学习对层次归一化的处理,实现对无线网络安全态势的评估。测试结果表明,无线网络安全态势评估方法经过Hash算法调整后,平均精准度为91.55%,且反应时间只需要20.44 s,能够实现对无线网络安全状态的有效评价。 展开更多
关键词 hash算法 无线网络 安全态势 评估方法
下载PDF
上一页 1 2 241 下一页 到第
使用帮助 返回顶部