期刊文献+
共找到6篇文章
< 1 >
每页显示 20 50 100
Design of area and power efficient Radix-4 DIT FFT butterfly unit using floating point fused arithmetic 被引量:2
1
作者 Prabhu E Mangalam H Karthick S 《Journal of Central South University》 SCIE EI CAS CSCD 2016年第7期1669-1681,共13页
In this work, power efficient butterfly unit based FFT architecture is presented. The butterfly unit is designed using floating-point fused arithmetic units. The fused arithmetic units include two-term dot product uni... In this work, power efficient butterfly unit based FFT architecture is presented. The butterfly unit is designed using floating-point fused arithmetic units. The fused arithmetic units include two-term dot product unit and add-subtract unit. In these arithmetic units, operations are performed over complex data values. A modified fused floating-point two-term dot product and an enhanced model for the Radix-4 FFT butterfly unit are proposed. The modified fused two-term dot product is designed using Radix-16 booth multiplier. Radix-16 booth multiplier will reduce the switching activities compared to Radix-8 booth multiplier in existing system and also will reduce the area required. The proposed architecture is implemented efficiently for Radix-4 decimation in time(DIT) FFT butterfly with the two floating-point fused arithmetic units. The proposed enhanced architecture is synthesized, implemented, placed and routed on a FPGA device using Xilinx ISE tool. It is observed that the Radix-4 DIT fused floating-point FFT butterfly requires 50.17% less space and 12.16% reduced power compared to the existing methods and the proposed enhanced model requires 49.82% less space on the FPGA device compared to the proposed design. Also, reduced power consumption is addressed by utilizing the reusability technique, which results in 11.42% of power reduction of the enhanced model compared to the proposed design. 展开更多
关键词 floating-point arithmetic floating-point fused dot product Radix-16 booth multiplier Radix-4 FFT butterfly fast fouriertransform decimation in time
下载PDF
ON AN EXTENSION OF ABEL-GONTSCHAROFF'S EXPANSION FORMULA 被引量:1
2
作者 Tianxiao He Leetsch C. Hsu Peter J. S. Shiue 《Analysis in Theory and Applications》 2005年第4期359-369,共11页
We present a constructive generalization of Abel-Gontscharoff's series expansion to higher dimensions. A constructive application to a problem of multivariate interpolation is also investigated. In addition, two algo... We present a constructive generalization of Abel-Gontscharoff's series expansion to higher dimensions. A constructive application to a problem of multivariate interpolation is also investigated. In addition, two algorithms for constructing the basis functions of the interpolants are given. 展开更多
关键词 Abel-Gontscharoff's Expansion Formula Abel-Gontscharoff-Gould polynomial multivariate Abel-Gontscharoff interpolation higher dimensional dot product annihilation coefficients
下载PDF
THE EQUATION OF MOTION FOR THE SYSTEM OF THE VARIABLE MASS IN THE NON-LINEAR NON-HOLONOMIC SPACE
3
作者 邱荣 《Applied Mathematics and Mechanics(English Edition)》 SCIE EI 1996年第4期379-386,共8页
The dot product of bases vectors on the super-surface of constraints of the nonlinear non-holonomic space and Mesherskii equations may act as the equations of fundamental dynamics of mechanical system for the variable... The dot product of bases vectors on the super-surface of constraints of the nonlinear non-holonomic space and Mesherskii equations may act as the equations of fundamental dynamics of mechanical system for the variable mass.These are very simple and convenient for computation.From these known equations,the equations of Chaplygin,Nielson,Appell,Mac-Millan et al.are deriv d;it is unnecessary to introduce the definition if Appell-Chetaev or Niu Qinping for the virtual displacement.These are compatible with the D'Alembert-Lagrange's principle. 展开更多
关键词 the non-linear non-holonomic constraints the system of the variable mass dot product.bases vectors on supersurface of constraints Misherskii equation
下载PDF
Deep Neural Network Based Spam Email Classification Using Attention Mechanisms
4
作者 Md. Tofael Ahmed Mariam Akter +4 位作者 Md. Saifur Rahman Maqsudur Rahman Pintu Chandra Paul Miss. Nargis Parvin Almas Hossain Antar 《Journal of Intelligent Learning Systems and Applications》 2023年第4期144-164,共21页
Spam emails pose a threat to individuals. The proliferation of spam emails daily has rendered traditional machine learning and deep learning methods for screening them ineffective and inefficient. In our research, we ... Spam emails pose a threat to individuals. The proliferation of spam emails daily has rendered traditional machine learning and deep learning methods for screening them ineffective and inefficient. In our research, we employ deep neural networks like RNN, LSTM, and GRU, incorporating attention mechanisms such as Bahdanua, scaled dot product (SDP), and Luong scaled dot product self-attention for spam email filtering. We evaluate our approach on various datasets, including Trec spam, Enron spam emails, SMS spam collections, and the Ling spam dataset, which constitutes a substantial custom dataset. All these datasets are publicly available. For the Enron dataset, we attain an accuracy of 99.97% using LSTM with SDP self-attention. Our custom dataset exhibits the highest accuracy of 99.01% when employing GRU with SDP self-attention. The SMS spam collection dataset yields a peak accuracy of 99.61% with LSTM and SDP attention. Using the GRU (Gated Recurrent Unit) alongside Luong and SDP (Structured Self-Attention) attention mechanisms, the peak accuracy of 99.89% in the Ling spam dataset. For the Trec spam dataset, the most accurate results are achieved using Luong attention LSTM, with an accuracy rate of 99.01%. Our performance analyses consistently indicate that employing the scaled dot product attention mechanism in conjunction with gated recurrent neural networks (GRU) delivers the most effective results. In summary, our research underscores the efficacy of employing advanced deep learning techniques and attention mechanisms for spam email filtering, with remarkable accuracy across multiple datasets. This approach presents a promising solution to the ever-growing problem of spam emails. 展开更多
关键词 Spam Email Attention Mechanism Deep Neural Network Bahdanua Attention Scale dot Product
下载PDF
On Conditionally Positive Definite Dot Product Kernels
5
作者 V.A.MENEGATTO C.P.OLIVEIRA Ana P.PERON 《Acta Mathematica Sinica,English Series》 SCIE CSCD 2008年第7期1127-1138,共12页
Let m and n be fixed, positive integers and P a space composed of real polynomials in m variables. The authors study functions f : R →R which map Gram matrices, based upon n points of R^m, into matrices, which are n... Let m and n be fixed, positive integers and P a space composed of real polynomials in m variables. The authors study functions f : R →R which map Gram matrices, based upon n points of R^m, into matrices, which are nonnegative definite with respect to P Among other things, the authors discuss continuity, differentiability, convexity, and convexity in the sense of Jensen, of such functions 展开更多
关键词 conditionally positive definite kernels dot product kernels Gram matrices CONVEXITY convexity in the sense of Jensen
原文传递
Nonorientable Genera of Petersen Powers
6
作者 Wen Zhong LIU Ting Ru SHEN Yi Chao CHEN 《Acta Mathematica Sinica,English Series》 SCIE CSCD 2015年第4期557-564,共8页
In the paper, we prove that for every integer n ≥ 1, there exists a Petersen power pn with nonorientable genus and Euler genus precisely n, which improves the upper bound of Mohar and Vodopivec's result [J. Graph Th... In the paper, we prove that for every integer n ≥ 1, there exists a Petersen power pn with nonorientable genus and Euler genus precisely n, which improves the upper bound of Mohar and Vodopivec's result [J. Graph Theory, 67, 1-8 (2011)] that for every integer k (2 ≤ k ≤ n- 1), a Petersen power Pn exists with nonorientable genus and Euler genus precisely k. 展开更多
关键词 dot product Petersen power GENUS
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部