The requirements of data coding in multimedia applications are presented, the current technique of coding and relative standards is introduced, then the work that have been doing is presented, i.e. the wavelet-based c...The requirements of data coding in multimedia applications are presented, the current technique of coding and relative standards is introduced, then the work that have been doing is presented, i.e. the wavelet-based coding method and the VE (Visual Entropy)-based coding method. The experiment results prove that these methods have gained a better perceptual quality of a reconstructed image and a lower bit rate. Their performance evaluations are better than JPEG (Joint Photographic Experts Group) coding. Finally, the future topics of study are put forward.展开更多
The upper air weather forecast data used in current business and research and digital data of the recently finished upper air meteorological monthly report were comparatively analyzed in complete data and quality cond...The upper air weather forecast data used in current business and research and digital data of the recently finished upper air meteorological monthly report were comparatively analyzed in complete data and quality condition of data, and sounding curve change caused by the difference of complete data was also compared, which evaluated advantages and disadvantages of two types of data.展开更多
In the process of encoding and decoding,erasure codes over binary fields,which just need AND operations and XOR operations and therefore have a high computational efficiency,are widely used in various fields of inform...In the process of encoding and decoding,erasure codes over binary fields,which just need AND operations and XOR operations and therefore have a high computational efficiency,are widely used in various fields of information technology.A matrix decoding method is proposed in this paper.The method is a universal data reconstruction scheme for erasure codes over binary fields.Besides a pre-judgment that whether errors can be recovered,the method can rebuild sectors of loss data on a fault-tolerant storage system constructed by erasure codes for disk errors.Data reconstruction process of the new method has simple and clear steps,so it is beneficial for implementation of computer codes.And more,it can be applied to other non-binary fields easily,so it is expected that the method has an extensive application in the future.展开更多
A field-programmable gate array(FPGA)based high-speed broadband data acquisition system is designed.The system has a dual channel simultaneous acquisition function.The maximum sampling rate is 500 MSa/s and bandwidth ...A field-programmable gate array(FPGA)based high-speed broadband data acquisition system is designed.The system has a dual channel simultaneous acquisition function.The maximum sampling rate is 500 MSa/s and bandwidth is200 MHz,which solves the large bandwidth,high-speed signal acquisition and processing problems.At present,the data acquisition system is successfully used in broadband receiver test systems.展开更多
The checking survey in Open mine is one of the most frequent and important work.It plays the role of forming a connecting link between open mine planning and pro- duction.Traditional checking method has such disadvant...The checking survey in Open mine is one of the most frequent and important work.It plays the role of forming a connecting link between open mine planning and pro- duction.Traditional checking method has such disadvantages as long time consumption, heavy workload,complicated calculating process,and lower automation.Used GPS and GIS technologies to systematically study the core issues of checking survey in open mine. A detail GPS data acquisition coding scheme was presented.Based on the scheme an algorithm used for computer semiautomatic cartography was made.Three methods used for eliminating gross errors from raw data which were needed for creating DEM was dis- cussed.Two algorithms were researched and realized which can be used to create open mine fine DEM model with constrained conditions and to dynamically update the model. The precision analysis and evaluation of the created model were carried out.展开更多
Many websites use verification codes to prevent users from using the machine automatically to register,login,malicious vote or irrigate but it brought great burden to the enterprises involved in internet marketing as ...Many websites use verification codes to prevent users from using the machine automatically to register,login,malicious vote or irrigate but it brought great burden to the enterprises involved in internet marketing as entering the verification code manually.Improving the verification code security system needs the identification method as the corresponding testing system.We propose an anisotropic heat kernel equation group which can generate a heat source scale space during the kernel evolution based on infinite heat source axiom,design a multi-step anisotropic verification code identification algorithm which includes core procedure of building anisotropic heat kernel,settingwave energy information parameters,combing outverification codccharacters and corresponding peripheral procedure of gray scaling,binarizing,denoising,normalizing,segmenting and identifying,give out the detail criterion and parameter set.Actual test show the anisotropic heat kernel identification algorithm can be used on many kinds of verification code including text characters,mathematical,Chinese,voice,3D,programming,video,advertising,it has a higher rate of 25%and 50%than neural network and context matching algorithm separately for Yahoo site,49%and 60%for Captcha site,20%and 52%for Baidu site,60%and 65%for 3DTakers site,40%,and 51%.for MDP site.展开更多
Availability of magnetic materials is most crucial for modern Europe,as they are integral to energy conversion across the renewable energy and electric mobility sectors.Unfortunately,there is still no circular economy...Availability of magnetic materials is most crucial for modern Europe,as they are integral to energy conversion across the renewable energy and electric mobility sectors.Unfortunately,there is still no circular economy to reuse and capture value for these types of materials.With the prediction that the need for NdFeB Rare Earth(RE)magnets will double in the next 10 years,this problem becomes even more urgent.As the quality of the recollected materials varies significantly,the development of a classification system for recyclate grades of EOL NdFeB magnets in combination with an eco-labelling system for newly produced RE permanent magnets is proposed to clearly identify different magnet types and qualities.It categorises the NdFeB magnets by technical pre-processing requirements,facilitating use of the highly effective HPMS process(Hydrogen Processing of Magnetic Scrap)for re-processing extracted materials directly from NdFeB alloy.The proposed measures will have a great impact to overcome existing low recycling rates due to poor collection,high leakages of collected materials into non-suitable channels,and inappropriate interface management between logistics,mechanical pre-processing and metallurgical metals recovery.展开更多
In the field of lossless compression, most kinds of traditional software have some shortages when they face the mass data. Their compressing abilities are limited by the data window size and the compressing format des...In the field of lossless compression, most kinds of traditional software have some shortages when they face the mass data. Their compressing abilities are limited by the data window size and the compressing format design. This paper presents a new design of compressing format named 'CZ format' which supports the data window size up to 4 GB and has some advantages in the mass data compression. Using this format, a compressing shareware named 'ComZip' is designed. The experiment results support that ComZip has better compression ratio than WinZip, Bzip2 and are compressed. And ComZip has the potential to beat 7-zip in WinRAR in most cases, especially when GBs or TBs of mass data future as the data window size exceeds 128 MB.展开更多
文摘The requirements of data coding in multimedia applications are presented, the current technique of coding and relative standards is introduced, then the work that have been doing is presented, i.e. the wavelet-based coding method and the VE (Visual Entropy)-based coding method. The experiment results prove that these methods have gained a better perceptual quality of a reconstructed image and a lower bit rate. Their performance evaluations are better than JPEG (Joint Photographic Experts Group) coding. Finally, the future topics of study are put forward.
基金Supported by National Natural Science Foundation(40705025)~~
文摘The upper air weather forecast data used in current business and research and digital data of the recently finished upper air meteorological monthly report were comparatively analyzed in complete data and quality condition of data, and sounding curve change caused by the difference of complete data was also compared, which evaluated advantages and disadvantages of two types of data.
基金supported by the National Natural Science Foundation of China under Grant No.61501064Sichuan Provincial Science and Technology Project under Grant No.2016GZ0122
文摘In the process of encoding and decoding,erasure codes over binary fields,which just need AND operations and XOR operations and therefore have a high computational efficiency,are widely used in various fields of information technology.A matrix decoding method is proposed in this paper.The method is a universal data reconstruction scheme for erasure codes over binary fields.Besides a pre-judgment that whether errors can be recovered,the method can rebuild sectors of loss data on a fault-tolerant storage system constructed by erasure codes for disk errors.Data reconstruction process of the new method has simple and clear steps,so it is beneficial for implementation of computer codes.And more,it can be applied to other non-binary fields easily,so it is expected that the method has an extensive application in the future.
文摘A field-programmable gate array(FPGA)based high-speed broadband data acquisition system is designed.The system has a dual channel simultaneous acquisition function.The maximum sampling rate is 500 MSa/s and bandwidth is200 MHz,which solves the large bandwidth,high-speed signal acquisition and processing problems.At present,the data acquisition system is successfully used in broadband receiver test systems.
基金the Ph.D.Program Research Foundation from MOE of China(20060147004)Research Foundation from Liaoning Technical University(04A02001)
文摘The checking survey in Open mine is one of the most frequent and important work.It plays the role of forming a connecting link between open mine planning and pro- duction.Traditional checking method has such disadvantages as long time consumption, heavy workload,complicated calculating process,and lower automation.Used GPS and GIS technologies to systematically study the core issues of checking survey in open mine. A detail GPS data acquisition coding scheme was presented.Based on the scheme an algorithm used for computer semiautomatic cartography was made.Three methods used for eliminating gross errors from raw data which were needed for creating DEM was dis- cussed.Two algorithms were researched and realized which can be used to create open mine fine DEM model with constrained conditions and to dynamically update the model. The precision analysis and evaluation of the created model were carried out.
基金The national natural science foundation(61273290,61373147)Xiamen Scientific Plan Project(2014S0048,3502Z20123037)+1 种基金Fujian Scientific Plan Project(2013HZ0004-1)FuJian provincial education office A-class project(-JA13238)
文摘Many websites use verification codes to prevent users from using the machine automatically to register,login,malicious vote or irrigate but it brought great burden to the enterprises involved in internet marketing as entering the verification code manually.Improving the verification code security system needs the identification method as the corresponding testing system.We propose an anisotropic heat kernel equation group which can generate a heat source scale space during the kernel evolution based on infinite heat source axiom,design a multi-step anisotropic verification code identification algorithm which includes core procedure of building anisotropic heat kernel,settingwave energy information parameters,combing outverification codccharacters and corresponding peripheral procedure of gray scaling,binarizing,denoising,normalizing,segmenting and identifying,give out the detail criterion and parameter set.Actual test show the anisotropic heat kernel identification algorithm can be used on many kinds of verification code including text characters,mathematical,Chinese,voice,3D,programming,video,advertising,it has a higher rate of 25%and 50%than neural network and context matching algorithm separately for Yahoo site,49%and 60%for Captcha site,20%and 52%for Baidu site,60%and 65%for 3DTakers site,40%,and 51%.for MDP site.
文摘Availability of magnetic materials is most crucial for modern Europe,as they are integral to energy conversion across the renewable energy and electric mobility sectors.Unfortunately,there is still no circular economy to reuse and capture value for these types of materials.With the prediction that the need for NdFeB Rare Earth(RE)magnets will double in the next 10 years,this problem becomes even more urgent.As the quality of the recollected materials varies significantly,the development of a classification system for recyclate grades of EOL NdFeB magnets in combination with an eco-labelling system for newly produced RE permanent magnets is proposed to clearly identify different magnet types and qualities.It categorises the NdFeB magnets by technical pre-processing requirements,facilitating use of the highly effective HPMS process(Hydrogen Processing of Magnetic Scrap)for re-processing extracted materials directly from NdFeB alloy.The proposed measures will have a great impact to overcome existing low recycling rates due to poor collection,high leakages of collected materials into non-suitable channels,and inappropriate interface management between logistics,mechanical pre-processing and metallurgical metals recovery.
文摘In the field of lossless compression, most kinds of traditional software have some shortages when they face the mass data. Their compressing abilities are limited by the data window size and the compressing format design. This paper presents a new design of compressing format named 'CZ format' which supports the data window size up to 4 GB and has some advantages in the mass data compression. Using this format, a compressing shareware named 'ComZip' is designed. The experiment results support that ComZip has better compression ratio than WinZip, Bzip2 and are compressed. And ComZip has the potential to beat 7-zip in WinRAR in most cases, especially when GBs or TBs of mass data future as the data window size exceeds 128 MB.