Rockburst is a common geological disaster in underground engineering,which seriously threatens the safety of personnel,equipment and property.Utilizing machine learning models to evaluate risk of rockburst is graduall...Rockburst is a common geological disaster in underground engineering,which seriously threatens the safety of personnel,equipment and property.Utilizing machine learning models to evaluate risk of rockburst is gradually becoming a trend.In this study,the integrated algorithms under Gradient Boosting Decision Tree(GBDT)framework were used to evaluate and classify rockburst intensity.First,a total of 301 rock burst data samples were obtained from a case database,and the data were preprocessed using synthetic minority over-sampling technique(SMOTE).Then,the rockburst evaluation models including GBDT,eXtreme Gradient Boosting(XGBoost),Light Gradient Boosting Machine(LightGBM),and Categorical Features Gradient Boosting(CatBoost)were established,and the optimal hyperparameters of the models were obtained through random search grid and five-fold cross-validation.Afterwards,use the optimal hyperparameter configuration to fit the evaluation models,and analyze these models using test set.In order to evaluate the performance,metrics including accuracy,precision,recall,and F1-score were selected to analyze and compare with other machine learning models.Finally,the trained models were used to conduct rock burst risk assessment on rock samples from a mine in Shanxi Province,China,and providing theoretical guidance for the mine's safe production work.The models under the GBDT framework perform well in the evaluation of rockburst levels,and the proposed methods can provide a reliable reference for rockburst risk level analysis and safety management.展开更多
This work demonstrates that the ΣΔ modulator with a low oversampling ratio is a viable option for the high-resolution digitization in a low-voltage environment.Low power dissipation is achieved by designing a low-OS...This work demonstrates that the ΣΔ modulator with a low oversampling ratio is a viable option for the high-resolution digitization in a low-voltage environment.Low power dissipation is achieved by designing a low-OSR modulator based on differential cascade architecture,while large signal swing maintained to achieve a high dynamic range in the low-voltage environment.Operating from a voltage supply of 1.8V,the sixth-order cascade modulator at a sampling frequency of 4-MHz with an OSR of 24 achieves a dynamic range of 81dB for a 80-kHz test signal,while dissipating only 5mW.展开更多
The conventional nonstationary convolutional model assumes that the seismic signal is recorded at normal incidence. Raw shot gathers are far from this assumption because of the effects of offsets. Because of such prob...The conventional nonstationary convolutional model assumes that the seismic signal is recorded at normal incidence. Raw shot gathers are far from this assumption because of the effects of offsets. Because of such problems, we propose a novel prestack nonstationary deconvolution approach. We introduce the radial trace (RT) transform to the nonstationary deconvolution, we estimate the nonstationary deconvolution factor with hyperbolic smoothing based on variable-step sampling (VSS) in the RT domain, and we obtain the high-resolution prestack nonstationary deconvolution data. The RT transform maps the shot record from the offset and traveltime coordinates to those of apparent velocity and traveltime. The ray paths of the traces in the RT better satisfy the assumptions of the convolutional model. The proposed method combines the advantages of stationary deconvolution and inverse Q filtering, without prior information for Q. The nonstationary deconvolution in the RT domain is more suitable than that in the space-time (XT) domain for prestack data because it is the generalized extension of normal incidence. Tests with synthetic and real data demonstrate that the proposed method is more effective in compensating for large-offset and deep data.展开更多
We propose a novel all-optical sampling method using nonlinear polarization rotation in a semiconductor optical amplifier. A rate-equation model capable of describing the all-optical sampling mechanism is presented in...We propose a novel all-optical sampling method using nonlinear polarization rotation in a semiconductor optical amplifier. A rate-equation model capable of describing the all-optical sampling mechanism is presented in this paper. Based on this model, we investigate the optimized operating parameters of the proposed system by simulating the output intensity of the probe light as functions of the input polarization angle, the phase induced by the polarization controller, and the ori- entation of the polarization beam splitter. The simulated results show that we can obtain a good linear slope and a large linear dynamic range,which is suitable for all-optical sampling. The operating power of the pump light can be less than lmW. The presented all-optical sampling method can potentially operate at a sampling rate up to hundreds GS/s and needs low optical power.展开更多
A new scheme combining a scalable transcoder with space time block codes (STBC) for an orthogonal frequency division multiplexing (OFDM) system is proposed for robust video transmission in dispersive fading channe...A new scheme combining a scalable transcoder with space time block codes (STBC) for an orthogonal frequency division multiplexing (OFDM) system is proposed for robust video transmission in dispersive fading channels. The target application for such a scalable transcoder is to provide successful access to the pre-encoded high quality video MPEG-2 from mobile wireless terminals. In the scalable transcoder, besides outputting the MPEG-4 fine granular scalability (FGS) bitstream, both the size of video frames and the bit rate are reduced. And an array processing algorithm of layer interference suppression is used at the receiver which makes the system structure provide different levels of protection to different layers. Furthermore, by considering the important level of scalable bitstream, the different bitstreams can be given different level protection by the system structure and channel coding. With the proposed system, the concurrent large diversity gain characteristic of STBC and alleviation of the frequency-selective fading effect of OFDM can be achieved. The simulation results show that the proposed schemes integrating scalable transcoding can provide a basic quality of video transmission and outperform the conventional single layer transcoding transmitted under the random and bursty error channel conditions.展开更多
The condensation tracking algorithm uses a prior transition probability as the proposal distribution, which does not make full use of the current observation. In order to overcome this shortcoming, a new face tracking...The condensation tracking algorithm uses a prior transition probability as the proposal distribution, which does not make full use of the current observation. In order to overcome this shortcoming, a new face tracking algorithm based on particle filter with mean shift importance sampling is proposed. First, the coarse location of the face target is attained by the efficient mean shift tracker, and then the result is used to construct the proposal distribution for particle propagation. Because the particles obtained with this method can cluster around the true state region, particle efficiency is improved greatly. The experimental results show that the performance of the proposed algorithm is better than that of the standard condensation tracking algorithm.展开更多
There are some limitations when we apply conventional methods to analyze the massive amounts of seismic data acquired with high-density spatial sampling since processors usually obtain the properties of raw data from ...There are some limitations when we apply conventional methods to analyze the massive amounts of seismic data acquired with high-density spatial sampling since processors usually obtain the properties of raw data from common shot gathers or other datasets located at certain points or along lines. We propose a novel method in this paper to observe seismic data on time slices from spatial subsets. The composition of a spatial subset and the unique character of orthogonal or oblique subsets are described and pre-stack subsets are shown by 3D visualization. In seismic data processing, spatial subsets can be used for the following aspects: (1) to check the trace distribution uniformity and regularity; (2) to observe the main features of ground-roll and linear noise; (3) to find abnormal traces from slices of datasets; and (4) to QC the results of pre-stack noise attenuation. The field data application shows that seismic data analysis in spatial subsets is an effective method that may lead to a better discrimination among various wavefields and help us obtain more information.展开更多
An approach to contour extraction and feature point detection in the 3-D fragment reassembly is proposed. A simple and effective technique is used for building the intrinsic topology of the fragment data suitable for ...An approach to contour extraction and feature point detection in the 3-D fragment reassembly is proposed. A simple and effective technique is used for building the intrinsic topology of the fragment data suitable for contour extraction. For the scanned data in which the topology is difficult to be achieved, the corresponding solutions are given to manage this problem. A robust approach is used for the curvature and torsion calculation of the discrete contour in a 3-D space. Finally, a method is developed for detecting feature points of the fragment contour based on total curvature. Therefore, the contour description combines the simple global information with local feature points. Experiments with real contour curves extracted from 3-D fragments demonstrate that the proposed method is robust and efficient.展开更多
In tomographic statics seismic data processing, it 1s crucial to cletermme an optimum base for a near-surface model. In this paper, we consider near-surface model base determination as a global optimum problem. Given ...In tomographic statics seismic data processing, it 1s crucial to cletermme an optimum base for a near-surface model. In this paper, we consider near-surface model base determination as a global optimum problem. Given information from uphole shooting and the first-arrival times from a surface seismic survey, we present a near-surface velocity model construction method based on a Monte-Carlo sampling scheme using a layered equivalent medium assumption. Compared with traditional least-squares first-arrival tomography, this scheme can delineate a clearer, weathering-layer base, resulting in a better implementation of damming correction. Examples using synthetic and field data are used to demonstrate the effectiveness of the proposed scheme.展开更多
This paper presents the design considerations and implementation of an area-efficient interpolator suitable for a delta-sigma D/A converter. In an effort to reduce the area and design complexity, a method for designin...This paper presents the design considerations and implementation of an area-efficient interpolator suitable for a delta-sigma D/A converter. In an effort to reduce the area and design complexity, a method for designing an FIR filter as a tapped cascaded interconnection of identical subfilters is modified. The proposed subfilter structure further minimizes the arithmetic number. Experimental results show that the proposed interpolator achieves the design specification,exhibiting high performance and hardware efficiency,and also has good noise rejection capability. The interpolation filter can be applied to a delta-sigma DAC and is fully functional.展开更多
ObjectiveThis study was to establish a simple method for collecting and detecting Mycoplasma hyopneumoniae (Mhp) in aerosol. MethodBased on the mechanisms of liquid impinger and filtration sampler, a double concentr...ObjectiveThis study was to establish a simple method for collecting and detecting Mycoplasma hyopneumoniae (Mhp) in aerosol. MethodBased on the mechanisms of liquid impinger and filtration sampler, a double concentration aerosol sampler was designed for collecting Mhp aerosol. Firstly, the collection was performed in a closed environment full of artificial aerosol of Mhp. Secondly, collection efficiency was detected by real-time PCR. Thereafter, the clinical feasibility of the designed equipment was tested by collecting aerosol samples in different pig herds. In one assay, the samples were collected at different times from one pig house challenged with Mhp. In another assay, the samples was collected from the delivery room, nursery and fattenning house of a MPS outbreak farm as well as a Mhp infection positive pig farm without obvious clinical symptoms. All the aerosol samples were then detected by real-time PCR or nested PCR. ResultThe collection efficiency of the designed bioaerosol sampler was (37.04±6.43) %, Mhp could be detected 7 d after intratracheal challenge with pneumonic lung homogenate suspension. Aerosol samples of 11 pig houses from the two Mhp positive pig farms with or without clinical symptoms all showed a positive result of PCR, the positivity rate was 100%. ConclusionA high sensitive collecting and detecting technology of aerosol was successfully established, which can be applied to clinical detection of Mhp in aerosol.展开更多
Frequency sampling is one of the popular methods in FIR digital filter design. In the frequency sampling method the value of transition band samples, which are usually obtained by consulting a table, must be determi...Frequency sampling is one of the popular methods in FIR digital filter design. In the frequency sampling method the value of transition band samples, which are usually obtained by consulting a table, must be determined in order to make the attenuation within the stopband maximal. However, the value obtained by searching for table can not be ensured to be optimal. Evolutionary programming (EP), a multi agent stochastic optimization technique, can lead to global optimal solutions for complex problems. In this paper a new application of EP to frequency sampling method is introduced. Two examples of lowpass and bandpass FIR filters are presented, and the steps of EP realization and experimental results are given. Experimental results show that the value of transition band samples obtained by EP can be ensured to be optimal and the performance of the filter is improved.展开更多
Traditional seismic data sampling follows the Nyquist sampling theorem. In this paper, we introduce the theory of compressive sensing (CS), breaking through the limitations of the traditional Nyquist sampling theore...Traditional seismic data sampling follows the Nyquist sampling theorem. In this paper, we introduce the theory of compressive sensing (CS), breaking through the limitations of the traditional Nyquist sampling theorem, rendering the coherent aliases of regular undersampling into harmless incoherent random noise using random undersampling, and effectively turning the reconstruction problem into a much simpler denoising problem. We introduce the projections onto convex sets (POCS) algorithm in the data reconstruction process, apply the exponential decay threshold parameter in the iterations, and modify the traditional reconstruction process that performs forward and reverse transforms in the time and space domain. We propose a new method that uses forward and reverse transforms in the space domain. The proposed method uses less computer memory and improves computational speed. We also analyze the antinoise and anti-aliasing ability of the proposed method, and compare the 2D and 3D data reconstruction. Theoretical models and real data show that the proposed method is effective and of practical importance, as it can reconstruct missing traces and reduce the exploration cost of complex data acquisition.展开更多
The interleaving/multiplexing technique was used to realize a 200?MHz real time data acquisition system. Two 100?MHz ADC modules worked parallelly and every ADC plays out data in ping pang fashion. The design improv...The interleaving/multiplexing technique was used to realize a 200?MHz real time data acquisition system. Two 100?MHz ADC modules worked parallelly and every ADC plays out data in ping pang fashion. The design improved the system conversion rata to 200?MHz and reduced the speed of data transporting and storing to 50?MHz. The high speed HDPLD and ECL logic parts were used to control system timing and the memory address. The multi layer print board and the shield were used to decrease interference produced by the high speed circuit. The system timing was designed carefully. The interleaving/multiplexing technique could improve the system conversion rata greatly while reducing the speed of external digital interfaces greatly. The design resolved the difficulties in high speed system effectively. The experiment proved the data acquisition system is stable and accurate.展开更多
Software synchronous sampling is widely employed in periodic signal measurement, but measurement errors occur very commonly. This paper analyses the cause of the errors, deduces the mathematical model for the measure...Software synchronous sampling is widely employed in periodic signal measurement, but measurement errors occur very commonly. This paper analyses the cause of the errors, deduces the mathematical model for the measurement errors in measuring RMS voltage, RMS current and active power using the software synchronous sampling method. Some measures to reduce the errors are put forward by simulating.展开更多
The sinusoid curve fit is widely applied in the evaluation of digitized measurement equipment, such as data acquisition system, digital storage oscilloscope, waveform recorder and A/D converter,etc. Because of the di...The sinusoid curve fit is widely applied in the evaluation of digitized measurement equipment, such as data acquisition system, digital storage oscilloscope, waveform recorder and A/D converter,etc. Because of the distortion and noise of sinusoid signal generator, the digitizing and the non linearity errors in measurement, it is impossible to avoid the distortion and the noise in sinusoid sampling series. The distortion and the noise limit the accuracy of curve fit results. Therefore, it is desirable to find a filter that can filter out both distortion and noise of the sinusoid sampling series, and in the meantime, the filter doesn′t influence the amplitude, the frequency, the phase and DC bias of fitting curve of the sine wave. And then, the uncertainty of fitting parameter can be reduced. This filter is designed and realized. Its realization in time domain is described and its transfer function in frequency domain is presented.展开更多
A novel data streams partitioning method is proposed to resolve problems of range-aggregation continuous queries over parallel streams for power industry.The first step of this method is to parallel sample the data,wh...A novel data streams partitioning method is proposed to resolve problems of range-aggregation continuous queries over parallel streams for power industry.The first step of this method is to parallel sample the data,which is implemented as an extended reservoir-sampling algorithm.A skip factor based on the change ratio of data-values is introduced to describe the distribution characteristics of data-values adaptively.The second step of this method is to partition the fluxes of data streams averagely,which is implemented with two alternative equal-depth histogram generating algorithms that fit the different cases:one for incremental maintenance based on heuristics and the other for periodical updates to generate an approximate partition vector.The experimental results on actual data prove that the method is efficient,practical and suitable for time-varying data streams processing.展开更多
基金Project(52161135301)supported by the International Cooperation and Exchange of the National Natural Science Foundation of ChinaProject(202306370296)supported by China Scholarship Council。
文摘Rockburst is a common geological disaster in underground engineering,which seriously threatens the safety of personnel,equipment and property.Utilizing machine learning models to evaluate risk of rockburst is gradually becoming a trend.In this study,the integrated algorithms under Gradient Boosting Decision Tree(GBDT)framework were used to evaluate and classify rockburst intensity.First,a total of 301 rock burst data samples were obtained from a case database,and the data were preprocessed using synthetic minority over-sampling technique(SMOTE).Then,the rockburst evaluation models including GBDT,eXtreme Gradient Boosting(XGBoost),Light Gradient Boosting Machine(LightGBM),and Categorical Features Gradient Boosting(CatBoost)were established,and the optimal hyperparameters of the models were obtained through random search grid and five-fold cross-validation.Afterwards,use the optimal hyperparameter configuration to fit the evaluation models,and analyze these models using test set.In order to evaluate the performance,metrics including accuracy,precision,recall,and F1-score were selected to analyze and compare with other machine learning models.Finally,the trained models were used to conduct rock burst risk assessment on rock samples from a mine in Shanxi Province,China,and providing theoretical guidance for the mine's safe production work.The models under the GBDT framework perform well in the evaluation of rockburst levels,and the proposed methods can provide a reliable reference for rockburst risk level analysis and safety management.
文摘This work demonstrates that the ΣΔ modulator with a low oversampling ratio is a viable option for the high-resolution digitization in a low-voltage environment.Low power dissipation is achieved by designing a low-OSR modulator based on differential cascade architecture,while large signal swing maintained to achieve a high dynamic range in the low-voltage environment.Operating from a voltage supply of 1.8V,the sixth-order cascade modulator at a sampling frequency of 4-MHz with an OSR of 24 achieves a dynamic range of 81dB for a 80-kHz test signal,while dissipating only 5mW.
基金financially supported by the National Science and Technology Major Project of China(No.2011ZX05023-005-005)the National Natural Science Foundation of China(No.41274137)
文摘The conventional nonstationary convolutional model assumes that the seismic signal is recorded at normal incidence. Raw shot gathers are far from this assumption because of the effects of offsets. Because of such problems, we propose a novel prestack nonstationary deconvolution approach. We introduce the radial trace (RT) transform to the nonstationary deconvolution, we estimate the nonstationary deconvolution factor with hyperbolic smoothing based on variable-step sampling (VSS) in the RT domain, and we obtain the high-resolution prestack nonstationary deconvolution data. The RT transform maps the shot record from the offset and traveltime coordinates to those of apparent velocity and traveltime. The ray paths of the traces in the RT better satisfy the assumptions of the convolutional model. The proposed method combines the advantages of stationary deconvolution and inverse Q filtering, without prior information for Q. The nonstationary deconvolution in the RT domain is more suitable than that in the space-time (XT) domain for prestack data because it is the generalized extension of normal incidence. Tests with synthetic and real data demonstrate that the proposed method is more effective in compensating for large-offset and deep data.
文摘We propose a novel all-optical sampling method using nonlinear polarization rotation in a semiconductor optical amplifier. A rate-equation model capable of describing the all-optical sampling mechanism is presented in this paper. Based on this model, we investigate the optimized operating parameters of the proposed system by simulating the output intensity of the probe light as functions of the input polarization angle, the phase induced by the polarization controller, and the ori- entation of the polarization beam splitter. The simulated results show that we can obtain a good linear slope and a large linear dynamic range,which is suitable for all-optical sampling. The operating power of the pump light can be less than lmW. The presented all-optical sampling method can potentially operate at a sampling rate up to hundreds GS/s and needs low optical power.
文摘A new scheme combining a scalable transcoder with space time block codes (STBC) for an orthogonal frequency division multiplexing (OFDM) system is proposed for robust video transmission in dispersive fading channels. The target application for such a scalable transcoder is to provide successful access to the pre-encoded high quality video MPEG-2 from mobile wireless terminals. In the scalable transcoder, besides outputting the MPEG-4 fine granular scalability (FGS) bitstream, both the size of video frames and the bit rate are reduced. And an array processing algorithm of layer interference suppression is used at the receiver which makes the system structure provide different levels of protection to different layers. Furthermore, by considering the important level of scalable bitstream, the different bitstreams can be given different level protection by the system structure and channel coding. With the proposed system, the concurrent large diversity gain characteristic of STBC and alleviation of the frequency-selective fading effect of OFDM can be achieved. The simulation results show that the proposed schemes integrating scalable transcoding can provide a basic quality of video transmission and outperform the conventional single layer transcoding transmitted under the random and bursty error channel conditions.
基金The National Natural Science Foundation of China(No60672094)
文摘The condensation tracking algorithm uses a prior transition probability as the proposal distribution, which does not make full use of the current observation. In order to overcome this shortcoming, a new face tracking algorithm based on particle filter with mean shift importance sampling is proposed. First, the coarse location of the face target is attained by the efficient mean shift tracker, and then the result is used to construct the proposal distribution for particle propagation. Because the particles obtained with this method can cluster around the true state region, particle efficiency is improved greatly. The experimental results show that the performance of the proposed algorithm is better than that of the standard condensation tracking algorithm.
文摘There are some limitations when we apply conventional methods to analyze the massive amounts of seismic data acquired with high-density spatial sampling since processors usually obtain the properties of raw data from common shot gathers or other datasets located at certain points or along lines. We propose a novel method in this paper to observe seismic data on time slices from spatial subsets. The composition of a spatial subset and the unique character of orthogonal or oblique subsets are described and pre-stack subsets are shown by 3D visualization. In seismic data processing, spatial subsets can be used for the following aspects: (1) to check the trace distribution uniformity and regularity; (2) to observe the main features of ground-roll and linear noise; (3) to find abnormal traces from slices of datasets; and (4) to QC the results of pre-stack noise attenuation. The field data application shows that seismic data analysis in spatial subsets is an effective method that may lead to a better discrimination among various wavefields and help us obtain more information.
文摘An approach to contour extraction and feature point detection in the 3-D fragment reassembly is proposed. A simple and effective technique is used for building the intrinsic topology of the fragment data suitable for contour extraction. For the scanned data in which the topology is difficult to be achieved, the corresponding solutions are given to manage this problem. A robust approach is used for the curvature and torsion calculation of the discrete contour in a 3-D space. Finally, a method is developed for detecting feature points of the fragment contour based on total curvature. Therefore, the contour description combines the simple global information with local feature points. Experiments with real contour curves extracted from 3-D fragments demonstrate that the proposed method is robust and efficient.
基金funded by the National Science VIP specialized project of China(Grant No.2011ZX05025-001-03)by the National Science Foundation of China(Grant No.41274117)
文摘In tomographic statics seismic data processing, it 1s crucial to cletermme an optimum base for a near-surface model. In this paper, we consider near-surface model base determination as a global optimum problem. Given information from uphole shooting and the first-arrival times from a surface seismic survey, we present a near-surface velocity model construction method based on a Monte-Carlo sampling scheme using a layered equivalent medium assumption. Compared with traditional least-squares first-arrival tomography, this scheme can delineate a clearer, weathering-layer base, resulting in a better implementation of damming correction. Examples using synthetic and field data are used to demonstrate the effectiveness of the proposed scheme.
文摘This paper presents the design considerations and implementation of an area-efficient interpolator suitable for a delta-sigma D/A converter. In an effort to reduce the area and design complexity, a method for designing an FIR filter as a tapped cascaded interconnection of identical subfilters is modified. The proposed subfilter structure further minimizes the arithmetic number. Experimental results show that the proposed interpolator achieves the design specification,exhibiting high performance and hardware efficiency,and also has good noise rejection capability. The interpolation filter can be applied to a delta-sigma DAC and is fully functional.
基金Supported by the Fund for Agricultural Science and Technology Independent Innovation of Jiangsu Province[CX(12)1001-05]~~
文摘ObjectiveThis study was to establish a simple method for collecting and detecting Mycoplasma hyopneumoniae (Mhp) in aerosol. MethodBased on the mechanisms of liquid impinger and filtration sampler, a double concentration aerosol sampler was designed for collecting Mhp aerosol. Firstly, the collection was performed in a closed environment full of artificial aerosol of Mhp. Secondly, collection efficiency was detected by real-time PCR. Thereafter, the clinical feasibility of the designed equipment was tested by collecting aerosol samples in different pig herds. In one assay, the samples were collected at different times from one pig house challenged with Mhp. In another assay, the samples was collected from the delivery room, nursery and fattenning house of a MPS outbreak farm as well as a Mhp infection positive pig farm without obvious clinical symptoms. All the aerosol samples were then detected by real-time PCR or nested PCR. ResultThe collection efficiency of the designed bioaerosol sampler was (37.04±6.43) %, Mhp could be detected 7 d after intratracheal challenge with pneumonic lung homogenate suspension. Aerosol samples of 11 pig houses from the two Mhp positive pig farms with or without clinical symptoms all showed a positive result of PCR, the positivity rate was 100%. ConclusionA high sensitive collecting and detecting technology of aerosol was successfully established, which can be applied to clinical detection of Mhp in aerosol.
文摘Frequency sampling is one of the popular methods in FIR digital filter design. In the frequency sampling method the value of transition band samples, which are usually obtained by consulting a table, must be determined in order to make the attenuation within the stopband maximal. However, the value obtained by searching for table can not be ensured to be optimal. Evolutionary programming (EP), a multi agent stochastic optimization technique, can lead to global optimal solutions for complex problems. In this paper a new application of EP to frequency sampling method is introduced. Two examples of lowpass and bandpass FIR filters are presented, and the steps of EP realization and experimental results are given. Experimental results show that the value of transition band samples obtained by EP can be ensured to be optimal and the performance of the filter is improved.
基金sponsored by the National Natural Science Foundation of China (No.41174107)the National Science and Technology projects of oil and gas (No.2011ZX05023-005)
文摘Traditional seismic data sampling follows the Nyquist sampling theorem. In this paper, we introduce the theory of compressive sensing (CS), breaking through the limitations of the traditional Nyquist sampling theorem, rendering the coherent aliases of regular undersampling into harmless incoherent random noise using random undersampling, and effectively turning the reconstruction problem into a much simpler denoising problem. We introduce the projections onto convex sets (POCS) algorithm in the data reconstruction process, apply the exponential decay threshold parameter in the iterations, and modify the traditional reconstruction process that performs forward and reverse transforms in the time and space domain. We propose a new method that uses forward and reverse transforms in the space domain. The proposed method uses less computer memory and improves computational speed. We also analyze the antinoise and anti-aliasing ability of the proposed method, and compare the 2D and 3D data reconstruction. Theoretical models and real data show that the proposed method is effective and of practical importance, as it can reconstruct missing traces and reduce the exploration cost of complex data acquisition.
文摘The interleaving/multiplexing technique was used to realize a 200?MHz real time data acquisition system. Two 100?MHz ADC modules worked parallelly and every ADC plays out data in ping pang fashion. The design improved the system conversion rata to 200?MHz and reduced the speed of data transporting and storing to 50?MHz. The high speed HDPLD and ECL logic parts were used to control system timing and the memory address. The multi layer print board and the shield were used to decrease interference produced by the high speed circuit. The system timing was designed carefully. The interleaving/multiplexing technique could improve the system conversion rata greatly while reducing the speed of external digital interfaces greatly. The design resolved the difficulties in high speed system effectively. The experiment proved the data acquisition system is stable and accurate.
文摘Software synchronous sampling is widely employed in periodic signal measurement, but measurement errors occur very commonly. This paper analyses the cause of the errors, deduces the mathematical model for the measurement errors in measuring RMS voltage, RMS current and active power using the software synchronous sampling method. Some measures to reduce the errors are put forward by simulating.
文摘The sinusoid curve fit is widely applied in the evaluation of digitized measurement equipment, such as data acquisition system, digital storage oscilloscope, waveform recorder and A/D converter,etc. Because of the distortion and noise of sinusoid signal generator, the digitizing and the non linearity errors in measurement, it is impossible to avoid the distortion and the noise in sinusoid sampling series. The distortion and the noise limit the accuracy of curve fit results. Therefore, it is desirable to find a filter that can filter out both distortion and noise of the sinusoid sampling series, and in the meantime, the filter doesn′t influence the amplitude, the frequency, the phase and DC bias of fitting curve of the sine wave. And then, the uncertainty of fitting parameter can be reduced. This filter is designed and realized. Its realization in time domain is described and its transfer function in frequency domain is presented.
基金The High Technology Research Plan of Jiangsu Prov-ince (No.BG2004034)the Foundation of Graduate Creative Program ofJiangsu Province (No.xm04-36).
文摘A novel data streams partitioning method is proposed to resolve problems of range-aggregation continuous queries over parallel streams for power industry.The first step of this method is to parallel sample the data,which is implemented as an extended reservoir-sampling algorithm.A skip factor based on the change ratio of data-values is introduced to describe the distribution characteristics of data-values adaptively.The second step of this method is to partition the fluxes of data streams averagely,which is implemented with two alternative equal-depth histogram generating algorithms that fit the different cases:one for incremental maintenance based on heuristics and the other for periodical updates to generate an approximate partition vector.The experimental results on actual data prove that the method is efficient,practical and suitable for time-varying data streams processing.