Machine-learning and big data are among the latest approaches in corrosion research.The biggest challenge in corrosion research is to accurately predict how materials will degrade in a given environment.Corrosion big ...Machine-learning and big data are among the latest approaches in corrosion research.The biggest challenge in corrosion research is to accurately predict how materials will degrade in a given environment.Corrosion big data is the application of mathematical methods to huge amounts of data to find correlations and infer probabilities.It is possible to use corrosion big data method to distinguish the influence of the minimal changes of alloying elements and small differences in microstructure on corrosion resistance of low alloy steels.In this research,corrosion big data evaluation methods and machine learning were used to study the effect of Sb and Sn,as well as environmental factors on the corrosion behavior of low alloy steels.Results depict corrosion big data method can accurately identify the influence of various factors on corrosion resistance of low alloy and is an effective and promising way in corrosion research.展开更多
Using four satellite data sets(TOMS/SBUV, OMI, MLS, and HALOE), we analyze the seasonal variations of the total column ozone(TCO) and its zonal deviation(TCO*), and reveal the vertical structure of the Ozone Low(OV) o...Using four satellite data sets(TOMS/SBUV, OMI, MLS, and HALOE), we analyze the seasonal variations of the total column ozone(TCO) and its zonal deviation(TCO*), and reveal the vertical structure of the Ozone Low(OV) over the Asian continent. Our principal findings are:(1) The TCO over the Asian continent reaches its maximum in the spring and its minimum in the autumn. The Ozone Low exists from May to September.(2) The Ozone Low has two negative cores, located in the lower and the upper stratosphere. The lower core is near 30 hPa in the winter and 70 hPa in the other seasons. The upper core varies from 10 hPa to 1 hPa among the four seasons.(3)The position of the Ozone Low in the lower and the upper stratosphere over the Asian continent shows seasonal variability.展开更多
A great challenge faced by wireless sensor networks(WSNs) is to reduce energy consumption of sensor nodes. Fortunately, the data gathering via random sensing can save energy of sensor nodes. Nevertheless, its randomne...A great challenge faced by wireless sensor networks(WSNs) is to reduce energy consumption of sensor nodes. Fortunately, the data gathering via random sensing can save energy of sensor nodes. Nevertheless, its randomness and density usually result in difficult implementations, high computation complexity and large storage spaces in practical settings. So the deterministic sparse sensing matrices are desired in some situations. However,it is difficult to guarantee the performance of deterministic sensing matrix by the acknowledged metrics. In this paper, we construct a class of deterministic sparse sensing matrices with statistical versions of restricted isometry property(St RIP) via regular low density parity check(RLDPC) matrices. The key idea of our construction is to achieve small mutual coherence of the matrices by confining the column weights of RLDPC matrices such that St RIP is satisfied. Besides, we prove that the constructed sensing matrices have the same scale of measurement numbers as the dense measurements. We also propose a data gathering method based on RLDPC matrix. Experimental results verify that the constructed sensing matrices have better reconstruction performance, compared to the Gaussian, Bernoulli, and CSLDPC matrices. And we also verify that the data gathering via RLDPC matrix can reduce energy consumption of WSNs.展开更多
Seismic data reconstruction is an essential and yet fundamental step in seismic data processing workflow,which is of profound significance to improve migration imaging quality,multiple suppression effect,and seismic i...Seismic data reconstruction is an essential and yet fundamental step in seismic data processing workflow,which is of profound significance to improve migration imaging quality,multiple suppression effect,and seismic inversion accuracy.Regularization methods play a central role in solving the underdetermined inverse problem of seismic data reconstruction.In this paper,a novel regularization approach is proposed,the low dimensional manifold model(LDMM),for reconstructing the missing seismic data.Our work relies on the fact that seismic patches always occupy a low dimensional manifold.Specifically,we exploit the dimension of the seismic patches manifold as a regularization term in the reconstruction problem,and reconstruct the missing seismic data by enforcing low dimensionality on this manifold.The crucial procedure of the proposed method is to solve the dimension of the patches manifold.Toward this,we adopt an efficient dimensionality calculation method based on low-rank approximation,which provides a reliable safeguard to enforce the constraints in the reconstruction process.Numerical experiments performed on synthetic and field seismic data demonstrate that,compared with the curvelet-based sparsity-promoting L1-norm minimization method and the multichannel singular spectrum analysis method,the proposed method obtains state-of-the-art reconstruction results.展开更多
The harmonic analysis method based on high and low water levels is discussed in this paper. In order to make full use of the information of high and low water observations (the time derivative of water level at the ob...The harmonic analysis method based on high and low water levels is discussed in this paper. In order to make full use of the information of high and low water observations (the time derivative of water level at the observation time is zero), the weight coefficient, w, is introduced to control the importance of the part related to this information in the error formula. The major diurnal constituents, O 1 and K 1, and semi diurnal constituents, N 2, M 2 and S 2 are selected directly from the monthly data analysis, and some other important constituents, P 1, ν 2 and K 2, are included as the inferred constituents. The obtained harmonic constants of the major constituents are very close to those obtained from the analysis of hourly data, and this shows that high and low water data can be used to extract tidal constants with high accuracy. The analysis result also shows that the inference and the weighting coefficient are important in the high and low water data analysis, and it is suggested that w ≥1 should be taken in monthly high and low water data analysis. This analysis method can be used directly to analyze altimetric data with w =0.展开更多
With the development of oilfield exploration and mining, the research on continental oil and gas reservoirs has been gradually refined, and the exploration target of offshore reservoir has also entered the hot studyst...With the development of oilfield exploration and mining, the research on continental oil and gas reservoirs has been gradually refined, and the exploration target of offshore reservoir has also entered the hot studystage of small sand bodies, small fault blocks, complex structures, low permeability and various heterogeneous geological bodies. Thus, the marine oil and gas development will inevitably enter thecomplicated reservoir stage;meanwhile the corresponding assessment technologies, engineering measures andexploration method should be designed delicately. Studying on hydraulic flow unit of low permeability reservoir of offshore oilfield has practical significance for connectivity degree and remaining oil distribution. An integrated method which contains the data mining and flow unit identification part was used on the flow unit prediction of low permeability reservoir;the predicted results?were compared with mature commercial system results for verifying its application. This strategy is successfully applied to increase the accuracy by choosing the outstanding prediction result. Excellent computing system could provide more accurate geological information for reservoir characterization.展开更多
This paper describes the implementation of a data logger for the real-time in-situ monitoring of hydrothermal systems. A compact mechanical structure ensures the security and reliability of data logger when used under...This paper describes the implementation of a data logger for the real-time in-situ monitoring of hydrothermal systems. A compact mechanical structure ensures the security and reliability of data logger when used under deep sea. The data logger is a battery powered instrument, which can connect chemical sensors (pH electrode, H2S electrode, H2 electrode) and temperature sensors. In order to achieve major energy savings, dynamic power management is implemented in hardware design and software design. The working current of the data logger in idle mode and active mode is 15 μA and 1.44 mA respectively, which greatly extends the working time of battery. The data logger has been successftdly tested in the first Sino-American Cooperative Deep Submergence Project from August 13 to September 3, 2005.展开更多
The quality of the low frequency electromagnetic data is affected by the spike and the trend noises.Failure in removal of the spikes and the trends reduces the credibility of data explanation.Based on the analyses of ...The quality of the low frequency electromagnetic data is affected by the spike and the trend noises.Failure in removal of the spikes and the trends reduces the credibility of data explanation.Based on the analyses of the causes and characteristics of these noises,this paper presents the results of a preset statistics stacking method(PSSM)and a piecewise linear fitting method(PLFM)in de-noising the spikes and trends,respectively.The magnitudes of the spikes are either higher or lower than the normal values,which leads to distortion of the useful signal.Comparisons have been performed in removing of the spikes among the average,the statistics and the PSSM methods,and the results indicate that only the PSSM can remove the spikes successfully.On the other hand,the spectrums of the linear and nonlinear trends mainly lie in the low frequency band and can change the calculated resistivity significantly.No influence of the trends is observed when the frequency is higher than a certain threshold value.The PLSM can remove effectively both the linear and nonlinear trends with errors around 1% in the power spectrum.The proposed methods present an effective way for de-noising the spike and the trend noises in the low frequency electromagnetic data,and establish a research basis for de-noising the low frequency noises.展开更多
This paper represents current research in low-power Very Large Scale Integration (VLSI) domain. Nowadays low power has become more sought research topic in electronic industry. Power dissipation is the most important ...This paper represents current research in low-power Very Large Scale Integration (VLSI) domain. Nowadays low power has become more sought research topic in electronic industry. Power dissipation is the most important area while designing the VLSI chip. Today almost all of the high speed switching devices include the Ternary Content Addressable Memory (TCAM) as one of the most important features. When a device consumes less power that becomes reliable and it would work with more efficiency. Complementary Metal Oxide Semiconductor (CMOS) technology is best known for low power consumption devices. This paper aims at designing a router application device which consumes less power and works more efficiently. Various strategies, methodologies and power management techniques for low power circuits and systems are discussed in this research. From this research the challenges could be developed that might be met while designing low power high performance circuit. This work aims at developing Data Aware AND-type match line architecture for TCAM. A TCAM macro of 256 × 128 was designed using Cadence Advanced Development Environment (ADE) with 90 nm technology file from Taiwan Semiconductor Manufacturing Company (TSMC). The result shows that the proposed Data Aware architecture provides around 35% speed and 45% power improvement over existing architecture.展开更多
The observation of geomagnetic field variations is an important approach to studying earthquake precursors.Since 1987,the China Earthquake Administration has explored this seismomagnetic relationship.In particular,the...The observation of geomagnetic field variations is an important approach to studying earthquake precursors.Since 1987,the China Earthquake Administration has explored this seismomagnetic relationship.In particular,they studied local magnetic field anomalies over the Chinese mainland for earthquake prediction.Owing to the years of research on the seismomagnetic relationship,earthquake prediction experts have concluded that the compressive magnetic effect,tectonic magnetic effect,electric magnetic fluid effect,and other factors contribute to preearthquake magnetic anomalies.However,this involves a small magnitude of magnetic field changes.It is difficult to relate them to the abnormal changes of the extremely large magnetic field in regions with extreme earthquakes owing to the high cost of professional geomagnetic equipment,thereby limiting large-scale deployment.Moreover,it is difficult to obtain strong magnetic field changes before an earthquake.The Tianjin Earthquake Agency has developed low-cost geomagnetic field observation equipment through the Beijing–Tianjin–Hebei geomagnetic equipment test project.The new system was used to test the availability of equipment and determine the findings based on big data..展开更多
Traditional particle identification methods face timeconsuming,experience-dependent,and poor repeatability challenges in heavy-ion collisions at low and intermediate energies.Researchers urgently need solutions to the...Traditional particle identification methods face timeconsuming,experience-dependent,and poor repeatability challenges in heavy-ion collisions at low and intermediate energies.Researchers urgently need solutions to the dilemma of traditional particle identification methods.This study explores the possibility of applying intelligent learning algorithms to the particle identification of heavy-ion collisions at low and intermediate energies.Multiple intelligent algorithms,including XgBoost and TabNet,were selected to test datasets from the neutron ion multi-detector for reaction-oriented dynamics(NIMROD-ISiS)and Geant4 simulation.Tree-based machine learning algorithms and deep learning algorithms e.g.TabNet show excellent performance and generalization ability.Adding additional data features besides energy deposition can improve the algorithm’s performance when the data distribution is nonuniform.Intelligent learning algorithms can be applied to solve the particle identification problem in heavy-ion collisions at low and intermediate energies.展开更多
[Objectives]To explore the trend of brands towards the design of waist protection products through data mining,and to provide reference for the design concept of the contour of waist protection pillow.[Methods]The str...[Objectives]To explore the trend of brands towards the design of waist protection products through data mining,and to provide reference for the design concept of the contour of waist protection pillow.[Methods]The structural design information of all waist protection equipment was collected from the national Internet platform,and the data were classified and a database was established.IBM SPSS 26.0 and MATLAB 2018a were used to analyze the data and tabulate them in Tableau 2022.4.After the association rules were clarified,the data were imported into Cinema 4D R21 to create the concept contour of waist protection pillow.[Results]The average and standard deviation of the single airbag design were the highest in all groups,with an average of 0.511 and a standard deviation of 0.502.The average and standard deviation of the upper and lower dual airbags were the lowest in all groups,with an average of 0.015 and a standard deviation of 0.120;the correlation coefficient between single airbag and 120°arc stretching was 0.325,which was positively correlated with each other(P<0.01);the correlation coefficient between multiple airbags and 360°encircling fitting was 0.501,which was positively correlated with each other and had the highest correlation degree(P<0.01).[Conclusions]The single airbag design is well recognized by companies,and has received the highest attention among all brand products.While focusing on single airbag design,most brands will consider the need to add 120°arc stretching elements in product design.At the time of focusing on multiple airbag design,some brands believe that 360°encircling fitting elements need to be added to the product,and the correlation between the two is the highest among all groups.展开更多
Two cold vortex weather processes in Liaoning Province in June of 2006 were analyzed.In the process of low vortex of June 3,strong convection weather,such lightning storm and hailstone,came forth in most areas of Liao...Two cold vortex weather processes in Liaoning Province in June of 2006 were analyzed.In the process of low vortex of June 3,strong convection weather,such lightning storm and hailstone,came forth in most areas of Liaoning Province.White and bright cloud was shown in satellite nephogram.Bow echo and cyclonic circumfluence were shown in weather radar production.In the process of low vortex of June 14,strong precipitation weather came forth in most area of Liaoning Province.Based on the velocity field production of weather radar,the relative place of front and radar station can be judged.The weather situation and forecast were the main basis of short-term prediction.And satellite nephogram,weather radar,automatic weather station play important roles in the monitoring and short-term prediction of disaster weathers.展开更多
This study of the general features of occurrence frequencies, spatial distribution of locations, life time and cloud patterns of polar lows over the Japan Sea and the neighboring Northwestern Pacific in winter of 1995...This study of the general features of occurrence frequencies, spatial distribution of locations, life time and cloud patterns of polar lows over the Japan Sea and the neighboring Northwestern Pacific in winter of 1995/1996 based on observation and satellite data showed that polar lows develop most frequently in mid winter over the Japan Sea (35-45°N ) and the Northwestern Pacific (30-50°N). They rarely form over the Eurasian Continent. Polar lows over the Northwestern Pacific are usually long lived (2-3 days). But polar lows over the Japan Sea are relatively short lived (1-2 days), because the east west width of the Japan Sea is relatively narrow and polar lows tend to decay after their passing over the Japan Islands. Generally speaking, polar lows over the Japan Sea are characterized by tight, spiral (or comma) cloud patterns on satellite images. It was observed that polar lows over the Japan Sea have a typically spiral cloud band with clear “eye” at their mature stage. In winter, because of the effect of the warm Tsushima Current, the annual mean SST of the Japan Sea is 5-9℃ higher than that of the same latitude oceans. The large sea air temperature difference sustained over the Japan Sea provides favorable condition for polar low formation. The general features of polar lows over the Japan Sea are compared with those of other areas where polar lows often occur.展开更多
Seismic data typically contain random missing traces because of obstacles and economic restrictions,influencing subsequent processing and interpretation.Seismic data recovery can be expressed as a low-rank matrix appr...Seismic data typically contain random missing traces because of obstacles and economic restrictions,influencing subsequent processing and interpretation.Seismic data recovery can be expressed as a low-rank matrix approximation problem by assuming a low-rank structure for the complete seismic data in the frequency–space(f–x)domain.The nuclear norm minimization(NNM)(sum of singular values)approach treats singular values equally,yielding a solution deviating from the optimal.Further,the log-sum majorization–minimization(LSMM)approach uses the nonconvex log-sum function as a rank substitution for seismic data interpolation,which is highly accurate but time-consuming.Therefore,this study proposes an efficient nonconvex reconstruction model based on the nonconvex Geman function(the nonconvex Geman low-rank(NCGL)model),involving a tighter approximation of the original rank function.Without introducing additional parameters,the nonconvex problem is solved using the Karush–Kuhn–Tucker condition theory.Experiments using synthetic and field data demonstrate that the proposed NCGL approach achieves a higher signal-to-noise ratio than the singular value thresholding method based on NNM and the projection onto convex sets method based on the data-driven threshold model.The proposed approach achieves higher reconstruction efficiency than the singular value thresholding and LSMM methods.展开更多
The very low frequency(VLF)regime below 30 MHz in the electromagnetic spectrum has presently been drawing global attention in radio astronomical research due to its potentially significant science outcomes exploring m...The very low frequency(VLF)regime below 30 MHz in the electromagnetic spectrum has presently been drawing global attention in radio astronomical research due to its potentially significant science outcomes exploring many unknown extragalactic sources,transients,and so on.However,the nontransparency of the Earth’s ionosphere,ionospheric distortion and artificial radio frequency interference(RFI)have made it difficult to detect the VLF celestial radio emission with ground-based instruments.A straightforward solution to overcome these problems is a space-based VLF radio telescope,just like the VLF radio instruments onboard the Chang’E-4 spacecraft.But building such a space telescope would be inevitably costly and technically challenging.The alternative approach would be then a ground-based VLF radio telescope.Particularly,in the period of post 2020 when the solar and terrestrial ionospheric activities are expected to be in a’calm’state,it will provide us a good chance to perform VLF ground-based radio observations.Anticipating such an opportunity,we built an agile VLF radio spectrum explorer co-located with the currently operational Mingantu Spectra Radio Heliograph(MUSER).The instrument includes four antennas operating in the VLF frequency range 1-70 MHz.Along with them,we employ an eight-channel analog and digital receivers to amplify,digitize and process the radio signals received by the antennas.We present in the paper this VLF radio spectrum explorer and the instrument will be useful for celestial studies of VLF radio emissions.展开更多
Aiming at the contradiction between the depth control accuracy and the energy consumption of the self-sustaining intelligent buoy,a low energy consumption depth control method based on historical array for real-time g...Aiming at the contradiction between the depth control accuracy and the energy consumption of the self-sustaining intelligent buoy,a low energy consumption depth control method based on historical array for real-time geostrophic oceanography(Argo)data is proposed.As known from the buoy kinematic model,the volume of the external oil sac only depends on the density and temperature of seawater at hovering depth.Hence,we use historical Argo data to extract the fitting curves of density and temperature,and obtain the relationship between the hovering depth and the volume of the external oil sac.Genetic algorithm is used to carry out the optimal energy consumption motion planning for the depth control process,and the specific motion strategy of depth control process is obtained.Compared with dual closed-loop fuzzy PID control method and radial basis function(RBF)-PID method,the proposed method reduces energy consumption to 1/50 with the same accuracy.Finally,a hardware-in-the-loop simulation system was used to verify this method.When the error caused by fitting curves is not considered,the average error is 2.62 m,the energy consumption is 3.214×10^(4)J,and the error of energy consumption is only 0.65%.It shows the effectiveness and reliability of the method as well as the advantages of comprehensively considering the accuracy and energy consumption.展开更多
Logic regression is an adaptive regression method which searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome, and thus, it reveals interaction effects which ar...Logic regression is an adaptive regression method which searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome, and thus, it reveals interaction effects which are associated with the response. In this study, we extended logic regression to longitudinal data with binary response and proposed “Transition Logic Regression Method” to find interactions related to response. In this method, interaction effects over time were found by Annealing Algorithm with AIC (Akaike Information Criterion) as the score function of the model. Also, first and second orders Markov dependence were allowed to capture the correlation among successive observations of the same individual in longitudinal binary response. Performance of the method was evaluated with simulation study in various conditions. Proposed method was used to find interactions of SNPs and other risk factors related to low HDL over time in data of 329 participants of longitudinal TLGS study.展开更多
A comparison study is performed to contrast the improvements in the tropical Pacific oceanic state of a low-resolution model respectively via data assimilation and by an increase in horizontal resolution. A low resolu...A comparison study is performed to contrast the improvements in the tropical Pacific oceanic state of a low-resolution model respectively via data assimilation and by an increase in horizontal resolution. A low resolution model (LR) (1°lat by 2°lon) and a high-resolution model (HR) (0.5°lat by 0.5°lon) are employed for the comparison. The authors perform 20-yr numerical experiments and analyze the annual mean fields of temperature and salinity. The results indicate that the low-resolution model with data assimilation behaves better than the high-resolution model in the estimation of ocean large-scale features. From 1990 to 2000, the average of HR's RMSE (root-mean-square error) relative to independent Tropical Atmosphere Ocean project (TAO) mooring data at randomly selected points is 0.97℃ compared to a RMSE of 0.56℃ for LR with temperature assimilation. Moreover, the LR with data assimilation is more frugal in computation. Although there is room to improve the high-resolution model, the low-resolution model with data assimilation may be an advisable choice in achieving a more realistic large-scale state of the ocean at the limited level of information provided by the current observational system.展开更多
基金financially supported by the Postdoctor Research Foundation of Shunde Graduate School of University of Science and Technology Beijing(No.2022 B H003)。
文摘Machine-learning and big data are among the latest approaches in corrosion research.The biggest challenge in corrosion research is to accurately predict how materials will degrade in a given environment.Corrosion big data is the application of mathematical methods to huge amounts of data to find correlations and infer probabilities.It is possible to use corrosion big data method to distinguish the influence of the minimal changes of alloying elements and small differences in microstructure on corrosion resistance of low alloy steels.In this research,corrosion big data evaluation methods and machine learning were used to study the effect of Sb and Sn,as well as environmental factors on the corrosion behavior of low alloy steels.Results depict corrosion big data method can accurately identify the influence of various factors on corrosion resistance of low alloy and is an effective and promising way in corrosion research.
基金funded by the National Science Foundation of China (91537213, 91837311, 41675039, 41875048)
文摘Using four satellite data sets(TOMS/SBUV, OMI, MLS, and HALOE), we analyze the seasonal variations of the total column ozone(TCO) and its zonal deviation(TCO*), and reveal the vertical structure of the Ozone Low(OV) over the Asian continent. Our principal findings are:(1) The TCO over the Asian continent reaches its maximum in the spring and its minimum in the autumn. The Ozone Low exists from May to September.(2) The Ozone Low has two negative cores, located in the lower and the upper stratosphere. The lower core is near 30 hPa in the winter and 70 hPa in the other seasons. The upper core varies from 10 hPa to 1 hPa among the four seasons.(3)The position of the Ozone Low in the lower and the upper stratosphere over the Asian continent shows seasonal variability.
基金supported by the National Natural Science Foundation of China(61307121)ABRP of Datong(2017127)the Ph.D.’s Initiated Research Projects of Datong University(2013-B-17,2015-B-05)
文摘A great challenge faced by wireless sensor networks(WSNs) is to reduce energy consumption of sensor nodes. Fortunately, the data gathering via random sensing can save energy of sensor nodes. Nevertheless, its randomness and density usually result in difficult implementations, high computation complexity and large storage spaces in practical settings. So the deterministic sparse sensing matrices are desired in some situations. However,it is difficult to guarantee the performance of deterministic sensing matrix by the acknowledged metrics. In this paper, we construct a class of deterministic sparse sensing matrices with statistical versions of restricted isometry property(St RIP) via regular low density parity check(RLDPC) matrices. The key idea of our construction is to achieve small mutual coherence of the matrices by confining the column weights of RLDPC matrices such that St RIP is satisfied. Besides, we prove that the constructed sensing matrices have the same scale of measurement numbers as the dense measurements. We also propose a data gathering method based on RLDPC matrix. Experimental results verify that the constructed sensing matrices have better reconstruction performance, compared to the Gaussian, Bernoulli, and CSLDPC matrices. And we also verify that the data gathering via RLDPC matrix can reduce energy consumption of WSNs.
基金supported by National Natural Science Foundation of China(Grant No.41874146 and No.42030103)Postgraduate Innovation Project of China University of Petroleum(East China)(No.YCX2021012)
文摘Seismic data reconstruction is an essential and yet fundamental step in seismic data processing workflow,which is of profound significance to improve migration imaging quality,multiple suppression effect,and seismic inversion accuracy.Regularization methods play a central role in solving the underdetermined inverse problem of seismic data reconstruction.In this paper,a novel regularization approach is proposed,the low dimensional manifold model(LDMM),for reconstructing the missing seismic data.Our work relies on the fact that seismic patches always occupy a low dimensional manifold.Specifically,we exploit the dimension of the seismic patches manifold as a regularization term in the reconstruction problem,and reconstruct the missing seismic data by enforcing low dimensionality on this manifold.The crucial procedure of the proposed method is to solve the dimension of the patches manifold.Toward this,we adopt an efficient dimensionality calculation method based on low-rank approximation,which provides a reliable safeguard to enforce the constraints in the reconstruction process.Numerical experiments performed on synthetic and field seismic data demonstrate that,compared with the curvelet-based sparsity-promoting L1-norm minimization method and the multichannel singular spectrum analysis method,the proposed method obtains state-of-the-art reconstruction results.
基金supported by the project of NSFC(No.49906001)the Excellent Young Teacher Award Foundation of State Education Ministry[2000](No.6).
文摘The harmonic analysis method based on high and low water levels is discussed in this paper. In order to make full use of the information of high and low water observations (the time derivative of water level at the observation time is zero), the weight coefficient, w, is introduced to control the importance of the part related to this information in the error formula. The major diurnal constituents, O 1 and K 1, and semi diurnal constituents, N 2, M 2 and S 2 are selected directly from the monthly data analysis, and some other important constituents, P 1, ν 2 and K 2, are included as the inferred constituents. The obtained harmonic constants of the major constituents are very close to those obtained from the analysis of hourly data, and this shows that high and low water data can be used to extract tidal constants with high accuracy. The analysis result also shows that the inference and the weighting coefficient are important in the high and low water data analysis, and it is suggested that w ≥1 should be taken in monthly high and low water data analysis. This analysis method can be used directly to analyze altimetric data with w =0.
文摘With the development of oilfield exploration and mining, the research on continental oil and gas reservoirs has been gradually refined, and the exploration target of offshore reservoir has also entered the hot studystage of small sand bodies, small fault blocks, complex structures, low permeability and various heterogeneous geological bodies. Thus, the marine oil and gas development will inevitably enter thecomplicated reservoir stage;meanwhile the corresponding assessment technologies, engineering measures andexploration method should be designed delicately. Studying on hydraulic flow unit of low permeability reservoir of offshore oilfield has practical significance for connectivity degree and remaining oil distribution. An integrated method which contains the data mining and flow unit identification part was used on the flow unit prediction of low permeability reservoir;the predicted results?were compared with mature commercial system results for verifying its application. This strategy is successfully applied to increase the accuracy by choosing the outstanding prediction result. Excellent computing system could provide more accurate geological information for reservoir characterization.
基金supported by the International Cooperative Key Project(Grant No.2004DFA04900)Ministry of Sciences and Technology of PRC,and the National Natural Science Foundation of China (Grant Nos.40637037 and 50675198)
文摘This paper describes the implementation of a data logger for the real-time in-situ monitoring of hydrothermal systems. A compact mechanical structure ensures the security and reliability of data logger when used under deep sea. The data logger is a battery powered instrument, which can connect chemical sensors (pH electrode, H2S electrode, H2 electrode) and temperature sensors. In order to achieve major energy savings, dynamic power management is implemented in hardware design and software design. The working current of the data logger in idle mode and active mode is 15 μA and 1.44 mA respectively, which greatly extends the working time of battery. The data logger has been successftdly tested in the first Sino-American Cooperative Deep Submergence Project from August 13 to September 3, 2005.
文摘The quality of the low frequency electromagnetic data is affected by the spike and the trend noises.Failure in removal of the spikes and the trends reduces the credibility of data explanation.Based on the analyses of the causes and characteristics of these noises,this paper presents the results of a preset statistics stacking method(PSSM)and a piecewise linear fitting method(PLFM)in de-noising the spikes and trends,respectively.The magnitudes of the spikes are either higher or lower than the normal values,which leads to distortion of the useful signal.Comparisons have been performed in removing of the spikes among the average,the statistics and the PSSM methods,and the results indicate that only the PSSM can remove the spikes successfully.On the other hand,the spectrums of the linear and nonlinear trends mainly lie in the low frequency band and can change the calculated resistivity significantly.No influence of the trends is observed when the frequency is higher than a certain threshold value.The PLSM can remove effectively both the linear and nonlinear trends with errors around 1% in the power spectrum.The proposed methods present an effective way for de-noising the spike and the trend noises in the low frequency electromagnetic data,and establish a research basis for de-noising the low frequency noises.
文摘This paper represents current research in low-power Very Large Scale Integration (VLSI) domain. Nowadays low power has become more sought research topic in electronic industry. Power dissipation is the most important area while designing the VLSI chip. Today almost all of the high speed switching devices include the Ternary Content Addressable Memory (TCAM) as one of the most important features. When a device consumes less power that becomes reliable and it would work with more efficiency. Complementary Metal Oxide Semiconductor (CMOS) technology is best known for low power consumption devices. This paper aims at designing a router application device which consumes less power and works more efficiently. Various strategies, methodologies and power management techniques for low power circuits and systems are discussed in this research. From this research the challenges could be developed that might be met while designing low power high performance circuit. This work aims at developing Data Aware AND-type match line architecture for TCAM. A TCAM macro of 256 × 128 was designed using Cadence Advanced Development Environment (ADE) with 90 nm technology file from Taiwan Semiconductor Manufacturing Company (TSMC). The result shows that the proposed Data Aware architecture provides around 35% speed and 45% power improvement over existing architecture.
基金supported by the Spark Program of Earthquake Science and Technology(No.XH23003C).
文摘The observation of geomagnetic field variations is an important approach to studying earthquake precursors.Since 1987,the China Earthquake Administration has explored this seismomagnetic relationship.In particular,they studied local magnetic field anomalies over the Chinese mainland for earthquake prediction.Owing to the years of research on the seismomagnetic relationship,earthquake prediction experts have concluded that the compressive magnetic effect,tectonic magnetic effect,electric magnetic fluid effect,and other factors contribute to preearthquake magnetic anomalies.However,this involves a small magnitude of magnetic field changes.It is difficult to relate them to the abnormal changes of the extremely large magnetic field in regions with extreme earthquakes owing to the high cost of professional geomagnetic equipment,thereby limiting large-scale deployment.Moreover,it is difficult to obtain strong magnetic field changes before an earthquake.The Tianjin Earthquake Agency has developed low-cost geomagnetic field observation equipment through the Beijing–Tianjin–Hebei geomagnetic equipment test project.The new system was used to test the availability of equipment and determine the findings based on big data..
基金This work was supported by the Strategic Priority Research Program of Chinese Academy of Sciences(No.XDB34030000)the National Key Research and Development Program of China(No.2022YFA1602404)+1 种基金the National Natural Science Foundation(No.U1832129)the Youth Innovation Promotion Association CAS(No.2017309).
文摘Traditional particle identification methods face timeconsuming,experience-dependent,and poor repeatability challenges in heavy-ion collisions at low and intermediate energies.Researchers urgently need solutions to the dilemma of traditional particle identification methods.This study explores the possibility of applying intelligent learning algorithms to the particle identification of heavy-ion collisions at low and intermediate energies.Multiple intelligent algorithms,including XgBoost and TabNet,were selected to test datasets from the neutron ion multi-detector for reaction-oriented dynamics(NIMROD-ISiS)and Geant4 simulation.Tree-based machine learning algorithms and deep learning algorithms e.g.TabNet show excellent performance and generalization ability.Adding additional data features besides energy deposition can improve the algorithm’s performance when the data distribution is nonuniform.Intelligent learning algorithms can be applied to solve the particle identification problem in heavy-ion collisions at low and intermediate energies.
基金Supported by Municipal Public Welfare Science and Technology Project of Zhoushan Science and Technology Bureau,Zhejiang Province(2021C31064).
文摘[Objectives]To explore the trend of brands towards the design of waist protection products through data mining,and to provide reference for the design concept of the contour of waist protection pillow.[Methods]The structural design information of all waist protection equipment was collected from the national Internet platform,and the data were classified and a database was established.IBM SPSS 26.0 and MATLAB 2018a were used to analyze the data and tabulate them in Tableau 2022.4.After the association rules were clarified,the data were imported into Cinema 4D R21 to create the concept contour of waist protection pillow.[Results]The average and standard deviation of the single airbag design were the highest in all groups,with an average of 0.511 and a standard deviation of 0.502.The average and standard deviation of the upper and lower dual airbags were the lowest in all groups,with an average of 0.015 and a standard deviation of 0.120;the correlation coefficient between single airbag and 120°arc stretching was 0.325,which was positively correlated with each other(P<0.01);the correlation coefficient between multiple airbags and 360°encircling fitting was 0.501,which was positively correlated with each other and had the highest correlation degree(P<0.01).[Conclusions]The single airbag design is well recognized by companies,and has received the highest attention among all brand products.While focusing on single airbag design,most brands will consider the need to add 120°arc stretching elements in product design.At the time of focusing on multiple airbag design,some brands believe that 360°encircling fitting elements need to be added to the product,and the correlation between the two is the highest among all groups.
文摘Two cold vortex weather processes in Liaoning Province in June of 2006 were analyzed.In the process of low vortex of June 3,strong convection weather,such lightning storm and hailstone,came forth in most areas of Liaoning Province.White and bright cloud was shown in satellite nephogram.Bow echo and cyclonic circumfluence were shown in weather radar production.In the process of low vortex of June 14,strong precipitation weather came forth in most area of Liaoning Province.Based on the velocity field production of weather radar,the relative place of front and radar station can be judged.The weather situation and forecast were the main basis of short-term prediction.And satellite nephogram,weather radar,automatic weather station play important roles in the monitoring and short-term prediction of disaster weathers.
文摘This study of the general features of occurrence frequencies, spatial distribution of locations, life time and cloud patterns of polar lows over the Japan Sea and the neighboring Northwestern Pacific in winter of 1995/1996 based on observation and satellite data showed that polar lows develop most frequently in mid winter over the Japan Sea (35-45°N ) and the Northwestern Pacific (30-50°N). They rarely form over the Eurasian Continent. Polar lows over the Northwestern Pacific are usually long lived (2-3 days). But polar lows over the Japan Sea are relatively short lived (1-2 days), because the east west width of the Japan Sea is relatively narrow and polar lows tend to decay after their passing over the Japan Islands. Generally speaking, polar lows over the Japan Sea are characterized by tight, spiral (or comma) cloud patterns on satellite images. It was observed that polar lows over the Japan Sea have a typically spiral cloud band with clear “eye” at their mature stage. In winter, because of the effect of the warm Tsushima Current, the annual mean SST of the Japan Sea is 5-9℃ higher than that of the same latitude oceans. The large sea air temperature difference sustained over the Japan Sea provides favorable condition for polar low formation. The general features of polar lows over the Japan Sea are compared with those of other areas where polar lows often occur.
基金financially supported by the National Key R&D Program of China(No.2018YFC1503705)the Science and Technology Research Project of Hubei Provincial Department of Education(No.B2017597)+1 种基金the Hubei Subsurface Multiscale Imaging Key Laboratory(China University of Geosciences)(No.SMIL-2018-06)the Fundamental Research Funds for the Central Universities(No.CCNU19TS020).
文摘Seismic data typically contain random missing traces because of obstacles and economic restrictions,influencing subsequent processing and interpretation.Seismic data recovery can be expressed as a low-rank matrix approximation problem by assuming a low-rank structure for the complete seismic data in the frequency–space(f–x)domain.The nuclear norm minimization(NNM)(sum of singular values)approach treats singular values equally,yielding a solution deviating from the optimal.Further,the log-sum majorization–minimization(LSMM)approach uses the nonconvex log-sum function as a rank substitution for seismic data interpolation,which is highly accurate but time-consuming.Therefore,this study proposes an efficient nonconvex reconstruction model based on the nonconvex Geman function(the nonconvex Geman low-rank(NCGL)model),involving a tighter approximation of the original rank function.Without introducing additional parameters,the nonconvex problem is solved using the Karush–Kuhn–Tucker condition theory.Experiments using synthetic and field data demonstrate that the proposed NCGL approach achieves a higher signal-to-noise ratio than the singular value thresholding method based on NNM and the projection onto convex sets method based on the data-driven threshold model.The proposed approach achieves higher reconstruction efficiency than the singular value thresholding and LSMM methods.
基金funded by the National Natural Science Foundation of China(NSFC,Nos.11573043,11790305,11433006)National Key R&D 278 Program of China(2018YFA0404602)+1 种基金the CE4 mission of the Chinese Lunar Exploration Program:the Netherlands-China Low Frequency Explorer(NCLE)Chinese Academy of Sciences(CAS)Strategic Priority Research Program(XDA15020200)。
文摘The very low frequency(VLF)regime below 30 MHz in the electromagnetic spectrum has presently been drawing global attention in radio astronomical research due to its potentially significant science outcomes exploring many unknown extragalactic sources,transients,and so on.However,the nontransparency of the Earth’s ionosphere,ionospheric distortion and artificial radio frequency interference(RFI)have made it difficult to detect the VLF celestial radio emission with ground-based instruments.A straightforward solution to overcome these problems is a space-based VLF radio telescope,just like the VLF radio instruments onboard the Chang’E-4 spacecraft.But building such a space telescope would be inevitably costly and technically challenging.The alternative approach would be then a ground-based VLF radio telescope.Particularly,in the period of post 2020 when the solar and terrestrial ionospheric activities are expected to be in a’calm’state,it will provide us a good chance to perform VLF ground-based radio observations.Anticipating such an opportunity,we built an agile VLF radio spectrum explorer co-located with the currently operational Mingantu Spectra Radio Heliograph(MUSER).The instrument includes four antennas operating in the VLF frequency range 1-70 MHz.Along with them,we employ an eight-channel analog and digital receivers to amplify,digitize and process the radio signals received by the antennas.We present in the paper this VLF radio spectrum explorer and the instrument will be useful for celestial studies of VLF radio emissions.
基金Qingdao Entrepreneurship and Innovation Leading Researchers Program(No.19-3-2-40-zhc)Key Research and Development Program of Shandong Province(Nos.2019GHY112072,2019GHY112051)Project Supported by State Key Laboratory of Precision Measuring Technology and Instruments(No.pilab1906).
文摘Aiming at the contradiction between the depth control accuracy and the energy consumption of the self-sustaining intelligent buoy,a low energy consumption depth control method based on historical array for real-time geostrophic oceanography(Argo)data is proposed.As known from the buoy kinematic model,the volume of the external oil sac only depends on the density and temperature of seawater at hovering depth.Hence,we use historical Argo data to extract the fitting curves of density and temperature,and obtain the relationship between the hovering depth and the volume of the external oil sac.Genetic algorithm is used to carry out the optimal energy consumption motion planning for the depth control process,and the specific motion strategy of depth control process is obtained.Compared with dual closed-loop fuzzy PID control method and radial basis function(RBF)-PID method,the proposed method reduces energy consumption to 1/50 with the same accuracy.Finally,a hardware-in-the-loop simulation system was used to verify this method.When the error caused by fitting curves is not considered,the average error is 2.62 m,the energy consumption is 3.214×10^(4)J,and the error of energy consumption is only 0.65%.It shows the effectiveness and reliability of the method as well as the advantages of comprehensively considering the accuracy and energy consumption.
文摘Logic regression is an adaptive regression method which searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome, and thus, it reveals interaction effects which are associated with the response. In this study, we extended logic regression to longitudinal data with binary response and proposed “Transition Logic Regression Method” to find interactions related to response. In this method, interaction effects over time were found by Annealing Algorithm with AIC (Akaike Information Criterion) as the score function of the model. Also, first and second orders Markov dependence were allowed to capture the correlation among successive observations of the same individual in longitudinal binary response. Performance of the method was evaluated with simulation study in various conditions. Proposed method was used to find interactions of SNPs and other risk factors related to low HDL over time in data of 329 participants of longitudinal TLGS study.
基金This study is supported by the Key Program of Chinese Academy of Sciences KZCX3 SW-221the National Natural Science Foundation of China(Grant No.40233033 and 40221503).
文摘A comparison study is performed to contrast the improvements in the tropical Pacific oceanic state of a low-resolution model respectively via data assimilation and by an increase in horizontal resolution. A low resolution model (LR) (1°lat by 2°lon) and a high-resolution model (HR) (0.5°lat by 0.5°lon) are employed for the comparison. The authors perform 20-yr numerical experiments and analyze the annual mean fields of temperature and salinity. The results indicate that the low-resolution model with data assimilation behaves better than the high-resolution model in the estimation of ocean large-scale features. From 1990 to 2000, the average of HR's RMSE (root-mean-square error) relative to independent Tropical Atmosphere Ocean project (TAO) mooring data at randomly selected points is 0.97℃ compared to a RMSE of 0.56℃ for LR with temperature assimilation. Moreover, the LR with data assimilation is more frugal in computation. Although there is room to improve the high-resolution model, the low-resolution model with data assimilation may be an advisable choice in achieving a more realistic large-scale state of the ocean at the limited level of information provided by the current observational system.