期刊文献+
共找到1,903篇文章
< 1 2 96 >
每页显示 20 50 100
Data-mining and atmospheric corrosion resistance evaluation of Sn-and Sb-additional low alloy steel based on big data technology 被引量:8
1
作者 Xiaojia Yang Jike Yang +4 位作者 Ying Yang Qing Li Di Xu Xuequn Cheng Xiaogang Li 《International Journal of Minerals,Metallurgy and Materials》 SCIE EI CAS CSCD 2022年第4期825-835,共11页
Machine-learning and big data are among the latest approaches in corrosion research.The biggest challenge in corrosion research is to accurately predict how materials will degrade in a given environment.Corrosion big ... Machine-learning and big data are among the latest approaches in corrosion research.The biggest challenge in corrosion research is to accurately predict how materials will degrade in a given environment.Corrosion big data is the application of mathematical methods to huge amounts of data to find correlations and infer probabilities.It is possible to use corrosion big data method to distinguish the influence of the minimal changes of alloying elements and small differences in microstructure on corrosion resistance of low alloy steels.In this research,corrosion big data evaluation methods and machine learning were used to study the effect of Sb and Sn,as well as environmental factors on the corrosion behavior of low alloy steels.Results depict corrosion big data method can accurately identify the influence of various factors on corrosion resistance of low alloy and is an effective and promising way in corrosion research. 展开更多
关键词 MACHINE-LEARNING corrosion big data low alloy steels corrosion resistance
下载PDF
Double cores of the Ozone Low in the vertical direction over the Asian continent in satellite data sets 被引量:2
2
作者 Zhou Tang Dong Guo +8 位作者 YuCheng Su ChunHua Shi ChenXi Zhang Yu Liu XiangDong Zheng WenWen Xu JianJun Xu RenQiang Liu WeiLiang Li 《Earth and Planetary Physics》 CSCD 2019年第2期93-101,共9页
Using four satellite data sets(TOMS/SBUV, OMI, MLS, and HALOE), we analyze the seasonal variations of the total column ozone(TCO) and its zonal deviation(TCO*), and reveal the vertical structure of the Ozone Low(OV) o... Using four satellite data sets(TOMS/SBUV, OMI, MLS, and HALOE), we analyze the seasonal variations of the total column ozone(TCO) and its zonal deviation(TCO*), and reveal the vertical structure of the Ozone Low(OV) over the Asian continent. Our principal findings are:(1) The TCO over the Asian continent reaches its maximum in the spring and its minimum in the autumn. The Ozone Low exists from May to September.(2) The Ozone Low has two negative cores, located in the lower and the upper stratosphere. The lower core is near 30 hPa in the winter and 70 hPa in the other seasons. The upper core varies from 10 hPa to 1 hPa among the four seasons.(3)The position of the Ozone Low in the lower and the upper stratosphere over the Asian continent shows seasonal variability. 展开更多
关键词 OZONE low DOUBLE core ASIAN CONTINENT SATELLITE data
下载PDF
Data Gathering in Wireless Sensor Networks Via Regular Low Density Parity Check Matrix 被引量:1
3
作者 Xiaoxia Song Yong Li 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2018年第1期83-91,共9页
A great challenge faced by wireless sensor networks(WSNs) is to reduce energy consumption of sensor nodes. Fortunately, the data gathering via random sensing can save energy of sensor nodes. Nevertheless, its randomne... A great challenge faced by wireless sensor networks(WSNs) is to reduce energy consumption of sensor nodes. Fortunately, the data gathering via random sensing can save energy of sensor nodes. Nevertheless, its randomness and density usually result in difficult implementations, high computation complexity and large storage spaces in practical settings. So the deterministic sparse sensing matrices are desired in some situations. However,it is difficult to guarantee the performance of deterministic sensing matrix by the acknowledged metrics. In this paper, we construct a class of deterministic sparse sensing matrices with statistical versions of restricted isometry property(St RIP) via regular low density parity check(RLDPC) matrices. The key idea of our construction is to achieve small mutual coherence of the matrices by confining the column weights of RLDPC matrices such that St RIP is satisfied. Besides, we prove that the constructed sensing matrices have the same scale of measurement numbers as the dense measurements. We also propose a data gathering method based on RLDPC matrix. Experimental results verify that the constructed sensing matrices have better reconstruction performance, compared to the Gaussian, Bernoulli, and CSLDPC matrices. And we also verify that the data gathering via RLDPC matrix can reduce energy consumption of WSNs. 展开更多
关键词 data gathering regular low density parity check(RLDPC) matrix sensing matrix signal reconstruction wireless sensor networks(WSNs)
下载PDF
Seismic data reconstruction based on low dimensional manifold model 被引量:1
4
作者 Nan-Ying Lan Fan-Chang Zhang Xing-Yao Yin 《Petroleum Science》 SCIE CAS CSCD 2022年第2期518-533,共16页
Seismic data reconstruction is an essential and yet fundamental step in seismic data processing workflow,which is of profound significance to improve migration imaging quality,multiple suppression effect,and seismic i... Seismic data reconstruction is an essential and yet fundamental step in seismic data processing workflow,which is of profound significance to improve migration imaging quality,multiple suppression effect,and seismic inversion accuracy.Regularization methods play a central role in solving the underdetermined inverse problem of seismic data reconstruction.In this paper,a novel regularization approach is proposed,the low dimensional manifold model(LDMM),for reconstructing the missing seismic data.Our work relies on the fact that seismic patches always occupy a low dimensional manifold.Specifically,we exploit the dimension of the seismic patches manifold as a regularization term in the reconstruction problem,and reconstruct the missing seismic data by enforcing low dimensionality on this manifold.The crucial procedure of the proposed method is to solve the dimension of the patches manifold.Toward this,we adopt an efficient dimensionality calculation method based on low-rank approximation,which provides a reliable safeguard to enforce the constraints in the reconstruction process.Numerical experiments performed on synthetic and field seismic data demonstrate that,compared with the curvelet-based sparsity-promoting L1-norm minimization method and the multichannel singular spectrum analysis method,the proposed method obtains state-of-the-art reconstruction results. 展开更多
关键词 Seismic data reconstruction low dimensional manifold model REGULARIZATION low-rank approximation
下载PDF
Tidal Analysis of High and Low Water Data
5
作者 LIPeiliang LILei +2 位作者 ZUOJuncheng ZHAOWei CHENZongyong 《Journal of Ocean University of China》 SCIE CAS 2004年第1期10-16,共7页
The harmonic analysis method based on high and low water levels is discussed in this paper. In order to make full use of the information of high and low water observations (the time derivative of water level at the ob... The harmonic analysis method based on high and low water levels is discussed in this paper. In order to make full use of the information of high and low water observations (the time derivative of water level at the observation time is zero), the weight coefficient, w, is introduced to control the importance of the part related to this information in the error formula. The major diurnal constituents, O 1 and K 1, and semi diurnal constituents, N 2, M 2 and S 2 are selected directly from the monthly data analysis, and some other important constituents, P 1, ν 2 and K 2, are included as the inferred constituents. The obtained harmonic constants of the major constituents are very close to those obtained from the analysis of hourly data, and this shows that high and low water data can be used to extract tidal constants with high accuracy. The analysis result also shows that the inference and the weighting coefficient are important in the high and low water data analysis, and it is suggested that w ≥1 should be taken in monthly high and low water data analysis. This analysis method can be used directly to analyze altimetric data with w =0. 展开更多
关键词 tidal analysis high and low water data altimetric data
下载PDF
An Integrated Method of Data Mining and Flow Unit Identification for Typical Low Permeability Reservoir Prediction
6
作者 Peng Yu 《World Journal of Engineering and Technology》 2019年第1期122-128,共7页
With the development of oilfield exploration and mining, the research on continental oil and gas reservoirs has been gradually refined, and the exploration target of offshore reservoir has also entered the hot studyst... With the development of oilfield exploration and mining, the research on continental oil and gas reservoirs has been gradually refined, and the exploration target of offshore reservoir has also entered the hot studystage of small sand bodies, small fault blocks, complex structures, low permeability and various heterogeneous geological bodies. Thus, the marine oil and gas development will inevitably enter thecomplicated reservoir stage;meanwhile the corresponding assessment technologies, engineering measures andexploration method should be designed delicately. Studying on hydraulic flow unit of low permeability reservoir of offshore oilfield has practical significance for connectivity degree and remaining oil distribution. An integrated method which contains the data mining and flow unit identification part was used on the flow unit prediction of low permeability reservoir;the predicted results?were compared with mature commercial system results for verifying its application. This strategy is successfully applied to increase the accuracy by choosing the outstanding prediction result. Excellent computing system could provide more accurate geological information for reservoir characterization. 展开更多
关键词 low PERMEABILITY Reservoir Offshore OILFIELD Hydraulic Flow UNIT Flow UNIT IDENTIFICATION data Mining
下载PDF
Design of Low-Power Data Logger of Deep Sea for Long-Term Field Observation 被引量:1
7
作者 赵伟 陈鹰 +2 位作者 杨灿军 曹建伟 顾临怡 《China Ocean Engineering》 SCIE EI 2009年第1期133-144,共12页
This paper describes the implementation of a data logger for the real-time in-situ monitoring of hydrothermal systems. A compact mechanical structure ensures the security and reliability of data logger when used under... This paper describes the implementation of a data logger for the real-time in-situ monitoring of hydrothermal systems. A compact mechanical structure ensures the security and reliability of data logger when used under deep sea. The data logger is a battery powered instrument, which can connect chemical sensors (pH electrode, H2S electrode, H2 electrode) and temperature sensors. In order to achieve major energy savings, dynamic power management is implemented in hardware design and software design. The working current of the data logger in idle mode and active mode is 15 μA and 1.44 mA respectively, which greatly extends the working time of battery. The data logger has been successftdly tested in the first Sino-American Cooperative Deep Submergence Project from August 13 to September 3, 2005. 展开更多
关键词 data logger low-power design deep sea long-term monitoring
下载PDF
Methods of de-noising the low frequency electromagnetic data
8
作者 王艳 《Journal of Measurement Science and Instrumentation》 CAS 2012年第1期62-65,共4页
The quality of the low frequency electromagnetic data is affected by the spike and the trend noises.Failure in removal of the spikes and the trends reduces the credibility of data explanation.Based on the analyses of ... The quality of the low frequency electromagnetic data is affected by the spike and the trend noises.Failure in removal of the spikes and the trends reduces the credibility of data explanation.Based on the analyses of the causes and characteristics of these noises,this paper presents the results of a preset statistics stacking method(PSSM)and a piecewise linear fitting method(PLFM)in de-noising the spikes and trends,respectively.The magnitudes of the spikes are either higher or lower than the normal values,which leads to distortion of the useful signal.Comparisons have been performed in removing of the spikes among the average,the statistics and the PSSM methods,and the results indicate that only the PSSM can remove the spikes successfully.On the other hand,the spectrums of the linear and nonlinear trends mainly lie in the low frequency band and can change the calculated resistivity significantly.No influence of the trends is observed when the frequency is higher than a certain threshold value.The PLSM can remove effectively both the linear and nonlinear trends with errors around 1% in the power spectrum.The proposed methods present an effective way for de-noising the spike and the trend noises in the low frequency electromagnetic data,and establish a research basis for de-noising the low frequency noises. 展开更多
关键词 SPIKE trend low frequency electromagnetic data DE-NOISING preset statistics stacking method(PSSM) piecewise linear fitting method(PLFM)
下载PDF
Data Intelligent Low Power High Performance TCAM for IP-Address Lookup Table
9
作者 K. Mathan T. Ravichandran 《Circuits and Systems》 2016年第11期3734-3745,共12页
This paper represents current research in low-power Very Large Scale Integration (VLSI) domain. Nowadays low power has become more sought research topic in electronic industry. Power dissipation is the most important ... This paper represents current research in low-power Very Large Scale Integration (VLSI) domain. Nowadays low power has become more sought research topic in electronic industry. Power dissipation is the most important area while designing the VLSI chip. Today almost all of the high speed switching devices include the Ternary Content Addressable Memory (TCAM) as one of the most important features. When a device consumes less power that becomes reliable and it would work with more efficiency. Complementary Metal Oxide Semiconductor (CMOS) technology is best known for low power consumption devices. This paper aims at designing a router application device which consumes less power and works more efficiently. Various strategies, methodologies and power management techniques for low power circuits and systems are discussed in this research. From this research the challenges could be developed that might be met while designing low power high performance circuit. This work aims at developing Data Aware AND-type match line architecture for TCAM. A TCAM macro of 256 × 128 was designed using Cadence Advanced Development Environment (ADE) with 90 nm technology file from Taiwan Semiconductor Manufacturing Company (TSMC). The result shows that the proposed Data Aware architecture provides around 35% speed and 45% power improvement over existing architecture. 展开更多
关键词 low Power TCAM Switching Power Match Line Searchline data Aware and Speech Processing
下载PDF
Design and implementation of low-cost geomagnetic field monitoring equipment for high-density deployment
10
作者 Sun Lu-Qiang Bai Xian-Fu +3 位作者 Kang Jian Zeng Ning Zhu Hong Zhang Ming-Dong 《Applied Geophysics》 SCIE CSCD 2024年第3期505-512,618,共9页
The observation of geomagnetic field variations is an important approach to studying earthquake precursors.Since 1987,the China Earthquake Administration has explored this seismomagnetic relationship.In particular,the... The observation of geomagnetic field variations is an important approach to studying earthquake precursors.Since 1987,the China Earthquake Administration has explored this seismomagnetic relationship.In particular,they studied local magnetic field anomalies over the Chinese mainland for earthquake prediction.Owing to the years of research on the seismomagnetic relationship,earthquake prediction experts have concluded that the compressive magnetic effect,tectonic magnetic effect,electric magnetic fluid effect,and other factors contribute to preearthquake magnetic anomalies.However,this involves a small magnitude of magnetic field changes.It is difficult to relate them to the abnormal changes of the extremely large magnetic field in regions with extreme earthquakes owing to the high cost of professional geomagnetic equipment,thereby limiting large-scale deployment.Moreover,it is difficult to obtain strong magnetic field changes before an earthquake.The Tianjin Earthquake Agency has developed low-cost geomagnetic field observation equipment through the Beijing–Tianjin–Hebei geomagnetic equipment test project.The new system was used to test the availability of equipment and determine the findings based on big data.. 展开更多
关键词 geomagnetic field earthquake prediction low cost high density big data
下载PDF
The study of intelligent algorithm in particle identification of heavy-ion collisions at low and intermediate energies
11
作者 Gao-Yi Cheng Qian-Min Su +1 位作者 Xi-Guang Cao Guo-Qiang Zhang 《Nuclear Science and Techniques》 SCIE EI CAS CSCD 2024年第2期170-182,共13页
Traditional particle identification methods face timeconsuming,experience-dependent,and poor repeatability challenges in heavy-ion collisions at low and intermediate energies.Researchers urgently need solutions to the... Traditional particle identification methods face timeconsuming,experience-dependent,and poor repeatability challenges in heavy-ion collisions at low and intermediate energies.Researchers urgently need solutions to the dilemma of traditional particle identification methods.This study explores the possibility of applying intelligent learning algorithms to the particle identification of heavy-ion collisions at low and intermediate energies.Multiple intelligent algorithms,including XgBoost and TabNet,were selected to test datasets from the neutron ion multi-detector for reaction-oriented dynamics(NIMROD-ISiS)and Geant4 simulation.Tree-based machine learning algorithms and deep learning algorithms e.g.TabNet show excellent performance and generalization ability.Adding additional data features besides energy deposition can improve the algorithm’s performance when the data distribution is nonuniform.Intelligent learning algorithms can be applied to solve the particle identification problem in heavy-ion collisions at low and intermediate energies. 展开更多
关键词 Heavy-ion collisions at low and intermediate energies Machine learning Ensemble learning algorithm Particle identification data imbalance
下载PDF
Data Mining Based Research of Development Direction of Waist Protection Equipment
12
作者 Lingfeng ZHU Zhizhen LU +3 位作者 Haijie YU Haifen YING Zheming LI Huashan FAN 《Medicinal Plant》 2024年第2期84-90,共7页
[Objectives]To explore the trend of brands towards the design of waist protection products through data mining,and to provide reference for the design concept of the contour of waist protection pillow.[Methods]The str... [Objectives]To explore the trend of brands towards the design of waist protection products through data mining,and to provide reference for the design concept of the contour of waist protection pillow.[Methods]The structural design information of all waist protection equipment was collected from the national Internet platform,and the data were classified and a database was established.IBM SPSS 26.0 and MATLAB 2018a were used to analyze the data and tabulate them in Tableau 2022.4.After the association rules were clarified,the data were imported into Cinema 4D R21 to create the concept contour of waist protection pillow.[Results]The average and standard deviation of the single airbag design were the highest in all groups,with an average of 0.511 and a standard deviation of 0.502.The average and standard deviation of the upper and lower dual airbags were the lowest in all groups,with an average of 0.015 and a standard deviation of 0.120;the correlation coefficient between single airbag and 120°arc stretching was 0.325,which was positively correlated with each other(P<0.01);the correlation coefficient between multiple airbags and 360°encircling fitting was 0.501,which was positively correlated with each other and had the highest correlation degree(P<0.01).[Conclusions]The single airbag design is well recognized by companies,and has received the highest attention among all brand products.While focusing on single airbag design,most brands will consider the need to add 120°arc stretching elements in product design.At the time of focusing on multiple airbag design,some brands believe that 360°encircling fitting elements need to be added to the product,and the correlation between the two is the highest among all groups. 展开更多
关键词 SPINE low back pain data mining AIRBAG STRETCHING Fitting Steel plate support Bidirectional compression Conceptual contour Design
下载PDF
Contrast Analysis of Two Low Vortices Weather Processes 被引量:1
13
作者 廖国进 黄阁 孟鹏 《Meteorological and Environmental Research》 CAS 2010年第4期68-71,91,共5页
Two cold vortex weather processes in Liaoning Province in June of 2006 were analyzed.In the process of low vortex of June 3,strong convection weather,such lightning storm and hailstone,came forth in most areas of Liao... Two cold vortex weather processes in Liaoning Province in June of 2006 were analyzed.In the process of low vortex of June 3,strong convection weather,such lightning storm and hailstone,came forth in most areas of Liaoning Province.White and bright cloud was shown in satellite nephogram.Bow echo and cyclonic circumfluence were shown in weather radar production.In the process of low vortex of June 14,strong precipitation weather came forth in most area of Liaoning Province.Based on the velocity field production of weather radar,the relative place of front and radar station can be judged.The weather situation and forecast were the main basis of short-term prediction.And satellite nephogram,weather radar,automatic weather station play important roles in the monitoring and short-term prediction of disaster weathers. 展开更多
关键词 low vortex Weather situation Satellite nephogram Weather radar Intensity field Velocity field Automatic station data China
下载PDF
GENERAL FEATURES OF POLAR LOWS OVER THE JAPAN SEA AND THE NORTHWESTERN PACIFIC 被引量:1
14
作者 傅刚 刘秦玉 吴增茂 《Chinese Journal of Oceanology and Limnology》 SCIE CAS CSCD 1999年第4期300-307,289,共9页
This study of the general features of occurrence frequencies, spatial distribution of locations, life time and cloud patterns of polar lows over the Japan Sea and the neighboring Northwestern Pacific in winter of 1995... This study of the general features of occurrence frequencies, spatial distribution of locations, life time and cloud patterns of polar lows over the Japan Sea and the neighboring Northwestern Pacific in winter of 1995/1996 based on observation and satellite data showed that polar lows develop most frequently in mid winter over the Japan Sea (35-45°N ) and the Northwestern Pacific (30-50°N). They rarely form over the Eurasian Continent. Polar lows over the Northwestern Pacific are usually long lived (2-3 days). But polar lows over the Japan Sea are relatively short lived (1-2 days), because the east west width of the Japan Sea is relatively narrow and polar lows tend to decay after their passing over the Japan Islands. Generally speaking, polar lows over the Japan Sea are characterized by tight, spiral (or comma) cloud patterns on satellite images. It was observed that polar lows over the Japan Sea have a typically spiral cloud band with clear “eye” at their mature stage. In winter, because of the effect of the warm Tsushima Current, the annual mean SST of the Japan Sea is 5-9℃ higher than that of the same latitude oceans. The large sea air temperature difference sustained over the Japan Sea provides favorable condition for polar low formation. The general features of polar lows over the Japan Sea are compared with those of other areas where polar lows often occur. 展开更多
关键词 POLAR lows the JAPAN SEA satellite data SPIRAL CLOUD BAND
下载PDF
Efficient seismic data reconstruction based on Geman function minimization 被引量:2
15
作者 Li Yan-Yan Fu Li-Hua +2 位作者 Cheng Wen-Ting Niu Xiao Zhang Wan-Juan 《Applied Geophysics》 SCIE CSCD 2022年第2期185-196,307,共13页
Seismic data typically contain random missing traces because of obstacles and economic restrictions,influencing subsequent processing and interpretation.Seismic data recovery can be expressed as a low-rank matrix appr... Seismic data typically contain random missing traces because of obstacles and economic restrictions,influencing subsequent processing and interpretation.Seismic data recovery can be expressed as a low-rank matrix approximation problem by assuming a low-rank structure for the complete seismic data in the frequency–space(f–x)domain.The nuclear norm minimization(NNM)(sum of singular values)approach treats singular values equally,yielding a solution deviating from the optimal.Further,the log-sum majorization–minimization(LSMM)approach uses the nonconvex log-sum function as a rank substitution for seismic data interpolation,which is highly accurate but time-consuming.Therefore,this study proposes an efficient nonconvex reconstruction model based on the nonconvex Geman function(the nonconvex Geman low-rank(NCGL)model),involving a tighter approximation of the original rank function.Without introducing additional parameters,the nonconvex problem is solved using the Karush–Kuhn–Tucker condition theory.Experiments using synthetic and field data demonstrate that the proposed NCGL approach achieves a higher signal-to-noise ratio than the singular value thresholding method based on NNM and the projection onto convex sets method based on the data-driven threshold model.The proposed approach achieves higher reconstruction efficiency than the singular value thresholding and LSMM methods. 展开更多
关键词 Seismic data reconstruction low rank Geman function NONCONVEX Karush–Kuhn–Tucker condition
下载PDF
An agile very low frequency radio spectrum explorer 被引量:1
16
作者 Lin-Jie Chen Yi-Hua Yan +2 位作者 Qiu-Xiang Fan Li-Hong Geng Susanta Kumar Bisoi 《Research in Astronomy and Astrophysics》 SCIE CAS CSCD 2021年第4期139-148,共10页
The very low frequency(VLF)regime below 30 MHz in the electromagnetic spectrum has presently been drawing global attention in radio astronomical research due to its potentially significant science outcomes exploring m... The very low frequency(VLF)regime below 30 MHz in the electromagnetic spectrum has presently been drawing global attention in radio astronomical research due to its potentially significant science outcomes exploring many unknown extragalactic sources,transients,and so on.However,the nontransparency of the Earth’s ionosphere,ionospheric distortion and artificial radio frequency interference(RFI)have made it difficult to detect the VLF celestial radio emission with ground-based instruments.A straightforward solution to overcome these problems is a space-based VLF radio telescope,just like the VLF radio instruments onboard the Chang’E-4 spacecraft.But building such a space telescope would be inevitably costly and technically challenging.The alternative approach would be then a ground-based VLF radio telescope.Particularly,in the period of post 2020 when the solar and terrestrial ionospheric activities are expected to be in a’calm’state,it will provide us a good chance to perform VLF ground-based radio observations.Anticipating such an opportunity,we built an agile VLF radio spectrum explorer co-located with the currently operational Mingantu Spectra Radio Heliograph(MUSER).The instrument includes four antennas operating in the VLF frequency range 1-70 MHz.Along with them,we employ an eight-channel analog and digital receivers to amplify,digitize and process the radio signals received by the antennas.We present in the paper this VLF radio spectrum explorer and the instrument will be useful for celestial studies of VLF radio emissions. 展开更多
关键词 very low frequency instrumentation:polarimeters methods:data analysis
下载PDF
Low energy consumption depth control method of self-sustaining intelligent buoy 被引量:1
17
作者 ZHENG Di XU Jiayi +1 位作者 LI Xingfei LI Hongyu 《Journal of Measurement Science and Instrumentation》 CAS CSCD 2021年第1期74-82,共9页
Aiming at the contradiction between the depth control accuracy and the energy consumption of the self-sustaining intelligent buoy,a low energy consumption depth control method based on historical array for real-time g... Aiming at the contradiction between the depth control accuracy and the energy consumption of the self-sustaining intelligent buoy,a low energy consumption depth control method based on historical array for real-time geostrophic oceanography(Argo)data is proposed.As known from the buoy kinematic model,the volume of the external oil sac only depends on the density and temperature of seawater at hovering depth.Hence,we use historical Argo data to extract the fitting curves of density and temperature,and obtain the relationship between the hovering depth and the volume of the external oil sac.Genetic algorithm is used to carry out the optimal energy consumption motion planning for the depth control process,and the specific motion strategy of depth control process is obtained.Compared with dual closed-loop fuzzy PID control method and radial basis function(RBF)-PID method,the proposed method reduces energy consumption to 1/50 with the same accuracy.Finally,a hardware-in-the-loop simulation system was used to verify this method.When the error caused by fitting curves is not considered,the average error is 2.62 m,the energy consumption is 3.214×10^(4)J,and the error of energy consumption is only 0.65%.It shows the effectiveness and reliability of the method as well as the advantages of comprehensively considering the accuracy and energy consumption. 展开更多
关键词 self-sustaining intelligent buoy low energy consumption depth control Argo data genetic algorithm hardware-in-the-loop simulation system
下载PDF
Transition Logic Regression Method to Identify Interactions in Binary Longitudinal Data 被引量:1
18
作者 Parvin Sarbakhsh Yadollah Mehrabi +2 位作者 Jeanine J. Houwing-Duistermaat Farid Zayeri Maryam Sadat Daneshpour 《Open Journal of Statistics》 2016年第3期469-481,共13页
Logic regression is an adaptive regression method which searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome, and thus, it reveals interaction effects which ar... Logic regression is an adaptive regression method which searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome, and thus, it reveals interaction effects which are associated with the response. In this study, we extended logic regression to longitudinal data with binary response and proposed “Transition Logic Regression Method” to find interactions related to response. In this method, interaction effects over time were found by Annealing Algorithm with AIC (Akaike Information Criterion) as the score function of the model. Also, first and second orders Markov dependence were allowed to capture the correlation among successive observations of the same individual in longitudinal binary response. Performance of the method was evaluated with simulation study in various conditions. Proposed method was used to find interactions of SNPs and other risk factors related to low HDL over time in data of 329 participants of longitudinal TLGS study. 展开更多
关键词 Logic Regression Longitudinal data Transition Model Interaction TLGS Study low HDL SNP
下载PDF
一种改进型Data-aware结构的亚阈值SRAM电路
19
作者 黄海超 陈昕 +1 位作者 金威 何卫锋 《微电子学与计算机》 CSCD 北大核心 2015年第9期28-32,共5页
针对传统Data-aware结构SRAM读操作过程中出现的行半选择带来的功耗浪费问题,提出了一种改进型data-aware 9T结构的SRAM电路.与传统SRAM相比,该结构通过Cross-Point读的访问方式解决了读过程中被选中行中,由于半选择单元存在读通路引起... 针对传统Data-aware结构SRAM读操作过程中出现的行半选择带来的功耗浪费问题,提出了一种改进型data-aware 9T结构的SRAM电路.与传统SRAM相比,该结构通过Cross-Point读的访问方式解决了读过程中被选中行中,由于半选择单元存在读通路引起的位线功耗浪费问题.实验数据表明,提出的SRAM电路,至多可以降低514%位线上消耗的功耗.测试电路采用0.13μm工艺,设计了一个16kb SRAM电路,工作电压为420mV,平均功耗为5.37μW. 展开更多
关键词 亚阈值 低功耗
下载PDF
A Comparison Study of Tropical Pacific Ocean State Estimation:Low-Resolution Assimilation vs.High-Resolution Simulation 被引量:5
20
作者 符伟伟 朱江 +1 位作者 周广庆 王会军 《Advances in Atmospheric Sciences》 SCIE CAS CSCD 2005年第2期212-219,共8页
A comparison study is performed to contrast the improvements in the tropical Pacific oceanic state of a low-resolution model respectively via data assimilation and by an increase in horizontal resolution. A low resolu... A comparison study is performed to contrast the improvements in the tropical Pacific oceanic state of a low-resolution model respectively via data assimilation and by an increase in horizontal resolution. A low resolution model (LR) (1°lat by 2°lon) and a high-resolution model (HR) (0.5°lat by 0.5°lon) are employed for the comparison. The authors perform 20-yr numerical experiments and analyze the annual mean fields of temperature and salinity. The results indicate that the low-resolution model with data assimilation behaves better than the high-resolution model in the estimation of ocean large-scale features. From 1990 to 2000, the average of HR's RMSE (root-mean-square error) relative to independent Tropical Atmosphere Ocean project (TAO) mooring data at randomly selected points is 0.97℃ compared to a RMSE of 0.56℃ for LR with temperature assimilation. Moreover, the LR with data assimilation is more frugal in computation. Although there is room to improve the high-resolution model, the low-resolution model with data assimilation may be an advisable choice in achieving a more realistic large-scale state of the ocean at the limited level of information provided by the current observational system. 展开更多
关键词 comparison study high-resolution model data assimilation low-resolution model
下载PDF
上一页 1 2 96 下一页 到第
使用帮助 返回顶部