Reconstruction of plasma equilibrium plays an important role in the analysis and simulation of plasma experiments. The kinetic equilibrium reconstruction with pressure and edge current constraints has been employed on...Reconstruction of plasma equilibrium plays an important role in the analysis and simulation of plasma experiments. The kinetic equilibrium reconstruction with pressure and edge current constraints has been employed on EAST tokamak. However, the internal safety factor(q) profile is not accurate. This paper proposes a new way of incorporating q profile constraints into kinetic equilibrium reconstruction. The q profile is yielded from the Polarimeter Interferometer(POINT)reconstruction. Virtual probes containing information on q profile constraints are added to inputs of the kinetic equilibrium reconstruction program to obtain the final equilibrium. The new equilibrium produces a more accurate internal q profile. This improved method would help analyze EAST experiments.展开更多
Vast amounts of heterogeneous data on marine observations have been accumulated due to the rapid development of ocean observation technology.Several state-of-art methods are proposed to manage the emerging Internet of...Vast amounts of heterogeneous data on marine observations have been accumulated due to the rapid development of ocean observation technology.Several state-of-art methods are proposed to manage the emerging Internet of Things(IoT)sensor data.However,the use of an inefficient data management strategy during the data storage process can lead to missing metadata;thus,part of the sensor data cannot be indexed and utilized(i.e.,‘data swamp’).Researchers have focused on optimizing storage procedures to prevent such disasters,but few have attempted to restore the missing metadata.In this study,we propose an AI-based algorithm to reconstruct the metadata of heterogeneous marine data in data swamps to solve the above problems.First,a MapReduce algorithm is proposed to preprocess raw marine data and extract its feature tensors in parallel.Second,load the feature tensors are loaded into a machine learning algorithm and clustering operation is implemented.The similarities between the incoming data and the trained clustering results in terms of clustering results are also calculated.Finally,metadata reconstruction is performed based on existing marine observa-tion data processing results.The experiments are designed using existing datasets obtained from ocean observing systems,thus verifying the effectiveness of the algorithms.The results demonstrate the excellent performance of our proposed algorithm for the metadata recon-struction of heterogenous marine observation data.展开更多
At present,the acquisition of seismic data is developing toward high-precision and high-density methods.However,complex natural environments and cultural factors in many exploration areas cause difficulties in achievi...At present,the acquisition of seismic data is developing toward high-precision and high-density methods.However,complex natural environments and cultural factors in many exploration areas cause difficulties in achieving uniform and intensive acquisition,which makes complete seismic data collection impossible.Therefore,data reconstruction is required in the processing link to ensure imaging accuracy.Deep learning,as a new field in rapid development,presents clear advantages in feature extraction and modeling.In this study,the convolutional neural network deep learning algorithm is applied to seismic data reconstruction.Based on the convolutional neural network algorithm and combined with the characteristics of seismic data acquisition,two training strategies of supervised and unsupervised learning are designed to reconstruct sparse acquisition seismic records.First,a supervised learning strategy is proposed for labeled data,wherein the complete seismic data are segmented as the input of the training set and are randomly sampled before each training,thereby increasing the number of samples and the richness of features.Second,an unsupervised learning strategy based on large samples is proposed for unlabeled data,and the rolling segmentation method is used to update(pseudo)labels and training parameters in the training process.Through the reconstruction test of simulated and actual data,the deep learning algorithm based on a convolutional neural network shows better reconstruction quality and higher accuracy than compressed sensing based on Curvelet transform.展开更多
Certain deterministic nonlinear systems may show chaotic behavior. We consider the motion of qualitative information and the practicalities of extracting a part from chaotic experimental data. Our approach based on a ...Certain deterministic nonlinear systems may show chaotic behavior. We consider the motion of qualitative information and the practicalities of extracting a part from chaotic experimental data. Our approach based on a theorem of Takens draws on the ideas from the generalized theory of information known as singular system analysis. We illustrate this technique by numerical data from the chaotic region of the chaotic experimental data. The method of the singular-value decomposition is used to calculate the eigenvalues of embedding space matrix. The corresponding concrete algorithm to calculate eigenvectors and to obtain the basis of embedding vector space is proposed in this paper. The projection on the orthogonal basis generated by eigenvectors of timeseries data and concrete paradigm are also provided here. Meanwhile the state space reconstruction technology of different kinds of chaotic data obtained from dynamical system has also been discussed in detail.展开更多
Accurate reconstruction from a reduced data set is highly essential for computed tomography in fast and/or low dose imaging applications. Conventional total variation(TV)-based algorithms apply the L1 norm-based pen...Accurate reconstruction from a reduced data set is highly essential for computed tomography in fast and/or low dose imaging applications. Conventional total variation(TV)-based algorithms apply the L1 norm-based penalties, which are not as efficient as Lp(0〈p〈1) quasi-norm-based penalties. TV with a p-th power-based norm can serve as a feasible alternative of the conventional TV, which is referred to as total p-variation(TpV). This paper proposes a TpV-based reconstruction model and develops an efficient algorithm. The total p-variation and Kullback-Leibler(KL) data divergence, which has better noise suppression capability compared with the often-used quadratic term, are combined to build the reconstruction model. The proposed algorithm is derived by the alternating direction method(ADM) which offers a stable, efficient, and easily coded implementation. We apply the proposed method in the reconstructions from very few views of projections(7 views evenly acquired within 180°). The images reconstructed by the new method show clearer edges and higher numerical accuracy than the conventional TV method. Both the simulations and real CT data experiments indicate that the proposed method may be promising for practical applications.展开更多
We use Radial Basis Functions (RBFs) to reconstruct smooth surfaces from 3D scattered data. An object's surface is defined implicitly as the zero set of an RBF fitted to the given surface data. We propose improveme...We use Radial Basis Functions (RBFs) to reconstruct smooth surfaces from 3D scattered data. An object's surface is defined implicitly as the zero set of an RBF fitted to the given surface data. We propose improvements on the methods of surface reconstruction with radial basis functions. A sparse approximation set of scattered data is constructed by reducing the number of interpolating points on the surface. We present an adaptive method for finding the off-surface normal points. The order of the equation decreases greatly as the number of the off-surface constraints reduces gradually. Experimental results are provided to illustrate that the proposed method is robust and may draw beautiful graphics.展开更多
Oil and gas seismic exploration have to adopt irregular seismic acquisition due to the increasingly complex exploration conditions to adapt to complex geological conditions and environments.However,the irregular seism...Oil and gas seismic exploration have to adopt irregular seismic acquisition due to the increasingly complex exploration conditions to adapt to complex geological conditions and environments.However,the irregular seismic acquisition is accompanied by the lack of acquisition data,which requires high-precision regularization.The sparse signal feature in the transform domain in compressed sensing theory is used in this paper to recover the missing signal,involving sparse transform base optimization and threshold modeling.First,this paper analyzes and compares the effects of six sparse transformation bases on the reconstruction accuracy and efficiency of irregular seismic data and establishes the quantitative relationship between sparse transformation and reconstruction accuracy and efficiency.Second,an adaptive threshold modeling method based on sparse coefficient is provided to improve the reconstruction accuracy.Test results show that the method has good adaptability to different seismic data and sparse transform bases.The f-x domain reconstruction method of effective frequency samples is studied to address the problem of low computational efficiency.The parallel computing strategy of curvelet transform combined with OpenMP is further proposed,which substantially improves the computational efficiency under the premise of ensuring the reconstruction accuracy.Finally,the actual acquisition data are used to verify the proposed method.The results indicate that the proposed method strategy can solve the regularization problem of irregular seismic data in production and improve the imaging quality of the target layer economically and efficiently.展开更多
We show a quantitative technique characterized by low numerical mediation for the reconstruction of temporal sequences of geophysical data of length L interrupted for a time ΔT where . The aim is to protect the infor...We show a quantitative technique characterized by low numerical mediation for the reconstruction of temporal sequences of geophysical data of length L interrupted for a time ΔT where . The aim is to protect the information acquired before and after the interruption by means of a numerical protocol with the lowest possible calculation weight. The signal reconstruction process is based on the synthesis of the low frequency signal extracted for subsampling (subsampling ∇Dirac = ΔT in phase with ΔT) with the high frequency signal recorded before the crash. The SYRec (SYnthetic REConstruction) method for simplicity and speed of calculation and for spectral response stability is particularly effective in the studies of high speed transient phenomena that develop in very perturbed fields. This operative condition is found a mental when almost immediate informational responses are required to the observation system. In this example we are dealing with geomagnetic data coming from an uw counter intrusion magnetic system. The system produces (on time) information about the transit of local magnetic singularities (magnetic perturbations with low spatial extension), originated by quasi-point form and kinematic sources (divers), in harbors magnetic underwater fields. The performances of stability of the SYRec system make it usable also in long and medium period of observation (activity of geomagnetic observatories).展开更多
In the process of encoding and decoding,erasure codes over binary fields,which just need AND operations and XOR operations and therefore have a high computational efficiency,are widely used in various fields of inform...In the process of encoding and decoding,erasure codes over binary fields,which just need AND operations and XOR operations and therefore have a high computational efficiency,are widely used in various fields of information technology.A matrix decoding method is proposed in this paper.The method is a universal data reconstruction scheme for erasure codes over binary fields.Besides a pre-judgment that whether errors can be recovered,the method can rebuild sectors of loss data on a fault-tolerant storage system constructed by erasure codes for disk errors.Data reconstruction process of the new method has simple and clear steps,so it is beneficial for implementation of computer codes.And more,it can be applied to other non-binary fields easily,so it is expected that the method has an extensive application in the future.展开更多
Seismic data reconstruction is an essential and yet fundamental step in seismic data processing workflow,which is of profound significance to improve migration imaging quality,multiple suppression effect,and seismic i...Seismic data reconstruction is an essential and yet fundamental step in seismic data processing workflow,which is of profound significance to improve migration imaging quality,multiple suppression effect,and seismic inversion accuracy.Regularization methods play a central role in solving the underdetermined inverse problem of seismic data reconstruction.In this paper,a novel regularization approach is proposed,the low dimensional manifold model(LDMM),for reconstructing the missing seismic data.Our work relies on the fact that seismic patches always occupy a low dimensional manifold.Specifically,we exploit the dimension of the seismic patches manifold as a regularization term in the reconstruction problem,and reconstruct the missing seismic data by enforcing low dimensionality on this manifold.The crucial procedure of the proposed method is to solve the dimension of the patches manifold.Toward this,we adopt an efficient dimensionality calculation method based on low-rank approximation,which provides a reliable safeguard to enforce the constraints in the reconstruction process.Numerical experiments performed on synthetic and field seismic data demonstrate that,compared with the curvelet-based sparsity-promoting L1-norm minimization method and the multichannel singular spectrum analysis method,the proposed method obtains state-of-the-art reconstruction results.展开更多
3D image reconstruction for weather radar data can not only help the weatherman to improve the forecast efficiency and accuracy, but also help people to understand the weather conditions easily and quickly. Marching C...3D image reconstruction for weather radar data can not only help the weatherman to improve the forecast efficiency and accuracy, but also help people to understand the weather conditions easily and quickly. Marching Cubes (MC) algorithm in the surface rendering has more excellent applicability in 3D reconstruction for the slice images;it may shorten the time to find and calculate the isosurface from raw volume data, reflect the shape structure more accurately. In this paper, we discuss a method to reconstruct the 3D weather cloud image by using the proposed Cube Weighting Interpolation (CWI) and MC algorithm. Firstly, we detail the steps of CWI, apply it to project the raw radar data into the cubes and obtain the equally spaced cloud slice images, then employ MC algorithm to draw the isosurface. Some experiments show that our method has a good effect and simple operation, which may provide an intuitive and effective reference for realizing the 3D surface reconstruction and meteorological image stereo visualization.展开更多
Rhododendron is famous for its high ornamental value.However,the genus is taxonomically difficult and the relationships within Rhododendron remain unresolved.In addition,the origin of key morphological characters with...Rhododendron is famous for its high ornamental value.However,the genus is taxonomically difficult and the relationships within Rhododendron remain unresolved.In addition,the origin of key morphological characters with high horticulture value need to be explored.Both problems largely hinder utilization of germplasm resources.Most studies attempted to disentangle the phylogeny of Rhododendron,but only used a few genomic markers and lacked large-scale sampling,resulting in low clade support and contradictory phylogenetic signals.Here,we used restriction-site associated DNA sequencing(RAD-seq)data and morphological traits for 144 species of Rhododendron,representing all subgenera and most sections and subsections of this species-rich genus,to decipher its intricate evolutionary history and reconstruct ancestral state.Our results revealed high resolutions at subgenera and section levels of Rhododendron based on RAD-seq data.Both optimal phylogenetic tree and split tree recovered five lineages among Rhododendron.Subg.Therorhodion(cladeⅠ)formed the basal lineage.Subg.Tsutsusi and Azaleastrum formed cladeⅡand had sister relationships.CladeⅢincluded all scaly rhododendron species.Subg.Pentanthera(cladeⅣ)formed a sister group to Subg.Hymenanthes(cladeⅤ).The results of ancestral state reconstruction showed that Rhododendron ancestor was a deciduous woody plant with terminal inflorescence,ten stamens,leaf blade without scales and broadly funnelform corolla with pink or purple color.This study shows significant distinguishability to resolve the evolutionary history of Rhododendron based on high clade support of phylogenetic tree constructed by RAD-seq data.It also provides an example to resolve discordant signals in phylogenetic trees and demonstrates the application feasibility of RAD-seq with large amounts of missing data in deciphering intricate evolutionary relationships.Additionally,the reconstructed ancestral state of six important characters provides insights into the innovation of key characters in Rhododendron.展开更多
We all live on one planet and geology has no borders.Countries that reside on different continents share the same architecture beneath the surface;they were once neighbors with common foundations.Interoperable geologi...We all live on one planet and geology has no borders.Countries that reside on different continents share the same architecture beneath the surface;they were once neighbors with common foundations.Interoperable geological data are now freely available to everyone for the benefit of society,demonstrating that geoscience can address both global and regional problems.Whilst increasingly large datasets("Big Data")provide clear opportunities(e.g.,Spina,2018).展开更多
The quality of real time traffic information is of the great importance, therefore the factors having effect on traffic characteristics are analyzed in general, and the necessities of real time data processing are sum...The quality of real time traffic information is of the great importance, therefore the factors having effect on traffic characteristics are analyzed in general, and the necessities of real time data processing are summarized. The identification and reconstruction of real time traffic data are analyzed using Kalman filter equation and statistical approach. Four methods for ITS (Intelligent transportation system) detector data screening in traffic management systems are discussed in detail. Meanwhile traffic data examinations are discussed with solutions formulated through analysis, and recommendations are made for information collection and data management in future.展开更多
Background A task assigned to space exploration satellites involves detecting the physical environment within a certain space.However,space detection data are complex and abstract.These data are not conducive for rese...Background A task assigned to space exploration satellites involves detecting the physical environment within a certain space.However,space detection data are complex and abstract.These data are not conducive for researchers'visual perceptions of the evolution and interaction of events in the space environment.Methods A time-series dynamic data sampling method for large-scale space was proposed for sample detection data in space and time,and the corresponding relationships between data location features and other attribute features were established.A tone-mapping method based on statistical histogram equalization was proposed and applied to the final attribute feature data.The visualization process is optimized for rendering by merging materials,reducing the number of patches,and performing other operations.Results The results of sampling,feature extraction,and uniform visualization of the detection data of complex types,long duration spans,and uneven spatial distributions were obtained.The real-time visualization of large-scale spatial structures using augmented reality devices,particularly low-performance devices,was also investigated.Conclusions The proposed visualization system can reconstruct the three-dimensional structure of a large-scale space,express the structure and changes in the spatial environment using augmented reality,and assist in intuitively discovering spatial environmental events and evolutionary rules.展开更多
To reconstruct the missing data of the total electron content (TEC) observations, a new method is proposed, which is based on the empirical orthogonal functions (EOF) decomposition and the value of eigenvalue itse...To reconstruct the missing data of the total electron content (TEC) observations, a new method is proposed, which is based on the empirical orthogonal functions (EOF) decomposition and the value of eigenvalue itself. It is a self-adaptive EOF decomposition without any prior information needed, and the error of reconstructed data can be estimated. The interval quartering algorithm and cross-validation algorithm are used to compute the optimal number of EOFs for reconstruction. The interval quartering algorithm can reduce the computation time. The application of the data interpolating empirical orthogonal functions (DINEOF) method to the real data have demonstrated that the method can reconstruct the TEC map with high accuracy, which can be employed on the real-time system in the future work.展开更多
There are only two quantitative tools for Precambrian paleogeographic reconstructions–paleomagnetic data and dyke swarms geometries.Paleomagnetic data provide information about paleolatitudes and orientation of rigid
Based on the compressive sensing,a novel algorithm is proposed to solve reconstruction problem under sparsity assumptions.Instead of estimating the reconstructed data through minimizing the objective function,the auth...Based on the compressive sensing,a novel algorithm is proposed to solve reconstruction problem under sparsity assumptions.Instead of estimating the reconstructed data through minimizing the objective function,the authors parameterize the problem as a linear combination of few elementary thresholding functions,which can be solved by calculating the linear weighting coefficients.It is to update the thresholding functions during the process of iteration.The advantage of this method is that the optimization problem only needs to be solved by calculating linear coefficients for each time.With the elementary thresholding functions satisfying certain constraints,a global convergence of the iterative algorithm is guaranteed.The synthetic and the field data results prove the effectiveness of the proposed algorithm.展开更多
A new B-spline surface reconstruction method from layer data based on deformable model is presented. An initial deformable surface, which is represented as a closed cylinder, is firstly given. The surface is subject t...A new B-spline surface reconstruction method from layer data based on deformable model is presented. An initial deformable surface, which is represented as a closed cylinder, is firstly given. The surface is subject to internal forces describing its implicit smoothness property and external forces attracting it toward the layer data points. And then finite element method is adopted to solve its energy minimization problem, which results a bicubic closed B-spline surface with C^2 continuity. The proposed method can provide a smoothness and accurate surface model directly from the layer data, without the need to fit cross-sectional curves and make them compatible. The feasibility of the proposed method is verified by the experimental results.展开更多
In this paper, on the basis of experimental data of two kinds of chemical explosions, the piston-pushing model of spherical blast-waves and the second-order Godunov-type scheme of finite difference methods with high i...In this paper, on the basis of experimental data of two kinds of chemical explosions, the piston-pushing model of spherical blast-waves and the second-order Godunov-type scheme of finite difference methods with high identification to discontinuity are used to the numerical reconstruction of part of an actual hemispherical blast-wave flow field by properly adjusting the moving bounary conditions of a piston. This method is simple and reliable. It is suitable to the evaluation of effects of the blast-wave flow field away from the explosion center.展开更多
基金supported by National Key R&D Program of China(Nos.2019YFE03040004 and 2017YFE0300404)supported by Comprehensive Research Facility for Fusion Technology Program of China(No.2018000052-73-01-001228)。
文摘Reconstruction of plasma equilibrium plays an important role in the analysis and simulation of plasma experiments. The kinetic equilibrium reconstruction with pressure and edge current constraints has been employed on EAST tokamak. However, the internal safety factor(q) profile is not accurate. This paper proposes a new way of incorporating q profile constraints into kinetic equilibrium reconstruction. The q profile is yielded from the Polarimeter Interferometer(POINT)reconstruction. Virtual probes containing information on q profile constraints are added to inputs of the kinetic equilibrium reconstruction program to obtain the final equilibrium. The new equilibrium produces a more accurate internal q profile. This improved method would help analyze EAST experiments.
基金supported by the Shandong Province Natural Science Foundation(No.ZR2020QF028).
文摘Vast amounts of heterogeneous data on marine observations have been accumulated due to the rapid development of ocean observation technology.Several state-of-art methods are proposed to manage the emerging Internet of Things(IoT)sensor data.However,the use of an inefficient data management strategy during the data storage process can lead to missing metadata;thus,part of the sensor data cannot be indexed and utilized(i.e.,‘data swamp’).Researchers have focused on optimizing storage procedures to prevent such disasters,but few have attempted to restore the missing metadata.In this study,we propose an AI-based algorithm to reconstruct the metadata of heterogeneous marine data in data swamps to solve the above problems.First,a MapReduce algorithm is proposed to preprocess raw marine data and extract its feature tensors in parallel.Second,load the feature tensors are loaded into a machine learning algorithm and clustering operation is implemented.The similarities between the incoming data and the trained clustering results in terms of clustering results are also calculated.Finally,metadata reconstruction is performed based on existing marine observa-tion data processing results.The experiments are designed using existing datasets obtained from ocean observing systems,thus verifying the effectiveness of the algorithms.The results demonstrate the excellent performance of our proposed algorithm for the metadata recon-struction of heterogenous marine observation data.
基金This study was supported by the National Natural Science Foundation of China under the project‘Research on the Dynamic Location of Receiver Points and Wave Field Separation Technology Based on Deep Learning in OBN Seismic Exploration’(No.42074140).
文摘At present,the acquisition of seismic data is developing toward high-precision and high-density methods.However,complex natural environments and cultural factors in many exploration areas cause difficulties in achieving uniform and intensive acquisition,which makes complete seismic data collection impossible.Therefore,data reconstruction is required in the processing link to ensure imaging accuracy.Deep learning,as a new field in rapid development,presents clear advantages in feature extraction and modeling.In this study,the convolutional neural network deep learning algorithm is applied to seismic data reconstruction.Based on the convolutional neural network algorithm and combined with the characteristics of seismic data acquisition,two training strategies of supervised and unsupervised learning are designed to reconstruct sparse acquisition seismic records.First,a supervised learning strategy is proposed for labeled data,wherein the complete seismic data are segmented as the input of the training set and are randomly sampled before each training,thereby increasing the number of samples and the richness of features.Second,an unsupervised learning strategy based on large samples is proposed for unlabeled data,and the rolling segmentation method is used to update(pseudo)labels and training parameters in the training process.Through the reconstruction test of simulated and actual data,the deep learning algorithm based on a convolutional neural network shows better reconstruction quality and higher accuracy than compressed sensing based on Curvelet transform.
基金The project supported by the National Natural Science Foundation of China(19672043)
文摘Certain deterministic nonlinear systems may show chaotic behavior. We consider the motion of qualitative information and the practicalities of extracting a part from chaotic experimental data. Our approach based on a theorem of Takens draws on the ideas from the generalized theory of information known as singular system analysis. We illustrate this technique by numerical data from the chaotic region of the chaotic experimental data. The method of the singular-value decomposition is used to calculate the eigenvalues of embedding space matrix. The corresponding concrete algorithm to calculate eigenvectors and to obtain the basis of embedding vector space is proposed in this paper. The projection on the orthogonal basis generated by eigenvectors of timeseries data and concrete paradigm are also provided here. Meanwhile the state space reconstruction technology of different kinds of chaotic data obtained from dynamical system has also been discussed in detail.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.61372172 and 61601518)
文摘Accurate reconstruction from a reduced data set is highly essential for computed tomography in fast and/or low dose imaging applications. Conventional total variation(TV)-based algorithms apply the L1 norm-based penalties, which are not as efficient as Lp(0〈p〈1) quasi-norm-based penalties. TV with a p-th power-based norm can serve as a feasible alternative of the conventional TV, which is referred to as total p-variation(TpV). This paper proposes a TpV-based reconstruction model and develops an efficient algorithm. The total p-variation and Kullback-Leibler(KL) data divergence, which has better noise suppression capability compared with the often-used quadratic term, are combined to build the reconstruction model. The proposed algorithm is derived by the alternating direction method(ADM) which offers a stable, efficient, and easily coded implementation. We apply the proposed method in the reconstructions from very few views of projections(7 views evenly acquired within 180°). The images reconstructed by the new method show clearer edges and higher numerical accuracy than the conventional TV method. Both the simulations and real CT data experiments indicate that the proposed method may be promising for practical applications.
文摘We use Radial Basis Functions (RBFs) to reconstruct smooth surfaces from 3D scattered data. An object's surface is defined implicitly as the zero set of an RBF fitted to the given surface data. We propose improvements on the methods of surface reconstruction with radial basis functions. A sparse approximation set of scattered data is constructed by reducing the number of interpolating points on the surface. We present an adaptive method for finding the off-surface normal points. The order of the equation decreases greatly as the number of the off-surface constraints reduces gradually. Experimental results are provided to illustrate that the proposed method is robust and may draw beautiful graphics.
基金supported by the National Science and Technology Major project(No.2016ZX05024001003)the Innovation Consortium Project of China Petroleum,and the Southwest Petroleum University(No.2020CX010201).
文摘Oil and gas seismic exploration have to adopt irregular seismic acquisition due to the increasingly complex exploration conditions to adapt to complex geological conditions and environments.However,the irregular seismic acquisition is accompanied by the lack of acquisition data,which requires high-precision regularization.The sparse signal feature in the transform domain in compressed sensing theory is used in this paper to recover the missing signal,involving sparse transform base optimization and threshold modeling.First,this paper analyzes and compares the effects of six sparse transformation bases on the reconstruction accuracy and efficiency of irregular seismic data and establishes the quantitative relationship between sparse transformation and reconstruction accuracy and efficiency.Second,an adaptive threshold modeling method based on sparse coefficient is provided to improve the reconstruction accuracy.Test results show that the method has good adaptability to different seismic data and sparse transform bases.The f-x domain reconstruction method of effective frequency samples is studied to address the problem of low computational efficiency.The parallel computing strategy of curvelet transform combined with OpenMP is further proposed,which substantially improves the computational efficiency under the premise of ensuring the reconstruction accuracy.Finally,the actual acquisition data are used to verify the proposed method.The results indicate that the proposed method strategy can solve the regularization problem of irregular seismic data in production and improve the imaging quality of the target layer economically and efficiently.
文摘We show a quantitative technique characterized by low numerical mediation for the reconstruction of temporal sequences of geophysical data of length L interrupted for a time ΔT where . The aim is to protect the information acquired before and after the interruption by means of a numerical protocol with the lowest possible calculation weight. The signal reconstruction process is based on the synthesis of the low frequency signal extracted for subsampling (subsampling ∇Dirac = ΔT in phase with ΔT) with the high frequency signal recorded before the crash. The SYRec (SYnthetic REConstruction) method for simplicity and speed of calculation and for spectral response stability is particularly effective in the studies of high speed transient phenomena that develop in very perturbed fields. This operative condition is found a mental when almost immediate informational responses are required to the observation system. In this example we are dealing with geomagnetic data coming from an uw counter intrusion magnetic system. The system produces (on time) information about the transit of local magnetic singularities (magnetic perturbations with low spatial extension), originated by quasi-point form and kinematic sources (divers), in harbors magnetic underwater fields. The performances of stability of the SYRec system make it usable also in long and medium period of observation (activity of geomagnetic observatories).
基金supported by the National Natural Science Foundation of China under Grant No.61501064Sichuan Provincial Science and Technology Project under Grant No.2016GZ0122
文摘In the process of encoding and decoding,erasure codes over binary fields,which just need AND operations and XOR operations and therefore have a high computational efficiency,are widely used in various fields of information technology.A matrix decoding method is proposed in this paper.The method is a universal data reconstruction scheme for erasure codes over binary fields.Besides a pre-judgment that whether errors can be recovered,the method can rebuild sectors of loss data on a fault-tolerant storage system constructed by erasure codes for disk errors.Data reconstruction process of the new method has simple and clear steps,so it is beneficial for implementation of computer codes.And more,it can be applied to other non-binary fields easily,so it is expected that the method has an extensive application in the future.
基金supported by National Natural Science Foundation of China(Grant No.41874146 and No.42030103)Postgraduate Innovation Project of China University of Petroleum(East China)(No.YCX2021012)
文摘Seismic data reconstruction is an essential and yet fundamental step in seismic data processing workflow,which is of profound significance to improve migration imaging quality,multiple suppression effect,and seismic inversion accuracy.Regularization methods play a central role in solving the underdetermined inverse problem of seismic data reconstruction.In this paper,a novel regularization approach is proposed,the low dimensional manifold model(LDMM),for reconstructing the missing seismic data.Our work relies on the fact that seismic patches always occupy a low dimensional manifold.Specifically,we exploit the dimension of the seismic patches manifold as a regularization term in the reconstruction problem,and reconstruct the missing seismic data by enforcing low dimensionality on this manifold.The crucial procedure of the proposed method is to solve the dimension of the patches manifold.Toward this,we adopt an efficient dimensionality calculation method based on low-rank approximation,which provides a reliable safeguard to enforce the constraints in the reconstruction process.Numerical experiments performed on synthetic and field seismic data demonstrate that,compared with the curvelet-based sparsity-promoting L1-norm minimization method and the multichannel singular spectrum analysis method,the proposed method obtains state-of-the-art reconstruction results.
文摘3D image reconstruction for weather radar data can not only help the weatherman to improve the forecast efficiency and accuracy, but also help people to understand the weather conditions easily and quickly. Marching Cubes (MC) algorithm in the surface rendering has more excellent applicability in 3D reconstruction for the slice images;it may shorten the time to find and calculate the isosurface from raw volume data, reflect the shape structure more accurately. In this paper, we discuss a method to reconstruct the 3D weather cloud image by using the proposed Cube Weighting Interpolation (CWI) and MC algorithm. Firstly, we detail the steps of CWI, apply it to project the raw radar data into the cubes and obtain the equally spaced cloud slice images, then employ MC algorithm to draw the isosurface. Some experiments show that our method has a good effect and simple operation, which may provide an intuitive and effective reference for realizing the 3D surface reconstruction and meteorological image stereo visualization.
基金supported by Ten Thousand Talent Program of Yunnan Province(Grant No.YNWR-QNBJ-2018-174)the Key Basic Research Program of Yunnan Province,China(Grant No.202101BC070003)+3 种基金National Natural Science Foundation of China(Grant No.31901237)Conservation Program for Plant Species with Extremely Small Populations in Yunnan Province(Grant No.2022SJ07X-03)Key Technologies Research for the Germplasmof Important Woody Flowers in Yunnan Province(Grant No.202302AE090018)Natural Science Foundation of Guizhou Province(Grant No.Qiankehejichu-ZK2021yiban 089&Qiankehejichu-ZK2023yiban 035)。
文摘Rhododendron is famous for its high ornamental value.However,the genus is taxonomically difficult and the relationships within Rhododendron remain unresolved.In addition,the origin of key morphological characters with high horticulture value need to be explored.Both problems largely hinder utilization of germplasm resources.Most studies attempted to disentangle the phylogeny of Rhododendron,but only used a few genomic markers and lacked large-scale sampling,resulting in low clade support and contradictory phylogenetic signals.Here,we used restriction-site associated DNA sequencing(RAD-seq)data and morphological traits for 144 species of Rhododendron,representing all subgenera and most sections and subsections of this species-rich genus,to decipher its intricate evolutionary history and reconstruct ancestral state.Our results revealed high resolutions at subgenera and section levels of Rhododendron based on RAD-seq data.Both optimal phylogenetic tree and split tree recovered five lineages among Rhododendron.Subg.Therorhodion(cladeⅠ)formed the basal lineage.Subg.Tsutsusi and Azaleastrum formed cladeⅡand had sister relationships.CladeⅢincluded all scaly rhododendron species.Subg.Pentanthera(cladeⅣ)formed a sister group to Subg.Hymenanthes(cladeⅤ).The results of ancestral state reconstruction showed that Rhododendron ancestor was a deciduous woody plant with terminal inflorescence,ten stamens,leaf blade without scales and broadly funnelform corolla with pink or purple color.This study shows significant distinguishability to resolve the evolutionary history of Rhododendron based on high clade support of phylogenetic tree constructed by RAD-seq data.It also provides an example to resolve discordant signals in phylogenetic trees and demonstrates the application feasibility of RAD-seq with large amounts of missing data in deciphering intricate evolutionary relationships.Additionally,the reconstructed ancestral state of six important characters provides insights into the innovation of key characters in Rhododendron.
基金granted by National Natural Science Foundation of China(Grant Nos.41572154,41820104004)the National Key R&D Plan(Grant No.2017YFC0601405)the Strategic Priority Research Program(B)of the Chinese Academy of Sciences(Grant No.XDB18000000).
文摘We all live on one planet and geology has no borders.Countries that reside on different continents share the same architecture beneath the surface;they were once neighbors with common foundations.Interoperable geological data are now freely available to everyone for the benefit of society,demonstrating that geoscience can address both global and regional problems.Whilst increasingly large datasets("Big Data")provide clear opportunities(e.g.,Spina,2018).
文摘The quality of real time traffic information is of the great importance, therefore the factors having effect on traffic characteristics are analyzed in general, and the necessities of real time data processing are summarized. The identification and reconstruction of real time traffic data are analyzed using Kalman filter equation and statistical approach. Four methods for ITS (Intelligent transportation system) detector data screening in traffic management systems are discussed in detail. Meanwhile traffic data examinations are discussed with solutions formulated through analysis, and recommendations are made for information collection and data management in future.
文摘Background A task assigned to space exploration satellites involves detecting the physical environment within a certain space.However,space detection data are complex and abstract.These data are not conducive for researchers'visual perceptions of the evolution and interaction of events in the space environment.Methods A time-series dynamic data sampling method for large-scale space was proposed for sample detection data in space and time,and the corresponding relationships between data location features and other attribute features were established.A tone-mapping method based on statistical histogram equalization was proposed and applied to the final attribute feature data.The visualization process is optimized for rendering by merging materials,reducing the number of patches,and performing other operations.Results The results of sampling,feature extraction,and uniform visualization of the detection data of complex types,long duration spans,and uneven spatial distributions were obtained.The real-time visualization of large-scale spatial structures using augmented reality devices,particularly low-performance devices,was also investigated.Conclusions The proposed visualization system can reconstruct the three-dimensional structure of a large-scale space,express the structure and changes in the spatial environment using augmented reality,and assist in intuitively discovering spatial environmental events and evolutionary rules.
基金supported by the National Natural Science Foundation of China(Grant No.41105013,41375028,and 61271106)the National Natural Science Foundation of Jiangsu Province,China(Grant No.BK2011122)the Key Laboratory of Meteorological Observation and Information Processing Scientific Research Fund of Jiangsu Province,China(Grant No.KDXS1205)
文摘To reconstruct the missing data of the total electron content (TEC) observations, a new method is proposed, which is based on the empirical orthogonal functions (EOF) decomposition and the value of eigenvalue itself. It is a self-adaptive EOF decomposition without any prior information needed, and the error of reconstructed data can be estimated. The interval quartering algorithm and cross-validation algorithm are used to compute the optimal number of EOFs for reconstruction. The interval quartering algorithm can reduce the computation time. The application of the data interpolating empirical orthogonal functions (DINEOF) method to the real data have demonstrated that the method can reconstruct the TEC map with high accuracy, which can be employed on the real-time system in the future work.
文摘There are only two quantitative tools for Precambrian paleogeographic reconstructions–paleomagnetic data and dyke swarms geometries.Paleomagnetic data provide information about paleolatitudes and orientation of rigid
文摘Based on the compressive sensing,a novel algorithm is proposed to solve reconstruction problem under sparsity assumptions.Instead of estimating the reconstructed data through minimizing the objective function,the authors parameterize the problem as a linear combination of few elementary thresholding functions,which can be solved by calculating the linear weighting coefficients.It is to update the thresholding functions during the process of iteration.The advantage of this method is that the optimization problem only needs to be solved by calculating linear coefficients for each time.With the elementary thresholding functions satisfying certain constraints,a global convergence of the iterative algorithm is guaranteed.The synthetic and the field data results prove the effectiveness of the proposed algorithm.
基金This project is supported by National Natural Science Foundation of China(No. 10272033) and Provincial Natural Science Foundation of Guangdong,China(No.04105385).
文摘A new B-spline surface reconstruction method from layer data based on deformable model is presented. An initial deformable surface, which is represented as a closed cylinder, is firstly given. The surface is subject to internal forces describing its implicit smoothness property and external forces attracting it toward the layer data points. And then finite element method is adopted to solve its energy minimization problem, which results a bicubic closed B-spline surface with C^2 continuity. The proposed method can provide a smoothness and accurate surface model directly from the layer data, without the need to fit cross-sectional curves and make them compatible. The feasibility of the proposed method is verified by the experimental results.
文摘In this paper, on the basis of experimental data of two kinds of chemical explosions, the piston-pushing model of spherical blast-waves and the second-order Godunov-type scheme of finite difference methods with high identification to discontinuity are used to the numerical reconstruction of part of an actual hemispherical blast-wave flow field by properly adjusting the moving bounary conditions of a piston. This method is simple and reliable. It is suitable to the evaluation of effects of the blast-wave flow field away from the explosion center.