A new method of nonlinear analysis is established by combining phase space reconstruction and data reduction sub-frequency band wavelet. This method is applied to two types of chaotic dynamic systems(Lorenz and Rssler...A new method of nonlinear analysis is established by combining phase space reconstruction and data reduction sub-frequency band wavelet. This method is applied to two types of chaotic dynamic systems(Lorenz and Rssler) to examine the anti-noise ability for complex systems. Results show that the nonlinear dynamic system analysis method resists noise and reveals the internal dynamics of a weak signal from noise pollution. On this basis, the vertical upward gas–liquid two-phase flow in a 2 mm × 0.81 mm small rectangular channel is investigated. The frequency and energy distributions of the main oscillation mode are revealed by analyzing the time–frequency spectra of the pressure signals of different flow patterns. The positive power spectral density of singular-value frequency entropy and the damping ratio are extracted to characterize the evolution of flow patterns and achieve accurate recognition of different vertical upward gas–liquid flow patterns(bubbly flow:100%, slug flow: 92%, churn flow: 96%, annular flow: 100%). The proposed analysis method will enrich the dynamics theory of multi-phase flow in small channel.展开更多
In this paper, three techniques, line run coding, quadtree DF (Depth-First) representation and H coding for compressing classified satellite cloud images with no distortion are presented. In these three codings, the f...In this paper, three techniques, line run coding, quadtree DF (Depth-First) representation and H coding for compressing classified satellite cloud images with no distortion are presented. In these three codings, the first two were invented by other persons and the third one, by ourselves. As a result, the comparison among their compression rates is. given at the end of this paper. Further application of these image compression technique to satellite data and other meteorological data looks promising.展开更多
A strategy for B-spline curve data reduction based on non-uniform B-spline wavelet decomposition is presented. In existing methods of knot removal, ranking the removal knots depends on a procedure of assigning a weigh...A strategy for B-spline curve data reduction based on non-uniform B-spline wavelet decomposition is presented. In existing methods of knot removal, ranking the removal knots depends on a procedure of assigning a weight to each knot to indicate its significance. This is reasonable but not straightforward. Propose is a more straightforward and accurate method to calculate the weight. The wavelet coefficient is taken as a weight for the corresponding knot. The approximating curve and the error can be obtained directly from the wavelet decomposition. By using the hierarchical structure of the wavelet, the error can be computed efficiently in an accumulative manner.展开更多
Imbalanced data classification is one of the major problems in machine learning.This imbalanced dataset typically has significant differences in the number of data samples between its classes.In most cases,the perform...Imbalanced data classification is one of the major problems in machine learning.This imbalanced dataset typically has significant differences in the number of data samples between its classes.In most cases,the performance of the machine learning algorithm such as Support Vector Machine(SVM)is affected when dealing with an imbalanced dataset.The classification accuracy is mostly skewed toward the majority class and poor results are exhibited in the prediction of minority-class samples.In this paper,a hybrid approach combining data pre-processing technique andSVMalgorithm based on improved Simulated Annealing(SA)was proposed.Firstly,the data preprocessing technique which primarily aims at solving the resampling strategy of handling imbalanced datasets was proposed.In this technique,the data were first synthetically generated to equalize the number of samples between classes and followed by a reduction step to remove redundancy and duplicated data.Next is the training of a balanced dataset using SVM.Since this algorithm requires an iterative process to search for the best penalty parameter during training,an improved SA algorithm was proposed for this task.In this proposed improvement,a new acceptance criterion for the solution to be accepted in the SA algorithm was introduced to enhance the accuracy of the optimization process.Experimental works based on ten publicly available imbalanced datasets have demonstrated higher accuracy in the classification tasks using the proposed approach in comparison with the conventional implementation of SVM.Registering at an average of 89.65%of accuracy for the binary class classification has demonstrated the good performance of the proposed works.展开更多
Cloud backup has been an important issue ever since large quantities of valuable data have been stored on the personal computing devices. Data reduction techniques, such as deduplication, delta encoding, and Lempel-Z...Cloud backup has been an important issue ever since large quantities of valuable data have been stored on the personal computing devices. Data reduction techniques, such as deduplication, delta encoding, and Lempel-Ziv (LZ) compression, performed at the client side before data transfer can help ease cloud backup by saving network bandwidth and reducing cloud storage space. However, client-side data reduction in cloud backup services faces efficiency and privacy challenges. In this paper, we present Pangolin, a secure and efficient cloud backup service for personal data storage by exploiting application awareness. It can speedup backup operations by application-aware client-side data reduction technique, and mitigate data security risks by integrating selective encryption into data reduction for sensitive applications. Our experimental evaluation, based on a prototype implementation, shows that our scheme can improve data reduction efficiency over the state-of-the-art methods by shortening the backup window size to 33%-75%, and its security mechanism for' sensitive applications has negligible impact on backup window size.展开更多
As location data are widely available to portable devices, trajectory tracking of moving objects has become an essential technology for most location-based services. To maintain such streaming data of location updates...As location data are widely available to portable devices, trajectory tracking of moving objects has become an essential technology for most location-based services. To maintain such streaming data of location updates from mobile clients, conventional approaches such as time-based regular location updating and distance-based location updating have been used. However, these methods suffer from the large amount of data, redundant location updates, and large trajectory estimation errors due to the varying speed of moving objects. In this paper, we propose a simple but efficient online trajectory data reduction method for portable devices. To solve the problems of redundancy and large estimation errors, the proposed algorithm computes trajectory errors and finds a recent location update that should be sent to the server to satisfy the user requirements. We evaluate the proposed algorithm with real GPS trajectory data consisting of 17 201 trajectories. The intensive simulation results prove that the proposed algorithm always meets the given user requirements and exhibits a data reduction ratio of greater than 87% when the acceptable trajectory error is greater than or equal to 10 meters.展开更多
Various Wireless Sensor Network(WSN)applications require the common task of collecting the data from the sensor nodes using the sink.Since the procedure of collecting data is iterative,an effective technique is necess...Various Wireless Sensor Network(WSN)applications require the common task of collecting the data from the sensor nodes using the sink.Since the procedure of collecting data is iterative,an effective technique is necessary to obtain the data efficiently by reducing the consumption of nodal energy.Hence,a technique for data reduction in WSN is presented in this paper by proposing a prediction algorithm,called Hierarchical Fractional Bidirectional Least-Mean Square(HFBLMS)algorithm.The novel algorithm is designed by modifying Hierarchical Least-Mean Square(HLMS)algorithm with the inclusion of BLMS for bidirectional-based data prediction and Fractional Calculus(FC)in the weight update process.Data redundancy is achieved by transmitting only those data required based on the data predicted at the sensor node and the sink.Moreover,the proposed HFBLMS algorithm reduces the energy consumption in the network by the effective prediction attained by BLMS.Two metrics,such as energy consumption and prediction error,are used for the evaluation of performance of the HFBLMS prediction algorithm,where it can attain energy values of 0.3587 and 0.1953 at the maximum number of rounds and prediction errors of just 0.0213 and 0.0095,using air quality and localization datasets,respectively.展开更多
In the preparation of firing tables, the determination of projectile drag coefficientsthrough firing test radar data reduction is very important. Many methods have been developed for this work but none of them appear ...In the preparation of firing tables, the determination of projectile drag coefficientsthrough firing test radar data reduction is very important. Many methods have been developed for this work but none of them appear to be satisfactory in one Way or another. Inthis paper a multi-spline model of drag coefficient (cd) curve is developed that can guaranteefirst derivative continuity of the cd curve and has good flexibility of fitting accurately to acd curve from subsonic up to supersonic range. Practical firing data reduction tests showboth fast convergence and accurate fitting results. Typical velocity fitting RMS errors are0.05-0.08 m/s.展开更多
This paper describes the data release of the LAMOST pilot survey, which includes data reduction, calibration, spectral analysis, data products and data access. The accuracy of the released data and the information abo...This paper describes the data release of the LAMOST pilot survey, which includes data reduction, calibration, spectral analysis, data products and data access. The accuracy of the released data and the information about the FITS headers of spectra are also introduced. The released data set includes 319 000 spectra and a catalog of these objects.展开更多
A data buoy, when moored at the sea, will be influenced by different environmental factors such as wind and wave current. For this reason, the stability of a data buoy is of great importance. This paper describes the ...A data buoy, when moored at the sea, will be influenced by different environmental factors such as wind and wave current. For this reason, the stability of a data buoy is of great importance. This paper describes the analysis and calculation of the stability of a data buoy which has a weight suspended from its bottom under normal working conditions and the influence of the distance from the weight to the bottom of the buoy on the 'righting' ability of the buoy when it is capsized. The paper also provides the curves which show the influence of the size of the weight and the distance from the weight to the bottom of the buoy on the 'righting' ability of the buoy. All these are of great reference value for the design and use of a data buoy at sea and for the design of similar floating bodies. The results from calculations agree with those from the model experiment.展开更多
Large scale dense Wireless Sensor Networks (WSNs) have been progressively employed for different classes of applications for the resolve of precise monitoring. As a result of high density of nodes, both spatially and ...Large scale dense Wireless Sensor Networks (WSNs) have been progressively employed for different classes of applications for the resolve of precise monitoring. As a result of high density of nodes, both spatially and temporally correlated information can be detected by several nodes. Hence, energy can be saved which is a major aspect of these networks. Moreover, by using these advantages of correlations, communication and data exchange can be reduced. In this paper, a novel algorithm that selects the data based on their contextual importance is proposed. The data, which are contextually important, are only transmitted to the upper layer and the remains are ignored. In this way, the proposed method achieves significant data reduction and in turn improves the energy conservation of data gathering.展开更多
Optimizing the sensor energy is one of the most important concern in Three-Dimensional(3D)Wireless Sensor Networks(WSNs).An improved dynamic hierarchical clustering has been used in previous works that computes optimu...Optimizing the sensor energy is one of the most important concern in Three-Dimensional(3D)Wireless Sensor Networks(WSNs).An improved dynamic hierarchical clustering has been used in previous works that computes optimum clusters count and thus,the total consumption of energy is optimal.However,the computational complexity will be increased due to data dimension,and this leads to increase in delay in network data transmission and reception.For solving the above-mentioned issues,an efficient dimensionality reduction model based on Incremental Linear Discriminant Analysis(ILDA)is proposed for 3D hierarchical clustering WSNs.The major objective of the proposed work is to design an efficient dimensionality reduction and energy efficient clustering algorithm in 3D hierarchical clustering WSNs.This ILDA approach consists of four major steps such as data dimension reduction,distance similarity index introduction,double cluster head technique and node dormancy approach.This protocol differs from normal hierarchical routing protocols in formulating the Cluster Head(CH)selection technique.According to node’s position and residual energy,optimal cluster-head function is generated,and every CH is elected by this formulation.For a 3D spherical structure,under the same network condition,the performance of the proposed ILDA with Improved Dynamic Hierarchical Clustering(IDHC)is compared with Distributed Energy-Efficient Clustering(DEEC),Hybrid Energy Efficient Distributed(HEED)and Stable Election Protocol(SEP)techniques.It is observed that the proposed ILDA based IDHC approach provides better results with respect to Throughput,network residual energy,network lifetime and first node death round.展开更多
In the context of increasing dimensionality of design variables and the complexity of constraints, the efficacy of Surrogate-Based Optimization(SBO) is limited. The traditional linear and nonlinear dimensionality redu...In the context of increasing dimensionality of design variables and the complexity of constraints, the efficacy of Surrogate-Based Optimization(SBO) is limited. The traditional linear and nonlinear dimensionality reduction algorithms are mainly to decompose the mathematical matrix composed of design variables or objective functions in various forms, the smoothness of the design space cannot be guaranteed in the process, and additional constraint functions need to be added in the optimization, which increases the calculation cost. This study presents a new parameterization method to improve both problems of SBO. The new parameterization is addressed by decoupling affine transformations(dilation, rotation, shearing, and translation) within the Grassmannian submanifold, which enables a separate representation of the physical information of the airfoil in a highdimensional space. Building upon this, Principal Geodesic Analysis(PGA) is employed to achieve geometric control, compress the design space, reduce the number of design variables, reduce the dimensions of design variables and enhance predictive performance during the surrogate optimization process. For comparison, a dimensionality reduction space is defined using 95% of the energy,and RAE 2822 for transonic conditions are used as demonstrations. This method significantly enhances the optimization efficiency of the surrogate model while effectively enabling geometric constraints. In three-dimensional problems, it enables simultaneous design of planar shapes for various components of the aircraft and high-order perturbation deformations. Optimization was applied to the ONERA M6 wing, achieving a lift-drag ratio of 18.09, representing a 27.25% improvement compared to the baseline configuration. In comparison to conventional surrogate model optimization methods, which only achieved a 17.97% improvement, this approach demonstrates its superiority.展开更多
Three recent breakthroughs due to AI in arts and science serve as motivation:An award winning digital image,protein folding,fast matrix multiplication.Many recent developments in artificial neural networks,particularl...Three recent breakthroughs due to AI in arts and science serve as motivation:An award winning digital image,protein folding,fast matrix multiplication.Many recent developments in artificial neural networks,particularly deep learning(DL),applied and relevant to computational mechanics(solid,fluids,finite-element technology)are reviewed in detail.Both hybrid and pure machine learning(ML)methods are discussed.Hybrid methods combine traditional PDE discretizations with ML methods either(1)to help model complex nonlinear constitutive relations,(2)to nonlinearly reduce the model order for efficient simulation(turbulence),or(3)to accelerate the simulation by predicting certain components in the traditional integration methods.Here,methods(1)and(2)relied on Long-Short-Term Memory(LSTM)architecture,with method(3)relying on convolutional neural networks.Pure ML methods to solve(nonlinear)PDEs are represented by Physics-Informed Neural network(PINN)methods,which could be combined with attention mechanism to address discontinuous solutions.Both LSTM and attention architectures,together with modern and generalized classic optimizers to include stochasticity for DL networks,are extensively reviewed.Kernel machines,including Gaussian processes,are provided to sufficient depth for more advanced works such as shallow networks with infinite width.Not only addressing experts,readers are assumed familiar with computational mechanics,but not with DL,whose concepts and applications are built up from the basics,aiming at bringing first-time learners quickly to the forefront of research.History and limitations of AI are recounted and discussed,with particular attention at pointing out misstatements or misconceptions of the classics,even in well-known references.Positioning and pointing control of a large-deformable beam is given as an example.展开更多
The canopy stomatal movement, a plant physiological process, generally occurs within leaves but its influence on exchange of CO2, water vapor, and sensible heat fluxes between atmosphere and terrestrial ecosystem. Man...The canopy stomatal movement, a plant physiological process, generally occurs within leaves but its influence on exchange of CO2, water vapor, and sensible heat fluxes between atmosphere and terrestrial ecosystem. Many studies have documented that the interaction between leaf photosynthesis and canopy stomatal conductance is obvious. Thus, information on stomatal conductance is valuable in climate and ecosystem models. In current study, a newly developed model was adopted to calculate canopy stomatal conductance of winter wheat in Huang-Huai-Hai (H-H-H) Plain of China (31.5 - 42.7°N, 110.0 - 123.0°E). The remote sensing information from NOAA-AVHRR and meteorological observed data were used to estimate regional scale stomatal conductance distribution. Canopy stomatal conductance distribution pattern of winter wheat on March 18, 1997 was also presented. The developed canopy stomatal conductance model might be used to estimate canopy stomatal conductance in land surface schemes and seems can be acted as a boundary condition in regional climatic model runs.展开更多
The Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST, also called the Guo Shou Jing Telescope) is a special reflecting Schmidt telescope. LAMOST’s special design allows both a large aperture (effecti...The Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST, also called the Guo Shou Jing Telescope) is a special reflecting Schmidt telescope. LAMOST’s special design allows both a large aperture (effective aperture of 3.6 m–4.9 m) and a wide field of view (FOV) (5°). It has an innovative active reflecting Schmidt configuration which continuously changes the mirror’s surface that adjusts during the observation process and combines thin deformable mirror active optics with segmented active optics. Its primary mirror (6.67m×6.05 m) and active Schmidt mirror (5.74m×4.40 m) are both segmented, and composed of 37 and 24 hexagonal sub-mirrors respectively. By using a parallel controllable fiber positioning technique, the focal surface of 1.75 m in diameter can accommodate 4000 optical fibers. Also, LAMOST has 16 spectrographs with 32 CCD cameras. LAMOST will be the telescope with the highest rate of spectral acquisition. As a national large scientific project, the LAMOST project was formally proposed in 1996, and approved by the Chinese government in 1997. The construction started in 2001, was completed in 2008 and passed the official acceptance in June 2009. The LAMOST pilot survey was started in October 2011 and the spectroscopic survey will launch in September 2012. Up to now, LAMOST has released more than 480 000 spectra of objects. LAMOST will make an important contribution to the study of the large-scale structure of the Universe, structure and evolution of the Galaxy, and cross-identification of multiwaveband properties in celestial objects.展开更多
As point cloud of one whole vehicle body has the traits of large geometric dimension, huge data and rigorous reverse precision, one pretreatment algorithm on automobile body point cloud is put forward. The basic idea ...As point cloud of one whole vehicle body has the traits of large geometric dimension, huge data and rigorous reverse precision, one pretreatment algorithm on automobile body point cloud is put forward. The basic idea of the registration algorithm based on the skeleton points is to construct the skeleton points of the whole vehicle model and the mark points of the separate point cloud, to search the mapped relationship between skeleton points and mark points using congruence triangle method and to match the whole vehicle point cloud using the improved iterative closed point (ICP) algorithm. The data reduction algorithm, based on average square root of distance, condenses data by three steps, computing datasets' average square root of distance in sampling cube grid, sorting order according to the value computed from the first step, choosing sampling percentage. The accuracy of the two algorithms above is proved by a registration and reduction example of whole vehicle point cloud of a certain light truck.展开更多
The principle, method and mersuring equipment in studying siltaton in the Lianyun Harbor by using the γ-ray density gauge are described in this paper. From field observation and analyses, some primary principles conc...The principle, method and mersuring equipment in studying siltaton in the Lianyun Harbor by using the γ-ray density gauge are described in this paper. From field observation and analyses, some primary principles concerning the infill rate, the distribution of silt density with depth and the consolidation rate and change of the shear strength of the silt have been found out.展开更多
Difference similitude matrix (DSM) is effective in reducing information system with its higher reduction rate and higher validity. We use DSM method to analyze the fault data of computer networks and obtain the fault ...Difference similitude matrix (DSM) is effective in reducing information system with its higher reduction rate and higher validity. We use DSM method to analyze the fault data of computer networks and obtain the fault diagnosis rules. Through discretizing the relative value of fault data, we get the information system of the fault data. DSM method reduces the information system and gets the diagnosis rules. The simulation with the actual scenario shows that the fault diagnosis based on DSM can obtain few and effective rules. Key words computer networks - data reduction - fault management - difference-similitude matrix CLC number TP 393 Foundation item: Supported by the National Natural Science Foundation of China (90204008)Biography: Jiang Hao (1976-), male, Ph. D candidate, research direction: computer network, data mine.展开更多
基金Supported by the National Natural Science Foundation of China(51406031)
文摘A new method of nonlinear analysis is established by combining phase space reconstruction and data reduction sub-frequency band wavelet. This method is applied to two types of chaotic dynamic systems(Lorenz and Rssler) to examine the anti-noise ability for complex systems. Results show that the nonlinear dynamic system analysis method resists noise and reveals the internal dynamics of a weak signal from noise pollution. On this basis, the vertical upward gas–liquid two-phase flow in a 2 mm × 0.81 mm small rectangular channel is investigated. The frequency and energy distributions of the main oscillation mode are revealed by analyzing the time–frequency spectra of the pressure signals of different flow patterns. The positive power spectral density of singular-value frequency entropy and the damping ratio are extracted to characterize the evolution of flow patterns and achieve accurate recognition of different vertical upward gas–liquid flow patterns(bubbly flow:100%, slug flow: 92%, churn flow: 96%, annular flow: 100%). The proposed analysis method will enrich the dynamics theory of multi-phase flow in small channel.
文摘In this paper, three techniques, line run coding, quadtree DF (Depth-First) representation and H coding for compressing classified satellite cloud images with no distortion are presented. In these three codings, the first two were invented by other persons and the third one, by ourselves. As a result, the comparison among their compression rates is. given at the end of this paper. Further application of these image compression technique to satellite data and other meteorological data looks promising.
基金Supported by the Natural Science Foundation of China (50075032) and State High-Technology Development Program of China (2001AA421150)
文摘A strategy for B-spline curve data reduction based on non-uniform B-spline wavelet decomposition is presented. In existing methods of knot removal, ranking the removal knots depends on a procedure of assigning a weight to each knot to indicate its significance. This is reasonable but not straightforward. Propose is a more straightforward and accurate method to calculate the weight. The wavelet coefficient is taken as a weight for the corresponding knot. The approximating curve and the error can be obtained directly from the wavelet decomposition. By using the hierarchical structure of the wavelet, the error can be computed efficiently in an accumulative manner.
文摘Imbalanced data classification is one of the major problems in machine learning.This imbalanced dataset typically has significant differences in the number of data samples between its classes.In most cases,the performance of the machine learning algorithm such as Support Vector Machine(SVM)is affected when dealing with an imbalanced dataset.The classification accuracy is mostly skewed toward the majority class and poor results are exhibited in the prediction of minority-class samples.In this paper,a hybrid approach combining data pre-processing technique andSVMalgorithm based on improved Simulated Annealing(SA)was proposed.Firstly,the data preprocessing technique which primarily aims at solving the resampling strategy of handling imbalanced datasets was proposed.In this technique,the data were first synthetically generated to equalize the number of samples between classes and followed by a reduction step to remove redundancy and duplicated data.Next is the training of a balanced dataset using SVM.Since this algorithm requires an iterative process to search for the best penalty parameter during training,an improved SA algorithm was proposed for this task.In this proposed improvement,a new acceptance criterion for the solution to be accepted in the SA algorithm was introduced to enhance the accuracy of the optimization process.Experimental works based on ten publicly available imbalanced datasets have demonstrated higher accuracy in the classification tasks using the proposed approach in comparison with the conventional implementation of SVM.Registering at an average of 89.65%of accuracy for the binary class classification has demonstrated the good performance of the proposed works.
基金supported in part by the National High Technology Research and Development 863 Program of China under Grant No.2013AA013201the National Natural Science Foundation of China under Grant Nos.61025009,61232003,61120106005,61170288,and 61379146
文摘Cloud backup has been an important issue ever since large quantities of valuable data have been stored on the personal computing devices. Data reduction techniques, such as deduplication, delta encoding, and Lempel-Ziv (LZ) compression, performed at the client side before data transfer can help ease cloud backup by saving network bandwidth and reducing cloud storage space. However, client-side data reduction in cloud backup services faces efficiency and privacy challenges. In this paper, we present Pangolin, a secure and efficient cloud backup service for personal data storage by exploiting application awareness. It can speedup backup operations by application-aware client-side data reduction technique, and mitigate data security risks by integrating selective encryption into data reduction for sensitive applications. Our experimental evaluation, based on a prototype implementation, shows that our scheme can improve data reduction efficiency over the state-of-the-art methods by shortening the backup window size to 33%-75%, and its security mechanism for' sensitive applications has negligible impact on backup window size.
基金supported by the Incheon National University Research Grant of Korea in 2011
文摘As location data are widely available to portable devices, trajectory tracking of moving objects has become an essential technology for most location-based services. To maintain such streaming data of location updates from mobile clients, conventional approaches such as time-based regular location updating and distance-based location updating have been used. However, these methods suffer from the large amount of data, redundant location updates, and large trajectory estimation errors due to the varying speed of moving objects. In this paper, we propose a simple but efficient online trajectory data reduction method for portable devices. To solve the problems of redundancy and large estimation errors, the proposed algorithm computes trajectory errors and finds a recent location update that should be sent to the server to satisfy the user requirements. We evaluate the proposed algorithm with real GPS trajectory data consisting of 17 201 trajectories. The intensive simulation results prove that the proposed algorithm always meets the given user requirements and exhibits a data reduction ratio of greater than 87% when the acceptable trajectory error is greater than or equal to 10 meters.
文摘Various Wireless Sensor Network(WSN)applications require the common task of collecting the data from the sensor nodes using the sink.Since the procedure of collecting data is iterative,an effective technique is necessary to obtain the data efficiently by reducing the consumption of nodal energy.Hence,a technique for data reduction in WSN is presented in this paper by proposing a prediction algorithm,called Hierarchical Fractional Bidirectional Least-Mean Square(HFBLMS)algorithm.The novel algorithm is designed by modifying Hierarchical Least-Mean Square(HLMS)algorithm with the inclusion of BLMS for bidirectional-based data prediction and Fractional Calculus(FC)in the weight update process.Data redundancy is achieved by transmitting only those data required based on the data predicted at the sensor node and the sink.Moreover,the proposed HFBLMS algorithm reduces the energy consumption in the network by the effective prediction attained by BLMS.Two metrics,such as energy consumption and prediction error,are used for the evaluation of performance of the HFBLMS prediction algorithm,where it can attain energy values of 0.3587 and 0.1953 at the maximum number of rounds and prediction errors of just 0.0213 and 0.0095,using air quality and localization datasets,respectively.
文摘In the preparation of firing tables, the determination of projectile drag coefficientsthrough firing test radar data reduction is very important. Many methods have been developed for this work but none of them appear to be satisfactory in one Way or another. Inthis paper a multi-spline model of drag coefficient (cd) curve is developed that can guaranteefirst derivative continuity of the cd curve and has good flexibility of fitting accurately to acd curve from subsonic up to supersonic range. Practical firing data reduction tests showboth fast convergence and accurate fitting results. Typical velocity fitting RMS errors are0.05-0.08 m/s.
文摘This paper describes the data release of the LAMOST pilot survey, which includes data reduction, calibration, spectral analysis, data products and data access. The accuracy of the released data and the information about the FITS headers of spectra are also introduced. The released data set includes 319 000 spectra and a catalog of these objects.
文摘A data buoy, when moored at the sea, will be influenced by different environmental factors such as wind and wave current. For this reason, the stability of a data buoy is of great importance. This paper describes the analysis and calculation of the stability of a data buoy which has a weight suspended from its bottom under normal working conditions and the influence of the distance from the weight to the bottom of the buoy on the 'righting' ability of the buoy when it is capsized. The paper also provides the curves which show the influence of the size of the weight and the distance from the weight to the bottom of the buoy on the 'righting' ability of the buoy. All these are of great reference value for the design and use of a data buoy at sea and for the design of similar floating bodies. The results from calculations agree with those from the model experiment.
文摘Large scale dense Wireless Sensor Networks (WSNs) have been progressively employed for different classes of applications for the resolve of precise monitoring. As a result of high density of nodes, both spatially and temporally correlated information can be detected by several nodes. Hence, energy can be saved which is a major aspect of these networks. Moreover, by using these advantages of correlations, communication and data exchange can be reduced. In this paper, a novel algorithm that selects the data based on their contextual importance is proposed. The data, which are contextually important, are only transmitted to the upper layer and the remains are ignored. In this way, the proposed method achieves significant data reduction and in turn improves the energy conservation of data gathering.
文摘Optimizing the sensor energy is one of the most important concern in Three-Dimensional(3D)Wireless Sensor Networks(WSNs).An improved dynamic hierarchical clustering has been used in previous works that computes optimum clusters count and thus,the total consumption of energy is optimal.However,the computational complexity will be increased due to data dimension,and this leads to increase in delay in network data transmission and reception.For solving the above-mentioned issues,an efficient dimensionality reduction model based on Incremental Linear Discriminant Analysis(ILDA)is proposed for 3D hierarchical clustering WSNs.The major objective of the proposed work is to design an efficient dimensionality reduction and energy efficient clustering algorithm in 3D hierarchical clustering WSNs.This ILDA approach consists of four major steps such as data dimension reduction,distance similarity index introduction,double cluster head technique and node dormancy approach.This protocol differs from normal hierarchical routing protocols in formulating the Cluster Head(CH)selection technique.According to node’s position and residual energy,optimal cluster-head function is generated,and every CH is elected by this formulation.For a 3D spherical structure,under the same network condition,the performance of the proposed ILDA with Improved Dynamic Hierarchical Clustering(IDHC)is compared with Distributed Energy-Efficient Clustering(DEEC),Hybrid Energy Efficient Distributed(HEED)and Stable Election Protocol(SEP)techniques.It is observed that the proposed ILDA based IDHC approach provides better results with respect to Throughput,network residual energy,network lifetime and first node death round.
基金supported by the National Natural Science Foundation of China (No.92371201).
文摘In the context of increasing dimensionality of design variables and the complexity of constraints, the efficacy of Surrogate-Based Optimization(SBO) is limited. The traditional linear and nonlinear dimensionality reduction algorithms are mainly to decompose the mathematical matrix composed of design variables or objective functions in various forms, the smoothness of the design space cannot be guaranteed in the process, and additional constraint functions need to be added in the optimization, which increases the calculation cost. This study presents a new parameterization method to improve both problems of SBO. The new parameterization is addressed by decoupling affine transformations(dilation, rotation, shearing, and translation) within the Grassmannian submanifold, which enables a separate representation of the physical information of the airfoil in a highdimensional space. Building upon this, Principal Geodesic Analysis(PGA) is employed to achieve geometric control, compress the design space, reduce the number of design variables, reduce the dimensions of design variables and enhance predictive performance during the surrogate optimization process. For comparison, a dimensionality reduction space is defined using 95% of the energy,and RAE 2822 for transonic conditions are used as demonstrations. This method significantly enhances the optimization efficiency of the surrogate model while effectively enabling geometric constraints. In three-dimensional problems, it enables simultaneous design of planar shapes for various components of the aircraft and high-order perturbation deformations. Optimization was applied to the ONERA M6 wing, achieving a lift-drag ratio of 18.09, representing a 27.25% improvement compared to the baseline configuration. In comparison to conventional surrogate model optimization methods, which only achieved a 17.97% improvement, this approach demonstrates its superiority.
文摘Three recent breakthroughs due to AI in arts and science serve as motivation:An award winning digital image,protein folding,fast matrix multiplication.Many recent developments in artificial neural networks,particularly deep learning(DL),applied and relevant to computational mechanics(solid,fluids,finite-element technology)are reviewed in detail.Both hybrid and pure machine learning(ML)methods are discussed.Hybrid methods combine traditional PDE discretizations with ML methods either(1)to help model complex nonlinear constitutive relations,(2)to nonlinearly reduce the model order for efficient simulation(turbulence),or(3)to accelerate the simulation by predicting certain components in the traditional integration methods.Here,methods(1)and(2)relied on Long-Short-Term Memory(LSTM)architecture,with method(3)relying on convolutional neural networks.Pure ML methods to solve(nonlinear)PDEs are represented by Physics-Informed Neural network(PINN)methods,which could be combined with attention mechanism to address discontinuous solutions.Both LSTM and attention architectures,together with modern and generalized classic optimizers to include stochasticity for DL networks,are extensively reviewed.Kernel machines,including Gaussian processes,are provided to sufficient depth for more advanced works such as shallow networks with infinite width.Not only addressing experts,readers are assumed familiar with computational mechanics,but not with DL,whose concepts and applications are built up from the basics,aiming at bringing first-time learners quickly to the forefront of research.History and limitations of AI are recounted and discussed,with particular attention at pointing out misstatements or misconceptions of the classics,even in well-known references.Positioning and pointing control of a large-deformable beam is given as an example.
文摘The canopy stomatal movement, a plant physiological process, generally occurs within leaves but its influence on exchange of CO2, water vapor, and sensible heat fluxes between atmosphere and terrestrial ecosystem. Many studies have documented that the interaction between leaf photosynthesis and canopy stomatal conductance is obvious. Thus, information on stomatal conductance is valuable in climate and ecosystem models. In current study, a newly developed model was adopted to calculate canopy stomatal conductance of winter wheat in Huang-Huai-Hai (H-H-H) Plain of China (31.5 - 42.7°N, 110.0 - 123.0°E). The remote sensing information from NOAA-AVHRR and meteorological observed data were used to estimate regional scale stomatal conductance distribution. Canopy stomatal conductance distribution pattern of winter wheat on March 18, 1997 was also presented. The developed canopy stomatal conductance model might be used to estimate canopy stomatal conductance in land surface schemes and seems can be acted as a boundary condition in regional climatic model runs.
文摘The Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST, also called the Guo Shou Jing Telescope) is a special reflecting Schmidt telescope. LAMOST’s special design allows both a large aperture (effective aperture of 3.6 m–4.9 m) and a wide field of view (FOV) (5°). It has an innovative active reflecting Schmidt configuration which continuously changes the mirror’s surface that adjusts during the observation process and combines thin deformable mirror active optics with segmented active optics. Its primary mirror (6.67m×6.05 m) and active Schmidt mirror (5.74m×4.40 m) are both segmented, and composed of 37 and 24 hexagonal sub-mirrors respectively. By using a parallel controllable fiber positioning technique, the focal surface of 1.75 m in diameter can accommodate 4000 optical fibers. Also, LAMOST has 16 spectrographs with 32 CCD cameras. LAMOST will be the telescope with the highest rate of spectral acquisition. As a national large scientific project, the LAMOST project was formally proposed in 1996, and approved by the Chinese government in 1997. The construction started in 2001, was completed in 2008 and passed the official acceptance in June 2009. The LAMOST pilot survey was started in October 2011 and the spectroscopic survey will launch in September 2012. Up to now, LAMOST has released more than 480 000 spectra of objects. LAMOST will make an important contribution to the study of the large-scale structure of the Universe, structure and evolution of the Galaxy, and cross-identification of multiwaveband properties in celestial objects.
基金This project is supported by Provincial Technology Cooperation Program of Yunnan,China(No.2003EAAAA00D043).
文摘As point cloud of one whole vehicle body has the traits of large geometric dimension, huge data and rigorous reverse precision, one pretreatment algorithm on automobile body point cloud is put forward. The basic idea of the registration algorithm based on the skeleton points is to construct the skeleton points of the whole vehicle model and the mark points of the separate point cloud, to search the mapped relationship between skeleton points and mark points using congruence triangle method and to match the whole vehicle point cloud using the improved iterative closed point (ICP) algorithm. The data reduction algorithm, based on average square root of distance, condenses data by three steps, computing datasets' average square root of distance in sampling cube grid, sorting order according to the value computed from the first step, choosing sampling percentage. The accuracy of the two algorithms above is proved by a registration and reduction example of whole vehicle point cloud of a certain light truck.
文摘The principle, method and mersuring equipment in studying siltaton in the Lianyun Harbor by using the γ-ray density gauge are described in this paper. From field observation and analyses, some primary principles concerning the infill rate, the distribution of silt density with depth and the consolidation rate and change of the shear strength of the silt have been found out.
文摘Difference similitude matrix (DSM) is effective in reducing information system with its higher reduction rate and higher validity. We use DSM method to analyze the fault data of computer networks and obtain the fault diagnosis rules. Through discretizing the relative value of fault data, we get the information system of the fault data. DSM method reduces the information system and gets the diagnosis rules. The simulation with the actual scenario shows that the fault diagnosis based on DSM can obtain few and effective rules. Key words computer networks - data reduction - fault management - difference-similitude matrix CLC number TP 393 Foundation item: Supported by the National Natural Science Foundation of China (90204008)Biography: Jiang Hao (1976-), male, Ph. D candidate, research direction: computer network, data mine.