In recent years, functional data has been widely used in finance, medicine, biology and other fields. The current clustering analysis can solve the problems in finite-dimensional space, but it is difficult to be direc...In recent years, functional data has been widely used in finance, medicine, biology and other fields. The current clustering analysis can solve the problems in finite-dimensional space, but it is difficult to be directly used for the clustering of functional data. In this paper, we propose a new unsupervised clustering algorithm based on adaptive weights. In the absence of initialization parameter, we use entropy-type penalty terms and fuzzy partition matrix to find the optimal number of clusters. At the same time, we introduce a measure based on adaptive weights to reflect the difference in information content between different clustering metrics. Simulation experiments show that the proposed algorithm has higher purity than some algorithms.展开更多
Human living would be impossible without air quality. Consistent advancements in practically every aspect of contemporary human life have harmed air quality. Everyday industrial, transportation, and home activities tu...Human living would be impossible without air quality. Consistent advancements in practically every aspect of contemporary human life have harmed air quality. Everyday industrial, transportation, and home activities turn up dangerous contaminants in our surroundings. This study investigated two years’ worth of air quality and outlier detection data from two Indian cities. Studies on air pollution have used numerous types of methodologies, with various gases being seen as a vector whose components include gas concentration values for each observation per-formed. We use curves to represent the monthly average of daily gas emissions in our technique. The approach, which is based on functional depth, was used to find outliers in the city of Delhi and Kolkata’s gas emissions, and the outcomes were compared to those from the traditional method. In the evaluation and comparison of these models’ performances, the functional approach model studied well.展开更多
Chlorophyll-a(Chl-a)concentration is a primary indicator for marine environmental monitoring.The spatio-temporal variations of sea surface Chl-a concentration in the Yellow Sea(YS)and the East China Sea(ECS)in 2001-20...Chlorophyll-a(Chl-a)concentration is a primary indicator for marine environmental monitoring.The spatio-temporal variations of sea surface Chl-a concentration in the Yellow Sea(YS)and the East China Sea(ECS)in 2001-2020 were investigated by reconstructing the MODIS Level 3 products with the data interpolation empirical orthogonal function(DINEOF)method.The reconstructed results by interpolating the combined MODIS daily+8-day datasets were found better than those merely by interpolating daily or 8-day data.Chl-a concentration in the YS and the ECS reached its maximum in spring,with blooms occurring,decreased in summer and autumn,and increased in late autumn and early winter.By performing empirical orthogonal function(EOF)decomposition of the reconstructed data fields and correlation analysis with several potential environmental factors,we found that the sea surface temperature(SST)plays a significant role in the seasonal variation of Chl a,especially during spring and summer.The increase of SST in spring and the upper-layer nutrients mixed up during the last winter might favor the occurrence of spring blooms.The high sea surface temperature(SST)throughout the summer would strengthen the vertical stratification and prevent nutrients supply from deep water,resulting in low surface Chl-a concentrations.The sea surface Chl-a concentration in the YS was found decreased significantly from 2012 to 2020,which was possibly related to the Pacific Decadal Oscillation(PDO).展开更多
An earthquake of Ms= 6, 9 occurred at the Gonghe, Qinghai Province, China on April 26, 1990. Three larger aftershocks took place at the same region, Ms= 5. 0 on May 7, 1990, Ms= 6. 0 on Jan. 3, 1994 and Ms= 5. 7on Feb...An earthquake of Ms= 6, 9 occurred at the Gonghe, Qinghai Province, China on April 26, 1990. Three larger aftershocks took place at the same region, Ms= 5. 0 on May 7, 1990, Ms= 6. 0 on Jan. 3, 1994 and Ms= 5. 7on Feb. 16, 1994. The long-period recordings of the main shock from China Digital Seismograph Network (CDSN) are deconvolved for the source time functions by the correspondent0 recordings of the three aftershocks asempirical Green's functions (EGFs). No matter which aftershock is taken as EGF, the relative source time functions (RSTFs) Obtained are nearly identical. The RSTFs suggest the Ms= 6. 9 event consists of at least two subevents with approximately equal size whose occurrence times are about 30 s apart, the first one has a duration of 12 s and a rise time of about 5 s, and the second one has a duration of 17 s and a rise time of about & s. COmParing the RSTFs obtained from P- and SH-phases respectively, we notice that those from SH-phases are a slightly more complex than those from p-phases, implying other finer subevents exist during the process of the main shock. It is interesting that the results from the EGF deconvolution of long-Period way form data are in good agreement with the results from the moment tensor inversion and from the EGF deconvolution of broadband waveform data. Additionally, the two larger aftershocks are deconvolved for their RSTFs. The deconvolution results show that the processes of the Ms= 6. 0 event on Jan. 3, 1994 and the Ms= 5. 7 event on Feb. 16,1994 are quite simple, both RSTFs are single impulses.The RSTFs of the Ms= 6. 9 main shock obtained from different stations are noticed to be azimuthally dependent, whose shapes are a slightly different with different stations. However, the RSTFs of the two smaller aftershocks are not azimuthally dependent. The integrations of RSTFs over the processes are quite close to each other, i. e., the scalar seismic moments estimated from different stations are in good agreement. Finally the scalar seismic moments of the three aftershocks are compared. The relative scalar seismic moment Of the three aftershocks deduced from the relative scalar seismic moments of the Ms=6. 9 main shock are very close to those inverted directly from the EGF deconvolution. The relative scalar seismic moment of the Ms =6. 9 main shock calculated using the three aftershocks as EGF are 22 (the Ms= 6. 0 aftershock being EGF), 26 (the Ms= 5. 7 aftershock being EGF) and 66 (the Ms= 5. 5 aftershock being EGF), respectively. Deducingfrom those results, the relative scalar sesimic moments of the Ms= 6. 0 to the Ms= 5. 7 events, the Ms= 6. 0 tothe Ms= 5. 5 events and the Ms= 5. 7 to the Ms= 5. 5 events are 1. 18, 3. 00 and 2. 54, respectively. The correspondent relative scalar seismic moments calculated directly from the waveform recordings are 1. 15, 3. 43, and 3. 05.展开更多
For Hermite-Birkhoff interpolation of scattered multidumensional data by radial basis function (?),existence and characterization theorems and a variational principle are proved. Examples include (?)(r)=r^b,Duchon'...For Hermite-Birkhoff interpolation of scattered multidumensional data by radial basis function (?),existence and characterization theorems and a variational principle are proved. Examples include (?)(r)=r^b,Duchon's thin-plate splines,Hardy's multiquadrics,and inverse multiquadrics.展开更多
We use Radial Basis Functions (RBFs) to reconstruct smooth surfaces from 3D scattered data. An object's surface is defined implicitly as the zero set of an RBF fitted to the given surface data. We propose improveme...We use Radial Basis Functions (RBFs) to reconstruct smooth surfaces from 3D scattered data. An object's surface is defined implicitly as the zero set of an RBF fitted to the given surface data. We propose improvements on the methods of surface reconstruction with radial basis functions. A sparse approximation set of scattered data is constructed by reducing the number of interpolating points on the surface. We present an adaptive method for finding the off-surface normal points. The order of the equation decreases greatly as the number of the off-surface constraints reduces gradually. Experimental results are provided to illustrate that the proposed method is robust and may draw beautiful graphics.展开更多
A method for data classification will influence the efficiency of classification. Attributes reduction based on discernibility matrix and discernibility function in rough sets can use in data classification, so we put...A method for data classification will influence the efficiency of classification. Attributes reduction based on discernibility matrix and discernibility function in rough sets can use in data classification, so we put forward a method for data classification. Namely, firstly, we use discernibility matrix and discernibility function to delete superfluous attributes in formation system and get a necessary attribute set. Secondly, we delete superfluous attribute values and get decision rules. Finally, we classify data by means of decision rules. The experiments show that data classification using this method is simpler in the structure, and can improve the efficiency of classification.展开更多
As the differences of sensor's precision and some random factors are difficult to control,the actual measurement signals are far from the target signals that affect the reliability and precision of rotating machinery...As the differences of sensor's precision and some random factors are difficult to control,the actual measurement signals are far from the target signals that affect the reliability and precision of rotating machinery fault diagnosis.The traditional signal processing methods,such as classical inference and weighted averaging algorithm usually lack dynamic adaptability that is easy for trends to cause the faults to be misjudged or left out.To enhance the measuring veracity and precision of vibration signal in rotary machine multi-sensor vibration signal fault diagnosis,a novel data level fusion approach is presented on the basis of correlation function analysis to fast determine the weighted value of multi-sensor vibration signals.The approach doesn't require knowing the prior information about sensors,and the weighted value of sensors can be confirmed depending on the correlation measure of real-time data tested in the data level fusion process.It gives greater weighted value to the greater correlation measure of sensor signals,and vice versa.The approach can effectively suppress large errors and even can still fuse data in the case of sensor failures because it takes full advantage of sensor's own-information to determine the weighted value.Moreover,it has good performance of anti-jamming due to the correlation measures between noise and effective signals are usually small.Through the simulation of typical signal collected from multi-sensors,the comparative analysis of dynamic adaptability and fault tolerance between the proposed approach and traditional weighted averaging approach is taken.Finally,the rotor dynamics and integrated fault simulator is taken as an example to verify the feasibility and advantages of the proposed approach,it is shown that the multi-sensor data level fusion based on correlation function weighted approach is better than the traditional weighted average approach with respect to fusion precision and dynamic adaptability.Meantime,the approach is adaptable and easy to use,can be applied to other areas of vibration measurement.展开更多
The objective of this paper is to present a review of different calibration and classification methods for functional data in the context of chemometric applications. In chemometric, it is usual to measure certain par...The objective of this paper is to present a review of different calibration and classification methods for functional data in the context of chemometric applications. In chemometric, it is usual to measure certain parameters in terms of a set of spectrometric curves that are observed in a finite set of points (functional data). Although the predictor variable is clearly functional, this problem is usually solved by using multivariate calibration techniques that consider it as a finite set of variables associated with the observed points (wavelengths or times). But these explicative variables are highly correlated and it is therefore more informative to reconstruct first the true functional form of the predictor curves. Although it has been published in several articles related to the implementation of functional data analysis techniques in chemometric, their power to solve real problems is not yet well known. Because of this the extension of multivariate calibration techniques (linear regression, principal component regression and partial least squares) and classification methods (linear discriminant analysis and logistic regression) to the functional domain and some relevant chemometric applications are reviewed in this paper.展开更多
A kernel-type estimator of the quantile function Q(p) = inf{t:F(t) ≥ p}, 0 ≤ p ≤ 1, is proposed based on the kernel smoother when the data are subjected to random truncation. The Bahadur-type representations o...A kernel-type estimator of the quantile function Q(p) = inf{t:F(t) ≥ p}, 0 ≤ p ≤ 1, is proposed based on the kernel smoother when the data are subjected to random truncation. The Bahadur-type representations of the kernel smooth estimator are established, and from Bahadur representations the authors can show that this estimator is strongly consistent, asymptotically normal, and weakly convergent.展开更多
The purpose of this paper is to study the theory of conservative estimating functions in nonlinear regression model with aggregated data. In this model, a quasi-score function with aggregated data is defined. When thi...The purpose of this paper is to study the theory of conservative estimating functions in nonlinear regression model with aggregated data. In this model, a quasi-score function with aggregated data is defined. When this function happens to be conservative, it is projection of the true score function onto a class of estimation functions. By constructing, the potential function for the projected score with aggregated data is obtained, which have some properties of log-likelihood function.展开更多
The authors have applied a systems analysis approach to describe the musculoskeletal system as consisting of a stack of superimposed kinematic hier-archical segments in which each lower segment tends to transfer its m...The authors have applied a systems analysis approach to describe the musculoskeletal system as consisting of a stack of superimposed kinematic hier-archical segments in which each lower segment tends to transfer its motion to the other superimposed segments. This segmental chain enables the derivation of both conscious perception and sensory control of action in space. This applied systems analysis approach involves the measurements of the complex motor behavior in order to elucidate the fusion of multiple sensor data for the reliable and efficient acquisition of the kinetic, kinematics and electromyographic data of the human spatial behavior. The acquired kinematic and related kinetic signals represent attributive features of the internal recon-struction of the physical links between the superimposed body segments. In-deed, this reconstruction of the physical links was established as a result of the fusion of the multiple sensor data. Furthermore, this acquired kinematics, kinetics and electromyographic data provided detailed means to record, annotate, process, transmit, and display pertinent information derived from the musculoskeletal system to quantify and differentiate between subjects with mobility-related disabilities and able-bodied subjects, and enabled an inference into the active neural processes underlying balance reactions. To gain insight into the basis for this long-term dependence, the authors have applied the fusion of multiple sensor data to investigate the effects of Cerebral Palsy, Multiple Sclerosis and Diabetic Neuropathy conditions, on biomechanical/neurophysiological changes that may alter the ability of the human loco-motor system to generate ambulation, balance and posture.展开更多
In the present paper,a new criterion is derived to obtain the optimum fitting curve while using Cubic B-spline basis functions to remove the statistical noise in the spectroscopic data.In this criterion,firstly,smooth...In the present paper,a new criterion is derived to obtain the optimum fitting curve while using Cubic B-spline basis functions to remove the statistical noise in the spectroscopic data.In this criterion,firstly,smoothed fitting curves using Cubic B-spline basis functions are selected with the increasing knot number.Then,the best fitting curves are selected according to the value of the minimum residual sum of squares(RSS)of two adjacent fitting curves.In the case of more than one best fitting curves,the authors use Reinsch's first condition to find a better one.The minimum residual sum of squares(RSS)of fitting curve with noisy data is not recommended as the criterion to determine the best fitting curve,because this value decreases to zero as the number of selected channels increases and the minimum value gives no smoothing effect.Compared with Reinsch's method,the derived criterion is simple and enables the smoothing conditions to be determined automatically without any initial input parameter.With the derived criterion,the satisfactory result was obtained for the experimental spectroscopic data to remove the statistical noise using Cubic B-spline basis functions.展开更多
Solving large radial basis function (RBF) interpolation problem with non-customized methods is computationally expensive and the matrices that occur are typically badly conditioned. In order to avoid these difficult...Solving large radial basis function (RBF) interpolation problem with non-customized methods is computationally expensive and the matrices that occur are typically badly conditioned. In order to avoid these difficulties, we present a fitting based on radial basis functions satisfying side conditions by least squares, although compared with interpolation the method loses some accuracy, it reduces the computational cost largely. Since the fitting accuracy and the non-singularity of coefficient matrix in normal equation are relevant to the uniformity of chosen centers of the fitted RBE we present a choice method of uniform centers. Numerical results confirm the fitting efficiency.展开更多
In this paper, based on random left truncated and right censored data, the authors derive strong representations of the cumulative hazard function estimator and the product-limit estimator of the survival function. wh...In this paper, based on random left truncated and right censored data, the authors derive strong representations of the cumulative hazard function estimator and the product-limit estimator of the survival function. which are valid up to a given order statistic of the observations. A precise bound for the errors is obtained which only depends on the index of the last order statistic to be included.展开更多
文摘In recent years, functional data has been widely used in finance, medicine, biology and other fields. The current clustering analysis can solve the problems in finite-dimensional space, but it is difficult to be directly used for the clustering of functional data. In this paper, we propose a new unsupervised clustering algorithm based on adaptive weights. In the absence of initialization parameter, we use entropy-type penalty terms and fuzzy partition matrix to find the optimal number of clusters. At the same time, we introduce a measure based on adaptive weights to reflect the difference in information content between different clustering metrics. Simulation experiments show that the proposed algorithm has higher purity than some algorithms.
文摘Human living would be impossible without air quality. Consistent advancements in practically every aspect of contemporary human life have harmed air quality. Everyday industrial, transportation, and home activities turn up dangerous contaminants in our surroundings. This study investigated two years’ worth of air quality and outlier detection data from two Indian cities. Studies on air pollution have used numerous types of methodologies, with various gases being seen as a vector whose components include gas concentration values for each observation per-formed. We use curves to represent the monthly average of daily gas emissions in our technique. The approach, which is based on functional depth, was used to find outliers in the city of Delhi and Kolkata’s gas emissions, and the outcomes were compared to those from the traditional method. In the evaluation and comparison of these models’ performances, the functional approach model studied well.
基金Supported by the Fundamental Research Funds for the Central Universities(Nos.202341017,202313024)。
文摘Chlorophyll-a(Chl-a)concentration is a primary indicator for marine environmental monitoring.The spatio-temporal variations of sea surface Chl-a concentration in the Yellow Sea(YS)and the East China Sea(ECS)in 2001-2020 were investigated by reconstructing the MODIS Level 3 products with the data interpolation empirical orthogonal function(DINEOF)method.The reconstructed results by interpolating the combined MODIS daily+8-day datasets were found better than those merely by interpolating daily or 8-day data.Chl-a concentration in the YS and the ECS reached its maximum in spring,with blooms occurring,decreased in summer and autumn,and increased in late autumn and early winter.By performing empirical orthogonal function(EOF)decomposition of the reconstructed data fields and correlation analysis with several potential environmental factors,we found that the sea surface temperature(SST)plays a significant role in the seasonal variation of Chl a,especially during spring and summer.The increase of SST in spring and the upper-layer nutrients mixed up during the last winter might favor the occurrence of spring blooms.The high sea surface temperature(SST)throughout the summer would strengthen the vertical stratification and prevent nutrients supply from deep water,resulting in low surface Chl-a concentrations.The sea surface Chl-a concentration in the YS was found decreased significantly from 2012 to 2020,which was possibly related to the Pacific Decadal Oscillation(PDO).
文摘An earthquake of Ms= 6, 9 occurred at the Gonghe, Qinghai Province, China on April 26, 1990. Three larger aftershocks took place at the same region, Ms= 5. 0 on May 7, 1990, Ms= 6. 0 on Jan. 3, 1994 and Ms= 5. 7on Feb. 16, 1994. The long-period recordings of the main shock from China Digital Seismograph Network (CDSN) are deconvolved for the source time functions by the correspondent0 recordings of the three aftershocks asempirical Green's functions (EGFs). No matter which aftershock is taken as EGF, the relative source time functions (RSTFs) Obtained are nearly identical. The RSTFs suggest the Ms= 6. 9 event consists of at least two subevents with approximately equal size whose occurrence times are about 30 s apart, the first one has a duration of 12 s and a rise time of about 5 s, and the second one has a duration of 17 s and a rise time of about & s. COmParing the RSTFs obtained from P- and SH-phases respectively, we notice that those from SH-phases are a slightly more complex than those from p-phases, implying other finer subevents exist during the process of the main shock. It is interesting that the results from the EGF deconvolution of long-Period way form data are in good agreement with the results from the moment tensor inversion and from the EGF deconvolution of broadband waveform data. Additionally, the two larger aftershocks are deconvolved for their RSTFs. The deconvolution results show that the processes of the Ms= 6. 0 event on Jan. 3, 1994 and the Ms= 5. 7 event on Feb. 16,1994 are quite simple, both RSTFs are single impulses.The RSTFs of the Ms= 6. 9 main shock obtained from different stations are noticed to be azimuthally dependent, whose shapes are a slightly different with different stations. However, the RSTFs of the two smaller aftershocks are not azimuthally dependent. The integrations of RSTFs over the processes are quite close to each other, i. e., the scalar seismic moments estimated from different stations are in good agreement. Finally the scalar seismic moments of the three aftershocks are compared. The relative scalar seismic moment Of the three aftershocks deduced from the relative scalar seismic moments of the Ms=6. 9 main shock are very close to those inverted directly from the EGF deconvolution. The relative scalar seismic moment of the Ms =6. 9 main shock calculated using the three aftershocks as EGF are 22 (the Ms= 6. 0 aftershock being EGF), 26 (the Ms= 5. 7 aftershock being EGF) and 66 (the Ms= 5. 5 aftershock being EGF), respectively. Deducingfrom those results, the relative scalar sesimic moments of the Ms= 6. 0 to the Ms= 5. 7 events, the Ms= 6. 0 tothe Ms= 5. 5 events and the Ms= 5. 7 to the Ms= 5. 5 events are 1. 18, 3. 00 and 2. 54, respectively. The correspondent relative scalar seismic moments calculated directly from the waveform recordings are 1. 15, 3. 43, and 3. 05.
文摘For Hermite-Birkhoff interpolation of scattered multidumensional data by radial basis function (?),existence and characterization theorems and a variational principle are proved. Examples include (?)(r)=r^b,Duchon's thin-plate splines,Hardy's multiquadrics,and inverse multiquadrics.
文摘We use Radial Basis Functions (RBFs) to reconstruct smooth surfaces from 3D scattered data. An object's surface is defined implicitly as the zero set of an RBF fitted to the given surface data. We propose improvements on the methods of surface reconstruction with radial basis functions. A sparse approximation set of scattered data is constructed by reducing the number of interpolating points on the surface. We present an adaptive method for finding the off-surface normal points. The order of the equation decreases greatly as the number of the off-surface constraints reduces gradually. Experimental results are provided to illustrate that the proposed method is robust and may draw beautiful graphics.
基金Supported by the National Natural Science Foun-dation of China(60474022)
文摘A method for data classification will influence the efficiency of classification. Attributes reduction based on discernibility matrix and discernibility function in rough sets can use in data classification, so we put forward a method for data classification. Namely, firstly, we use discernibility matrix and discernibility function to delete superfluous attributes in formation system and get a necessary attribute set. Secondly, we delete superfluous attribute values and get decision rules. Finally, we classify data by means of decision rules. The experiments show that data classification using this method is simpler in the structure, and can improve the efficiency of classification.
基金supported by National Hi-tech Research and Development Program of China (863 Program, Grant No. 2007AA04Z433)Hunan Provincial Natural Science Foundation of China (Grant No. 09JJ8005)Scientific Research Foundation of Graduate School of Beijing University of Chemical and Technology,China (Grant No. 10Me002)
文摘As the differences of sensor's precision and some random factors are difficult to control,the actual measurement signals are far from the target signals that affect the reliability and precision of rotating machinery fault diagnosis.The traditional signal processing methods,such as classical inference and weighted averaging algorithm usually lack dynamic adaptability that is easy for trends to cause the faults to be misjudged or left out.To enhance the measuring veracity and precision of vibration signal in rotary machine multi-sensor vibration signal fault diagnosis,a novel data level fusion approach is presented on the basis of correlation function analysis to fast determine the weighted value of multi-sensor vibration signals.The approach doesn't require knowing the prior information about sensors,and the weighted value of sensors can be confirmed depending on the correlation measure of real-time data tested in the data level fusion process.It gives greater weighted value to the greater correlation measure of sensor signals,and vice versa.The approach can effectively suppress large errors and even can still fuse data in the case of sensor failures because it takes full advantage of sensor's own-information to determine the weighted value.Moreover,it has good performance of anti-jamming due to the correlation measures between noise and effective signals are usually small.Through the simulation of typical signal collected from multi-sensors,the comparative analysis of dynamic adaptability and fault tolerance between the proposed approach and traditional weighted averaging approach is taken.Finally,the rotor dynamics and integrated fault simulator is taken as an example to verify the feasibility and advantages of the proposed approach,it is shown that the multi-sensor data level fusion based on correlation function weighted approach is better than the traditional weighted average approach with respect to fusion precision and dynamic adaptability.Meantime,the approach is adaptable and easy to use,can be applied to other areas of vibration measurement.
文摘The objective of this paper is to present a review of different calibration and classification methods for functional data in the context of chemometric applications. In chemometric, it is usual to measure certain parameters in terms of a set of spectrometric curves that are observed in a finite set of points (functional data). Although the predictor variable is clearly functional, this problem is usually solved by using multivariate calibration techniques that consider it as a finite set of variables associated with the observed points (wavelengths or times). But these explicative variables are highly correlated and it is therefore more informative to reconstruct first the true functional form of the predictor curves. Although it has been published in several articles related to the implementation of functional data analysis techniques in chemometric, their power to solve real problems is not yet well known. Because of this the extension of multivariate calibration techniques (linear regression, principal component regression and partial least squares) and classification methods (linear discriminant analysis and logistic regression) to the functional domain and some relevant chemometric applications are reviewed in this paper.
基金Zhou's research was partially supported by the NNSF of China (10471140, 10571169)Wu's research was partially supported by NNSF of China (0571170)
文摘A kernel-type estimator of the quantile function Q(p) = inf{t:F(t) ≥ p}, 0 ≤ p ≤ 1, is proposed based on the kernel smoother when the data are subjected to random truncation. The Bahadur-type representations of the kernel smooth estimator are established, and from Bahadur representations the authors can show that this estimator is strongly consistent, asymptotically normal, and weakly convergent.
文摘The purpose of this paper is to study the theory of conservative estimating functions in nonlinear regression model with aggregated data. In this model, a quasi-score function with aggregated data is defined. When this function happens to be conservative, it is projection of the true score function onto a class of estimation functions. By constructing, the potential function for the projected score with aggregated data is obtained, which have some properties of log-likelihood function.
文摘The authors have applied a systems analysis approach to describe the musculoskeletal system as consisting of a stack of superimposed kinematic hier-archical segments in which each lower segment tends to transfer its motion to the other superimposed segments. This segmental chain enables the derivation of both conscious perception and sensory control of action in space. This applied systems analysis approach involves the measurements of the complex motor behavior in order to elucidate the fusion of multiple sensor data for the reliable and efficient acquisition of the kinetic, kinematics and electromyographic data of the human spatial behavior. The acquired kinematic and related kinetic signals represent attributive features of the internal recon-struction of the physical links between the superimposed body segments. In-deed, this reconstruction of the physical links was established as a result of the fusion of the multiple sensor data. Furthermore, this acquired kinematics, kinetics and electromyographic data provided detailed means to record, annotate, process, transmit, and display pertinent information derived from the musculoskeletal system to quantify and differentiate between subjects with mobility-related disabilities and able-bodied subjects, and enabled an inference into the active neural processes underlying balance reactions. To gain insight into the basis for this long-term dependence, the authors have applied the fusion of multiple sensor data to investigate the effects of Cerebral Palsy, Multiple Sclerosis and Diabetic Neuropathy conditions, on biomechanical/neurophysiological changes that may alter the ability of the human loco-motor system to generate ambulation, balance and posture.
基金Supported by the Science and Technology Development Fund of Macao(China)grant(No.042/2007/A3,No.003/2008/A1)partly supported by NSFC Project(No.10631080)National Key Basic Research Project of China grant(No.2004CB318000)
文摘In the present paper,a new criterion is derived to obtain the optimum fitting curve while using Cubic B-spline basis functions to remove the statistical noise in the spectroscopic data.In this criterion,firstly,smoothed fitting curves using Cubic B-spline basis functions are selected with the increasing knot number.Then,the best fitting curves are selected according to the value of the minimum residual sum of squares(RSS)of two adjacent fitting curves.In the case of more than one best fitting curves,the authors use Reinsch's first condition to find a better one.The minimum residual sum of squares(RSS)of fitting curve with noisy data is not recommended as the criterion to determine the best fitting curve,because this value decreases to zero as the number of selected channels increases and the minimum value gives no smoothing effect.Compared with Reinsch's method,the derived criterion is simple and enables the smoothing conditions to be determined automatically without any initial input parameter.With the derived criterion,the satisfactory result was obtained for the experimental spectroscopic data to remove the statistical noise using Cubic B-spline basis functions.
基金Supported by National Natural Science Youth Foundation (10401021).
文摘Solving large radial basis function (RBF) interpolation problem with non-customized methods is computationally expensive and the matrices that occur are typically badly conditioned. In order to avoid these difficulties, we present a fitting based on radial basis functions satisfying side conditions by least squares, although compared with interpolation the method loses some accuracy, it reduces the computational cost largely. Since the fitting accuracy and the non-singularity of coefficient matrix in normal equation are relevant to the uniformity of chosen centers of the fitted RBE we present a choice method of uniform centers. Numerical results confirm the fitting efficiency.
文摘In this paper, based on random left truncated and right censored data, the authors derive strong representations of the cumulative hazard function estimator and the product-limit estimator of the survival function. which are valid up to a given order statistic of the observations. A precise bound for the errors is obtained which only depends on the index of the last order statistic to be included.