Experimental and theoretical studies of the mechanisms of vibration stimulation of oil recovery in watered fields lead to the conclusion that resonance oscillations develop in fractured-block formations. These oscilla...Experimental and theoretical studies of the mechanisms of vibration stimulation of oil recovery in watered fields lead to the conclusion that resonance oscillations develop in fractured-block formations. These oscillations, caused by weak but long-lasting and frequency-stable influences, create the conditions for ultrasonic wave’s generation in the layers, which are capable of destroying thickened oil membranes in reservoir cracks. For fractured-porous reservoirs in the process of exploitation by the method of water high-pressure oil displacement, the possibility of intensifying ultrasonic vibrations can have an important technological significance. Even a very weak ultrasound can destroy, over a long period of time, the viscous oil membranes formed in the cracks between the blocks, which can be the reason for lowering the permeability of the layers and increasing the oil recovery. To describe these effects, it is necessary to consider the wave process in a hierarchically blocky environment and theoretically simulate the mechanism of the appearance of self-oscillations under the action of relaxation shear stresses. For the analysis of seism acoustic response in time on fixed intervals along the borehole an algorithm of phase diagrams of the state of many-phase medium is suggested.展开更多
Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabili...Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabilities. To effectively detect and mitigate cyberattacks, both computerized and visual analyses are typically required. However, most security analysts are not adequately trained in visualization principles and/or methods, which is required for effective visual perception of useful attack information hidden in attack data. Additionally, Honeypot has proven useful in cyberattack research, but no studies have comprehensively investigated visualization practices in the field. In this paper, we reviewed visualization practices and methods commonly used in the discovery and communication of attack patterns based on Honeypot network traffic data. Using the PRISMA methodology, we identified and screened 218 papers and evaluated only 37 papers having a high impact. Most Honeypot papers conducted summary statistics of Honeypot data based on static data metrics such as IP address, port, and packet size. They visually analyzed Honeypot attack data using simple graphical methods (such as line, bar, and pie charts) that tend to hide useful attack information. Furthermore, only a few papers conducted extended attack analysis, and commonly visualized attack data using scatter and linear plots. Papers rarely included simple yet sophisticated graphical methods, such as box plots and histograms, which allow for critical evaluation of analysis results. While a significant number of automated visualization tools have incorporated visualization standards by default, the construction of effective and expressive graphical methods for easy pattern discovery and explainable insights still requires applied knowledge and skill of visualization principles and tools, and occasionally, an interdisciplinary collaboration with peers. We, therefore, suggest the need, going forward, for non-classical graphical methods for visualizing attack patterns and communicating analysis results. We also recommend training investigators in visualization principles and standards for effective visual perception and presentation.展开更多
Seeing is an important index to evaluate the quality of an astronomical site.To estimate seeing at the Muztagh-Ata site with height and time quantitatively,the European Centre for Medium-Range Weather Forecasts reanal...Seeing is an important index to evaluate the quality of an astronomical site.To estimate seeing at the Muztagh-Ata site with height and time quantitatively,the European Centre for Medium-Range Weather Forecasts reanalysis database(ERA5)is used.Seeing calculated from ERA5 is compared consistently with the Differential Image Motion Monitor seeing at the height of 12 m.Results show that seeing decays exponentially with height at the Muztagh-Ata site.Seeing decays the fastest in fall in 2021 and most slowly with height in summer.The seeing condition is better in fall than in summer.The median value of seeing at 12 m is 0.89 arcsec,the maximum value is1.21 arcsec in August and the minimum is 0.66 arcsec in October.The median value of seeing at 12 m is 0.72arcsec in the nighttime and 1.08 arcsec in the daytime.Seeing is a combination of annual and about biannual variations with the same phase as temperature and wind speed indicating that seeing variation with time is influenced by temperature and wind speed.The Richardson number Ri is used to analyze the atmospheric stability and the variations of seeing are consistent with Ri between layers.These quantitative results can provide an important reference for a telescopic observation strategy.展开更多
Open clusters(OCs)serve as invaluable tracers for investigating the properties and evolution of stars and galaxies.Despite recent advancements in machine learning clustering algorithms,accurately discerning such clust...Open clusters(OCs)serve as invaluable tracers for investigating the properties and evolution of stars and galaxies.Despite recent advancements in machine learning clustering algorithms,accurately discerning such clusters remains challenging.We re-visited the 3013 samples generated with a hybrid clustering algorithm of FoF and pyUPMASK.A multi-view clustering(MvC)ensemble method was applied,which analyzes each member star of the OC from three perspectives—proper motion,spatial position,and composite views—before integrating the clustering outcomes to deduce more reliable cluster memberships.Based on the MvC results,we further excluded cluster candidates with fewer than ten member stars and obtained 1256 OC candidates.After isochrone fitting and visual inspection,we identified 506 candidate OCs in the Milky Way.In addition to the 493 previously reported candidates,we finally discovered 13 high-confidence new candidate clusters.展开更多
Fast and reliable localization of high-energy transients is crucial for characterizing the burst properties and guiding the follow-up observations.Localization based on the relative counts of different detectors has b...Fast and reliable localization of high-energy transients is crucial for characterizing the burst properties and guiding the follow-up observations.Localization based on the relative counts of different detectors has been widely used for all-sky gamma-ray monitors.There are two major methods for this count distribution localization:χ^(2)minimization method and the Bayesian method.Here we propose a modified Bayesian method that could take advantage of both the accuracy of the Bayesian method and the simplicity of the χ^(2)method.With comprehensive simulations,we find that our Bayesian method with Poisson likelihood is generally more applicable for various bursts than the χ^(2)method,especially for weak bursts.We further proposed a location-spectrum iteration approach based on the Bayesian inference,which could alleviate the problems caused by the spectral difference between the burst and location templates.Our method is very suitable for scenarios with limited computation resources or timesensitive applications,such as in-flight localization software,and low-latency localization for rapidly follow-up observations.展开更多
In the two-dimensional positioning method of pulsars, the grid method is used to provide non-sensitive direction and positional estimates. However, the grid method has a high computational load and low accuracy due to...In the two-dimensional positioning method of pulsars, the grid method is used to provide non-sensitive direction and positional estimates. However, the grid method has a high computational load and low accuracy due to the interval of the grid. To improve estimation accuracy and reduce the computational load, we propose a fast twodimensional positioning method for the crab pulsar based on multiple optimization algorithms(FTPCO). The FTPCO uses the Levenberg–Marquardt(LM) algorithm, three-point orientation(TPO) method, particle swarm optimization(PSO) and Newton–Raphson-based optimizer(NRBO) to substitute the grid method. First, to avoid the influence of the non-sensitive direction on positioning, we take an orbital error and the distortion of the pulsar profile as optimization objectives and combine the grid method with the LM algorithm or PSO to search for the non-sensitive direction. Then, on the sensitive plane perpendicular to the non-sensitive direction, the TPO method is proposed to fast search the sensitive direction and sub-sensitive direction. Finally, the NRBO is employed on the sensitive and sub-sensitive directions to achieve two-dimensional positioning of the Crab pulsar. The simulation results show that the computational load of the FTPCO is reduced by 89.4% and the positioning accuracy of the FTPCO is improved by approximately 38% compared with the grid method. The FTPCO has the advantage of high real-time accuracy and does not fall into the local optimum.展开更多
To address the problem of real-time processing of ultra-wide bandwidth pulsar baseband data,we designed and implemented a pulsar baseband data processing algorithm(PSRDP)based on GPU parallel computing technology.PSRD...To address the problem of real-time processing of ultra-wide bandwidth pulsar baseband data,we designed and implemented a pulsar baseband data processing algorithm(PSRDP)based on GPU parallel computing technology.PSRDP can perform operations such as baseband data unpacking,channel separation,coherent dedispersion,Stokes detection,phase and folding period prediction,and folding integration in GPU clusters.We tested the algorithm using the J0437-4715 pulsar baseband data generated by the CASPSR and Medusa backends of the Parkes,and the J0332+5434 pulsar baseband data generated by the self-developed backend of the Nan Shan Radio Telescope.We obtained the pulse profiles of each baseband data.Through experimental analysis,we have found that the pulse profiles generated by the PSRDP algorithm in this paper are essentially consistent with the processing results of Digital Signal Processing Software for Pulsar Astronomy(DSPSR),which verified the effectiveness of the PSRDP algorithm.Furthermore,using the same baseband data,we compared the processing speed of PSRDP with DSPSR,and the results showed that PSRDP was not slower than DSPSR in terms of speed.The theoretical and technical experience gained from the PSRDP algorithm research in this article lays a technical foundation for the real-time processing of QTT(Qi Tai radio Telescope)ultra-wide bandwidth pulsar baseband data.展开更多
In recent years improper allocation of safety input has prevailed in coal mines in China, which resulted in the frequent accidents in coal mining operation. A comprehensive assessment of the input efficiency of coal m...In recent years improper allocation of safety input has prevailed in coal mines in China, which resulted in the frequent accidents in coal mining operation. A comprehensive assessment of the input efficiency of coal mine safety should lead to improved efficiency in the use of funds and management resources. This helps government and enterprise managers better understand how safety inputs are used and to optimize allocation of resources. Study on coal mine's efficiency assessment of safety input was con- ducted in this paper. A C^2R model with non-Archimedean infinitesimal vector based on output is established after consideration of the input characteristics and the model properties. An assessment of an operating mine was done using a specific set of input and output criteria. It is found that the safety input was efficient in 2002 and 2005 and was weakly efficient in 2003. However, the efficiency was relatively low in both 2001 and 2004. The safety input resources can be optimized and adjusted by means of projection theory. Such analysis shows that, on average in 2001 and 2004, 45% of the expended funds could have been saved. Likewise, 10% of the safety management and technical staff could have been eliminated and working hours devoted to safety could have been reduced by 12%. These conditions could have Riven the same results.展开更多
A set of indices for performance evaluation for business processes with multiple inputs and multiple outputs is proposed, which are found in machinery manufacturers. Based on the traditional methods of data envelopmen...A set of indices for performance evaluation for business processes with multiple inputs and multiple outputs is proposed, which are found in machinery manufacturers. Based on the traditional methods of data envelopment analysis (DEA) and analytical hierarchical process (AHP), a hybrid model called DEA/AHP model is proposed to deal with the evaluation of business process performance. With the proposed method, the DEA is firstly used to develop a pairwise comparison matrix, and then the AHP is applied to evaluate the performance of business process using the pairwise comparison matrix. The significant advantage of this hybrid model is the use of objective data instead of subjective human judgment for performance evaluation. In the case study, a project of business process reengineering (BPR) with a hydraulic machinery manufacturer is used to demonstrate the effectiveness of the DEA/AHP model.展开更多
The application of data envelopment analysis (DEA) as a multiple criteria decision making (MCDM) technique has been gaining more and more attention in recent research. In the practice of applying DEA approach, the...The application of data envelopment analysis (DEA) as a multiple criteria decision making (MCDM) technique has been gaining more and more attention in recent research. In the practice of applying DEA approach, the appearance of uncertainties on input and output data of decision making unit (DMU) might make the nominal solution infeasible and lead to the efficiency scores meaningless from practical view. This paper analyzes the impact of data uncertainty on the evaluation results of DEA, and proposes several robust DEA models based on the adaptation of recently developed robust optimization approaches, which would be immune against input and output data uncertainties. The robust DEA models developed are based on input-oriented and outputoriented CCR model, respectively, when the uncertainties appear in output data and input data separately. Furthermore, the robust DEA models could deal with random symmetric uncertainty and unknown-but-bounded uncertainty, in both of which the distributions of the random data entries are permitted to be unknown. The robust DEA models are implemented in a numerical example and the efficiency scores and rankings of these models are compared. The results indicate that the robust DEA approach could be a more reliable method for efficiency evaluation and ranking in MCDM problems.展开更多
This paper establishes the phase space in the light of spacial series data , discusses the fractal structure of geological data in terms of correlated functions and studies the chaos of these data . In addition , it i...This paper establishes the phase space in the light of spacial series data , discusses the fractal structure of geological data in terms of correlated functions and studies the chaos of these data . In addition , it introduces the R/S analysis for time series analysis into spacial series to calculate the structural fractal dimensions of ranges and standard deviation for spacial series data -and to establish the fractal dimension matrix and the procedures in plotting the fractal dimension anomaly diagram with vector distances of fractal dimension . At last , it has examples of its application .展开更多
Data envelopment analysis (DEA) has become a standard non parametric approach to productivity analysis, especially to relative efficiency analysis of decision making units (DMUs). Extended to the prediction field, it ...Data envelopment analysis (DEA) has become a standard non parametric approach to productivity analysis, especially to relative efficiency analysis of decision making units (DMUs). Extended to the prediction field, it can solve the prediction problem with multiple inputs and outputs which can not be solved easily by the regression analysis method.But the traditional DEA models can not solve the problem with undesirable outputs,so in this paper the inherent relationship between goal programming and the DEA method based on the relationship between multiple goal programming and goal programming is explored,and a mixed DEA model which can make all factors of inputs and undesirable outputs decrease in different proportions is built.And at the same time,all the factors of desirable outputs increase in different proportions.展开更多
This paper introduced the basic theory and algorithm of the surrogate data method, which proposed a rigorous way to detect the random and seemingly stochastic characteristics in a system. The Gaussian data and the Ros...This paper introduced the basic theory and algorithm of the surrogate data method, which proposed a rigorous way to detect the random and seemingly stochastic characteristics in a system. The Gaussian data and the Rossler data were used to show the availability and effectivity of this method. According to the analysis by this method based on the short-circuiting current signals under the conditions of the same voltage and different wire feed speeds, it is demonstrated that the electrical signals time series exhibit apparently randomness when the welding parameters do not match. However, the electrical signals time series are deterministic when a match is found. The stability of short-circuiting transfer process could be judged exactly by the method of surrogate data.展开更多
This paper proposes a new approach for ranking efficiency units in data envelopment analysis as a modification of the super-efficiency models developed by Tone [1]. The new approach based on slacks-based measure of ef...This paper proposes a new approach for ranking efficiency units in data envelopment analysis as a modification of the super-efficiency models developed by Tone [1]. The new approach based on slacks-based measure of efficiency (SBM) for dealing with objective function used to classify all of the decision-making units allows the ranking of all inefficient DMUs and overcomes the disadvantages of infeasibility. This method also is applied to rank super-efficient scores for the sample of 145 agricultural bank branches in Viet Nam during 2007-2010. We then compare the estimated results from the new SCI model and the exsisting SBM model by using some statistical tests.展开更多
According to the eco-efficiency theory, combined with agricultural production characteristics, I point out the environmental impact and substance energy consumption characteristics of agricultural production. Based on...According to the eco-efficiency theory, combined with agricultural production characteristics, I point out the environmental impact and substance energy consumption characteristics of agricultural production. Based on this, I establish the eco-efficiency evaluation indicator system for agricultural production, and conduct a comprehensive analysis on the agricultural eco-efficiency of 17 prefecture-level cities in Anhui Province, using data envelopment analysis method.展开更多
Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minute...Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minutes as an innovation technique,which provides promising applications in tunnel deformation monitoring.Here,an efficient method for extracting tunnel cross-sections and convergence analysis using dense TLS point cloud data is proposed.First,the tunnel orientation is determined using principal component analysis(PCA)in the Euclidean plane.Two control points are introduced to detect and remove the unsuitable points by using point cloud division and then the ground points are removed by defining an elevation value width of 0.5 m.Next,a z-score method is introduced to detect and remove the outlies.Because the tunnel cross-section’s standard shape is round,the circle fitting is implemented using the least-squares method.Afterward,the convergence analysis is made at the angles of 0°,30°and 150°.The proposed approach’s feasibility is tested on a TLS point cloud of a Nanjing subway tunnel acquired using a FARO X330 laser scanner.The results indicate that the proposed methodology achieves an overall accuracy of 1.34 mm,which is also in agreement with the measurements acquired by a total station instrument.The proposed methodology provides new insights and references for the applications of TLS in tunnel deformation monitoring,which can also be extended to other engineering applications.展开更多
The paper studies the non-zero slacks in data envelopment analysis. A procedure is developed for the treatment of non-zero slacks. DEA projections can be done just in one step.
In the paper, the primary component analysis is made using 8 seismicity parameters of earthquake frequency N (ML≥3.0), b-value, η-value, A(b)-value, Mf-value, Ac-value, C-value and D-value that reflect the character...In the paper, the primary component analysis is made using 8 seismicity parameters of earthquake frequency N (ML≥3.0), b-value, η-value, A(b)-value, Mf-value, Ac-value, C-value and D-value that reflect the characteristics of magnitude, time and space distribution of seismicity from different respects. By using the primary component analysis method, the synthesis parameter W reflecting the anomalous features of earthquake magnitude, time and space distribution can be gained. Generally, there is some relativity among the 8 parameters, but their variations are different in different periods. The earthquake prediction based on these parameters is not very well. However, the synthesis parameter W showed obvious anomalies before 13 earthquakes (MS≥5.8) occurred in North China, which indicates that the synthesis parameter W can reflect the anomalous characteristics of magnitude, time and space distribution of seismicity better. Other problems related to the conclusions drawn by the primary component analysis method are also discussed.展开更多
This work involves the evaluation of dry port competitiveness through analysis of efficiencies for selected dry ports in Africa. Five dry ports were selected and analysis carried out over a period of four years. The d...This work involves the evaluation of dry port competitiveness through analysis of efficiencies for selected dry ports in Africa. Five dry ports were selected and analysis carried out over a period of four years. The dry ports considered were Mojo and Kality in Ethiopia, Mombasa in Kenya, Isaka in Tanzania and Casablanca in Casablanca, Morocco. Data Envelopment Analysis (DEA) was applied for this work. Container throughputs for the various ports under consideration were used as the output variable for the data analysis model, while the number of reach stackers, the number of tractors, the number of forklifts and the size of the dry port were used as the input variables. From the results, the Mombasa dry port was found to be the most efficient with an average score of approximately 1 over the period under consideration. Casablanca was the second efficient dry port with an average score of 0.762, while Isaka was the least efficient with an average score of 0.142. This research is significant since the African countries have embraced the dry port concept, as witnessed in the huge investments in this sector, and would serve to highlight areas that need improvement for the few existing dry port facilities, most of which are undergoing expansion as well as modernization.展开更多
This paper improves the slacks-based method for estimating inefficiency,derives the criteria for the selection of the weights of output and input inefficiencies in the objective function,and creates a new nonparametri...This paper improves the slacks-based method for estimating inefficiency,derives the criteria for the selection of the weights of output and input inefficiencies in the objective function,and creates a new nonparametric method for accounting economic growth.Based on this method,the paper estimates the sources of China s economic growth from 1978 to 2013.Our findings suggest that factor input and especially capital is a major source of economic growth for China as a whole and its major regions,and that economic growth in recent years is increasingly dependent on capital.For a rather long period of time before 2005,China s northeast,central and western regions lagged behind the eastern region in terms of economic growth,and TFP and factor input are major reasons behind such regional growth disparities.Although other regions have narrowed their disparities with and even overtaken the eastern region in terms of economic growth,the key driver is the rapid increase in the contribution of factor input.Advanced technologies of eastern region should be utilized to promote TFP progress in other regions,which is vital to economic growth in these regions and China as a whole.展开更多
文摘Experimental and theoretical studies of the mechanisms of vibration stimulation of oil recovery in watered fields lead to the conclusion that resonance oscillations develop in fractured-block formations. These oscillations, caused by weak but long-lasting and frequency-stable influences, create the conditions for ultrasonic wave’s generation in the layers, which are capable of destroying thickened oil membranes in reservoir cracks. For fractured-porous reservoirs in the process of exploitation by the method of water high-pressure oil displacement, the possibility of intensifying ultrasonic vibrations can have an important technological significance. Even a very weak ultrasound can destroy, over a long period of time, the viscous oil membranes formed in the cracks between the blocks, which can be the reason for lowering the permeability of the layers and increasing the oil recovery. To describe these effects, it is necessary to consider the wave process in a hierarchically blocky environment and theoretically simulate the mechanism of the appearance of self-oscillations under the action of relaxation shear stresses. For the analysis of seism acoustic response in time on fixed intervals along the borehole an algorithm of phase diagrams of the state of many-phase medium is suggested.
文摘Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabilities. To effectively detect and mitigate cyberattacks, both computerized and visual analyses are typically required. However, most security analysts are not adequately trained in visualization principles and/or methods, which is required for effective visual perception of useful attack information hidden in attack data. Additionally, Honeypot has proven useful in cyberattack research, but no studies have comprehensively investigated visualization practices in the field. In this paper, we reviewed visualization practices and methods commonly used in the discovery and communication of attack patterns based on Honeypot network traffic data. Using the PRISMA methodology, we identified and screened 218 papers and evaluated only 37 papers having a high impact. Most Honeypot papers conducted summary statistics of Honeypot data based on static data metrics such as IP address, port, and packet size. They visually analyzed Honeypot attack data using simple graphical methods (such as line, bar, and pie charts) that tend to hide useful attack information. Furthermore, only a few papers conducted extended attack analysis, and commonly visualized attack data using scatter and linear plots. Papers rarely included simple yet sophisticated graphical methods, such as box plots and histograms, which allow for critical evaluation of analysis results. While a significant number of automated visualization tools have incorporated visualization standards by default, the construction of effective and expressive graphical methods for easy pattern discovery and explainable insights still requires applied knowledge and skill of visualization principles and tools, and occasionally, an interdisciplinary collaboration with peers. We, therefore, suggest the need, going forward, for non-classical graphical methods for visualizing attack patterns and communicating analysis results. We also recommend training investigators in visualization principles and standards for effective visual perception and presentation.
基金funded by the National Natural Science Foundation of China(NSFC)the Chinese Academy of Sciences(CAS)(grant No.U2031209)the National Natural Science Foundation of China(NSFC,grant Nos.11872128,42174192,and 91952111)。
文摘Seeing is an important index to evaluate the quality of an astronomical site.To estimate seeing at the Muztagh-Ata site with height and time quantitatively,the European Centre for Medium-Range Weather Forecasts reanalysis database(ERA5)is used.Seeing calculated from ERA5 is compared consistently with the Differential Image Motion Monitor seeing at the height of 12 m.Results show that seeing decays exponentially with height at the Muztagh-Ata site.Seeing decays the fastest in fall in 2021 and most slowly with height in summer.The seeing condition is better in fall than in summer.The median value of seeing at 12 m is 0.89 arcsec,the maximum value is1.21 arcsec in August and the minimum is 0.66 arcsec in October.The median value of seeing at 12 m is 0.72arcsec in the nighttime and 1.08 arcsec in the daytime.Seeing is a combination of annual and about biannual variations with the same phase as temperature and wind speed indicating that seeing variation with time is influenced by temperature and wind speed.The Richardson number Ri is used to analyze the atmospheric stability and the variations of seeing are consistent with Ri between layers.These quantitative results can provide an important reference for a telescopic observation strategy.
基金supported by the National Key Research And Development Program of China(No.2022YFF0711500)the National Natural Science Foundation of China(NSFC,Grant No.12373097)+1 种基金the Basic and Applied Basic Research Foundation Project of Guangdong Province(No.2024A1515011503)the Guangzhou Science and Technology Funds(2023A03J0016)。
文摘Open clusters(OCs)serve as invaluable tracers for investigating the properties and evolution of stars and galaxies.Despite recent advancements in machine learning clustering algorithms,accurately discerning such clusters remains challenging.We re-visited the 3013 samples generated with a hybrid clustering algorithm of FoF and pyUPMASK.A multi-view clustering(MvC)ensemble method was applied,which analyzes each member star of the OC from three perspectives—proper motion,spatial position,and composite views—before integrating the clustering outcomes to deduce more reliable cluster memberships.Based on the MvC results,we further excluded cluster candidates with fewer than ten member stars and obtained 1256 OC candidates.After isochrone fitting and visual inspection,we identified 506 candidate OCs in the Milky Way.In addition to the 493 previously reported candidates,we finally discovered 13 high-confidence new candidate clusters.
基金supported by the National Key R&D Program of China(2021YFA0718500)support from the Strategic Priority Research Program on Space Science,the Chinese Academy of Sciences(grant Nos.XDA15360102,XDA15360300,XDA15052700 and E02212A02S)+1 种基金the National Natural Science Foundation of China(grant Nos.12173038 and U2038106)the National HEP Data Center(grant No.E029S2S1)。
文摘Fast and reliable localization of high-energy transients is crucial for characterizing the burst properties and guiding the follow-up observations.Localization based on the relative counts of different detectors has been widely used for all-sky gamma-ray monitors.There are two major methods for this count distribution localization:χ^(2)minimization method and the Bayesian method.Here we propose a modified Bayesian method that could take advantage of both the accuracy of the Bayesian method and the simplicity of the χ^(2)method.With comprehensive simulations,we find that our Bayesian method with Poisson likelihood is generally more applicable for various bursts than the χ^(2)method,especially for weak bursts.We further proposed a location-spectrum iteration approach based on the Bayesian inference,which could alleviate the problems caused by the spectral difference between the burst and location templates.Our method is very suitable for scenarios with limited computation resources or timesensitive applications,such as in-flight localization software,and low-latency localization for rapidly follow-up observations.
基金supported by the National Natural Science Foundation of China (Nos. 61873196 and 62373030)the Innovation Program for Quantum Science and Technology(No. 2021ZD0303400)。
文摘In the two-dimensional positioning method of pulsars, the grid method is used to provide non-sensitive direction and positional estimates. However, the grid method has a high computational load and low accuracy due to the interval of the grid. To improve estimation accuracy and reduce the computational load, we propose a fast twodimensional positioning method for the crab pulsar based on multiple optimization algorithms(FTPCO). The FTPCO uses the Levenberg–Marquardt(LM) algorithm, three-point orientation(TPO) method, particle swarm optimization(PSO) and Newton–Raphson-based optimizer(NRBO) to substitute the grid method. First, to avoid the influence of the non-sensitive direction on positioning, we take an orbital error and the distortion of the pulsar profile as optimization objectives and combine the grid method with the LM algorithm or PSO to search for the non-sensitive direction. Then, on the sensitive plane perpendicular to the non-sensitive direction, the TPO method is proposed to fast search the sensitive direction and sub-sensitive direction. Finally, the NRBO is employed on the sensitive and sub-sensitive directions to achieve two-dimensional positioning of the Crab pulsar. The simulation results show that the computational load of the FTPCO is reduced by 89.4% and the positioning accuracy of the FTPCO is improved by approximately 38% compared with the grid method. The FTPCO has the advantage of high real-time accuracy and does not fall into the local optimum.
基金supported by the National Key R&D Program of China Nos.2021YFC2203502 and 2022YFF0711502the National Natural Science Foundation of China(NSFC)(12173077 and 12003062)+5 种基金the Tianshan Innovation Team Plan of Xinjiang Uygur Autonomous Region(2022D14020)the Tianshan Talent Project of Xinjiang Uygur Autonomous Region(2022TSYCCX0095)the Scientific Instrument Developing Project of the Chinese Academy of Sciences,grant No.PTYQ2022YZZD01China National Astronomical Data Center(NADC)the Operation,Maintenance and Upgrading Fund for Astronomical Telescopes and Facility Instruments,budgeted from the Ministry of Finance of China(MOF)and administrated by the Chinese Academy of Sciences(CAS)Natural Science Foundation of Xinjiang Uygur Autonomous Region(2022D01A360)。
文摘To address the problem of real-time processing of ultra-wide bandwidth pulsar baseband data,we designed and implemented a pulsar baseband data processing algorithm(PSRDP)based on GPU parallel computing technology.PSRDP can perform operations such as baseband data unpacking,channel separation,coherent dedispersion,Stokes detection,phase and folding period prediction,and folding integration in GPU clusters.We tested the algorithm using the J0437-4715 pulsar baseband data generated by the CASPSR and Medusa backends of the Parkes,and the J0332+5434 pulsar baseband data generated by the self-developed backend of the Nan Shan Radio Telescope.We obtained the pulse profiles of each baseband data.Through experimental analysis,we have found that the pulse profiles generated by the PSRDP algorithm in this paper are essentially consistent with the processing results of Digital Signal Processing Software for Pulsar Astronomy(DSPSR),which verified the effectiveness of the PSRDP algorithm.Furthermore,using the same baseband data,we compared the processing speed of PSRDP with DSPSR,and the results showed that PSRDP was not slower than DSPSR in terms of speed.The theoretical and technical experience gained from the PSRDP algorithm research in this article lays a technical foundation for the real-time processing of QTT(Qi Tai radio Telescope)ultra-wide bandwidth pulsar baseband data.
基金Project 70771105 supported by the National Natural Science Foundation of China
文摘In recent years improper allocation of safety input has prevailed in coal mines in China, which resulted in the frequent accidents in coal mining operation. A comprehensive assessment of the input efficiency of coal mine safety should lead to improved efficiency in the use of funds and management resources. This helps government and enterprise managers better understand how safety inputs are used and to optimize allocation of resources. Study on coal mine's efficiency assessment of safety input was con- ducted in this paper. A C^2R model with non-Archimedean infinitesimal vector based on output is established after consideration of the input characteristics and the model properties. An assessment of an operating mine was done using a specific set of input and output criteria. It is found that the safety input was efficient in 2002 and 2005 and was weakly efficient in 2003. However, the efficiency was relatively low in both 2001 and 2004. The safety input resources can be optimized and adjusted by means of projection theory. Such analysis shows that, on average in 2001 and 2004, 45% of the expended funds could have been saved. Likewise, 10% of the safety management and technical staff could have been eliminated and working hours devoted to safety could have been reduced by 12%. These conditions could have Riven the same results.
基金This project is supported by National Natural Science Foundation of China (No. 70471009)Natural Science Foundation Project of CQ CSTC, China (No. 2006BA2033).
文摘A set of indices for performance evaluation for business processes with multiple inputs and multiple outputs is proposed, which are found in machinery manufacturers. Based on the traditional methods of data envelopment analysis (DEA) and analytical hierarchical process (AHP), a hybrid model called DEA/AHP model is proposed to deal with the evaluation of business process performance. With the proposed method, the DEA is firstly used to develop a pairwise comparison matrix, and then the AHP is applied to evaluate the performance of business process using the pairwise comparison matrix. The significant advantage of this hybrid model is the use of objective data instead of subjective human judgment for performance evaluation. In the case study, a project of business process reengineering (BPR) with a hydraulic machinery manufacturer is used to demonstrate the effectiveness of the DEA/AHP model.
文摘The application of data envelopment analysis (DEA) as a multiple criteria decision making (MCDM) technique has been gaining more and more attention in recent research. In the practice of applying DEA approach, the appearance of uncertainties on input and output data of decision making unit (DMU) might make the nominal solution infeasible and lead to the efficiency scores meaningless from practical view. This paper analyzes the impact of data uncertainty on the evaluation results of DEA, and proposes several robust DEA models based on the adaptation of recently developed robust optimization approaches, which would be immune against input and output data uncertainties. The robust DEA models developed are based on input-oriented and outputoriented CCR model, respectively, when the uncertainties appear in output data and input data separately. Furthermore, the robust DEA models could deal with random symmetric uncertainty and unknown-but-bounded uncertainty, in both of which the distributions of the random data entries are permitted to be unknown. The robust DEA models are implemented in a numerical example and the efficiency scores and rankings of these models are compared. The results indicate that the robust DEA approach could be a more reliable method for efficiency evaluation and ranking in MCDM problems.
文摘This paper establishes the phase space in the light of spacial series data , discusses the fractal structure of geological data in terms of correlated functions and studies the chaos of these data . In addition , it introduces the R/S analysis for time series analysis into spacial series to calculate the structural fractal dimensions of ranges and standard deviation for spacial series data -and to establish the fractal dimension matrix and the procedures in plotting the fractal dimension anomaly diagram with vector distances of fractal dimension . At last , it has examples of its application .
文摘Data envelopment analysis (DEA) has become a standard non parametric approach to productivity analysis, especially to relative efficiency analysis of decision making units (DMUs). Extended to the prediction field, it can solve the prediction problem with multiple inputs and outputs which can not be solved easily by the regression analysis method.But the traditional DEA models can not solve the problem with undesirable outputs,so in this paper the inherent relationship between goal programming and the DEA method based on the relationship between multiple goal programming and goal programming is explored,and a mixed DEA model which can make all factors of inputs and undesirable outputs decrease in different proportions is built.And at the same time,all the factors of desirable outputs increase in different proportions.
基金supported by the Young Scientists Fund of the National Natural Science Foundation of China(Grant No.51205283)
文摘This paper introduced the basic theory and algorithm of the surrogate data method, which proposed a rigorous way to detect the random and seemingly stochastic characteristics in a system. The Gaussian data and the Rossler data were used to show the availability and effectivity of this method. According to the analysis by this method based on the short-circuiting current signals under the conditions of the same voltage and different wire feed speeds, it is demonstrated that the electrical signals time series exhibit apparently randomness when the welding parameters do not match. However, the electrical signals time series are deterministic when a match is found. The stability of short-circuiting transfer process could be judged exactly by the method of surrogate data.
文摘This paper proposes a new approach for ranking efficiency units in data envelopment analysis as a modification of the super-efficiency models developed by Tone [1]. The new approach based on slacks-based measure of efficiency (SBM) for dealing with objective function used to classify all of the decision-making units allows the ranking of all inefficient DMUs and overcomes the disadvantages of infeasibility. This method also is applied to rank super-efficient scores for the sample of 145 agricultural bank branches in Viet Nam during 2007-2010. We then compare the estimated results from the new SCI model and the exsisting SBM model by using some statistical tests.
基金Supported by Special Project for Youth Research in Anhui Institute of Architecture&Industry(20104012)
文摘According to the eco-efficiency theory, combined with agricultural production characteristics, I point out the environmental impact and substance energy consumption characteristics of agricultural production. Based on this, I establish the eco-efficiency evaluation indicator system for agricultural production, and conduct a comprehensive analysis on the agricultural eco-efficiency of 17 prefecture-level cities in Anhui Province, using data envelopment analysis method.
基金National Natural Science Foundation of China(No.41801379)Fundamental Research Funds for the Central Universities(No.2019B08414)National Key R&D Program of China(No.2016YFC0401801)。
文摘Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minutes as an innovation technique,which provides promising applications in tunnel deformation monitoring.Here,an efficient method for extracting tunnel cross-sections and convergence analysis using dense TLS point cloud data is proposed.First,the tunnel orientation is determined using principal component analysis(PCA)in the Euclidean plane.Two control points are introduced to detect and remove the unsuitable points by using point cloud division and then the ground points are removed by defining an elevation value width of 0.5 m.Next,a z-score method is introduced to detect and remove the outlies.Because the tunnel cross-section’s standard shape is round,the circle fitting is implemented using the least-squares method.Afterward,the convergence analysis is made at the angles of 0°,30°and 150°.The proposed approach’s feasibility is tested on a TLS point cloud of a Nanjing subway tunnel acquired using a FARO X330 laser scanner.The results indicate that the proposed methodology achieves an overall accuracy of 1.34 mm,which is also in agreement with the measurements acquired by a total station instrument.The proposed methodology provides new insights and references for the applications of TLS in tunnel deformation monitoring,which can also be extended to other engineering applications.
文摘The paper studies the non-zero slacks in data envelopment analysis. A procedure is developed for the treatment of non-zero slacks. DEA projections can be done just in one step.
基金Project of Joint Seismological Science Foundation of China (104090).
文摘In the paper, the primary component analysis is made using 8 seismicity parameters of earthquake frequency N (ML≥3.0), b-value, η-value, A(b)-value, Mf-value, Ac-value, C-value and D-value that reflect the characteristics of magnitude, time and space distribution of seismicity from different respects. By using the primary component analysis method, the synthesis parameter W reflecting the anomalous features of earthquake magnitude, time and space distribution can be gained. Generally, there is some relativity among the 8 parameters, but their variations are different in different periods. The earthquake prediction based on these parameters is not very well. However, the synthesis parameter W showed obvious anomalies before 13 earthquakes (MS≥5.8) occurred in North China, which indicates that the synthesis parameter W can reflect the anomalous characteristics of magnitude, time and space distribution of seismicity better. Other problems related to the conclusions drawn by the primary component analysis method are also discussed.
文摘This work involves the evaluation of dry port competitiveness through analysis of efficiencies for selected dry ports in Africa. Five dry ports were selected and analysis carried out over a period of four years. The dry ports considered were Mojo and Kality in Ethiopia, Mombasa in Kenya, Isaka in Tanzania and Casablanca in Casablanca, Morocco. Data Envelopment Analysis (DEA) was applied for this work. Container throughputs for the various ports under consideration were used as the output variable for the data analysis model, while the number of reach stackers, the number of tractors, the number of forklifts and the size of the dry port were used as the input variables. From the results, the Mombasa dry port was found to be the most efficient with an average score of approximately 1 over the period under consideration. Casablanca was the second efficient dry port with an average score of 0.762, while Isaka was the least efficient with an average score of 0.142. This research is significant since the African countries have embraced the dry port concept, as witnessed in the huge investments in this sector, and would serve to highlight areas that need improvement for the few existing dry port facilities, most of which are undergoing expansion as well as modernization.
文摘This paper improves the slacks-based method for estimating inefficiency,derives the criteria for the selection of the weights of output and input inefficiencies in the objective function,and creates a new nonparametric method for accounting economic growth.Based on this method,the paper estimates the sources of China s economic growth from 1978 to 2013.Our findings suggest that factor input and especially capital is a major source of economic growth for China as a whole and its major regions,and that economic growth in recent years is increasingly dependent on capital.For a rather long period of time before 2005,China s northeast,central and western regions lagged behind the eastern region in terms of economic growth,and TFP and factor input are major reasons behind such regional growth disparities.Although other regions have narrowed their disparities with and even overtaken the eastern region in terms of economic growth,the key driver is the rapid increase in the contribution of factor input.Advanced technologies of eastern region should be utilized to promote TFP progress in other regions,which is vital to economic growth in these regions and China as a whole.