inductive fault analysis is a technique for enumerating likely bridges that is limited by the weighted critical area computation. Based on the rectangle model of a real defect and mathematical morphology, an efficient...inductive fault analysis is a technique for enumerating likely bridges that is limited by the weighted critical area computation. Based on the rectangle model of a real defect and mathematical morphology, an efficient algorithm is presented to compute the weighted critical area of a layout. The algorithm avoids the need to determine which rectangles belong to a net and the merging of the critical area corresponding to a net pair. Experimental resuits showing the algorithm's performance are presented.展开更多
To provide basis for the reliability improvement design of CNC system, the failure data of a type of CNC system in one year are collected under field conditions in workshops. The distribution model parameters of time ...To provide basis for the reliability improvement design of CNC system, the failure data of a type of CNC system in one year are collected under field conditions in workshops. The distribution model parameters of time between failures are estimated by least square method and hypothesis testing is done by d-test method. It is proved that the time between failures of the CNC system follows Weibull distribution and the system has entered into the wear-out failure period. The failure positions and failure causes are analyzed further to indicate the weak subsystems of the CNC system. It can be found that servo unit, electrical system, detecting unit and power supply are principal failure positions and the main failure cause is breakage of components. The corresponding improvement measures are put forward. The paper provides a reference to reliability design and analysis of CNC system for the manufacturer and has great guidance to using and maintaining CNC system for the user.展开更多
Since there are not enough fault data in historical data sets, it is very difficult to diagnose faults for batch processes. In addition, a complete batch trajectory can be obtained till the end of its operation. In or...Since there are not enough fault data in historical data sets, it is very difficult to diagnose faults for batch processes. In addition, a complete batch trajectory can be obtained till the end of its operation. In order to overcome the need for estimated or filled up future unmeasured values in the online fault diagnosis, sufficiently utilize the finite information of faults, and enhance the diagnostic performance, an improved multi-model Fisher discriminant analysis is represented. The trait of the proposed method is that the training data sets are made of the current measured information and the past major discriminant information, and not only the current information or the whole batch data. An industrial typical multi-stage streptomycin fermentation process is used to test the performance of fault diagnosis of the proposed method.展开更多
PRINCE is a 64-bit lightweight block cipher with a 128-bit key published at ASIACRYPT 2012. Assuming one nibble fault is injected, previous different fault analysis(DFA) on PRINCE adopted the technique from DFA on AES...PRINCE is a 64-bit lightweight block cipher with a 128-bit key published at ASIACRYPT 2012. Assuming one nibble fault is injected, previous different fault analysis(DFA) on PRINCE adopted the technique from DFA on AES and current results are different. This paper aims to make a comprehensive study of algebraic fault analysis(AFA) on PRINCE. How to build the equations for PRINCE and faults are explained. Extensive experiments are conducted. Under nibble-based fault model, AFA with three or four fault injections can succeed within 300 seconds with a very high probability. Under other fault models such as byte-based, half word-based, word-based fault models, the faults become overlapped in the last round and previous DFAs are difficult to work. Our results show that AFA can still succeed to recover the full master key. To evaluate security of PRINCE against fault attacks, we utilize AFA to calculate the reduced entropy of the secret key for given amount of fault injections. The results can interpret and compare the efficiency of previous work. Under nibble-based fault model, the master key of PRINCE can be reduced to 29.69 and 236.10 with 3 and 2 fault injections on average, respectively.展开更多
Potential failures of electronic instrument are very common in the engineering practice.In this paper,potential failure state model is analyzed based on dynamic characteristics of electronic instrument at work and a c...Potential failures of electronic instrument are very common in the engineering practice.In this paper,potential failure state model is analyzed based on dynamic characteristics of electronic instrument at work and a comprehensive method of judging multi-state reliability is put forward.Then,a multi-state electronic instrument reliability analysis model is built based on Bayesian Networks(BN).Considering the failure-potential failure-normal work states,the model is built to estimate reliability of the system and the conditional probability of the elements.Finally,the model is proved corrective and effective by examples.展开更多
The Coupled Model Inter-comparison Project Phase 5 (CMIP5) contains a group of state-of-the-art climate models and represents the highest level of climate simulation thus far. However, these models significantly ove...The Coupled Model Inter-comparison Project Phase 5 (CMIP5) contains a group of state-of-the-art climate models and represents the highest level of climate simulation thus far. However, these models significantly overestimated global mean surface temperature (GMST) during 2006-2014. Based on the ensemble empirical mode decomposition (EEMD) method, the long term change of the observed GMST time series of HadCRUT4 records during 1850-2014 was analyzed, then the simulated GMST by 33 CMIP5 climate models was assessed. The possible reason that climate models failed to project the recent global warming hiatus was revealed. Results show that during 1850-2014 the GMST on a centennial timescale rose with fluctuation, dominated by the secular trend and the multi-decadal variability (MDV). The secular trend was relatively steady beginning in the early 20th century, with an average warming rate of 0.0883℃/decade over the last 50 years. While the MDV (with a -65-year cycle) showed 2.5 multi-decadal waves during 1850-2014, which deepened and steepened with time, the alarming warming over the last quarter of the 20th century was a result of the concurrence of the secular wanning trend and the warming phase of the MDV, both of which accounted one third of the temperature increase during 1975-1998. Recently the slowdown of global warming emerged as the MDV approached its third peak, leading to a reduction in the warming rate. A comparative analysis between the GMST time series derived from HadCRUT4 records and 33 CMIP5 model outputs reveals that the GMSTs during the historical simulation period of 1850-2005 can be reproduced well by models, especially on the accelerated global warming over the last quarter of 20th century. However, the projected GMSTs and their linear trends during 2006-2014 under the RCP4.5 scenario were significantly higher than observed. This is because the CMIP5 models confused the MDV with secular trend underlying the GMST time series, which results in a fast secular trend and an improper MDV with irregular phases and small amplitudes. This implies that the role of atmospheric CO2 in global warming may be overestimated, while the MDV which is an interior oscillation of the climate system may be underestimated, which should be related to insufficient understanding of key climatic internal dynamic processes. Our study puts forward an important criterion for the new generation of climate models: they should be able to simulate both the secular trend and the MDV of GMST.展开更多
We analyse further the reliability behaviour of series and parallel systems in the successive damage model initiated by Downton. The results are compared with those obtained for other models with different bivariate d...We analyse further the reliability behaviour of series and parallel systems in the successive damage model initiated by Downton. The results are compared with those obtained for other models with different bivariate distributions.展开更多
文摘inductive fault analysis is a technique for enumerating likely bridges that is limited by the weighted critical area computation. Based on the rectangle model of a real defect and mathematical morphology, an efficient algorithm is presented to compute the weighted critical area of a layout. The algorithm avoids the need to determine which rectangles belong to a net and the merging of the critical area corresponding to a net pair. Experimental resuits showing the algorithm's performance are presented.
基金the National High Technology Research and Development Program of China(Grant No.2002AA424058)the 10th Five-year National S&T Program of China(Grant No.2001BA203B13 -02).
文摘To provide basis for the reliability improvement design of CNC system, the failure data of a type of CNC system in one year are collected under field conditions in workshops. The distribution model parameters of time between failures are estimated by least square method and hypothesis testing is done by d-test method. It is proved that the time between failures of the CNC system follows Weibull distribution and the system has entered into the wear-out failure period. The failure positions and failure causes are analyzed further to indicate the weak subsystems of the CNC system. It can be found that servo unit, electrical system, detecting unit and power supply are principal failure positions and the main failure cause is breakage of components. The corresponding improvement measures are put forward. The paper provides a reference to reliability design and analysis of CNC system for the manufacturer and has great guidance to using and maintaining CNC system for the user.
基金Supported by the National Natural Science Foundation of China (No.60421002).
文摘Since there are not enough fault data in historical data sets, it is very difficult to diagnose faults for batch processes. In addition, a complete batch trajectory can be obtained till the end of its operation. In order to overcome the need for estimated or filled up future unmeasured values in the online fault diagnosis, sufficiently utilize the finite information of faults, and enhance the diagnostic performance, an improved multi-model Fisher discriminant analysis is represented. The trait of the proposed method is that the training data sets are made of the current measured information and the past major discriminant information, and not only the current information or the whole batch data. An industrial typical multi-stage streptomycin fermentation process is used to test the performance of fault diagnosis of the proposed method.
基金supported in part by the Major State Basic Research Development Program (973 Plan) of China under thegrant 2013CB338004the National Natural Science Foundation of China under the grants 61173191, 61271124, 61272491, 61309021, 61472357+1 种基金by the Zhejiang Provincial Natural Science Foundation of China under the grant LY13F010001by the Fundamental Research Funds for the Central Universities under the grant 2015QNA5005
文摘PRINCE is a 64-bit lightweight block cipher with a 128-bit key published at ASIACRYPT 2012. Assuming one nibble fault is injected, previous different fault analysis(DFA) on PRINCE adopted the technique from DFA on AES and current results are different. This paper aims to make a comprehensive study of algebraic fault analysis(AFA) on PRINCE. How to build the equations for PRINCE and faults are explained. Extensive experiments are conducted. Under nibble-based fault model, AFA with three or four fault injections can succeed within 300 seconds with a very high probability. Under other fault models such as byte-based, half word-based, word-based fault models, the faults become overlapped in the last round and previous DFAs are difficult to work. Our results show that AFA can still succeed to recover the full master key. To evaluate security of PRINCE against fault attacks, we utilize AFA to calculate the reduced entropy of the secret key for given amount of fault injections. The results can interpret and compare the efficiency of previous work. Under nibble-based fault model, the master key of PRINCE can be reduced to 29.69 and 236.10 with 3 and 2 fault injections on average, respectively.
基金supported by the Natural Science Foundation of China(No.60971092)
文摘Potential failures of electronic instrument are very common in the engineering practice.In this paper,potential failure state model is analyzed based on dynamic characteristics of electronic instrument at work and a comprehensive method of judging multi-state reliability is put forward.Then,a multi-state electronic instrument reliability analysis model is built based on Bayesian Networks(BN).Considering the failure-potential failure-normal work states,the model is built to estimate reliability of the system and the conditional probability of the elements.Finally,the model is proved corrective and effective by examples.
基金supported by the National Natural Science Foundation of China-Shandong Joint Fund for Marine Science Research Centers(Grant No.U1406404)the Transparent Ocean Project (Grant No.2015ASKJ01)the corresponding author is also supported by Ao-Shan Talent Program
文摘The Coupled Model Inter-comparison Project Phase 5 (CMIP5) contains a group of state-of-the-art climate models and represents the highest level of climate simulation thus far. However, these models significantly overestimated global mean surface temperature (GMST) during 2006-2014. Based on the ensemble empirical mode decomposition (EEMD) method, the long term change of the observed GMST time series of HadCRUT4 records during 1850-2014 was analyzed, then the simulated GMST by 33 CMIP5 climate models was assessed. The possible reason that climate models failed to project the recent global warming hiatus was revealed. Results show that during 1850-2014 the GMST on a centennial timescale rose with fluctuation, dominated by the secular trend and the multi-decadal variability (MDV). The secular trend was relatively steady beginning in the early 20th century, with an average warming rate of 0.0883℃/decade over the last 50 years. While the MDV (with a -65-year cycle) showed 2.5 multi-decadal waves during 1850-2014, which deepened and steepened with time, the alarming warming over the last quarter of the 20th century was a result of the concurrence of the secular wanning trend and the warming phase of the MDV, both of which accounted one third of the temperature increase during 1975-1998. Recently the slowdown of global warming emerged as the MDV approached its third peak, leading to a reduction in the warming rate. A comparative analysis between the GMST time series derived from HadCRUT4 records and 33 CMIP5 model outputs reveals that the GMSTs during the historical simulation period of 1850-2005 can be reproduced well by models, especially on the accelerated global warming over the last quarter of 20th century. However, the projected GMSTs and their linear trends during 2006-2014 under the RCP4.5 scenario were significantly higher than observed. This is because the CMIP5 models confused the MDV with secular trend underlying the GMST time series, which results in a fast secular trend and an improper MDV with irregular phases and small amplitudes. This implies that the role of atmospheric CO2 in global warming may be overestimated, while the MDV which is an interior oscillation of the climate system may be underestimated, which should be related to insufficient understanding of key climatic internal dynamic processes. Our study puts forward an important criterion for the new generation of climate models: they should be able to simulate both the secular trend and the MDV of GMST.
文摘We analyse further the reliability behaviour of series and parallel systems in the successive damage model initiated by Downton. The results are compared with those obtained for other models with different bivariate distributions.