The assumption widely used in the user equilibrium model for stochastic network was that the probability distributions of the travel time were known explicitly by travelers. However, this distribution may be unavailab...The assumption widely used in the user equilibrium model for stochastic network was that the probability distributions of the travel time were known explicitly by travelers. However, this distribution may be unavailable in reality. By relaxing the restrictive assumption, a robust user equilibrium model based on cumulative prospect theory under distribution-free travel time was presented. In the absence of the cumulative distribution function of the travel time, the exact cumulative prospect value(CPV) for each route cannot be obtained. However, the upper and lower bounds on the CPV can be calculated by probability inequalities.Travelers were assumed to choose the routes with the best worst-case CPVs. The proposed model was formulated as a variational inequality problem and solved via a heuristic solution algorithm. A numerical example was also provided to illustrate the application of the proposed model and the efficiency of the solution algorithm.展开更多
Video Super-Resolution (SR) reconstruction produces video sequences with High Resolution (HR) via the fusion of several Low-Resolution (LR) video frames. Traditional methods rely on the accurate estimation of su...Video Super-Resolution (SR) reconstruction produces video sequences with High Resolution (HR) via the fusion of several Low-Resolution (LR) video frames. Traditional methods rely on the accurate estimation of subpixel motion, which constrains their applicability to video sequences with relatively simple motions such as global translation. We propose an efficient iterative spatio-temporal adaptive SR reconstruction model based on Zemike Moment (ZM), which is effective for spatial video sequences with arbitrary motion. The model uses region correlation judgment and self-adaptive threshold strategies to improve the effect and time efficiency of the ZM-based SR method. This leads to better mining of non-local self-similarity and local structural regularity, and is robust to noise and rotation. An efficient iterative curvature-based interpolation scheme is introduced to obtain the initial HR estimation of each LR video frame. Experimental results both on spatial and standard video sequences demonstrate that the proposed method outperforms existing methods in terms of both subjective visual and objective quantitative evaluations, and greatly improves the time efficiency.展开更多
Based on the evaluation of advantages and disadvantages of high-precision digital time interval measuring algorithms, and combined with the principle of the typical time-difference ultrasonic flow measurement, the req...Based on the evaluation of advantages and disadvantages of high-precision digital time interval measuring algorithms, and combined with the principle of the typical time-difference ultrasonic flow measurement, the requirements for the measurement of echo time of flight put forward by the ultrasonic flow measurement are analyzed. A new high-precision time interval measurement algorithm is presented, which combines the pulse counting method with the phase delay interpolation. The pulse counting method is used to ensure a large dynamic measuring range, and a double-edge triggering counter is designed to improve the accuracy and reduce the counting quantization error. The phase delay interpolation is used to reduce the quantization error of pulse counting for further improving the time measurement resolution. Test data show that the systexn for the measurement of the ultrasonic echo time of flight based on this algorithm and implemented on an Field Programmable Gate Army(FleA) needs a relatively short time for measurement, and has a measurement error of less than 105 ps.展开更多
The reliability of electromechanical product is usually determined by the fault number and working time traditionally. The shortcoming of this method is that the product must be in service. To design and enhance the r...The reliability of electromechanical product is usually determined by the fault number and working time traditionally. The shortcoming of this method is that the product must be in service. To design and enhance the reliability of the electromechanical product, the reliability evaluation method must be feasible and correct. Reliability evaluation method and algorithm were proposed. The reliability of product can be calculated by the reliability of subsystems which can be gained by experiment or historical data. The reliability of the machining center was evaluated by the method and algorithm as one example. The calculation result shows that the solution accuracy of mean time between failures is 97.4% calculated by the method proposed in this article compared by the traditional method. The method and algorithm can be used to evaluate the reliability of electromechanical product before it is in service.展开更多
This paper considers the parallel machines scheduling problem where jobs are subject to different release times. A constructive heuristic is first proposed to solve the problem in a modest amount of computer time. In ...This paper considers the parallel machines scheduling problem where jobs are subject to different release times. A constructive heuristic is first proposed to solve the problem in a modest amount of computer time. In general, the quality of the solutions provided by heuristics degrades with the increase of the probiem’s scale. Combined the global search ability of genetic algorithm, this paper proposed a hybrid heuristic to improve the quality of solutions further. The computational results show that the hybrid heuristic combines the advantages of heuristic and genetic algorithm effectively and can provide very good solutions to some large problems in a reasonable amount of computer time.展开更多
Objective To establish and evaluate a hypercoagulable animal model for the assessment of anticoagulants. Methods Forty mice, thirty-two rats, and twenty-four rabbits were randomly and equally divided into control grou...Objective To establish and evaluate a hypercoagulable animal model for the assessment of anticoagulants. Methods Forty mice, thirty-two rats, and twenty-four rabbits were randomly and equally divided into control group (saline) and three ellagic acid (EA)-treated groups (low, middle, and high doses). In the mice, bleeding time (BT) was estimated with tail transaction, and clotting time (CT) with template method. Prothrombin time (PT) and the activated partial thromboplastin time (APTT) in rats and rabbits were measured by means of Quick's one-stage assay and modified APTT assay respectively. In addition, thrombin activity was estimated in rats with PT assay using a hemagglutination analyzer. The circulating platelet aggregates were de- tected in rabbits through platelet counting and presented as the circulating platelet aggregate ratio (CPAR). Results EA shortened BT and CT in mice, PT and APTT in rats, and increased thrombin activity and CPAR, all in a dose-dependent manner. EA also brought reduction of PT and APTT in rabbits in dose- and time-dependent manners. Conclusion EA could induce hypercoagulable state through activating coagulation system and platelets in mice, rats, and rabbits.展开更多
This paper proposes a Genetic Programming-Based Modeling (GPM) algorithm on chaotic time series. GP is used here to search for appropriate model structures in function space, and the Particle Swarm Optimization (PSO) ...This paper proposes a Genetic Programming-Based Modeling (GPM) algorithm on chaotic time series. GP is used here to search for appropriate model structures in function space, and the Particle Swarm Optimization (PSO) algorithm is used for Nonlinear Parameter Estimation (NPE) of dynamic model structures. In addition, GPM integrates the results of Nonlinear Time Series Analysis (NTSA) to adjust the parameters and takes them as the criteria of established models. Experiments showed the effectiveness of such improvements on chaotic time series modeling.展开更多
In hard real-time systems, schedulability analysis is not only one of the important means of guaranteeing the timelines of embedded software but also one of the fundamental theories of applying other new techniques, s...In hard real-time systems, schedulability analysis is not only one of the important means of guaranteeing the timelines of embedded software but also one of the fundamental theories of applying other new techniques, such as energy savings and fault tolerance. However, most of the existing schedulability analysis methods assume that schedulers use preemptive scheduling or non-preemptive scheduling. In this paper, we present a schedulability analysis method, i.e., the worst-case hybrid scheduling (WCHS) algorithm, which considers the influence of release jitters of transactions and extends schedulability analysis theory to timing analysis of linear transactions under fixed priority hybrid scheduling. To the best of our knowledge, this method is the first one on timing analysis of linear transactions under hybrid scheduling. An example is employed to demonstrate the use of this method. Experiments show that this method has lower computational complexity while keeping correctness, and that hybrid scheduling has little influence on the average worst-case response time (WCRT), but a negative impact on the schedulability of systems.展开更多
A new composite two component grout comprised of modified urea-formaldehyde resin and cement was formulated to take account of the advantages and disadvantages of both the cement grout and the chem- ical grout. The ne...A new composite two component grout comprised of modified urea-formaldehyde resin and cement was formulated to take account of the advantages and disadvantages of both the cement grout and the chem- ical grout. The new grout is designed for water blocking by reinforcing as well as seepage control by bore grouting. The A component consists of a modified urea-formaldehyde resin A component, some cement, and some water. The B component is an alkaline coagulant. An orthogonal test of four factors at three lev- els showed that gel time increased with increased water content and with urea-formaldehyde resin con- tent. Gel time decreased at increased levels of alkaline coagulant. The A component of this new composite grout is stable over time. A mixed cross-over test showed that as the volume ratio of A to B increases the gel time falls at first but then increases. The solid strength decreases with increasing levels of the B com- ponent. The solid strength increases over time and becomes stable by the 28th day after mixing. The vis- cosity increases with increasing levels of resin A component. The increase is exponential and may be fit to: μ = 8.162e0.0286x.展开更多
Underwater acoustic scattering echoes have time–space structures and are aliasing in time and frequency domains. Different series of echoes properties are not identified when incident angle is unknown. This article i...Underwater acoustic scattering echoes have time–space structures and are aliasing in time and frequency domains. Different series of echoes properties are not identified when incident angle is unknown. This article investigates variations in target echoes of monostatic sonar to address this problem. The mother wavelet with similar structures has been proposed on the basis of preprocessing signal waveform using matched filter, and the theoretical expressions between delay factor and incident angle are derived in the wavelet domain. Analysis of simulation data and experimental results in free-field pool show that this method can effectively separate geometrical scattering components of target echoes. The time delay estimation obtained from geometrical echoes at a single angle is consistent with target geometrical features, which provides a basis for object recognition without angle information. The findings provide valuable insights for analyzing elastic scattering echoes in actual ocean environment.展开更多
NLTHA (nonlinear time history analysis) is impractical for widespread used by the professional engineer because it requires long and inefficient computational time involving complexities when six DOF (degree of fre...NLTHA (nonlinear time history analysis) is impractical for widespread used by the professional engineer because it requires long and inefficient computational time involving complexities when six DOF (degree of freedom) per node is applied. The NLTHA nowadays is predicted by MPA (modal pushover analysis). In this method, effects of higher modes on the dynamic response are considered to estimate seismic demands for structures. In this study, the effect of the reduction of number of DOF is analyzed using 3D NLTHA together with MPA of a rigid connection RC bridge under large earthquake motion. The results are compared with the 6 DOF NLTHA in terms of response of the structure and CPU time to obtain the most efficient computational effort. Result of NLTHA showed that the computational time of the structure both for 4 DOF (without two lateral torsional effects) and 3 DOF (without two lateral torsional and vertical displacements) was reduced significantly compared to the structure using 6 DOF. The reduction of computational time was close to fifty percent both for 4 and 3 DOF's. When the maximum responses between NLTHA and MPA are compared, it is found that the differences are insignificant.展开更多
The braking behavior of drivers when a pedestrian comes out from the sidewalk to the road was analyzed using a driving simulator. Based on drivers' braking behavior, the braking control timing of the system for avoid...The braking behavior of drivers when a pedestrian comes out from the sidewalk to the road was analyzed using a driving simulator. Based on drivers' braking behavior, the braking control timing of the system for avoiding the collision with pedestrians was proposed. In this study, the subject drivers started braking at almost the same time in terms of TTC (Time to Collision), regardless of the velocity of a subject vehicle and crossing velocity of pedestrians. This experimental result showed that brake timing of the system which can minimize the interference for braking between drivers and the system is 1.3 s of TTC. Next, the drivers' braking behavior was investigated when the system controlled braking to avoid collision at this timing. As a result, drivers did not show any change of braking behavior with no excessive interference between braking control by the system and braking operation by drivers for avoiding collisions with pedestrians which is equivalent to the excessive dependence on the system.展开更多
In the statistical standard literature the stationarity of a time dependent process generally is defined by the invariance in time of the distribution of the variable, like a SPL (sound pressure level) fluctuating i...In the statistical standard literature the stationarity of a time dependent process generally is defined by the invariance in time of the distribution of the variable, like a SPL (sound pressure level) fluctuating in time. However in reality there cannot exist constant distribution, respectively characteristics, in time in the strict mathematical sense because the time intervals of observation only can be finite due to practical reasons. Hence on every distribution and characteristics based on it a certain, but evaluable uncertainty is imposed. For monitoring these uncertainties the online-measurement technique, i.e. primarily appropriate software, is already available, also for customers. According to this state of the art the following expanded definition of the stationarity is proposed: Stationarity during a quality controlled measurement process becomes established, when the upper confidence limit of the interesting specific characteristic has no positive slope in time and correspondingly the lower confidence limit of the specific characteristic no negative slope and, as a third, a common condition, the interesting specific characteristic has adjusted itself to a constant position in time. From this a systematic criteria scheme is established and in examples applied on different in- and outdoor situations of sound impact.展开更多
基金Project(2012CB725400)supported by the National Basic Research Program of ChinaProjects(71271023,71322102,7121001)supported by the National Natural Science Foundation of China
文摘The assumption widely used in the user equilibrium model for stochastic network was that the probability distributions of the travel time were known explicitly by travelers. However, this distribution may be unavailable in reality. By relaxing the restrictive assumption, a robust user equilibrium model based on cumulative prospect theory under distribution-free travel time was presented. In the absence of the cumulative distribution function of the travel time, the exact cumulative prospect value(CPV) for each route cannot be obtained. However, the upper and lower bounds on the CPV can be calculated by probability inequalities.Travelers were assumed to choose the routes with the best worst-case CPVs. The proposed model was formulated as a variational inequality problem and solved via a heuristic solution algorithm. A numerical example was also provided to illustrate the application of the proposed model and the efficiency of the solution algorithm.
基金the National Basic Research Program of China (973 Program) under Grant No.2012CB821200,the National Natural Science Foundation of China under Grants No.91024001,No.61070142,the Beijing Natural Science Foundation under Grant No.4111002
文摘Video Super-Resolution (SR) reconstruction produces video sequences with High Resolution (HR) via the fusion of several Low-Resolution (LR) video frames. Traditional methods rely on the accurate estimation of subpixel motion, which constrains their applicability to video sequences with relatively simple motions such as global translation. We propose an efficient iterative spatio-temporal adaptive SR reconstruction model based on Zemike Moment (ZM), which is effective for spatial video sequences with arbitrary motion. The model uses region correlation judgment and self-adaptive threshold strategies to improve the effect and time efficiency of the ZM-based SR method. This leads to better mining of non-local self-similarity and local structural regularity, and is robust to noise and rotation. An efficient iterative curvature-based interpolation scheme is introduced to obtain the initial HR estimation of each LR video frame. Experimental results both on spatial and standard video sequences demonstrate that the proposed method outperforms existing methods in terms of both subjective visual and objective quantitative evaluations, and greatly improves the time efficiency.
基金supported by the National 863 Program(No.2008AA042207)
文摘Based on the evaluation of advantages and disadvantages of high-precision digital time interval measuring algorithms, and combined with the principle of the typical time-difference ultrasonic flow measurement, the requirements for the measurement of echo time of flight put forward by the ultrasonic flow measurement are analyzed. A new high-precision time interval measurement algorithm is presented, which combines the pulse counting method with the phase delay interpolation. The pulse counting method is used to ensure a large dynamic measuring range, and a double-edge triggering counter is designed to improve the accuracy and reduce the counting quantization error. The phase delay interpolation is used to reduce the quantization error of pulse counting for further improving the time measurement resolution. Test data show that the systexn for the measurement of the ultrasonic echo time of flight based on this algorithm and implemented on an Field Programmable Gate Army(FleA) needs a relatively short time for measurement, and has a measurement error of less than 105 ps.
基金Project(2013ZX04013047)supported by the Major Program of National Natural Science Foundation of ChinaProject(51275014)supported by the National Natural Science Foundation of China
文摘The reliability of electromechanical product is usually determined by the fault number and working time traditionally. The shortcoming of this method is that the product must be in service. To design and enhance the reliability of the electromechanical product, the reliability evaluation method must be feasible and correct. Reliability evaluation method and algorithm were proposed. The reliability of product can be calculated by the reliability of subsystems which can be gained by experiment or historical data. The reliability of the machining center was evaluated by the method and algorithm as one example. The calculation result shows that the solution accuracy of mean time between failures is 97.4% calculated by the method proposed in this article compared by the traditional method. The method and algorithm can be used to evaluate the reliability of electromechanical product before it is in service.
文摘This paper considers the parallel machines scheduling problem where jobs are subject to different release times. A constructive heuristic is first proposed to solve the problem in a modest amount of computer time. In general, the quality of the solutions provided by heuristics degrades with the increase of the probiem’s scale. Combined the global search ability of genetic algorithm, this paper proposed a hybrid heuristic to improve the quality of solutions further. The computational results show that the hybrid heuristic combines the advantages of heuristic and genetic algorithm effectively and can provide very good solutions to some large problems in a reasonable amount of computer time.
文摘Objective To establish and evaluate a hypercoagulable animal model for the assessment of anticoagulants. Methods Forty mice, thirty-two rats, and twenty-four rabbits were randomly and equally divided into control group (saline) and three ellagic acid (EA)-treated groups (low, middle, and high doses). In the mice, bleeding time (BT) was estimated with tail transaction, and clotting time (CT) with template method. Prothrombin time (PT) and the activated partial thromboplastin time (APTT) in rats and rabbits were measured by means of Quick's one-stage assay and modified APTT assay respectively. In addition, thrombin activity was estimated in rats with PT assay using a hemagglutination analyzer. The circulating platelet aggregates were de- tected in rabbits through platelet counting and presented as the circulating platelet aggregate ratio (CPAR). Results EA shortened BT and CT in mice, PT and APTT in rats, and increased thrombin activity and CPAR, all in a dose-dependent manner. EA also brought reduction of PT and APTT in rabbits in dose- and time-dependent manners. Conclusion EA could induce hypercoagulable state through activating coagulation system and platelets in mice, rats, and rabbits.
基金Project (Nos. 60174009 and 70071017) supported by the NationalNatural Science Foundation of China
文摘This paper proposes a Genetic Programming-Based Modeling (GPM) algorithm on chaotic time series. GP is used here to search for appropriate model structures in function space, and the Particle Swarm Optimization (PSO) algorithm is used for Nonlinear Parameter Estimation (NPE) of dynamic model structures. In addition, GPM integrates the results of Nonlinear Time Series Analysis (NTSA) to adjust the parameters and takes them as the criteria of established models. Experiments showed the effectiveness of such improvements on chaotic time series modeling.
基金the National Natural Science Foundation of China (No. 60533040)the Hi-Tech Research and Development Program (863) of China (Nos. 2007AA010304 and 2007AA01Z129)the Key Scientific and Technological Project of Hangzhou Tech-nology Bureau, China (No. 20062412B01)
文摘In hard real-time systems, schedulability analysis is not only one of the important means of guaranteeing the timelines of embedded software but also one of the fundamental theories of applying other new techniques, such as energy savings and fault tolerance. However, most of the existing schedulability analysis methods assume that schedulers use preemptive scheduling or non-preemptive scheduling. In this paper, we present a schedulability analysis method, i.e., the worst-case hybrid scheduling (WCHS) algorithm, which considers the influence of release jitters of transactions and extends schedulability analysis theory to timing analysis of linear transactions under fixed priority hybrid scheduling. To the best of our knowledge, this method is the first one on timing analysis of linear transactions under hybrid scheduling. An example is employed to demonstrate the use of this method. Experiments show that this method has lower computational complexity while keeping correctness, and that hybrid scheduling has little influence on the average worst-case response time (WCRT), but a negative impact on the schedulability of systems.
基金the Graduate Developing Innovation Project of Jiangsu Province of China (No. CXZZ11-0306)the Major State Basic Research and Development Program of China (No.2007CB209400)
文摘A new composite two component grout comprised of modified urea-formaldehyde resin and cement was formulated to take account of the advantages and disadvantages of both the cement grout and the chem- ical grout. The new grout is designed for water blocking by reinforcing as well as seepage control by bore grouting. The A component consists of a modified urea-formaldehyde resin A component, some cement, and some water. The B component is an alkaline coagulant. An orthogonal test of four factors at three lev- els showed that gel time increased with increased water content and with urea-formaldehyde resin con- tent. Gel time decreased at increased levels of alkaline coagulant. The A component of this new composite grout is stable over time. A mixed cross-over test showed that as the volume ratio of A to B increases the gel time falls at first but then increases. The solid strength decreases with increasing levels of the B com- ponent. The solid strength increases over time and becomes stable by the 28th day after mixing. The vis- cosity increases with increasing levels of resin A component. The increase is exponential and may be fit to: μ = 8.162e0.0286x.
基金Foundation item: Supported by the National Natural Science Foundation of China(Grant No.51279033) and Natural Science Foundation of Heilongjiang Province, China(Grant No.F201346 )
文摘Underwater acoustic scattering echoes have time–space structures and are aliasing in time and frequency domains. Different series of echoes properties are not identified when incident angle is unknown. This article investigates variations in target echoes of monostatic sonar to address this problem. The mother wavelet with similar structures has been proposed on the basis of preprocessing signal waveform using matched filter, and the theoretical expressions between delay factor and incident angle are derived in the wavelet domain. Analysis of simulation data and experimental results in free-field pool show that this method can effectively separate geometrical scattering components of target echoes. The time delay estimation obtained from geometrical echoes at a single angle is consistent with target geometrical features, which provides a basis for object recognition without angle information. The findings provide valuable insights for analyzing elastic scattering echoes in actual ocean environment.
文摘NLTHA (nonlinear time history analysis) is impractical for widespread used by the professional engineer because it requires long and inefficient computational time involving complexities when six DOF (degree of freedom) per node is applied. The NLTHA nowadays is predicted by MPA (modal pushover analysis). In this method, effects of higher modes on the dynamic response are considered to estimate seismic demands for structures. In this study, the effect of the reduction of number of DOF is analyzed using 3D NLTHA together with MPA of a rigid connection RC bridge under large earthquake motion. The results are compared with the 6 DOF NLTHA in terms of response of the structure and CPU time to obtain the most efficient computational effort. Result of NLTHA showed that the computational time of the structure both for 4 DOF (without two lateral torsional effects) and 3 DOF (without two lateral torsional and vertical displacements) was reduced significantly compared to the structure using 6 DOF. The reduction of computational time was close to fifty percent both for 4 and 3 DOF's. When the maximum responses between NLTHA and MPA are compared, it is found that the differences are insignificant.
文摘The braking behavior of drivers when a pedestrian comes out from the sidewalk to the road was analyzed using a driving simulator. Based on drivers' braking behavior, the braking control timing of the system for avoiding the collision with pedestrians was proposed. In this study, the subject drivers started braking at almost the same time in terms of TTC (Time to Collision), regardless of the velocity of a subject vehicle and crossing velocity of pedestrians. This experimental result showed that brake timing of the system which can minimize the interference for braking between drivers and the system is 1.3 s of TTC. Next, the drivers' braking behavior was investigated when the system controlled braking to avoid collision at this timing. As a result, drivers did not show any change of braking behavior with no excessive interference between braking control by the system and braking operation by drivers for avoiding collisions with pedestrians which is equivalent to the excessive dependence on the system.
文摘In the statistical standard literature the stationarity of a time dependent process generally is defined by the invariance in time of the distribution of the variable, like a SPL (sound pressure level) fluctuating in time. However in reality there cannot exist constant distribution, respectively characteristics, in time in the strict mathematical sense because the time intervals of observation only can be finite due to practical reasons. Hence on every distribution and characteristics based on it a certain, but evaluable uncertainty is imposed. For monitoring these uncertainties the online-measurement technique, i.e. primarily appropriate software, is already available, also for customers. According to this state of the art the following expanded definition of the stationarity is proposed: Stationarity during a quality controlled measurement process becomes established, when the upper confidence limit of the interesting specific characteristic has no positive slope in time and correspondingly the lower confidence limit of the specific characteristic no negative slope and, as a third, a common condition, the interesting specific characteristic has adjusted itself to a constant position in time. From this a systematic criteria scheme is established and in examples applied on different in- and outdoor situations of sound impact.