In this paper, a single-machine scheduling model with a given common due date is considered. Job processing time is a linear decreasing function of its starting time. The objective function is to minimize the total we...In this paper, a single-machine scheduling model with a given common due date is considered. Job processing time is a linear decreasing function of its starting time. The objective function is to minimize the total weighted earliness award and tardiness penalty. Our aim is to find an optimal schedule so as to minimize the objective function. As the problem is NP-hard, some properties and polynomial time solvable cases of this problem are given. A dynamic programming algorithm for the general case of the problem is provided.展开更多
In this paper, single machine scheduling problems with variable processing time are raised. The criterions of the problem considered are minimizing scheduling length of all jobs, flow time and number of tardy jobs and...In this paper, single machine scheduling problems with variable processing time are raised. The criterions of the problem considered are minimizing scheduling length of all jobs, flow time and number of tardy jobs and so on. The complexity of the problem is determined. [WT5HZ]展开更多
Abstract Most papers in scheduling research have treated individual job processing times as fixed parameters. However, in many practical situations, a manager may control processing time by reallocating resources. In ...Abstract Most papers in scheduling research have treated individual job processing times as fixed parameters. However, in many practical situations, a manager may control processing time by reallocating resources. In this paper, authors consider a machine scheduling problem with controllable processing times. In the first part of this paper, a special case where the processing times and compression costs are uniform among jobs is discussed. Theoretical results are derived that aid in developing an O(n 2) algorithm to slove the problem optimally. In the second part of this paper, authors generalize the discussion to general case. An effective heuristic to the general problem will be presented.展开更多
Due date quotation and scheduling are important tools to match demand with production capacity in the MTO (make-to-order) environment. We consider an order scheduling problem faced by a manufacturing f'trm operatin...Due date quotation and scheduling are important tools to match demand with production capacity in the MTO (make-to-order) environment. We consider an order scheduling problem faced by a manufacturing f'trm operating in an MTO environment, where the firm needs to quote a common due date for the customers, and simultaneously control the processing times of customer orders (by allocating extra resources to process the orders) so as to complete the orders before a given deadline. The objective is to minimize the total costs of earliness, tardiness, due date assignment and extra resource consumption. We show the problem is NP-hard, even if the cost weights for controlling the order processing times are identical. We identify several polynomially solvable cases of the problem, and develop a branch and bound algorithm and three Tabu search algorithms to solve the general problem. We then conduct computational experiments to evaluate the performance of the three Tabu-search algorithms and show that they are generally effective in terms of solution quality.展开更多
Two-stage hybrid flow shop scheduling has been extensively considered in single-factory settings.However,the distributed two-stage hybrid flow shop scheduling problem(DTHFSP)with fuzzy processing time is seldom invest...Two-stage hybrid flow shop scheduling has been extensively considered in single-factory settings.However,the distributed two-stage hybrid flow shop scheduling problem(DTHFSP)with fuzzy processing time is seldom investigated in multiple factories.Furthermore,the integration of reinforcement learning and metaheuristic is seldom applied to solve DTHFSP.In the current study,DTHFSP with fuzzy processing time was investigated,and a novel Q-learning-based teaching-learning based optimization(QTLBO)was constructed to minimize makespan.Several teachers were recruited for this study.The teacher phase,learner phase,teacher’s self-learning phase,and learner’s self-learning phase were designed.The Q-learning algorithm was implemented by 9 states,4 actions defined as combinations of the above phases,a reward,and an adaptive action selection,which were applied to dynamically adjust the algorithm structure.A number of experiments were conducted.The computational results demonstrate that the new strategies of QTLBO are effective;furthermore,it presents promising results on the considered DTHFSP.展开更多
In this paper, single machine scheduling problems with variable processing time is discussed according to published instances of management engineering. Processing time of a job is the product of a “coefficient' ...In this paper, single machine scheduling problems with variable processing time is discussed according to published instances of management engineering. Processing time of a job is the product of a “coefficient' of the job on position i and a “normal' processing time of the job. The criteria considered is to minimize scheduled length of all jobs. A lemma is proposed and proved. In no deadline constrained condition, the problem belongs to polynomial time algorithm. It is proved by using 3 partition that if the problem is deadline constrained, its complexity is strong NP hard. Finally, a conjuncture is proposed that is to be proved.展开更多
Background:Dry specimen transport has shown equivalence to traditional liquid transport using a novel high-risk Human papillomavirus assay.Considering that dry transport might cross obstacles during cervical cancer sc...Background:Dry specimen transport has shown equivalence to traditional liquid transport using a novel high-risk Human papillomavirus assay.Considering that dry transport might cross obstacles during cervical cancer screening in low and middle resource settings,this study was designed evaluate different processing time of dry specimen transport using the same isothermal amplification hrHPV assay.Methods:There were 564 women between the ages of 30–55 recruited from colposcopy clinic.For each patient,two endocervical samples were collected and placed into empty collection tubes by physician.Samples were stored at room temperature until analyzed for hrHPV using the AmpFire assay at two time points:2 days and 2 weeks.511 of the 564 participants with positive hrHPV were provided colposcopy exam and quadrant biopsy.Results:A total of 1128 endocervical samples from 564 patients were detected by the Ampfire assay.Good agreement was found between two time periods(KappaStandard error=0.67±0.04).Sensitivity(2days/2weeks)for CIN2t was 95.28%(95%CI:92.14%–98.42%)vs 90.57%(CI(86.65%–94.49%)and specificity(2days/2weeks)was 22.47%(CI 19.33%–25.61%)vs 28.15%(CI 24.23%–32.07%)respectively.The difference for Ampfire HPV detection in sensitivity for CIN2t for the two time periods was not significant(P=0.227),while the difference in specificity for CIN2t was significant(P=0.001).The difference in Ct values 29.23(CI 28.15–30.31)and 29.27(CI 28.19–30.35)between two time points was not significant(P?0.164).Conclusion:Processing dry brush specimens can be delayed up to 2 weeks.Using the AmpFire assay platform which supports cervical cancer prevention programs in low-to-middle-income countries(LMICs).展开更多
Seismological Bureau of Sichuan Province, Chengdu 610041, China2) Center for Analysis and Prediction, State Seismological Bureau, Beijing 100036, China3) Observation Center for Prediction of Earthquakes and Volcanic E...Seismological Bureau of Sichuan Province, Chengdu 610041, China2) Center for Analysis and Prediction, State Seismological Bureau, Beijing 100036, China3) Observation Center for Prediction of Earthquakes and Volcanic Eruptions, Faculty of Sciences, Tohoku University, Sendai 98077, Japan展开更多
Big health data collection and storing for further analysis is a challenging task because this knowledge is big and has many features.Several cloud-based IoT health providers have been described in the literature prev...Big health data collection and storing for further analysis is a challenging task because this knowledge is big and has many features.Several cloud-based IoT health providers have been described in the literature previously.Furthermore,there are a number of issues related to time consumed and overall network performance when it comes to big data information.In the existing method,less performed optimization algorithms were used for optimizing the data.In the proposed method,the Chaotic Cuckoo Optimization algorithm was used for feature selection,and Convolutional Support Vector Machine(CSVM)was used.The research presents a method for analyzing healthcare information that uses in future prediction.The major goal is to take a variety of data while improving efficiency and minimizing process time.The suggested method employs a hybrid method that is divided into two stages.In the first stage,it reduces the features by using the Chaotic Cuckoo Optimization algorithm with Levy flight,opposition-based learning,and distributor operator.In the second stage,CSVM is used which combines the benefits of convolutional neural network(CNN)and SVM.The CSVM modifies CNN’s convolution product to learn hidden deep inside data sources.For improved economic flexibility,greater protection,greater analytics with confidentiality,and lower operating cost,the suggested approach is built on fog computing.Overall results of the experiments show that the suggested method can minimize the number of features in the datasets,enhances the accuracy by 82%,and decrease the time of the process.展开更多
In this paper, a fabrication scheduling problem concerning the production of components at a single manufacturing facility was studied, in which the manufactured components are subsequently assembled into a finite num...In this paper, a fabrication scheduling problem concerning the production of components at a single manufacturing facility was studied, in which the manufactured components are subsequently assembled into a finite number of end products. Each product was assumed to comprise a common component to all jobs and a unique component to itself. Common operations were processed in batches and each batch required a setup time. A product is completed when both its two operations have been processed and are available. The optimality criterion considered was the minimization of weighted flow time. For this scheduling problem, the optimal schedules were described in a weignted shortest processing time first (WSPT) order and two algorithms were constructed corresponding to the batch availability and item availability, respectively.展开更多
Ground condition and construction (excavation and support) time and costs are the key factors in decision-making during planning and design phases of a tunnel project. An innovative methodology for probabilistic est...Ground condition and construction (excavation and support) time and costs are the key factors in decision-making during planning and design phases of a tunnel project. An innovative methodology for probabilistic estimation of ground condition and construction time and costs is proposed, which is an integration of the ground prediction approach based on Markov process, and the time and cost variance analysis based on Monte-Carlo (MC) simulation. The former provides the probabilistic description of ground classification along tunnel alignment according to the geological information revealed from geological profile and boreholes. The latter provides the probabilistic description of the expected construction time and costs for each operation according to the survey feedbacks from experts. Then an engineering application to Hamro tunnel is presented to demonstrate how the ground condition and the construction time and costs are estimated in a probabilistic way. In most items, in order to estimate the data needed for this methodology, a number of questionnaires are distributed among the tunneling experts and finally the mean values of the respondents are applied. These facilitate both the owners and the contractors to be aware of the risk that they should carry before construction, and are useful for both tendering and bidding.展开更多
Some properties of Super-Brownian motion have been approached by Dawson & Hochberg [1], Iscoe [2] & L3], Konno & Shiga [4] and so on. In this paper, we limit our attention to the occupation time processes ...Some properties of Super-Brownian motion have been approached by Dawson & Hochberg [1], Iscoe [2] & L3], Konno & Shiga [4] and so on. In this paper, we limit our attention to the occupation time processes of the Super-Brownian motion,and try to give an intuitive proof for their absolute continuity with respect to the Lebesgue measure on Rd (d≤3) when the initial measure of the Super-Brownian motion has the absolute continuity.展开更多
A multi-GPU system designed for high-speed,real-time signal processing of optical coherencetomography(OCT)is described herein.For the OCT data sampled in linear wave numbers,themaximum procesing rates reached 2.95 MHz...A multi-GPU system designed for high-speed,real-time signal processing of optical coherencetomography(OCT)is described herein.For the OCT data sampled in linear wave numbers,themaximum procesing rates reached 2.95 MHz for 1024-OCT and 1.96 MHz for 2048-OCT.Data sampled using linear wavelengths were re-sampled using a time-domain interpolation method and zero-padding interpolation method to improve image quality.The maximum processing rates for1024-OCT reached 2.16 MHz for the time-domain method and 1.26 MHz for the zero-paddingmethod.The maximum processing rates for 2048-0CT reached_1.58 MHz,and 0.68 MHz,respectively.This method is capable of high-speed,real-time processing for O CT systems.展开更多
In our previous work, a novel algorithm to perform robust pose estimation was presented. The pose was estimated using points on the object to regions on image correspondence. The laboratory experiments conducted in th...In our previous work, a novel algorithm to perform robust pose estimation was presented. The pose was estimated using points on the object to regions on image correspondence. The laboratory experiments conducted in the previous work showed that the accuracy of the estimated pose was over 99% for position and 84% for orientation estimations respectively. However, for larger objects, the algorithm requires a high number of points to achieve the same accuracy. The requirement of higher number of points makes the algorithm, computationally intensive resulting in the algorithm infeasible for real-time computer vision applications. In this paper, the algorithm is parallelized to run on NVIDIA GPUs. The results indicate that even for objects having more than 2000 points, the algorithm can estimate the pose in real time for each frame of high-resolution videos.展开更多
In the first step, the Ehrenfest reasoning concerning the adiabatic invariance of the angular orbital momentum is applied to the electron motion in the hydrogen atom. It is demonstrated that the time of the energy emi...In the first step, the Ehrenfest reasoning concerning the adiabatic invariance of the angular orbital momentum is applied to the electron motion in the hydrogen atom. It is demonstrated that the time of the energy emission from the quantum level n+1 to level n can be deduced from the orbital angular momentum examined in the hydrogen atom. This time is found precisely equal to the time interval dictated by the Joule-Lenz law governing the electron transition between the levels n+1 and n. In the next step, the mechanical parameters entering the quantum systems are applied in calculating the time intervals characteristic for the electron transitions. This concerns the neighbouring energy levels in the hydrogen atom as well as the Landau levels in the electron gas submitted to the action of a constant magnetic field.展开更多
Given the existing integrated scheduling algorithms,all processes are ordered and scheduled overall,and these algorithms ignore the influence of the vertical and horizontal characteristics of the product process tree ...Given the existing integrated scheduling algorithms,all processes are ordered and scheduled overall,and these algorithms ignore the influence of the vertical and horizontal characteristics of the product process tree on the product scheduling effect.This paper presents an integrated scheduling algorithm for the same equipment process sequencing based on the Root-Subtree horizontal and vertical pre-scheduling to solve the above problem.Firstly,the tree decomposition method is used to extract the root node to split the process tree into several Root-Subtrees,and the Root-Subtree priority is set from large to small through the optimal completion time of vertical and horizontal pre-scheduling.All Root-Subtree processes on the same equipment are sorted into the stack according to the equipment process pre-start time,and the stack-top processes are combined with the schedulable process set to schedule and dispatch the stack.The start processing time of each process is determined according to the dynamic start processing time strategy of the equipment process,to complete the fusion operation of the Root-Subtree processes under the constraints of the vertical process tree and the horizontal equipment.Then,the root node is retrieved to form a substantial scheduling scheme,which realizes scheduling optimization by mining the vertical and horizontal characteristics of the process tree.Verification by examples shows that,compared with the traditional integrated scheduling algorithms that sort the scheduling processes as an overall,the integrated scheduling algorithmin this paper is better.The proposed algorithmenhances the process scheduling compactness,reduces the length of the idle time of the processing equipment,and optimizes the production scheduling target,which is of universal significance to solve the integrated scheduling problem.展开更多
The brief arts and crafts of the ordinary fourdrinier are introduced first. After the intractable points of paper basis weight (BW) control are analyzed, an autotuning PID/PI control algorithm based on relay feedback ...The brief arts and crafts of the ordinary fourdrinier are introduced first. After the intractable points of paper basis weight (BW) control are analyzed, an autotuning PID/PI control algorithm based on relay feedback identification is proposed, which has such advantages as simple parameter adjustment, little dependence on process model, strong robustness and easiness to implementation. And it is very suitable for controlling such processes as BW loop with large time delay.展开更多
With the spatial-temporal focusing of acoustic energy,time reversal processing(TRP) shows the potential application for active target detection in shallow water.To turn the ‘potential’ into a reality,the TRP based o...With the spatial-temporal focusing of acoustic energy,time reversal processing(TRP) shows the potential application for active target detection in shallow water.To turn the ‘potential’ into a reality,the TRP based on a model source(MS) instead of a physical probe source(PS) is investigated.For uncertain ocean environments,the robustness of TRP is discussed for the narrowband and broadband signal respectively.The channel transfer function matrix is first constructed in the acoustic perturbation space.Then a steering vector for time reversal transmission is obtained by singular value decomposition(SVD) of the matrix.For verification of the robust TRP,the tank experiments of time reversal transmission focusing and its application for active target detection are undertaken.The experimental results have shown that the robust TRP can effectively detect and locate a small bottom target.展开更多
A water model and a high-speed video camera were utilized in the 300-t RH equipment to study the effect of steel flow patterns in a vacuum chamber on fast decarburization and a superior flow-pattern map was obtained d...A water model and a high-speed video camera were utilized in the 300-t RH equipment to study the effect of steel flow patterns in a vacuum chamber on fast decarburization and a superior flow-pattern map was obtained during the practical RH process. There are three flow patterns with different bubbling characteristics and steel surface states in the vacuum chamber: boiling pattern(BP), transition pattern(TP), and wave pattern(WP). The effect of the liquid-steel level and the residence time of the steel in the chamber on flow patterns and decarburization reaction were investigated, respectively. The liquid-steel level significantly affected the flow-pattern transition from BP to WP, and the residence time and reaction area were crucial to evaluate the whole decarburization process rather than the circulation flow rate and mixing time. A superior flow-pattern map during the practical RH process showed that the steel flow pattern changed from BP to TP quickly, and then remained as TP until the end of decarburization.展开更多
文摘In this paper, a single-machine scheduling model with a given common due date is considered. Job processing time is a linear decreasing function of its starting time. The objective function is to minimize the total weighted earliness award and tardiness penalty. Our aim is to find an optimal schedule so as to minimize the objective function. As the problem is NP-hard, some properties and polynomial time solvable cases of this problem are given. A dynamic programming algorithm for the general case of the problem is provided.
文摘In this paper, single machine scheduling problems with variable processing time are raised. The criterions of the problem considered are minimizing scheduling length of all jobs, flow time and number of tardy jobs and so on. The complexity of the problem is determined. [WT5HZ]
文摘Abstract Most papers in scheduling research have treated individual job processing times as fixed parameters. However, in many practical situations, a manager may control processing time by reallocating resources. In this paper, authors consider a machine scheduling problem with controllable processing times. In the first part of this paper, a special case where the processing times and compression costs are uniform among jobs is discussed. Theoretical results are derived that aid in developing an O(n 2) algorithm to slove the problem optimally. In the second part of this paper, authors generalize the discussion to general case. An effective heuristic to the general problem will be presented.
文摘Due date quotation and scheduling are important tools to match demand with production capacity in the MTO (make-to-order) environment. We consider an order scheduling problem faced by a manufacturing f'trm operating in an MTO environment, where the firm needs to quote a common due date for the customers, and simultaneously control the processing times of customer orders (by allocating extra resources to process the orders) so as to complete the orders before a given deadline. The objective is to minimize the total costs of earliness, tardiness, due date assignment and extra resource consumption. We show the problem is NP-hard, even if the cost weights for controlling the order processing times are identical. We identify several polynomially solvable cases of the problem, and develop a branch and bound algorithm and three Tabu search algorithms to solve the general problem. We then conduct computational experiments to evaluate the performance of the three Tabu-search algorithms and show that they are generally effective in terms of solution quality.
文摘Two-stage hybrid flow shop scheduling has been extensively considered in single-factory settings.However,the distributed two-stage hybrid flow shop scheduling problem(DTHFSP)with fuzzy processing time is seldom investigated in multiple factories.Furthermore,the integration of reinforcement learning and metaheuristic is seldom applied to solve DTHFSP.In the current study,DTHFSP with fuzzy processing time was investigated,and a novel Q-learning-based teaching-learning based optimization(QTLBO)was constructed to minimize makespan.Several teachers were recruited for this study.The teacher phase,learner phase,teacher’s self-learning phase,and learner’s self-learning phase were designed.The Q-learning algorithm was implemented by 9 states,4 actions defined as combinations of the above phases,a reward,and an adaptive action selection,which were applied to dynamically adjust the algorithm structure.A number of experiments were conducted.The computational results demonstrate that the new strategies of QTLBO are effective;furthermore,it presents promising results on the considered DTHFSP.
文摘In this paper, single machine scheduling problems with variable processing time is discussed according to published instances of management engineering. Processing time of a job is the product of a “coefficient' of the job on position i and a “normal' processing time of the job. The criteria considered is to minimize scheduled length of all jobs. A lemma is proposed and proved. In no deadline constrained condition, the problem belongs to polynomial time algorithm. It is proved by using 3 partition that if the problem is deadline constrained, its complexity is strong NP hard. Finally, a conjuncture is proposed that is to be proved.
基金the Science and Technology Research Project Foundation of Shanxi Province,China(201803D421049).
文摘Background:Dry specimen transport has shown equivalence to traditional liquid transport using a novel high-risk Human papillomavirus assay.Considering that dry transport might cross obstacles during cervical cancer screening in low and middle resource settings,this study was designed evaluate different processing time of dry specimen transport using the same isothermal amplification hrHPV assay.Methods:There were 564 women between the ages of 30–55 recruited from colposcopy clinic.For each patient,two endocervical samples were collected and placed into empty collection tubes by physician.Samples were stored at room temperature until analyzed for hrHPV using the AmpFire assay at two time points:2 days and 2 weeks.511 of the 564 participants with positive hrHPV were provided colposcopy exam and quadrant biopsy.Results:A total of 1128 endocervical samples from 564 patients were detected by the Ampfire assay.Good agreement was found between two time periods(KappaStandard error=0.67±0.04).Sensitivity(2days/2weeks)for CIN2t was 95.28%(95%CI:92.14%–98.42%)vs 90.57%(CI(86.65%–94.49%)and specificity(2days/2weeks)was 22.47%(CI 19.33%–25.61%)vs 28.15%(CI 24.23%–32.07%)respectively.The difference for Ampfire HPV detection in sensitivity for CIN2t for the two time periods was not significant(P=0.227),while the difference in specificity for CIN2t was significant(P=0.001).The difference in Ct values 29.23(CI 28.15–30.31)and 29.27(CI 28.19–30.35)between two time points was not significant(P?0.164).Conclusion:Processing dry brush specimens can be delayed up to 2 weeks.Using the AmpFire assay platform which supports cervical cancer prevention programs in low-to-middle-income countries(LMICs).
文摘Seismological Bureau of Sichuan Province, Chengdu 610041, China2) Center for Analysis and Prediction, State Seismological Bureau, Beijing 100036, China3) Observation Center for Prediction of Earthquakes and Volcanic Eruptions, Faculty of Sciences, Tohoku University, Sendai 98077, Japan
基金The authors extend their appreciation to the Deanship of Scientific Research at King Khalid University for funding this work under grant number(RGP 2/158/43)Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2022R161)Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘Big health data collection and storing for further analysis is a challenging task because this knowledge is big and has many features.Several cloud-based IoT health providers have been described in the literature previously.Furthermore,there are a number of issues related to time consumed and overall network performance when it comes to big data information.In the existing method,less performed optimization algorithms were used for optimizing the data.In the proposed method,the Chaotic Cuckoo Optimization algorithm was used for feature selection,and Convolutional Support Vector Machine(CSVM)was used.The research presents a method for analyzing healthcare information that uses in future prediction.The major goal is to take a variety of data while improving efficiency and minimizing process time.The suggested method employs a hybrid method that is divided into two stages.In the first stage,it reduces the features by using the Chaotic Cuckoo Optimization algorithm with Levy flight,opposition-based learning,and distributor operator.In the second stage,CSVM is used which combines the benefits of convolutional neural network(CNN)and SVM.The CSVM modifies CNN’s convolution product to learn hidden deep inside data sources.For improved economic flexibility,greater protection,greater analytics with confidentiality,and lower operating cost,the suggested approach is built on fog computing.Overall results of the experiments show that the suggested method can minimize the number of features in the datasets,enhances the accuracy by 82%,and decrease the time of the process.
文摘In this paper, a fabrication scheduling problem concerning the production of components at a single manufacturing facility was studied, in which the manufactured components are subsequently assembled into a finite number of end products. Each product was assumed to comprise a common component to all jobs and a unique component to itself. Common operations were processed in batches and each batch required a setup time. A product is completed when both its two operations have been processed and are available. The optimality criterion considered was the minimization of weighted flow time. For this scheduling problem, the optimal schedules were described in a weignted shortest processing time first (WSPT) order and two algorithms were constructed corresponding to the batch availability and item availability, respectively.
文摘Ground condition and construction (excavation and support) time and costs are the key factors in decision-making during planning and design phases of a tunnel project. An innovative methodology for probabilistic estimation of ground condition and construction time and costs is proposed, which is an integration of the ground prediction approach based on Markov process, and the time and cost variance analysis based on Monte-Carlo (MC) simulation. The former provides the probabilistic description of ground classification along tunnel alignment according to the geological information revealed from geological profile and boreholes. The latter provides the probabilistic description of the expected construction time and costs for each operation according to the survey feedbacks from experts. Then an engineering application to Hamro tunnel is presented to demonstrate how the ground condition and the construction time and costs are estimated in a probabilistic way. In most items, in order to estimate the data needed for this methodology, a number of questionnaires are distributed among the tunneling experts and finally the mean values of the respondents are applied. These facilitate both the owners and the contractors to be aware of the risk that they should carry before construction, and are useful for both tendering and bidding.
文摘Some properties of Super-Brownian motion have been approached by Dawson & Hochberg [1], Iscoe [2] & L3], Konno & Shiga [4] and so on. In this paper, we limit our attention to the occupation time processes of the Super-Brownian motion,and try to give an intuitive proof for their absolute continuity with respect to the Lebesgue measure on Rd (d≤3) when the initial measure of the Super-Brownian motion has the absolute continuity.
基金the support from the union project of Peking University third hospital&Chinese Academy of Sciences(Grant No.7490-04,Grant No.KJZD-EW-TZ-L03)the Sichuan Youth Science&Technology Foundation(Grant No.13QNJJ0034)+1 种基金the West Light Foundation of the Chinese Academy of Sciences,the National Major Scientific Equipment program(Grant No.2012YQ120080)the National Science Foundation of China(Grant No.6118082).
文摘A multi-GPU system designed for high-speed,real-time signal processing of optical coherencetomography(OCT)is described herein.For the OCT data sampled in linear wave numbers,themaximum procesing rates reached 2.95 MHz for 1024-OCT and 1.96 MHz for 2048-OCT.Data sampled using linear wavelengths were re-sampled using a time-domain interpolation method and zero-padding interpolation method to improve image quality.The maximum processing rates for1024-OCT reached 2.16 MHz for the time-domain method and 1.26 MHz for the zero-paddingmethod.The maximum processing rates for 2048-0CT reached_1.58 MHz,and 0.68 MHz,respectively.This method is capable of high-speed,real-time processing for O CT systems.
文摘In our previous work, a novel algorithm to perform robust pose estimation was presented. The pose was estimated using points on the object to regions on image correspondence. The laboratory experiments conducted in the previous work showed that the accuracy of the estimated pose was over 99% for position and 84% for orientation estimations respectively. However, for larger objects, the algorithm requires a high number of points to achieve the same accuracy. The requirement of higher number of points makes the algorithm, computationally intensive resulting in the algorithm infeasible for real-time computer vision applications. In this paper, the algorithm is parallelized to run on NVIDIA GPUs. The results indicate that even for objects having more than 2000 points, the algorithm can estimate the pose in real time for each frame of high-resolution videos.
文摘In the first step, the Ehrenfest reasoning concerning the adiabatic invariance of the angular orbital momentum is applied to the electron motion in the hydrogen atom. It is demonstrated that the time of the energy emission from the quantum level n+1 to level n can be deduced from the orbital angular momentum examined in the hydrogen atom. This time is found precisely equal to the time interval dictated by the Joule-Lenz law governing the electron transition between the levels n+1 and n. In the next step, the mechanical parameters entering the quantum systems are applied in calculating the time intervals characteristic for the electron transitions. This concerns the neighbouring energy levels in the hydrogen atom as well as the Landau levels in the electron gas submitted to the action of a constant magnetic field.
基金supported by the National Natural Science Foundation of China[Grant No.61772160].
文摘Given the existing integrated scheduling algorithms,all processes are ordered and scheduled overall,and these algorithms ignore the influence of the vertical and horizontal characteristics of the product process tree on the product scheduling effect.This paper presents an integrated scheduling algorithm for the same equipment process sequencing based on the Root-Subtree horizontal and vertical pre-scheduling to solve the above problem.Firstly,the tree decomposition method is used to extract the root node to split the process tree into several Root-Subtrees,and the Root-Subtree priority is set from large to small through the optimal completion time of vertical and horizontal pre-scheduling.All Root-Subtree processes on the same equipment are sorted into the stack according to the equipment process pre-start time,and the stack-top processes are combined with the schedulable process set to schedule and dispatch the stack.The start processing time of each process is determined according to the dynamic start processing time strategy of the equipment process,to complete the fusion operation of the Root-Subtree processes under the constraints of the vertical process tree and the horizontal equipment.Then,the root node is retrieved to form a substantial scheduling scheme,which realizes scheduling optimization by mining the vertical and horizontal characteristics of the process tree.Verification by examples shows that,compared with the traditional integrated scheduling algorithms that sort the scheduling processes as an overall,the integrated scheduling algorithmin this paper is better.The proposed algorithmenhances the process scheduling compactness,reduces the length of the idle time of the processing equipment,and optimizes the production scheduling target,which is of universal significance to solve the integrated scheduling problem.
基金This project was supported by the National Key Project in the Ninth Fivc-Year Plan(97-619-02-03).
文摘The brief arts and crafts of the ordinary fourdrinier are introduced first. After the intractable points of paper basis weight (BW) control are analyzed, an autotuning PID/PI control algorithm based on relay feedback identification is proposed, which has such advantages as simple parameter adjustment, little dependence on process model, strong robustness and easiness to implementation. And it is very suitable for controlling such processes as BW loop with large time delay.
基金Project (Nos.60702022 and 60872066) supported by the National Natural Science Foundation of China
文摘With the spatial-temporal focusing of acoustic energy,time reversal processing(TRP) shows the potential application for active target detection in shallow water.To turn the ‘potential’ into a reality,the TRP based on a model source(MS) instead of a physical probe source(PS) is investigated.For uncertain ocean environments,the robustness of TRP is discussed for the narrowband and broadband signal respectively.The channel transfer function matrix is first constructed in the acoustic perturbation space.Then a steering vector for time reversal transmission is obtained by singular value decomposition(SVD) of the matrix.For verification of the robust TRP,the tank experiments of time reversal transmission focusing and its application for active target detection are undertaken.The experimental results have shown that the robust TRP can effectively detect and locate a small bottom target.
基金financially supported by the National Natural Science Foundation of China (No.51704203)the PhD Early Development Program of Taiyuan University of Science and Technology (Nos. 20152008, 20152013, and 20152018)+2 种基金Shanxi Province Science Foundation for Youths (No. 201601D202027)Key Project of Research and Development Plan of Shanxi Province (Nos. 201603D111004 and 201603D121010)NSFC-Shanxi Coal Based Low Carbon Joint Fund (No. U1510131)
文摘A water model and a high-speed video camera were utilized in the 300-t RH equipment to study the effect of steel flow patterns in a vacuum chamber on fast decarburization and a superior flow-pattern map was obtained during the practical RH process. There are three flow patterns with different bubbling characteristics and steel surface states in the vacuum chamber: boiling pattern(BP), transition pattern(TP), and wave pattern(WP). The effect of the liquid-steel level and the residence time of the steel in the chamber on flow patterns and decarburization reaction were investigated, respectively. The liquid-steel level significantly affected the flow-pattern transition from BP to WP, and the residence time and reaction area were crucial to evaluate the whole decarburization process rather than the circulation flow rate and mixing time. A superior flow-pattern map during the practical RH process showed that the steel flow pattern changed from BP to TP quickly, and then remained as TP until the end of decarburization.