There is a growing in number of operations in aviation all over the world.This growing is increasing the necessity of innovation and new technology to respond the increment of the demand.As a respond of this demand,FA...There is a growing in number of operations in aviation all over the world.This growing is increasing the necessity of innovation and new technology to respond the increment of the demand.As a respond of this demand,FAA(Federal Aviation Administration)is working with NextGen in the United States and the EUROCONTROL is implementing the Point Merge as solution in the air traffic flow management in Europe.However,the FAA alternative and EUROCONTROL alternative are not mutually exclusive since Panama,a small country in Latin America,is trying to use a combination between the vectoring approach and the Point Merge in the air traffic flow management.In addition,the AAC(Autoridad de Aereonautica Civil)and the Tocumen(Tocumen International Airport)are working in a continuous collaboration between FAA and Panama with the mutual challenge to improve the actual system.As a result,the main airline of Panama,the Compania Panamena de Aviacion(COPA Airlines),and the Autoridad de Aeronautica Civil(AAC)constructed a simulation model to select an air traffic flow alternative that can be able to change the actual situation.In other words,COPA Airlines and AAC are pursuing the minimization of the numbers of conflicts,the number of sequence actions,the flight time,the track flight distance and the fuel burn.Furthermore,this study aims to use the final draft of this previous analysis based on a simulation methodology to conduct a Design and Analysis of Computer Experiments with the final objective to increment the statistical significance of the actual model.展开更多
Typically, magnesium alloys have been designed using a so-called hill-climbing approach, with rather incremental advances over the past century. Iterative and incremental alloy design is slow and expensive, but more i...Typically, magnesium alloys have been designed using a so-called hill-climbing approach, with rather incremental advances over the past century. Iterative and incremental alloy design is slow and expensive, but more importantly it does not harness all the data that exists in the field. In this work, a new approach is proposed that utilises data science and provides a detailed understanding of the data that exists in the field of Mg-alloy design to date. In this approach, first a consolidated alloy database that incorporates 916 datapoints was developed from the literature and experimental work. To analyse the characteristics of the database, alloying and thermomechanical processing effects on mechanical properties were explored via composition-process-property matrices. An unsupervised machine learning(ML) method of clustering was also implemented, using unlabelled data, with the aim of revealing potentially useful information for an alloy representation space of low dimensionality. In addition, the alloy database was correlated to thermodynamically stable secondary phases to further understand the relationships between microstructure and mechanical properties. This work not only introduces an invaluable open-source database, but it also provides, for the first-time data, insights that enable future accelerated digital Mg-alloy design.展开更多
Cavitation is one of the most important performance of centrifugal pumps. However, the current optimization works of centrifugal pump are mostly focusing on hydraulic efficiency only, which may result in poor cavitati...Cavitation is one of the most important performance of centrifugal pumps. However, the current optimization works of centrifugal pump are mostly focusing on hydraulic efficiency only, which may result in poor cavitation performance. Therefore, it is necessary to find an appropriate solution to improve cavitation performance with acceptable efficiency. In this paper, to improve the cavitation performance of a centrifugal pump with a vaned diffuser, the influence of impeller geometric parameters on the cavitation of the pump is investigated using the orthogonal design of experiment (DOE) based on computational fluid dynamics. The impeller inlet diameter D1, inlet incidence angle Aft, and blade wrap angle ~0 are selected as the main impeller geometric parameters and the orthogonal experiment of L9(3"3) is performed. Three-dimensional steady simulations for cavitation are conducted by using constant gas mass fraction model with second-order upwind, and the predicated cavitation performance is validated by laboratory experiment. The optimization results are obtained by the range analysis method to improve cavitation performance without obvious decreasing the efficiency of the centrifugal pump. The internal flow of the pump is analyzed in order to identify the flow behavior that can affect cavitation performance. The results show that D1 has the greatest influence on the pump cavitation and the final optimized impeller provides better flow distribution at blade leading edge. The final optimized impeller accomplishes better cavitation and hydraulic performance and the NPSHR decreases by 0.63m compared with the original one. The presented work supplies a feasible route in engineering practice to optimize a centrifugal pump impeller for better cavitation performance.展开更多
Four process parameters, pad diameter, stencil thickness, ball diameter and stand-off were chosen as four control factors. By using an L25 (5^6 ) orthogonal array the ceramic ball grid array ( CBGA ) solder joints...Four process parameters, pad diameter, stencil thickness, ball diameter and stand-off were chosen as four control factors. By using an L25 (5^6 ) orthogonal array the ceramic ball grid array ( CBGA ) solder joints which have 25 different combinations of process parameters were designed. The numerical models of all the 25 CBGA solder joints were developed using the Sugrace Evolver. Utilizing the sugrace coordinate exported from the 25 CBGA solder joints numerical models, the finite element analysis models were set up and the nonlinear finite element analysis of the CBGA solder joints under thermal cycles were pegrormed by ANSYS. The thermal fatigue life of CBGA solder joint was calculated using Coffin-Manson equation. Based on the calculated thermal fatigue life results, the range analysis and the variance analysis were pegrormed. The results show that the fatigue life of CBGA solder joint is affected by the pad diameter, the stencil thickness, the ball diameter and the stand-off in a descending order, the best combination of process parameters results in the longest fatigue life is 0.07 mm stand-off, 0.125 mm stencil thickness of, 0.85 mm ball diameter and 0. 89 mm pad diameter. With 95% confidence the pad diameter has a significant effect on the reliability of CBGA solder joints whereas the stand-off, the stencil thickness and the ball diameter have little effect on the reliability of CBGA solder joints.展开更多
The application of computer in the quantitative analysis chemistry experiment is a chemistry experiment teaching applications software, developed by Visual Basic (6.0), based on the content of quantitative analysis ...The application of computer in the quantitative analysis chemistry experiment is a chemistry experiment teaching applications software, developed by Visual Basic (6.0), based on the content of quantitative analysis chemistry experiment of chemistry major in higher institute. This software has the function of the automatic processing the experimental data, the automatic generation of test report copies, and the automatic evaluation of students' experimental results, which solve the reliability, objectivity and accuracy problems of the students' experiment data processing and evaluation, and avoid interference with human factors. The software has the characteristic of the easy installation, the easy operation, the strong practicability, pertinence, the systematicness and the running stability, so it provides a platform in the quantitative analysis chemistry experiment for the students' assessment system of automatic processing, and it has a high popularization value. The project's technical route design is reasonable, the research method is correct, and the experimental data processing results are reliable, which has reached the leading domestic level in the quantitative analysis chemistry experiment teaching field of computer data processing. And this project has been through the achievements appraisal of Gansu Provincial Sci. & Tech. Department.展开更多
As of 2020,the issue of user satisfaction has generated a significant amount of interest.Therefore,we employ a big data approach for exploring user satisfaction among Uber users.We develop a research model of user sat...As of 2020,the issue of user satisfaction has generated a significant amount of interest.Therefore,we employ a big data approach for exploring user satisfaction among Uber users.We develop a research model of user satisfaction by expanding the list of user experience(UX)elements(i.e.,pragmatic,expectation confirmation,hedonic,and burden)by including more elements,namely:risk,cost,promotion,anxiety,sadness,and anger.Subsequently,we collect 125,768 comments from online reviews of Uber services and perform a sentiment analysis to extract the UX elements.The results of a regression analysis reveal the following:hedonic,promotion,and pragmatic significantly and positively affect user satisfaction,while burden,cost,and risk have a substantial negative influence.However,the influence of expectation confirmation on user satisfaction is not supported.Moreover,sadness,anxiety,and anger are positively related to the perceived risk of users.Compared with sadness and anxiety,anger has a more important role in increasing the perceived burden of users.Based on these findings,we also provide some theoretical implications for future UX literature and some core suggestions related to establishing strategies for Uber and similar services.The proposed big data approach may be utilized in other UX studies in the future.展开更多
Big data on product sales are an emerging resource for supporting modular product design to meet diversified customers’requirements of product specification combinations.To better facilitate decision-making of modula...Big data on product sales are an emerging resource for supporting modular product design to meet diversified customers’requirements of product specification combinations.To better facilitate decision-making of modular product design,correlations among specifications and components originated from customers’conscious and subconscious preferences can be investigated by using big data on product sales.This study proposes a framework and the associated methods for supporting modular product design decisions based on correlation analysis of product specifications and components using big sales data.The correlations of the product specifications are determined by analyzing the collected product sales data.By building the relations between the product components and specifications,a matrix for measuring the correlation among product components is formed for component clustering.Six rules for supporting the decision making of modular product design are proposed based on the frequency analysis of the specification values per component cluster.A case study of electric vehicles illustrates the application of the proposed method.展开更多
The clothing industry is booming,and clothing prices are more affordable today.People's requirements for clothing are no longer limited to the question of enough clothing to wear.In order to solve the main problem...The clothing industry is booming,and clothing prices are more affordable today.People's requirements for clothing are no longer limited to the question of enough clothing to wear.In order to solve the main problem that consumers have a lot of clothes but don't know how to match and look good,this article aims to design a matchmaker application(app)-Coordinator app.Through analyzing the current situation of existing apparel matching apps and typical case,the paper summarizes the existing clothing,the advantages and disadvantages of matching apps,and the combine with user needs survey feedback,to design the main four modules required by the Coordinator app which are elaborated and displayed separately.展开更多
Hazard maps are usually prepared for each disaster, including seismic hazard maps, flood hazard maps, and landslide hazard maps. However, when the general public attempts to check their own disaster risk, most are lik...Hazard maps are usually prepared for each disaster, including seismic hazard maps, flood hazard maps, and landslide hazard maps. However, when the general public attempts to check their own disaster risk, most are likely not aware of the specific types of disaster. So, first of all, we need to know what kind<span style="font-family:;" "="">s</span><span style="font-family:;" "=""> of hazards are important. However, the information that integrates multiple hazards is not well maintained, and there are few such studies. On the other hand, in Japan, a lot of hazard information is being released on the Internet. So, we summarized and assessed hazard data that can be accessed online regarding shelters (where evacuees live during disasters) and their catchments (areas assigned to each shelter) in Yokohama City, Kanagawa Prefecture. Based on the results, we investigated whether a grouping by cluster analysis would allow for multi-hazard assessment. We used four natural disasters (seismic, flood, tsunami, sediment disaster) and six parameters of other population and senior population. However, since the characteristics of the population and the senior population were almost the same, only population data was used in the final examination. From the cluster analysis, it was found that it is appropriate to group the designated evacuation centers in Yokohama City into six groups. In addition, each of the six groups was found <span>to have explainable characteristics, confirming the effectiveness of multi-hazard</span> creation using cluster analysis. For example, we divided, all hazards are low, both flood and Seismic hazards are high, sediment hazards are high, etc. In many Japanese cities, disaster prevention measures have been constructed in consideration of ground hazards, mainly for earthquake disasters. In this paper, we confirmed the consistency between the evaluation results of the multi-hazard evaluated here and the existing ground hazard map and examined the usefulness of the designated evacuation center. Finally, the validity was confirmed by comparing this result with the ground hazard based on the actual measurement by the past research. In places where the seismic hazard is large, the two are consistent with the fact that the easiness of shaking by actual measurement is also large.</span>展开更多
In our former work [Catal. Today 174 (2011) 127], 12 heterogeneous catalysts were screened for CO oxidation, and Au-ZnO/Al2O3 was chosen and optimized in terms of weight loadings of Au and ZnO. The present study fol...In our former work [Catal. Today 174 (2011) 127], 12 heterogeneous catalysts were screened for CO oxidation, and Au-ZnO/Al2O3 was chosen and optimized in terms of weight loadings of Au and ZnO. The present study follows on to consider the impact of process parameters (catalyst preparation and reaction conditions), in conjunction with catalyst composition (weight loadings of Au and ZnO, and the total weight of the catalyst), as the optimization of the process parameters simultaneously optimized the catalyst composition. The optimization target is the reactivity of this important reaction. These factors were first optimized using response surface methodology (RSM) with 25 experiments, to obtain the optimum: 100 mg of 1.0%Au-4.1%ZnO/Al2O3 catalyst with 220℃ calcination and 100℃ reduction. After optimization, the main effects and interactions of these five factors were studied using statistical sensitivity analysis (SA). Certain observations from SA were verified by reaction mechanism, reactivity test and/or characterization techniques, while others need further investigation.展开更多
A comprehensive but simple-to-use software package called DPS (Data Pro- cessing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics an...A comprehensive but simple-to-use software package called DPS (Data Pro- cessing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical sottware. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology.展开更多
A systematic approach was presented to develop the empirical model for predicting the ultimate tensile strength of AA5083-H111 aluminum alloy which is widely used in ship building industry by incorporating friction st...A systematic approach was presented to develop the empirical model for predicting the ultimate tensile strength of AA5083-H111 aluminum alloy which is widely used in ship building industry by incorporating friction stir welding(FSW) process parameters such as tool rotational speed,welding speed,and axial force.FSW was carried out considering three-factor five-level central composite rotatable design with full replications technique.Response surface methodology(RSM) was applied to developing linear regression model for establishing the relationship between the FSW process parameters and ultimate tensile strength.Analysis of variance(ANOVA) technique was used to check the adequacy of the developed model.The FSW process parameters were also optimized using response surface methodology(RSM) to maximize the ultimate tensile strength.The joint welded at a tool rotational speed of 1 000 r/min,a welding speed of 69 mm/min and an axial force of 1.33 t exhibits higher tensile strength compared with other joints.展开更多
This study firstly defines a set of arrangement rule for perforated holes of multi-hole orifice(MO), and then presents three critical geometrical parameters including total number of performated hole, equivalent diame...This study firstly defines a set of arrangement rule for perforated holes of multi-hole orifice(MO), and then presents three critical geometrical parameters including total number of performated hole, equivalent diameter ratio and distribution density of perforated holes, which are to quantify MO structure. This paper built the throttling test apparatus for nice test MO plates, which were designed according to orthogonal theory. The experiments were conducted to investigate the effect of three critical geometerical parameters on the pressure loss coefficient of test MOs. Results show that equivalent diameter ratio is the dominant prameter affecting MO throttling characterstic.展开更多
In the course of network supported collaborative design, the data processing plays a very vital role. Much effort has been spent in this area, and many kinds of approaches have been proposed. Based on the correlative ...In the course of network supported collaborative design, the data processing plays a very vital role. Much effort has been spent in this area, and many kinds of approaches have been proposed. Based on the correlative materials, this paper presents extensible markup language (XML) based strategy for several important problems of data processing in network supported collaborative design, such as the representation of standard for the exchange of product model data (STEP) with XML in the product information expression and the management of XML documents using relational database. The paper gives a detailed exposition on how to clarify the mapping between XML structure and the relationship database structure and how XML-QL queries can be translated into structured query language (SQL) queries. Finally, the structure of data processing system based on XML is presented.展开更多
This article attempts to develop a simultaneous optimization procedure of several response variables from incomplete multi-response experiments. In incomplete multi-response experiments all the responses (p) are not r...This article attempts to develop a simultaneous optimization procedure of several response variables from incomplete multi-response experiments. In incomplete multi-response experiments all the responses (p) are not recorded from all the experimental units (n). Two situations of multi-response experiments considered are (i) on units all the responses are recorded while on units a subset of responses is recorded and (ii) on units all the responses (p) are recorded, on units a subset of responses is recorded and on units the remaining subset of responses is recorded. The procedure of estimation of parameters from linear multi-response models for incomplete multi-response experiments has been developed for both the situations. It has been shown that the parameter estimates are consistent and asymptotically unbiased. Using these parameter estimates, simultaneous optimization of incomplete multi-response experiments is attempted following the generalized distance criterion [1]. For the implementation of these procedures, SAS codes have been developed for both complete (k ≤ 5, p = 5) and incomplete (k ≤ 5, p1 = 2, 3 and p2 = 2, 3, where k is the number of factors) multi-response experiments. The procedure developed is illustrated with the help of a real data set.展开更多
Objective The evaluation index of medical equipment's economic benefit is based on the usage of medical equipment,the traditional data collection method is time-consuming,laborious and not entirely accurate.The us...Objective The evaluation index of medical equipment's economic benefit is based on the usage of medical equipment,the traditional data collection method is time-consuming,laborious and not entirely accurate.The usage of medical equipment is obtained by designing data query statements from the HIS system.Methods First the charging items are in correspondence with the device's name included,second fees and other relevant data are extracted from charging module in HIS.Through a rough estimate of the recovery period and an increase or decrease ratio,the economic benefit of the medical equipment can be analyzed.Results Through the method of the benefit analysis of the medical equipment,we can clearly find out the different economic benefit of the equipment,and finally analyze the reasons.Conclusion Practice has proved that,this methad,it can greatly reduce human,material resources required in data collection and improve the accuracy of the data.It can help hospital managers timely to grasp the operating costs of medical equipment and other information,and also provide scientific data for hospital managers when they purchase reasonable medical equipment.展开更多
The quality assessment and prediction becomes one of the most critical requirements for improving reliability, efficiency and safety of laser welding. Accurate and efficient model to perform non-destructive quality es...The quality assessment and prediction becomes one of the most critical requirements for improving reliability, efficiency and safety of laser welding. Accurate and efficient model to perform non-destructive quality estimation is an essential part of this assessment. This paper presents a structured and comprehensive approach developed to design an effective artificial neural network based model for weld bead geometry prediction and control in laser welding of galvanized steel in butt joint configurations. The proposed approach examines laser welding parameters and conditions known to have an influence on geometric characteristics of the welds and builds a weld quality prediction model step by step. The modelling procedure begins by examining, through structured experimental investigations and exhaustive 3D modelling and simulation efforts, the direct and the interaction effects of laser welding parameters such as laser power, welding speed, fibre diameter and gap, on the weld bead geometry (i.e. depth of penetration and bead width). Using these results and various statistical tools, various neural network based prediction models are developed and evaluated. The results demonstrate that the proposed approach can effectively lead to a consistent model able to accurately and reliably provide an appropriate prediction of weld bead geometry under variable welding conditions.展开更多
Laser surface transformation hardening becomes one of the most modern processes used to improve fatigue and wear properties of steel surfaces. In this process, the material properties and the heating parameters are th...Laser surface transformation hardening becomes one of the most modern processes used to improve fatigue and wear properties of steel surfaces. In this process, the material properties and the heating parameters are the factors that present the most significant effects on the hardened surface attributes. The control of these factors using predictive modeling approaches to achieve desired surface properties leads to conclusive results. However, when the dimensions of the surface to be treated are larger than the cross-section of the laser beam, various laser-scanning patterns are involved. This paper presents an experimental investigation of laser surface hardening of AISI 4340 steel using different laser scanning patterns. This investigation is based on a structured experimental design using the Taguchi method and improved statistical analysis tools. Experiments are carried out using a 3 kW Nd: YAG laser source in order to evaluate the effects of the heating parameters and patterns design parameters on the physical and geometrical characteristics of the hardened surface. Laser power, scanning speed and scanning patterns (linear, sinusoidal, triangular and trochoid) are the factors used to evaluate the hardened depth and the hardened width variations and to identify the possible relationship between these factors and the hardened zone attributes. Various statistical tools such as ANOVA, correlations analysis and response surfaces are applied in order to examine the effects of the experimental factors on the hardened surface characteristics. The results reveal that the scanning patterns do not modify the nature of the laser parameters’ effects on the hardened depth and the hardened width. But they can accentuate or reduce these effects depending on the type of the considered pattern. The results show also that the sinusoidal and the triangular patterns are relevant when a maximum hardened width with an acceptable hardened depth is desired.展开更多
This paper discussed Bayesian variable selection methods for models from split-plot mixture designs using samples from Metropolis-Hastings within the Gibbs sampling algorithm. Bayesian variable selection is easy to im...This paper discussed Bayesian variable selection methods for models from split-plot mixture designs using samples from Metropolis-Hastings within the Gibbs sampling algorithm. Bayesian variable selection is easy to implement due to the improvement in computing via MCMC sampling. We described the Bayesian methodology by introducing the Bayesian framework, and explaining Markov Chain Monte Carlo (MCMC) sampling. The Metropolis-Hastings within Gibbs sampling was used to draw dependent samples from the full conditional distributions which were explained. In mixture experiments with process variables, the response depends not only on the proportions of the mixture components but also on the effects of the process variables. In many such mixture-process variable experiments, constraints such as time or cost prohibit the selection of treatments completely at random. In these situations, restrictions on the randomisation force the level combinations of one group of factors to be fixed and the combinations of the other group of factors are run. Then a new level of the first-factor group is set and combinations of the other factors are run. We discussed the computational algorithm for the Stochastic Search Variable Selection (SSVS) in linear mixed models. We extended the computational algorithm of SSVS to fit models from split-plot mixture design by introducing the algorithm of the Stochastic Search Variable Selection for Split-plot Design (SSVS-SPD). The motivation of this extension is that we have two different levels of the experimental units, one for the whole plots and the other for subplots in the split-plot mixture design.展开更多
To address various fisheries science problems around Japan, the Japan Fisheries Research and Education Agency (FRA) has developed an ocean forecast system by combining an ocean circulation model based on the Regional ...To address various fisheries science problems around Japan, the Japan Fisheries Research and Education Agency (FRA) has developed an ocean forecast system by combining an ocean circulation model based on the Regional Ocean Modeling System (ROMS) with three-dimensional variational analysis schemes. This system, which is called FRA-ROMS, is a basic and essential tool for the systematic conduct of fisheries science. The main aim of FRA-ROMS is to realistically simulate mesoscale variations over the Kuroshio-Oyashio region. Here, in situ oceanographic and satellite data were assimilated into FRA-ROMS using a weekly time window. We first examined the reproducibility through comparison with several oceanographic datasets with an Eulerian reference frame. FRA-ROMS was able to reproduce representative features of mesoscale variations such as the position of the Kuroshio path, variability of the Kuroshio Extension, and southward intrusions of the Oyashio. Second, using a Lagrangian reference frame, we estimated position errors between ocean drifters and particles passively transported by simulated currents, because particle tracking is an essential technique used in applications of reanalysis products to fisheries science. Finally, we summarize recent and ongoing fisheries studies that use FRA-ROMS and mention several new developments and enhancements that will be implemented in the near future.展开更多
文摘There is a growing in number of operations in aviation all over the world.This growing is increasing the necessity of innovation and new technology to respond the increment of the demand.As a respond of this demand,FAA(Federal Aviation Administration)is working with NextGen in the United States and the EUROCONTROL is implementing the Point Merge as solution in the air traffic flow management in Europe.However,the FAA alternative and EUROCONTROL alternative are not mutually exclusive since Panama,a small country in Latin America,is trying to use a combination between the vectoring approach and the Point Merge in the air traffic flow management.In addition,the AAC(Autoridad de Aereonautica Civil)and the Tocumen(Tocumen International Airport)are working in a continuous collaboration between FAA and Panama with the mutual challenge to improve the actual system.As a result,the main airline of Panama,the Compania Panamena de Aviacion(COPA Airlines),and the Autoridad de Aeronautica Civil(AAC)constructed a simulation model to select an air traffic flow alternative that can be able to change the actual situation.In other words,COPA Airlines and AAC are pursuing the minimization of the numbers of conflicts,the number of sequence actions,the flight time,the track flight distance and the fuel burn.Furthermore,this study aims to use the final draft of this previous analysis based on a simulation methodology to conduct a Design and Analysis of Computer Experiments with the final objective to increment the statistical significance of the actual model.
基金the support of the Monash-IITB Academy Scholarshipfunded in part by the Australian Research Council (DP190103592)。
文摘Typically, magnesium alloys have been designed using a so-called hill-climbing approach, with rather incremental advances over the past century. Iterative and incremental alloy design is slow and expensive, but more importantly it does not harness all the data that exists in the field. In this work, a new approach is proposed that utilises data science and provides a detailed understanding of the data that exists in the field of Mg-alloy design to date. In this approach, first a consolidated alloy database that incorporates 916 datapoints was developed from the literature and experimental work. To analyse the characteristics of the database, alloying and thermomechanical processing effects on mechanical properties were explored via composition-process-property matrices. An unsupervised machine learning(ML) method of clustering was also implemented, using unlabelled data, with the aim of revealing potentially useful information for an alloy representation space of low dimensionality. In addition, the alloy database was correlated to thermodynamically stable secondary phases to further understand the relationships between microstructure and mechanical properties. This work not only introduces an invaluable open-source database, but it also provides, for the first-time data, insights that enable future accelerated digital Mg-alloy design.
基金Supported by National Science&Technology Pillar Program of China(Grant No.2014BAB08B01)National Natural Science Foundation of China(Grant No.51409123)+1 种基金Jiangsu Provincial Natural Science Foundation of China(Grant No.BK20140554)Training Project for Young Core Teacher of Jiangsu University,China
文摘Cavitation is one of the most important performance of centrifugal pumps. However, the current optimization works of centrifugal pump are mostly focusing on hydraulic efficiency only, which may result in poor cavitation performance. Therefore, it is necessary to find an appropriate solution to improve cavitation performance with acceptable efficiency. In this paper, to improve the cavitation performance of a centrifugal pump with a vaned diffuser, the influence of impeller geometric parameters on the cavitation of the pump is investigated using the orthogonal design of experiment (DOE) based on computational fluid dynamics. The impeller inlet diameter D1, inlet incidence angle Aft, and blade wrap angle ~0 are selected as the main impeller geometric parameters and the orthogonal experiment of L9(3"3) is performed. Three-dimensional steady simulations for cavitation are conducted by using constant gas mass fraction model with second-order upwind, and the predicated cavitation performance is validated by laboratory experiment. The optimization results are obtained by the range analysis method to improve cavitation performance without obvious decreasing the efficiency of the centrifugal pump. The internal flow of the pump is analyzed in order to identify the flow behavior that can affect cavitation performance. The results show that D1 has the greatest influence on the pump cavitation and the final optimized impeller provides better flow distribution at blade leading edge. The final optimized impeller accomplishes better cavitation and hydraulic performance and the NPSHR decreases by 0.63m compared with the original one. The presented work supplies a feasible route in engineering practice to optimize a centrifugal pump impeller for better cavitation performance.
基金This work was supported by Science Foundation of Guangxi Zhuang Autonomous Region (Contract No. 02336060).
文摘Four process parameters, pad diameter, stencil thickness, ball diameter and stand-off were chosen as four control factors. By using an L25 (5^6 ) orthogonal array the ceramic ball grid array ( CBGA ) solder joints which have 25 different combinations of process parameters were designed. The numerical models of all the 25 CBGA solder joints were developed using the Sugrace Evolver. Utilizing the sugrace coordinate exported from the 25 CBGA solder joints numerical models, the finite element analysis models were set up and the nonlinear finite element analysis of the CBGA solder joints under thermal cycles were pegrormed by ANSYS. The thermal fatigue life of CBGA solder joint was calculated using Coffin-Manson equation. Based on the calculated thermal fatigue life results, the range analysis and the variance analysis were pegrormed. The results show that the fatigue life of CBGA solder joint is affected by the pad diameter, the stencil thickness, the ball diameter and the stand-off in a descending order, the best combination of process parameters results in the longest fatigue life is 0.07 mm stand-off, 0.125 mm stencil thickness of, 0.85 mm ball diameter and 0. 89 mm pad diameter. With 95% confidence the pad diameter has a significant effect on the reliability of CBGA solder joints whereas the stand-off, the stencil thickness and the ball diameter have little effect on the reliability of CBGA solder joints.
文摘The application of computer in the quantitative analysis chemistry experiment is a chemistry experiment teaching applications software, developed by Visual Basic (6.0), based on the content of quantitative analysis chemistry experiment of chemistry major in higher institute. This software has the function of the automatic processing the experimental data, the automatic generation of test report copies, and the automatic evaluation of students' experimental results, which solve the reliability, objectivity and accuracy problems of the students' experiment data processing and evaluation, and avoid interference with human factors. The software has the characteristic of the easy installation, the easy operation, the strong practicability, pertinence, the systematicness and the running stability, so it provides a platform in the quantitative analysis chemistry experiment for the students' assessment system of automatic processing, and it has a high popularization value. The project's technical route design is reasonable, the research method is correct, and the experimental data processing results are reliable, which has reached the leading domestic level in the quantitative analysis chemistry experiment teaching field of computer data processing. And this project has been through the achievements appraisal of Gansu Provincial Sci. & Tech. Department.
基金supported by a National Research Foundation of Korea(NRF)(http://nrf.re.kr/eng/index)grant funded by the Korean government(NRF-2020R1A2C1014957).
文摘As of 2020,the issue of user satisfaction has generated a significant amount of interest.Therefore,we employ a big data approach for exploring user satisfaction among Uber users.We develop a research model of user satisfaction by expanding the list of user experience(UX)elements(i.e.,pragmatic,expectation confirmation,hedonic,and burden)by including more elements,namely:risk,cost,promotion,anxiety,sadness,and anger.Subsequently,we collect 125,768 comments from online reviews of Uber services and perform a sentiment analysis to extract the UX elements.The results of a regression analysis reveal the following:hedonic,promotion,and pragmatic significantly and positively affect user satisfaction,while burden,cost,and risk have a substantial negative influence.However,the influence of expectation confirmation on user satisfaction is not supported.Moreover,sadness,anxiety,and anger are positively related to the perceived risk of users.Compared with sadness and anxiety,anger has a more important role in increasing the perceived burden of users.Based on these findings,we also provide some theoretical implications for future UX literature and some core suggestions related to establishing strategies for Uber and similar services.The proposed big data approach may be utilized in other UX studies in the future.
基金National Key R&D Program of China(Grant No.2018YFB1701701)Sailing Talent Program+1 种基金Guangdong Provincial Science and Technologies Program of China(Grant No.2017B090922008)Special Grand Grant from Tianjin City Government of China。
文摘Big data on product sales are an emerging resource for supporting modular product design to meet diversified customers’requirements of product specification combinations.To better facilitate decision-making of modular product design,correlations among specifications and components originated from customers’conscious and subconscious preferences can be investigated by using big data on product sales.This study proposes a framework and the associated methods for supporting modular product design decisions based on correlation analysis of product specifications and components using big sales data.The correlations of the product specifications are determined by analyzing the collected product sales data.By building the relations between the product components and specifications,a matrix for measuring the correlation among product components is formed for component clustering.Six rules for supporting the decision making of modular product design are proposed based on the frequency analysis of the specification values per component cluster.A case study of electric vehicles illustrates the application of the proposed method.
文摘The clothing industry is booming,and clothing prices are more affordable today.People's requirements for clothing are no longer limited to the question of enough clothing to wear.In order to solve the main problem that consumers have a lot of clothes but don't know how to match and look good,this article aims to design a matchmaker application(app)-Coordinator app.Through analyzing the current situation of existing apparel matching apps and typical case,the paper summarizes the existing clothing,the advantages and disadvantages of matching apps,and the combine with user needs survey feedback,to design the main four modules required by the Coordinator app which are elaborated and displayed separately.
文摘Hazard maps are usually prepared for each disaster, including seismic hazard maps, flood hazard maps, and landslide hazard maps. However, when the general public attempts to check their own disaster risk, most are likely not aware of the specific types of disaster. So, first of all, we need to know what kind<span style="font-family:;" "="">s</span><span style="font-family:;" "=""> of hazards are important. However, the information that integrates multiple hazards is not well maintained, and there are few such studies. On the other hand, in Japan, a lot of hazard information is being released on the Internet. So, we summarized and assessed hazard data that can be accessed online regarding shelters (where evacuees live during disasters) and their catchments (areas assigned to each shelter) in Yokohama City, Kanagawa Prefecture. Based on the results, we investigated whether a grouping by cluster analysis would allow for multi-hazard assessment. We used four natural disasters (seismic, flood, tsunami, sediment disaster) and six parameters of other population and senior population. However, since the characteristics of the population and the senior population were almost the same, only population data was used in the final examination. From the cluster analysis, it was found that it is appropriate to group the designated evacuation centers in Yokohama City into six groups. In addition, each of the six groups was found <span>to have explainable characteristics, confirming the effectiveness of multi-hazard</span> creation using cluster analysis. For example, we divided, all hazards are low, both flood and Seismic hazards are high, sediment hazards are high, etc. In many Japanese cities, disaster prevention measures have been constructed in consideration of ground hazards, mainly for earthquake disasters. In this paper, we confirmed the consistency between the evaluation results of the multi-hazard evaluated here and the existing ground hazard map and examined the usefulness of the designated evacuation center. Finally, the validity was confirmed by comparing this result with the ground hazard based on the actual measurement by the past research. In places where the seismic hazard is large, the two are consistent with the fact that the easiness of shaking by actual measurement is also large.</span>
基金supported by the Singapore AcRF Tier 1 Grant(RG 19/09)the A*STAR SERC Grant(102 101 0020)
文摘In our former work [Catal. Today 174 (2011) 127], 12 heterogeneous catalysts were screened for CO oxidation, and Au-ZnO/Al2O3 was chosen and optimized in terms of weight loadings of Au and ZnO. The present study follows on to consider the impact of process parameters (catalyst preparation and reaction conditions), in conjunction with catalyst composition (weight loadings of Au and ZnO, and the total weight of the catalyst), as the optimization of the process parameters simultaneously optimized the catalyst composition. The optimization target is the reactivity of this important reaction. These factors were first optimized using response surface methodology (RSM) with 25 experiments, to obtain the optimum: 100 mg of 1.0%Au-4.1%ZnO/Al2O3 catalyst with 220℃ calcination and 100℃ reduction. After optimization, the main effects and interactions of these five factors were studied using statistical sensitivity analysis (SA). Certain observations from SA were verified by reaction mechanism, reactivity test and/or characterization techniques, while others need further investigation.
文摘A comprehensive but simple-to-use software package called DPS (Data Pro- cessing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical sottware. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology.
文摘A systematic approach was presented to develop the empirical model for predicting the ultimate tensile strength of AA5083-H111 aluminum alloy which is widely used in ship building industry by incorporating friction stir welding(FSW) process parameters such as tool rotational speed,welding speed,and axial force.FSW was carried out considering three-factor five-level central composite rotatable design with full replications technique.Response surface methodology(RSM) was applied to developing linear regression model for establishing the relationship between the FSW process parameters and ultimate tensile strength.Analysis of variance(ANOVA) technique was used to check the adequacy of the developed model.The FSW process parameters were also optimized using response surface methodology(RSM) to maximize the ultimate tensile strength.The joint welded at a tool rotational speed of 1 000 r/min,a welding speed of 69 mm/min and an axial force of 1.33 t exhibits higher tensile strength compared with other joints.
基金Sponsored by National Natural Science Foundation of China (Grant No50578049)
文摘This study firstly defines a set of arrangement rule for perforated holes of multi-hole orifice(MO), and then presents three critical geometrical parameters including total number of performated hole, equivalent diameter ratio and distribution density of perforated holes, which are to quantify MO structure. This paper built the throttling test apparatus for nice test MO plates, which were designed according to orthogonal theory. The experiments were conducted to investigate the effect of three critical geometerical parameters on the pressure loss coefficient of test MOs. Results show that equivalent diameter ratio is the dominant prameter affecting MO throttling characterstic.
基金supported by National High Technology Research and Development Program of China (863 Program) (No. AA420060)
文摘In the course of network supported collaborative design, the data processing plays a very vital role. Much effort has been spent in this area, and many kinds of approaches have been proposed. Based on the correlative materials, this paper presents extensible markup language (XML) based strategy for several important problems of data processing in network supported collaborative design, such as the representation of standard for the exchange of product model data (STEP) with XML in the product information expression and the management of XML documents using relational database. The paper gives a detailed exposition on how to clarify the mapping between XML structure and the relationship database structure and how XML-QL queries can be translated into structured query language (SQL) queries. Finally, the structure of data processing system based on XML is presented.
文摘This article attempts to develop a simultaneous optimization procedure of several response variables from incomplete multi-response experiments. In incomplete multi-response experiments all the responses (p) are not recorded from all the experimental units (n). Two situations of multi-response experiments considered are (i) on units all the responses are recorded while on units a subset of responses is recorded and (ii) on units all the responses (p) are recorded, on units a subset of responses is recorded and on units the remaining subset of responses is recorded. The procedure of estimation of parameters from linear multi-response models for incomplete multi-response experiments has been developed for both the situations. It has been shown that the parameter estimates are consistent and asymptotically unbiased. Using these parameter estimates, simultaneous optimization of incomplete multi-response experiments is attempted following the generalized distance criterion [1]. For the implementation of these procedures, SAS codes have been developed for both complete (k ≤ 5, p = 5) and incomplete (k ≤ 5, p1 = 2, 3 and p2 = 2, 3, where k is the number of factors) multi-response experiments. The procedure developed is illustrated with the help of a real data set.
文摘Objective The evaluation index of medical equipment's economic benefit is based on the usage of medical equipment,the traditional data collection method is time-consuming,laborious and not entirely accurate.The usage of medical equipment is obtained by designing data query statements from the HIS system.Methods First the charging items are in correspondence with the device's name included,second fees and other relevant data are extracted from charging module in HIS.Through a rough estimate of the recovery period and an increase or decrease ratio,the economic benefit of the medical equipment can be analyzed.Results Through the method of the benefit analysis of the medical equipment,we can clearly find out the different economic benefit of the equipment,and finally analyze the reasons.Conclusion Practice has proved that,this methad,it can greatly reduce human,material resources required in data collection and improve the accuracy of the data.It can help hospital managers timely to grasp the operating costs of medical equipment and other information,and also provide scientific data for hospital managers when they purchase reasonable medical equipment.
文摘The quality assessment and prediction becomes one of the most critical requirements for improving reliability, efficiency and safety of laser welding. Accurate and efficient model to perform non-destructive quality estimation is an essential part of this assessment. This paper presents a structured and comprehensive approach developed to design an effective artificial neural network based model for weld bead geometry prediction and control in laser welding of galvanized steel in butt joint configurations. The proposed approach examines laser welding parameters and conditions known to have an influence on geometric characteristics of the welds and builds a weld quality prediction model step by step. The modelling procedure begins by examining, through structured experimental investigations and exhaustive 3D modelling and simulation efforts, the direct and the interaction effects of laser welding parameters such as laser power, welding speed, fibre diameter and gap, on the weld bead geometry (i.e. depth of penetration and bead width). Using these results and various statistical tools, various neural network based prediction models are developed and evaluated. The results demonstrate that the proposed approach can effectively lead to a consistent model able to accurately and reliably provide an appropriate prediction of weld bead geometry under variable welding conditions.
文摘Laser surface transformation hardening becomes one of the most modern processes used to improve fatigue and wear properties of steel surfaces. In this process, the material properties and the heating parameters are the factors that present the most significant effects on the hardened surface attributes. The control of these factors using predictive modeling approaches to achieve desired surface properties leads to conclusive results. However, when the dimensions of the surface to be treated are larger than the cross-section of the laser beam, various laser-scanning patterns are involved. This paper presents an experimental investigation of laser surface hardening of AISI 4340 steel using different laser scanning patterns. This investigation is based on a structured experimental design using the Taguchi method and improved statistical analysis tools. Experiments are carried out using a 3 kW Nd: YAG laser source in order to evaluate the effects of the heating parameters and patterns design parameters on the physical and geometrical characteristics of the hardened surface. Laser power, scanning speed and scanning patterns (linear, sinusoidal, triangular and trochoid) are the factors used to evaluate the hardened depth and the hardened width variations and to identify the possible relationship between these factors and the hardened zone attributes. Various statistical tools such as ANOVA, correlations analysis and response surfaces are applied in order to examine the effects of the experimental factors on the hardened surface characteristics. The results reveal that the scanning patterns do not modify the nature of the laser parameters’ effects on the hardened depth and the hardened width. But they can accentuate or reduce these effects depending on the type of the considered pattern. The results show also that the sinusoidal and the triangular patterns are relevant when a maximum hardened width with an acceptable hardened depth is desired.
文摘This paper discussed Bayesian variable selection methods for models from split-plot mixture designs using samples from Metropolis-Hastings within the Gibbs sampling algorithm. Bayesian variable selection is easy to implement due to the improvement in computing via MCMC sampling. We described the Bayesian methodology by introducing the Bayesian framework, and explaining Markov Chain Monte Carlo (MCMC) sampling. The Metropolis-Hastings within Gibbs sampling was used to draw dependent samples from the full conditional distributions which were explained. In mixture experiments with process variables, the response depends not only on the proportions of the mixture components but also on the effects of the process variables. In many such mixture-process variable experiments, constraints such as time or cost prohibit the selection of treatments completely at random. In these situations, restrictions on the randomisation force the level combinations of one group of factors to be fixed and the combinations of the other group of factors are run. Then a new level of the first-factor group is set and combinations of the other factors are run. We discussed the computational algorithm for the Stochastic Search Variable Selection (SSVS) in linear mixed models. We extended the computational algorithm of SSVS to fit models from split-plot mixture design by introducing the algorithm of the Stochastic Search Variable Selection for Split-plot Design (SSVS-SPD). The motivation of this extension is that we have two different levels of the experimental units, one for the whole plots and the other for subplots in the split-plot mixture design.
文摘To address various fisheries science problems around Japan, the Japan Fisheries Research and Education Agency (FRA) has developed an ocean forecast system by combining an ocean circulation model based on the Regional Ocean Modeling System (ROMS) with three-dimensional variational analysis schemes. This system, which is called FRA-ROMS, is a basic and essential tool for the systematic conduct of fisheries science. The main aim of FRA-ROMS is to realistically simulate mesoscale variations over the Kuroshio-Oyashio region. Here, in situ oceanographic and satellite data were assimilated into FRA-ROMS using a weekly time window. We first examined the reproducibility through comparison with several oceanographic datasets with an Eulerian reference frame. FRA-ROMS was able to reproduce representative features of mesoscale variations such as the position of the Kuroshio path, variability of the Kuroshio Extension, and southward intrusions of the Oyashio. Second, using a Lagrangian reference frame, we estimated position errors between ocean drifters and particles passively transported by simulated currents, because particle tracking is an essential technique used in applications of reanalysis products to fisheries science. Finally, we summarize recent and ongoing fisheries studies that use FRA-ROMS and mention several new developments and enhancements that will be implemented in the near future.