Buildings with large open spaces in which chemicals are handled are often exposed to the risk of explosions.Computational fluid dynamics is a useful and convenient way to investigate contaminant dispersion in such lar...Buildings with large open spaces in which chemicals are handled are often exposed to the risk of explosions.Computational fluid dynamics is a useful and convenient way to investigate contaminant dispersion in such large spaces.The turbulent Schmidt number(Sc_(t))concept has typically been used in this regard,and most studies have adopted a default value.We studied the concentration distribution for sulfur hexafluoride(SF_(6))assuming different emission rates and considering the effect of Sc_(t).Then we examined the same problem for a light gas by assuming hydrogen gas(H_(2))as the contaminant.When SF_(6) was considered as the contaminant gas,a variation in the emission rate completely changed the concentration distribution.When the emission rate was low,the gravitational effect did not take place.For both low and high emission rates,an increase in S_(ct) accelerated the transport rate of SF_(6).In contrast,for H_(2) as the contaminant gas,a larger S_(ct) could induce a decrease in the H_(2) transport rate.展开更多
Corona discharge, as a common means to obtain non-equilibrium plasma, can generally obtain high-concentration plasma by increasing discharge points to meet production needs. However,the existing numerical simulation m...Corona discharge, as a common means to obtain non-equilibrium plasma, can generally obtain high-concentration plasma by increasing discharge points to meet production needs. However,the existing numerical simulation models used to study multi-point corona discharge are all calculations of small-scale space models, which cannot obtain the distribution characteristics of plasma in large space. Based on our previous research, this paper proposes a hybrid model for studying the distribution of multi-point discharge plasma in large-scale spaces, which divides the computational domain and computes separately with the hydrodynamic model and the ion mobility model. The simulation results are verified by a needle–ball electrode device. Firstly, the electric field distribution and plasma distribution of the needle electrodes with single tip and double tips are compared and discussed. Secondly, the plasma distribution of the needle electrode with the double tip at different voltages is investigated. Both computational and experimental results indicate that the charged particle concentration and current of the needle electrode with double tips are both twice as high as those of the needle electrode with a single tip. This model can extend the computational area of the multi-point corona discharge finite element model to the sub-meter(25 cm) or meter level, which provides an effective means to study the plasma distribution generated by multiple discharge points in large-scale space.展开更多
This paper discusses the two-block large-scale nonconvex optimization problem with general linear constraints.Based on the ideas of splitting and sequential quadratic optimization(SQO),a new feasible descent method fo...This paper discusses the two-block large-scale nonconvex optimization problem with general linear constraints.Based on the ideas of splitting and sequential quadratic optimization(SQO),a new feasible descent method for the discussed problem is proposed.First,we consider the problem of quadratic optimal(QO)approximation associated with the current feasible iteration point,and we split the QO into two small-scale QOs which can be solved in parallel.Second,a feasible descent direction for the problem is obtained and a new SQO-type method is proposed,namely,splitting feasible SQO(SF-SQO)method.Moreover,under suitable conditions,we analyse the global convergence,strong convergence and rate of superlinear convergence of the SF-SQO method.Finally,preliminary numerical experiments regarding the economic dispatch of a power system are carried out,and these show that the SF-SQO method is promising.展开更多
The author of this paper once attempted to propose a unified framework for gauge fields based on the mathematical and physical picture of the principal fiber bundle: that is, to believe that our universe may have more...The author of this paper once attempted to propose a unified framework for gauge fields based on the mathematical and physical picture of the principal fiber bundle: that is, to believe that our universe may have more fundamental interactions than the four, and these fundamental gauge fields are only components on the bottom manifold (i.e. our universe) projected by a unified gauge potential of the principal fiber bundle manifold;these components can satisfy the transformation of gauge potential, or even be transformed from one basic interaction gauge potential to another basic interaction gauge potential, and can be summarized into a unified equation, namely the generalized gauge equation expression, corresponding to gauge transformation invariance;so the invariance of gauge transformation is a necessary condition for unified field theory, and the four (or more) fundamental interaction fields of the universe are unified in a unified gauge field defined by the connection on the principal fiber bundle. In this paper, the author continues to propose a model of large-scale (gravitational) fundamental interactions in the universe based on the mathematical and physical picture of the principal fiber bundle, attempting to explain that dark matter and dark energy are merely reflections of these gravitational fundamental interactions that deviate in intensity from the gravitational fundamental interactions of the solar system at galaxy scales or some cosmic scales which are much larger than the solar system. All these “gravitational” fundamental interactions originate from the unified gauge field of the universe, namely the connection or curvature on the principal fiber bundle. These interactions are their projected representations on the bottom manifold (i.e. our universe) by different cross-sections (gauge transformations). These projection representations of the universe certainly are described by the generalized gauge equation or curvature similarity equation, and under the guidance of curvature gauge transformation factors, oscillate and evolve between the curvatures 1→0→-1→0→1 of the universe.展开更多
In this paper we report a sparse truncated Newton algorithm for handling large-scale simple bound nonlinear constrained minimixation problem. The truncated Newton method is used to update the variables with indices ou...In this paper we report a sparse truncated Newton algorithm for handling large-scale simple bound nonlinear constrained minimixation problem. The truncated Newton method is used to update the variables with indices outside of the active set, while the projected gradient method is used to update the active variables. At each iterative level, the search direction consists of three parts, one of which is a subspace truncated Newton direction, the other two are subspace gradient and modified gradient directions. The subspace truncated Newton direction is obtained by solving a sparse system of linear equations. The global convergence and quadratic convergence rate of the algorithm are proved and some numerical tests are given.展开更多
We are engaged in solving two difficult problems in adaptive control of the large-scale time-variant aerospace system. One is parameter identification of time-variant continuous-time state-space modei; the other is ho...We are engaged in solving two difficult problems in adaptive control of the large-scale time-variant aerospace system. One is parameter identification of time-variant continuous-time state-space modei; the other is how to solve algebraic Riccati equation (ARE) of large order efficiently. In our approach, two neural networks are employed to independently solve both the system identification problem and the ARE associated with the optimal control problem. Thus the identification and the control computation are combined in closed-loop, adaptive, real-time control system . The advantage of this approach is that the neural networks converge to their solutions very quickly and simultaneously.展开更多
A method to synthesize anticancer drug N-( 4- hydroxyphenyl) retinamide (4-HPR)on a large scale is described. It consists of the preferred steps of reacting all-trans retinoic acid with thionyl chloride to form re...A method to synthesize anticancer drug N-( 4- hydroxyphenyl) retinamide (4-HPR)on a large scale is described. It consists of the preferred steps of reacting all-trans retinoic acid with thionyl chloride to form retinoyl chloride and then reacting with triethylamine to generate retinoyl ammonium salt which in turn is reacted with p-aminophenol to eventually produce 4-HPR. This process can overcome many scale-up challenges that exist in the methods reported in the literature and provide an easy, mild and high yield route for large scale synthesis of 4-HPR. Moreover, the effects of the molar ratios of the reagents on the yield are examined. The best molar ratios are a 2.0 molar equivalence of thionyl chloride and a 3.0 molar equivalence of paminophenol to retinoic acid, and the total yield is 80. 7%.展开更多
The thermal environmental characteristics are experim-entally studied in terms of different air supply volumes and outdoor meteorological parameters in a large-space building which is air conditioned with a low sidewa...The thermal environmental characteristics are experim-entally studied in terms of different air supply volumes and outdoor meteorological parameters in a large-space building which is air conditioned with a low sidewall air supply.The experimental results show that the indoor vertical temperature distributions under different condition are similar.The maximum vertical temperature difference(MVTD)is up to about 20 ℃,and it linearly changes with the sol-air temperature.The indoor vertical temperature gradients(VTGs)in the upper,central and lower zones are different.The influence of the sol-air temperature on the VTGs in the upper and the lower zones is greater than that in the central zone.The characteristics of the VTGs in the three zones affected by the air supply volume are the same as those affected by the sol-air temperature.Besides,because of the small air velocity,the predicted mean vote(PMV)on comfort in the occupied zone is slightly high and the air temperature difference between the head and the ankle is usually more than 3 ℃.展开更多
A new limited memory symmetric rank one algorithm is proposed. It combines a modified self-scaled symmetric rank one (SSR1) update with the limited memory and nonmonotone line search technique. In this algorithm, th...A new limited memory symmetric rank one algorithm is proposed. It combines a modified self-scaled symmetric rank one (SSR1) update with the limited memory and nonmonotone line search technique. In this algorithm, the descent search direction is generated by inverse limited memory SSR1 update, thus simplifying the computation. Numerical comparison of the algorithm and the famous limited memory BFGS algorithm is given. Comparison results indicate that the new algorithm can process a kind of large-scale unconstrained optimization problems.展开更多
The large-scale periodic orbits of a nonlinear mechanics system can represent the homology classes, which are generally non-trivial, of the energy level surface and the topology properties of an energy level surface a...The large-scale periodic orbits of a nonlinear mechanics system can represent the homology classes, which are generally non-trivial, of the energy level surface and the topology properties of an energy level surface are determined by the that of the phase space and the large-scale properties of the Hamiltonian. These properties are used for estimate of the rank of the first homology group of energy level surfaces in the paper.展开更多
The end-effector of the large space manipulator is employed to assist the manipulator in handling and manipulating large payloads on orbit.Currently,there are few researches about the end-effector,and the existing end...The end-effector of the large space manipulator is employed to assist the manipulator in handling and manipulating large payloads on orbit.Currently,there are few researches about the end-effector,and the existing end-effectors have some disadvantages,such as poor misalignment tolerance capability and complex mechanical components.According to the end positioning errors and the residual vibration characters of the large space manipulators,two basic performance requirements of the end-effector which include the capabilities of misalignment tolerance and soft capture are proposed.And the end-effector should accommodate the following misalignments of the mechanical interface.The translation misalignments in axial and radial directions and the angular misalignments in roll,pitch and yaw are ±100 mm,100 mm,±10°,±15°,±15°,respectively.Seven end-effector schemes are presented and the capabilities of misalignment tolerance and soft capture are analyzed elementarily.The three fingers-three petals end-effector and the steel cable-snared end-effector are the most feasible schemes among the seven schemes,and they are designed in detail.The capabilities of misalignment tolerance and soft capture are validated and evaluated,through the experiment on the micro-gravity simulating device and the dynamic analysis in ADAMS software.The results show that the misalignment tolerance capabilities of these two schemes could satisfy the requirement.And the translation misalignment tolerances in axial and radial directions and the angular misalignment tolerances in roll,pitch and yaw of the steel cable-snared end-effector are 30mm,15mm,6°,3° and 3° larger than those of the three fingers-three petals end-effector,respectively.And the contact force of the steel cable-snared end-effector is smaller and smoother than that of the three fingers-three petals end-effector.The end-effector schemes and research methods are beneficial to the developments of the large space manipulator end-effctor and the space docking mechanism.展开更多
We present a deterministic algorithm for large-scale VLSI module placement. Following the less flexibility first (LFF) principle,we simulate a manual packing process in which the concept of placement by stages is in...We present a deterministic algorithm for large-scale VLSI module placement. Following the less flexibility first (LFF) principle,we simulate a manual packing process in which the concept of placement by stages is introduced to reduce the overall evaluation complexity. The complexity of the proposed algorithm is (N1 + N2 ) × O( n^2 ) + N3× O(n^4lgn) ,where N1, N2 ,and N3 denote the number of modules in each stage, N1 + N2 + N3 = n, and N3〈〈 n. This complexity is much less than the original time complexity of O(n^5lgn). Experimental results indicate that this approach is quite promising.展开更多
This study investigates the dominant modes of variability in monthly and seasonal rainfall over the India-China region mainly through Empirical Orthogonal Function (EOF) analysis. The EOFs have shown that whereas the ...This study investigates the dominant modes of variability in monthly and seasonal rainfall over the India-China region mainly through Empirical Orthogonal Function (EOF) analysis. The EOFs have shown that whereas the rainfall over India varies as one coherent zone, that over China varies in east-west oriented bands. The influence of this banded structure extends well into India.Relationship of rainfall with large scale parameters such as the subtropical ridge over the Indian and the western Pacific regions, Southern Oscillation, the Northern Hemispheric surface air temperature and stratospheric winds have also been investigated. These results show that the rainfall over the area around 40°N, 110°E over China is highly related with rainfall over India. The subtropical ridge over the Indian region is an important predictor over India as well an over the northern China region. '展开更多
We present the design and performance of a home-built scanning tunneling microscope (STM), which is compact (66 mm tall and 25 mm in diameter), yet equipped with a 3D atomic precision piezoelectric motor in which ...We present the design and performance of a home-built scanning tunneling microscope (STM), which is compact (66 mm tall and 25 mm in diameter), yet equipped with a 3D atomic precision piezoelectric motor in which the Z coarse approach relies on a high simplic-ity friction-type walker (of our own invention) driven by an axially cut piezoelectric tube. The walker is vertically inserted in a piezoelectric scanner tube (PST) with its brim laying at on the PST end as the inertial slider (driven by the PST) for the XZ (sample plane) motion. The STM is designed to be capable of searching rare microscopic targets (defects, dopants, boundaries, nano-devices, etc.) in a macroscopic sample area (square millimeters) under extreme conditions (low temperatures, strong magnetic elds, etc.) in which it ts. It gives good atomic resolution images after scanning a highly oriented pyrolytic graphite sample in air at room temperature.展开更多
Human pluripotent stem cells(hPSCs), including human embryonic stem cells and human induced pluripotent stem cells, are promising sources for hematopoietic cells due to their unlimited growth capacity and the pluripot...Human pluripotent stem cells(hPSCs), including human embryonic stem cells and human induced pluripotent stem cells, are promising sources for hematopoietic cells due to their unlimited growth capacity and the pluripotency. Dendritic cells(DCs), the unique immune cells in the hematopoietic system, can be loaded with tumor specific antigen and used as vaccine for cancer immunotherapy. While autologous DCs from peripheral blood are limited in cell number, hPSC-derived DCs provide a novel alternative cell source which has the potential for large scale production. This review summarizes recent advances in differentiating hPSCs to DCs through the intermediate stage of hematopoietic stem cells. Step-wise growth factor induction has been used to derive DCs from hPSCs either in suspension cultureof embryoid bodies(EBs) or in co-culture with stromal cells. To fulfill the clinical potential of the DCs derived from hPSCs, the bioprocess needs to be scaled up to produce a large number of cells economically under tight quality control. This requires the development of novel bioreactor systems combining guided EB-based differentiation with engineered culture environment. Hence, recent progress in using bioreactors for hPSC lineage-specific differentiation is reviewed. In particular, the potential scale up strategies for the multistage DC differentiation and the effect of shear stress on hPSC differentiation in bioreactors are discussed in detail.展开更多
Local diversity AdaBoost support vector machine(LDAB-SVM) is proposed for large scale dataset classification problems.The training dataset is split into several blocks firstly, and some models based on these dataset...Local diversity AdaBoost support vector machine(LDAB-SVM) is proposed for large scale dataset classification problems.The training dataset is split into several blocks firstly, and some models based on these dataset blocks are built.In order to obtain a better performance, AdaBoost is used in each model building.In the boosting iteration step, the component learners which have higher diversity and accuracy are collected via the kernel parameters adjusting.Then the local models via voting method are integrated.The experimental study shows that LDAB-SVM can deal with large scale dataset efficiently without reducing the performance of the classifier.展开更多
Recently a new clustering algorithm called 'affinity propagation' (AP) has been proposed, which efficiently clustered sparsely related data by passing messages between data points. However, we want to cluster ...Recently a new clustering algorithm called 'affinity propagation' (AP) has been proposed, which efficiently clustered sparsely related data by passing messages between data points. However, we want to cluster large scale data where the similarities are not sparse in many cases. This paper presents two variants of AP for grouping large scale data with a dense similarity matrix. The local approach is partition affinity propagation (PAP) and the global method is landmark affinity propagation (LAP). PAP passes messages in the subsets of data first and then merges them as the number of initial step of iterations; it can effectively reduce the number of iterations of clustering. LAP passes messages between the landmark data points first and then clusters non-landmark data points; it is a large global approximation method to speed up clustering. Experiments are conducted on many datasets, such as random data points, manifold subspaces, images of faces and Chinese calligraphy, and the results demonstrate that the two ap-proaches are feasible and practicable.展开更多
We study how to use the SR1 update to realize minimization methods for problems where the storage is critical. We give an update formula which generates matrices using information from the last m iterations. The numer...We study how to use the SR1 update to realize minimization methods for problems where the storage is critical. We give an update formula which generates matrices using information from the last m iterations. The numerical tests show that the method is efficent.展开更多
The thermal stratification level of low sidewall air supply system in large space was defined. Depending on the experiment of low sidewall air supply in summer 2008,the thermal stratification level was studied by simu...The thermal stratification level of low sidewall air supply system in large space was defined. Depending on the experiment of low sidewall air supply in summer 2008,the thermal stratification level was studied by simulation. Based on the simulation of experiment condition,the air velocity and vertical temperature distribution in a large space were simulated at different air-outlet velocities,and then the thermal stratification level line was obtained. The simulation results well match with the experimental ones and the average relative error is 3.4%. The thermal stratification level is heightened by increasing the air-outlet velocity with low sidewall air supply mode. It is concluded that when air-outlet velocity is 0.29 m/s,which is the experimental case,a uniform thermal environment in the higher occupied zone and a stable stratification level are formed. When the air-outlet velocity is low,such as 0.05 m/s,the thermal stratification level is too low and the air velocity is too small to meet the human thermal comfort in the occupied zone. So,it would be reasonable that the air-outlet velocity may be designed as 0.31 m/s if the height of the occupied zone is 2 m.展开更多
Conservation of ancient and large trees in domestic and overseas cities was compared, ancient and large trees were regarded as important cultural relics playing an important role in optimizing urban natural environmen...Conservation of ancient and large trees in domestic and overseas cities was compared, ancient and large trees were regarded as important cultural relics playing an important role in optimizing urban natural environment and enriching urban humanistic and natural landscapes, and they were also important contents of urban garden works symbolizing urban parks. A case study was carried out Yunqizhujing Park to study conservation of ancient and large trees in park green spaces of Hangzhou City, solutions to current problems were proposed, and constructive suggestions were given for the conservation of ancient and large trees in urban park green spaces.展开更多
基金funded by the National Natural Science Foundation of China and the Machinery Industry Innovation Platform Construction Project of China Machinery Industry Federation,Grant Numbers 52378103 and 2019SA-10-07.
文摘Buildings with large open spaces in which chemicals are handled are often exposed to the risk of explosions.Computational fluid dynamics is a useful and convenient way to investigate contaminant dispersion in such large spaces.The turbulent Schmidt number(Sc_(t))concept has typically been used in this regard,and most studies have adopted a default value.We studied the concentration distribution for sulfur hexafluoride(SF_(6))assuming different emission rates and considering the effect of Sc_(t).Then we examined the same problem for a light gas by assuming hydrogen gas(H_(2))as the contaminant.When SF_(6) was considered as the contaminant gas,a variation in the emission rate completely changed the concentration distribution.When the emission rate was low,the gravitational effect did not take place.For both low and high emission rates,an increase in S_(ct) accelerated the transport rate of SF_(6).In contrast,for H_(2) as the contaminant gas,a larger S_(ct) could induce a decrease in the H_(2) transport rate.
基金supported by National Natural Science Foundation of China (Nos.52207158 and 51821005)the Fundamental Research Funds for the Central Universities (HUST: No.2022JYCXJJ012)the National Key Research and Development Program of China (Nos.2016YFC0401002 and 2016YFC0401006)。
文摘Corona discharge, as a common means to obtain non-equilibrium plasma, can generally obtain high-concentration plasma by increasing discharge points to meet production needs. However,the existing numerical simulation models used to study multi-point corona discharge are all calculations of small-scale space models, which cannot obtain the distribution characteristics of plasma in large space. Based on our previous research, this paper proposes a hybrid model for studying the distribution of multi-point discharge plasma in large-scale spaces, which divides the computational domain and computes separately with the hydrodynamic model and the ion mobility model. The simulation results are verified by a needle–ball electrode device. Firstly, the electric field distribution and plasma distribution of the needle electrodes with single tip and double tips are compared and discussed. Secondly, the plasma distribution of the needle electrode with the double tip at different voltages is investigated. Both computational and experimental results indicate that the charged particle concentration and current of the needle electrode with double tips are both twice as high as those of the needle electrode with a single tip. This model can extend the computational area of the multi-point corona discharge finite element model to the sub-meter(25 cm) or meter level, which provides an effective means to study the plasma distribution generated by multiple discharge points in large-scale space.
基金supported by the National Natural Science Foundation of China(12171106)the Natural Science Foundation of Guangxi Province(2020GXNSFDA238017 and 2018GXNSFFA281007)the Shanghai Sailing Program(21YF1430300)。
文摘This paper discusses the two-block large-scale nonconvex optimization problem with general linear constraints.Based on the ideas of splitting and sequential quadratic optimization(SQO),a new feasible descent method for the discussed problem is proposed.First,we consider the problem of quadratic optimal(QO)approximation associated with the current feasible iteration point,and we split the QO into two small-scale QOs which can be solved in parallel.Second,a feasible descent direction for the problem is obtained and a new SQO-type method is proposed,namely,splitting feasible SQO(SF-SQO)method.Moreover,under suitable conditions,we analyse the global convergence,strong convergence and rate of superlinear convergence of the SF-SQO method.Finally,preliminary numerical experiments regarding the economic dispatch of a power system are carried out,and these show that the SF-SQO method is promising.
文摘The author of this paper once attempted to propose a unified framework for gauge fields based on the mathematical and physical picture of the principal fiber bundle: that is, to believe that our universe may have more fundamental interactions than the four, and these fundamental gauge fields are only components on the bottom manifold (i.e. our universe) projected by a unified gauge potential of the principal fiber bundle manifold;these components can satisfy the transformation of gauge potential, or even be transformed from one basic interaction gauge potential to another basic interaction gauge potential, and can be summarized into a unified equation, namely the generalized gauge equation expression, corresponding to gauge transformation invariance;so the invariance of gauge transformation is a necessary condition for unified field theory, and the four (or more) fundamental interaction fields of the universe are unified in a unified gauge field defined by the connection on the principal fiber bundle. In this paper, the author continues to propose a model of large-scale (gravitational) fundamental interactions in the universe based on the mathematical and physical picture of the principal fiber bundle, attempting to explain that dark matter and dark energy are merely reflections of these gravitational fundamental interactions that deviate in intensity from the gravitational fundamental interactions of the solar system at galaxy scales or some cosmic scales which are much larger than the solar system. All these “gravitational” fundamental interactions originate from the unified gauge field of the universe, namely the connection or curvature on the principal fiber bundle. These interactions are their projected representations on the bottom manifold (i.e. our universe) by different cross-sections (gauge transformations). These projection representations of the universe certainly are described by the generalized gauge equation or curvature similarity equation, and under the guidance of curvature gauge transformation factors, oscillate and evolve between the curvatures 1→0→-1→0→1 of the universe.
基金The research was supported by the State Education Grant for Retumed Scholars
文摘In this paper we report a sparse truncated Newton algorithm for handling large-scale simple bound nonlinear constrained minimixation problem. The truncated Newton method is used to update the variables with indices outside of the active set, while the projected gradient method is used to update the active variables. At each iterative level, the search direction consists of three parts, one of which is a subspace truncated Newton direction, the other two are subspace gradient and modified gradient directions. The subspace truncated Newton direction is obtained by solving a sparse system of linear equations. The global convergence and quadratic convergence rate of the algorithm are proved and some numerical tests are given.
文摘We are engaged in solving two difficult problems in adaptive control of the large-scale time-variant aerospace system. One is parameter identification of time-variant continuous-time state-space modei; the other is how to solve algebraic Riccati equation (ARE) of large order efficiently. In our approach, two neural networks are employed to independently solve both the system identification problem and the ARE associated with the optimal control problem. Thus the identification and the control computation are combined in closed-loop, adaptive, real-time control system . The advantage of this approach is that the neural networks converge to their solutions very quickly and simultaneously.
文摘A method to synthesize anticancer drug N-( 4- hydroxyphenyl) retinamide (4-HPR)on a large scale is described. It consists of the preferred steps of reacting all-trans retinoic acid with thionyl chloride to form retinoyl chloride and then reacting with triethylamine to generate retinoyl ammonium salt which in turn is reacted with p-aminophenol to eventually produce 4-HPR. This process can overcome many scale-up challenges that exist in the methods reported in the literature and provide an easy, mild and high yield route for large scale synthesis of 4-HPR. Moreover, the effects of the molar ratios of the reagents on the yield are examined. The best molar ratios are a 2.0 molar equivalence of thionyl chloride and a 3.0 molar equivalence of paminophenol to retinoic acid, and the total yield is 80. 7%.
基金The National Natural Science Foundation of China(No.50478113)the Leading Academic Discipline Project of Shanghai Municipal Education Commission(No.J50502)
文摘The thermal environmental characteristics are experim-entally studied in terms of different air supply volumes and outdoor meteorological parameters in a large-space building which is air conditioned with a low sidewall air supply.The experimental results show that the indoor vertical temperature distributions under different condition are similar.The maximum vertical temperature difference(MVTD)is up to about 20 ℃,and it linearly changes with the sol-air temperature.The indoor vertical temperature gradients(VTGs)in the upper,central and lower zones are different.The influence of the sol-air temperature on the VTGs in the upper and the lower zones is greater than that in the central zone.The characteristics of the VTGs in the three zones affected by the air supply volume are the same as those affected by the sol-air temperature.Besides,because of the small air velocity,the predicted mean vote(PMV)on comfort in the occupied zone is slightly high and the air temperature difference between the head and the ankle is usually more than 3 ℃.
基金the National Natural Science Foundation of China(10471062)the Natural Science Foundation of Jiangsu Province(BK2006184)~~
文摘A new limited memory symmetric rank one algorithm is proposed. It combines a modified self-scaled symmetric rank one (SSR1) update with the limited memory and nonmonotone line search technique. In this algorithm, the descent search direction is generated by inverse limited memory SSR1 update, thus simplifying the computation. Numerical comparison of the algorithm and the famous limited memory BFGS algorithm is given. Comparison results indicate that the new algorithm can process a kind of large-scale unconstrained optimization problems.
文摘The large-scale periodic orbits of a nonlinear mechanics system can represent the homology classes, which are generally non-trivial, of the energy level surface and the topology properties of an energy level surface are determined by the that of the phase space and the large-scale properties of the Hamiltonian. These properties are used for estimate of the rank of the first homology group of energy level surfaces in the paper.
基金supported by National Hi-tech Research and Development Program of China(863 Program,Grant No. 2006AA04Z228)
文摘The end-effector of the large space manipulator is employed to assist the manipulator in handling and manipulating large payloads on orbit.Currently,there are few researches about the end-effector,and the existing end-effectors have some disadvantages,such as poor misalignment tolerance capability and complex mechanical components.According to the end positioning errors and the residual vibration characters of the large space manipulators,two basic performance requirements of the end-effector which include the capabilities of misalignment tolerance and soft capture are proposed.And the end-effector should accommodate the following misalignments of the mechanical interface.The translation misalignments in axial and radial directions and the angular misalignments in roll,pitch and yaw are ±100 mm,100 mm,±10°,±15°,±15°,respectively.Seven end-effector schemes are presented and the capabilities of misalignment tolerance and soft capture are analyzed elementarily.The three fingers-three petals end-effector and the steel cable-snared end-effector are the most feasible schemes among the seven schemes,and they are designed in detail.The capabilities of misalignment tolerance and soft capture are validated and evaluated,through the experiment on the micro-gravity simulating device and the dynamic analysis in ADAMS software.The results show that the misalignment tolerance capabilities of these two schemes could satisfy the requirement.And the translation misalignment tolerances in axial and radial directions and the angular misalignment tolerances in roll,pitch and yaw of the steel cable-snared end-effector are 30mm,15mm,6°,3° and 3° larger than those of the three fingers-three petals end-effector,respectively.And the contact force of the steel cable-snared end-effector is smaller and smoother than that of the three fingers-three petals end-effector.The end-effector schemes and research methods are beneficial to the developments of the large space manipulator end-effctor and the space docking mechanism.
文摘We present a deterministic algorithm for large-scale VLSI module placement. Following the less flexibility first (LFF) principle,we simulate a manual packing process in which the concept of placement by stages is introduced to reduce the overall evaluation complexity. The complexity of the proposed algorithm is (N1 + N2 ) × O( n^2 ) + N3× O(n^4lgn) ,where N1, N2 ,and N3 denote the number of modules in each stage, N1 + N2 + N3 = n, and N3〈〈 n. This complexity is much less than the original time complexity of O(n^5lgn). Experimental results indicate that this approach is quite promising.
文摘This study investigates the dominant modes of variability in monthly and seasonal rainfall over the India-China region mainly through Empirical Orthogonal Function (EOF) analysis. The EOFs have shown that whereas the rainfall over India varies as one coherent zone, that over China varies in east-west oriented bands. The influence of this banded structure extends well into India.Relationship of rainfall with large scale parameters such as the subtropical ridge over the Indian and the western Pacific regions, Southern Oscillation, the Northern Hemispheric surface air temperature and stratospheric winds have also been investigated. These results show that the rainfall over the area around 40°N, 110°E over China is highly related with rainfall over India. The subtropical ridge over the Indian region is an important predictor over India as well an over the northern China region. '
文摘We present the design and performance of a home-built scanning tunneling microscope (STM), which is compact (66 mm tall and 25 mm in diameter), yet equipped with a 3D atomic precision piezoelectric motor in which the Z coarse approach relies on a high simplic-ity friction-type walker (of our own invention) driven by an axially cut piezoelectric tube. The walker is vertically inserted in a piezoelectric scanner tube (PST) with its brim laying at on the PST end as the inertial slider (driven by the PST) for the XZ (sample plane) motion. The STM is designed to be capable of searching rare microscopic targets (defects, dopants, boundaries, nano-devices, etc.) in a macroscopic sample area (square millimeters) under extreme conditions (low temperatures, strong magnetic elds, etc.) in which it ts. It gives good atomic resolution images after scanning a highly oriented pyrolytic graphite sample in air at room temperature.
基金Supported by In part by Florida State University start up fundFlorida State University Research Foundation GAP awardthe partial support from National Science Foundation,No.1342192
文摘Human pluripotent stem cells(hPSCs), including human embryonic stem cells and human induced pluripotent stem cells, are promising sources for hematopoietic cells due to their unlimited growth capacity and the pluripotency. Dendritic cells(DCs), the unique immune cells in the hematopoietic system, can be loaded with tumor specific antigen and used as vaccine for cancer immunotherapy. While autologous DCs from peripheral blood are limited in cell number, hPSC-derived DCs provide a novel alternative cell source which has the potential for large scale production. This review summarizes recent advances in differentiating hPSCs to DCs through the intermediate stage of hematopoietic stem cells. Step-wise growth factor induction has been used to derive DCs from hPSCs either in suspension cultureof embryoid bodies(EBs) or in co-culture with stromal cells. To fulfill the clinical potential of the DCs derived from hPSCs, the bioprocess needs to be scaled up to produce a large number of cells economically under tight quality control. This requires the development of novel bioreactor systems combining guided EB-based differentiation with engineered culture environment. Hence, recent progress in using bioreactors for hPSC lineage-specific differentiation is reviewed. In particular, the potential scale up strategies for the multistage DC differentiation and the effect of shear stress on hPSC differentiation in bioreactors are discussed in detail.
基金supported by the National Natural Science Foundation of China (60603098)
文摘Local diversity AdaBoost support vector machine(LDAB-SVM) is proposed for large scale dataset classification problems.The training dataset is split into several blocks firstly, and some models based on these dataset blocks are built.In order to obtain a better performance, AdaBoost is used in each model building.In the boosting iteration step, the component learners which have higher diversity and accuracy are collected via the kernel parameters adjusting.Then the local models via voting method are integrated.The experimental study shows that LDAB-SVM can deal with large scale dataset efficiently without reducing the performance of the classifier.
基金the National Natural Science Foundation of China (Nos. 60533090 and 60603096)the National Hi-Tech Research and Development Program (863) of China (No. 2006AA010107)+2 种基金the Key Technology R&D Program of China (No. 2006BAH02A13-4)the Program for Changjiang Scholars and Innovative Research Team in University of China (No. IRT0652)the Cultivation Fund of the Key Scientific and Technical Innovation Project of MOE, China (No. 706033)
文摘Recently a new clustering algorithm called 'affinity propagation' (AP) has been proposed, which efficiently clustered sparsely related data by passing messages between data points. However, we want to cluster large scale data where the similarities are not sparse in many cases. This paper presents two variants of AP for grouping large scale data with a dense similarity matrix. The local approach is partition affinity propagation (PAP) and the global method is landmark affinity propagation (LAP). PAP passes messages in the subsets of data first and then merges them as the number of initial step of iterations; it can effectively reduce the number of iterations of clustering. LAP passes messages between the landmark data points first and then clusters non-landmark data points; it is a large global approximation method to speed up clustering. Experiments are conducted on many datasets, such as random data points, manifold subspaces, images of faces and Chinese calligraphy, and the results demonstrate that the two ap-proaches are feasible and practicable.
文摘We study how to use the SR1 update to realize minimization methods for problems where the storage is critical. We give an update formula which generates matrices using information from the last m iterations. The numerical tests show that the method is efficent.
基金Project(50478113) supported by the National Natural Science Foundation of ChinaProject(J50502) supported by the Leading Academic Discipline Project of Shanghai Municipal Education Commission,China
文摘The thermal stratification level of low sidewall air supply system in large space was defined. Depending on the experiment of low sidewall air supply in summer 2008,the thermal stratification level was studied by simulation. Based on the simulation of experiment condition,the air velocity and vertical temperature distribution in a large space were simulated at different air-outlet velocities,and then the thermal stratification level line was obtained. The simulation results well match with the experimental ones and the average relative error is 3.4%. The thermal stratification level is heightened by increasing the air-outlet velocity with low sidewall air supply mode. It is concluded that when air-outlet velocity is 0.29 m/s,which is the experimental case,a uniform thermal environment in the higher occupied zone and a stable stratification level are formed. When the air-outlet velocity is low,such as 0.05 m/s,the thermal stratification level is too low and the air velocity is too small to meet the human thermal comfort in the occupied zone. So,it would be reasonable that the air-outlet velocity may be designed as 0.31 m/s if the height of the occupied zone is 2 m.
基金Supported by Lin’an Scientific and Technological Program of Zhejiang Province(200933)Hangzhou Social Development Scientific Research Program of Zhejiang Province(20100933B34)
文摘Conservation of ancient and large trees in domestic and overseas cities was compared, ancient and large trees were regarded as important cultural relics playing an important role in optimizing urban natural environment and enriching urban humanistic and natural landscapes, and they were also important contents of urban garden works symbolizing urban parks. A case study was carried out Yunqizhujing Park to study conservation of ancient and large trees in park green spaces of Hangzhou City, solutions to current problems were proposed, and constructive suggestions were given for the conservation of ancient and large trees in urban park green spaces.