An earthquake of M S=6.9 occurred in Gonghe County, Qinghai Province, China on April 26, 1990.This earthquake was followed by three larger aftershocks of M S=5.5 on May 7, 1990, M S=6.0 on Jan.3, 199...An earthquake of M S=6.9 occurred in Gonghe County, Qinghai Province, China on April 26, 1990.This earthquake was followed by three larger aftershocks of M S=5.5 on May 7, 1990, M S=6.0 on Jan.3, 1994, and M S=5.7 on Feb.16, 1994, consecutively. The moment tensors of these earthquakes as function of time were obtained by the technique of moment tensor inversion in frequency domain . The results inverted indicate that these earthquakes had a very similar focal mechanism of predominantly reverse faulting on a plane striking NWW, dipping to SSW.The scalar seismic moments of these earthquakes are M 0=9.4×10 18 Nm for the M S=6.9 event, 8.0×10 16 Nm for the M S=5.5 event, 4.9×10 17 Nm for the M S =6.0 event and 2.9×10 17 Nm for the M S=5.7 event, respectively. The results inverted also show that the source processes of these events were significantly different. The main shock had a very complex process, consisting of two distinct sub events with comparable sizes. The first sub event occurred in the first 12s, having a seismic moment of 4.7×10 18 Nm, and the second one continued from 31s to 41s, having a seismic moment of 2.5×10 18 Nm. In addition, a much smaller sub event, having a seismic moment of about 2.1×10 18 Nm, may exist in the interval of 12 s and 31 s, In contrast, the source processes of the three aftershocks are quite simple. The source time function of each of aftershocks is a single impulse, suggestting that each of aftershocks consists of a mainly uninterrupted rupture. The rise times and total rupture durations are 4 s and 11 s for the M S=5.5 event, 6 s and 16 s for the M S= 6.0 event and 6 s and 13 s for the M S=5.7 event, respectively.展开更多
Cochlodinium polykrikoides is a notoriously harmful algal species that inflicts severe damage on the aquacultures of the coastal seas of Korea and Japan. Information on their expected movement tracks and boundaries of...Cochlodinium polykrikoides is a notoriously harmful algal species that inflicts severe damage on the aquacultures of the coastal seas of Korea and Japan. Information on their expected movement tracks and boundaries of influence is very useful and important for the effective establishment of a reduction plan. In general, the information is supported by a red-tide(a.k.a algal bloom) model. The performance of the model is highly dependent on the accuracy of parameters, which are the coefficients of functions approximating the biological growth and loss patterns of the C. polykrikoides. These parameters have been estimated using the bioassay data composed of growth-limiting factor and net growth rate value pairs. In the case of the C. polykrikoides, the parameters are different from each other in accordance with the used data because the bioassay data are sufficient compared to the other algal species. The parameters estimated by one specific dataset can be viewed as locally-optimized because they are adjusted only by that dataset. In cases where the other one data set is used, the estimation error might be considerable. In this study, the parameters are estimated by all available data sets without the use of only one specific data set and thus can be considered globally optimized. The cost function for the optimization is defined as the integrated mean squared estimation error, i.e., the difference between the values of the experimental and estimated rates. Based on quantitative error analysis, the root-mean squared errors of the global parameters show smaller values, approximately 25%–50%, than the values of the local parameters. In addition, bias is removed completely in the case of the globally estimated parameters. The parameter sets can be used as the reference default values of a red-tide model because they are optimal and representative. However, additional tuning of the parameters using the in-situ monitoring data is highly required.As opposed to the bioassay data, it is necessary because the bioassay data have limitations in terms of the in-situ coastal conditions.展开更多
In this study,the authors introduce a new bogus data assimilation method based on the dimension-reduced projection 4-DVar,which can resolve the cost function directly in low-dimensional space.The authors also try a ne...In this study,the authors introduce a new bogus data assimilation method based on the dimension-reduced projection 4-DVar,which can resolve the cost function directly in low-dimensional space.The authors also try a new method to improve the quality of samples,which are the base of dimension-reduced space projection bogus data assimilation (DRP-BDA).By running a number of numerical weather models with different model parameterization combinations on the typhoon Sinlaku,the authors obtained two groups of samples with different spreads and similarities.After DRP-BDA,the results show that,compared with the control runs,the simulated typhoon center pressure can be deepened by more than 20 hPa to 30 hPa and that the intensity can last as long as 60 hours.The mean track error is improved after DRP-BDA,and the structure of the typhoon is also improved.The wind near the typhoon center is enhanced dramatically,while the warm core is moderate.展开更多
A number of spectroscopic surveys have been carried out or are planned to study the origin of the Milky Way. Their exploitation requires reliable automated methods and softwares to measure the fundamental parameters o...A number of spectroscopic surveys have been carried out or are planned to study the origin of the Milky Way. Their exploitation requires reliable automated methods and softwares to measure the fundamental parameters of the stars. Adopting the ULySS package, we have tested the effect of different resolutions and signal-to- noise ratios (SNR) on the measurement of the stellar atmospheric parameters (effective temperature Teff, surface gravity log g, and metaUicity [Fe/H]). We show that ULySS is reliable for determining these parameters with medium-resolution spectra (R ~2000). Then, we applied the method to measure the parameters of 771 stars selected in the commissioning database of the Guoshoujing Telescope (LAMOST). The results were compared with the SDSS/SEGUE Stellar Parameter Pipeline (SSPP), and we derived precisions of 167 K, 0.34dex, and 0.16dex for Teff, logg and [Fe/H] respectively. Furthermore, 120 of these stars are selected to construct the primary stellar spectral template library (Version 1.0) of LAMOST, and will be deployed as basic ingredients for the LAMOST automated parametrization pipeline.展开更多
The Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) published its first data release (DR1) in 2013, which is currently the largest dataset of stellar spectra in the world. We combine the PASTEL ...The Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) published its first data release (DR1) in 2013, which is currently the largest dataset of stellar spectra in the world. We combine the PASTEL catalog and SIMBAD radial velocities as a testing standard to validate stellar parameters (effec- tive temperature Tefr, surface gravity log g, metallicity [Fe/H] and radial velocity Vr) derived from DR1. Through cross-identification of the DR1 catalogs and the PASTEL catalog, we obtain a preliminary sample of 422 stars. After removal of stellar param- eter measurements from problematic spectra and applying effective temperature con- straints to the sample, we compare the stellar parameters from DR1 with those from PASTEL and SIMBAD to demonstrate that the DR1 results are reliable in restricted ranges of Tefr. We derive standard deviations of 110 K, 0.19 dex and 0.11 dex for Tell, log 9 and [Fe/H] respectively when Teff〈 8000 K, and 4.91 km s-1 for Vr when Teff 〈 10 000 K. Systematic errors are negligible except for those of Vr. In addition, metallicities in DR1 are systematically higher than those in PASTEL, in the range of PASTEL [Fe/H] 〈 -1.5.展开更多
With the rapid development of large scale sky surveys like the Sloan Digital Sky Survey (SDSS), GAIA and LAMOST (Guoshoujing telescope), stellar spectra can be obtained on an ever-increasing scale. Therefore, it i...With the rapid development of large scale sky surveys like the Sloan Digital Sky Survey (SDSS), GAIA and LAMOST (Guoshoujing telescope), stellar spectra can be obtained on an ever-increasing scale. Therefore, it is necessary to estimate stel- lar atmospheric parameters such as Teff, log g and [Fe/H] automatically to achieve the scientific goals and make full use of the potential value of these observations. Feature selection plays a key role in the automatic measurement of atmospheric parameters. We propose to use the least absolute shrinkage selection operator (Lasso) algorithm to select features from stellar spectra. Feature selection can reduce redundancy in spectra, alleviate the influence of noise, improve calculation speed and enhance the robustness of the estimation system. Based on the extracted features, stellar atmospheric param- eters are estimated by the support vector regression model. Three typical schemes are evaluated on spectral data from both the ELODIE library and SDSS. Experimental results show the potential performance to a certain degree. In addition, results show that our method is stable when applied to different spectra.展开更多
Multispecies ecological models have been used for predicting the effects of fishing activity and evaluating the performance of management strategies. Size-spectrum models are one type of physiologically-structured eco...Multispecies ecological models have been used for predicting the effects of fishing activity and evaluating the performance of management strategies. Size-spectrum models are one type of physiologically-structured ecological model that provide a feasible approach to describing fish communities in terms of individual dietary variation and ontogenetic niche shift. Despite the potential of ecological models in improving our understanding of ecosystems, their application is usually limited for data-poor fisheries. As a first step in implementing ecosystem-based fisheries management(EBFM), this study built a size-spectrum model for the fish community in the Haizhou Bay, China. We describe data collection procedures and model parameterization to facilitate the implementation of such size-spectrum models for future studies of data-poor ecosystems. The effects of fishing on the ecosystem were exemplified with a range of fishing effort and were monitored with a set of ecological indicators. Total community biomass, biodiversity index, W-statistic, LFI(Large fish index), Mean W(mean body weight) and Slope(slope of community size spectra) showed a strong non-linear pattern in response to fishing pressure, and largest fishing effort did not generate the most drastic responses in certain scenarios. We emphasize the value and feasibility of developing size-spectrum models to capture ecological dynamics and suggest limitations as well as potential for model improvement. This study aims to promote a wide use of this type of model in support of EBFM.展开更多
The accuracy of the estimated stellar atmospheric parameter evidently decreases with the decreasing of spectral signal-to-noise ratio(S/N)and there are a huge amount of this kind observations,especially in case of S/N...The accuracy of the estimated stellar atmospheric parameter evidently decreases with the decreasing of spectral signal-to-noise ratio(S/N)and there are a huge amount of this kind observations,especially in case of S/N<30.Therefore,it is helpful to improve the parameter estimation performance for these spectra and this work studied the(T_(eff),log g,[Fe/H])estimation problem for LAMOST DR8 low-resolution spectra with 20≤S/N<30.We proposed a data-driven method based on machine learning techniques.First,this scheme detected stellar atmospheric parameter-sensitive features from spectra by the Least Absolute Shrinkage and Selection Operator(LASSO),rejected ineffective data components and irrelevant data.Second,a Multi-layer Perceptron(MLP)method was used to estimate stellar atmospheric parameters from the LASSO features.Finally,the performance of the LASSO-MLP was evaluated by computing and analyzing the consistency between its estimation and the reference from the Apache Point Observatory Galactic Evolution Experiment high-resolution spectra.Experiments show that the Mean Absolute Errors of T_(eff),log g,[Fe/H]are reduced from the LASP(137.6 K,0.195,0.091 dex)to LASSO-MLP(84.32 K,0.137,0.063 dex),which indicate evident improvements on stellar atmospheric parameter estimation.In addition,this work estimated the stellar atmospheric parameters for 1,162,760 lowresolution spectra with 20≤S/N<30 from LAMOST DR8 using LASSO-MLP,and released the estimation catalog,learned model,experimental code,trained model,training data and test data for scientific exploration and algorithm study.展开更多
In the present study, the imitation of heavy rainfall event which occurred over Jharkhand during 18 August 2016 was taken as a case study. Weather Research and Forecasting (WRF) model has been utilized for this study....In the present study, the imitation of heavy rainfall event which occurred over Jharkhand during 18 August 2016 was taken as a case study. Weather Research and Forecasting (WRF) model has been utilized for this study. National Centers for Environmental Prediction (NCEP) analysis data is compared with GSMaP data with different combination of physical parameterization scheme like microphysics (MP) and cumulus parameterization (CP). In the present study, three MP schemes: Kessler scheme, Lin et al. scheme and WRF Single-moment 6-class scheme with combination of three CP schemes: Betts-Miller-Janjic scheme, Multi-scale Kain-Fritsch scheme and New simplified Arakawa-Schubert scheme have been used. The model predicted humidity, temperature and precipitation were compared with the GSMaP product. The model nicely depicted the cloud pattern and recognized the rain event spatially. The obtained result shows that the model overestimates the precipitation for all the schemes.展开更多
Parameterization is one of the key problems in the construction of a curve to interpolate a set of ordered points. We propose a new local parameterization method based on the curvature model in this paper. The new met...Parameterization is one of the key problems in the construction of a curve to interpolate a set of ordered points. We propose a new local parameterization method based on the curvature model in this paper. The new method determines the knots by mi- nimizing the maximum curvature of quadratic curve. When the knots by the new method are used to construct interpolation curve, the constructed curve have good precision. We also give some comparisons of the new method with existing methods, and our method can perform better in interpolation error, and the interpolated curve is more fairing.展开更多
To scrutinize the nature of dark energy,many equations of state have been proposed.In this context,we examine the simplest parameterization of the equation of state parameter of dark energy in an anisotropic Bianchi t...To scrutinize the nature of dark energy,many equations of state have been proposed.In this context,we examine the simplest parameterization of the equation of state parameter of dark energy in an anisotropic Bianchi type I universe compared with theΛCDM model.Using different combinations of data samples,including Pantheon and Pantheon+H(z),alongside applying the minimization of theχ^(2)function of the distance modulus of data samples,we obtain the constrained values of cosmographic parameters in the parameterization of the dark energy scenario.One condition of the phantom barrier crossing is acquired.Several physical properties of the universe are discussed by considering the anisotropy effect and different observational data points.One should note that the deductions of the cosmological parameter verify recent observational data.展开更多
Over the past decade, open-source software use has grown. Today, many companies including Google, Microsoft, Meta, RedHat, MongoDB, and Apache are major participants of open-source contributions. With the increased us...Over the past decade, open-source software use has grown. Today, many companies including Google, Microsoft, Meta, RedHat, MongoDB, and Apache are major participants of open-source contributions. With the increased use of open-source software or integration of open-source software into custom-developed software, the quality of this software component increases in importance. This study examined a sample of open-source applications from GitHub. Static software analytics were conducted, and each application was classified for its risk level. In the analyzed applications, it was found that 90% of the applications were classified as low risk or moderate low risk indicating a high level of quality for open-source applications.展开更多
文摘An earthquake of M S=6.9 occurred in Gonghe County, Qinghai Province, China on April 26, 1990.This earthquake was followed by three larger aftershocks of M S=5.5 on May 7, 1990, M S=6.0 on Jan.3, 1994, and M S=5.7 on Feb.16, 1994, consecutively. The moment tensors of these earthquakes as function of time were obtained by the technique of moment tensor inversion in frequency domain . The results inverted indicate that these earthquakes had a very similar focal mechanism of predominantly reverse faulting on a plane striking NWW, dipping to SSW.The scalar seismic moments of these earthquakes are M 0=9.4×10 18 Nm for the M S=6.9 event, 8.0×10 16 Nm for the M S=5.5 event, 4.9×10 17 Nm for the M S =6.0 event and 2.9×10 17 Nm for the M S=5.7 event, respectively. The results inverted also show that the source processes of these events were significantly different. The main shock had a very complex process, consisting of two distinct sub events with comparable sizes. The first sub event occurred in the first 12s, having a seismic moment of 4.7×10 18 Nm, and the second one continued from 31s to 41s, having a seismic moment of 2.5×10 18 Nm. In addition, a much smaller sub event, having a seismic moment of about 2.1×10 18 Nm, may exist in the interval of 12 s and 31 s, In contrast, the source processes of the three aftershocks are quite simple. The source time function of each of aftershocks is a single impulse, suggestting that each of aftershocks consists of a mainly uninterrupted rupture. The rise times and total rupture durations are 4 s and 11 s for the M S=5.5 event, 6 s and 16 s for the M S= 6.0 event and 6 s and 13 s for the M S=5.7 event, respectively.
基金The part of the project "Development of Korea Operational Oceanographic System(KOOS),Phase 2",funded by the Ministry of Oceans and Fisheries,Koreathe part of the project entitled "Cooperative Project on Korea-China Bilateral Committee on Ocean Science",funded by the Ministry of Oceans and Fisheries,Korea and China-Korea Joint Research Ocean Research Center
文摘Cochlodinium polykrikoides is a notoriously harmful algal species that inflicts severe damage on the aquacultures of the coastal seas of Korea and Japan. Information on their expected movement tracks and boundaries of influence is very useful and important for the effective establishment of a reduction plan. In general, the information is supported by a red-tide(a.k.a algal bloom) model. The performance of the model is highly dependent on the accuracy of parameters, which are the coefficients of functions approximating the biological growth and loss patterns of the C. polykrikoides. These parameters have been estimated using the bioassay data composed of growth-limiting factor and net growth rate value pairs. In the case of the C. polykrikoides, the parameters are different from each other in accordance with the used data because the bioassay data are sufficient compared to the other algal species. The parameters estimated by one specific dataset can be viewed as locally-optimized because they are adjusted only by that dataset. In cases where the other one data set is used, the estimation error might be considerable. In this study, the parameters are estimated by all available data sets without the use of only one specific data set and thus can be considered globally optimized. The cost function for the optimization is defined as the integrated mean squared estimation error, i.e., the difference between the values of the experimental and estimated rates. Based on quantitative error analysis, the root-mean squared errors of the global parameters show smaller values, approximately 25%–50%, than the values of the local parameters. In addition, bias is removed completely in the case of the globally estimated parameters. The parameter sets can be used as the reference default values of a red-tide model because they are optimal and representative. However, additional tuning of the parameters using the in-situ monitoring data is highly required.As opposed to the bioassay data, it is necessary because the bioassay data have limitations in terms of the in-situ coastal conditions.
基金the Ministry of Finance of China and the China Meteorological Administration for the Special Project of Meteorological Sector (Grant No. GYHY(QX)200906009)the National Natural Science Foundation of China for the innovation group project (Grant No.40821092)
文摘In this study,the authors introduce a new bogus data assimilation method based on the dimension-reduced projection 4-DVar,which can resolve the cost function directly in low-dimensional space.The authors also try a new method to improve the quality of samples,which are the base of dimension-reduced space projection bogus data assimilation (DRP-BDA).By running a number of numerical weather models with different model parameterization combinations on the typhoon Sinlaku,the authors obtained two groups of samples with different spreads and similarities.After DRP-BDA,the results show that,compared with the control runs,the simulated typhoon center pressure can be deepened by more than 20 hPa to 30 hPa and that the intensity can last as long as 60 hours.The mean track error is improved after DRP-BDA,and the structure of the typhoon is also improved.The wind near the typhoon center is enhanced dramatically,while the warm core is moderate.
基金Supported by the National Natural Science Foundation of China(Grant Nos. 10973021, 10778626 and 10933001)the National Basic Research Development Program of China (Grant No. 2007CB815404)the China Scholarship Council (CSC) (Grant No. 2007104275)
文摘A number of spectroscopic surveys have been carried out or are planned to study the origin of the Milky Way. Their exploitation requires reliable automated methods and softwares to measure the fundamental parameters of the stars. Adopting the ULySS package, we have tested the effect of different resolutions and signal-to- noise ratios (SNR) on the measurement of the stellar atmospheric parameters (effective temperature Teff, surface gravity log g, and metaUicity [Fe/H]). We show that ULySS is reliable for determining these parameters with medium-resolution spectra (R ~2000). Then, we applied the method to measure the parameters of 771 stars selected in the commissioning database of the Guoshoujing Telescope (LAMOST). The results were compared with the SDSS/SEGUE Stellar Parameter Pipeline (SSPP), and we derived precisions of 167 K, 0.34dex, and 0.16dex for Teff, logg and [Fe/H] respectively. Furthermore, 120 of these stars are selected to construct the primary stellar spectral template library (Version 1.0) of LAMOST, and will be deployed as basic ingredients for the LAMOST automated parametrization pipeline.
基金supported by the National Key Basic Research Program of China (NKBRP) 2014CB845700supported by National Natural Science Foundation of China (Grant Nos.11473001 and 11233004)
文摘The Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) published its first data release (DR1) in 2013, which is currently the largest dataset of stellar spectra in the world. We combine the PASTEL catalog and SIMBAD radial velocities as a testing standard to validate stellar parameters (effec- tive temperature Tefr, surface gravity log g, metallicity [Fe/H] and radial velocity Vr) derived from DR1. Through cross-identification of the DR1 catalogs and the PASTEL catalog, we obtain a preliminary sample of 422 stars. After removal of stellar param- eter measurements from problematic spectra and applying effective temperature con- straints to the sample, we compare the stellar parameters from DR1 with those from PASTEL and SIMBAD to demonstrate that the DR1 results are reliable in restricted ranges of Tefr. We derive standard deviations of 110 K, 0.19 dex and 0.11 dex for Tell, log 9 and [Fe/H] respectively when Teff〈 8000 K, and 4.91 km s-1 for Vr when Teff 〈 10 000 K. Systematic errors are negligible except for those of Vr. In addition, metallicities in DR1 are systematically higher than those in PASTEL, in the range of PASTEL [Fe/H] 〈 -1.5.
文摘With the rapid development of large scale sky surveys like the Sloan Digital Sky Survey (SDSS), GAIA and LAMOST (Guoshoujing telescope), stellar spectra can be obtained on an ever-increasing scale. Therefore, it is necessary to estimate stel- lar atmospheric parameters such as Teff, log g and [Fe/H] automatically to achieve the scientific goals and make full use of the potential value of these observations. Feature selection plays a key role in the automatic measurement of atmospheric parameters. We propose to use the least absolute shrinkage selection operator (Lasso) algorithm to select features from stellar spectra. Feature selection can reduce redundancy in spectra, alleviate the influence of noise, improve calculation speed and enhance the robustness of the estimation system. Based on the extracted features, stellar atmospheric param- eters are estimated by the support vector regression model. Three typical schemes are evaluated on spectral data from both the ELODIE library and SDSS. Experimental results show the potential performance to a certain degree. In addition, results show that our method is stable when applied to different spectra.
基金The Special Fund for Agriscientific Research in the Public Interest under contract No.201303050the Fundamental Research Funds for the Central Universities under contract Nos 201022001 and 201262004
文摘Multispecies ecological models have been used for predicting the effects of fishing activity and evaluating the performance of management strategies. Size-spectrum models are one type of physiologically-structured ecological model that provide a feasible approach to describing fish communities in terms of individual dietary variation and ontogenetic niche shift. Despite the potential of ecological models in improving our understanding of ecosystems, their application is usually limited for data-poor fisheries. As a first step in implementing ecosystem-based fisheries management(EBFM), this study built a size-spectrum model for the fish community in the Haizhou Bay, China. We describe data collection procedures and model parameterization to facilitate the implementation of such size-spectrum models for future studies of data-poor ecosystems. The effects of fishing on the ecosystem were exemplified with a range of fishing effort and were monitored with a set of ecological indicators. Total community biomass, biodiversity index, W-statistic, LFI(Large fish index), Mean W(mean body weight) and Slope(slope of community size spectra) showed a strong non-linear pattern in response to fishing pressure, and largest fishing effort did not generate the most drastic responses in certain scenarios. We emphasize the value and feasibility of developing size-spectrum models to capture ecological dynamics and suggest limitations as well as potential for model improvement. This study aims to promote a wide use of this type of model in support of EBFM.
基金supported by the National Natural Science Foundation of China(grant Nos.11973022,11973049,and U1811464)the Natural Science Foundation of Guangdong Province(No.2020A1515010710)the Youth Innovation Promotion Association of the CAS(id.Y202017)。
文摘The accuracy of the estimated stellar atmospheric parameter evidently decreases with the decreasing of spectral signal-to-noise ratio(S/N)and there are a huge amount of this kind observations,especially in case of S/N<30.Therefore,it is helpful to improve the parameter estimation performance for these spectra and this work studied the(T_(eff),log g,[Fe/H])estimation problem for LAMOST DR8 low-resolution spectra with 20≤S/N<30.We proposed a data-driven method based on machine learning techniques.First,this scheme detected stellar atmospheric parameter-sensitive features from spectra by the Least Absolute Shrinkage and Selection Operator(LASSO),rejected ineffective data components and irrelevant data.Second,a Multi-layer Perceptron(MLP)method was used to estimate stellar atmospheric parameters from the LASSO features.Finally,the performance of the LASSO-MLP was evaluated by computing and analyzing the consistency between its estimation and the reference from the Apache Point Observatory Galactic Evolution Experiment high-resolution spectra.Experiments show that the Mean Absolute Errors of T_(eff),log g,[Fe/H]are reduced from the LASP(137.6 K,0.195,0.091 dex)to LASSO-MLP(84.32 K,0.137,0.063 dex),which indicate evident improvements on stellar atmospheric parameter estimation.In addition,this work estimated the stellar atmospheric parameters for 1,162,760 lowresolution spectra with 20≤S/N<30 from LAMOST DR8 using LASSO-MLP,and released the estimation catalog,learned model,experimental code,trained model,training data and test data for scientific exploration and algorithm study.
文摘In the present study, the imitation of heavy rainfall event which occurred over Jharkhand during 18 August 2016 was taken as a case study. Weather Research and Forecasting (WRF) model has been utilized for this study. National Centers for Environmental Prediction (NCEP) analysis data is compared with GSMaP data with different combination of physical parameterization scheme like microphysics (MP) and cumulus parameterization (CP). In the present study, three MP schemes: Kessler scheme, Lin et al. scheme and WRF Single-moment 6-class scheme with combination of three CP schemes: Betts-Miller-Janjic scheme, Multi-scale Kain-Fritsch scheme and New simplified Arakawa-Schubert scheme have been used. The model predicted humidity, temperature and precipitation were compared with the GSMaP product. The model nicely depicted the cloud pattern and recognized the rain event spatially. The obtained result shows that the model overestimates the precipitation for all the schemes.
基金Supported by National Research Foundation for the Doctoral Program of Higher Education of China(20110131130004)Independent Innovation Foundation of Shandong University,IIFSDU(2012TB013)
文摘Parameterization is one of the key problems in the construction of a curve to interpolate a set of ordered points. We propose a new local parameterization method based on the curvature model in this paper. The new method determines the knots by mi- nimizing the maximum curvature of quadratic curve. When the knots by the new method are used to construct interpolation curve, the constructed curve have good precision. We also give some comparisons of the new method with existing methods, and our method can perform better in interpolation error, and the interpolated curve is more fairing.
文摘To scrutinize the nature of dark energy,many equations of state have been proposed.In this context,we examine the simplest parameterization of the equation of state parameter of dark energy in an anisotropic Bianchi type I universe compared with theΛCDM model.Using different combinations of data samples,including Pantheon and Pantheon+H(z),alongside applying the minimization of theχ^(2)function of the distance modulus of data samples,we obtain the constrained values of cosmographic parameters in the parameterization of the dark energy scenario.One condition of the phantom barrier crossing is acquired.Several physical properties of the universe are discussed by considering the anisotropy effect and different observational data points.One should note that the deductions of the cosmological parameter verify recent observational data.
文摘Over the past decade, open-source software use has grown. Today, many companies including Google, Microsoft, Meta, RedHat, MongoDB, and Apache are major participants of open-source contributions. With the increased use of open-source software or integration of open-source software into custom-developed software, the quality of this software component increases in importance. This study examined a sample of open-source applications from GitHub. Static software analytics were conducted, and each application was classified for its risk level. In the analyzed applications, it was found that 90% of the applications were classified as low risk or moderate low risk indicating a high level of quality for open-source applications.