Dynamic data driven simulation (DDDS) is proposed to improve the model by incorporaing real data from the practical systems into the model. Instead of giving a static input, multiple possible sets of inputs are fed ...Dynamic data driven simulation (DDDS) is proposed to improve the model by incorporaing real data from the practical systems into the model. Instead of giving a static input, multiple possible sets of inputs are fed into the model. And the computational errors are corrected using statistical approaches. It involves a variety of aspects, including the uncertainty modeling, the measurement evaluation, the system model and the measurement model coupling ,the computation complexity, and the performance issue. Authors intend to set up the architecture of DDDS for wildfire spread model, DEVS-FIRE, based on the discrete event speeification (DEVS) formalism. The experimental results show that the framework can track the dynamically changing fire front based on fire sen- sor data, thus, it provides more aecurate predictions.展开更多
Outlier in one variable will smear the estimation of other measurements in data reconciliation (DR). In this article, a novel robust method is proposed for nonlinear dynamic data reconciliation, to reduce the influe...Outlier in one variable will smear the estimation of other measurements in data reconciliation (DR). In this article, a novel robust method is proposed for nonlinear dynamic data reconciliation, to reduce the influence of outliers on the result of DR. This method introduces a penalty function matrix in a conventional least-square objective function, to assign small weights for outliers and large weights for normal measurements. To avoid the loss of data information, element-wise Mahalanobis distance is proposed, as an improvement on vector-wise distance, to construct a penalty function matrix. The correlation of measurement error is also considered in this article. The method introduces the robust statistical theory into conventional least square estimator by constructing the penalty weight matrix and gets not only good robustness but also simple calculation. Simulation of a continuous stirred tank reactor, verifies the effectiveness of the proposed algorithm.展开更多
Health monitoring data or the data about infectious diseases such as COVID-19 may need to be constantly updated and dynamically released,but they may contain user's sensitive information.Thus,how to preserve the u...Health monitoring data or the data about infectious diseases such as COVID-19 may need to be constantly updated and dynamically released,but they may contain user's sensitive information.Thus,how to preserve the user's privacy before their release is critically important yet challenging.Differential Privacy(DP)is well-known to provide effective privacy protection,and thus the dynamic DP preserving data release was designed to publish a histogram to meet DP guarantee.Unfortunately,this scheme may result in high cumulative errors and lower the data availability.To address this problem,in this paper,we apply Jensen-Shannon(JS)divergence to design the OPTICS(Ordering Points To Identify The Clustering Structure)scheme.It uses JS divergence to measure the difference between the updated data set at the current release time and private data set at the previous release time.By comparing the difference with a threshold,only when the difference is greater than the threshold,can we apply OPTICS to publish DP protected data sets.Our experimental results show that the absolute errors and average relative errors are significantly lower than those existing works.展开更多
A class of networked control systems is investigated whose communication network is shared with other applications. The design objective for such a system setting is not only the optimization of the control performanc...A class of networked control systems is investigated whose communication network is shared with other applications. The design objective for such a system setting is not only the optimization of the control performance but also the efficient utilization of the communication resources. We observe that at a large time scale the data packet delay in the communication network is roughly varying piecewise constant, which is typically true for data networks like the Internet. Based on this observation, a dynamic data packing scheme is proposed within the recently developed packet-based control framework for networked control systems. As expected this proposed approach achieves a fine balance between the control performance and the communication utilization: the similar control performance can be obtained at dramatically reduced cost of the communication resources. Simulations illustrate the effectiveness of the proposed approach.展开更多
The experimental random error and desired valuse of non observed points in dynamic indexes were estimated by establishing the linear regression equations about variety regulations of dynamic indexes.The methods for d...The experimental random error and desired valuse of non observed points in dynamic indexes were estimated by establishing the linear regression equations about variety regulations of dynamic indexes.The methods for difference significant test among different treatments using dynamic point as indexes were presented without setting the replication on each dynamic point observed.展开更多
Nowadays, an increasing number of persons choose to outsource their computing demands and storage demands to the Cloud. In order to ensure the integrity of the data in the untrusted Cloud, especially the dynamic files...Nowadays, an increasing number of persons choose to outsource their computing demands and storage demands to the Cloud. In order to ensure the integrity of the data in the untrusted Cloud, especially the dynamic files which can be updated online, we propose an improved dynamic provable data possession model. We use some homomorphic tags to verify the integrity of the file and use some hash values generated by some secret values and tags to prevent replay attack and forgery attack. Compared with previous works, our proposal reduces the computational and communication complexity from O(logn) to O(1). We did some experiments to ensure this improvement and extended the model to file sharing situation.展开更多
Due to the development of 5G communication,many aspects of information technology(IT)services are changing.With the development of communication technologies such as 5G,it has become possible to provide IT services th...Due to the development of 5G communication,many aspects of information technology(IT)services are changing.With the development of communication technologies such as 5G,it has become possible to provide IT services that were difficult to provide in the past.One of the services made possible through this change is cloud-based collaboration.In order to support secure collaboration over cloud,encryption technology to securely manage dynamic data is essential.However,since the existing encryption technology is not suitable for encryption of dynamic data,a new technology that can provide encryption for dynamic data is required for secure cloudbased collaboration.In this paper,we propose a new encryption technology to support secure collaboration for dynamic data in the cloud.Specifically,we propose an encryption operation mode which can support data updates such as modification,addition,and deletion of encrypted data in an encrypted state.To support the dynamic update of encrypted data,we invent a new mode of operation technique named linked-block cipher(LBC).Basic idea of our work is to use an updatable random value so-called link to link two encrypted blocks.Due to the use of updatable random link values,we can modify,insert,and delete an encrypted data without decrypt it.展开更多
In order to provide a practicable solution to data confidentiality in cloud storage service,a data assured deletion scheme,which achieves the fine grained access control,hopping and sniffing attacks resistance,data dy...In order to provide a practicable solution to data confidentiality in cloud storage service,a data assured deletion scheme,which achieves the fine grained access control,hopping and sniffing attacks resistance,data dynamics and deduplication,is proposed.In our scheme,data blocks are encrypted by a two-level encryption approach,in which the control keys are generated from a key derivation tree,encrypted by an All-OrNothing algorithm and then distributed into DHT network after being partitioned by secret sharing.This guarantees that only authorized users can recover the control keys and then decrypt the outsourced data in an ownerspecified data lifetime.Besides confidentiality,data dynamics and deduplication are also achieved separately by adjustment of key derivation tree and convergent encryption.The analysis and experimental results show that our scheme can satisfy its security goal and perform the assured deletion with low cost.展开更多
A decision model of knowledge transfer is presented on the basis of the characteristics of knowledge transfer in a big data environment.This model can determine the weight of knowledge transferred from another enterpr...A decision model of knowledge transfer is presented on the basis of the characteristics of knowledge transfer in a big data environment.This model can determine the weight of knowledge transferred from another enterprise or from a big data provider.Numerous simulation experiments are implemented to test the efficiency of the optimization model.Simulation experiment results show that when increasing the weight of knowledge from big data knowledge provider,the total discount expectation of profits will increase,and the transfer cost will be reduced.The calculated results are in accordance with the actual economic situation.The optimization model can provide useful decision support for enterprises in a big data environment.展开更多
Related factors for measuring urban agglomeration effect were studied firstly.Then,panel data of 283 prefecture level cities of China were collected to analyze the effect of agglomeration on employment density.Besides...Related factors for measuring urban agglomeration effect were studied firstly.Then,panel data of 283 prefecture level cities of China were collected to analyze the effect of agglomeration on employment density.Besides,fixed effect model was applied to analyze static panel data,and two-step generalized method of moments(GMM) estimator was employed to analyze dynamic panel data.Results reveal that per capita regional GDP,public medical care level,and population mobility have significant effect on employment density.Therefore,there exists effect of agglomeration economy in prefecture level cities of China in the current stage.展开更多
This article presented a new data fusion approach for reasonably predicting dynamic serviceability reliability of the long-span bridge girder.Firstly,multivariate Bayesian dynamic linear model(MBDLM)considering dynami...This article presented a new data fusion approach for reasonably predicting dynamic serviceability reliability of the long-span bridge girder.Firstly,multivariate Bayesian dynamic linear model(MBDLM)considering dynamic correlation among the multiple variables is provided to predict dynamic extreme deflections;secondly,with the proposed MBDLM,the dynamic correlation coefficients between any two performance functions can be predicted;finally,based on MBDLM and Gaussian copula technique,a new data fusion method is given to predict the serviceability reliability of the long-span bridge girder,and the monitoring extreme deflection data from an actual bridge is provided to illustrated the feasibility and application of the proposed method.展开更多
A new method is developed to assess and analyze the dynamic performance of hydrostatic bearing oil film by using an amulets-layer dynamic mesh technique. It is implemented using C Language to compile the UDF program o...A new method is developed to assess and analyze the dynamic performance of hydrostatic bearing oil film by using an amulets-layer dynamic mesh technique. It is implemented using C Language to compile the UDF program of a single oil film of the hydrostatic bearing. The effects of key lubrication parameters of the hydrostatic bearing are evaluated and analyzed under various working conditions,i.e. under no-load,a load of 40 t,a full load of 160 t,and the rotation speed of 1r/min,2r/min,4r/min,8r/min,16r/min,32r/min. The transient data of oil film bearing capacity under different load and rotation speed are acquired for a total of 18 working conditions during the oil film thickness changing. It allows the effective prediction of dynamic performance of large size hydrostatic bearing. Experiments on hydrostatic bearing oil film have been performed and the results were used to define the boundary conditions for the numerical simulations and validate the developed numerical model. The results showed that the oil film thickness became thinner with the increase of the operating time of the hydrostatic bearing,both the oil film rigidity and the oil cavity pressure increased significantly,and the increase of the bearing capacity was inversely proportional to the cube of the change of the film thickness. Meanwhile,the effect of the load condition on carrying capacity of large size static bearing was more important than the speed condition. The error between the simulation value and the experimental value was 4.25%.展开更多
Differential privacy has recently become a widely recognized strict privacy protection model of data release.Differential privacy histogram publishing can directly show the statistical data distribution under the prem...Differential privacy has recently become a widely recognized strict privacy protection model of data release.Differential privacy histogram publishing can directly show the statistical data distribution under the premise of ensuring user privacy for data query,sharing,and analysis.The dynamic data release is a study with a wide range of current industry needs.However,the amount of data varies considerably over different periods.Unreasonable data processing will result in the risk of users’information leakage and unavailability of the data.Therefore,we designed a differential privacy histogram publishing method based on the dynamic sliding window of LSTM(DPHP-DL),which can improve data availability on the premise of guaranteeing data privacy.DPHP-DL is integrated by DSW-LSTM and DPHK+.DSW-LSTM updates the size of sliding windows based on data value prediction via long shortterm memory(LSTM)networks,which evenly divides the data stream into several windows.DPHK+heuristically publishes non-isometric histograms based on k-mean++clustering of automatically obtaining the optimal K,so as to achieve differential privacy histogram publishing of dynamic data.Extensive experiments on real-world dynamic datasets demonstrate the superior performance of the DPHP-DL.展开更多
The user control over the life cycle of data is of an extreme importance in clouds in order to determine whether the service provider adheres to the client’s pre-specified needs in the contract between them or n...The user control over the life cycle of data is of an extreme importance in clouds in order to determine whether the service provider adheres to the client’s pre-specified needs in the contract between them or not, significant clients concerns raise on some aspects like social, location and the laws to which the data are subject to. The problem is even magnified more with the lack of transparency by Cloud Service Providers (CSPs). Auditing and compliance enforcement introduce different set of challenges in cloud computing that are not yet resolved. In this paper, a conducted questionnaire showed that the data owners have real concerns about not just the secrecy and integrity of their data in cloud environment, but also for spatial, temporal, and legal issues related to their data especially for sensitive or personal data. The questionnaire results show the importance for the data owners to address mainly three major issues: Their ability to continue the work, the secrecy and integrity of their data, and the spatial, legal, temporal constraints related to their data. Although a good volume of work was dedicated for auditing in the literature, only little work was dedicated to the fulfillment of the contractual obligations of the CSPs. The paper contributes to knowledge by proposing an extension to the auditing models to include the fulfillment of contractual obligations aspects beside the important aspects of secrecy and integrity of client’s data.展开更多
A regional coupled prediction system for the Asia-Pacific(AP-RCP)(38°E-180°,20°S-60°N) area has been established.The AP-RCP system consists of WRF-ROMS(Weather Research and Forecast,and Regional Oc...A regional coupled prediction system for the Asia-Pacific(AP-RCP)(38°E-180°,20°S-60°N) area has been established.The AP-RCP system consists of WRF-ROMS(Weather Research and Forecast,and Regional Ocean Model System) coupled models combined with local observational information through dynamically downscaling coupled data assimilation(CDA).The system generates 18-day forecasts for the atmosphere and ocean environment on a daily quasi-operational schedule at Pilot National Laboratory for Marine Science and Technology(Qingdao)(QNLM),consisting of 2 different-resolution coupled models:27 km WRF coupled with 9 km ROMS,9 km WRF coupled with 3 km ROMS,while a version of 3 km WRF coupled with 3 km ROMS is in a test mode.This study is a first step to evaluate the impact of high-resolution coupled model with dynamically downscaling CDA on the extended-range predictions,focusing on forecasts of typhoon onset,improved precipitation and typhoon intensity forecasts as well as simulation of the Kuroshio current variability associated with mesoscale oceanic activities.The results show that for realizing the extended-range predictability of atmospheric and oceanic environment characterized by statistics of mesoscale activities,a fine resolution coupled model resolving local mesoscale phenomena with balanced and coherent coupled initialization is a necessary first step.The next challenges include improving the planetary boundary physics and the representation of air-sea and air-land interactions to enable the model to resolve kilometer or sub-kilometer processes.展开更多
A number of proposals have been suggested to tackle data integrity and privacy concerns in cloud storage in which some existing schemes suffer from vulnerabilities in data dynamics. In this paper, we propose an improv...A number of proposals have been suggested to tackle data integrity and privacy concerns in cloud storage in which some existing schemes suffer from vulnerabilities in data dynamics. In this paper, we propose an improved fairness and dynamic provable data possession scheme that supports public verification and batch auditing while preserves data privacy. The rb23Tree is utilized to facilitate data dynamics. Moreover, the fairness is considered to prevent a dishonest user from accusing the cloud service provider of manipulating the data. The scheme allows a third party auditor (TPA) to verify the data integrity without learning any information about the data content during the auditing process. Furthermore, our scheme also allows batch auditing, which greatly accelerates the auditing process when there are multiple auditing requests. Security analysis and extensive experimental evaluations show that our scheme is secure and efficient.展开更多
Using data for China for the years 1991 to 2005 by province and employing the semi- parametric panel data model estimation method developed by Horowitz (2004) and Henderson et al. (2006) and Hubler's non-parametr...Using data for China for the years 1991 to 2005 by province and employing the semi- parametric panel data model estimation method developed by Horowitz (2004) and Henderson et al. (2006) and Hubler's non-parametric generalized method of moments (GMM) estimation (2005), this article constructs a dynamic semi-parametric panel data model and describes the dynamic changing trajectory of the effect on consumption of income disparity among urban residents. Our findings show that there is a significant "ratchet effect" in the consumption of urban residents; that income disparity among urban residents has a clear negative influence on consumption; and that the trajectory of this influence shows a roughly bimodal curve.展开更多
Although China’s urban floating population is mainly concentrated in developed cities,from the central and western cities to the eastern developed cities,but the characteristics of the floating population in differen...Although China’s urban floating population is mainly concentrated in developed cities,from the central and western cities to the eastern developed cities,but the characteristics of the floating population in different cities are significantly different.This paper systematically investigates the spatiotemporal characteristics and influencing factors of the floating population in different levels of cities.The results show that the regional imbalance to further strengthen,accumulation and dispersion trend has become increasingly obvious,liquidity is positively correlated and city level scale,and urban agglomeration and the core city is still polarization center of floating population.Flow range is closely related to urban hierarchy:the higher the intra-urban grade,the more tend to inter-provincial flow;the lower the city grade,the more tend to intra-urban mobility.Short-term(1-2 years)and long-term(more than 7 years)flow-time coexist.The short-term liquidity increases with the city grade,and the long-term liquidity decreases with the city level.Farmers are still the main body of the floating population.Younger age,lower education level,low-skilled,high gender ratio employees are the most basic demographic characteristics of the floating population,although there are differences between different cities.The main reason for affecting the floating population is seeking jobs and doing business.展开更多
The aim of this paper is to provide a clear insight about the determinants of female employment rate in the European Union where we have used panel data analyses of 27 countries members of the European Union from 1995...The aim of this paper is to provide a clear insight about the determinants of female employment rate in the European Union where we have used panel data analyses of 27 countries members of the European Union from 1995 till 2009. Applying dynamic modeling, i.e, generalized method of moments (GMM) econometrics findings have driven us to system estimated model where the following institutional variables have been tested: maternity leave, child care facilities, college education, fertility rate, GDP growth, female unemployment rate and part-time employment. We expect these variables to have a positive impact on the female employment rate except for the female unemployment rate and maternity leave展开更多
We propose a new functional single index model, which called dynamic single-index model for functional data, or DSIM, to efficiently perform non-linear and dynamic relationships between functional predictor and functi...We propose a new functional single index model, which called dynamic single-index model for functional data, or DSIM, to efficiently perform non-linear and dynamic relationships between functional predictor and functional response. The proposed model naturally allows for some curvature not captured by the ordinary functional linear model. By using the proposed two-step estimating algorithm, we develop the estimates for both the link function and the regression coefficient function, and then provide predictions of new response trajectories. Besides the asymptotic properties for the estimates of the unknown functions, we also establish the consistency of the predictions of new response trajectories under mild conditions. Finally, we show through extensive simulation studies and a real data example that the proposed DSIM can highly outperform existed functional regression methods in most settings.展开更多
文摘Dynamic data driven simulation (DDDS) is proposed to improve the model by incorporaing real data from the practical systems into the model. Instead of giving a static input, multiple possible sets of inputs are fed into the model. And the computational errors are corrected using statistical approaches. It involves a variety of aspects, including the uncertainty modeling, the measurement evaluation, the system model and the measurement model coupling ,the computation complexity, and the performance issue. Authors intend to set up the architecture of DDDS for wildfire spread model, DEVS-FIRE, based on the discrete event speeification (DEVS) formalism. The experimental results show that the framework can track the dynamically changing fire front based on fire sen- sor data, thus, it provides more aecurate predictions.
基金Supported by the National Natural Science Foundation of China (No.60504033)
文摘Outlier in one variable will smear the estimation of other measurements in data reconciliation (DR). In this article, a novel robust method is proposed for nonlinear dynamic data reconciliation, to reduce the influence of outliers on the result of DR. This method introduces a penalty function matrix in a conventional least-square objective function, to assign small weights for outliers and large weights for normal measurements. To avoid the loss of data information, element-wise Mahalanobis distance is proposed, as an improvement on vector-wise distance, to construct a penalty function matrix. The correlation of measurement error is also considered in this article. The method introduces the robust statistical theory into conventional least square estimator by constructing the penalty weight matrix and gets not only good robustness but also simple calculation. Simulation of a continuous stirred tank reactor, verifies the effectiveness of the proposed algorithm.
基金supported in part by National Natural Science Foundation of China(No.61672106)in part by Natural Science Foundation of Beijing,China(L192023)in part by the project of promoting the Classified Development of Beijing Information Science and Technology University(No.5112211038,5112211039)。
文摘Health monitoring data or the data about infectious diseases such as COVID-19 may need to be constantly updated and dynamically released,but they may contain user's sensitive information.Thus,how to preserve the user's privacy before their release is critically important yet challenging.Differential Privacy(DP)is well-known to provide effective privacy protection,and thus the dynamic DP preserving data release was designed to publish a histogram to meet DP guarantee.Unfortunately,this scheme may result in high cumulative errors and lower the data availability.To address this problem,in this paper,we apply Jensen-Shannon(JS)divergence to design the OPTICS(Ordering Points To Identify The Clustering Structure)scheme.It uses JS divergence to measure the difference between the updated data set at the current release time and private data set at the previous release time.By comparing the difference with a threshold,only when the difference is greater than the threshold,can we apply OPTICS to publish DP protected data sets.Our experimental results show that the absolute errors and average relative errors are significantly lower than those existing works.
基金supported by the National Natural Science Foundation of China(Grant Nos.6142230761174061&61304048)+4 种基金the Scientific Research Starting Foundation for the Returned Overseas Chinese Scholars,Ministry of Education of Chinathe National Hi-Tech Research and Development Program of China("863"Project)(Grant No.2014AA06A503)the Youth Innovation Promotion Association,Chinese Academy of Sciences,in part by the Youth Top-Notch Talent Support Programthe 1000-Talent Youth ProgramZhejiang 1000-Talent Program
文摘A class of networked control systems is investigated whose communication network is shared with other applications. The design objective for such a system setting is not only the optimization of the control performance but also the efficient utilization of the communication resources. We observe that at a large time scale the data packet delay in the communication network is roughly varying piecewise constant, which is typically true for data networks like the Internet. Based on this observation, a dynamic data packing scheme is proposed within the recently developed packet-based control framework for networked control systems. As expected this proposed approach achieves a fine balance between the control performance and the communication utilization: the similar control performance can be obtained at dramatically reduced cost of the communication resources. Simulations illustrate the effectiveness of the proposed approach.
文摘The experimental random error and desired valuse of non observed points in dynamic indexes were estimated by establishing the linear regression equations about variety regulations of dynamic indexes.The methods for difference significant test among different treatments using dynamic point as indexes were presented without setting the replication on each dynamic point observed.
基金supported by Major Program of Shanghai Science and Technology Commission under Grant No.10DZ1500200Collaborative Applied Research and Development Project between Morgan Stanley and Shanghai Jiao Tong University, China
文摘Nowadays, an increasing number of persons choose to outsource their computing demands and storage demands to the Cloud. In order to ensure the integrity of the data in the untrusted Cloud, especially the dynamic files which can be updated online, we propose an improved dynamic provable data possession model. We use some homomorphic tags to verify the integrity of the file and use some hash values generated by some secret values and tags to prevent replay attack and forgery attack. Compared with previous works, our proposal reduces the computational and communication complexity from O(logn) to O(1). We did some experiments to ensure this improvement and extended the model to file sharing situation.
基金This work was partly supported by Institute of Information&communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(No.2021-0-00779Development of high-speed encryption data processing technology that guarantees privacy based hardware,50%)National R&D Program through the National Research Foundation of Korea(NRF)funded by Ministry of Science and ICT(NRF-2021R1F1A1056115,50%).
文摘Due to the development of 5G communication,many aspects of information technology(IT)services are changing.With the development of communication technologies such as 5G,it has become possible to provide IT services that were difficult to provide in the past.One of the services made possible through this change is cloud-based collaboration.In order to support secure collaboration over cloud,encryption technology to securely manage dynamic data is essential.However,since the existing encryption technology is not suitable for encryption of dynamic data,a new technology that can provide encryption for dynamic data is required for secure cloudbased collaboration.In this paper,we propose a new encryption technology to support secure collaboration for dynamic data in the cloud.Specifically,we propose an encryption operation mode which can support data updates such as modification,addition,and deletion of encrypted data in an encrypted state.To support the dynamic update of encrypted data,we invent a new mode of operation technique named linked-block cipher(LBC).Basic idea of our work is to use an updatable random value so-called link to link two encrypted blocks.Due to the use of updatable random link values,we can modify,insert,and delete an encrypted data without decrypt it.
基金supported by the National Key Basic Research Program of China(973 program) under Grant No.2012CB315901
文摘In order to provide a practicable solution to data confidentiality in cloud storage service,a data assured deletion scheme,which achieves the fine grained access control,hopping and sniffing attacks resistance,data dynamics and deduplication,is proposed.In our scheme,data blocks are encrypted by a two-level encryption approach,in which the control keys are generated from a key derivation tree,encrypted by an All-OrNothing algorithm and then distributed into DHT network after being partitioned by secret sharing.This guarantees that only authorized users can recover the control keys and then decrypt the outsourced data in an ownerspecified data lifetime.Besides confidentiality,data dynamics and deduplication are also achieved separately by adjustment of key derivation tree and convergent encryption.The analysis and experimental results show that our scheme can satisfy its security goal and perform the assured deletion with low cost.
基金supported by NSFC(Grant No.71373032)the Natural Science Foundation of Hunan Province(Grant No.12JJ4073)+3 种基金the Scientific Research Fund of Hunan Provincial Education Department(Grant No.11C0029)the Educational Economy and Financial Research Base of Hunan Province(Grant No.13JCJA2)the Project of China Scholarship Council for Overseas Studies(201208430233201508430121)
文摘A decision model of knowledge transfer is presented on the basis of the characteristics of knowledge transfer in a big data environment.This model can determine the weight of knowledge transferred from another enterprise or from a big data provider.Numerous simulation experiments are implemented to test the efficiency of the optimization model.Simulation experiment results show that when increasing the weight of knowledge from big data knowledge provider,the total discount expectation of profits will increase,and the transfer cost will be reduced.The calculated results are in accordance with the actual economic situation.The optimization model can provide useful decision support for enterprises in a big data environment.
基金Supported by Key Project of National Social Science Foundation(12&ZD100)Fundamental Research Funds for the Central Universities of Southwest University(SWU1309005)
文摘Related factors for measuring urban agglomeration effect were studied firstly.Then,panel data of 283 prefecture level cities of China were collected to analyze the effect of agglomeration on employment density.Besides,fixed effect model was applied to analyze static panel data,and two-step generalized method of moments(GMM) estimator was employed to analyze dynamic panel data.Results reveal that per capita regional GDP,public medical care level,and population mobility have significant effect on employment density.Therefore,there exists effect of agglomeration economy in prefecture level cities of China in the current stage.
基金This work was supported by Natural Science Foundation of Gansu Province of China(20JR10RA625,20JR10RA623)National Key Research and Development Project of China(Project No.2019YFC1511005)+1 种基金Fundamental Research Funds for the Central Universities(Grant No.lzujbky-2020-55)National Natural Science Foundation of China(Grant No.51608243).
文摘This article presented a new data fusion approach for reasonably predicting dynamic serviceability reliability of the long-span bridge girder.Firstly,multivariate Bayesian dynamic linear model(MBDLM)considering dynamic correlation among the multiple variables is provided to predict dynamic extreme deflections;secondly,with the proposed MBDLM,the dynamic correlation coefficients between any two performance functions can be predicted;finally,based on MBDLM and Gaussian copula technique,a new data fusion method is given to predict the serviceability reliability of the long-span bridge girder,and the monitoring extreme deflection data from an actual bridge is provided to illustrated the feasibility and application of the proposed method.
基金Supported by the National Natural Science Foundation of China(No.51005063,51375123)National Science and Technology Cooperation Projects of China(No.2012DFR70840)
文摘A new method is developed to assess and analyze the dynamic performance of hydrostatic bearing oil film by using an amulets-layer dynamic mesh technique. It is implemented using C Language to compile the UDF program of a single oil film of the hydrostatic bearing. The effects of key lubrication parameters of the hydrostatic bearing are evaluated and analyzed under various working conditions,i.e. under no-load,a load of 40 t,a full load of 160 t,and the rotation speed of 1r/min,2r/min,4r/min,8r/min,16r/min,32r/min. The transient data of oil film bearing capacity under different load and rotation speed are acquired for a total of 18 working conditions during the oil film thickness changing. It allows the effective prediction of dynamic performance of large size hydrostatic bearing. Experiments on hydrostatic bearing oil film have been performed and the results were used to define the boundary conditions for the numerical simulations and validate the developed numerical model. The results showed that the oil film thickness became thinner with the increase of the operating time of the hydrostatic bearing,both the oil film rigidity and the oil cavity pressure increased significantly,and the increase of the bearing capacity was inversely proportional to the cube of the change of the film thickness. Meanwhile,the effect of the load condition on carrying capacity of large size static bearing was more important than the speed condition. The error between the simulation value and the experimental value was 4.25%.
基金supported by the National Nature Science Foundation of China(Grant Nos.91546108,and 71490725)the AnhuiProvincial Scienceand Technology Major Projects(201903a05020020)+2 种基金the Anhui Provincial Natural Science Foundation(1908085QG298)the Fundamental Research Funds for the Central Universities(JZ2019HGTA0053,JZ2019 HGBZ0128)the Open Research Fund Program of Key Laboratory of Process Optimization and Intelligent Decision-making,Ministry of Education,China.
文摘Differential privacy has recently become a widely recognized strict privacy protection model of data release.Differential privacy histogram publishing can directly show the statistical data distribution under the premise of ensuring user privacy for data query,sharing,and analysis.The dynamic data release is a study with a wide range of current industry needs.However,the amount of data varies considerably over different periods.Unreasonable data processing will result in the risk of users’information leakage and unavailability of the data.Therefore,we designed a differential privacy histogram publishing method based on the dynamic sliding window of LSTM(DPHP-DL),which can improve data availability on the premise of guaranteeing data privacy.DPHP-DL is integrated by DSW-LSTM and DPHK+.DSW-LSTM updates the size of sliding windows based on data value prediction via long shortterm memory(LSTM)networks,which evenly divides the data stream into several windows.DPHK+heuristically publishes non-isometric histograms based on k-mean++clustering of automatically obtaining the optimal K,so as to achieve differential privacy histogram publishing of dynamic data.Extensive experiments on real-world dynamic datasets demonstrate the superior performance of the DPHP-DL.
文摘The user control over the life cycle of data is of an extreme importance in clouds in order to determine whether the service provider adheres to the client’s pre-specified needs in the contract between them or not, significant clients concerns raise on some aspects like social, location and the laws to which the data are subject to. The problem is even magnified more with the lack of transparency by Cloud Service Providers (CSPs). Auditing and compliance enforcement introduce different set of challenges in cloud computing that are not yet resolved. In this paper, a conducted questionnaire showed that the data owners have real concerns about not just the secrecy and integrity of their data in cloud environment, but also for spatial, temporal, and legal issues related to their data especially for sensitive or personal data. The questionnaire results show the importance for the data owners to address mainly three major issues: Their ability to continue the work, the secrecy and integrity of their data, and the spatial, legal, temporal constraints related to their data. Although a good volume of work was dedicated for auditing in the literature, only little work was dedicated to the fulfillment of the contractual obligations of the CSPs. The paper contributes to knowledge by proposing an extension to the auditing models to include the fulfillment of contractual obligations aspects beside the important aspects of secrecy and integrity of client’s data.
基金supported by the National Key Research and Development Program of China(2017YFC1404100,2017YFC1404104)the National Natural Science Foundation of China(41775100,41830964)+1 种基金the Shandong Province’s"Taishan"Scientist Project(2018012919)the collaborative project between the Ocean University of China(OUC),Texas A&M University(TAMU)and the National Center for Atmospheric Research(NCAR)and completed through the International Laboratory for High Resolution Earth System Prediction(iHESP)-a collaboration among QNLM,TAMU and NCAR。
文摘A regional coupled prediction system for the Asia-Pacific(AP-RCP)(38°E-180°,20°S-60°N) area has been established.The AP-RCP system consists of WRF-ROMS(Weather Research and Forecast,and Regional Ocean Model System) coupled models combined with local observational information through dynamically downscaling coupled data assimilation(CDA).The system generates 18-day forecasts for the atmosphere and ocean environment on a daily quasi-operational schedule at Pilot National Laboratory for Marine Science and Technology(Qingdao)(QNLM),consisting of 2 different-resolution coupled models:27 km WRF coupled with 9 km ROMS,9 km WRF coupled with 3 km ROMS,while a version of 3 km WRF coupled with 3 km ROMS is in a test mode.This study is a first step to evaluate the impact of high-resolution coupled model with dynamically downscaling CDA on the extended-range predictions,focusing on forecasts of typhoon onset,improved precipitation and typhoon intensity forecasts as well as simulation of the Kuroshio current variability associated with mesoscale oceanic activities.The results show that for realizing the extended-range predictability of atmospheric and oceanic environment characterized by statistics of mesoscale activities,a fine resolution coupled model resolving local mesoscale phenomena with balanced and coherent coupled initialization is a necessary first step.The next challenges include improving the planetary boundary physics and the representation of air-sea and air-land interactions to enable the model to resolve kilometer or sub-kilometer processes.
基金Supported by the Doctoral Fund of the Ministry of Education priority Areas of Development Projects(20110141130006)the Natural Science Foundation of Hubei Province(ZRZ0041)
文摘A number of proposals have been suggested to tackle data integrity and privacy concerns in cloud storage in which some existing schemes suffer from vulnerabilities in data dynamics. In this paper, we propose an improved fairness and dynamic provable data possession scheme that supports public verification and batch auditing while preserves data privacy. The rb23Tree is utilized to facilitate data dynamics. Moreover, the fairness is considered to prevent a dishonest user from accusing the cloud service provider of manipulating the data. The scheme allows a third party auditor (TPA) to verify the data integrity without learning any information about the data content during the auditing process. Furthermore, our scheme also allows batch auditing, which greatly accelerates the auditing process when there are multiple auditing requests. Security analysis and extensive experimental evaluations show that our scheme is secure and efficient.
文摘Using data for China for the years 1991 to 2005 by province and employing the semi- parametric panel data model estimation method developed by Horowitz (2004) and Henderson et al. (2006) and Hubler's non-parametric generalized method of moments (GMM) estimation (2005), this article constructs a dynamic semi-parametric panel data model and describes the dynamic changing trajectory of the effect on consumption of income disparity among urban residents. Our findings show that there is a significant "ratchet effect" in the consumption of urban residents; that income disparity among urban residents has a clear negative influence on consumption; and that the trajectory of this influence shows a roughly bimodal curve.
文摘Although China’s urban floating population is mainly concentrated in developed cities,from the central and western cities to the eastern developed cities,but the characteristics of the floating population in different cities are significantly different.This paper systematically investigates the spatiotemporal characteristics and influencing factors of the floating population in different levels of cities.The results show that the regional imbalance to further strengthen,accumulation and dispersion trend has become increasingly obvious,liquidity is positively correlated and city level scale,and urban agglomeration and the core city is still polarization center of floating population.Flow range is closely related to urban hierarchy:the higher the intra-urban grade,the more tend to inter-provincial flow;the lower the city grade,the more tend to intra-urban mobility.Short-term(1-2 years)and long-term(more than 7 years)flow-time coexist.The short-term liquidity increases with the city grade,and the long-term liquidity decreases with the city level.Farmers are still the main body of the floating population.Younger age,lower education level,low-skilled,high gender ratio employees are the most basic demographic characteristics of the floating population,although there are differences between different cities.The main reason for affecting the floating population is seeking jobs and doing business.
文摘The aim of this paper is to provide a clear insight about the determinants of female employment rate in the European Union where we have used panel data analyses of 27 countries members of the European Union from 1995 till 2009. Applying dynamic modeling, i.e, generalized method of moments (GMM) econometrics findings have driven us to system estimated model where the following institutional variables have been tested: maternity leave, child care facilities, college education, fertility rate, GDP growth, female unemployment rate and part-time employment. We expect these variables to have a positive impact on the female employment rate except for the female unemployment rate and maternity leave
基金supported by National Natural Science Foundation of China (Grant No. 11271080)
文摘We propose a new functional single index model, which called dynamic single-index model for functional data, or DSIM, to efficiently perform non-linear and dynamic relationships between functional predictor and functional response. The proposed model naturally allows for some curvature not captured by the ordinary functional linear model. By using the proposed two-step estimating algorithm, we develop the estimates for both the link function and the regression coefficient function, and then provide predictions of new response trajectories. Besides the asymptotic properties for the estimates of the unknown functions, we also establish the consistency of the predictions of new response trajectories under mild conditions. Finally, we show through extensive simulation studies and a real data example that the proposed DSIM can highly outperform existed functional regression methods in most settings.