As the volume of healthcare and medical data increases from diverse sources,real-world scenarios involving data sharing and collaboration have certain challenges,including the risk of privacy leakage,difficulty in dat...As the volume of healthcare and medical data increases from diverse sources,real-world scenarios involving data sharing and collaboration have certain challenges,including the risk of privacy leakage,difficulty in data fusion,low reliability of data storage,low effectiveness of data sharing,etc.To guarantee the service quality of data collaboration,this paper presents a privacy-preserving Healthcare and Medical Data Collaboration Service System combining Blockchain with Federated Learning,termed FL-HMChain.This system is composed of three layers:Data extraction and storage,data management,and data application.Focusing on healthcare and medical data,a healthcare and medical blockchain is constructed to realize data storage,transfer,processing,and access with security,real-time,reliability,and integrity.An improved master node selection consensus mechanism is presented to detect and prevent dishonest behavior,ensuring the overall reliability and trustworthiness of the collaborative model training process.Furthermore,healthcare and medical data collaboration services in real-world scenarios have been discussed and developed.To further validate the performance of FL-HMChain,a Convolutional Neural Network-based Federated Learning(FL-CNN-HMChain)model is investigated for medical image identification.This model achieves better performance compared to the baseline Convolutional Neural Network(CNN),having an average improvement of 4.7%on Area Under Curve(AUC)and 7%on Accuracy(ACC),respectively.Furthermore,the probability of privacy leakage can be effectively reduced by the blockchain-based parameter transfer mechanism in federated learning between local and global models.展开更多
The global ionosphere maps(GIM)provided by the International GNSS Service(IGS)are extensively utilized for ionospheric morphology monitoring,scientific research,and practical application.Assessing the credibility of G...The global ionosphere maps(GIM)provided by the International GNSS Service(IGS)are extensively utilized for ionospheric morphology monitoring,scientific research,and practical application.Assessing the credibility of GIM products in data-sparse regions is of paramount importance.In this study,measurements from the Crustal Movement Observation Network of China(CMONOC)are leveraged to evaluate the suitability of IGS-GIM products over China region in 2013-2014.The indices of mean error(ME),root mean square error(RMSE),and normalized RMSE(NRMSE)are then utilized to quantify the accuracy of IGS-GIM products.Results revealed distinct local time and latitudinal dependencies in IGS-GIM errors,with substantially high errors at nighttime(NRMSE:39%)and above 40°latitude(NRMSE:49%).Seasonal differences also emerged,with larger equinoctial deviations(NRMSE:33.5%)compared with summer(20%).A preliminary analysis implied that the irregular assimilation of sparse IGS observations,compounded by China’s distinct geomagnetic topology,may manifest as error variations.These results suggest that modeling based solely on IGS-GIM observations engenders inadequate representations across China and that a thorough examination would proffer the necessary foundation for advancing regional total electron content(TEC)constructions.展开更多
In an attempt to assess the Kenyan healthcare system, this study looks at the current efforts that are already in place, what challenges they face, and what strategies can be put into practice to foster interoperabili...In an attempt to assess the Kenyan healthcare system, this study looks at the current efforts that are already in place, what challenges they face, and what strategies can be put into practice to foster interoperability. By reviewing a variety of literature and using statistics, the paper ascertains notable impediments such as the absence of standard protocols, lack of adequate technological infrastructure, and weak regulatory frameworks. Resultant effects from these challenges regarding health provision target enhanced data sharing and merging for better patient outcomes and allocation of resources. It also highlights several opportunities that include the adoption of emerging technologies, and the establishment of public-private partnerships to strengthen the healthcare framework among others. In this regard, the article provides recommendations based on stakeholder views and global best practices addressed to policymakers, medical practitioners, and IT specialists concerned with achieving effective interoperability within Kenya’s health system. This research is relevant because it adds knowledge to the existing literature on how healthcare quality can be improved to make it more patient-centered especially in Kenya.展开更多
As an important part of railway lines, the healthy service status of track fasteners was very important to ensure the safety of trains. The application of deep learning algorithms was becoming an important method to r...As an important part of railway lines, the healthy service status of track fasteners was very important to ensure the safety of trains. The application of deep learning algorithms was becoming an important method to realize its state detection. However, there was often a deficiency that the detection accuracy and calculation speed of model were difficult to balance, when the traditional deep learning model is used to detect the service state of track fasteners. Targeting this issue, an improved Yolov4 model for detecting the service status of track fasteners was proposed. Firstly, the Mixup data augmentation technology was introduced into Yolov4 model to enhance the generalization ability of model. Secondly, the MobileNet-V2 lightweight network was employed in lieu of the CSPDarknet53 network as the backbone, thereby reducing the number of algorithm parameters and improving the model’s computational efficiency. Finally, the SE attention mechanism was incorporated to boost the importance of rail fastener identification by emphasizing relevant image features, ensuring that the network’s focus was primarily on the fasteners being inspected. The algorithm achieved both high precision and high speed operation of the rail fastener service state detection, while realizing the lightweight of model. The experimental results revealed that, the MAP value of the rail fastener service state detection algorithm based on the improved Yolov4 model reaches 83.2%, which is 2.83% higher than that of the traditional Yolov4 model, and the calculation speed was improved by 67.39%. Compared with the traditional Yolov4 model, the proposed method achieved the collaborative optimization of detection accuracy and calculation speed.展开更多
This paper aims to present the experience gathered in the Italian alpine city of Bolzano within the project“Bolzano Traffic”whose goal is the introduction of an experimental open ITS platform for local service provi...This paper aims to present the experience gathered in the Italian alpine city of Bolzano within the project“Bolzano Traffic”whose goal is the introduction of an experimental open ITS platform for local service providers,fostering the diffusion of advanced traveller information services and the future deployment of cooperative mobility systems in the region.Several end-users applications targeted to the needs of different user groups have been developed in collaboration with local companies and research centers;a partnership with the EU Co-Cities project has been activated as well.The implemented services rely on real-time travel and traffic information collected by urban traffic monitoring systems or published by local stakeholders(e.g.public transportation operators).An active involvement of end-users,who have recently started testing these demo applications for free,is actually on-going.展开更多
The aim of the work was to determine the spatial distribution of activity in the forest on the area of the Forest Promotional Complex“Sudety Zachodnie”using mobile phone data.The study identified the sites with the ...The aim of the work was to determine the spatial distribution of activity in the forest on the area of the Forest Promotional Complex“Sudety Zachodnie”using mobile phone data.The study identified the sites with the highest(hot spot)and lowest(cold spot)use.Habitat,stand,demographic,topographic and spatial factors affecting the distribution of activity were also analyzed.Two approaches were applied in our research:global and local Moran’s coefficients,and a machine learning technique,Boosted Regression Trees.The results show that 11,503,320 visits to forest areas were recorded in the“Sudety Zachodnie”in 2019.The most popular season for activities was winter,and the least popular was spring.Using global and local Moran’s I coefficients,three small hot clusters of activity and one large cold cluster were identified.Locations with high values with similar neighbours(hot-spots)were most often visited forest areas,averaging almost 200,000 visits over 2019.Significantly fewer visits were recorded in cold-spots,the average number of visits to these areas was about 4,500.The value of global Moran’s I was equal to 0.54 and proved significant positive spatial autocorrelation.Results of Boosted Regression Trees modeling of visits in forest,using tree stand habitat and spatial factors accurately explained 76%of randomly selected input data.The variables that had the greatest effect on the distribution of activities were the density of hiking and biking trails and diversity of topography.The methodology presented in this article allows delineation of Cultural Ecosystem Services hot spots in forest areas based on mobile phone data.It also allows the identification of factors that may influence the distribution of visits in forests.Such data are important for managing forest areas and adapting forest management to the needs of society while maintaining ecosystem stability.展开更多
According to Cisco’s Internet Report 2020 white paper,there will be 29.3 billion connected devices worldwide by 2023,up from 18.4 billion in 2018.5G connections will generate nearly three times more traffic than 4G c...According to Cisco’s Internet Report 2020 white paper,there will be 29.3 billion connected devices worldwide by 2023,up from 18.4 billion in 2018.5G connections will generate nearly three times more traffic than 4G connections.While bringing a boom to the network,it also presents unprecedented challenges in terms of flow forwarding decisions.The path assignment mechanism used in traditional traffic schedulingmethods tends to cause local network congestion caused by the concentration of elephant flows,resulting in unbalanced network load and degraded quality of service.Using the centralized control of software-defined networks,this study proposes a data center traffic scheduling strategy for minimization congestion and quality of service guaranteeing(MCQG).The ideal transmission path is selected for data flows while considering the network congestion rate and quality of service.Different traffic scheduling strategies are used according to the characteristics of different service types in data centers.Reroute scheduling for elephant flows that tend to cause local congestion.The path evaluation function is formed by the maximum link utilization on the path,the number of elephant flows and the time delay,and the fast merit-seeking capability of the sparrow search algorithm is used to find the path with the lowest actual link overhead as the rerouting path for the elephant flows.It is used to reduce the possibility of local network congestion occurrence.Equal cost multi-path(ECMP)protocols with faster response time are used to schedulemouse flows with shorter duration.Used to guarantee the quality of service of the network.To achieve isolated transmission of various types of data streams.The experimental results show that the proposed strategy has higher throughput,better network load balancing,and better robustness compared to ECMP under different traffic models.In addition,because it can fully utilize the resources in the network,MCQG also outperforms another traffic scheduling strategy that does rerouting for elephant flows(namely Hedera).Compared withECMPandHedera,MCQGimproves average throughput by 11.73%and 4.29%,and normalized total throughput by 6.74%and 2.64%,respectively;MCQG improves link utilization by 23.25%and 15.07%;in addition,the average round-trip delay and packet loss rate fluctuate significantly less than the two compared strategies.展开更多
As a part of the product development process, the after-sales services are not only a source of innovation, but also they benefit from value creation through new managerial methodologies for the achievement of competi...As a part of the product development process, the after-sales services are not only a source of innovation, but also they benefit from value creation through new managerial methodologies for the achievement of competitive advantage and customer satisfaction. The objective of the paper is to further understand value creation for the after-sales services. We present the case of the creation of a new business for the after-sales services for the entrance into a new market. The new business is created by two gurus in the aerospace industry. A typology of guidelines is derived, based on organizational and strategic perspectives, for the after-sales services value creation and the guidelines for the creation of a new business as well as for the entrance of into a new market are presented.展开更多
The rapid rise and development of the computer industry in China has promoted the production of and demand for computer diskettes. China, with over 30 diskette production lines and an annual production of 2 billion di...The rapid rise and development of the computer industry in China has promoted the production of and demand for computer diskettes. China, with over 30 diskette production lines and an annual production of 2 billion diskettes, including some for export, has become the biggest diskette producing country in the world. However, imported diskettes, especially high grade brands, enjoy a good market here. That’s why many world renowned diskette展开更多
JCOMM has strategy to establish the network of WMO-IOC Centres for Marine-meteorological and Oceanographic Climate Data (CMOCs) under the new Marine Climate Data System (MCDS) in 2012 for improving the quality and...JCOMM has strategy to establish the network of WMO-IOC Centres for Marine-meteorological and Oceanographic Climate Data (CMOCs) under the new Marine Climate Data System (MCDS) in 2012 for improving the quality and timeliness of the marine-meteorological and oceanographic data, metadata and products available to end users. China as a candidate of CMOC China has been approved to run on a trial basis after the 4th Meeting of the Joint IOC/WMO Technical Commission for Oceanography and Marine Meteorology (JCOMM). This article states the developing intention of CMOC China in the next few years through the brief introduction to critical marine data, products and service system and cooperation projects in the world.展开更多
To achieve the Sustainable Development Goals(SDGs),high-quality data are needed to inform the formulation of policies and investment decisions,to monitor progress towards the SDGs and to evaluate the impacts of polici...To achieve the Sustainable Development Goals(SDGs),high-quality data are needed to inform the formulation of policies and investment decisions,to monitor progress towards the SDGs and to evaluate the impacts of policies.However,the data landscape is changing.With emerging big data and cloud-based services,there are new opportunities for data collection,influencing both official data collection processes and the operation of the programmes they monitor.This paper uses cases and examples to explore the potential of crowdsourcing and public earth observation(EO)data products for monitoring and tracking the SDGs.This paper suggests that cloud-based services that integrate crowdsourcing and public EO data products provide cost-effective solutions for monitoring and tracking the SDGs,particularly for low-income countries.The paper also discusses the challenges of using cloud services and big data for SDG monitoring.Validation and quality control of public EO data is very important;otherwise,the user will be unable to assess the quality of the data or use it with confidence.展开更多
This paper proposes a method of data-flow testing for Web services composition. Firstly, to facilitate data flow analysis and constraints collecting, the existing model representation of business process execution lan...This paper proposes a method of data-flow testing for Web services composition. Firstly, to facilitate data flow analysis and constraints collecting, the existing model representation of business process execution language (BPEL) is modified in company with the analysis of data dependency and an exact representation of dead path elimination (DPE) is proposed, which over-comes the difficulties brought to dataflow analysis. Then defining and using information based on data flow rules is collected by parsing BPEL and Web services description language (WSDL) documents and the def-use annotated control flow graph is created. Based on this model, data-flow anomalies which indicate potential errors can be discovered by traversing the paths of graph, and all-du-paths used in dynamic data flow testing for Web services composition are automatically generated, then testers can design the test cases according to the collected constraints for each path selected.展开更多
Cyberattacks are difficult to prevent because the targeted companies and organizations are often relying on new and fundamentally insecure cloudbased technologies,such as the Internet of Things.With increasing industr...Cyberattacks are difficult to prevent because the targeted companies and organizations are often relying on new and fundamentally insecure cloudbased technologies,such as the Internet of Things.With increasing industry adoption and migration of traditional computing services to the cloud,one of the main challenges in cybersecurity is to provide mechanisms to secure these technologies.This work proposes a Data Security Framework for cloud computing services(CCS)that evaluates and improves CCS data security from a software engineering perspective by evaluating the levels of security within the cloud computing paradigm using engineering methods and techniques applied to CCS.This framework is developed by means of a methodology based on a heuristic theory that incorporates knowledge generated by existing works as well as the experience of their implementation.The paper presents the design details of the framework,which consists of three stages:identification of data security requirements,management of data security risks and evaluation of data security performance in CCS.展开更多
An ocean state monitor and analysis radar(OSMAR), developed by Wuhan University in China, have been mounted at six stations along the coasts of East China Sea(ECS) to measure velocities(currents, waves and winds...An ocean state monitor and analysis radar(OSMAR), developed by Wuhan University in China, have been mounted at six stations along the coasts of East China Sea(ECS) to measure velocities(currents, waves and winds) at the sea surface. Radar-observed surface current is taken as an example to illustrate the operational high-frequency(HF) radar observing and data service platform(OP), presenting an operational flow from data observing, transmitting, processing, visualizing, to end-user service. Three layers(systems): radar observing system(ROS), data service system(DSS) and visualization service system(VSS), as well as the data flow within the platform are introduced. Surface velocities observed at stations are synthesized at the radar data receiving and preprocessing center of the ROS, and transmitted to the DSS, in which the data processing and quality control(QC) are conducted. Users are allowed to browse the processed data on the portal of the DSS, and access to those data files. The VSS aims to better show the data products by displaying the information on a visual globe. By utilizing the OP, the surface currents in East China Sea are monitored, and hourly and seasonal variabilities of them are investigated.展开更多
Background:Given the importance of customers as the most valuable assets of organizations,customer retention seems to be an essential,basic requirement for any organization.Banks are no exception to this rule.The comp...Background:Given the importance of customers as the most valuable assets of organizations,customer retention seems to be an essential,basic requirement for any organization.Banks are no exception to this rule.The competitive atmosphere within which electronic banking services are provided by different banks increases the necessity of customer retention.Methods:Being based on existing information technologies which allow one to collect data from organizations’databases,data mining introduces a powerful tool for the extraction of knowledge from huge amounts of data.In this research,the decision tree technique was applied to build a model incorporating this knowledge.Results:The results represent the characteristics of churned customers.Conclusions:Bank managers can identify churners in future using the results of decision tree.They should be provide some strategies for customers whose features are getting more likely to churner’s features.展开更多
In this paper, we present a set of best practices for workflow design and implementation for numerical weather prediction models and meteorological data service, which have been in operation in China Meteorological Ad...In this paper, we present a set of best practices for workflow design and implementation for numerical weather prediction models and meteorological data service, which have been in operation in China Meteorological Administration (CMA) for years and have been proven effective in reliably managing the complexities of large-scale meteorological related workflows. Based on the previous work on the platforms, we argue that a minimum set of guidelines including workflow scheme, module design, implementation standards and maintenance consideration during the whole establishment of the platform are highly recommended, serving to reduce the need for future maintenance and adjustment. A significant gain in performance can be achieved through the workflow-based projects. We believe that a good workflow system plays an important role in the weather forecast service, providing a useful tool for monitoring the whole process, fixing the errors, repairing a workflow, or redesigning an equivalent workflow pattern with new components.展开更多
The New Austrian Tunneling Method (NATM) has been widely used in the construction of mountain tun- nels, urban metro lines, underground storage tanks, underground power houses, mining roadways, and so on, The variat...The New Austrian Tunneling Method (NATM) has been widely used in the construction of mountain tun- nels, urban metro lines, underground storage tanks, underground power houses, mining roadways, and so on, The variation patterns of advance geological prediction data, stress-strain data of supporting struc- tures, and deformation data of the surrounding rock are vitally important in assessing the rationality and reliability of construction schemes, and provide essential information to ensure the safety and scheduling of tunnel construction, However, as the quantity of these data increases significantly, the uncertainty and discreteness of the mass data make it extremely difficult to produce a reasonable con- struction scheme; they also reduce the forecast accuracy of accidents and dangerous situations, creating huge challenges in tunnel construction safety, In order to solve this problem, a novel data service system is proposed that uses data-association technology and the NATM, with the support of a big data environ- ment, This system can integrate data resources from distributed monitoring sensors during the construc- tion process, and then identify associations and build relations among data resources under the same construction conditions, These data associations and relations are then stored in a data pool, With the development and supplementation of the data pool, similar relations can then he used under similar con- ditions, in order to provide data references for construction schematic designs and resource allocation, The proposed data service system also provides valuable guidance for the construction of similar projects.展开更多
With the growing popularity of data-intensive services on the Internet, the traditional process-centric model for business process meets challenges due to the lack of abilities to describe data semantics and dependenc...With the growing popularity of data-intensive services on the Internet, the traditional process-centric model for business process meets challenges due to the lack of abilities to describe data semantics and dependencies, resulting in the inflexibility of the design and implement for the processes. This paper proposes a novel data-aware business process model which is able to describe both explicit control flow and implicit data flow. Data model with dependencies which are formulated by Linear-time Temporal Logic(LTL) is presented, and their satisfiability is validated by an automaton-based model checking algorithm. Data dependencies are fully considered in modeling phase, which helps to improve the efficiency and reliability of programming during developing phase. Finally, a prototype system based on j BPM for data-aware workflow is designed using such model, and has been deployed to Beijing Kingfore heating management system to validate the flexibility, efficacy and convenience of our approach for massive coding and large-scale system management in reality.展开更多
Cloud computing(CC)is an advanced technology that provides access to predictive resources and data sharing.The cloud environment represents the right type regarding cloud usage model ownership,size,and rights to acces...Cloud computing(CC)is an advanced technology that provides access to predictive resources and data sharing.The cloud environment represents the right type regarding cloud usage model ownership,size,and rights to access.It introduces the scope and nature of cloud computing.In recent times,all processes are fed into the system for which consumer data and cache size are required.One of the most security issues in the cloud environment is Distributed Denial of Ser-vice(DDoS)attacks,responsible for cloud server overloading.This proposed sys-tem ID3(Iterative Dichotomiser 3)Maximum Multifactor Dimensionality Posteriori Method(ID3-MMDP)is used to overcome the drawback and a rela-tively simple way to execute and for the detection of(DDoS)attack.First,the pro-posed ID3-MMDP method calls for the resources of the cloud platform and then implements the attack detection technology based on information entropy to detect DDoS attacks.Since because the entropy value can show the discrete or aggregated characteristics of the current data set,it can be used for the detection of abnormal dataflow,User-uploaded data,ID3-MMDP system checks and read risk measurement and processing,bug ratingfile size changes,orfile name changes and changes in the format design of the data size entropy value.Unique properties can be used whenever the program approaches any data error to detect abnormal data services.Finally,the experiment also verifies the DDoS attack detection capability algorithm.展开更多
基金We are thankful for the funding support fromthe Science and Technology Projects of the National Archives Administration of China(Grant Number 2022-R-031)the Fundamental Research Funds for the Central Universities,Central China Normal University(Grant Number CCNU24CG014).
文摘As the volume of healthcare and medical data increases from diverse sources,real-world scenarios involving data sharing and collaboration have certain challenges,including the risk of privacy leakage,difficulty in data fusion,low reliability of data storage,low effectiveness of data sharing,etc.To guarantee the service quality of data collaboration,this paper presents a privacy-preserving Healthcare and Medical Data Collaboration Service System combining Blockchain with Federated Learning,termed FL-HMChain.This system is composed of three layers:Data extraction and storage,data management,and data application.Focusing on healthcare and medical data,a healthcare and medical blockchain is constructed to realize data storage,transfer,processing,and access with security,real-time,reliability,and integrity.An improved master node selection consensus mechanism is presented to detect and prevent dishonest behavior,ensuring the overall reliability and trustworthiness of the collaborative model training process.Furthermore,healthcare and medical data collaboration services in real-world scenarios have been discussed and developed.To further validate the performance of FL-HMChain,a Convolutional Neural Network-based Federated Learning(FL-CNN-HMChain)model is investigated for medical image identification.This model achieves better performance compared to the baseline Convolutional Neural Network(CNN),having an average improvement of 4.7%on Area Under Curve(AUC)and 7%on Accuracy(ACC),respectively.Furthermore,the probability of privacy leakage can be effectively reduced by the blockchain-based parameter transfer mechanism in federated learning between local and global models.
基金the National Key R&D Program of China(Grant No.2022YFF0503702)the National Natural Science Foundation of China(Grant Nos.42074186,41831071,42004136,and 42274195)+1 种基金the Natural Science Foundation of Jiangsu Province(Grant No.BK20211036)the Specialized Research Fund for State Key Laboratories,and the University of Science and Technology of China Research Funds of the Double First-Class Initiative(Grant No.YD2080002013).
文摘The global ionosphere maps(GIM)provided by the International GNSS Service(IGS)are extensively utilized for ionospheric morphology monitoring,scientific research,and practical application.Assessing the credibility of GIM products in data-sparse regions is of paramount importance.In this study,measurements from the Crustal Movement Observation Network of China(CMONOC)are leveraged to evaluate the suitability of IGS-GIM products over China region in 2013-2014.The indices of mean error(ME),root mean square error(RMSE),and normalized RMSE(NRMSE)are then utilized to quantify the accuracy of IGS-GIM products.Results revealed distinct local time and latitudinal dependencies in IGS-GIM errors,with substantially high errors at nighttime(NRMSE:39%)and above 40°latitude(NRMSE:49%).Seasonal differences also emerged,with larger equinoctial deviations(NRMSE:33.5%)compared with summer(20%).A preliminary analysis implied that the irregular assimilation of sparse IGS observations,compounded by China’s distinct geomagnetic topology,may manifest as error variations.These results suggest that modeling based solely on IGS-GIM observations engenders inadequate representations across China and that a thorough examination would proffer the necessary foundation for advancing regional total electron content(TEC)constructions.
文摘In an attempt to assess the Kenyan healthcare system, this study looks at the current efforts that are already in place, what challenges they face, and what strategies can be put into practice to foster interoperability. By reviewing a variety of literature and using statistics, the paper ascertains notable impediments such as the absence of standard protocols, lack of adequate technological infrastructure, and weak regulatory frameworks. Resultant effects from these challenges regarding health provision target enhanced data sharing and merging for better patient outcomes and allocation of resources. It also highlights several opportunities that include the adoption of emerging technologies, and the establishment of public-private partnerships to strengthen the healthcare framework among others. In this regard, the article provides recommendations based on stakeholder views and global best practices addressed to policymakers, medical practitioners, and IT specialists concerned with achieving effective interoperability within Kenya’s health system. This research is relevant because it adds knowledge to the existing literature on how healthcare quality can be improved to make it more patient-centered especially in Kenya.
文摘As an important part of railway lines, the healthy service status of track fasteners was very important to ensure the safety of trains. The application of deep learning algorithms was becoming an important method to realize its state detection. However, there was often a deficiency that the detection accuracy and calculation speed of model were difficult to balance, when the traditional deep learning model is used to detect the service state of track fasteners. Targeting this issue, an improved Yolov4 model for detecting the service status of track fasteners was proposed. Firstly, the Mixup data augmentation technology was introduced into Yolov4 model to enhance the generalization ability of model. Secondly, the MobileNet-V2 lightweight network was employed in lieu of the CSPDarknet53 network as the backbone, thereby reducing the number of algorithm parameters and improving the model’s computational efficiency. Finally, the SE attention mechanism was incorporated to boost the importance of rail fastener identification by emphasizing relevant image features, ensuring that the network’s focus was primarily on the fasteners being inspected. The algorithm achieved both high precision and high speed operation of the rail fastener service state detection, while realizing the lightweight of model. The experimental results revealed that, the MAP value of the rail fastener service state detection algorithm based on the improved Yolov4 model reaches 83.2%, which is 2.83% higher than that of the traditional Yolov4 model, and the calculation speed was improved by 67.39%. Compared with the traditional Yolov4 model, the proposed method achieved the collaborative optimization of detection accuracy and calculation speed.
文摘This paper aims to present the experience gathered in the Italian alpine city of Bolzano within the project“Bolzano Traffic”whose goal is the introduction of an experimental open ITS platform for local service providers,fostering the diffusion of advanced traveller information services and the future deployment of cooperative mobility systems in the region.Several end-users applications targeted to the needs of different user groups have been developed in collaboration with local companies and research centers;a partnership with the EU Co-Cities project has been activated as well.The implemented services rely on real-time travel and traffic information collected by urban traffic monitoring systems or published by local stakeholders(e.g.public transportation operators).An active involvement of end-users,who have recently started testing these demo applications for free,is actually on-going.
基金Funded by the National Science Centre,Poland under the OPUS call in the Weave programme(project No.2021/43/I/HS4/01451)funded by Ministry of Education and Science(901503)。
文摘The aim of the work was to determine the spatial distribution of activity in the forest on the area of the Forest Promotional Complex“Sudety Zachodnie”using mobile phone data.The study identified the sites with the highest(hot spot)and lowest(cold spot)use.Habitat,stand,demographic,topographic and spatial factors affecting the distribution of activity were also analyzed.Two approaches were applied in our research:global and local Moran’s coefficients,and a machine learning technique,Boosted Regression Trees.The results show that 11,503,320 visits to forest areas were recorded in the“Sudety Zachodnie”in 2019.The most popular season for activities was winter,and the least popular was spring.Using global and local Moran’s I coefficients,three small hot clusters of activity and one large cold cluster were identified.Locations with high values with similar neighbours(hot-spots)were most often visited forest areas,averaging almost 200,000 visits over 2019.Significantly fewer visits were recorded in cold-spots,the average number of visits to these areas was about 4,500.The value of global Moran’s I was equal to 0.54 and proved significant positive spatial autocorrelation.Results of Boosted Regression Trees modeling of visits in forest,using tree stand habitat and spatial factors accurately explained 76%of randomly selected input data.The variables that had the greatest effect on the distribution of activities were the density of hiking and biking trails and diversity of topography.The methodology presented in this article allows delineation of Cultural Ecosystem Services hot spots in forest areas based on mobile phone data.It also allows the identification of factors that may influence the distribution of visits in forests.Such data are important for managing forest areas and adapting forest management to the needs of society while maintaining ecosystem stability.
基金This work is funded by the National Natural Science Foundation of China under Grant No.61772180the Key R&D plan of Hubei Province(2020BHB004,2020BAB012).
文摘According to Cisco’s Internet Report 2020 white paper,there will be 29.3 billion connected devices worldwide by 2023,up from 18.4 billion in 2018.5G connections will generate nearly three times more traffic than 4G connections.While bringing a boom to the network,it also presents unprecedented challenges in terms of flow forwarding decisions.The path assignment mechanism used in traditional traffic schedulingmethods tends to cause local network congestion caused by the concentration of elephant flows,resulting in unbalanced network load and degraded quality of service.Using the centralized control of software-defined networks,this study proposes a data center traffic scheduling strategy for minimization congestion and quality of service guaranteeing(MCQG).The ideal transmission path is selected for data flows while considering the network congestion rate and quality of service.Different traffic scheduling strategies are used according to the characteristics of different service types in data centers.Reroute scheduling for elephant flows that tend to cause local congestion.The path evaluation function is formed by the maximum link utilization on the path,the number of elephant flows and the time delay,and the fast merit-seeking capability of the sparrow search algorithm is used to find the path with the lowest actual link overhead as the rerouting path for the elephant flows.It is used to reduce the possibility of local network congestion occurrence.Equal cost multi-path(ECMP)protocols with faster response time are used to schedulemouse flows with shorter duration.Used to guarantee the quality of service of the network.To achieve isolated transmission of various types of data streams.The experimental results show that the proposed strategy has higher throughput,better network load balancing,and better robustness compared to ECMP under different traffic models.In addition,because it can fully utilize the resources in the network,MCQG also outperforms another traffic scheduling strategy that does rerouting for elephant flows(namely Hedera).Compared withECMPandHedera,MCQGimproves average throughput by 11.73%and 4.29%,and normalized total throughput by 6.74%and 2.64%,respectively;MCQG improves link utilization by 23.25%and 15.07%;in addition,the average round-trip delay and packet loss rate fluctuate significantly less than the two compared strategies.
文摘As a part of the product development process, the after-sales services are not only a source of innovation, but also they benefit from value creation through new managerial methodologies for the achievement of competitive advantage and customer satisfaction. The objective of the paper is to further understand value creation for the after-sales services. We present the case of the creation of a new business for the after-sales services for the entrance into a new market. The new business is created by two gurus in the aerospace industry. A typology of guidelines is derived, based on organizational and strategic perspectives, for the after-sales services value creation and the guidelines for the creation of a new business as well as for the entrance of into a new market are presented.
文摘The rapid rise and development of the computer industry in China has promoted the production of and demand for computer diskettes. China, with over 30 diskette production lines and an annual production of 2 billion diskettes, including some for export, has become the biggest diskette producing country in the world. However, imported diskettes, especially high grade brands, enjoy a good market here. That’s why many world renowned diskette
文摘JCOMM has strategy to establish the network of WMO-IOC Centres for Marine-meteorological and Oceanographic Climate Data (CMOCs) under the new Marine Climate Data System (MCDS) in 2012 for improving the quality and timeliness of the marine-meteorological and oceanographic data, metadata and products available to end users. China as a candidate of CMOC China has been approved to run on a trial basis after the 4th Meeting of the Joint IOC/WMO Technical Commission for Oceanography and Marine Meteorology (JCOMM). This article states the developing intention of CMOC China in the next few years through the brief introduction to critical marine data, products and service system and cooperation projects in the world.
基金funded by the National Key Research and Development Program of China(Grant No.2016YFA0600304)the Strategic Priority Research Program of Chinese Academy of Sciences(Grant No.XDA19030201).
文摘To achieve the Sustainable Development Goals(SDGs),high-quality data are needed to inform the formulation of policies and investment decisions,to monitor progress towards the SDGs and to evaluate the impacts of policies.However,the data landscape is changing.With emerging big data and cloud-based services,there are new opportunities for data collection,influencing both official data collection processes and the operation of the programmes they monitor.This paper uses cases and examples to explore the potential of crowdsourcing and public earth observation(EO)data products for monitoring and tracking the SDGs.This paper suggests that cloud-based services that integrate crowdsourcing and public EO data products provide cost-effective solutions for monitoring and tracking the SDGs,particularly for low-income countries.The paper also discusses the challenges of using cloud services and big data for SDG monitoring.Validation and quality control of public EO data is very important;otherwise,the user will be unable to assess the quality of the data or use it with confidence.
基金the National Natural Science Foundation of China(60425206, 60503033)National Basic Research Program of China (973 Program, 2002CB312000)Opening Foundation of State Key Laboratory of Software Engineering in Wuhan University
文摘This paper proposes a method of data-flow testing for Web services composition. Firstly, to facilitate data flow analysis and constraints collecting, the existing model representation of business process execution language (BPEL) is modified in company with the analysis of data dependency and an exact representation of dead path elimination (DPE) is proposed, which over-comes the difficulties brought to dataflow analysis. Then defining and using information based on data flow rules is collected by parsing BPEL and Web services description language (WSDL) documents and the def-use annotated control flow graph is created. Based on this model, data-flow anomalies which indicate potential errors can be discovered by traversing the paths of graph, and all-du-paths used in dynamic data flow testing for Web services composition are automatically generated, then testers can design the test cases according to the collected constraints for each path selected.
文摘Cyberattacks are difficult to prevent because the targeted companies and organizations are often relying on new and fundamentally insecure cloudbased technologies,such as the Internet of Things.With increasing industry adoption and migration of traditional computing services to the cloud,one of the main challenges in cybersecurity is to provide mechanisms to secure these technologies.This work proposes a Data Security Framework for cloud computing services(CCS)that evaluates and improves CCS data security from a software engineering perspective by evaluating the levels of security within the cloud computing paradigm using engineering methods and techniques applied to CCS.This framework is developed by means of a methodology based on a heuristic theory that incorporates knowledge generated by existing works as well as the experience of their implementation.The paper presents the design details of the framework,which consists of three stages:identification of data security requirements,management of data security risks and evaluation of data security performance in CCS.
基金The National Natural Science Foundation of China under contract No.41206012
文摘An ocean state monitor and analysis radar(OSMAR), developed by Wuhan University in China, have been mounted at six stations along the coasts of East China Sea(ECS) to measure velocities(currents, waves and winds) at the sea surface. Radar-observed surface current is taken as an example to illustrate the operational high-frequency(HF) radar observing and data service platform(OP), presenting an operational flow from data observing, transmitting, processing, visualizing, to end-user service. Three layers(systems): radar observing system(ROS), data service system(DSS) and visualization service system(VSS), as well as the data flow within the platform are introduced. Surface velocities observed at stations are synthesized at the radar data receiving and preprocessing center of the ROS, and transmitted to the DSS, in which the data processing and quality control(QC) are conducted. Users are allowed to browse the processed data on the portal of the DSS, and access to those data files. The VSS aims to better show the data products by displaying the information on a visual globe. By utilizing the OP, the surface currents in East China Sea are monitored, and hourly and seasonal variabilities of them are investigated.
文摘Background:Given the importance of customers as the most valuable assets of organizations,customer retention seems to be an essential,basic requirement for any organization.Banks are no exception to this rule.The competitive atmosphere within which electronic banking services are provided by different banks increases the necessity of customer retention.Methods:Being based on existing information technologies which allow one to collect data from organizations’databases,data mining introduces a powerful tool for the extraction of knowledge from huge amounts of data.In this research,the decision tree technique was applied to build a model incorporating this knowledge.Results:The results represent the characteristics of churned customers.Conclusions:Bank managers can identify churners in future using the results of decision tree.They should be provide some strategies for customers whose features are getting more likely to churner’s features.
文摘In this paper, we present a set of best practices for workflow design and implementation for numerical weather prediction models and meteorological data service, which have been in operation in China Meteorological Administration (CMA) for years and have been proven effective in reliably managing the complexities of large-scale meteorological related workflows. Based on the previous work on the platforms, we argue that a minimum set of guidelines including workflow scheme, module design, implementation standards and maintenance consideration during the whole establishment of the platform are highly recommended, serving to reduce the need for future maintenance and adjustment. A significant gain in performance can be achieved through the workflow-based projects. We believe that a good workflow system plays an important role in the weather forecast service, providing a useful tool for monitoring the whole process, fixing the errors, repairing a workflow, or redesigning an equivalent workflow pattern with new components.
文摘The New Austrian Tunneling Method (NATM) has been widely used in the construction of mountain tun- nels, urban metro lines, underground storage tanks, underground power houses, mining roadways, and so on, The variation patterns of advance geological prediction data, stress-strain data of supporting struc- tures, and deformation data of the surrounding rock are vitally important in assessing the rationality and reliability of construction schemes, and provide essential information to ensure the safety and scheduling of tunnel construction, However, as the quantity of these data increases significantly, the uncertainty and discreteness of the mass data make it extremely difficult to produce a reasonable con- struction scheme; they also reduce the forecast accuracy of accidents and dangerous situations, creating huge challenges in tunnel construction safety, In order to solve this problem, a novel data service system is proposed that uses data-association technology and the NATM, with the support of a big data environ- ment, This system can integrate data resources from distributed monitoring sensors during the construc- tion process, and then identify associations and build relations among data resources under the same construction conditions, These data associations and relations are then stored in a data pool, With the development and supplementation of the data pool, similar relations can then he used under similar con- ditions, in order to provide data references for construction schematic designs and resource allocation, The proposed data service system also provides valuable guidance for the construction of similar projects.
基金supported by the National Natural Science Foundation of China (No. 61502043, No. 61132001)Beijing Natural Science Foundation (No. 4162042)BeiJing Talents Fund (No. 2015000020124G082)
文摘With the growing popularity of data-intensive services on the Internet, the traditional process-centric model for business process meets challenges due to the lack of abilities to describe data semantics and dependencies, resulting in the inflexibility of the design and implement for the processes. This paper proposes a novel data-aware business process model which is able to describe both explicit control flow and implicit data flow. Data model with dependencies which are formulated by Linear-time Temporal Logic(LTL) is presented, and their satisfiability is validated by an automaton-based model checking algorithm. Data dependencies are fully considered in modeling phase, which helps to improve the efficiency and reliability of programming during developing phase. Finally, a prototype system based on j BPM for data-aware workflow is designed using such model, and has been deployed to Beijing Kingfore heating management system to validate the flexibility, efficacy and convenience of our approach for massive coding and large-scale system management in reality.
文摘Cloud computing(CC)is an advanced technology that provides access to predictive resources and data sharing.The cloud environment represents the right type regarding cloud usage model ownership,size,and rights to access.It introduces the scope and nature of cloud computing.In recent times,all processes are fed into the system for which consumer data and cache size are required.One of the most security issues in the cloud environment is Distributed Denial of Ser-vice(DDoS)attacks,responsible for cloud server overloading.This proposed sys-tem ID3(Iterative Dichotomiser 3)Maximum Multifactor Dimensionality Posteriori Method(ID3-MMDP)is used to overcome the drawback and a rela-tively simple way to execute and for the detection of(DDoS)attack.First,the pro-posed ID3-MMDP method calls for the resources of the cloud platform and then implements the attack detection technology based on information entropy to detect DDoS attacks.Since because the entropy value can show the discrete or aggregated characteristics of the current data set,it can be used for the detection of abnormal dataflow,User-uploaded data,ID3-MMDP system checks and read risk measurement and processing,bug ratingfile size changes,orfile name changes and changes in the format design of the data size entropy value.Unique properties can be used whenever the program approaches any data error to detect abnormal data services.Finally,the experiment also verifies the DDoS attack detection capability algorithm.