China' Mainland has a poor distribution of meteorological stations.Existing models’estimation accuracy for creating high-resolution surfaces of meteorological data is restricted for air temperature,and low for re...China' Mainland has a poor distribution of meteorological stations.Existing models’estimation accuracy for creating high-resolution surfaces of meteorological data is restricted for air temperature,and low for relative humidity and wind speed(few studies reported).This study compared the typical generalized additive model(GAM)and autoencoder-based residual neural network(hereafter,residual network for short)in terms of predicting three meteorological parameters,namely air temperature,relative humidity,and wind speed,using data from 824 monitoring stations across China’s mainland in 2015.The performance of the two models was assessed using a 10-fold cross-validation procedure.The air temperature models employ basic variables such as latitude,longitude,elevation,and the day of the year.The relative humidity models employ air temperature and ozone concentration as covariates,while the wind speed models use wind speed coarse-resolution reanalysis data as covariates,in addition to the fundamental variables.Spatial coordinates represent spatial variation,while the time index of the day captures time variation in our spatiotemporal models.In comparison to GAM,the residual network considerably improved prediction accuracy:on average,the coefficient of variation(CV)R2 of the three meteorological parameters rose by 0.21,CV root-mean square(RMSE)fell by 37%,and the relative humidity model improved the most.The accuracy of relative humidity models was considerably improved once the monthly index was included,demonstrating that varied amounts of temporal variables are crucial for relative humidity models.We also spoke about the benefits and drawbacks of using coarse resolution reanalysis data and closest neighbor values as variables.In comparison to classic GAMs,this study indicates that the residual network model may considerably increase the accuracy of national high spatial(1 km)and temporal(daily)resolution meteorological data.Our findings have implications for high-resolution and high-accuracy meteorological parameter mapping in China.展开更多
In integrating geo-spatial datasets, sometimes layers are unable to perfectly overlay each other. In most cases, the cause of misalignment is the cartographic variation of objects forming features in the datasets. Eit...In integrating geo-spatial datasets, sometimes layers are unable to perfectly overlay each other. In most cases, the cause of misalignment is the cartographic variation of objects forming features in the datasets. Either this could be due to actual changes on ground, collection, or storage approaches used leading to overlapping or openings between features. In this paper, we present an alignment method that uses adjustment algorithms to update the geometry of features within a dataset or complementary adjacent datasets so that they can align to achieve perfect integration. The method identifies every unique spatial instance in datasets and their spatial points that define all their geometry;the differences are compared and used to compute the alignment parameters. This provides a uniform geo-spatial features’ alignment taking into consideration changes in the different datasets being integrated without affecting the topology and attributes.展开更多
Through analyzing theprinciple of data sharing in the data-base system, this paper discusses theprinciple and method for integratingand sharing GIS data by data engine,introduces a way to achieve the highintegration a...Through analyzing theprinciple of data sharing in the data-base system, this paper discusses theprinciple and method for integratingand sharing GIS data by data engine,introduces a way to achieve the highintegration and sharing of GIS data on the basis of VCT in VC++, and pro-vides the method for uniting VCT intoRDBMS in order to implement a spa-tial database with object-oriented datamodel.展开更多
The paper presents a set of techniques of digital watermarking by which copyright and user rights messages are hidden into geo-spatial graphics data,as well as techniques of compressing and encrypting the watermarked ...The paper presents a set of techniques of digital watermarking by which copyright and user rights messages are hidden into geo-spatial graphics data,as well as techniques of compressing and encrypting the watermarked geo-spatial graphics data.The technology aims at tracing and resisting the illegal distribution and duplication of the geo-spatial graphics data product,so as to effectively protect the data producer's rights as well as to facilitate the secure sharing of geo-spatial graphics data.So far in the CIS field throughout the world,few researches have been made on digital watermarking.The research is a novel exploration both in the field of security management of geo-spatial graphics data and in the applications of digital watermarking technique.An application software employing the proposed technology has been developed.A number of experimental tests on the 1:500,000 digital bathymetric chart of the South China Sea and 1:10,000 digital topographic map of Jiangsu Province have been conducted to verify the feasibility of the proposed technology.展开更多
Many fields,such as neuroscience,are experiencing the vast prolife ration of cellular data,underscoring the need fo r organizing and interpreting large datasets.A popular approach partitions data into manageable subse...Many fields,such as neuroscience,are experiencing the vast prolife ration of cellular data,underscoring the need fo r organizing and interpreting large datasets.A popular approach partitions data into manageable subsets via hierarchical clustering,but objective methods to determine the appropriate classification granularity are missing.We recently introduced a technique to systematically identify when to stop subdividing clusters based on the fundamental principle that cells must differ more between than within clusters.Here we present the corresponding protocol to classify cellular datasets by combining datadriven unsupervised hierarchical clustering with statistical testing.These general-purpose functions are applicable to any cellular dataset that can be organized as two-dimensional matrices of numerical values,including molecula r,physiological,and anatomical datasets.We demonstrate the protocol using cellular data from the Janelia MouseLight project to chara cterize morphological aspects of neurons.展开更多
There is a growing body of clinical research on the utility of synthetic data derivatives,an emerging research tool in medicine.In nephrology,clinicians can use machine learning and artificial intelligence as powerful...There is a growing body of clinical research on the utility of synthetic data derivatives,an emerging research tool in medicine.In nephrology,clinicians can use machine learning and artificial intelligence as powerful aids in their clinical decision-making while also preserving patient privacy.This is especially important given the epidemiology of chronic kidney disease,renal oncology,and hypertension worldwide.However,there remains a need to create a framework for guidance regarding how to better utilize synthetic data as a practical application in this research.展开更多
Concerning about the rapid urban growth in recent China, this study takes Beijing as a case and puts forward that urban sprawl can be measured from spatial configuration, urban growth efficiency and external impacts, ...Concerning about the rapid urban growth in recent China, this study takes Beijing as a case and puts forward that urban sprawl can be measured from spatial configuration, urban growth efficiency and external impacts, and then develops a geo-spatial indices system for measuring sprawl, a total of 13 indicators. In order to calculate these indices, different sources data are selected, including land use maps, former land use planning, land price and floor-area-ratio samples, digitized map of the highways and city centers, population and GDP statistical data, etc. Various GIS spatial analysis methods are used to spatialize these indices into 100mx100m cells. Besides, an integrated urban sprawl index is calculated by weight sum of these 13 indices. The application result indicates that geo-spatial indices system can capture most of the typical features and interior differentia of urban sprawl. Construction land in Beijing has kept fast growing with large amount, low efficiency and disordered spatial configuration, indicating a typical sprawling tendency. The following specific sprawl features are identified by each indicator: (1) typical spatial configuration of sprawling: obvious fragmentation and irregularity of landscape due to unsuccessful enforcement of land use planning, unadvisable pattern of typical discontinuous development, strip development and leapfrog development; (2) low efficiency of sprawl: low development density, low population density and economic output in newly developed area; and (3) negative impacts on agriculture, environment and city life. According to the integrated sprawl index, the sprawling amount in the northern part is larger than that in the southern, but the sprawling extent is in converse case; most sprawling area include the marginal area of the near suburbs and the area between highways, etc. Four sprawling patterns are identified: randomly expansion at urban fringe, strip development along or between highways, scattered development of industrial land, leapfrog development of urban residence and industrial area.展开更多
Tourism is a rapidly growing investment point in Sri Lanka, where huge investment is takeing place. Even though the investment is very massive, the planning, development, and marketing are key components of success in...Tourism is a rapidly growing investment point in Sri Lanka, where huge investment is takeing place. Even though the investment is very massive, the planning, development, and marketing are key components of success in tourism zone enhancement. The main objective of this study was to implement a geo-spatial information system for development of tourism in Kandy district. Primary data collection methods i.e. questionnaire survey, interviews, focus group interviews, and observations were employed for data collection. Google maps with Google API standards which are specially designed for developers and computer programmers were used for implementation of the system. System requirements were identified by interviewing tourists and observations made on tourist sites. Proximity analysis, spatial joint, and network analysis with Google direction application program interface (API) and Google place API were used to analyze data. The study highlights the potential tourist attractions and the accessibility and other required details through a web output. Issues and challenges faced by travelers are mainly lack of specific location information, public transport schedules, and reliable tourist attraction information. Online geo-spatial information system created in this study provides a guide for tourists to fred the destination routes, the service areas, and all necessary details on particular destinations.展开更多
With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapi...With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapid development of IIoT.Blockchain technology has immutability,decentralization,and autonomy,which can greatly improve the inherent defects of the IIoT.In the traditional blockchain,data is stored in a Merkle tree.As data continues to grow,the scale of proofs used to validate it grows,threatening the efficiency,security,and reliability of blockchain-based IIoT.Accordingly,this paper first analyzes the inefficiency of the traditional blockchain structure in verifying the integrity and correctness of data.To solve this problem,a new Vector Commitment(VC)structure,Partition Vector Commitment(PVC),is proposed by improving the traditional VC structure.Secondly,this paper uses PVC instead of the Merkle tree to store big data generated by IIoT.PVC can improve the efficiency of traditional VC in the process of commitment and opening.Finally,this paper uses PVC to build a blockchain-based IIoT data security storage mechanism and carries out a comparative analysis of experiments.This mechanism can greatly reduce communication loss and maximize the rational use of storage space,which is of great significance for maintaining the security and stability of blockchain-based IIoT.展开更多
In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose...In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose a Hadoop based big data secure storage scheme.Firstly,in order to disperse the NameNode service from a single server to multiple servers,we combine HDFS federation and HDFS high-availability mechanisms,and use the Zookeeper distributed coordination mechanism to coordinate each node to achieve dual-channel storage.Then,we improve the ECC encryption algorithm for the encryption of ordinary data,and adopt a homomorphic encryption algorithm to encrypt data that needs to be calculated.To accelerate the encryption,we adopt the dualthread encryption mode.Finally,the HDFS control module is designed to combine the encryption algorithm with the storage model.Experimental results show that the proposed solution solves the problem of a single point of failure of metadata,performs well in terms of metadata reliability,and can realize the fault tolerance of the server.The improved encryption algorithm integrates the dual-channel storage mode,and the encryption storage efficiency improves by 27.6% on average.展开更多
Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal depende...Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal dependence,and noise.Therefore,methodologies for data augmentation and conversion of time series data into images for analysis have been studied.This paper proposes a fault detection model that uses time series data augmentation and transformation to address the problems of data imbalance,temporal dependence,and robustness to noise.The method of data augmentation is set as the addition of noise.It involves adding Gaussian noise,with the noise level set to 0.002,to maximize the generalization performance of the model.In addition,we use the Markov Transition Field(MTF)method to effectively visualize the dynamic transitions of the data while converting the time series data into images.It enables the identification of patterns in time series data and assists in capturing the sequential dependencies of the data.For anomaly detection,the PatchCore model is applied to show excellent performance,and the detected anomaly areas are represented as heat maps.It allows for the detection of anomalies,and by applying an anomaly map to the original image,it is possible to capture the areas where anomalies occur.The performance evaluation shows that both F1-score and Accuracy are high when time series data is converted to images.Additionally,when processed as images rather than as time series data,there was a significant reduction in both the size of the data and the training time.The proposed method can provide an important springboard for research in the field of anomaly detection using time series data.Besides,it helps solve problems such as analyzing complex patterns in data lightweight.展开更多
Regarding the special potential of ports located on international coastlines such as Makoran Sea (Iran) for goods and human smuggling, national level of coastline security is very important. They can play a significan...Regarding the special potential of ports located on international coastlines such as Makoran Sea (Iran) for goods and human smuggling, national level of coastline security is very important. They can play a significant role in the development of power and security. Based on military reviews and analyses, police location and monitoring field view in the coastlines are strategic issues in modern security development. This research proposes a tool for development of coastal roads and coastal walking routes in the deployment of police. The main focuses are monitoring field view and accessibility to the strategic coastline. GIS tool plays an essential role in producing important security maps. Chabahar Port in Iran, as the most important port of Makoran Sea, has been selected as the study area, regarding its strategic role in the national economy and security. Research method focused on these major axes: successful establishment of police stations in shoreline for increasing monitoring and coastal security and suitable patrol of patrol police car in the coastal roads. This study adopts a scientific approach to the analysis of the present and future development in urban and security planning in coastal towns in the national and regional levels.展开更多
Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary w...Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary with a deformation condition.This study proposes a novel approach for accurately predicting an anisotropic deformation behavior of wrought Mg alloys using machine learning(ML)with data augmentation.The developed model combines four key strategies from data science:learning the entire flow curves,generative adversarial networks(GAN),algorithm-driven hyperparameter tuning,and gated recurrent unit(GRU)architecture.The proposed model,namely GAN-aided GRU,was extensively evaluated for various predictive scenarios,such as interpolation,extrapolation,and a limited dataset size.The model exhibited significant predictability and improved generalizability for estimating the anisotropic compressive behavior of ZK60 Mg alloys under 11 annealing conditions and for three loading directions.The GAN-aided GRU results were superior to those of previous ML models and constitutive equations.The superior performance was attributed to hyperparameter optimization,GAN-based data augmentation,and the inherent predictivity of the GRU for extrapolation.As a first attempt to employ ML techniques other than artificial neural networks,this study proposes a novel perspective on predicting the anisotropic deformation behaviors of wrought Mg alloys.展开更多
There are challenges to the reliability evaluation for insulated gate bipolar transistors(IGBT)on electric vehicles,such as junction temperature measurement,computational and storage resources.In this paper,a junction...There are challenges to the reliability evaluation for insulated gate bipolar transistors(IGBT)on electric vehicles,such as junction temperature measurement,computational and storage resources.In this paper,a junction temperature estimation approach based on neural network without additional cost is proposed and the lifetime calculation for IGBT using electric vehicle big data is performed.The direct current(DC)voltage,operation current,switching frequency,negative thermal coefficient thermistor(NTC)temperature and IGBT lifetime are inputs.And the junction temperature(T_(j))is output.With the rain flow counting method,the classified irregular temperatures are brought into the life model for the failure cycles.The fatigue accumulation method is then used to calculate the IGBT lifetime.To solve the limited computational and storage resources of electric vehicle controllers,the operation of IGBT lifetime calculation is running on a big data platform.The lifetime is then transmitted wirelessly to electric vehicles as input for neural network.Thus the junction temperature of IGBT under long-term operating conditions can be accurately estimated.A test platform of the motor controller combined with the vehicle big data server is built for the IGBT accelerated aging test.Subsequently,the IGBT lifetime predictions are derived from the junction temperature estimation by the neural network method and the thermal network method.The experiment shows that the lifetime prediction based on a neural network with big data demonstrates a higher accuracy than that of the thermal network,which improves the reliability evaluation of system.展开更多
As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The ...As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The eddy dissipation rate(EDR)has been established as the standard metric for quantifying turbulence in civil aviation.This study aims to explore a universally applicable symbolic classification approach based on genetic programming to detect turbulence anomalies using quick access recorder(QAR)data.The detection of atmospheric turbulence is approached as an anomaly detection problem.Comparative evaluations demonstrate that this approach performs on par with direct EDR calculation methods in identifying turbulence events.Moreover,comparisons with alternative machine learning techniques indicate that the proposed technique is the optimal methodology currently available.In summary,the use of symbolic classification via genetic programming enables accurate turbulence detection from QAR data,comparable to that with established EDR approaches and surpassing that achieved with machine learning algorithms.This finding highlights the potential of integrating symbolic classifiers into turbulence monitoring systems to enhance civil aviation safety amidst rising environmental and operational hazards.展开更多
Since the impoundment of Three Gorges Reservoir(TGR)in 2003,numerous slopes have experienced noticeable movement or destabilization owing to reservoir level changes and seasonal rainfall.One case is the Outang landsli...Since the impoundment of Three Gorges Reservoir(TGR)in 2003,numerous slopes have experienced noticeable movement or destabilization owing to reservoir level changes and seasonal rainfall.One case is the Outang landslide,a large-scale and active landslide,on the south bank of the Yangtze River.The latest monitoring data and site investigations available are analyzed to establish spatial and temporal landslide deformation characteristics.Data mining technology,including the two-step clustering and Apriori algorithm,is then used to identify the dominant triggers of landslide movement.In the data mining process,the two-step clustering method clusters the candidate triggers and displacement rate into several groups,and the Apriori algorithm generates correlation criteria for the cause-and-effect.The analysis considers multiple locations of the landslide and incorporates two types of time scales:longterm deformation on a monthly basis and short-term deformation on a daily basis.This analysis shows that the deformations of the Outang landslide are driven by both rainfall and reservoir water while its deformation varies spatiotemporally mainly due to the difference in local responses to hydrological factors.The data mining results reveal different dominant triggering factors depending on the monitoring frequency:the monthly and bi-monthly cumulative rainfall control the monthly deformation,and the 10-d cumulative rainfall and the 5-d cumulative drop of water level in the reservoir dominate the daily deformation of the landslide.It is concluded that the spatiotemporal deformation pattern and data mining rules associated with precipitation and reservoir water level have the potential to be broadly implemented for improving landslide prevention and control in the dam reservoirs and other landslideprone areas.展开更多
A benchmark experiment on^(238)U slab samples was conducted using a deuterium-tritium neutron source at the China Institute of Atomic Energy.The leakage neutron spectra within energy levels of 0.8-16 MeV at 60°an...A benchmark experiment on^(238)U slab samples was conducted using a deuterium-tritium neutron source at the China Institute of Atomic Energy.The leakage neutron spectra within energy levels of 0.8-16 MeV at 60°and 120°were measured using the time-of-flight method.The samples were prepared as rectangular slabs with a 30 cm square base and thicknesses of 3,6,and 9 cm.The leakage neutron spectra were also calculated using the MCNP-4C program based on the latest evaluated files of^(238)U evaluated neutron data from CENDL-3.2,ENDF/B-Ⅷ.0,JENDL-5.0,and JEFF-3.3.Based on the comparison,the deficiencies and improvements in^(238)U evaluated nuclear data were analyzed.The results showed the following.(1)The calculated results for CENDL-3.2 significantly overestimated the measurements in the energy interval of elastic scattering at 60°and 120°.(2)The calculated results of CENDL-3.2 overestimated the measurements in the energy interval of inelastic scattering at 120°.(3)The calculated results for CENDL-3.2 significantly overestimated the measurements in the 3-8.5 MeV energy interval at 60°and 120°.(4)The calculated results with JENDL-5.0 were generally consistent with the measurement results.展开更多
文摘China' Mainland has a poor distribution of meteorological stations.Existing models’estimation accuracy for creating high-resolution surfaces of meteorological data is restricted for air temperature,and low for relative humidity and wind speed(few studies reported).This study compared the typical generalized additive model(GAM)and autoencoder-based residual neural network(hereafter,residual network for short)in terms of predicting three meteorological parameters,namely air temperature,relative humidity,and wind speed,using data from 824 monitoring stations across China’s mainland in 2015.The performance of the two models was assessed using a 10-fold cross-validation procedure.The air temperature models employ basic variables such as latitude,longitude,elevation,and the day of the year.The relative humidity models employ air temperature and ozone concentration as covariates,while the wind speed models use wind speed coarse-resolution reanalysis data as covariates,in addition to the fundamental variables.Spatial coordinates represent spatial variation,while the time index of the day captures time variation in our spatiotemporal models.In comparison to GAM,the residual network considerably improved prediction accuracy:on average,the coefficient of variation(CV)R2 of the three meteorological parameters rose by 0.21,CV root-mean square(RMSE)fell by 37%,and the relative humidity model improved the most.The accuracy of relative humidity models was considerably improved once the monthly index was included,demonstrating that varied amounts of temporal variables are crucial for relative humidity models.We also spoke about the benefits and drawbacks of using coarse resolution reanalysis data and closest neighbor values as variables.In comparison to classic GAMs,this study indicates that the residual network model may considerably increase the accuracy of national high spatial(1 km)and temporal(daily)resolution meteorological data.Our findings have implications for high-resolution and high-accuracy meteorological parameter mapping in China.
文摘In integrating geo-spatial datasets, sometimes layers are unable to perfectly overlay each other. In most cases, the cause of misalignment is the cartographic variation of objects forming features in the datasets. Either this could be due to actual changes on ground, collection, or storage approaches used leading to overlapping or openings between features. In this paper, we present an alignment method that uses adjustment algorithms to update the geometry of features within a dataset or complementary adjacent datasets so that they can align to achieve perfect integration. The method identifies every unique spatial instance in datasets and their spatial points that define all their geometry;the differences are compared and used to compute the alignment parameters. This provides a uniform geo-spatial features’ alignment taking into consideration changes in the different datasets being integrated without affecting the topology and attributes.
文摘Through analyzing theprinciple of data sharing in the data-base system, this paper discusses theprinciple and method for integratingand sharing GIS data by data engine,introduces a way to achieve the highintegration and sharing of GIS data on the basis of VCT in VC++, and pro-vides the method for uniting VCT intoRDBMS in order to implement a spa-tial database with object-oriented datamodel.
基金Under the auspices of Jiangsu Provincial Science and Technology Fundation of Surveying and Mapping (No. 200416 )
文摘The paper presents a set of techniques of digital watermarking by which copyright and user rights messages are hidden into geo-spatial graphics data,as well as techniques of compressing and encrypting the watermarked geo-spatial graphics data.The technology aims at tracing and resisting the illegal distribution and duplication of the geo-spatial graphics data product,so as to effectively protect the data producer's rights as well as to facilitate the secure sharing of geo-spatial graphics data.So far in the CIS field throughout the world,few researches have been made on digital watermarking.The research is a novel exploration both in the field of security management of geo-spatial graphics data and in the applications of digital watermarking technique.An application software employing the proposed technology has been developed.A number of experimental tests on the 1:500,000 digital bathymetric chart of the South China Sea and 1:10,000 digital topographic map of Jiangsu Province have been conducted to verify the feasibility of the proposed technology.
基金supported in part by NIH grants R01NS39600,U01MH114829RF1MH128693(to GAA)。
文摘Many fields,such as neuroscience,are experiencing the vast prolife ration of cellular data,underscoring the need fo r organizing and interpreting large datasets.A popular approach partitions data into manageable subsets via hierarchical clustering,but objective methods to determine the appropriate classification granularity are missing.We recently introduced a technique to systematically identify when to stop subdividing clusters based on the fundamental principle that cells must differ more between than within clusters.Here we present the corresponding protocol to classify cellular datasets by combining datadriven unsupervised hierarchical clustering with statistical testing.These general-purpose functions are applicable to any cellular dataset that can be organized as two-dimensional matrices of numerical values,including molecula r,physiological,and anatomical datasets.We demonstrate the protocol using cellular data from the Janelia MouseLight project to chara cterize morphological aspects of neurons.
文摘There is a growing body of clinical research on the utility of synthetic data derivatives,an emerging research tool in medicine.In nephrology,clinicians can use machine learning and artificial intelligence as powerful aids in their clinical decision-making while also preserving patient privacy.This is especially important given the epidemiology of chronic kidney disease,renal oncology,and hypertension worldwide.However,there remains a need to create a framework for guidance regarding how to better utilize synthetic data as a practical application in this research.
基金National Natural Science Foundation of China, No.40571056 Sustentation Fund on Doctoral Thesis from Beijing Science and Technology Committee, No.ZZ0608
文摘Concerning about the rapid urban growth in recent China, this study takes Beijing as a case and puts forward that urban sprawl can be measured from spatial configuration, urban growth efficiency and external impacts, and then develops a geo-spatial indices system for measuring sprawl, a total of 13 indicators. In order to calculate these indices, different sources data are selected, including land use maps, former land use planning, land price and floor-area-ratio samples, digitized map of the highways and city centers, population and GDP statistical data, etc. Various GIS spatial analysis methods are used to spatialize these indices into 100mx100m cells. Besides, an integrated urban sprawl index is calculated by weight sum of these 13 indices. The application result indicates that geo-spatial indices system can capture most of the typical features and interior differentia of urban sprawl. Construction land in Beijing has kept fast growing with large amount, low efficiency and disordered spatial configuration, indicating a typical sprawling tendency. The following specific sprawl features are identified by each indicator: (1) typical spatial configuration of sprawling: obvious fragmentation and irregularity of landscape due to unsuccessful enforcement of land use planning, unadvisable pattern of typical discontinuous development, strip development and leapfrog development; (2) low efficiency of sprawl: low development density, low population density and economic output in newly developed area; and (3) negative impacts on agriculture, environment and city life. According to the integrated sprawl index, the sprawling amount in the northern part is larger than that in the southern, but the sprawling extent is in converse case; most sprawling area include the marginal area of the near suburbs and the area between highways, etc. Four sprawling patterns are identified: randomly expansion at urban fringe, strip development along or between highways, scattered development of industrial land, leapfrog development of urban residence and industrial area.
文摘Tourism is a rapidly growing investment point in Sri Lanka, where huge investment is takeing place. Even though the investment is very massive, the planning, development, and marketing are key components of success in tourism zone enhancement. The main objective of this study was to implement a geo-spatial information system for development of tourism in Kandy district. Primary data collection methods i.e. questionnaire survey, interviews, focus group interviews, and observations were employed for data collection. Google maps with Google API standards which are specially designed for developers and computer programmers were used for implementation of the system. System requirements were identified by interviewing tourists and observations made on tourist sites. Proximity analysis, spatial joint, and network analysis with Google direction application program interface (API) and Google place API were used to analyze data. The study highlights the potential tourist attractions and the accessibility and other required details through a web output. Issues and challenges faced by travelers are mainly lack of specific location information, public transport schedules, and reliable tourist attraction information. Online geo-spatial information system created in this study provides a guide for tourists to fred the destination routes, the service areas, and all necessary details on particular destinations.
基金supported by China’s National Natural Science Foundation(Nos.62072249,62072056)This work is also funded by the National Science Foundation of Hunan Province(2020JJ2029).
文摘With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapid development of IIoT.Blockchain technology has immutability,decentralization,and autonomy,which can greatly improve the inherent defects of the IIoT.In the traditional blockchain,data is stored in a Merkle tree.As data continues to grow,the scale of proofs used to validate it grows,threatening the efficiency,security,and reliability of blockchain-based IIoT.Accordingly,this paper first analyzes the inefficiency of the traditional blockchain structure in verifying the integrity and correctness of data.To solve this problem,a new Vector Commitment(VC)structure,Partition Vector Commitment(PVC),is proposed by improving the traditional VC structure.Secondly,this paper uses PVC instead of the Merkle tree to store big data generated by IIoT.PVC can improve the efficiency of traditional VC in the process of commitment and opening.Finally,this paper uses PVC to build a blockchain-based IIoT data security storage mechanism and carries out a comparative analysis of experiments.This mechanism can greatly reduce communication loss and maximize the rational use of storage space,which is of great significance for maintaining the security and stability of blockchain-based IIoT.
文摘In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose a Hadoop based big data secure storage scheme.Firstly,in order to disperse the NameNode service from a single server to multiple servers,we combine HDFS federation and HDFS high-availability mechanisms,and use the Zookeeper distributed coordination mechanism to coordinate each node to achieve dual-channel storage.Then,we improve the ECC encryption algorithm for the encryption of ordinary data,and adopt a homomorphic encryption algorithm to encrypt data that needs to be calculated.To accelerate the encryption,we adopt the dualthread encryption mode.Finally,the HDFS control module is designed to combine the encryption algorithm with the storage model.Experimental results show that the proposed solution solves the problem of a single point of failure of metadata,performs well in terms of metadata reliability,and can realize the fault tolerance of the server.The improved encryption algorithm integrates the dual-channel storage mode,and the encryption storage efficiency improves by 27.6% on average.
基金This research was financially supported by the Ministry of Trade,Industry,and Energy(MOTIE),Korea,under the“Project for Research and Development with Middle Markets Enterprises and DNA(Data,Network,AI)Universities”(AI-based Safety Assessment and Management System for Concrete Structures)(ReferenceNumber P0024559)supervised by theKorea Institute for Advancement of Technology(KIAT).
文摘Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal dependence,and noise.Therefore,methodologies for data augmentation and conversion of time series data into images for analysis have been studied.This paper proposes a fault detection model that uses time series data augmentation and transformation to address the problems of data imbalance,temporal dependence,and robustness to noise.The method of data augmentation is set as the addition of noise.It involves adding Gaussian noise,with the noise level set to 0.002,to maximize the generalization performance of the model.In addition,we use the Markov Transition Field(MTF)method to effectively visualize the dynamic transitions of the data while converting the time series data into images.It enables the identification of patterns in time series data and assists in capturing the sequential dependencies of the data.For anomaly detection,the PatchCore model is applied to show excellent performance,and the detected anomaly areas are represented as heat maps.It allows for the detection of anomalies,and by applying an anomaly map to the original image,it is possible to capture the areas where anomalies occur.The performance evaluation shows that both F1-score and Accuracy are high when time series data is converted to images.Additionally,when processed as images rather than as time series data,there was a significant reduction in both the size of the data and the training time.The proposed method can provide an important springboard for research in the field of anomaly detection using time series data.Besides,it helps solve problems such as analyzing complex patterns in data lightweight.
文摘Regarding the special potential of ports located on international coastlines such as Makoran Sea (Iran) for goods and human smuggling, national level of coastline security is very important. They can play a significant role in the development of power and security. Based on military reviews and analyses, police location and monitoring field view in the coastlines are strategic issues in modern security development. This research proposes a tool for development of coastal roads and coastal walking routes in the deployment of police. The main focuses are monitoring field view and accessibility to the strategic coastline. GIS tool plays an essential role in producing important security maps. Chabahar Port in Iran, as the most important port of Makoran Sea, has been selected as the study area, regarding its strategic role in the national economy and security. Research method focused on these major axes: successful establishment of police stations in shoreline for increasing monitoring and coastal security and suitable patrol of patrol police car in the coastal roads. This study adopts a scientific approach to the analysis of the present and future development in urban and security planning in coastal towns in the national and regional levels.
基金Korea Institute of Energy Technology Evaluation and Planning(KETEP)grant funded by the Korea government(Grant No.20214000000140,Graduate School of Convergence for Clean Energy Integrated Power Generation)Korea Basic Science Institute(National Research Facilities and Equipment Center)grant funded by the Ministry of Education(2021R1A6C101A449)the National Research Foundation of Korea grant funded by the Ministry of Science and ICT(2021R1A2C1095139),Republic of Korea。
文摘Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary with a deformation condition.This study proposes a novel approach for accurately predicting an anisotropic deformation behavior of wrought Mg alloys using machine learning(ML)with data augmentation.The developed model combines four key strategies from data science:learning the entire flow curves,generative adversarial networks(GAN),algorithm-driven hyperparameter tuning,and gated recurrent unit(GRU)architecture.The proposed model,namely GAN-aided GRU,was extensively evaluated for various predictive scenarios,such as interpolation,extrapolation,and a limited dataset size.The model exhibited significant predictability and improved generalizability for estimating the anisotropic compressive behavior of ZK60 Mg alloys under 11 annealing conditions and for three loading directions.The GAN-aided GRU results were superior to those of previous ML models and constitutive equations.The superior performance was attributed to hyperparameter optimization,GAN-based data augmentation,and the inherent predictivity of the GRU for extrapolation.As a first attempt to employ ML techniques other than artificial neural networks,this study proposes a novel perspective on predicting the anisotropic deformation behaviors of wrought Mg alloys.
文摘There are challenges to the reliability evaluation for insulated gate bipolar transistors(IGBT)on electric vehicles,such as junction temperature measurement,computational and storage resources.In this paper,a junction temperature estimation approach based on neural network without additional cost is proposed and the lifetime calculation for IGBT using electric vehicle big data is performed.The direct current(DC)voltage,operation current,switching frequency,negative thermal coefficient thermistor(NTC)temperature and IGBT lifetime are inputs.And the junction temperature(T_(j))is output.With the rain flow counting method,the classified irregular temperatures are brought into the life model for the failure cycles.The fatigue accumulation method is then used to calculate the IGBT lifetime.To solve the limited computational and storage resources of electric vehicle controllers,the operation of IGBT lifetime calculation is running on a big data platform.The lifetime is then transmitted wirelessly to electric vehicles as input for neural network.Thus the junction temperature of IGBT under long-term operating conditions can be accurately estimated.A test platform of the motor controller combined with the vehicle big data server is built for the IGBT accelerated aging test.Subsequently,the IGBT lifetime predictions are derived from the junction temperature estimation by the neural network method and the thermal network method.The experiment shows that the lifetime prediction based on a neural network with big data demonstrates a higher accuracy than that of the thermal network,which improves the reliability evaluation of system.
基金supported by the Meteorological Soft Science Project(Grant No.2023ZZXM29)the Natural Science Fund Project of Tianjin,China(Grant No.21JCYBJC00740)the Key Research and Development-Social Development Program of Jiangsu Province,China(Grant No.BE2021685).
文摘As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The eddy dissipation rate(EDR)has been established as the standard metric for quantifying turbulence in civil aviation.This study aims to explore a universally applicable symbolic classification approach based on genetic programming to detect turbulence anomalies using quick access recorder(QAR)data.The detection of atmospheric turbulence is approached as an anomaly detection problem.Comparative evaluations demonstrate that this approach performs on par with direct EDR calculation methods in identifying turbulence events.Moreover,comparisons with alternative machine learning techniques indicate that the proposed technique is the optimal methodology currently available.In summary,the use of symbolic classification via genetic programming enables accurate turbulence detection from QAR data,comparable to that with established EDR approaches and surpassing that achieved with machine learning algorithms.This finding highlights the potential of integrating symbolic classifiers into turbulence monitoring systems to enhance civil aviation safety amidst rising environmental and operational hazards.
基金supported by the Natural Science Foundation of Shandong Province,China(Grant No.ZR2021QD032)。
文摘Since the impoundment of Three Gorges Reservoir(TGR)in 2003,numerous slopes have experienced noticeable movement or destabilization owing to reservoir level changes and seasonal rainfall.One case is the Outang landslide,a large-scale and active landslide,on the south bank of the Yangtze River.The latest monitoring data and site investigations available are analyzed to establish spatial and temporal landslide deformation characteristics.Data mining technology,including the two-step clustering and Apriori algorithm,is then used to identify the dominant triggers of landslide movement.In the data mining process,the two-step clustering method clusters the candidate triggers and displacement rate into several groups,and the Apriori algorithm generates correlation criteria for the cause-and-effect.The analysis considers multiple locations of the landslide and incorporates two types of time scales:longterm deformation on a monthly basis and short-term deformation on a daily basis.This analysis shows that the deformations of the Outang landslide are driven by both rainfall and reservoir water while its deformation varies spatiotemporally mainly due to the difference in local responses to hydrological factors.The data mining results reveal different dominant triggering factors depending on the monitoring frequency:the monthly and bi-monthly cumulative rainfall control the monthly deformation,and the 10-d cumulative rainfall and the 5-d cumulative drop of water level in the reservoir dominate the daily deformation of the landslide.It is concluded that the spatiotemporal deformation pattern and data mining rules associated with precipitation and reservoir water level have the potential to be broadly implemented for improving landslide prevention and control in the dam reservoirs and other landslideprone areas.
基金This work was supported by the general program(No.1177531)joint funding(No.U2067205)from the National Natural Science Foundation of China.
文摘A benchmark experiment on^(238)U slab samples was conducted using a deuterium-tritium neutron source at the China Institute of Atomic Energy.The leakage neutron spectra within energy levels of 0.8-16 MeV at 60°and 120°were measured using the time-of-flight method.The samples were prepared as rectangular slabs with a 30 cm square base and thicknesses of 3,6,and 9 cm.The leakage neutron spectra were also calculated using the MCNP-4C program based on the latest evaluated files of^(238)U evaluated neutron data from CENDL-3.2,ENDF/B-Ⅷ.0,JENDL-5.0,and JEFF-3.3.Based on the comparison,the deficiencies and improvements in^(238)U evaluated nuclear data were analyzed.The results showed the following.(1)The calculated results for CENDL-3.2 significantly overestimated the measurements in the energy interval of elastic scattering at 60°and 120°.(2)The calculated results of CENDL-3.2 overestimated the measurements in the energy interval of inelastic scattering at 120°.(3)The calculated results for CENDL-3.2 significantly overestimated the measurements in the 3-8.5 MeV energy interval at 60°and 120°.(4)The calculated results with JENDL-5.0 were generally consistent with the measurement results.