There are many cloud data security techniques and algorithms available that can be used to detect attacks on cloud data,but these techniques and algorithms cannot be used to protect data from an attacker.Cloud cryptog...There are many cloud data security techniques and algorithms available that can be used to detect attacks on cloud data,but these techniques and algorithms cannot be used to protect data from an attacker.Cloud cryptography is the best way to transmit data in a secure and reliable format.Various researchers have developed various mechanisms to transfer data securely,which can convert data from readable to unreadable,but these algorithms are not sufficient to provide complete data security.Each algorithm has some data security issues.If some effective data protection techniques are used,the attacker will not be able to decipher the encrypted data,and even if the attacker tries to tamper with the data,the attacker will not have access to the original data.In this paper,various data security techniques are developed,which can be used to protect the data from attackers completely.First,a customized American Standard Code for Information Interchange(ASCII)table is developed.The value of each Index is defined in a customized ASCII table.When an attacker tries to decrypt the data,the attacker always tries to apply the predefined ASCII table on the Ciphertext,which in a way,can be helpful for the attacker to decrypt the data.After that,a radix 64-bit encryption mechanism is used,with the help of which the number of cipher data is doubled from the original data.When the number of cipher values is double the original data,the attacker tries to decrypt each value.Instead of getting the original data,the attacker gets such data that has no relation to the original data.After that,a Hill Matrix algorithm is created,with the help of which a key is generated that is used in the exact plain text for which it is created,and this Key cannot be used in any other plain text.The boundaries of each Hill text work up to that text.The techniques used in this paper are compared with those used in various papers and discussed that how far the current algorithm is better than all other algorithms.Then,the Kasiski test is used to verify the validity of the proposed algorithm and found that,if the proposed algorithm is used for data encryption,so an attacker cannot break the proposed algorithm security using any technique or algorithm.展开更多
Cognitive Radio Networks(CRNs)have become a successful platform in recent years for a diverse range of future systems,in particularly,industrial internet of things(IIoT)applications.In order to provide an efficient co...Cognitive Radio Networks(CRNs)have become a successful platform in recent years for a diverse range of future systems,in particularly,industrial internet of things(IIoT)applications.In order to provide an efficient connection among IIoT devices,CRNs enhance spectrum utilization by using licensed spectrum.However,the routing protocol in these networks is considered one of the main problems due to node mobility and time-variant channel selection.Specifically,the channel selection for routing protocol is indispensable in CRNs to provide an adequate adaptation to the Primary User(PU)activity and create a robust routing path.This study aims to construct a robust routing path by minimizing PU interference and routing delay to maximize throughput within the IIoT domain.Thus,a generic routing framework from a cross-layer perspective is investigated that intends to share the information resources by exploiting a recently proposed method,namely,Channel Availability Probability.Moreover,a novel cross-layer-oriented routing protocol is proposed by using a time-variant channel estimation technique.This protocol combines lower layer(Physical layer and Data Link layer)sensing that is derived from the channel estimation model.Also,it periodically updates and stores the routing table for optimal route decision-making.Moreover,in order to achieve higher throughput and lower delay,a new routing metric is presented.To evaluate the performance of the proposed protocol,network simulations have been conducted and also compared to the widely used routing protocols,as a benchmark.The simulation results of different routing scenarios demonstrate that our proposed solution outperforms the existing protocols in terms of the standard network performance metrics involving packet delivery ratio(with an improved margin of around 5–20%approximately)under varying numbers of PUs and cognitive users in Mobile Cognitive Radio Networks(MCRNs).Moreover,the cross-layer routing protocol successfully achieves high routing performance in finding a robust route,selecting the high channel stability,and reducing the probability of PU interference for continued communication.展开更多
A measure of non-classicality of even and odd coherent states is studied. We first calculate the Wigner functions of the even and odd coherent states, which consists of two terms: the positive-definite Gaussian term ...A measure of non-classicality of even and odd coherent states is studied. We first calculate the Wigner functions of the even and odd coherent states, which consists of two terms: the positive-definite Gaussian term and the wave term with negativity, and then calculate the integrated value εmax of the wave term of the Wigner functions of the even and odd coherent states in their area with negativity, and use εmax to measure non-classicality of the even and odd coherent states. For the even and odd coherent states with certain photon count, it is very convenient for us to use εmax to measure their non-classicality. The methods of our definition and calculation for εmax have theoretical reference value.展开更多
Rock slide is one of the common geohazard in the Three Gorges Reservoir area, and it affects the shipping of the Yangtze River and the safety of people living on the banks. In order to investigate the internal fractur...Rock slide is one of the common geohazard in the Three Gorges Reservoir area, and it affects the shipping of the Yangtze River and the safety of people living on the banks. In order to investigate the internal fracturing mechanism of rock mass, distributed microseismic monitoring network was arranged with 15 three component geophones(3C geophones), deployed at borehole and out of the sliding mass in the unstable Dulong slope. Stein Unbiased Risk Estimation(SURE) method was used to noise suppression for the microseismic record, and decomposition parameters of the Continuous Wavelet Transform(CWT) were determined with maximum energy of correlation coefficient(MECC) method. The signal-to-noise ratio was tripled after the process, and source parameters are obtained with full waveform inversion. The rupture volume model was counted by the irregular grid statistics with the events’ density. It shows that the rock slide is of a small scale and composed of a single block. Moreover, the relationship among microseismicity, displacement and rainfall were discussed in the paper. The deformation rate was dramatically changed in the period of intensive events. There is a good consistency especially in the rainfall period. Although there is a time delay, continuous rainfall is more likely to cause the increase of microseismic events. The results show that the Dulong slope is a shallow rock slide in the state of creep deformation, and the rupture mechanism of the rock mass is left-lateral normal fault with shear failure. The research provides more key information for the early warning and prevention of rock slides and helps to reduce the risk of geohazards.展开更多
In order to solve the problem that traditional energy efficiency power allocation algorithms usually require the assumption of constant or perfect channel state information in cognitive radio networks(CRNs),which may ...In order to solve the problem that traditional energy efficiency power allocation algorithms usually require the assumption of constant or perfect channel state information in cognitive radio networks(CRNs),which may lead to performance degradation in real systems with disturbances or uncertainties,we propose a robust energy efficiency power allocation algorithm for underlay cognitive radio(CR)systems with channel uncertainty in consideration of interference power threshold constraint and minimum target SINR requirement constraint.The ellipsoid sets are used to describe the channel uncertainty,and a constrained fractional programming for the allocation is transformed to a convex optimization problem by worst-case optimization approach.A simplified version of robust energy efficiency scheme by a substitutional constraint having lower complexity is presented.Simulation results show that our proposed scheme can provide higher energy efficiency compared with capacity maximization algorithm and guarantee the signal to interference plus noise ratio(SINR)requirement of each cognitive user under channel uncertainty.展开更多
Locusts are agricultural pests around the world. To cognize how locust distribution density and community structure are related to the hydrothermal and vegetation growth conditions of their habitats and thereby provid...Locusts are agricultural pests around the world. To cognize how locust distribution density and community structure are related to the hydrothermal and vegetation growth conditions of their habitats and thereby providing rapid and accurate warning of locust invasions, it is important to develop efficient and accurate techniques for acquiring locust information. In this paper, by analyzing the differences between the morphological features of Locusta migratoria manilensis and Oedaleus decorus asiaticus, we proposed a semi-automatic locust species and instar information detection model based on locust image segmentation, locust feature variable extraction and support vector machine(SVM) classification. And we subsequently examined its applicability and accuracy based on sample image data acquired in the field. Locust image segmentation experiment showed that the proposed GrabCut-based interactive segmentation method can be used to rapidly extract images of various locust body parts and exhibits excellent operability. In a locust feature variable extraction experiment, the textural, color and morphological features of various locust body parts were calculated. Based on the results, eight feature variables were selected to identify locust species and instars using outlier detection, variable function calculation and principal component analysis. An SVM-based locust classification experiment achieved a semi-automatic detection accuracy of 96.16% when a polynomial kernel function with a penalty factor parameter c of 2 040 and a gamma parameter g of 0.5 was used. The proposed detection model exhibits advantages such as high applicability and accuracy when it is used to identify locust instars of L. migratoria manilensis and O. decorus asiaticus, and it can also be used to identify other species of locusts.展开更多
Most resource allocation algorithms are based on interference power constraint in cognitive radio networks.Instead of using conventional primary user interference constraint,we give a new criterion called allowable si...Most resource allocation algorithms are based on interference power constraint in cognitive radio networks.Instead of using conventional primary user interference constraint,we give a new criterion called allowable signal to interference plus noise ratio(SINR) loss constraint in cognitive transmission to protect primary users.Considering power allocation problem for cognitive users over flat fading channels,in order to maximize throughput of cognitive users subject to the allowable SINR loss constraint and maximum transmit power for each cognitive user,we propose a new power allocation algorithm.The comparison of computer simulation between our proposed algorithm and the algorithm based on interference power constraint is provided to show that it gets more throughput and provides stability to cognitive radio networks.展开更多
Considering a quantum model consisting of two effective two-level atoms and a single-mode cavity, this paper investigates the entanglement dynamics between the two atoms, and studies the effect of the Stark shift on t...Considering a quantum model consisting of two effective two-level atoms and a single-mode cavity, this paper investigates the entanglement dynamics between the two atoms, and studies the effect of the Stark shift on the entanglement. The results show that, on the one hand the atom-atom entanglement evolves periodically with time and the periods are affected by the Stark shift; on the other hand, the two atoms are not disentangled at any time when the Stark shift is considered, and for large values of the Stark shift parameter, the two atoms can remain in a stationary entangled state. In addition, for the initially partially entangled atomic state, the atom-atom entanglement can be greatly enhanced due to the presence of Stark shift. These properties show that the Stark shift can be used to control entanglement between two atoms.展开更多
Pooling design is a mathematical tool in many application areas.In this paper, we give a new construction of pooling design with subspaces of the pseudo-symplectic space and discuss its properties.We define the design...Pooling design is a mathematical tool in many application areas.In this paper, we give a new construction of pooling design with subspaces of the pseudo-symplectic space and discuss its properties.We define the design parameters of a d^z-disjunct matrix.Then we discuss the change law of the design parameters in our construction along with their variables.展开更多
Recent advancements in hardware and communication technologies have enabled worldwide interconnection using the internet of things(IoT).The IoT is the backbone of smart city applications such as smart grids and green ...Recent advancements in hardware and communication technologies have enabled worldwide interconnection using the internet of things(IoT).The IoT is the backbone of smart city applications such as smart grids and green energy management.In smart cities,the IoT devices are used for linking power,price,energy,and demand information for smart homes and home energy management(HEM)in the smart grids.In complex smart gridconnected systems,power scheduling and secure dispatch of information are the main research challenge.These challenges can be resolved through various machine learning techniques and data analytics.In this paper,we have proposed a particle swarm optimization based machine learning algorithm known as a collaborative execute-before-after dependency-based requirement,for the smart grid.The proposed collaborative execute-before-after dependencybased requirement algorithm works in two phases,analysis and assessment of the requirements of end-users and power distribution companies.In the rst phases,a xed load is adjusted over a period of 24 h,and in the second phase,a randomly produced population load for 90 days is evaluated using particle swarm optimization.The simulation results demonstrate that the proposed algorithm performed better in terms of percentage cost reduction,peak to average ratio,and power variance mean ratio than particle swarm optimization and inclined block rate.展开更多
By employing molecular mechanics and molecular dynamics simulations, we investigate the radial collapses and elasticities of different chiral single-walled carbon nanotubes (SWCNTs) with divacancy, and 5-8-5 defects...By employing molecular mechanics and molecular dynamics simulations, we investigate the radial collapses and elasticities of different chiral single-walled carbon nanotubes (SWCNTs) with divacancy, and 5-8-5 defects. It is found that divacancy and 5-8-5 defect can reduce the collapse pressure (Pc) of SWCNT (10, 10) while 5-8-5 defect can greatly increase Pc of SWCNT (17, 0). For example, 5-8-5 defect can make Pc of SWCNT (17, 0) increase by 500%. A model is established to understand the effects of chirality, divacancy, and 5-8-5 defect on radial collapse of SWCNTs. The results are particularly of value for understanding the mechanical behavior of SWCNT with divacancy, and the 5-8-5 defect that may be considered as a filler of high loading composites.展开更多
This paper proposed an universal steganalysis program based on quantification attack which can detect several kinds of data hiding algorithms for grayscale images. In practice, most techniques produce stego images tha...This paper proposed an universal steganalysis program based on quantification attack which can detect several kinds of data hiding algorithms for grayscale images. In practice, most techniques produce stego images that are perceptually identical to the cover images but exhibit statistical irregularities that distinguish them from cover images. Attacking the suspicious images using the quantization method, we can obtain statistically different from embedded-and-quantization attacked images and from quantization attacked-but-not-embedded sources. We have developed a technique based on one-class SVM for discriminating between cover-images and stego-images. Simulation results show our approach is able to distinguish between cover and stego images with reasonable accuracy.展开更多
One of the challenges in accurately estimating Photovoltaic (PV) cell electric performance is the uncertainty of the model equivalent circuit parameters. The parameters considered in the study are the series resistanc...One of the challenges in accurately estimating Photovoltaic (PV) cell electric performance is the uncertainty of the model equivalent circuit parameters. The parameters considered in the study are the series resistance, shunt resistance, photo current, saturation current, and diode ideality factor. Parameter estimation for the PV cell equivalent circuit model is challenging due to the implicit transcendental relationship of the I-V characteristics of the cell. This paper presents a fuzzy logic based study for estimating the uncertainty of the cell parameters. The model parameters change with temperature and irradiance, are the source of uncertainties. Mathematical programming is used to estimate the fuzzy parameters. The approach is performed on practical data and the results of the analysis provide the estimation of the PV cell parameters. Results of this research yielding better estimated parameters compared with other methods using the Absolute Mean Error (AME).展开更多
In this note, we study some properties of local random pull-back attractors on compact metric spaces. We obtain some relations between attractors and their fundamental neighborhoods and basins of attraction. We also o...In this note, we study some properties of local random pull-back attractors on compact metric spaces. We obtain some relations between attractors and their fundamental neighborhoods and basins of attraction. We also obtain some properties of omega-limit sets, as well as connectedness of random attractors. A simple deterministic example is given to illustrate some confusing problems.展开更多
The following paper explored data mining issues in Small and Medium Enterprises’ (SMEs), firstly exploring the relationship between data mining and economic development. With SME’s contributing most employment prosp...The following paper explored data mining issues in Small and Medium Enterprises’ (SMEs), firstly exploring the relationship between data mining and economic development. With SME’s contributing most employment prospects and output within any emerging economy such as the Kingdom of Saudi Arabia. Adopting technology will improve SME’s potential for effective decision making and efficient operations. Hence, it is important that SMEs have access to data mining techniques and implement the most suited into their business to improve their business intelligence (BI). The paper is aimed to critically review the existing literature on data mining in the field of SME to provide a theoretical underpinning for any future work. It has been found data mining to be complicated and fragmented with a multitude of options available for businesses from quite basic systems implemented within Excel or Access to more sophisticated cloud-based systems. For any business, data mining is trade-off between the need for data analysis, and intelligence against the cost and resource-use of the system put in place. Multiple challenges have been identified to data mining, most notably the resource-intensive nature of systems (both in terms of labor and capital) and the security issues of data collection, analysis and storage;with General Data Protection Regulation (GDPR) a key focus for Kingdom of Saudi Arabia businesses. With these challenges the paper suggests that any SME starts small with an internal data mining exercise to digitalize and analyze their customer data, scaling up over time as the business grows and acquires the resources needed to properly manage any system.展开更多
In this work,the concepts of particle swarm optimization-based method,named non-Gaussian improved particle swarm optimization for minimizing the cost of energy(COE)of wind turbines(WTs)on high-altitude sites are intro...In this work,the concepts of particle swarm optimization-based method,named non-Gaussian improved particle swarm optimization for minimizing the cost of energy(COE)of wind turbines(WTs)on high-altitude sites are introduced.Since the COE depends on site specification constants and initialized parameters of wind turbine,the focus was on the design optimization of rotor radius,hub height and rated power.Based on literature,the COE is converted to the Saudi Arabia context.Thus,the constrained wind turbine optimization problem is developed.Then,non-Gaussian improved particle swarm optimization is provided and compared with the conventional particle swarm optimization for solving the optimization design in wind turbine efficiency under different altitudes ranging from 2500 to 4000 m.The results show that as altitude rises,the optimal rotor radius grows,but the optimal hub height and rated power drop,resulting in an increase in COE.Further,the non-Gaussian method display a faster convergence compared to the classical particle swarm optimization.These findings will be useful as a reference for wind turbine design at high altitudes.Thus,it could be employed to optimize the initialized parameter of wind turbine for the planned and largest wind farm in Saudi Arabia in Dumat Al-Jandal selected site.展开更多
<div style="text-align:justify;"> <span style="font-family:Verdana;">The main objective of this research is to discuss the current legal and methodological issues in the field of softwa...<div style="text-align:justify;"> <span style="font-family:Verdana;">The main objective of this research is to discuss the current legal and methodological issues in the field of software Re-Usability. Though there are enormous online forums discussing such issues via Q&A but this paper is an attempt to raise the awareness about the legal issues, which a software engineer may trap into. The paper discussed the current issues with software reusability within the legal and methodological context. This paper applied an extensive literature review to critically appraise the past studies to come to a collective conclusion. Prior to discussing the issues, the benefits of reuse were mentioned, including the saving of time and cost for users. But legally the reuse of software assets creates complexities for the user in relation to meeting all the licensing requirements and dealing with the liability in case of a breach. Methodologically, there are major barriers to reused software when it comes to technical competence and managerial issues such as a lack of resources. Even when reusing software to save time, and leverage off the specialization of other authors, the end-user must also have the technical expertise to search, adapt and merge these reusable assets into the larger software infrastructure. The review ultimately shows the high barriers still remain to software reuse which could mean that smaller developers and businesses will still be reluctant to fully utilize open-source components to the best advantage.</span> </div>展开更多
Big data is a term that refers to a set of data that,due to its largeness or complexity,cannot be stored or processed with one of the usual tools or applications for data management,and it has become a prominent word...Big data is a term that refers to a set of data that,due to its largeness or complexity,cannot be stored or processed with one of the usual tools or applications for data management,and it has become a prominent word in recent years for the massive development of technology.Almost immediately thereafter,the term“big data mining”emerged,i.e.,mining from big data even as an emerging and interconnected field of research.Classification is an important stage in data mining since it helps people make better decisions in a variety of situations,including scientific endeavors,biomedical research,and industrial applications.The probabilistic neural network(PNN)is a commonly used and successful method for handling classification and pattern recognition issues.In this study,the authors proposed to combine the probabilistic neural network(PPN),which is one of the data mining techniques,with the vibrating particles system(VPS),which is one of the metaheuristic algorithms named“VPS-PNN”,to solve classi-fication problems more effectively.The data set is eleven common benchmark medical datasets from the machine-learning library,the suggested method was tested.The suggested VPS-PNN mechanism outperforms the PNN,biogeography-based optimization,enhanced-water cycle algorithm(E-WCA)and the firefly algorithm(FA)in terms of convergence speed and classification accuracy.展开更多
This paper generally illustrates the main origins of English and Chinese names,that is,the origin from characters in an cient mythology,the religion,typical literary images;common people in daily life and from famous ...This paper generally illustrates the main origins of English and Chinese names,that is,the origin from characters in an cient mythology,the religion,typical literary images;common people in daily life and from famous trademarks.So it's necessary to focus on names involving cultural connotation.What's more,it introduces the transliteration and annotated translation.展开更多
基金This research was supported by the Researchers supporting program(TUMAProject-2021-27)Almaarefa University,Riyadh,Saudi Arabia.
文摘There are many cloud data security techniques and algorithms available that can be used to detect attacks on cloud data,but these techniques and algorithms cannot be used to protect data from an attacker.Cloud cryptography is the best way to transmit data in a secure and reliable format.Various researchers have developed various mechanisms to transfer data securely,which can convert data from readable to unreadable,but these algorithms are not sufficient to provide complete data security.Each algorithm has some data security issues.If some effective data protection techniques are used,the attacker will not be able to decipher the encrypted data,and even if the attacker tries to tamper with the data,the attacker will not have access to the original data.In this paper,various data security techniques are developed,which can be used to protect the data from attackers completely.First,a customized American Standard Code for Information Interchange(ASCII)table is developed.The value of each Index is defined in a customized ASCII table.When an attacker tries to decrypt the data,the attacker always tries to apply the predefined ASCII table on the Ciphertext,which in a way,can be helpful for the attacker to decrypt the data.After that,a radix 64-bit encryption mechanism is used,with the help of which the number of cipher data is doubled from the original data.When the number of cipher values is double the original data,the attacker tries to decrypt each value.Instead of getting the original data,the attacker gets such data that has no relation to the original data.After that,a Hill Matrix algorithm is created,with the help of which a key is generated that is used in the exact plain text for which it is created,and this Key cannot be used in any other plain text.The boundaries of each Hill text work up to that text.The techniques used in this paper are compared with those used in various papers and discussed that how far the current algorithm is better than all other algorithms.Then,the Kasiski test is used to verify the validity of the proposed algorithm and found that,if the proposed algorithm is used for data encryption,so an attacker cannot break the proposed algorithm security using any technique or algorithm.
文摘Cognitive Radio Networks(CRNs)have become a successful platform in recent years for a diverse range of future systems,in particularly,industrial internet of things(IIoT)applications.In order to provide an efficient connection among IIoT devices,CRNs enhance spectrum utilization by using licensed spectrum.However,the routing protocol in these networks is considered one of the main problems due to node mobility and time-variant channel selection.Specifically,the channel selection for routing protocol is indispensable in CRNs to provide an adequate adaptation to the Primary User(PU)activity and create a robust routing path.This study aims to construct a robust routing path by minimizing PU interference and routing delay to maximize throughput within the IIoT domain.Thus,a generic routing framework from a cross-layer perspective is investigated that intends to share the information resources by exploiting a recently proposed method,namely,Channel Availability Probability.Moreover,a novel cross-layer-oriented routing protocol is proposed by using a time-variant channel estimation technique.This protocol combines lower layer(Physical layer and Data Link layer)sensing that is derived from the channel estimation model.Also,it periodically updates and stores the routing table for optimal route decision-making.Moreover,in order to achieve higher throughput and lower delay,a new routing metric is presented.To evaluate the performance of the proposed protocol,network simulations have been conducted and also compared to the widely used routing protocols,as a benchmark.The simulation results of different routing scenarios demonstrate that our proposed solution outperforms the existing protocols in terms of the standard network performance metrics involving packet delivery ratio(with an improved margin of around 5–20%approximately)under varying numbers of PUs and cognitive users in Mobile Cognitive Radio Networks(MCRNs).Moreover,the cross-layer routing protocol successfully achieves high routing performance in finding a robust route,selecting the high channel stability,and reducing the probability of PU interference for continued communication.
文摘A measure of non-classicality of even and odd coherent states is studied. We first calculate the Wigner functions of the even and odd coherent states, which consists of two terms: the positive-definite Gaussian term and the wave term with negativity, and then calculate the integrated value εmax of the wave term of the Wigner functions of the even and odd coherent states in their area with negativity, and use εmax to measure non-classicality of the even and odd coherent states. For the even and odd coherent states with certain photon count, it is very convenient for us to use εmax to measure their non-classicality. The methods of our definition and calculation for εmax have theoretical reference value.
基金supported by the Chongqing Administration of Science and Technology(Grants No.cstc2021jxjl20008,cstc2020jcyj-msxm X1068)the Chongqing Administration of Planning and Natural Resources(Grant No.KJ-2019018)。
文摘Rock slide is one of the common geohazard in the Three Gorges Reservoir area, and it affects the shipping of the Yangtze River and the safety of people living on the banks. In order to investigate the internal fracturing mechanism of rock mass, distributed microseismic monitoring network was arranged with 15 three component geophones(3C geophones), deployed at borehole and out of the sliding mass in the unstable Dulong slope. Stein Unbiased Risk Estimation(SURE) method was used to noise suppression for the microseismic record, and decomposition parameters of the Continuous Wavelet Transform(CWT) were determined with maximum energy of correlation coefficient(MECC) method. The signal-to-noise ratio was tripled after the process, and source parameters are obtained with full waveform inversion. The rupture volume model was counted by the irregular grid statistics with the events’ density. It shows that the rock slide is of a small scale and composed of a single block. Moreover, the relationship among microseismicity, displacement and rainfall were discussed in the paper. The deformation rate was dramatically changed in the period of intensive events. There is a good consistency especially in the rainfall period. Although there is a time delay, continuous rainfall is more likely to cause the increase of microseismic events. The results show that the Dulong slope is a shallow rock slide in the state of creep deformation, and the rupture mechanism of the rock mass is left-lateral normal fault with shear failure. The research provides more key information for the early warning and prevention of rock slides and helps to reduce the risk of geohazards.
基金supported by the Nation Natural Science Foundation of China(Grant NO.61501059)the Education Department of Jilin Province(Grant NO.2016343)
文摘In order to solve the problem that traditional energy efficiency power allocation algorithms usually require the assumption of constant or perfect channel state information in cognitive radio networks(CRNs),which may lead to performance degradation in real systems with disturbances or uncertainties,we propose a robust energy efficiency power allocation algorithm for underlay cognitive radio(CR)systems with channel uncertainty in consideration of interference power threshold constraint and minimum target SINR requirement constraint.The ellipsoid sets are used to describe the channel uncertainty,and a constrained fractional programming for the allocation is transformed to a convex optimization problem by worst-case optimization approach.A simplified version of robust energy efficiency scheme by a substitutional constraint having lower complexity is presented.Simulation results show that our proposed scheme can provide higher energy efficiency compared with capacity maximization algorithm and guarantee the signal to interference plus noise ratio(SINR)requirement of each cognitive user under channel uncertainty.
基金funded by the National Natural Science Foundation of China(31471762)the Fundamental Research Funds for the Central Universities of China(2018NTST03)。
文摘Locusts are agricultural pests around the world. To cognize how locust distribution density and community structure are related to the hydrothermal and vegetation growth conditions of their habitats and thereby providing rapid and accurate warning of locust invasions, it is important to develop efficient and accurate techniques for acquiring locust information. In this paper, by analyzing the differences between the morphological features of Locusta migratoria manilensis and Oedaleus decorus asiaticus, we proposed a semi-automatic locust species and instar information detection model based on locust image segmentation, locust feature variable extraction and support vector machine(SVM) classification. And we subsequently examined its applicability and accuracy based on sample image data acquired in the field. Locust image segmentation experiment showed that the proposed GrabCut-based interactive segmentation method can be used to rapidly extract images of various locust body parts and exhibits excellent operability. In a locust feature variable extraction experiment, the textural, color and morphological features of various locust body parts were calculated. Based on the results, eight feature variables were selected to identify locust species and instars using outlier detection, variable function calculation and principal component analysis. An SVM-based locust classification experiment achieved a semi-automatic detection accuracy of 96.16% when a polynomial kernel function with a penalty factor parameter c of 2 040 and a gamma parameter g of 0.5 was used. The proposed detection model exhibits advantages such as high applicability and accuracy when it is used to identify locust instars of L. migratoria manilensis and O. decorus asiaticus, and it can also be used to identify other species of locusts.
基金ACKNOWLEDGEMENTS This work is supported by National Natural Science Foundation of China (No. 61171079). The authors would like to thank the editors and the anonymous reviewers for their detailed constructive comments that helped to improve the presentation of this paper.
文摘Most resource allocation algorithms are based on interference power constraint in cognitive radio networks.Instead of using conventional primary user interference constraint,we give a new criterion called allowable signal to interference plus noise ratio(SINR) loss constraint in cognitive transmission to protect primary users.Considering power allocation problem for cognitive users over flat fading channels,in order to maximize throughput of cognitive users subject to the allowable SINR loss constraint and maximum transmit power for each cognitive user,we propose a new power allocation algorithm.The comparison of computer simulation between our proposed algorithm and the algorithm based on interference power constraint is provided to show that it gets more throughput and provides stability to cognitive radio networks.
基金supported by the Young Scientists Fund of the National Natural Science Foundation of China (Grant No. 10905028)the Natural Science Foundation of Hunan Province of China (Grant No. 07JJ3013)+1 种基金the Program for Science and Technology Department of Henan Province of China (Grant No. 102300410050)the Foundation of Hunan Provincial Education Department of China (Grant No. 06A038)
文摘Considering a quantum model consisting of two effective two-level atoms and a single-mode cavity, this paper investigates the entanglement dynamics between the two atoms, and studies the effect of the Stark shift on the entanglement. The results show that, on the one hand the atom-atom entanglement evolves periodically with time and the periods are affected by the Stark shift; on the other hand, the two atoms are not disentangled at any time when the Stark shift is considered, and for large values of the Stark shift parameter, the two atoms can remain in a stationary entangled state. In addition, for the initially partially entangled atomic state, the atom-atom entanglement can be greatly enhanced due to the presence of Stark shift. These properties show that the Stark shift can be used to control entanglement between two atoms.
基金Supported by the NSF of Hebei Province(A2009000253)
文摘Pooling design is a mathematical tool in many application areas.In this paper, we give a new construction of pooling design with subspaces of the pseudo-symplectic space and discuss its properties.We define the design parameters of a d^z-disjunct matrix.Then we discuss the change law of the design parameters in our construction along with their variables.
文摘Recent advancements in hardware and communication technologies have enabled worldwide interconnection using the internet of things(IoT).The IoT is the backbone of smart city applications such as smart grids and green energy management.In smart cities,the IoT devices are used for linking power,price,energy,and demand information for smart homes and home energy management(HEM)in the smart grids.In complex smart gridconnected systems,power scheduling and secure dispatch of information are the main research challenge.These challenges can be resolved through various machine learning techniques and data analytics.In this paper,we have proposed a particle swarm optimization based machine learning algorithm known as a collaborative execute-before-after dependency-based requirement,for the smart grid.The proposed collaborative execute-before-after dependencybased requirement algorithm works in two phases,analysis and assessment of the requirements of end-users and power distribution companies.In the rst phases,a xed load is adjusted over a period of 24 h,and in the second phase,a randomly produced population load for 90 days is evaluated using particle swarm optimization.The simulation results demonstrate that the proposed algorithm performed better in terms of percentage cost reduction,peak to average ratio,and power variance mean ratio than particle swarm optimization and inclined block rate.
基金Project supported by the National Natural Science Foundation of China(Grant No.11374372)Natural Science Foundation of Shandong Province,China(Grant No.ZR2014EMQ006)+3 种基金the Postdoctoral Science Foundation of China(Grant No.2014M551983)the Postdoctoral Applied Research Foundation of Qingdao City,China(Grant No.2014)the Fundamental Research Funds for the Central Universities,China(Grant Nos.12CX04087A and 14CX02018A)the Qingdao Science and Technology Program,China(Grant No.14-2-4-27-jch)
文摘By employing molecular mechanics and molecular dynamics simulations, we investigate the radial collapses and elasticities of different chiral single-walled carbon nanotubes (SWCNTs) with divacancy, and 5-8-5 defects. It is found that divacancy and 5-8-5 defect can reduce the collapse pressure (Pc) of SWCNT (10, 10) while 5-8-5 defect can greatly increase Pc of SWCNT (17, 0). For example, 5-8-5 defect can make Pc of SWCNT (17, 0) increase by 500%. A model is established to understand the effects of chirality, divacancy, and 5-8-5 defect on radial collapse of SWCNTs. The results are particularly of value for understanding the mechanical behavior of SWCNT with divacancy, and the 5-8-5 defect that may be considered as a filler of high loading composites.
基金Science Fund of Shanghai Municipal Education Commission (03DZ13)
文摘This paper proposed an universal steganalysis program based on quantification attack which can detect several kinds of data hiding algorithms for grayscale images. In practice, most techniques produce stego images that are perceptually identical to the cover images but exhibit statistical irregularities that distinguish them from cover images. Attacking the suspicious images using the quantization method, we can obtain statistically different from embedded-and-quantization attacked images and from quantization attacked-but-not-embedded sources. We have developed a technique based on one-class SVM for discriminating between cover-images and stego-images. Simulation results show our approach is able to distinguish between cover and stego images with reasonable accuracy.
文摘One of the challenges in accurately estimating Photovoltaic (PV) cell electric performance is the uncertainty of the model equivalent circuit parameters. The parameters considered in the study are the series resistance, shunt resistance, photo current, saturation current, and diode ideality factor. Parameter estimation for the PV cell equivalent circuit model is challenging due to the implicit transcendental relationship of the I-V characteristics of the cell. This paper presents a fuzzy logic based study for estimating the uncertainty of the cell parameters. The model parameters change with temperature and irradiance, are the source of uncertainties. Mathematical programming is used to estimate the fuzzy parameters. The approach is performed on practical data and the results of the analysis provide the estimation of the PV cell parameters. Results of this research yielding better estimated parameters compared with other methods using the Absolute Mean Error (AME).
基金Partially Supported by the SRFDP (20070183053) and the Young Fund of the College of Mathematics at Jilin University.
文摘In this note, we study some properties of local random pull-back attractors on compact metric spaces. We obtain some relations between attractors and their fundamental neighborhoods and basins of attraction. We also obtain some properties of omega-limit sets, as well as connectedness of random attractors. A simple deterministic example is given to illustrate some confusing problems.
文摘The following paper explored data mining issues in Small and Medium Enterprises’ (SMEs), firstly exploring the relationship between data mining and economic development. With SME’s contributing most employment prospects and output within any emerging economy such as the Kingdom of Saudi Arabia. Adopting technology will improve SME’s potential for effective decision making and efficient operations. Hence, it is important that SMEs have access to data mining techniques and implement the most suited into their business to improve their business intelligence (BI). The paper is aimed to critically review the existing literature on data mining in the field of SME to provide a theoretical underpinning for any future work. It has been found data mining to be complicated and fragmented with a multitude of options available for businesses from quite basic systems implemented within Excel or Access to more sophisticated cloud-based systems. For any business, data mining is trade-off between the need for data analysis, and intelligence against the cost and resource-use of the system put in place. Multiple challenges have been identified to data mining, most notably the resource-intensive nature of systems (both in terms of labor and capital) and the security issues of data collection, analysis and storage;with General Data Protection Regulation (GDPR) a key focus for Kingdom of Saudi Arabia businesses. With these challenges the paper suggests that any SME starts small with an internal data mining exercise to digitalize and analyze their customer data, scaling up over time as the business grows and acquires the resources needed to properly manage any system.
基金The authors extend their appreciation to the Researchers Supporting Project number(RSP2022R515)King Saud University,Riyadh,Saudi Arabia for funding this research work。
文摘In this work,the concepts of particle swarm optimization-based method,named non-Gaussian improved particle swarm optimization for minimizing the cost of energy(COE)of wind turbines(WTs)on high-altitude sites are introduced.Since the COE depends on site specification constants and initialized parameters of wind turbine,the focus was on the design optimization of rotor radius,hub height and rated power.Based on literature,the COE is converted to the Saudi Arabia context.Thus,the constrained wind turbine optimization problem is developed.Then,non-Gaussian improved particle swarm optimization is provided and compared with the conventional particle swarm optimization for solving the optimization design in wind turbine efficiency under different altitudes ranging from 2500 to 4000 m.The results show that as altitude rises,the optimal rotor radius grows,but the optimal hub height and rated power drop,resulting in an increase in COE.Further,the non-Gaussian method display a faster convergence compared to the classical particle swarm optimization.These findings will be useful as a reference for wind turbine design at high altitudes.Thus,it could be employed to optimize the initialized parameter of wind turbine for the planned and largest wind farm in Saudi Arabia in Dumat Al-Jandal selected site.
文摘<div style="text-align:justify;"> <span style="font-family:Verdana;">The main objective of this research is to discuss the current legal and methodological issues in the field of software Re-Usability. Though there are enormous online forums discussing such issues via Q&A but this paper is an attempt to raise the awareness about the legal issues, which a software engineer may trap into. The paper discussed the current issues with software reusability within the legal and methodological context. This paper applied an extensive literature review to critically appraise the past studies to come to a collective conclusion. Prior to discussing the issues, the benefits of reuse were mentioned, including the saving of time and cost for users. But legally the reuse of software assets creates complexities for the user in relation to meeting all the licensing requirements and dealing with the liability in case of a breach. Methodologically, there are major barriers to reused software when it comes to technical competence and managerial issues such as a lack of resources. Even when reusing software to save time, and leverage off the specialization of other authors, the end-user must also have the technical expertise to search, adapt and merge these reusable assets into the larger software infrastructure. The review ultimately shows the high barriers still remain to software reuse which could mean that smaller developers and businesses will still be reluctant to fully utilize open-source components to the best advantage.</span> </div>
文摘Big data is a term that refers to a set of data that,due to its largeness or complexity,cannot be stored or processed with one of the usual tools or applications for data management,and it has become a prominent word in recent years for the massive development of technology.Almost immediately thereafter,the term“big data mining”emerged,i.e.,mining from big data even as an emerging and interconnected field of research.Classification is an important stage in data mining since it helps people make better decisions in a variety of situations,including scientific endeavors,biomedical research,and industrial applications.The probabilistic neural network(PNN)is a commonly used and successful method for handling classification and pattern recognition issues.In this study,the authors proposed to combine the probabilistic neural network(PPN),which is one of the data mining techniques,with the vibrating particles system(VPS),which is one of the metaheuristic algorithms named“VPS-PNN”,to solve classi-fication problems more effectively.The data set is eleven common benchmark medical datasets from the machine-learning library,the suggested method was tested.The suggested VPS-PNN mechanism outperforms the PNN,biogeography-based optimization,enhanced-water cycle algorithm(E-WCA)and the firefly algorithm(FA)in terms of convergence speed and classification accuracy.
文摘This paper generally illustrates the main origins of English and Chinese names,that is,the origin from characters in an cient mythology,the religion,typical literary images;common people in daily life and from famous trademarks.So it's necessary to focus on names involving cultural connotation.What's more,it introduces the transliteration and annotated translation.