Huanglongbing(HLB),a systemic and destructive disease of citrus,is associated with‘Candidatus Liberibacter asiaticus’(Las)in the United States.Our earlier work has shown that Las bacteria were significantly reduced ...Huanglongbing(HLB),a systemic and destructive disease of citrus,is associated with‘Candidatus Liberibacter asiaticus’(Las)in the United States.Our earlier work has shown that Las bacteria were significantly reduced or eliminated when potted HLB-affected citrus were continuously exposed to high temperatures of 40 to 42°C for a minimum of 48 h.To determine the feasibility and effectiveness of solar thermotherapy in the field,portable plastic enclosures were placed over commercial and residential citrus,exposing trees to high temperatures through solarization.Within 3–6 weeks after treatment,most trees responded with vigorous new growth.Las titer in new growth was greatly reduced for 18–36 months after treatment.Unlike with potted trees,exposure to high heat did not eradicate the Las population under field conditions.This may be attributed to reduced temperatures at night in the field compared to continuous high temperature exposure that can be maintained in growth chambers,and the failure to achieve therapeutic temperatures in the root zone.Despite the presence of Las in heat-treated commercial citrus,many trees produced abundant flush and grew vigorously for 2 to 3 years after treatment.Transcriptome analysis comparing healthy trees to HLB-affected citrus both before and after heat treatment demonstrated that post-treatment transcriptional expression patterns more closely resembled the expression patterns of healthy controls for most differentially expressed genes and that genes involved with plant-bacterium interactions are upregulated after heat treatment.Overall,these results indicate that solar thermotherapy can be an effective component of an integrated control strategy for citrus HLB.展开更多
In order to accommodate the large number of Internet of Things(IoT)connections in mobile networks,Non-Orthogonal Multiple Access(NOMA)has been applied as the resource sharing mechanism in mobile access networks to imp...In order to accommodate the large number of Internet of Things(IoT)connections in mobile networks,Non-Orthogonal Multiple Access(NOMA)has been applied as the resource sharing mechanism in mobile access networks to improve the spectrum efficiency.The NOMA-based resource management for uplink communications comprises two problems,i.e.,user clustering and power&wireless channel allocation.User clustering refers to assigning users in terms of IoT devices to different clusters(where users in the same cluster are sharing the wireless channels to upload their data),and power&wireless channel allocation is to optimize the transmission power of the users and the number of wireless channels allocated to the users in a cluster.The two problems are coupled together,thus making it difficult to solve the NOMA-based resource management problem.In this paper,we propose a QoS-aWarE resourcE managemenT(SWEET)for NOMA algorithm to jointly optimize the user clustering,power management,and wireless channel allocation such that the number of wireless channels is minimized and the data rate requirements of the users can be satisfied.The performance of SWEET is validated via extensive simulations.展开更多
The lemon(Citrus limon;family Rutaceae)is one of the most important and popular fruits worldwide.Lemon also tolerates huan-glongbing(HLB)disease,which is a devastating citrus disease.Here we produced a gap-free and ha...The lemon(Citrus limon;family Rutaceae)is one of the most important and popular fruits worldwide.Lemon also tolerates huan-glongbing(HLB)disease,which is a devastating citrus disease.Here we produced a gap-free and haplotype-resolved chromosome-scale genome assembly of the lemon by combining Pacific Biosciences circular consensus sequencing,Oxford Nanopore 50-kb ultra-long,and high-throughput chromatin conformation capture technologies.The assembly contained nine-pair chromosomes with a contig N50 of 35.6 Mb and zero gaps,while a total of 633.0 Mb genomic sequences were generated.The origination analysis identified 338.5Mb genomic sequences originating from citron(53.5%),147.4Mb frommandarin(23.3%),and 147.1Mb frompummelo(23.2%).The genome included 30528 protein-coding genes,and most of the assembled sequences were found to be repetitive sequences.Several significantly expanded gene families were associated with plant-pathogen interactions,plant hormone signal transduction,and the biosynthesis of major active components,such as terpenoids and f lavor compounds.Most HLB-tolerant genes were expanded in the lemon genome,such as 2-oxoglutarate(2OG)/Fe(II)-dependent oxygenase and constitutive disease resistance 1,cell wall-related genes,and lignin synthesis genes.Comparative transcriptomic analysis showed that phloem regeneration and lower levels of phloem plugging are the elements that contribute to HLB tolerance in lemon.Our results provide insight into lemon genome evolution,active component biosynthesis,and genes associated with HLB tolerance.展开更多
Modern computer systems are increasingly bounded by the available or permissible power at multiple layers from individual components to data centers.To cope with this reality,it is necessary to understand how power bo...Modern computer systems are increasingly bounded by the available or permissible power at multiple layers from individual components to data centers.To cope with this reality,it is necessary to understand how power bounds im-pact performance,especially for systems built from high-end nodes,each consisting of multiple power hungry components.Because placing an inappropriate power bound on a node or a component can lead to severe performance loss,coordinat-ing power allocation among nodes and components is mandatory to achieve desired performance given a total power bud-get.In this article,we describe the paradigm of power bounded high-performance computing,which considers coordinated power bound assignment to be a key factor in computer system performance analysis and optimization.We apply this paradigm to the problem of power coordination across multiple layers for both CPU and GPU computing.Using several case studies,we demonstrate how the principles of balanced power coordination can be applied and adapted to the inter-play of workloads,hardware technology,and the available total power for performance improvement.展开更多
The Pathfinder paradigm has been used in generating and analyzing graph models that support clustering similar concepts and minimum-cost paths to provide an associative network structure within a domain. The co-occurr...The Pathfinder paradigm has been used in generating and analyzing graph models that support clustering similar concepts and minimum-cost paths to provide an associative network structure within a domain. The co-occurrence pathfinder network (CPFN) extends the traditional pathfinder paradigm so that co-occurring concepts can be calculated at each sampling time. Existing algorithms take O(n4) time to calculate the pathfinder network (PFN) at each sampling time for a non-completed input graph of a CPFN (r=∞, q=n-1), where n is the number of nodes in the input graph, r is the Minkowski exponent and q is the maximum number of links considered in finding a minimum cost path between vertices. To reduce the complexity of calculating the CPFN, we propose a greedy based algorithm, MEC(G) algorithm, which takes shortcuts to avoid unnecessary steps in the existing algorithms, to correctly calculate a CPFN (r=∞, q=n-1) in O(klogk) time where k is the number of edges of the input graph. Our example demonstrates the efficiency and correctness of the proposed MEC(G) algorithm, confirming our mathematic analysis on this algorithm.展开更多
In traditional networks,enabling new network functions often needs to add new proprietary middleboxes.However,finding the space and power to accommodate these middleboxes is becoming increasingly difficult,
Cyber-physical systems(CPS)have been widely deployed in critical infrastructures and are vulnerable to various attacks.Data integrity attacks manipulate sensor measurements and cause control systems to fail,which are ...Cyber-physical systems(CPS)have been widely deployed in critical infrastructures and are vulnerable to various attacks.Data integrity attacks manipulate sensor measurements and cause control systems to fail,which are one of the prominent threats to CPS.Anomaly detection methods are proposed to secure CPS.However,existing anomaly detection studies usually require expert knowledge(e.g.,system model-based)or are lack of interpretability(e.g.,deep learning-based).In this paper,we present DEEPNOISE,a deep learning-based anomaly detection method for CPS with interpretability.Specifically,we utilize the sensor and process noise to detect data integrity attacks.Such noise represents the intrinsic characteristics of physical devices and the production process in CPS.One key enabler is that we use a robust deep autoencoder to automatically extract the noise from measurement data.Further,an LSTM-based detector is designed to inspect the obtained noise and detect anomalies.Data integrity attacks change noise patterns and thus are identified as the root cause of anomalies by DEEPNOISE.Evaluated on the SWaT testbed,DEEPNOISE achieves higher accuracy and recall compared with state-of-the-art model-based and deep learningbased methods.On average,when detecting direct attacks,the precision is 95.47%,the recall is 96.58%,and F_(1) is 95.98%.When detecting stealthy attacks,precision,recall,and F_(1) scores are between 96% and 99.5%.展开更多
Unsupervised image translation(UIT)studies the mapping between two image domains.Since such mappings are under-constrained,existing research has pursued various desirable properties such as distributional matching or ...Unsupervised image translation(UIT)studies the mapping between two image domains.Since such mappings are under-constrained,existing research has pursued various desirable properties such as distributional matching or two-way consistency.In this paper,we re-examine UIT from a new perspective:distributional semantics consistency,based on the observation that data variations contain semantics,e.g.,shoes varying in colors.Further,the semantics can be multi-dimensional,e.g.,shoes also varying in style,functionality,etc.Given two image domains,matching these semantic dimensions during UIT will produce mappings with explicable correspondences,which has not been investigated previously.We propose distributional semantics mapping(DSM),the first UIT method which explicitly matches semantics between two domains.We show that distributional semantics has been rarely considered within and beyond UIT,even though it is a common problem in deep learning.We evaluate DSM on several benchmark datasets,demonstrating its general ability to capture distributional semantics.Extensive comparisons show that DSM not only produces explicable mappings,but also improves image quality in general.展开更多
In the past decade,dramatic progress has been made in the field of machine learning.This paper explores the possibility of applying deep learning in power system state estimation.Traditionally,physics-based models are...In the past decade,dramatic progress has been made in the field of machine learning.This paper explores the possibility of applying deep learning in power system state estimation.Traditionally,physics-based models are used including weighted least square(WLS)or weighted least absolute value(WLAV).These models typically consider a single snapshot of the system without capturing temporal correlations of system states.In this paper,a physics-guided deep learning(PGDL)method is proposed.Specifically,inspired by autoencoders,deep neural networks(DNNs)are used to learn the temporal correlations.The estimated system states from DNNs are then checked against physics laws by running through a set of power flow equations.Hence,the proposed PGDL is both data-driven and physics-guided.The accuracy and robustness of the proposed PGDL method are compared with traditional methods in standard IEEE cases.Simulations show promising results and the applicability is further discussed.展开更多
We consider semantic image segmentation.Our method is inspired by Bayesian deep learning which improves image segmentation accuracy by modeling the uncertainty of the network output.In contrast to uncertainty,our meth...We consider semantic image segmentation.Our method is inspired by Bayesian deep learning which improves image segmentation accuracy by modeling the uncertainty of the network output.In contrast to uncertainty,our method directly learns to predict the erroneous pixels of a segmentation network,which is modeled as a binary classification problem.It can speed up training comparing to the Monte Carlo integration often used in Bayesian deep learning.It also allows us to train a branch to correct the labels of erroneous pixels.Our method consists of three stages:(i)predict pixel-wise error probability of the initial result,(ii)redetermine new labels for pixels with high error probability,and(iii)fuse the initial result and the redetermined result with respect to the error probability.We formulate the error-pixel prediction problem as a classification task and employ an error-prediction branch in the network to predict pixel-wise error probabilities.We also introduce a detail branch to focus the training process on the erroneous pixels.We have experimentally validated our method on the Cityscapes and ADE20K datasets.Our model can be easily added to various advanced segmentation networks to improve their performance.Taking DeepLabv3+as an example,our network can achieve 82.88%of mloU on Cityscapes testing dataset and 45.73%on ADE20K validation dataset,improving corresponding DeepLabv3+results by 0.74%and 0.13%respectively.展开更多
基金This work was funded by the Florida Department of Agriculture and Consumer Services Specialty Crop Block Grant#018023 and the Citrus Research and Development Foundation#834.
文摘Huanglongbing(HLB),a systemic and destructive disease of citrus,is associated with‘Candidatus Liberibacter asiaticus’(Las)in the United States.Our earlier work has shown that Las bacteria were significantly reduced or eliminated when potted HLB-affected citrus were continuously exposed to high temperatures of 40 to 42°C for a minimum of 48 h.To determine the feasibility and effectiveness of solar thermotherapy in the field,portable plastic enclosures were placed over commercial and residential citrus,exposing trees to high temperatures through solarization.Within 3–6 weeks after treatment,most trees responded with vigorous new growth.Las titer in new growth was greatly reduced for 18–36 months after treatment.Unlike with potted trees,exposure to high heat did not eradicate the Las population under field conditions.This may be attributed to reduced temperatures at night in the field compared to continuous high temperature exposure that can be maintained in growth chambers,and the failure to achieve therapeutic temperatures in the root zone.Despite the presence of Las in heat-treated commercial citrus,many trees produced abundant flush and grew vigorously for 2 to 3 years after treatment.Transcriptome analysis comparing healthy trees to HLB-affected citrus both before and after heat treatment demonstrated that post-treatment transcriptional expression patterns more closely resembled the expression patterns of healthy controls for most differentially expressed genes and that genes involved with plant-bacterium interactions are upregulated after heat treatment.Overall,these results indicate that solar thermotherapy can be an effective component of an integrated control strategy for citrus HLB.
基金This work is supported by the National Science Foundation under Award OIA-1757207.
文摘In order to accommodate the large number of Internet of Things(IoT)connections in mobile networks,Non-Orthogonal Multiple Access(NOMA)has been applied as the resource sharing mechanism in mobile access networks to improve the spectrum efficiency.The NOMA-based resource management for uplink communications comprises two problems,i.e.,user clustering and power&wireless channel allocation.User clustering refers to assigning users in terms of IoT devices to different clusters(where users in the same cluster are sharing the wireless channels to upload their data),and power&wireless channel allocation is to optimize the transmission power of the users and the number of wireless channels allocated to the users in a cluster.The two problems are coupled together,thus making it difficult to solve the NOMA-based resource management problem.In this paper,we propose a QoS-aWarE resourcE managemenT(SWEET)for NOMA algorithm to jointly optimize the user clustering,power management,and wireless channel allocation such that the number of wireless channels is minimized and the data rate requirements of the users can be satisfied.The performance of SWEET is validated via extensive simulations.
基金supported by the Guangxi Major Project of Science and Technology(Guike AA18118027)the Postdoctoral Project of Hainan Yazhou Bay Seed Laboratory Program(B21Y10203)the Scientific Research and Development Fund of the College of Agriculture,Guangxi University(EE101731).
文摘The lemon(Citrus limon;family Rutaceae)is one of the most important and popular fruits worldwide.Lemon also tolerates huan-glongbing(HLB)disease,which is a devastating citrus disease.Here we produced a gap-free and haplotype-resolved chromosome-scale genome assembly of the lemon by combining Pacific Biosciences circular consensus sequencing,Oxford Nanopore 50-kb ultra-long,and high-throughput chromatin conformation capture technologies.The assembly contained nine-pair chromosomes with a contig N50 of 35.6 Mb and zero gaps,while a total of 633.0 Mb genomic sequences were generated.The origination analysis identified 338.5Mb genomic sequences originating from citron(53.5%),147.4Mb frommandarin(23.3%),and 147.1Mb frompummelo(23.2%).The genome included 30528 protein-coding genes,and most of the assembled sequences were found to be repetitive sequences.Several significantly expanded gene families were associated with plant-pathogen interactions,plant hormone signal transduction,and the biosynthesis of major active components,such as terpenoids and f lavor compounds.Most HLB-tolerant genes were expanded in the lemon genome,such as 2-oxoglutarate(2OG)/Fe(II)-dependent oxygenase and constitutive disease resistance 1,cell wall-related genes,and lignin synthesis genes.Comparative transcriptomic analysis showed that phloem regeneration and lower levels of phloem plugging are the elements that contribute to HLB tolerance in lemon.Our results provide insight into lemon genome evolution,active component biosynthesis,and genes associated with HLB tolerance.
基金supported in part by the U.S.National Science Foundation under Grant Nos.CCF-1551511 and CNS-1551262.
文摘Modern computer systems are increasingly bounded by the available or permissible power at multiple layers from individual components to data centers.To cope with this reality,it is necessary to understand how power bounds im-pact performance,especially for systems built from high-end nodes,each consisting of multiple power hungry components.Because placing an inappropriate power bound on a node or a component can lead to severe performance loss,coordinat-ing power allocation among nodes and components is mandatory to achieve desired performance given a total power bud-get.In this article,we describe the paradigm of power bounded high-performance computing,which considers coordinated power bound assignment to be a key factor in computer system performance analysis and optimization.We apply this paradigm to the problem of power coordination across multiple layers for both CPU and GPU computing.Using several case studies,we demonstrate how the principles of balanced power coordination can be applied and adapted to the inter-play of workloads,hardware technology,and the available total power for performance improvement.
文摘The Pathfinder paradigm has been used in generating and analyzing graph models that support clustering similar concepts and minimum-cost paths to provide an associative network structure within a domain. The co-occurrence pathfinder network (CPFN) extends the traditional pathfinder paradigm so that co-occurring concepts can be calculated at each sampling time. Existing algorithms take O(n4) time to calculate the pathfinder network (PFN) at each sampling time for a non-completed input graph of a CPFN (r=∞, q=n-1), where n is the number of nodes in the input graph, r is the Minkowski exponent and q is the maximum number of links considered in finding a minimum cost path between vertices. To reduce the complexity of calculating the CPFN, we propose a greedy based algorithm, MEC(G) algorithm, which takes shortcuts to avoid unnecessary steps in the existing algorithms, to correctly calculate a CPFN (r=∞, q=n-1) in O(klogk) time where k is the number of edges of the input graph. Our example demonstrates the efficiency and correctness of the proposed MEC(G) algorithm, confirming our mathematic analysis on this algorithm.
文摘In traditional networks,enabling new network functions often needs to add new proprietary middleboxes.However,finding the space and power to accommodate these middleboxes is becoming increasingly difficult,
基金National Natural Science Foundation of China(No.62172308,U1626107,61972297,62172144)。
文摘Cyber-physical systems(CPS)have been widely deployed in critical infrastructures and are vulnerable to various attacks.Data integrity attacks manipulate sensor measurements and cause control systems to fail,which are one of the prominent threats to CPS.Anomaly detection methods are proposed to secure CPS.However,existing anomaly detection studies usually require expert knowledge(e.g.,system model-based)or are lack of interpretability(e.g.,deep learning-based).In this paper,we present DEEPNOISE,a deep learning-based anomaly detection method for CPS with interpretability.Specifically,we utilize the sensor and process noise to detect data integrity attacks.Such noise represents the intrinsic characteristics of physical devices and the production process in CPS.One key enabler is that we use a robust deep autoencoder to automatically extract the noise from measurement data.Further,an LSTM-based detector is designed to inspect the obtained noise and detect anomalies.Data integrity attacks change noise patterns and thus are identified as the root cause of anomalies by DEEPNOISE.Evaluated on the SWaT testbed,DEEPNOISE achieves higher accuracy and recall compared with state-of-the-art model-based and deep learningbased methods.On average,when detecting direct attacks,the precision is 95.47%,the recall is 96.58%,and F_(1) is 95.98%.When detecting stealthy attacks,precision,recall,and F_(1) scores are between 96% and 99.5%.
基金supported by National Natural Science Foundation of China(Grant No.61772462)the 100 Talents Program of Zhejiang University。
文摘Unsupervised image translation(UIT)studies the mapping between two image domains.Since such mappings are under-constrained,existing research has pursued various desirable properties such as distributional matching or two-way consistency.In this paper,we re-examine UIT from a new perspective:distributional semantics consistency,based on the observation that data variations contain semantics,e.g.,shoes varying in colors.Further,the semantics can be multi-dimensional,e.g.,shoes also varying in style,functionality,etc.Given two image domains,matching these semantic dimensions during UIT will produce mappings with explicable correspondences,which has not been investigated previously.We propose distributional semantics mapping(DSM),the first UIT method which explicitly matches semantics between two domains.We show that distributional semantics has been rarely considered within and beyond UIT,even though it is a common problem in deep learning.We evaluate DSM on several benchmark datasets,demonstrating its general ability to capture distributional semantics.Extensive comparisons show that DSM not only produces explicable mappings,but also improves image quality in general.
文摘In the past decade,dramatic progress has been made in the field of machine learning.This paper explores the possibility of applying deep learning in power system state estimation.Traditionally,physics-based models are used including weighted least square(WLS)or weighted least absolute value(WLAV).These models typically consider a single snapshot of the system without capturing temporal correlations of system states.In this paper,a physics-guided deep learning(PGDL)method is proposed.Specifically,inspired by autoencoders,deep neural networks(DNNs)are used to learn the temporal correlations.The estimated system states from DNNs are then checked against physics laws by running through a set of power flow equations.Hence,the proposed PGDL is both data-driven and physics-guided.The accuracy and robustness of the proposed PGDL method are compared with traditional methods in standard IEEE cases.Simulations show promising results and the applicability is further discussed.
基金supported by the National Natural Science Foundation of China(No.61732016).
文摘We consider semantic image segmentation.Our method is inspired by Bayesian deep learning which improves image segmentation accuracy by modeling the uncertainty of the network output.In contrast to uncertainty,our method directly learns to predict the erroneous pixels of a segmentation network,which is modeled as a binary classification problem.It can speed up training comparing to the Monte Carlo integration often used in Bayesian deep learning.It also allows us to train a branch to correct the labels of erroneous pixels.Our method consists of three stages:(i)predict pixel-wise error probability of the initial result,(ii)redetermine new labels for pixels with high error probability,and(iii)fuse the initial result and the redetermined result with respect to the error probability.We formulate the error-pixel prediction problem as a classification task and employ an error-prediction branch in the network to predict pixel-wise error probabilities.We also introduce a detail branch to focus the training process on the erroneous pixels.We have experimentally validated our method on the Cityscapes and ADE20K datasets.Our model can be easily added to various advanced segmentation networks to improve their performance.Taking DeepLabv3+as an example,our network can achieve 82.88%of mloU on Cityscapes testing dataset and 45.73%on ADE20K validation dataset,improving corresponding DeepLabv3+results by 0.74%and 0.13%respectively.