Background Deep 3D morphable models(deep 3DMMs)play an essential role in computer vision.They are used in facial synthesis,compression,reconstruction and animation,avatar creation,virtual try-on,facial recognition sys...Background Deep 3D morphable models(deep 3DMMs)play an essential role in computer vision.They are used in facial synthesis,compression,reconstruction and animation,avatar creation,virtual try-on,facial recognition systems and medical imaging.These applications require high spatial and perceptual quality of synthesised meshes.Despite their significance,these models have not been compared with different mesh representations and evaluated jointly with point-wise distance and perceptual metrics.Methods We compare the influence of different mesh representation features to various deep 3DMMs on spatial and perceptual fidelity of the reconstructed meshes.This paper proves the hypothesis that building deep 3DMMs from meshes represented with global representations leads to lower spatial reconstruction error measured with L_(1) and L_(2) norm metrics and underperforms on perceptual metrics.In contrast,using differential mesh representations which describe differential surface properties yields lower perceptual FMPD and DAME and higher spatial fidelity error.The influence of mesh feature normalisation and standardisation is also compared and analysed from perceptual and spatial fidelity perspectives.Results The results presented in this paper provide guidance in selecting mesh representations to build deep 3DMMs accordingly to spatial and perceptual quality objectives and propose combinations of mesh representations and deep 3DMMs which improve either perceptual or spatial fidelity of existing methods.展开更多
A new chaos game representation of protein sequences based on the detailed hydrophobic-hydrophilic (HP) model has been proposed by Yu et al (Physica A 337(2004) 171). A CGR-walk model is proposed based on the ne...A new chaos game representation of protein sequences based on the detailed hydrophobic-hydrophilic (HP) model has been proposed by Yu et al (Physica A 337(2004) 171). A CGR-walk model is proposed based on the new CGR coordinates for the protein sequences from complete genomes in the present paper. The new CCR coordinates based on the detailed HP model are converted into a time series, and a long-memory ARFIMA(p, d, q) model is introduced into the protein sequence analysis. This model is applied to simulating real CCR-walk sequence data of twelve protein sequences. Remarkably long-range correlations are uncovered in the data and the results obtained from these models are reasonably consistent with those available from the ARFIMA(p, d, q) model.展开更多
To makesystem-of-systems combat simulation models easy to be developed and reused, simulation model formal specification and representation are researched. According to the view of system-of-systems combat simulation,...To makesystem-of-systems combat simulation models easy to be developed and reused, simulation model formal specification and representation are researched. According to the view of system-of-systems combat simulation, and based on DEVS, the simulation model's fundamental formalisms are explored. It includes entity model, system-of-systems model and experiment model. It also presents rigorous formal specification. XML data exchange standard is combined to design the XML based language, SCSL, to support simulation model representation. The corresponding relationship between SCSL and simulation model formalism is discussed and the syntax and semantics of elements in SCSL are detailed. Based on simulation model formal specification, the abstract simulation algorithm is given and SCSL virtual machine, which is capable of automatically interpreting and executing simulation model represented by SCSL, is designed. Finally an application case is presented, which can show the validation of the theory and verification of SCSL.展开更多
Data-driven turbulence modeling studies have reached such a stage that the basic framework is settled,but several essential issues remain that strongly affect the performance.Two problems are studied in the current re...Data-driven turbulence modeling studies have reached such a stage that the basic framework is settled,but several essential issues remain that strongly affect the performance.Two problems are studied in the current research:(1)the processing of the Reynolds stress tensor and(2)the coupling method between the machine learning model and flow solver.For the Reynolds stress processing issue,we perform the theoretical derivation to extend the relevant tensor arguments of Reynolds stress.Then,the tensor representation theorem is employed to give the complete irreducible invariants and integrity basis.An adaptive regularization term is employed to enhance the representation performance.For the coupling issue,an iterative coupling framework with consistent convergence is proposed and then applied to a canonical separated flow.The results have high consistency with the direct numerical simulation true values,which proves the validity of the current approach.展开更多
A sinusoidal representation of speech and a cochlear model are used to extract speech parameters in this paper, and a speech analysis/synthesis system controlled by the auditory spectrum is developed with the model. T...A sinusoidal representation of speech and a cochlear model are used to extract speech parameters in this paper, and a speech analysis/synthesis system controlled by the auditory spectrum is developed with the model. The computer simulation shows that speech can be synthesized with only 12 parameters per frame on the average. The method has the advantages of few parameters, low complexity and high performance of speech representation. The synthetic speech has high intelligibility.展开更多
In distributed storage systems,file access efficiency has an important impact on the real-time nature of information forensics.As a popular approach to improve file accessing efficiency,prefetching model can fetches d...In distributed storage systems,file access efficiency has an important impact on the real-time nature of information forensics.As a popular approach to improve file accessing efficiency,prefetching model can fetches data before it is needed according to the file access pattern,which can reduce the I/O waiting time and increase the system concurrency.However,prefetching model needs to mine the degree of association between files to ensure the accuracy of prefetching.In the massive small file situation,the sheer volume of files poses a challenge to the efficiency and accuracy of relevance mining.In this paper,we propose a massive files prefetching model based on LSTM neural network with cache transaction strategy to improve file access efficiency.Firstly,we propose a file clustering algorithm based on temporal locality and spatial locality to reduce the computational complexity.Secondly,we propose a definition of cache transaction according to files occurrence in cache instead of time-offset distance based methods to extract file block feature accurately.Lastly,we innovatively propose a file access prediction algorithm based on LSTM neural network which predict the file that have high possibility to be accessed.Experiments show that compared with the traditional LRU and the plain grouping methods,the proposed model notably increase the cache hit rate and effectively reduces the I/O wait time.展开更多
Action model learning has become a hot topic in knowledge engineering for automated planning.A key problem for learning action models is to analyze state changes before and after action executions from observed"p...Action model learning has become a hot topic in knowledge engineering for automated planning.A key problem for learning action models is to analyze state changes before and after action executions from observed"plan traces".To support such an analysis,a new approach is proposed to partition propositions of plan traces into states.First,vector representations of propositions and actions are obtained by training a neural network called Skip-Gram borrowed from the area of natural language processing(NLP).Then,a type of semantic distance among propositions and actions is defined based on their similarity measures in the vector space.Finally,k-means and k-nearest neighbor(kNN)algorithms are exploited to map propositions to states.This approach is called state partition by word vector(SPWV),which is implemented on top of a recent action model learning framework by Rao et al.Experimental results on the benchmark domains show that SPWV leads to a lower error rate of the learnt action model,compared to the probability based approach for state partition that was developed by Rao et al.展开更多
IIn order to improve the performance of wireless distributed peer-to-peer(P2P)files sharing systems,a general system architecture and a novel peer selecting model based on fuzzy cognitive maps(FCM)are proposed in this...IIn order to improve the performance of wireless distributed peer-to-peer(P2P)files sharing systems,a general system architecture and a novel peer selecting model based on fuzzy cognitive maps(FCM)are proposed in this paper.The new model provides an effective approach on choosing an optimal peer from several resource discovering results for the best file transfer.Compared with the traditional min-hops scheme that uses hops as the only selecting criterion,the proposed model uses FCM to investigate the complex relationships among various relative factors in wireless environments and gives an overall evaluation score on the candidate.It also has strong scalability for being independent of specified P2P resource discovering protocols.Furthermore,a complete implementation is explained in concrete modules.The simulation results show that the proposed model is effective and feasible compared with min-hops scheme,with the success transfer rate increased by at least 20% and transfer time improved as high as 34%.展开更多
Traditional topic models have been widely used for analyzing semantic topics from electronic documents.However,the obvious defects of topic words acquired by them are poor in readability and consistency.Only the domai...Traditional topic models have been widely used for analyzing semantic topics from electronic documents.However,the obvious defects of topic words acquired by them are poor in readability and consistency.Only the domain experts are possible to guess their meaning.In fact,phrases are the main unit for people to express semantics.This paper presents a Distributed Representation-Phrase Latent Dirichlet Allocation(DR-Phrase LDA)which is a phrase topic model.Specifically,we reasonably enhance the semantic information of phrases via distributed representation in this model.The experimental results show the topics quality acquired by our model is more readable and consistent than other similar topic models.展开更多
A knowledge representation has been proposed using the state space theory of Artificial Intelligence for Dynamic Programming Model, in which a model can be defined as a six tuple M=(I,G,O,T,D,S). A building block mode...A knowledge representation has been proposed using the state space theory of Artificial Intelligence for Dynamic Programming Model, in which a model can be defined as a six tuple M=(I,G,O,T,D,S). A building block modeling method uses the modules of a six tuple to form a rule based solution model. Moreover, a rule based system has been designed and set up to solve the Dynamic Programming Model. This knowledge based representation can be easily used to express symbolical knowledge and dynamic characteristics for Dynamic Programming Model, and the inference based on the knowledge in the process of solving Dynamic Programming Model can also be conveniently realized in computer.展开更多
The existing geometrical solution models for predicting ternary thermodynamic properties from relevant binary ones have been analysed,and a general representation was proposed in an integral form on the bases of these...The existing geometrical solution models for predicting ternary thermodynamic properties from relevant binary ones have been analysed,and a general representation was proposed in an integral form on the bases of these models.展开更多
An explicit algebraic stress model (EASM) has been formulated for two-dimensional turbulent buoyant flows using a five-term tensor representation in a prior study. The derivation was based on partitioning the buoyant ...An explicit algebraic stress model (EASM) has been formulated for two-dimensional turbulent buoyant flows using a five-term tensor representation in a prior study. The derivation was based on partitioning the buoyant flux tensor into a two-dimensional and a three-dimensional component. The five-term basis was formed with the two-dimensional component of the buoyant flux tensor. As such, the derived EASM is limited to two-dimensional flows only. In this paper, a more general approach using a seven-term representation without partitioning the buoyant flux tensor is used to derive an EASM valid for two- and three-dimensional turbulent buoyant flows. Consequently, the basis tensors are formed with the fully three-dimensional buoyant flux tensor. The derived EASM has the two-dimensional flow as a special case. The matrices and the representation coefficients are further simplified using a four-term representation. When this four-term representation model is applied to calculate two-dimensional homogeneous buoyant flows, the results are essentially identical with those obtained previously using the two-dimensional component of the buoyant flux tensor. Therefore, the present approach leads to a more general EASM formulation that is equally valid for two- and three-dimensional turbulent buoyant flows.展开更多
Feature based design has been regarded as a promising approach for CAD/CAM integration.This paper aims to establish a domain independent representation formalism for feature based design in three aspects: formal re...Feature based design has been regarded as a promising approach for CAD/CAM integration.This paper aims to establish a domain independent representation formalism for feature based design in three aspects: formal representation,design process model and design algorithm.The implementing scheme and formal description of feature taxonomy,feature operator,feature model validation and feature transformation are given in the paper.The feature based design process model suited for either sequencial or concurrent engineering is proposed and its application to product structural design and process plan design is presented. Some general design algorithms for developing feature based design system are also addressed.The proposed scheme provides a formal methodology elementary for feature based design system development and operation in a structural way.展开更多
The machinery fault signal is a typical non-Gaussian and non-stationary process. The fault signal can be described by SaS distribution model because of the presence of impulses.Time-frequency distribution is a useful ...The machinery fault signal is a typical non-Gaussian and non-stationary process. The fault signal can be described by SaS distribution model because of the presence of impulses.Time-frequency distribution is a useful tool to extract helpful information of the machinery fault signal. Various fractional lower order(FLO) time-frequency distribution methods have been proposed based on fractional lower order statistics, which include fractional lower order short time Fourier transform(FLO-STFT), fractional lower order Wigner-Ville distributions(FLO-WVDs), fractional lower order Cohen class time-frequency distributions(FLO-CDs), fractional lower order adaptive kernel time-frequency distributions(FLO-AKDs) and adaptive fractional lower order time-frequency auto-regressive moving average(FLO-TFARMA) model time-frequency representation method.The methods and the exiting methods based on second order statistics in SaS distribution environments are compared, simulation results show that the new methods have better performances than the existing methods. The advantages and disadvantages of the improved time-frequency methods have been summarized.Last, the new methods are applied to analyze the outer race fault signals, the results illustrate their good performances.展开更多
Many methods based on deep learning have achieved amazing results in image sentiment analysis.However,these existing methods usually pursue high accuracy,ignoring the effect on model training efficiency.Considering th...Many methods based on deep learning have achieved amazing results in image sentiment analysis.However,these existing methods usually pursue high accuracy,ignoring the effect on model training efficiency.Considering that when faced with large-scale sentiment analysis tasks,the high accuracy rate often requires long experimental time.In view of the weakness,a method that can greatly improve experimental efficiency with only small fluctuations in model accuracy is proposed,and singular value decomposition(SVD)is used to find the sparse feature of the image,which are sparse vectors with strong discriminativeness and effectively reduce redundant information;The authors propose the Fast Dictionary Learning algorithm(FDL),which can combine neural network with sparse representation.This method is based on K-Singular Value Decomposition,and through iteration,it can effectively reduce the calculation time and greatly improve the training efficiency in the case of small fluctuation of accuracy.Moreover,the effectiveness of the proposed method is evaluated on the FER2013 dataset.By adding singular value decomposition,the accuracy of the test suite increased by 0.53%,and the total experiment time was shortened by 8.2%;Fast Dictionary Learning shortened the total experiment time by 36.3%.展开更多
Predicting anomalous behaviour of a running process using system call trace is a common practice among security community and it is still an active research area. It is a typical pattern recognition problem and can be...Predicting anomalous behaviour of a running process using system call trace is a common practice among security community and it is still an active research area. It is a typical pattern recognition problem and can be dealt with machine learning algorithms. Standard system call datasets were employed to train these algorithms. However, advancements in operating systems made these datasets outdated and un-relevant. Australian Defence Force Academy Linux Dataset (ADFA-LD) and Australian Defence Force Academy Windows Dataset (ADFA-WD) are new generation system calls datasets that contain labelled system call traces for modern exploits and attacks on various applications. In this paper, we evaluate performance of Modified Vector Space Representation technique on ADFA-LD and ADFA-WD datasets using various classification algorithms. Our experimental results show that our method performs well and it helps accurately distinguishing process behaviour through system calls.展开更多
Due to the NP-hardness of the job shop scheduling problem (JSP), many heuristic approaches have been proposed;among them is the genetic algorithm (GA). In the literature, there are eight different GA representations f...Due to the NP-hardness of the job shop scheduling problem (JSP), many heuristic approaches have been proposed;among them is the genetic algorithm (GA). In the literature, there are eight different GA representations for the JSP;each one aims to provide subtle environment through which the GA’s reproduction and mutation operators would succeed in finding near optimal solutions in small computational time. This paper provides a computational study to compare the performance of the GA under six different representations.展开更多
Due to NP-Hard nature of the Job Shop Scheduling Problems (JSP), exact methods fail to provide the optimal solutions in quite reasonable computational time. Due to this nature of the problem, so many heuristics and me...Due to NP-Hard nature of the Job Shop Scheduling Problems (JSP), exact methods fail to provide the optimal solutions in quite reasonable computational time. Due to this nature of the problem, so many heuristics and meta-heuristics have been proposed in the past to get optimal or near-optimal solutions for easy to tough JSP instances in lesser computational time compared to exact methods. One of such heuristics is genetic algorithm (GA). Representations in GA will have a direct impact on computational time it takes in providing optimal or near optimal solutions. Different representation schemes are possible in case of Job Scheduling Problems. These schemes in turn will have a higher impact on the performance of GA. It is intended to show through this paper, how these representations will perform, by a comparative analysis based on average deviation, evolution of solution over entire generations etc.展开更多
As a part of the Smart Grid concept, an efficient energy management at the residential level has received increasing attention in lately research. Its main focus is to balance the energy consumption in the residential...As a part of the Smart Grid concept, an efficient energy management at the residential level has received increasing attention in lately research. Its main focus is to balance the energy consumption in the residential environment in order to avoid the undesirable peaks faced by the electricity supplier. This challenge can be achieved by means of a home energy management system (HEMS). The HEMS may consider local renewable energy production and energy storage, as well as local control of some particular loads when peaks mitigation is necessary. This paper presents the modeling and comparison of two residential systems;one using conventional electric baseboard heating and the other one supported by Electric Thermal Storage (ETS);the ETS is employed to optimize the local energy utilization pursuing the peak shaving of residential consumption profile. Simulations of the proposed architecture using the Energetic Macroscopic Representation (EMR) demonstrate the potential of ETS technologies in future HEMS.展开更多
File labeling techniques have a long history in analyzing the anthological trends in computational linguistics.The situation becomes worse in the case of files downloaded into systems from the Internet.Currently,most ...File labeling techniques have a long history in analyzing the anthological trends in computational linguistics.The situation becomes worse in the case of files downloaded into systems from the Internet.Currently,most users either have to change file names manually or leave a meaningless name of the files,which increases the time to search required files and results in redundancy and duplications of user files.Currently,no significant work is done on automated file labeling during the organization of heterogeneous user files.A few attempts have been made in topic modeling.However,one major drawback of current topic modeling approaches is better results.They rely on specific language types and domain similarity of the data.In this research,machine learning approaches have been employed to analyze and extract the information from heterogeneous corpus.A different file labeling technique has also been used to get the meaningful and`cohesive topic of the files.The results show that the proposed methodology can generate relevant and context-sensitive names for heterogeneous data files and provide additional insight into automated file labeling in operating systems.展开更多
基金Supported by the Centre for Digital Entertainment at Bournemouth University by the UK Engineering and Physical Sciences Research Council(EPSRC)EP/L016540/1 and Humain Ltd.
文摘Background Deep 3D morphable models(deep 3DMMs)play an essential role in computer vision.They are used in facial synthesis,compression,reconstruction and animation,avatar creation,virtual try-on,facial recognition systems and medical imaging.These applications require high spatial and perceptual quality of synthesised meshes.Despite their significance,these models have not been compared with different mesh representations and evaluated jointly with point-wise distance and perceptual metrics.Methods We compare the influence of different mesh representation features to various deep 3DMMs on spatial and perceptual fidelity of the reconstructed meshes.This paper proves the hypothesis that building deep 3DMMs from meshes represented with global representations leads to lower spatial reconstruction error measured with L_(1) and L_(2) norm metrics and underperforms on perceptual metrics.In contrast,using differential mesh representations which describe differential surface properties yields lower perceptual FMPD and DAME and higher spatial fidelity error.The influence of mesh feature normalisation and standardisation is also compared and analysed from perceptual and spatial fidelity perspectives.Results The results presented in this paper provide guidance in selecting mesh representations to build deep 3DMMs accordingly to spatial and perceptual quality objectives and propose combinations of mesh representations and deep 3DMMs which improve either perceptual or spatial fidelity of existing methods.
基金Project supported by the National Natural Science Foundation of China (Grant No 60575038)the Natural Science Foundation of Jiangnan University, China (Grant No 20070365)the Program for Innovative Research Team of Jiangnan University, China
文摘A new chaos game representation of protein sequences based on the detailed hydrophobic-hydrophilic (HP) model has been proposed by Yu et al (Physica A 337(2004) 171). A CGR-walk model is proposed based on the new CGR coordinates for the protein sequences from complete genomes in the present paper. The new CCR coordinates based on the detailed HP model are converted into a time series, and a long-memory ARFIMA(p, d, q) model is introduced into the protein sequence analysis. This model is applied to simulating real CCR-walk sequence data of twelve protein sequences. Remarkably long-range correlations are uncovered in the data and the results obtained from these models are reasonably consistent with those available from the ARFIMA(p, d, q) model.
文摘To makesystem-of-systems combat simulation models easy to be developed and reused, simulation model formal specification and representation are researched. According to the view of system-of-systems combat simulation, and based on DEVS, the simulation model's fundamental formalisms are explored. It includes entity model, system-of-systems model and experiment model. It also presents rigorous formal specification. XML data exchange standard is combined to design the XML based language, SCSL, to support simulation model representation. The corresponding relationship between SCSL and simulation model formalism is discussed and the syntax and semantics of elements in SCSL are detailed. Based on simulation model formal specification, the abstract simulation algorithm is given and SCSL virtual machine, which is capable of automatically interpreting and executing simulation model represented by SCSL, is designed. Finally an application case is presented, which can show the validation of the theory and verification of SCSL.
基金This work was supported by the National Natural Science Foundation of China(91852108,11872230 and 92152301).
文摘Data-driven turbulence modeling studies have reached such a stage that the basic framework is settled,but several essential issues remain that strongly affect the performance.Two problems are studied in the current research:(1)the processing of the Reynolds stress tensor and(2)the coupling method between the machine learning model and flow solver.For the Reynolds stress processing issue,we perform the theoretical derivation to extend the relevant tensor arguments of Reynolds stress.Then,the tensor representation theorem is employed to give the complete irreducible invariants and integrity basis.An adaptive regularization term is employed to enhance the representation performance.For the coupling issue,an iterative coupling framework with consistent convergence is proposed and then applied to a canonical separated flow.The results have high consistency with the direct numerical simulation true values,which proves the validity of the current approach.
文摘A sinusoidal representation of speech and a cochlear model are used to extract speech parameters in this paper, and a speech analysis/synthesis system controlled by the auditory spectrum is developed with the model. The computer simulation shows that speech can be synthesized with only 12 parameters per frame on the average. The method has the advantages of few parameters, low complexity and high performance of speech representation. The synthetic speech has high intelligibility.
基金This work is supported by‘The Fundamental Research Funds for the Central Universities(Grant No.HIT.NSRIF.201714)’‘Weihai Science and Technology Development Program(2016DXGJMS15)’‘Key Research and Development Program in Shandong Provincial(2017GGX90103)’.
文摘In distributed storage systems,file access efficiency has an important impact on the real-time nature of information forensics.As a popular approach to improve file accessing efficiency,prefetching model can fetches data before it is needed according to the file access pattern,which can reduce the I/O waiting time and increase the system concurrency.However,prefetching model needs to mine the degree of association between files to ensure the accuracy of prefetching.In the massive small file situation,the sheer volume of files poses a challenge to the efficiency and accuracy of relevance mining.In this paper,we propose a massive files prefetching model based on LSTM neural network with cache transaction strategy to improve file access efficiency.Firstly,we propose a file clustering algorithm based on temporal locality and spatial locality to reduce the computational complexity.Secondly,we propose a definition of cache transaction according to files occurrence in cache instead of time-offset distance based methods to extract file block feature accurately.Lastly,we innovatively propose a file access prediction algorithm based on LSTM neural network which predict the file that have high possibility to be accessed.Experiments show that compared with the traditional LRU and the plain grouping methods,the proposed model notably increase the cache hit rate and effectively reduces the I/O wait time.
基金Supported by the National Natural Science Foundation of China(61103136,61370156,61503074)Open Research Foundation of Science and Technology on Aerospace Flight Dynamics Laboratory(2014afdl002)
文摘Action model learning has become a hot topic in knowledge engineering for automated planning.A key problem for learning action models is to analyze state changes before and after action executions from observed"plan traces".To support such an analysis,a new approach is proposed to partition propositions of plan traces into states.First,vector representations of propositions and actions are obtained by training a neural network called Skip-Gram borrowed from the area of natural language processing(NLP).Then,a type of semantic distance among propositions and actions is defined based on their similarity measures in the vector space.Finally,k-means and k-nearest neighbor(kNN)algorithms are exploited to map propositions to states.This approach is called state partition by word vector(SPWV),which is implemented on top of a recent action model learning framework by Rao et al.Experimental results on the benchmark domains show that SPWV leads to a lower error rate of the learnt action model,compared to the probability based approach for state partition that was developed by Rao et al.
基金Sponsored by the National Natural Science Foundation of China(Grant No.60672124 and 60832009)Hi-Tech Research and Development Program(National 863 Program)(Grant No.2007AA01Z221)
文摘IIn order to improve the performance of wireless distributed peer-to-peer(P2P)files sharing systems,a general system architecture and a novel peer selecting model based on fuzzy cognitive maps(FCM)are proposed in this paper.The new model provides an effective approach on choosing an optimal peer from several resource discovering results for the best file transfer.Compared with the traditional min-hops scheme that uses hops as the only selecting criterion,the proposed model uses FCM to investigate the complex relationships among various relative factors in wireless environments and gives an overall evaluation score on the candidate.It also has strong scalability for being independent of specified P2P resource discovering protocols.Furthermore,a complete implementation is explained in concrete modules.The simulation results show that the proposed model is effective and feasible compared with min-hops scheme,with the success transfer rate increased by at least 20% and transfer time improved as high as 34%.
基金This work was supported by the Project of Industry and University Cooperative Research of Jiangsu Province,China(No.BY2019051)Ma,J.would like to thank the Jiangsu Eazytec Information Technology Company(www.eazytec.com)for their financial support.
文摘Traditional topic models have been widely used for analyzing semantic topics from electronic documents.However,the obvious defects of topic words acquired by them are poor in readability and consistency.Only the domain experts are possible to guess their meaning.In fact,phrases are the main unit for people to express semantics.This paper presents a Distributed Representation-Phrase Latent Dirichlet Allocation(DR-Phrase LDA)which is a phrase topic model.Specifically,we reasonably enhance the semantic information of phrases via distributed representation in this model.The experimental results show the topics quality acquired by our model is more readable and consistent than other similar topic models.
文摘A knowledge representation has been proposed using the state space theory of Artificial Intelligence for Dynamic Programming Model, in which a model can be defined as a six tuple M=(I,G,O,T,D,S). A building block modeling method uses the modules of a six tuple to form a rule based solution model. Moreover, a rule based system has been designed and set up to solve the Dynamic Programming Model. This knowledge based representation can be easily used to express symbolical knowledge and dynamic characteristics for Dynamic Programming Model, and the inference based on the knowledge in the process of solving Dynamic Programming Model can also be conveniently realized in computer.
文摘The existing geometrical solution models for predicting ternary thermodynamic properties from relevant binary ones have been analysed,and a general representation was proposed in an integral form on the bases of these models.
文摘An explicit algebraic stress model (EASM) has been formulated for two-dimensional turbulent buoyant flows using a five-term tensor representation in a prior study. The derivation was based on partitioning the buoyant flux tensor into a two-dimensional and a three-dimensional component. The five-term basis was formed with the two-dimensional component of the buoyant flux tensor. As such, the derived EASM is limited to two-dimensional flows only. In this paper, a more general approach using a seven-term representation without partitioning the buoyant flux tensor is used to derive an EASM valid for two- and three-dimensional turbulent buoyant flows. Consequently, the basis tensors are formed with the fully three-dimensional buoyant flux tensor. The derived EASM has the two-dimensional flow as a special case. The matrices and the representation coefficients are further simplified using a four-term representation. When this four-term representation model is applied to calculate two-dimensional homogeneous buoyant flows, the results are essentially identical with those obtained previously using the two-dimensional component of the buoyant flux tensor. Therefore, the present approach leads to a more general EASM formulation that is equally valid for two- and three-dimensional turbulent buoyant flows.
文摘Feature based design has been regarded as a promising approach for CAD/CAM integration.This paper aims to establish a domain independent representation formalism for feature based design in three aspects: formal representation,design process model and design algorithm.The implementing scheme and formal description of feature taxonomy,feature operator,feature model validation and feature transformation are given in the paper.The feature based design process model suited for either sequencial or concurrent engineering is proposed and its application to product structural design and process plan design is presented. Some general design algorithms for developing feature based design system are also addressed.The proposed scheme provides a formal methodology elementary for feature based design system development and operation in a structural way.
基金supported by the National Natural Science Foundation of China(61261046,61362038)the Natural Science Foundation of Jiangxi Province(20142BAB207006,20151BAB207013)+2 种基金the Science and Technology Project of Provincial Education Department of Jiangxi Province(GJJ14738,GJJ14739)the Research Foundation of Health Department of Jiangxi Province(20175561)the Science and Technology Project of Jiujiang University(2016KJ001,2016KJ002)
文摘The machinery fault signal is a typical non-Gaussian and non-stationary process. The fault signal can be described by SaS distribution model because of the presence of impulses.Time-frequency distribution is a useful tool to extract helpful information of the machinery fault signal. Various fractional lower order(FLO) time-frequency distribution methods have been proposed based on fractional lower order statistics, which include fractional lower order short time Fourier transform(FLO-STFT), fractional lower order Wigner-Ville distributions(FLO-WVDs), fractional lower order Cohen class time-frequency distributions(FLO-CDs), fractional lower order adaptive kernel time-frequency distributions(FLO-AKDs) and adaptive fractional lower order time-frequency auto-regressive moving average(FLO-TFARMA) model time-frequency representation method.The methods and the exiting methods based on second order statistics in SaS distribution environments are compared, simulation results show that the new methods have better performances than the existing methods. The advantages and disadvantages of the improved time-frequency methods have been summarized.Last, the new methods are applied to analyze the outer race fault signals, the results illustrate their good performances.
基金supported by the National Natural Science Foundation of China(No.61801440)the High‐quality and Cutting‐edge Disciplines Construction Project for Universities in Beijing(Internet Information,Communication University of China),State Key Laboratory of Media Convergence and Communication(Communication University of China)the Fundamental Research Funds for the Central Universities(CUC2019B069).
文摘Many methods based on deep learning have achieved amazing results in image sentiment analysis.However,these existing methods usually pursue high accuracy,ignoring the effect on model training efficiency.Considering that when faced with large-scale sentiment analysis tasks,the high accuracy rate often requires long experimental time.In view of the weakness,a method that can greatly improve experimental efficiency with only small fluctuations in model accuracy is proposed,and singular value decomposition(SVD)is used to find the sparse feature of the image,which are sparse vectors with strong discriminativeness and effectively reduce redundant information;The authors propose the Fast Dictionary Learning algorithm(FDL),which can combine neural network with sparse representation.This method is based on K-Singular Value Decomposition,and through iteration,it can effectively reduce the calculation time and greatly improve the training efficiency in the case of small fluctuation of accuracy.Moreover,the effectiveness of the proposed method is evaluated on the FER2013 dataset.By adding singular value decomposition,the accuracy of the test suite increased by 0.53%,and the total experiment time was shortened by 8.2%;Fast Dictionary Learning shortened the total experiment time by 36.3%.
文摘Predicting anomalous behaviour of a running process using system call trace is a common practice among security community and it is still an active research area. It is a typical pattern recognition problem and can be dealt with machine learning algorithms. Standard system call datasets were employed to train these algorithms. However, advancements in operating systems made these datasets outdated and un-relevant. Australian Defence Force Academy Linux Dataset (ADFA-LD) and Australian Defence Force Academy Windows Dataset (ADFA-WD) are new generation system calls datasets that contain labelled system call traces for modern exploits and attacks on various applications. In this paper, we evaluate performance of Modified Vector Space Representation technique on ADFA-LD and ADFA-WD datasets using various classification algorithms. Our experimental results show that our method performs well and it helps accurately distinguishing process behaviour through system calls.
文摘Due to the NP-hardness of the job shop scheduling problem (JSP), many heuristic approaches have been proposed;among them is the genetic algorithm (GA). In the literature, there are eight different GA representations for the JSP;each one aims to provide subtle environment through which the GA’s reproduction and mutation operators would succeed in finding near optimal solutions in small computational time. This paper provides a computational study to compare the performance of the GA under six different representations.
文摘Due to NP-Hard nature of the Job Shop Scheduling Problems (JSP), exact methods fail to provide the optimal solutions in quite reasonable computational time. Due to this nature of the problem, so many heuristics and meta-heuristics have been proposed in the past to get optimal or near-optimal solutions for easy to tough JSP instances in lesser computational time compared to exact methods. One of such heuristics is genetic algorithm (GA). Representations in GA will have a direct impact on computational time it takes in providing optimal or near optimal solutions. Different representation schemes are possible in case of Job Scheduling Problems. These schemes in turn will have a higher impact on the performance of GA. It is intended to show through this paper, how these representations will perform, by a comparative analysis based on average deviation, evolution of solution over entire generations etc.
文摘As a part of the Smart Grid concept, an efficient energy management at the residential level has received increasing attention in lately research. Its main focus is to balance the energy consumption in the residential environment in order to avoid the undesirable peaks faced by the electricity supplier. This challenge can be achieved by means of a home energy management system (HEMS). The HEMS may consider local renewable energy production and energy storage, as well as local control of some particular loads when peaks mitigation is necessary. This paper presents the modeling and comparison of two residential systems;one using conventional electric baseboard heating and the other one supported by Electric Thermal Storage (ETS);the ETS is employed to optimize the local energy utilization pursuing the peak shaving of residential consumption profile. Simulations of the proposed architecture using the Energetic Macroscopic Representation (EMR) demonstrate the potential of ETS technologies in future HEMS.
文摘File labeling techniques have a long history in analyzing the anthological trends in computational linguistics.The situation becomes worse in the case of files downloaded into systems from the Internet.Currently,most users either have to change file names manually or leave a meaningless name of the files,which increases the time to search required files and results in redundancy and duplications of user files.Currently,no significant work is done on automated file labeling during the organization of heterogeneous user files.A few attempts have been made in topic modeling.However,one major drawback of current topic modeling approaches is better results.They rely on specific language types and domain similarity of the data.In this research,machine learning approaches have been employed to analyze and extract the information from heterogeneous corpus.A different file labeling technique has also been used to get the meaningful and`cohesive topic of the files.The results show that the proposed methodology can generate relevant and context-sensitive names for heterogeneous data files and provide additional insight into automated file labeling in operating systems.