A natural extension of the Lorentz transformation to its complex version was constructed together with a parallel extension of the Minkowski M<sup>4</sup> model for special relativity (SR) to complex C<...A natural extension of the Lorentz transformation to its complex version was constructed together with a parallel extension of the Minkowski M<sup>4</sup> model for special relativity (SR) to complex C<sup>4</sup> space-time. As the [signed] absolute values of complex coordinates of the underlying motion’s characterization in C<sup>4</sup> one obtains a Newtonian-like type of motion whereas as the real parts of the complex motion’s description and of the complex Lorentz transformation, all the SR theory as modeled by M<sup>4</sup> real space-time can be recovered. This means all the SR theory is preserved in the real subspace M<sup>4</sup> of the space-time C<sup>4</sup> while becoming simpler and clearer in the new complex model’s framework. Since velocities in the complex model can be determined geometrically, with no primary use of time, time turns out to be definable within the equivalent theory of the reduced complex C<sup>4</sup> model to the C<sup>3</sup> “para-space” model. That procedure allows us to separate time from the (para)space and consider all the SR theory as a theory of C<sup>3</sup> alone. On the other hand, the complex time defined within the C<sup>3</sup> theory is interpreted and modeled by the single separate C<sup>1</sup> complex plane. The possibility for application of the C<sup>3</sup> model to quantum mechanics is suggested. As such, the model C<sup>3</sup> seems to have unifying abilities for application to different physical theories.展开更多
One of the critical hurdles, and breakthroughs, in the field of Natural Language Processing (NLP) in the last two decades has been the development of techniques for text representation that solves the so-called curse ...One of the critical hurdles, and breakthroughs, in the field of Natural Language Processing (NLP) in the last two decades has been the development of techniques for text representation that solves the so-called curse of dimensionality, a problem which plagues NLP in general given that the feature set for learning starts as a function of the size of the language in question, upwards of hundreds of thousands of terms typically. As such, much of the research and development in NLP in the last two decades has been in finding and optimizing solutions to this problem, to feature selection in NLP effectively. This paper looks at the development of these various techniques, leveraging a variety of statistical methods which rest on linguistic theories that were advanced in the middle of the last century, namely the distributional hypothesis which suggests that words that are found in similar contexts generally have similar meanings. In this survey paper we look at the development of some of the most popular of these techniques from a mathematical as well as data structure perspective, from Latent Semantic Analysis to Vector Space Models to their more modern variants which are typically referred to as word embeddings. In this review of algoriths such as Word2Vec, GloVe, ELMo and BERT, we explore the idea of semantic spaces more generally beyond applicability to NLP.展开更多
We present a new interpretation of the Higgs field as a composite particle made up of a positive, with, a negative mass Planck particle. According to the Winterberg hypothesis, space, i.e., the vacuum, consists of bot...We present a new interpretation of the Higgs field as a composite particle made up of a positive, with, a negative mass Planck particle. According to the Winterberg hypothesis, space, i.e., the vacuum, consists of both positive and negative physical massive particles, which he called planckions, interacting through strong superfluid forces. In our composite model for the Higgs boson, there is an intrinsic length scale associated with the vacuum, different from the one introduced by Winterberg, where, when the vacuum is in a perfectly balanced state, the number density of positive Planck particles equals the number density of negative Planck particles. Due to the mass compensating effect, the vacuum thus appears massless, chargeless, without pressure, energy density, or entropy. However, a situation can arise where there is an effective mass density imbalance due to the two species of Planck particle not matching in terms of populations, within their respective excited energy states. This does not require the physical addition or removal of either positive or negative Planck particles, within a given region of space, as originally thought. Ordinary matter, dark matter, and dark energy can thus be given a new interpretation as residual vacuum energies within the context of a greater vacuum, where the populations of the positive and negative energy states exactly balance. In the present epoch, it is estimated that the dark energy number density imbalance amounts to, , per cubic meter, when cosmic distance scales in excess of, 100 Mpc, are considered. Compared to a strictly balanced vacuum, where we estimate that the positive, and the negative Planck number density, is of the order, 7.85E54 particles per cubic meter, the above is a very small perturbation. This slight imbalance, we argue, would dramatically alleviate, if not altogether eliminate, the long standing cosmological constant problem.展开更多
We work within a Winterberg framework where space, i.e., the vacuum, consists of a two component superfluid/super-solid made up of a vast assembly (sea) of positive and negative mass Planck particles, called planckion...We work within a Winterberg framework where space, i.e., the vacuum, consists of a two component superfluid/super-solid made up of a vast assembly (sea) of positive and negative mass Planck particles, called planckions. These material particles interact indirectly, and have very strong restoring forces keeping them a finite distance apart from each other within their respective species. Because of their mass compensating effect, the vacuum appears massless, charge-less, without pressure, net energy density or entropy. In addition, we consider two varying G models, where G, is Newton’s constant, and G<sup>-1</sup>, increases with an increase in cosmological time. We argue that there are at least two competing models for the quantum vacuum within such a framework. The first follows a strict extension of Winterberg’s model. This leads to nonsensible results, if G increases, going back in cosmological time, as the length scale inherent in such a model will not scale properly. The second model introduces a different length scale, which does scale properly, but keeps the mass of the Planck particle as, ± the Planck mass. Moreover we establish a connection between ordinary matter, dark matter, and dark energy, where all three mass densities within the Friedman equation must be interpreted as residual vacuum energies, which only surface, once aggregate matter has formed, at relatively low CMB temperatures. The symmetry of the vacuum will be shown to be broken, because of the different scaling laws, beginning with the formation of elementary particles. Much like waves on an ocean where positive and negative planckion mass densities effectively cancel each other out and form a zero vacuum energy density/zero vacuum pressure surface, these positive mass densities are very small perturbations (anomalies) about the mean. This greatly alleviates, i.e., minimizes the cosmological constant problem, a long standing problem associated with the vacuum.展开更多
On the basis of using entropy weight method to measure China’s education poverty alleviation and rural revitalization evaluation indicators, using the panel data of 30 provinces in China (excluding Xizang, Hong Kong,...On the basis of using entropy weight method to measure China’s education poverty alleviation and rural revitalization evaluation indicators, using the panel data of 30 provinces in China (excluding Xizang, Hong Kong, Macao and Taiwan) from 2012 to 2021, a spatial panel simultaneous equation model is constructed based on adjacency matrix, geographical distance matrix and economic geographical distance matrix deeply study the interaction mechanism and spatial spillover effects between education poverty alleviation and rural revitalization through the generalized spatial three-stage least squares method (GS3SLS). The results indicate that there is a significant spatial spillover effect and a positive spatial correlation between education poverty alleviation and rural revitalization, and there is a significant interactive effect between the two variables, while promoting each other positively. Therefore, the government should clarify the deep relationship between education poverty alleviation and rural revitalization based on the current background, and better consolidate and expand the effective connection between the achievements of education poverty alleviation and rural revitalization.展开更多
This study introduces the Orbit Weighting Scheme(OWS),a novel approach aimed at enhancing the precision and efficiency of Vector Space information retrieval(IR)models,which have traditionally relied on weighting schem...This study introduces the Orbit Weighting Scheme(OWS),a novel approach aimed at enhancing the precision and efficiency of Vector Space information retrieval(IR)models,which have traditionally relied on weighting schemes like tf-idf and BM25.These conventional methods often struggle with accurately capturing document relevance,leading to inefficiencies in both retrieval performance and index size management.OWS proposes a dynamic weighting mechanism that evaluates the significance of terms based on their orbital position within the vector space,emphasizing term relationships and distribution patterns overlooked by existing models.Our research focuses on evaluating OWS’s impact on model accuracy using Information Retrieval metrics like Recall,Precision,InterpolatedAverage Precision(IAP),andMeanAverage Precision(MAP).Additionally,we assessOWS’s effectiveness in reducing the inverted index size,crucial for model efficiency.We compare OWS-based retrieval models against others using different schemes,including tf-idf variations and BM25Delta.Results reveal OWS’s superiority,achieving a 54%Recall and 81%MAP,and a notable 38%reduction in the inverted index size.This highlights OWS’s potential in optimizing retrieval processes and underscores the need for further research in this underrepresented area to fully leverage OWS’s capabilities in information retrieval methodologies.展开更多
A pre-selection space time model was proposed to estimate the traffic condition at poor-data-detector,especially non-detector locations.The space time model is better to integrate the spatial and temporal information ...A pre-selection space time model was proposed to estimate the traffic condition at poor-data-detector,especially non-detector locations.The space time model is better to integrate the spatial and temporal information comprehensibly.Firstly,the influencing factors of the "cause nodes" were studied,and then the pre-selection "cause nodes" procedure which utilizes the Pearson correlation coefficient to evaluate the relevancy of the traffic data was introduced.Finally,only the most relevant data were collected to compose the space time model.The experimental results with the actual data demonstrate that the model performs better than other three models.展开更多
During high-intensity,fully mechanized mining of extra-thick coal seam,the top coal would cave to a certain 3D form.Based on the data collected during drilling,a 3D model of top coal caving surface space was establish...During high-intensity,fully mechanized mining of extra-thick coal seam,the top coal would cave to a certain 3D form.Based on the data collected during drilling,a 3D model of top coal caving surface space was established to determine the relationship between the location of the stope roof and the caving surface,enabling the mathematical computation of the top caving angle(φ).The drilling method was employed to measure the top caving angle on two extra-thick fully mechanized coal caving faces under the conditions of three geological structures,namely,no geological structure,igneous rock structure,and fault structure.The results show that the value of top caving angle could be accurately estimated on-site with the 9-parameter 3D top coal caving surface model built with the drilling method.This method is a novel on-site measurement that can be easily applied.Our findings reveal that the characteristics of the coal-rock in the two mining faces are different;yet their caving angles follow the ruleφ_(igneous rock structure)<φ_(no geological structure)<φ_(fault structure).Finally,through the data fitting with two indexes(the top coal uniaxial compressive strength and the top caving angle),it is found that the relationship between the two indexes satisfies an exponential decay function.展开更多
Multistation machining process is widely applied in contemporary manufacturing environment. Modeling of variation propagation in multistation machining process is one of the most important research scenarios. Due to t...Multistation machining process is widely applied in contemporary manufacturing environment. Modeling of variation propagation in multistation machining process is one of the most important research scenarios. Due to the existence of multiple variation streams, it is challenging to model and analyze variation propagation in a multi-station system. Current approaches to error modeling for multistation machining process are not explicit enough for error control and ensuring final product quality. In this paper, a mathematic model to depict the part dimensional variation of the complex multistation manufacturing process is formulated. A linear state space dimensional error propagation equation is established through kinematics analysis of the influence of locating parameter variations and locating datum variations on dimensional errors, so the dimensional error accumulation and transformation within the multistation process are quantitatively described. A systematic procedure to build the model is presented, which enhances the way to determine the variation sources in complex machining systems. A simple two-dimensional example is used to illustrate the proposed procedures. Finally, an industrial case of multistation machining part in a manufacturing shop is given to testify the validation and practicability of the method. The proposed analytical model is essential to quality control and improvement for multistation systems in machining quality forecasting and design optimization.展开更多
A hybrid model that is based on the Combination of keywords and concept was put forward. The hybrid model is built on vector space model and probabilistic reasoning network. It not only can exert the advantages of key...A hybrid model that is based on the Combination of keywords and concept was put forward. The hybrid model is built on vector space model and probabilistic reasoning network. It not only can exert the advantages of keywords retrieval and concept retrieval but also can compensate for their shortcomings. Their parameters can be adjusted according to different usage in order to accept the best information retrieval result, and it has been proved by our experiments.展开更多
An integrated dynamic model of natural gas pipeline networks is developed in this paper.Components for gas supply,e.g.,pipelines,junctions,compressor stations,LNG terminals,regulation stations and gas storage faciliti...An integrated dynamic model of natural gas pipeline networks is developed in this paper.Components for gas supply,e.g.,pipelines,junctions,compressor stations,LNG terminals,regulation stations and gas storage facilities are included in the model.These components are firstly modeled with respect to their properties and functions and,then,integrated at the system level by Graph Theory.The model can be used for simulating the system response in different scenarios of operation,and evaluate the consequences from the perspectives of supply security and resilience.A case study is considered to evaluate the accuracy of the model by benchmarking its results against those from literature and the software Pipeline Studio.Finally,the model is applied on a relatively complex natural gas pipeline network and the results are analyzed in detail from the supply security and resilience points of view.The main contributions of the paper are:firstly,a novel model of a complex gas pipeline network is proposed as a dynamic state-space model at system level;a method,based on the dynamic model,is proposed to analyze the security and resilience of supply from a system perspective.展开更多
Formal state space models of quantum control systems are deduced and a scheme to establish formal state space models via quantization could been obtained for quantum control systems is proposed. State evolution of qua...Formal state space models of quantum control systems are deduced and a scheme to establish formal state space models via quantization could been obtained for quantum control systems is proposed. State evolution of quantum control systems must accord with Schrdinger equations, so it is foremost to obtain Hamiltonian operators of systems. There are corresponding relations between operators of quantum systems and corresponding physical quantities of classical systems, such as momentum, energy and Hamiltonian, so Schrdinger equation models of corresponding quantum control systems via quantization could been obtained from classical control systems, and then establish formal state space models through the suitable transformation from Schrdinger equations for these quantum control systems. This method provides a new kind of path for modeling in quantum control.展开更多
The fixture layout is crucial to assure the product quality in a multistation assembly process (MAP). A well-designed fixture layout will make the final product's variability be insensitive to the fixture variation...The fixture layout is crucial to assure the product quality in a multistation assembly process (MAP). A well-designed fixture layout will make the final product's variability be insensitive to the fixture variation inputs. As the basis of the fixture layout design, the design criterion plays an important role in the effectiveness of a solution and the optimization efficiency. In this paper, an effective and efficient design criterion is proposed for the fixture layout with a fixed reference point (FRP) in an MAP. First of all, a state space model for the individual port's variation propagation and accumulation is developed, which is the mathematical foundation of the proposed criterion. Then, based on this model, a novel design criterion used to evaluate the performance of the fixture layout is proposed for the fixture layout with an FRP. Finally, a method extracted from the proposed design criterion is developed for quick fixture layout design. A four-station assembly process is used to validate the effectiveness and efficiency of the proposed models and methods.展开更多
A new analytical method is proposed to analyze the force acting on a rectangular oscillating buoy due to linear waves.In the method a new analytical expression for the diffraction velocity potential is obtained first ...A new analytical method is proposed to analyze the force acting on a rectangular oscillating buoy due to linear waves.In the method a new analytical expression for the diffraction velocity potential is obtained first by use of theeigenfunction expansion method and then the wave excitation force is calculated by use of the known incident wavepotential and the diffraction potential. Compared with the classical analytical method, it can be seen that the presentmethod is simpler for a two-dimensional problem due to the comparable effort needed for the computation ofdiffraction potential and for that of radiated potential. To verify the correctness of the method, a classical example inthe reference is recomputed and the obtained results are in good accordance with those by use of other methods,which shows that the present method is correct.展开更多
Hyperstatic structure plane model being built by structural mechanics is studied. Space model precisely reflected in real stress of the structure is built by finite element method (FEM) analysis commerce software. M...Hyperstatic structure plane model being built by structural mechanics is studied. Space model precisely reflected in real stress of the structure is built by finite element method (FEM) analysis commerce software. Mapping model of complex structure system is set up, with convenient calculation just as in plane model and comprehensive information as in space model. Plane model and space model are calculated under the same working condition. Plane model modular construction inner force is considered as input data; Space model modular construction inner force is considered as output data. Thus specimen is built on input data and output dam. Character and affiliation are extracted through training specimen, with the employment of nonlinear mapping capability of the artificial neural network. Mapping model with interpolation and extrpolation is gained, laying the foundation for optimum design. The steel structure of high-layer parking system (SSHLPS) is calculated as an instance. A three-layer back-propagation (BP) net including one hidden layer is constructed with nine input nodes and eight output nodes for a five-layer SSHLPS. The three-layer structure optimization result through the mapping model interpolation contrasts with integrity re-analysis, and seven layers structure through the mapping model extrpulation contrasts with integrity re-analysis. Any layer SSHLPS among 1-8 can be calculated with much accuracy. Amount of calculation can also be reduced if it is appfied into the same topological structure, with reduced distortion and assured precision.展开更多
A state space model(SSM) is derived for quantum-dot semiconductor optical amplifiers(QD-SOAs).Rate equations of QD-SOA are formulated in the form of state update equations,where average occupation probabilities along ...A state space model(SSM) is derived for quantum-dot semiconductor optical amplifiers(QD-SOAs).Rate equations of QD-SOA are formulated in the form of state update equations,where average occupation probabilities along QD-SOA cavity are considered as state variables of the system.Simulations show that SSM calculates QD-SOA′s static and dynamic characteristics with high accuracy.展开更多
Granular computing is a very hot research field in recent years. In our previous work an algebraic quotient space model was proposed,where the quotient structure could not be deduced if the granulation was based on an...Granular computing is a very hot research field in recent years. In our previous work an algebraic quotient space model was proposed,where the quotient structure could not be deduced if the granulation was based on an equivalence relation. In this paper,definitions were given and formulas of the lower quotient congruence and upper quotient congruence were calculated to roughly represent the quotient structure. Then the accuracy and roughness were defined to measure the quotient structure in quantification. Finally,a numerical example was given to demonstrate that the rough representation and measuring methods are efficient and applicable. The work has greatly enriched the algebraic quotient space model and granular computing theory.展开更多
This paper studies the problem of the space station short-term mission planning, which aims to allocate the executing time of missions effectively, schedule the corresponding resources reasonably and arrange the time ...This paper studies the problem of the space station short-term mission planning, which aims to allocate the executing time of missions effectively, schedule the corresponding resources reasonably and arrange the time of the astronauts properly. A domain model is developed by using the ontology theory to describe the concepts, constraints and relations of the planning domain formally, abstractly and normatively. A method based on time iteration is adopted to solve the short-term planning problem. Meanwhile, the resolving strategies are proposed to resolve different kinds of conflicts induced by the constraints of power, heat, resource, astronaut and relationship. The proposed approach is evaluated in a test case with fifteen missions, thirteen resources and three astronauts. The results show that the developed domain ontology model is reasonable, and the time iteration method using the proposed resolving strategies can successfully obtain the plan satisfying all considered constraints.展开更多
The high potentiality of integrating renewable energies,such as photovoltaic,into a modern electrical microgrid system,using DC-to-DC converters,raises some issues associated with controller loop design and system sta...The high potentiality of integrating renewable energies,such as photovoltaic,into a modern electrical microgrid system,using DC-to-DC converters,raises some issues associated with controller loop design and system stability.The generalized state space average model(GSSAM)concept was consequently introduced to design a DC-to-DC converter controller in order to evaluate DC-to-DC converter performance and to conduct stability studies.This paper presents a GSSAM for parallel DC-to-DC converters,namely:buck,boost,and buck-boost converters.The rationale of this study is that modern electrical systems,such as DC networks,hybrid microgrids,and electric ships,are formed by parallel DC-to-DC converters with separate DC input sources.Therefore,this paper proposes a GSSAM for any number of parallel DC-to-DC converters.The proposed GSSAM is validated and investigated in a time-domain simulation environment,namely a MATLAB/SIMULINK.The study compares the steady-state,transient,and oscillatory performance of the state-space average model with a fully detailed switching model.展开更多
The current highly competitive environment has driven industries to operate with increasingly restricted profit margins. Thus, it is imperative to optimize production processes. Faced with this scenario, multivariable...The current highly competitive environment has driven industries to operate with increasingly restricted profit margins. Thus, it is imperative to optimize production processes. Faced with this scenario, multivariable predictive control of processes has been presented as a powerful alternative to achieve these goals. Moreover, the rationale for implementation of advanced control and subsequent analysis of its post-match performance also focus on the benefits that this tool brings to the plant. It is therefore essential to establish a methodology for analysis, based on clear and measurable criteria. Currently, there are different methodologies available in the market to assist with such analysis. These tools can have a quantitative or qualitative focus. The aim of this study is to evaluate three of the best current main performance assessment technologies: Minimum Variance Control-Harris Index; Statistical Process Control (Cp and Cpk); and the Qin and Yu Index. These indexes were studied for an alumina plant controlled by three MPC (model predictive control) algorithms (GPC (generalized predictive control), RMPCT (robust multivariable predictive control technology) and ESSMPC (extended state space model predictive controller)) with different results.展开更多
文摘A natural extension of the Lorentz transformation to its complex version was constructed together with a parallel extension of the Minkowski M<sup>4</sup> model for special relativity (SR) to complex C<sup>4</sup> space-time. As the [signed] absolute values of complex coordinates of the underlying motion’s characterization in C<sup>4</sup> one obtains a Newtonian-like type of motion whereas as the real parts of the complex motion’s description and of the complex Lorentz transformation, all the SR theory as modeled by M<sup>4</sup> real space-time can be recovered. This means all the SR theory is preserved in the real subspace M<sup>4</sup> of the space-time C<sup>4</sup> while becoming simpler and clearer in the new complex model’s framework. Since velocities in the complex model can be determined geometrically, with no primary use of time, time turns out to be definable within the equivalent theory of the reduced complex C<sup>4</sup> model to the C<sup>3</sup> “para-space” model. That procedure allows us to separate time from the (para)space and consider all the SR theory as a theory of C<sup>3</sup> alone. On the other hand, the complex time defined within the C<sup>3</sup> theory is interpreted and modeled by the single separate C<sup>1</sup> complex plane. The possibility for application of the C<sup>3</sup> model to quantum mechanics is suggested. As such, the model C<sup>3</sup> seems to have unifying abilities for application to different physical theories.
文摘One of the critical hurdles, and breakthroughs, in the field of Natural Language Processing (NLP) in the last two decades has been the development of techniques for text representation that solves the so-called curse of dimensionality, a problem which plagues NLP in general given that the feature set for learning starts as a function of the size of the language in question, upwards of hundreds of thousands of terms typically. As such, much of the research and development in NLP in the last two decades has been in finding and optimizing solutions to this problem, to feature selection in NLP effectively. This paper looks at the development of these various techniques, leveraging a variety of statistical methods which rest on linguistic theories that were advanced in the middle of the last century, namely the distributional hypothesis which suggests that words that are found in similar contexts generally have similar meanings. In this survey paper we look at the development of some of the most popular of these techniques from a mathematical as well as data structure perspective, from Latent Semantic Analysis to Vector Space Models to their more modern variants which are typically referred to as word embeddings. In this review of algoriths such as Word2Vec, GloVe, ELMo and BERT, we explore the idea of semantic spaces more generally beyond applicability to NLP.
文摘We present a new interpretation of the Higgs field as a composite particle made up of a positive, with, a negative mass Planck particle. According to the Winterberg hypothesis, space, i.e., the vacuum, consists of both positive and negative physical massive particles, which he called planckions, interacting through strong superfluid forces. In our composite model for the Higgs boson, there is an intrinsic length scale associated with the vacuum, different from the one introduced by Winterberg, where, when the vacuum is in a perfectly balanced state, the number density of positive Planck particles equals the number density of negative Planck particles. Due to the mass compensating effect, the vacuum thus appears massless, chargeless, without pressure, energy density, or entropy. However, a situation can arise where there is an effective mass density imbalance due to the two species of Planck particle not matching in terms of populations, within their respective excited energy states. This does not require the physical addition or removal of either positive or negative Planck particles, within a given region of space, as originally thought. Ordinary matter, dark matter, and dark energy can thus be given a new interpretation as residual vacuum energies within the context of a greater vacuum, where the populations of the positive and negative energy states exactly balance. In the present epoch, it is estimated that the dark energy number density imbalance amounts to, , per cubic meter, when cosmic distance scales in excess of, 100 Mpc, are considered. Compared to a strictly balanced vacuum, where we estimate that the positive, and the negative Planck number density, is of the order, 7.85E54 particles per cubic meter, the above is a very small perturbation. This slight imbalance, we argue, would dramatically alleviate, if not altogether eliminate, the long standing cosmological constant problem.
文摘We work within a Winterberg framework where space, i.e., the vacuum, consists of a two component superfluid/super-solid made up of a vast assembly (sea) of positive and negative mass Planck particles, called planckions. These material particles interact indirectly, and have very strong restoring forces keeping them a finite distance apart from each other within their respective species. Because of their mass compensating effect, the vacuum appears massless, charge-less, without pressure, net energy density or entropy. In addition, we consider two varying G models, where G, is Newton’s constant, and G<sup>-1</sup>, increases with an increase in cosmological time. We argue that there are at least two competing models for the quantum vacuum within such a framework. The first follows a strict extension of Winterberg’s model. This leads to nonsensible results, if G increases, going back in cosmological time, as the length scale inherent in such a model will not scale properly. The second model introduces a different length scale, which does scale properly, but keeps the mass of the Planck particle as, ± the Planck mass. Moreover we establish a connection between ordinary matter, dark matter, and dark energy, where all three mass densities within the Friedman equation must be interpreted as residual vacuum energies, which only surface, once aggregate matter has formed, at relatively low CMB temperatures. The symmetry of the vacuum will be shown to be broken, because of the different scaling laws, beginning with the formation of elementary particles. Much like waves on an ocean where positive and negative planckion mass densities effectively cancel each other out and form a zero vacuum energy density/zero vacuum pressure surface, these positive mass densities are very small perturbations (anomalies) about the mean. This greatly alleviates, i.e., minimizes the cosmological constant problem, a long standing problem associated with the vacuum.
文摘On the basis of using entropy weight method to measure China’s education poverty alleviation and rural revitalization evaluation indicators, using the panel data of 30 provinces in China (excluding Xizang, Hong Kong, Macao and Taiwan) from 2012 to 2021, a spatial panel simultaneous equation model is constructed based on adjacency matrix, geographical distance matrix and economic geographical distance matrix deeply study the interaction mechanism and spatial spillover effects between education poverty alleviation and rural revitalization through the generalized spatial three-stage least squares method (GS3SLS). The results indicate that there is a significant spatial spillover effect and a positive spatial correlation between education poverty alleviation and rural revitalization, and there is a significant interactive effect between the two variables, while promoting each other positively. Therefore, the government should clarify the deep relationship between education poverty alleviation and rural revitalization based on the current background, and better consolidate and expand the effective connection between the achievements of education poverty alleviation and rural revitalization.
文摘This study introduces the Orbit Weighting Scheme(OWS),a novel approach aimed at enhancing the precision and efficiency of Vector Space information retrieval(IR)models,which have traditionally relied on weighting schemes like tf-idf and BM25.These conventional methods often struggle with accurately capturing document relevance,leading to inefficiencies in both retrieval performance and index size management.OWS proposes a dynamic weighting mechanism that evaluates the significance of terms based on their orbital position within the vector space,emphasizing term relationships and distribution patterns overlooked by existing models.Our research focuses on evaluating OWS’s impact on model accuracy using Information Retrieval metrics like Recall,Precision,InterpolatedAverage Precision(IAP),andMeanAverage Precision(MAP).Additionally,we assessOWS’s effectiveness in reducing the inverted index size,crucial for model efficiency.We compare OWS-based retrieval models against others using different schemes,including tf-idf variations and BM25Delta.Results reveal OWS’s superiority,achieving a 54%Recall and 81%MAP,and a notable 38%reduction in the inverted index size.This highlights OWS’s potential in optimizing retrieval processes and underscores the need for further research in this underrepresented area to fully leverage OWS’s capabilities in information retrieval methodologies.
基金Project(D101106049710005) supported by the Beijing Science Foundation Program,ChinaProject(61104164) supported by the National Natural Science Foundation,China
文摘A pre-selection space time model was proposed to estimate the traffic condition at poor-data-detector,especially non-detector locations.The space time model is better to integrate the spatial and temporal information comprehensibly.Firstly,the influencing factors of the "cause nodes" were studied,and then the pre-selection "cause nodes" procedure which utilizes the Pearson correlation coefficient to evaluate the relevancy of the traffic data was introduced.Finally,only the most relevant data were collected to compose the space time model.The experimental results with the actual data demonstrate that the model performs better than other three models.
基金This work was supported by the Science and Technology Innovation Project of Higher Education in Shanxi Province(No.2019L0754)Central Guiding Local Science and Technology Development Fund project(No.YDZJSX2021B021)the Datong Science and Technology Plan Project(No.2019122).
文摘During high-intensity,fully mechanized mining of extra-thick coal seam,the top coal would cave to a certain 3D form.Based on the data collected during drilling,a 3D model of top coal caving surface space was established to determine the relationship between the location of the stope roof and the caving surface,enabling the mathematical computation of the top caving angle(φ).The drilling method was employed to measure the top caving angle on two extra-thick fully mechanized coal caving faces under the conditions of three geological structures,namely,no geological structure,igneous rock structure,and fault structure.The results show that the value of top caving angle could be accurately estimated on-site with the 9-parameter 3D top coal caving surface model built with the drilling method.This method is a novel on-site measurement that can be easily applied.Our findings reveal that the characteristics of the coal-rock in the two mining faces are different;yet their caving angles follow the ruleφ_(igneous rock structure)<φ_(no geological structure)<φ_(fault structure).Finally,through the data fitting with two indexes(the top coal uniaxial compressive strength and the top caving angle),it is found that the relationship between the two indexes satisfies an exponential decay function.
基金supported by National Department Fundamental Research Foundation of China (Grant No. B222090014)National Department Technology Fundatmental Foundaiton of China (Grant No. C172009C001)
文摘Multistation machining process is widely applied in contemporary manufacturing environment. Modeling of variation propagation in multistation machining process is one of the most important research scenarios. Due to the existence of multiple variation streams, it is challenging to model and analyze variation propagation in a multi-station system. Current approaches to error modeling for multistation machining process are not explicit enough for error control and ensuring final product quality. In this paper, a mathematic model to depict the part dimensional variation of the complex multistation manufacturing process is formulated. A linear state space dimensional error propagation equation is established through kinematics analysis of the influence of locating parameter variations and locating datum variations on dimensional errors, so the dimensional error accumulation and transformation within the multistation process are quantitatively described. A systematic procedure to build the model is presented, which enhances the way to determine the variation sources in complex machining systems. A simple two-dimensional example is used to illustrate the proposed procedures. Finally, an industrial case of multistation machining part in a manufacturing shop is given to testify the validation and practicability of the method. The proposed analytical model is essential to quality control and improvement for multistation systems in machining quality forecasting and design optimization.
文摘A hybrid model that is based on the Combination of keywords and concept was put forward. The hybrid model is built on vector space model and probabilistic reasoning network. It not only can exert the advantages of keywords retrieval and concept retrieval but also can compensate for their shortcomings. Their parameters can be adjusted according to different usage in order to accept the best information retrieval result, and it has been proved by our experiments.
基金supported by National Natural Science Foundation of China[grant number 51904316]provided by China University of Petroleum,Beijing[grant number2462021YJRC013,2462020YXZZ045]
文摘An integrated dynamic model of natural gas pipeline networks is developed in this paper.Components for gas supply,e.g.,pipelines,junctions,compressor stations,LNG terminals,regulation stations and gas storage facilities are included in the model.These components are firstly modeled with respect to their properties and functions and,then,integrated at the system level by Graph Theory.The model can be used for simulating the system response in different scenarios of operation,and evaluate the consequences from the perspectives of supply security and resilience.A case study is considered to evaluate the accuracy of the model by benchmarking its results against those from literature and the software Pipeline Studio.Finally,the model is applied on a relatively complex natural gas pipeline network and the results are analyzed in detail from the supply security and resilience points of view.The main contributions of the paper are:firstly,a novel model of a complex gas pipeline network is proposed as a dynamic state-space model at system level;a method,based on the dynamic model,is proposed to analyze the security and resilience of supply from a system perspective.
文摘Formal state space models of quantum control systems are deduced and a scheme to establish formal state space models via quantization could been obtained for quantum control systems is proposed. State evolution of quantum control systems must accord with Schrdinger equations, so it is foremost to obtain Hamiltonian operators of systems. There are corresponding relations between operators of quantum systems and corresponding physical quantities of classical systems, such as momentum, energy and Hamiltonian, so Schrdinger equation models of corresponding quantum control systems via quantization could been obtained from classical control systems, and then establish formal state space models through the suitable transformation from Schrdinger equations for these quantum control systems. This method provides a new kind of path for modeling in quantum control.
基金National Nature Science Foundation of China(No.71201025)Specialized Research Fund for the Doctoral Program of Higher Education,China(No.20110092120007)Jiangsu Key Laboratory of Equipments Detection and Control,China(No.JSKLEDC201215)
文摘The fixture layout is crucial to assure the product quality in a multistation assembly process (MAP). A well-designed fixture layout will make the final product's variability be insensitive to the fixture variation inputs. As the basis of the fixture layout design, the design criterion plays an important role in the effectiveness of a solution and the optimization efficiency. In this paper, an effective and efficient design criterion is proposed for the fixture layout with a fixed reference point (FRP) in an MAP. First of all, a state space model for the individual port's variation propagation and accumulation is developed, which is the mathematical foundation of the proposed criterion. Then, based on this model, a novel design criterion used to evaluate the performance of the fixture layout is proposed for the fixture layout with an FRP. Finally, a method extracted from the proposed design criterion is developed for quick fixture layout design. A four-station assembly process is used to validate the effectiveness and efficiency of the proposed models and methods.
基金This work Was supported by the High Tech Research and Development(863)Program of China under Grant No.2003AA5 16010the Chinese Academy of Science Pilot Project of the National Knowledge Innovation Program under Grant No.KGCX2-SW-305Chinese National Science Fund for Distinguished Young Scholars under Grant No.50125924.
文摘A new analytical method is proposed to analyze the force acting on a rectangular oscillating buoy due to linear waves.In the method a new analytical expression for the diffraction velocity potential is obtained first by use of theeigenfunction expansion method and then the wave excitation force is calculated by use of the known incident wavepotential and the diffraction potential. Compared with the classical analytical method, it can be seen that the presentmethod is simpler for a two-dimensional problem due to the comparable effort needed for the computation ofdiffraction potential and for that of radiated potential. To verify the correctness of the method, a classical example inthe reference is recomputed and the obtained results are in good accordance with those by use of other methods,which shows that the present method is correct.
基金This project is supported by Provincial Natural Science Foundation of Shanxi, China (No. 20041074)Provincial Natural Science Youth Foundation of Shanxi, China (No. 20051030)Provincial Education Office Key Subject of Shanxi, China (No. 20045027-20045028)
文摘Hyperstatic structure plane model being built by structural mechanics is studied. Space model precisely reflected in real stress of the structure is built by finite element method (FEM) analysis commerce software. Mapping model of complex structure system is set up, with convenient calculation just as in plane model and comprehensive information as in space model. Plane model and space model are calculated under the same working condition. Plane model modular construction inner force is considered as input data; Space model modular construction inner force is considered as output data. Thus specimen is built on input data and output dam. Character and affiliation are extracted through training specimen, with the employment of nonlinear mapping capability of the artificial neural network. Mapping model with interpolation and extrpolation is gained, laying the foundation for optimum design. The steel structure of high-layer parking system (SSHLPS) is calculated as an instance. A three-layer back-propagation (BP) net including one hidden layer is constructed with nine input nodes and eight output nodes for a five-layer SSHLPS. The three-layer structure optimization result through the mapping model interpolation contrasts with integrity re-analysis, and seven layers structure through the mapping model extrpulation contrasts with integrity re-analysis. Any layer SSHLPS among 1-8 can be calculated with much accuracy. Amount of calculation can also be reduced if it is appfied into the same topological structure, with reduced distortion and assured precision.
文摘A state space model(SSM) is derived for quantum-dot semiconductor optical amplifiers(QD-SOAs).Rate equations of QD-SOA are formulated in the form of state update equations,where average occupation probabilities along QD-SOA cavity are considered as state variables of the system.Simulations show that SSM calculates QD-SOA′s static and dynamic characteristics with high accuracy.
基金Supported by the National Natural Science Foundation of China(No.61772031)the Special Energy Saving Foundation of Changsha,Hunan Province in 2017
文摘Granular computing is a very hot research field in recent years. In our previous work an algebraic quotient space model was proposed,where the quotient structure could not be deduced if the granulation was based on an equivalence relation. In this paper,definitions were given and formulas of the lower quotient congruence and upper quotient congruence were calculated to roughly represent the quotient structure. Then the accuracy and roughness were defined to measure the quotient structure in quantification. Finally,a numerical example was given to demonstrate that the rough representation and measuring methods are efficient and applicable. The work has greatly enriched the algebraic quotient space model and granular computing theory.
基金supported by the National Natural Science Foundation of China(11402295)the Science Project of National University of Defense Technology(JC14-01-05)the Hunan Provincial Natural Science Foundation of China(2015JJ3020)
文摘This paper studies the problem of the space station short-term mission planning, which aims to allocate the executing time of missions effectively, schedule the corresponding resources reasonably and arrange the time of the astronauts properly. A domain model is developed by using the ontology theory to describe the concepts, constraints and relations of the planning domain formally, abstractly and normatively. A method based on time iteration is adopted to solve the short-term planning problem. Meanwhile, the resolving strategies are proposed to resolve different kinds of conflicts induced by the constraints of power, heat, resource, astronaut and relationship. The proposed approach is evaluated in a test case with fifteen missions, thirteen resources and three astronauts. The results show that the developed domain ontology model is reasonable, and the time iteration method using the proposed resolving strategies can successfully obtain the plan satisfying all considered constraints.
文摘The high potentiality of integrating renewable energies,such as photovoltaic,into a modern electrical microgrid system,using DC-to-DC converters,raises some issues associated with controller loop design and system stability.The generalized state space average model(GSSAM)concept was consequently introduced to design a DC-to-DC converter controller in order to evaluate DC-to-DC converter performance and to conduct stability studies.This paper presents a GSSAM for parallel DC-to-DC converters,namely:buck,boost,and buck-boost converters.The rationale of this study is that modern electrical systems,such as DC networks,hybrid microgrids,and electric ships,are formed by parallel DC-to-DC converters with separate DC input sources.Therefore,this paper proposes a GSSAM for any number of parallel DC-to-DC converters.The proposed GSSAM is validated and investigated in a time-domain simulation environment,namely a MATLAB/SIMULINK.The study compares the steady-state,transient,and oscillatory performance of the state-space average model with a fully detailed switching model.
文摘The current highly competitive environment has driven industries to operate with increasingly restricted profit margins. Thus, it is imperative to optimize production processes. Faced with this scenario, multivariable predictive control of processes has been presented as a powerful alternative to achieve these goals. Moreover, the rationale for implementation of advanced control and subsequent analysis of its post-match performance also focus on the benefits that this tool brings to the plant. It is therefore essential to establish a methodology for analysis, based on clear and measurable criteria. Currently, there are different methodologies available in the market to assist with such analysis. These tools can have a quantitative or qualitative focus. The aim of this study is to evaluate three of the best current main performance assessment technologies: Minimum Variance Control-Harris Index; Statistical Process Control (Cp and Cpk); and the Qin and Yu Index. These indexes were studied for an alumina plant controlled by three MPC (model predictive control) algorithms (GPC (generalized predictive control), RMPCT (robust multivariable predictive control technology) and ESSMPC (extended state space model predictive controller)) with different results.