In this article,we study Kahler metrics on a certain line bundle over some compact Kahler manifolds to find complete Kahler metrics with positive holomorphic sectional(or bisectional)curvatures.Thus,we apply a strateg...In this article,we study Kahler metrics on a certain line bundle over some compact Kahler manifolds to find complete Kahler metrics with positive holomorphic sectional(or bisectional)curvatures.Thus,we apply a strategy to a famous Yau conjecture with a co-homogeneity one geometry.展开更多
We investigate the quantum metric and topological Euler number in a cyclically modulated Su-Schrieffer-Heeger(SSH)model with long-range hopping terms.By computing the quantum geometry tensor,we derive exact expression...We investigate the quantum metric and topological Euler number in a cyclically modulated Su-Schrieffer-Heeger(SSH)model with long-range hopping terms.By computing the quantum geometry tensor,we derive exact expressions for the quantum metric and Berry curvature of the energy band electrons,and we obtain the phase diagram of the model marked by the first Chern number.Furthermore,we also obtain the topological Euler number of the energy band based on the Gauss-Bonnet theorem on the topological characterization of the closed Bloch states manifold in the first Brillouin zone.However,some regions where the Berry curvature is identically zero in the first Brillouin zone result in the degeneracy of the quantum metric,which leads to ill-defined non-integer topological Euler numbers.Nevertheless,the non-integer"Euler number"provides valuable insights and an upper bound for the absolute values of the Chern numbers.展开更多
Assessment of rock mass quality significantly impacts the design and construction of underground and open-pit mines from the point of stability and economy.This study develops the novel Gromov-Hausdorff distance for r...Assessment of rock mass quality significantly impacts the design and construction of underground and open-pit mines from the point of stability and economy.This study develops the novel Gromov-Hausdorff distance for rock quality(GHDQR)methodology for rock mass quality rating based on multi-criteria grey metric space.It usually presents the quality of surrounding rock by classes(metric spaces)with specified properties and adequate interval-grey numbers.Measuring the distance between surrounding rock sample characteristics and existing classes represents the core of this study.The Gromov-Hausdorff distance is an especially useful discriminant function,i.e.,a classifier to calculate these distances,and assess the quality of the surrounding rock.The efficiency of the developed methodology is analyzed using the Mean Absolute Percentage Error(MAPE)technique.Seven existing methods,such as the Gaussian cloud method,Discriminant method,Mutation series method,Artificial neural network(ANN),Support vector machine(SVM),Grey wolf optimizer and Support vector classification method(GWO-SVC)and Rock mass rating method(RMR)are used for comparison with the proposed GHDQR method.The share of the highly accurate category of 85.71%clearly indicates compliance with actual values obtained by the compared methods.The results of comparisons showed that the model enables objective,efficient,and reliable assessment of rock mass quality.展开更多
In this paper,we study a class of Finsler metrics defined by a vector field on a gradient Ricci soliton.We obtain a necessary and sufficient condition for these Finsler metrics on a compact gradient Ricci soliton to b...In this paper,we study a class of Finsler metrics defined by a vector field on a gradient Ricci soliton.We obtain a necessary and sufficient condition for these Finsler metrics on a compact gradient Ricci soliton to be of isotropic S-curvature by establishing a new integral inequality.Then we determine the Ricci curvature of navigation Finsler metrics of isotropic S-curvature on a gradient Ricci soliton generalizing result only known in the case when such soliton is of Einstein type.As its application,we obtain the Ricci curvature of all navigation Finsler metrics of isotropic S-curvature on Gaussian shrinking soliton.展开更多
Background:Failure to rescue has been an effective quality metric in congenital heart surgery.Conversely,mor-bidity and mortality depend greatly on non-modifiable individual factors and have a weak correlation with be...Background:Failure to rescue has been an effective quality metric in congenital heart surgery.Conversely,mor-bidity and mortality depend greatly on non-modifiable individual factors and have a weak correlation with better-quality performance.We aim to measure the complications,mortality,and risk factors in pediatric patients undergoing congenital heart surgery in a high-complexity institution located in a middle-income country and compare it with other institutions that have conducted a similar study.Methods:A retrospective observational study was conducted in a high-complexity service provider institution,in Cali,Colombia.All pediatric patients undergoing any congenital heart surgery between 2019 and 2022 were included.The main outcomes evaluated in the study were complication,mortality,and failure to rescue rate.Univariate and multivariate logistic regression analysis was performed with mortality as the outcome variable.Results:We evaluated 308 congenital heart sur-geries.Regarding the outcomes,201(65%)complications occurred,23(7.5%)patients died,and the FTR of the entire cohort was 11.4%.The presence of a postoperative complication(OR 14.88,CI 3.06–268.37,p=0.009),age(OR 0.79,CI 0.57–0.96,p=0.068),and urgent/emergent surgery(OR 8.14,CI 2.97–28.66,p<0.001)were the most significant variables in predicting mortality.Conclusions:Failure to rescue is an effective and comparable quality measure in healthcare institutions and is the major contributor to postoperative mortality in congenital heart surgeries.Despite our higher mortality and complication rate,we obtained a comparable failure to rescue rate to high-income countries’health institutions.展开更多
In this article,we first establish an asymptotically sharp result on the higher order Fréchet derivatives for bounded holomorphic mappings f(x)=f(0)+∞∑s=1Dskf(0)(x^(sk))/(sk)!:B_(X)→B_(Y),where B_X is the unit...In this article,we first establish an asymptotically sharp result on the higher order Fréchet derivatives for bounded holomorphic mappings f(x)=f(0)+∞∑s=1Dskf(0)(x^(sk))/(sk)!:B_(X)→B_(Y),where B_X is the unit ball of X.We next give a sharp result on the first order Fréchet derivative for bounded holomorphic mappings F(X)=F(0+)∞∑s=KD^(s)f(0)(x^(8)/s!):B_(X)→B_(Y),where B_(X)is the unit ball of X.The results that we derive include some results in several complex variables,and extend the classical result in one complex variable to several complex variables.展开更多
Cross entropy is a measure in machine learning and deep learning that assesses the difference between predicted and actual probability distributions. In this study, we propose cross entropy as a performance evaluation...Cross entropy is a measure in machine learning and deep learning that assesses the difference between predicted and actual probability distributions. In this study, we propose cross entropy as a performance evaluation metric for image classifier models and apply it to the CT image classification of lung cancer. A convolutional neural network is employed as the deep neural network (DNN) image classifier, with the residual network (ResNet) 50 chosen as the DNN archi-tecture. The image data used comprise a lung CT image set. Two classification models are built from datasets with varying amounts of data, and lung cancer is categorized into four classes using 10-fold cross-validation. Furthermore, we employ t-distributed stochastic neighbor embedding to visually explain the data distribution after classification. Experimental results demonstrate that cross en-tropy is a highly useful metric for evaluating the reliability of image classifier models. It is noted that for a more comprehensive evaluation of model perfor-mance, combining with other evaluation metrics is considered essential. .展开更多
Using the Raychaudhuri equation, we associate quantum probability amplitudes (propagators) to equatorial principal ingoing and outgoing null geodesic congruences in the Kerr metric. The expansion scalars diverge at th...Using the Raychaudhuri equation, we associate quantum probability amplitudes (propagators) to equatorial principal ingoing and outgoing null geodesic congruences in the Kerr metric. The expansion scalars diverge at the ring singularity;however, the propagators remain finite, which is an indication that at the quantum level singularities might disappear or, at least, become softened.展开更多
The distance between two vertices u and v in a connected graph G is the number of edges lying in a shortest path(geodesic)between them.A vertex x of G performs the metric identification for a pair(u,v)of vertices in G...The distance between two vertices u and v in a connected graph G is the number of edges lying in a shortest path(geodesic)between them.A vertex x of G performs the metric identification for a pair(u,v)of vertices in G if and only if the equality between the distances of u and v with x implies that u=v(That is,the distance between u and x is different from the distance between v and x).The minimum number of vertices performing the metric identification for every pair of vertices in G defines themetric dimension of G.In this paper,we performthemetric identification of vertices in two types of polygonal cacti:chain polygonal cactus and star polygonal cactus.展开更多
In a very recent article of mine I have corrected the traditional derivation of the Schwarzschild metric thus arriving to formulate a correct Schwarzschild metric different from the traditional Schwarzschild metric. I...In a very recent article of mine I have corrected the traditional derivation of the Schwarzschild metric thus arriving to formulate a correct Schwarzschild metric different from the traditional Schwarzschild metric. In this article, starting from this correct Schwarzschild metric, I also propose corrections to the other traditional Reissner-Nordstrøm, Kerr and Kerr-Newman metrics on the basis of the fact that these metrics should be equal to the correct Schwarzschild metric in the borderline case in which they reduce to the case described by this metric. In this way, we see that, like the correct Schwarzschild metric, also the correct Reissner-Nordstrøm, Kerr and Kerr-Newman metrics do not present any event horizon (and therefore do not present any black hole) unlike the traditional Reissner-Nordstrøm, Kerr and Kerr-Newman metrics.展开更多
In this paper,we prove that for some completions of certain fiber bundles there is a Maxwell-Einstein metric conformally related to any given Kahler class.
Deep metric learning(DML)has achieved great results on visual understanding tasks by seamlessly integrating conventional metric learning with deep neural networks.Existing deep metric learning methods focus on designi...Deep metric learning(DML)has achieved great results on visual understanding tasks by seamlessly integrating conventional metric learning with deep neural networks.Existing deep metric learning methods focus on designing pair-based distance loss to decrease intra-class distance while increasing interclass distance.However,these methods fail to preserve the geometric structure of data in the embedding space,which leads to the spatial structure shift across mini-batches and may slow down the convergence of embedding learning.To alleviate these issues,by assuming that the input data is embedded in a lower-dimensional sub-manifold,we propose a novel deep Riemannian metric learning(DRML)framework that exploits the non-Euclidean geometric structural information.Considering that the curvature information of data measures how much the Riemannian(nonEuclidean)metric deviates from the Euclidean metric,we leverage geometry flow,which is called a geometric evolution equation,to characterize the relation between the Riemannian metric and its curvature.Our DRML not only regularizes the local neighborhoods connection of the embeddings at the hidden layer but also adapts the embeddings to preserve the geometric structure of the data.On several benchmark datasets,the proposed DRML outperforms all existing methods and these results demonstrate its effectiveness.展开更多
The Metric of a graph plays an essential role in the arrangement of different dimensional structures and finding their basis in various terms.The metric dimension of a graph is the selection of the minimum possible nu...The Metric of a graph plays an essential role in the arrangement of different dimensional structures and finding their basis in various terms.The metric dimension of a graph is the selection of the minimum possible number of vertices so that each vertex of the graph is distinctively defined by its vector of distances to the set of selected vertices.This set of selected vertices is known as the metric basis of a graph.In applied mathematics or computer science,the topic of metric basis is considered as locating number or locating set,and it has applications in robot navigation and finding a beacon set of a computer network.Due to the vast applications of this concept in computer science,optimization problems,and also in chemistry enormous research has been conducted.To extend this research to a four-dimensional structure,we studied the metric basis of the Klein bottle and proved that the Klein bottle has a constant metric dimension for the variation of all its parameters.Although the metric basis is variying in 3 and 4 values when the values of its parameter change,it remains constant and unchanged concerning its order or number of vertices.The methodology of determining the metric basis or locating set is based on the distances of a graph.Therefore,we proved the main theorems in distance forms.展开更多
The problem of investigating the minimum set of landmarks consisting of auto-machines(Robots)in a connected network is studied with the concept of location number ormetric dimension of this network.In this paper,we st...The problem of investigating the minimum set of landmarks consisting of auto-machines(Robots)in a connected network is studied with the concept of location number ormetric dimension of this network.In this paper,we study the latest type of metric dimension called as local fractional metric dimension(LFMD)and find its upper bounds for generalized Petersen networks GP(n,3),where n≥7.For n≥9.The limiting values of LFMD for GP(n,3)are also obtained as 1(bounded)if n approaches to infinity.展开更多
Component-based software engineering is concerned with the develop-ment of software that can satisfy the customer prerequisites through reuse or inde-pendent development.Coupling and cohesion measurements are primaril...Component-based software engineering is concerned with the develop-ment of software that can satisfy the customer prerequisites through reuse or inde-pendent development.Coupling and cohesion measurements are primarily used to analyse the better software design quality,increase the reliability and reduced system software complexity.The complexity measurement of cohesion and coupling component to analyze the relationship between the component module.In this paper,proposed the component selection framework of Hexa-oval optimization algorithm for selecting the suitable components from the repository.It measures the interface density modules of coupling and cohesion in a modular software sys-tem.This cohesion measurement has been taken into two parameters for analyz-ing the result of complexity,with the help of low cohesion and high cohesion.In coupling measures between the component of inside parameters and outside parameters.Thefinal process of coupling and cohesion,the measured values were used for the average calculation of components parameter.This paper measures the complexity of direct and indirect interaction among the component as well as the proposed algorithm selecting the optimal component for the repository.The better result is observed for high cohesion and low coupling in compo-nent-based software engineering.展开更多
Meteorological droughts occur when there is deficiency in rainfall;i.e. rainfall availability is below some acclaimed normal values. Hence, the greater challenge is to be able to obtain suitable methods for assessing ...Meteorological droughts occur when there is deficiency in rainfall;i.e. rainfall availability is below some acclaimed normal values. Hence, the greater challenge is to be able to obtain suitable methods for assessing drought occurrence, its onset or initiation and termination. Thus, an attempt was made in this paper to evaluate the performance of Standardised Precipitation Index (SPI) and Standardised Precipitation Anomaly Index (SPAI) to characterise drought in Northern Nigeria for purposes of comparison and eventual adoption of probable candidate index for the development of an Early Warning System. The findings indicated that despite the fact that the annual timescale may be long, it can be employed to obtain information on the temporal evolution of drought especially, regional behaviour. However, monthly timescale can be more appropriate if emphasis is on evaluating the effects of drought in situations relating to water supply, agriculture and groundwater abstractions. The SPAI can be employed for periodic rainfall time series though;it accentuates drought signatures and may not necessarily dampen high fluctuations due to implications of high climatic variability considering the stochastic nature and state transition of drought phenomena. On the other hand, the temporal evolution of SPI and SPAI were not coherent at different temporal accumulations with differences in fluctuations. However, despite the differences between the SPI and SPAI, generally at some timescales, for instance, 6-month accumulation, both spatial and temporal distributions of drought characteristics were seemingly consistent. In view of the observed shortcomings of both indices, especially the SPI, the Standardised Nonstationary Precipitation Index (SnsPI) should be looked into and too, other indexes that take into consideration the implications of global warming by incorporating potential evapotranspiration may be deemed more suitable for drought studies in Northern Nigeria.展开更多
In recent years,deep learning techniques have been used to estimate gaze-a significant task in computer vision and human-computer interaction.Previous studies have made significant achievements in predicting 2D or 3D ...In recent years,deep learning techniques have been used to estimate gaze-a significant task in computer vision and human-computer interaction.Previous studies have made significant achievements in predicting 2D or 3D gazes from monocular face images.This study presents a deep neural network for 2D gaze estimation on mobile devices.It achieves state-of-the-art 2D gaze point regression error,while significantly improving gaze classification error on quadrant divisions of the display.To this end,an efficient attention-based module that correlates and fuses the left and right eye contextual features is first proposed to improve gaze point regression performance.Subsequently,through a unified perspective for gaze estimation,metric learning for gaze classification on quadrant divisions is incorporated as additional supervision.Consequently,both gaze point regression and quadrant classification perfor-mances are improved.The experiments demonstrate that the proposed method outperforms existing gaze-estima-tion methods on the GazeCapture and MPIIFaceGaze datasets.展开更多
Purpose:This study examines the effects of using publication-based metrics for the initial screening in the application process for a project leader.The key questions are whether formal policy affects the allocation o...Purpose:This study examines the effects of using publication-based metrics for the initial screening in the application process for a project leader.The key questions are whether formal policy affects the allocation of funds to researchers with a better publication record and how the previous academic performance of principal investigators is related to future project results.Design/methodology/approach:We compared two competitions,before and after the policy raised the publication threshold for the principal investigators.We analyzed 9,167 papers published by 332 winners in physics and the social sciences and humanities(SSH),and 11,253 publications resulting from each funded project.Findings:We found that among physicists,even in the first period,grants tended to be allocated to prolific authors publishing in high-quality journals.In contrast,the SSH project grantees had been less prolific in publishing internationally in both periods;however,in the second period,the selection of grant recipients yielded better results regarding awarding grants to more productive authors in terms of the quantity and quality of publications.There was no evidence that this better selection of grant recipients resulted in better publication records during grant realization.Originality:This study contributes to the discussion of formal policies that rely on metrics for the evaluation of grant proposals.The Russian case shows that such policy may have a profound effect on changing the supply side of applicants,especially in disciplines that are less suitable for metric-based evaluations.In spite of the criticism given to metrics,they might be a useful additional instrument in academic systems where professional expertise is corrupted and prevents allocation of funds to prolific researchers.展开更多
In this paper,we solve the obstacle problems on metric measure spaces with generalized Ricci lower bounds.We show the existence and Lipschitz continuity of the solutions,and then we establish some regularities of the ...In this paper,we solve the obstacle problems on metric measure spaces with generalized Ricci lower bounds.We show the existence and Lipschitz continuity of the solutions,and then we establish some regularities of the free boundaries.展开更多
Existing systems use key performance indicators(KPIs)as metrics for physical layer(PHY)optimization,which suffers from the problem of overoptimization,because some unnecessary PHY enhancements are imperceptible to ter...Existing systems use key performance indicators(KPIs)as metrics for physical layer(PHY)optimization,which suffers from the problem of overoptimization,because some unnecessary PHY enhancements are imperceptible to terminal users and thus induce additional cost and energy waste.Therefore,it is necessary to utilize directly the quality of experience(QoE)of user as a metric of optimization,which can achieve the global optimum of QoE under cost and energy constraints.However,QoE is still a metric of application layer that cannot be easily used to design and optimize the PHY.To address this problem,we in this paper propose a novel end-to-end QoE(E2E-QoE)based optimization architecture at the user-side for the first time.Specifically,a cross-layer parameterized model is proposed to establish the relationship between PHY and E2E-QoE.Based on this,an E2E-QoE oriented PHY anomaly diagnosis method is further designed to locate the time and root cause of anomalies.Finally,we investigate to optimize the PHY algorithm directly based on the E2E-QoE.The proposed frameworks and algorithms are all validated using the data from real fifth-generation(5G)mobile system,which show that using E2E-QoE as the metric of PHY optimization is feasible and can outperform existing schemes.展开更多
文摘In this article,we study Kahler metrics on a certain line bundle over some compact Kahler manifolds to find complete Kahler metrics with positive holomorphic sectional(or bisectional)curvatures.Thus,we apply a strategy to a famous Yau conjecture with a co-homogeneity one geometry.
基金Project supported by the Beijing Natural Science Foundation(Grant No.1232026)the Qinxin Talents Program of BISTU(Grant No.QXTCP C201711)+2 种基金the R&D Program of Beijing Municipal Education Commission(Grant No.KM202011232017)the National Natural Science Foundation of China(Grant No.12304190)the Research fund of BISTU(Grant No.2022XJJ32).
文摘We investigate the quantum metric and topological Euler number in a cyclically modulated Su-Schrieffer-Heeger(SSH)model with long-range hopping terms.By computing the quantum geometry tensor,we derive exact expressions for the quantum metric and Berry curvature of the energy band electrons,and we obtain the phase diagram of the model marked by the first Chern number.Furthermore,we also obtain the topological Euler number of the energy band based on the Gauss-Bonnet theorem on the topological characterization of the closed Bloch states manifold in the first Brillouin zone.However,some regions where the Berry curvature is identically zero in the first Brillouin zone result in the degeneracy of the quantum metric,which leads to ill-defined non-integer topological Euler numbers.Nevertheless,the non-integer"Euler number"provides valuable insights and an upper bound for the absolute values of the Chern numbers.
文摘Assessment of rock mass quality significantly impacts the design and construction of underground and open-pit mines from the point of stability and economy.This study develops the novel Gromov-Hausdorff distance for rock quality(GHDQR)methodology for rock mass quality rating based on multi-criteria grey metric space.It usually presents the quality of surrounding rock by classes(metric spaces)with specified properties and adequate interval-grey numbers.Measuring the distance between surrounding rock sample characteristics and existing classes represents the core of this study.The Gromov-Hausdorff distance is an especially useful discriminant function,i.e.,a classifier to calculate these distances,and assess the quality of the surrounding rock.The efficiency of the developed methodology is analyzed using the Mean Absolute Percentage Error(MAPE)technique.Seven existing methods,such as the Gaussian cloud method,Discriminant method,Mutation series method,Artificial neural network(ANN),Support vector machine(SVM),Grey wolf optimizer and Support vector classification method(GWO-SVC)and Rock mass rating method(RMR)are used for comparison with the proposed GHDQR method.The share of the highly accurate category of 85.71%clearly indicates compliance with actual values obtained by the compared methods.The results of comparisons showed that the model enables objective,efficient,and reliable assessment of rock mass quality.
基金Supported by the National Natural Science Foundation of China(11771020,12171005).
文摘In this paper,we study a class of Finsler metrics defined by a vector field on a gradient Ricci soliton.We obtain a necessary and sufficient condition for these Finsler metrics on a compact gradient Ricci soliton to be of isotropic S-curvature by establishing a new integral inequality.Then we determine the Ricci curvature of navigation Finsler metrics of isotropic S-curvature on a gradient Ricci soliton generalizing result only known in the case when such soliton is of Einstein type.As its application,we obtain the Ricci curvature of all navigation Finsler metrics of isotropic S-curvature on Gaussian shrinking soliton.
基金approved by the Institutional Ethics Committee(approval number 628-2022 Act No.I22-112 of November 02,2022)following national and international recommendations for human research.In。
文摘Background:Failure to rescue has been an effective quality metric in congenital heart surgery.Conversely,mor-bidity and mortality depend greatly on non-modifiable individual factors and have a weak correlation with better-quality performance.We aim to measure the complications,mortality,and risk factors in pediatric patients undergoing congenital heart surgery in a high-complexity institution located in a middle-income country and compare it with other institutions that have conducted a similar study.Methods:A retrospective observational study was conducted in a high-complexity service provider institution,in Cali,Colombia.All pediatric patients undergoing any congenital heart surgery between 2019 and 2022 were included.The main outcomes evaluated in the study were complication,mortality,and failure to rescue rate.Univariate and multivariate logistic regression analysis was performed with mortality as the outcome variable.Results:We evaluated 308 congenital heart sur-geries.Regarding the outcomes,201(65%)complications occurred,23(7.5%)patients died,and the FTR of the entire cohort was 11.4%.The presence of a postoperative complication(OR 14.88,CI 3.06–268.37,p=0.009),age(OR 0.79,CI 0.57–0.96,p=0.068),and urgent/emergent surgery(OR 8.14,CI 2.97–28.66,p<0.001)were the most significant variables in predicting mortality.Conclusions:Failure to rescue is an effective and comparable quality measure in healthcare institutions and is the major contributor to postoperative mortality in congenital heart surgeries.Despite our higher mortality and complication rate,we obtained a comparable failure to rescue rate to high-income countries’health institutions.
基金supported by the NSFC(11871257,12071130)supported by the NSFC(11971165)。
文摘In this article,we first establish an asymptotically sharp result on the higher order Fréchet derivatives for bounded holomorphic mappings f(x)=f(0)+∞∑s=1Dskf(0)(x^(sk))/(sk)!:B_(X)→B_(Y),where B_X is the unit ball of X.We next give a sharp result on the first order Fréchet derivative for bounded holomorphic mappings F(X)=F(0+)∞∑s=KD^(s)f(0)(x^(8)/s!):B_(X)→B_(Y),where B_(X)is the unit ball of X.The results that we derive include some results in several complex variables,and extend the classical result in one complex variable to several complex variables.
文摘Cross entropy is a measure in machine learning and deep learning that assesses the difference between predicted and actual probability distributions. In this study, we propose cross entropy as a performance evaluation metric for image classifier models and apply it to the CT image classification of lung cancer. A convolutional neural network is employed as the deep neural network (DNN) image classifier, with the residual network (ResNet) 50 chosen as the DNN archi-tecture. The image data used comprise a lung CT image set. Two classification models are built from datasets with varying amounts of data, and lung cancer is categorized into four classes using 10-fold cross-validation. Furthermore, we employ t-distributed stochastic neighbor embedding to visually explain the data distribution after classification. Experimental results demonstrate that cross en-tropy is a highly useful metric for evaluating the reliability of image classifier models. It is noted that for a more comprehensive evaluation of model perfor-mance, combining with other evaluation metrics is considered essential. .
文摘Using the Raychaudhuri equation, we associate quantum probability amplitudes (propagators) to equatorial principal ingoing and outgoing null geodesic congruences in the Kerr metric. The expansion scalars diverge at the ring singularity;however, the propagators remain finite, which is an indication that at the quantum level singularities might disappear or, at least, become softened.
文摘The distance between two vertices u and v in a connected graph G is the number of edges lying in a shortest path(geodesic)between them.A vertex x of G performs the metric identification for a pair(u,v)of vertices in G if and only if the equality between the distances of u and v with x implies that u=v(That is,the distance between u and x is different from the distance between v and x).The minimum number of vertices performing the metric identification for every pair of vertices in G defines themetric dimension of G.In this paper,we performthemetric identification of vertices in two types of polygonal cacti:chain polygonal cactus and star polygonal cactus.
文摘In a very recent article of mine I have corrected the traditional derivation of the Schwarzschild metric thus arriving to formulate a correct Schwarzschild metric different from the traditional Schwarzschild metric. In this article, starting from this correct Schwarzschild metric, I also propose corrections to the other traditional Reissner-Nordstrøm, Kerr and Kerr-Newman metrics on the basis of the fact that these metrics should be equal to the correct Schwarzschild metric in the borderline case in which they reduce to the case described by this metric. In this way, we see that, like the correct Schwarzschild metric, also the correct Reissner-Nordstrøm, Kerr and Kerr-Newman metrics do not present any event horizon (and therefore do not present any black hole) unlike the traditional Reissner-Nordstrøm, Kerr and Kerr-Newman metrics.
文摘In this paper,we prove that for some completions of certain fiber bundles there is a Maxwell-Einstein metric conformally related to any given Kahler class.
基金supported in part by the Young Elite Scientists Sponsorship Program by CAST(2022QNRC001)the National Natural Science Foundation of China(61621003,62101136)+2 种基金Natural Science Foundation of Shanghai(21ZR1403600)Shanghai Municipal Science and Technology Major Project(2018SHZDZX01)ZJLab,and Shanghai Municipal of Science and Technology Project(20JC1419500)。
文摘Deep metric learning(DML)has achieved great results on visual understanding tasks by seamlessly integrating conventional metric learning with deep neural networks.Existing deep metric learning methods focus on designing pair-based distance loss to decrease intra-class distance while increasing interclass distance.However,these methods fail to preserve the geometric structure of data in the embedding space,which leads to the spatial structure shift across mini-batches and may slow down the convergence of embedding learning.To alleviate these issues,by assuming that the input data is embedded in a lower-dimensional sub-manifold,we propose a novel deep Riemannian metric learning(DRML)framework that exploits the non-Euclidean geometric structural information.Considering that the curvature information of data measures how much the Riemannian(nonEuclidean)metric deviates from the Euclidean metric,we leverage geometry flow,which is called a geometric evolution equation,to characterize the relation between the Riemannian metric and its curvature.Our DRML not only regularizes the local neighborhoods connection of the embeddings at the hidden layer but also adapts the embeddings to preserve the geometric structure of the data.On several benchmark datasets,the proposed DRML outperforms all existing methods and these results demonstrate its effectiveness.
文摘The Metric of a graph plays an essential role in the arrangement of different dimensional structures and finding their basis in various terms.The metric dimension of a graph is the selection of the minimum possible number of vertices so that each vertex of the graph is distinctively defined by its vector of distances to the set of selected vertices.This set of selected vertices is known as the metric basis of a graph.In applied mathematics or computer science,the topic of metric basis is considered as locating number or locating set,and it has applications in robot navigation and finding a beacon set of a computer network.Due to the vast applications of this concept in computer science,optimization problems,and also in chemistry enormous research has been conducted.To extend this research to a four-dimensional structure,we studied the metric basis of the Klein bottle and proved that the Klein bottle has a constant metric dimension for the variation of all its parameters.Although the metric basis is variying in 3 and 4 values when the values of its parameter change,it remains constant and unchanged concerning its order or number of vertices.The methodology of determining the metric basis or locating set is based on the distances of a graph.Therefore,we proved the main theorems in distance forms.
基金funded by the Deanship of Scientific Research at Jouf University under Grant No.DSR-2021-03-0301supported by the Higher Education Commission of Pakistan through the National Research Program for Universities Grant No.20-16188/NRPU/R&D/HEC/20212021.
文摘The problem of investigating the minimum set of landmarks consisting of auto-machines(Robots)in a connected network is studied with the concept of location number ormetric dimension of this network.In this paper,we study the latest type of metric dimension called as local fractional metric dimension(LFMD)and find its upper bounds for generalized Petersen networks GP(n,3),where n≥7.For n≥9.The limiting values of LFMD for GP(n,3)are also obtained as 1(bounded)if n approaches to infinity.
基金We deeply acknowledge Taif University for Supporting this research through Taif University Researchers Supporting Project number(TURSP-2020/231),Taif University,Taif,Saudi Arabia.
文摘Component-based software engineering is concerned with the develop-ment of software that can satisfy the customer prerequisites through reuse or inde-pendent development.Coupling and cohesion measurements are primarily used to analyse the better software design quality,increase the reliability and reduced system software complexity.The complexity measurement of cohesion and coupling component to analyze the relationship between the component module.In this paper,proposed the component selection framework of Hexa-oval optimization algorithm for selecting the suitable components from the repository.It measures the interface density modules of coupling and cohesion in a modular software sys-tem.This cohesion measurement has been taken into two parameters for analyz-ing the result of complexity,with the help of low cohesion and high cohesion.In coupling measures between the component of inside parameters and outside parameters.Thefinal process of coupling and cohesion,the measured values were used for the average calculation of components parameter.This paper measures the complexity of direct and indirect interaction among the component as well as the proposed algorithm selecting the optimal component for the repository.The better result is observed for high cohesion and low coupling in compo-nent-based software engineering.
文摘Meteorological droughts occur when there is deficiency in rainfall;i.e. rainfall availability is below some acclaimed normal values. Hence, the greater challenge is to be able to obtain suitable methods for assessing drought occurrence, its onset or initiation and termination. Thus, an attempt was made in this paper to evaluate the performance of Standardised Precipitation Index (SPI) and Standardised Precipitation Anomaly Index (SPAI) to characterise drought in Northern Nigeria for purposes of comparison and eventual adoption of probable candidate index for the development of an Early Warning System. The findings indicated that despite the fact that the annual timescale may be long, it can be employed to obtain information on the temporal evolution of drought especially, regional behaviour. However, monthly timescale can be more appropriate if emphasis is on evaluating the effects of drought in situations relating to water supply, agriculture and groundwater abstractions. The SPAI can be employed for periodic rainfall time series though;it accentuates drought signatures and may not necessarily dampen high fluctuations due to implications of high climatic variability considering the stochastic nature and state transition of drought phenomena. On the other hand, the temporal evolution of SPI and SPAI were not coherent at different temporal accumulations with differences in fluctuations. However, despite the differences between the SPI and SPAI, generally at some timescales, for instance, 6-month accumulation, both spatial and temporal distributions of drought characteristics were seemingly consistent. In view of the observed shortcomings of both indices, especially the SPI, the Standardised Nonstationary Precipitation Index (SnsPI) should be looked into and too, other indexes that take into consideration the implications of global warming by incorporating potential evapotranspiration may be deemed more suitable for drought studies in Northern Nigeria.
基金the National Natural Science Foundation of China,No.61932003and the Fundamental Research Funds for the Central Universities.
文摘In recent years,deep learning techniques have been used to estimate gaze-a significant task in computer vision and human-computer interaction.Previous studies have made significant achievements in predicting 2D or 3D gazes from monocular face images.This study presents a deep neural network for 2D gaze estimation on mobile devices.It achieves state-of-the-art 2D gaze point regression error,while significantly improving gaze classification error on quadrant divisions of the display.To this end,an efficient attention-based module that correlates and fuses the left and right eye contextual features is first proposed to improve gaze point regression performance.Subsequently,through a unified perspective for gaze estimation,metric learning for gaze classification on quadrant divisions is incorporated as additional supervision.Consequently,both gaze point regression and quadrant classification perfor-mances are improved.The experiments demonstrate that the proposed method outperforms existing gaze-estima-tion methods on the GazeCapture and MPIIFaceGaze datasets.
基金This work is supported by Russian Science Foundation(Grant No.21-78-10102).
文摘Purpose:This study examines the effects of using publication-based metrics for the initial screening in the application process for a project leader.The key questions are whether formal policy affects the allocation of funds to researchers with a better publication record and how the previous academic performance of principal investigators is related to future project results.Design/methodology/approach:We compared two competitions,before and after the policy raised the publication threshold for the principal investigators.We analyzed 9,167 papers published by 332 winners in physics and the social sciences and humanities(SSH),and 11,253 publications resulting from each funded project.Findings:We found that among physicists,even in the first period,grants tended to be allocated to prolific authors publishing in high-quality journals.In contrast,the SSH project grantees had been less prolific in publishing internationally in both periods;however,in the second period,the selection of grant recipients yielded better results regarding awarding grants to more productive authors in terms of the quantity and quality of publications.There was no evidence that this better selection of grant recipients resulted in better publication records during grant realization.Originality:This study contributes to the discussion of formal policies that rely on metrics for the evaluation of grant proposals.The Russian case shows that such policy may have a profound effect on changing the supply side of applicants,especially in disciplines that are less suitable for metric-based evaluations.In spite of the criticism given to metrics,they might be a useful additional instrument in academic systems where professional expertise is corrupted and prevents allocation of funds to prolific researchers.
基金supported by the National Key R&Dprogram of China(2021YFA1003001)。
文摘In this paper,we solve the obstacle problems on metric measure spaces with generalized Ricci lower bounds.We show the existence and Lipschitz continuity of the solutions,and then we establish some regularities of the free boundaries.
文摘Existing systems use key performance indicators(KPIs)as metrics for physical layer(PHY)optimization,which suffers from the problem of overoptimization,because some unnecessary PHY enhancements are imperceptible to terminal users and thus induce additional cost and energy waste.Therefore,it is necessary to utilize directly the quality of experience(QoE)of user as a metric of optimization,which can achieve the global optimum of QoE under cost and energy constraints.However,QoE is still a metric of application layer that cannot be easily used to design and optimize the PHY.To address this problem,we in this paper propose a novel end-to-end QoE(E2E-QoE)based optimization architecture at the user-side for the first time.Specifically,a cross-layer parameterized model is proposed to establish the relationship between PHY and E2E-QoE.Based on this,an E2E-QoE oriented PHY anomaly diagnosis method is further designed to locate the time and root cause of anomalies.Finally,we investigate to optimize the PHY algorithm directly based on the E2E-QoE.The proposed frameworks and algorithms are all validated using the data from real fifth-generation(5G)mobile system,which show that using E2E-QoE as the metric of PHY optimization is feasible and can outperform existing schemes.