Since Grover’s algorithm was first introduced, it has become a category of quantum algorithms that can be applied to many problems through the exploitation of quantum parallelism. The original application was the uns...Since Grover’s algorithm was first introduced, it has become a category of quantum algorithms that can be applied to many problems through the exploitation of quantum parallelism. The original application was the unstructured search problems with the time complexity of O(). In Grover’s algorithm, the key is Oracle and Amplitude Amplification. In this paper, our purpose is to show through examples that, in general, the time complexity of the Oracle Phase is O(N), not O(1). As a result, the time complexity of Grover’s algorithm is O(N), not O(). As a secondary purpose, we also attempt to restore the time complexity of Grover’s algorithm to its original form, O(), by introducing an O(1) parallel algorithm for unstructured search without repeated items, which will work for most cases. In the worst-case scenarios where the number of repeated items is O(N), the time complexity of the Oracle Phase is still O(N) even after additional preprocessing.展开更多
Maximum frequent pattern generation from a large database of transactions and items for association rule mining is an important research topic in data mining. Association rule mining aims to discover interesting corre...Maximum frequent pattern generation from a large database of transactions and items for association rule mining is an important research topic in data mining. Association rule mining aims to discover interesting correlations, frequent patterns, associations, or causal structures between items hidden in a large database. By exploiting quantum computing, we propose an efficient quantum search algorithm design to discover the maximum frequent patterns. We modified Grover’s search algorithm so that a subspace of arbitrary symmetric states is used instead of the whole search space. We presented a novel quantum oracle design that employs a quantum counter to count the maximum frequent items and a quantum comparator to check with a minimum support threshold. The proposed derived algorithm increases the rate of the correct solutions since the search is only in a subspace. Furthermore, our algorithm significantly scales and optimizes the required number of qubits in design, which directly reflected positively on the performance. Our proposed design can accommodate more transactions and items and still have a good performance with a small number of qubits.展开更多
“Open community” has aroused widespread concern and research. This paper focuses on the system analysis research of the problem that based on statistics including the regression equation fitting function and mathema...“Open community” has aroused widespread concern and research. This paper focuses on the system analysis research of the problem that based on statistics including the regression equation fitting function and mathematical theory, combined with the actual effect of camera measurement method, Prim’s algorithm and neural network to “Open community” and the applicable conditions. Research results show that with the increasing number of roads within the district, the benefit time gradually increased, but each type of district capacity is different.展开更多
Based on the principle of “pre-disaster prevention outweighs rescue during disasters”, this study targets areas threatened by natural disasters, and develops an automatic algorithm based on the Prim algorithm to ser...Based on the principle of “pre-disaster prevention outweighs rescue during disasters”, this study targets areas threatened by natural disasters, and develops an automatic algorithm based on the Prim algorithm to serve as an automatic identification system. In the face of natural disasters that disable key facilities in the region and prevent settlements from contacting the outside world or outsiders from sending rescuers to the settlements, the proposed system helps to identify whether these regions will become isolated areas and conduct disaster mitigation and relief resource allocation before any natural disaster in order to reduce potential disaster losses. An automatic identification system, based on the threshold of channel blocking due to broken roads and bridges, determines through the decision tree model and relevant patterns whether such regions will become isolated areas by identifying areas based on the results of model analysis. The proposed system’s identification results are verified by actual case histories and comparisons;the results can be used to correctly identify isolated areas. Finally, Microsoft Visual Studio C # and Google Map are employed to apply the results and to produce an information mode for the determination and decision support of isolated areas affected by natural disasters.展开更多
This paper delves into the intricate interplay between artificial intelligence(AI)systems and the perpetuation of Anti-Black racism within the United States medical industry.Despite the promising potential of AI to en...This paper delves into the intricate interplay between artificial intelligence(AI)systems and the perpetuation of Anti-Black racism within the United States medical industry.Despite the promising potential of AI to enhance healthcare outcomes and reduce disparities,there is a growing concern that these technologies may inadvertently/advertently exacerbate existing racial inequalities.Focusing specifically on the experiences of Black patients,this research investigates how the following AI components:medical algorithms,machine learning,and natural learning processes are contributing to the unequal distribution of medical resources,diagnosis,and health care treatment of those classified as Black.Furthermore,this review employs a multidisciplinary approach,combining insights from computer science,medical ethics,and social justice theory to analyze the mechanisms through which AI systems may encode and reinforce racial biases.By dissecting the three primary components of AI,this paper aims to present a clear understanding of how these technologies work,how they intersect,and how they may inherently perpetuate harmful stereotypes resulting in negligent outcomes for Black patients.Furthermore,this paper explores the ethical implications of deploying AI in healthcare settings and calls for increased transparency,accountability,and diversity in the development and implementation of these technologies.Finally,it is important that I prefer the following paper with a clear and concise definition of what I refer to as Anti-Black racism throughout the text.Therefore,I assert the following:Anti-Black racism refers to prejudice,discrimination,or antagonism directed against individuals or communities of African descent based on their race.It involves the belief in the inherent superiority of one race over another and the systemic and institutional practices that perpetuate inequality and disadvantage for Black people.Furthermore,I proclaim that this form of racism can be manifested in various ways,such as unequal access to opportunities,resources,education,employment,and fair treatment within social,economic,and political systems.It is also pertinent to acknowledge that Anti-Black racism is deeply rooted in historical and societal structures throughout the U.S.borders and beyond,leading to systemic disadvantages and disparities that impact the well-being and life chances of Black individuals and communities.Addressing Anti-Black racism involves recognizing and challenging both individual attitudes and systemic structures that contribute to discrimination and inequality.Efforts to combat Anti-Black racism include promoting awareness,education,advocacy for policy changes,and fostering a culture of inclusivity and equality.展开更多
With the help of the classical Abel’s lemma on summation by parts and algorithm of q-hypergeometric summations, we deal with the summation, which can be written as multiplication of a q-hypergeometric term and q-harm...With the help of the classical Abel’s lemma on summation by parts and algorithm of q-hypergeometric summations, we deal with the summation, which can be written as multiplication of a q-hypergeometric term and q-harmonic numbers. This enables us to construct and prove identities on q-harmonic numbers. Several examples are also given.展开更多
In this paper, we modify the Bregman APG<sub>s</sub> (BAPG<sub>s</sub>) method proposed in (Wang, L, et al.) for solving the support vector machine problem with truncated loss (HTPSVM) given in...In this paper, we modify the Bregman APG<sub>s</sub> (BAPG<sub>s</sub>) method proposed in (Wang, L, et al.) for solving the support vector machine problem with truncated loss (HTPSVM) given in (Zhu, W, et al.), we also add an adaptive parameter selection technique based on (Ren, K, et al.). In each iteration, we use the linear approximation method to get the explicit solution of the subproblem and set a function to apply the Bregman distance. Finally, numerical experiments are performed to verify the efficiency of BAPG<sub>s</sub>.展开更多
Accurate frequency estimation in a wideband digital receiver using the FFT algorithm encounters challenges, such as spectral leakage resulting from the FFT’s assumption of signal periodicity. High-resolution FFTs pos...Accurate frequency estimation in a wideband digital receiver using the FFT algorithm encounters challenges, such as spectral leakage resulting from the FFT’s assumption of signal periodicity. High-resolution FFTs pose computational demands, and estimating non-integer multiples of frequency resolution proves exceptionally challenging. This paper introduces two novel methods for enhanced frequency precision: polynomial interpolation and array indexing, comparing their results with super-resolution and scalloping loss. Simulation results demonstrate the effectiveness of the proposed methods in contemporary radar systems, with array indexing providing the best frequency estimation despite utilizing maximum hardware resources. The paper demonstrates a trade-off between accurate frequency estimation and hardware resources when comparing polynomial interpolation and array indexing.展开更多
文摘Since Grover’s algorithm was first introduced, it has become a category of quantum algorithms that can be applied to many problems through the exploitation of quantum parallelism. The original application was the unstructured search problems with the time complexity of O(). In Grover’s algorithm, the key is Oracle and Amplitude Amplification. In this paper, our purpose is to show through examples that, in general, the time complexity of the Oracle Phase is O(N), not O(1). As a result, the time complexity of Grover’s algorithm is O(N), not O(). As a secondary purpose, we also attempt to restore the time complexity of Grover’s algorithm to its original form, O(), by introducing an O(1) parallel algorithm for unstructured search without repeated items, which will work for most cases. In the worst-case scenarios where the number of repeated items is O(N), the time complexity of the Oracle Phase is still O(N) even after additional preprocessing.
文摘Maximum frequent pattern generation from a large database of transactions and items for association rule mining is an important research topic in data mining. Association rule mining aims to discover interesting correlations, frequent patterns, associations, or causal structures between items hidden in a large database. By exploiting quantum computing, we propose an efficient quantum search algorithm design to discover the maximum frequent patterns. We modified Grover’s search algorithm so that a subspace of arbitrary symmetric states is used instead of the whole search space. We presented a novel quantum oracle design that employs a quantum counter to count the maximum frequent items and a quantum comparator to check with a minimum support threshold. The proposed derived algorithm increases the rate of the correct solutions since the search is only in a subspace. Furthermore, our algorithm significantly scales and optimizes the required number of qubits in design, which directly reflected positively on the performance. Our proposed design can accommodate more transactions and items and still have a good performance with a small number of qubits.
文摘“Open community” has aroused widespread concern and research. This paper focuses on the system analysis research of the problem that based on statistics including the regression equation fitting function and mathematical theory, combined with the actual effect of camera measurement method, Prim’s algorithm and neural network to “Open community” and the applicable conditions. Research results show that with the increasing number of roads within the district, the benefit time gradually increased, but each type of district capacity is different.
文摘Based on the principle of “pre-disaster prevention outweighs rescue during disasters”, this study targets areas threatened by natural disasters, and develops an automatic algorithm based on the Prim algorithm to serve as an automatic identification system. In the face of natural disasters that disable key facilities in the region and prevent settlements from contacting the outside world or outsiders from sending rescuers to the settlements, the proposed system helps to identify whether these regions will become isolated areas and conduct disaster mitigation and relief resource allocation before any natural disaster in order to reduce potential disaster losses. An automatic identification system, based on the threshold of channel blocking due to broken roads and bridges, determines through the decision tree model and relevant patterns whether such regions will become isolated areas by identifying areas based on the results of model analysis. The proposed system’s identification results are verified by actual case histories and comparisons;the results can be used to correctly identify isolated areas. Finally, Microsoft Visual Studio C # and Google Map are employed to apply the results and to produce an information mode for the determination and decision support of isolated areas affected by natural disasters.
文摘This paper delves into the intricate interplay between artificial intelligence(AI)systems and the perpetuation of Anti-Black racism within the United States medical industry.Despite the promising potential of AI to enhance healthcare outcomes and reduce disparities,there is a growing concern that these technologies may inadvertently/advertently exacerbate existing racial inequalities.Focusing specifically on the experiences of Black patients,this research investigates how the following AI components:medical algorithms,machine learning,and natural learning processes are contributing to the unequal distribution of medical resources,diagnosis,and health care treatment of those classified as Black.Furthermore,this review employs a multidisciplinary approach,combining insights from computer science,medical ethics,and social justice theory to analyze the mechanisms through which AI systems may encode and reinforce racial biases.By dissecting the three primary components of AI,this paper aims to present a clear understanding of how these technologies work,how they intersect,and how they may inherently perpetuate harmful stereotypes resulting in negligent outcomes for Black patients.Furthermore,this paper explores the ethical implications of deploying AI in healthcare settings and calls for increased transparency,accountability,and diversity in the development and implementation of these technologies.Finally,it is important that I prefer the following paper with a clear and concise definition of what I refer to as Anti-Black racism throughout the text.Therefore,I assert the following:Anti-Black racism refers to prejudice,discrimination,or antagonism directed against individuals or communities of African descent based on their race.It involves the belief in the inherent superiority of one race over another and the systemic and institutional practices that perpetuate inequality and disadvantage for Black people.Furthermore,I proclaim that this form of racism can be manifested in various ways,such as unequal access to opportunities,resources,education,employment,and fair treatment within social,economic,and political systems.It is also pertinent to acknowledge that Anti-Black racism is deeply rooted in historical and societal structures throughout the U.S.borders and beyond,leading to systemic disadvantages and disparities that impact the well-being and life chances of Black individuals and communities.Addressing Anti-Black racism involves recognizing and challenging both individual attitudes and systemic structures that contribute to discrimination and inequality.Efforts to combat Anti-Black racism include promoting awareness,education,advocacy for policy changes,and fostering a culture of inclusivity and equality.
文摘With the help of the classical Abel’s lemma on summation by parts and algorithm of q-hypergeometric summations, we deal with the summation, which can be written as multiplication of a q-hypergeometric term and q-harmonic numbers. This enables us to construct and prove identities on q-harmonic numbers. Several examples are also given.
文摘In this paper, we modify the Bregman APG<sub>s</sub> (BAPG<sub>s</sub>) method proposed in (Wang, L, et al.) for solving the support vector machine problem with truncated loss (HTPSVM) given in (Zhu, W, et al.), we also add an adaptive parameter selection technique based on (Ren, K, et al.). In each iteration, we use the linear approximation method to get the explicit solution of the subproblem and set a function to apply the Bregman distance. Finally, numerical experiments are performed to verify the efficiency of BAPG<sub>s</sub>.
文摘Accurate frequency estimation in a wideband digital receiver using the FFT algorithm encounters challenges, such as spectral leakage resulting from the FFT’s assumption of signal periodicity. High-resolution FFTs pose computational demands, and estimating non-integer multiples of frequency resolution proves exceptionally challenging. This paper introduces two novel methods for enhanced frequency precision: polynomial interpolation and array indexing, comparing their results with super-resolution and scalloping loss. Simulation results demonstrate the effectiveness of the proposed methods in contemporary radar systems, with array indexing providing the best frequency estimation despite utilizing maximum hardware resources. The paper demonstrates a trade-off between accurate frequency estimation and hardware resources when comparing polynomial interpolation and array indexing.