Edge devices,due to their limited computational and storage resources,often require the use of compilers for program optimization.Therefore,ensuring the security and reliability of these compilers is of paramount impo...Edge devices,due to their limited computational and storage resources,often require the use of compilers for program optimization.Therefore,ensuring the security and reliability of these compilers is of paramount importance in the emerging field of edge AI.One widely used testing method for this purpose is fuzz testing,which detects bugs by inputting random test cases into the target program.However,this process consumes significant time and resources.To improve the efficiency of compiler fuzz testing,it is common practice to utilize test case prioritization techniques.Some researchers use machine learning to predict the code coverage of test cases,aiming to maximize the test capability for the target compiler by increasing the overall predicted coverage of the test cases.Nevertheless,these methods can only forecast the code coverage of the compiler at a specific optimization level,potentially missing many optimization-related bugs.In this paper,we introduce C-CORE(short for Clustering by Code Representation),the first framework to prioritize test cases according to their code representations,which are derived directly from the source codes.This approach avoids being limited to specific compiler states and extends to a broader range of compiler bugs.Specifically,we first train a scaled pre-trained programming language model to capture as many common features as possible from the test cases generated by a fuzzer.Using this pre-trained model,we then train two downstream models:one for predicting the likelihood of triggering a bug and another for identifying code representations associated with bugs.Subsequently,we cluster the test cases according to their code representations and select the highest-scoring test case from each cluster as the high-quality test case.This reduction in redundant testing cases leads to time savings.Comprehensive evaluation results reveal that code representations are better at distinguishing test capabilities,and C-CORE significantly enhances testing efficiency.Across four datasets,C-CORE increases the average of the percentage of faults detected(APFD)value by 0.16 to 0.31 and reduces test time by over 50% in 46% of cases.When compared to the best results from approaches using predicted code coverage,C-CORE improves the APFD value by 1.1% to 12.3% and achieves an overall time-saving of 159.1%.展开更多
In real life,incomplete information,inaccurate data,and the preferences of decision-makers during qualitative judgment would impact the process of decision-making.As a technical instrument that can successfully handle...In real life,incomplete information,inaccurate data,and the preferences of decision-makers during qualitative judgment would impact the process of decision-making.As a technical instrument that can successfully handle uncertain information,Fermatean fuzzy sets have recently been used to solve the multi-attribute decision-making(MADM)problems.This paper proposes a Fermatean hesitant fuzzy information aggregation method to address the problem of fusion where the membership,non-membership,and priority are considered simultaneously.Combining the Fermatean hesitant fuzzy sets with Heronian Mean operators,this paper proposes the Fermatean hesitant fuzzy Heronian mean(FHFHM)operator and the Fermatean hesitant fuzzyweighted Heronian mean(FHFWHM)operator.Then,considering the priority relationship between attributes is often easier to obtain than the weight of attributes,this paper defines a new Fermatean hesitant fuzzy prioritized Heronian mean operator(FHFPHM),and discusses its elegant properties such as idempotency,boundedness and monotonicity in detail.Later,for problems with unknown weights and the Fermatean hesitant fuzzy information,aMADM approach based on prioritized attributes is proposed,which can effectively depict the correlation between attributes and avoid the influence of subjective factors on the results.Finally,a numerical example of multi-sensor electronic surveillance is applied to verify the feasibility and validity of the method proposed in this paper.展开更多
In view of the environment competencies,selecting the optimal green supplier is one of the crucial issues for enterprises,and multi-criteria decision-making(MCDM)methodologies can more easily solve this green supplier...In view of the environment competencies,selecting the optimal green supplier is one of the crucial issues for enterprises,and multi-criteria decision-making(MCDM)methodologies can more easily solve this green supplier selection(GSS)problem.In addition,prioritized aggregation(PA)operator can focus on the prioritization relationship over the criteria,Choquet integral(CI)operator can fully take account of the importance of criteria and the interactions among them,and Bonferroni mean(BM)operator can capture the interrelationships of criteria.However,most existing researches cannot simultaneously consider the interactions,interrelationships and prioritizations over the criteria,which are involved in the GSS process.Moreover,the interval type-2 fuzzy set(IT2FS)is a more effective tool to represent the fuzziness.Therefore,based on the advantages of PA,CI,BM and IT2FS,in this paper,the interval type-2 fuzzy prioritized Choquet normalized weighted BM operators with fuzzy measure and generalized prioritized measure are proposed,and some properties are discussed.Then,a novel MCDM approach for GSS based upon the presented operators is developed,and detailed decision steps are given.Finally,the applicability and practicability of the proposed methodology are demonstrated by its application in the shared-bike GSS and by comparisons with other methods.The advantages of the proposed method are that it can consider interactions,interrelationships and prioritizations over the criteria simultaneously.展开更多
Medical Internet of Things(MIoTs)is a collection of small and energyefficient wireless sensor devices that monitor the patient’s body.The healthcare networks transmit continuous data monitoring for the patients to su...Medical Internet of Things(MIoTs)is a collection of small and energyefficient wireless sensor devices that monitor the patient’s body.The healthcare networks transmit continuous data monitoring for the patients to survive them independently.There are many improvements in MIoTs,but still,there are critical issues that might affect the Quality of Service(QoS)of a network.Congestion handling is one of the critical factors that directly affect the QoS of the network.The congestion in MIoT can cause more energy consumption,delay,and important data loss.If a patient has an emergency,then the life-critical signals must transmit with minimum latency.During emergencies,the MIoTs have to monitor the patients continuously and transmit data(e.g.,ECG,BP,heart rate,etc.)with minimum delay.Therefore,there is an efficient technique required that can transmit emergency data of high-risk patients to the medical staff on time with maximum reliability.The main objective of this research is to monitor and transmit the patient’s real-time data efficiently and to prioritize the emergency data.In this paper,Emergency Prioritized and Congestion Handling Protocol for Medical IoTs(EPCP_MIoT)is proposed that efficiently monitors the patients and overcome the congestion by enabling different monitoring modes.Whereas the emergency data transmissions are prioritized and transmit at SIFS time.The proposed technique is implemented and compared with the previous technique,the comparison results show that the proposed technique outperforms the previous techniques in terms of network throughput,end to end delay,energy consumption,and packet loss ratio.展开更多
The object-based scalable coding in MPEG-4 is investigated, and a prioritized transmission scheme of MPEG-4 audio-visual objects (AVOs) over the DiffServ network with the QoS guarantee is proposed. MPEG-4 AVOs are e...The object-based scalable coding in MPEG-4 is investigated, and a prioritized transmission scheme of MPEG-4 audio-visual objects (AVOs) over the DiffServ network with the QoS guarantee is proposed. MPEG-4 AVOs are extracted and classified into different groups according to their priority values and scalable layers (visual importance). These priority values are mapped to the 1P DiffServ per hop behaviors (PHB). This scheme can selectively discard packets with low importance, in order to avoid the network congestion. Simulation results show that the quality of received video can gracefully adapt to network state, as compared with the ‘best-effort' manner. Also, by allowing the content provider to define prioritization of each audio-visual object, the adaptive transmission of object-based scalable video can be customized based on the content.展开更多
Pancreatic cancer (PC) occurs when malignant cells develop in the part of the pancreas, a glandular organ behind the stomach. For 2015, there are about 40,560 people dead of pancreatic cancer (20,710 men and 19,850...Pancreatic cancer (PC) occurs when malignant cells develop in the part of the pancreas, a glandular organ behind the stomach. For 2015, there are about 40,560 people dead of pancreatic cancer (20,710 men and 19,850 women) in the US (Siegel et al., 2015). Though PC accounts for about 3% of all cancers in the US, it can cause about 7% of cancer deaths. This is mainly because that the early stages of this cancer do not usually produce symptoms, and thus the cancer is almost always fatal when it is diagnosed.展开更多
Purpose-The authors develop some prioritized operators named Pythagorean fuzzy prioritized averaging operator with priority degrees and Pythagorean fuzzy prioritized geometric operator with priority degrees.The proper...Purpose-The authors develop some prioritized operators named Pythagorean fuzzy prioritized averaging operator with priority degrees and Pythagorean fuzzy prioritized geometric operator with priority degrees.The properties of the existing method are routinely compared to those of other current approaches,emphasizing the superiority of the presented work over currently used methods.Furthermore,the impact of priority degrees on the aggregate outcome is thoroughly examined.Further,based on these operators,a decision-making approach is presented under the Pythagorean fuzzy set environment.An illustrative example related to the selection of the best alternative is considered to demonstrate the efficiency of the proposed approach.Design/methodology/approach-In real-world situations,Pythagorean fuzzy numbers are exceptionally useful for representing ambiguous data.The authors look at multi-criteria decision-making issues in which the parameters have a prioritization relationship.The idea of a priority degree is introduced.The aggregation operators are formed by awarding non-negative real numbers known as priority degrees among strict priority levels.Consequently,the authors develop some prioritized operators named Pythagorean fuzzy prioritized averaging operator with priority degrees and Pythagorean fuzzy prioritized geometric operator with priority degrees.Findings-The authors develop some prioritized operators named Pythagorean fuzzy prioritized averaging operator with priority degrees and Pythagorean fuzzy prioritized geometric operator with priority degrees.The properties of the existing method are routinely compared to those of other current approaches,emphasizing the superiority of the presented work over currently used methods.Furthermore,the impact of priority degrees on the aggregate outcome is thoroughly examined.Further,based on these operators,a decision-making approach is presented under the Pythagorean fuzzy set environment.An illustrative example related to the selection of the best alternative is considered to demonstrate the efficiency of the proposed approach.Originality/value-The aggregation operators are formed by awarding non-negative real numbers known as priority degrees among strict priority levels.Consequently,the authors develop some prioritized operators named Pythagorean fuzzy prioritized averaging operator with priority degrees and Pythagorean fuzzy prioritized geometric operator with priority degrees.The properties of the existing method are routinely compared to those of other current approaches,emphasizing the superiority of the presented work over currently used methods.Furthermore,the impact of priority degrees on the aggregate outcome is thoroughly examined.展开更多
Uncovering causal genes for human inherited diseases,as the primary step toward understanding the pathogenesis of these diseases,requires a combined analysis of genetic and genomic data.Although bioinformatics methods...Uncovering causal genes for human inherited diseases,as the primary step toward understanding the pathogenesis of these diseases,requires a combined analysis of genetic and genomic data.Although bioinformatics methods have been designed to prioritize candidate genes resulting fromgenetic linkage analysis or association studies,the coverage of both diseases and genes in existing methods is quite limited,thereby preventing the scan of causal genes for a significant proportion of diseases at the whole-genome level.To overcome this limitation,we propose a method named pgWalk to prioritize candidate genes by integrating multiple phenomic and genomic data.We derive three types of phenotype similarities among 7719 diseases and nine types of functional similarities among 20327 genes.Based on a pair of phenotype and gene similarities,we construct a disease-gene network and then simulate the process that a random walker wanders on such a heterogeneous network to quantify the strength of association between a candidate gene and a query disease.A weighted version of the Fisher’s method with dependent correction is adopted to integrate 27 scores obtained in this way,and a final q-value is calibrated for prioritizing candidate genes.A series of validation experiments are conducted to demonstrate the superior performance of this approach.We further show the effectiveness of this method in exome sequencing studies of autism and epileptic encephalopathies.An online service and the standalone software of pgWalk can be found at http://bioinfo.au.tsinghua.edu.cn/jianglab/pgwalk.展开更多
Test Case Prioritization(TCP)techniques perform better than other regression test optimization techniques including Test Suite Reduction(TSR)and Test Case Selection(TCS).Many TCP techniques are available,and their per...Test Case Prioritization(TCP)techniques perform better than other regression test optimization techniques including Test Suite Reduction(TSR)and Test Case Selection(TCS).Many TCP techniques are available,and their performance is usually measured through a metric Average Percentage of Fault Detection(APFD).This metric is value-neutral because it only works well when all test cases have the same cost,and all faults have the same severity.Using APFD for performance evaluation of test case orders where test cases cost or faults severity varies is prone to produce false results.Therefore,using the right metric for performance evaluation of TCP techniques is very important to get reliable and correct results.In this paper,two value-based TCP techniques have been introduced using Genetic Algorithm(GA)including Value-Cognizant Fault Detection-Based TCP(VCFDB-TCP)and Value-Cognizant Requirements Coverage-Based TCP(VCRCB-TCP).Two novel value-based performance evaluation metrics are also introduced for value-based TCP including Average Percentage of Fault Detection per value(APFDv)and Average Percentage of Requirements Coverage per value(APRCv).Two case studies are performed to validate proposed techniques and performance evaluation metrics.The proposed GA-based techniques outperformed the existing state-of-the-art TCP techniques including Original Order(OO),Reverse Order(REV-O),Random Order(RO),and Greedy algorithm.展开更多
Although the construction of underground dams is one of the best methods to conserve water resources in arid and semi-arid regions,applying efficient methods for the selection of suitable sites for subsurface dam cons...Although the construction of underground dams is one of the best methods to conserve water resources in arid and semi-arid regions,applying efficient methods for the selection of suitable sites for subsurface dam construction remains a challenge.Due to the costly and time-consuming methods of site selection for underground dam construction,this study aimed to present a new method using geographic information systems techniques and decision-making processes.The exclusionary criteria including fault,slope,hypsometry,land use,soil,stream,geology,and chemical properties of groundwater were selected for site selection of dam construction and inappropriate regions were omitted by integration and scoring layers in ArcGIS based on the Boolean logic.Finally,appropriate sites were prioritized using the Multi-Attribute Utility Theory.According to the results of the utility coefficient,seven sites were selected as the region for underground dam construction based on all criteria and experts’opinions.The site of Nazarabad dam was the best location for underground dam construction with a utility coefficient of 0.7137 followed by sites of Akhavan with a utility coefficient of 0.4633 and Mirshamsi with a utility coefficient of 0.4083.This study proposed a new approach for the construction of the subsurface dam at the proper site and help managers and decision-makers achieve sustainable water resources with limited facilities and capital and avoid wasting national capital.展开更多
Software needs modifications and requires revisions regularly.Owing to these revisions,retesting software becomes essential to ensure that the enhancements made,have not affected its bug-free functioning.The time and ...Software needs modifications and requires revisions regularly.Owing to these revisions,retesting software becomes essential to ensure that the enhancements made,have not affected its bug-free functioning.The time and cost incurred in this process,need to be reduced by the method of test case selection and prioritization.It is observed that many nature-inspired techniques are applied in this area.African Buffalo Optimization is one such approach,applied to regression test selection and prioritization.In this paper,the proposed work explains and proves the applicability of the African Buffalo Optimization approach to test case selection and prioritization.The proposed algorithm converges in polynomial time(O(n^(2))).In this paper,the empirical evaluation of applying African Buffalo Optimization for test case prioritization is done on sample data set with multiple iterations.An astounding 62.5%drop in size and a 48.57%drop in the runtime of the original test suite were recorded.The obtained results are compared with Ant Colony Optimization.The comparative analysis indicates that African Buffalo Optimization and Ant Colony Optimization exhibit similar fault detection capabilities(80%),and a reduction in the overall execution time and size of the resultant test suite.The results and analysis,hence,advocate and encourages the use of African Buffalo Optimization in the area of test case selection and prioritization.展开更多
Regression testing is a widely used approach to confirm the correct functionality of the software in incremental development.The use of test cases makes it easier to test the ripple effect of changed requirements.Rigo...Regression testing is a widely used approach to confirm the correct functionality of the software in incremental development.The use of test cases makes it easier to test the ripple effect of changed requirements.Rigorous testingmay help in meeting the quality criteria that is based on the conformance to the requirements as given by the intended stakeholders.However,a minimized and prioritized set of test cases may reduce the efforts and time required for testingwhile focusing on the timely delivery of the software application.In this research,a technique named Test Reduce has been presented to get a minimal set of test cases based on high priority to ensure that the web applicationmeets the required quality criteria.A new technique TestReduce is proposed with a blend of genetic algorithm to find an optimized and minimal set of test cases.The ultimate objective associated with this study is to provide a technique that may solve the minimization problem of regression test cases in the case of linked requirements.In this research,the 100-Dollar prioritization approach is used to define the priority of the new requirements.展开更多
Both unit and integration testing are incredibly crucial for almost any software application because each of them operates a distinct process to examine the product.Due to resource constraints,when software is subject...Both unit and integration testing are incredibly crucial for almost any software application because each of them operates a distinct process to examine the product.Due to resource constraints,when software is subjected to modifications,the drastic increase in the count of test cases forces the testers to opt for a test optimization strategy.One such strategy is test case prioritization(TCP).Existing works have propounded various methodologies that re-order the system-level test cases intending to boost either the fault detection capabilities or the coverage efficacy at the earliest.Nonetheless,singularity in objective functions and the lack of dissimilitude among the re-ordered test sequences have degraded the cogency of their approaches.Considering such gaps and scenarios when the meteoric and continuous updations in the software make the intensive unit and integration testing process more fragile,this study has introduced a memetics-inspired methodology for TCP.The proposed structure is first embedded with diverse parameters,and then traditional steps of the shuffled-frog-leaping approach(SFLA)are followed to prioritize the test cases at unit and integration levels.On 5 standard test functions,a comparative analysis is conducted between the established algorithms and the proposed approach,where the latter enhances the coverage rate and fault detection of re-ordered test sets.Investigation results related to the mean average percentage of fault detection(APFD)confirmed that the proposed approach exceeds the memetic,basic multi-walk,PSO,and optimized multi-walk by 21.7%,13.99%,12.24%,and 11.51%,respectively.展开更多
Automation software need to be continuously updated by addressing software bugs contained in their repositories.However,bugs have different levels of importance;hence,it is essential to prioritize bug reports based on...Automation software need to be continuously updated by addressing software bugs contained in their repositories.However,bugs have different levels of importance;hence,it is essential to prioritize bug reports based on their sever-ity and importance.Manually managing the deluge of incoming bug reports faces time and resource constraints from the development team and delays the resolu-tion of critical bugs.Therefore,bug report prioritization is vital.This study pro-poses a new model for bug prioritization based on average one dependence estimator;it prioritizes bug reports based on severity,which is determined by the number of attributes.The more the number of attributes,the more the severity.The proposed model is evaluated using precision,recall,F1-Score,accuracy,G-Measure,and Matthew’s correlation coefficient.Results of the proposed model are compared with those of the support vector machine(SVM)and Naive Bayes(NB)models.Eclipse and Mozilla datasetswere used as the sources of bug reports.The proposed model improved the bug repository management and out-performed the SVM and NB models.Additionally,the proposed model used a weaker attribute independence supposition than the former models,thereby improving prediction accuracy with minimal computational cost.展开更多
In this paper, we propose a practical design and implementation of network-adaptive high definition (HD) MPEG-2 video streaming combined with cross-layered channel monitoring (CLM) over the IEEE 802.11a wireless local...In this paper, we propose a practical design and implementation of network-adaptive high definition (HD) MPEG-2 video streaming combined with cross-layered channel monitoring (CLM) over the IEEE 802.11a wireless local area network (WLAN). For wireless channel monitoring, we adopt a cross-layered approach, where an access point (AP) periodically measures lower layers such as medium access control (MAC) and physical (PHY) transmission information (e.g., MAC layer loss rate) and then sends the monitored information to the streaming server application. The adaptive streaming server with the CLM scheme reacts more quickly and efficiently to the fluctuating wireless channel than the end-to-end application-layer monitoring (E2EM) scheme. The streaming server dynamically performs priority-based frame dropping to adjust the sending rate according to the measured wireless channel condition. For this purpose, the proposed streaming system nicely provides frame-based prioritized packetization by using a real-time stream parsing module. Various evaluation results over an IEEE 802.11a WLAN testbed are provided to verify the intended Quality of Service (QoS) adaptation capability. Experimental results showed that the proposed system can mitigate the quality degradation of video streaming due to the fluctuations of time-varying channel.展开更多
The main idea of reinforcement learning is evaluating the chosen action depending on the current reward.According to this concept,many algorithms achieved proper performance on classic Atari 2600 games.The main challe...The main idea of reinforcement learning is evaluating the chosen action depending on the current reward.According to this concept,many algorithms achieved proper performance on classic Atari 2600 games.The main challenge is when the reward is sparse or missing.Such environments are complex exploration environments likeMontezuma’s Revenge,Pitfall,and Private Eye games.Approaches built to deal with such challenges were very demanding.This work introduced a different reward system that enables the simple classical algorithm to learn fast and achieve high performance in hard exploration environments.Moreover,we added some simple enhancements to several hyperparameters,such as the number of actions and the sampling ratio that helped improve performance.We include the extra reward within the human demonstrations.After that,we used Prioritized Double Deep Q-Networks(Prioritized DDQN)to learning from these demonstrations.Our approach enabled the Prioritized DDQNwith a short learning time to finish the first level of Montezuma’s Revenge game and to perform well in both Pitfall and Private Eye.We used the same games to compare our results with several baselines,such as the Rainbow and Deep Q-learning from demonstrations(DQfD)algorithm.The results showed that the new rewards system enabled Prioritized DDQN to out-perform the baselines in the hard exploration games with short learning time.展开更多
<strong>Background:</strong> Clinical judgment is a specific role that establishes a professional identity. The purpose of this paper is to prepare nursing students to make better judgments in the clinical...<strong>Background:</strong> Clinical judgment is a specific role that establishes a professional identity. The purpose of this paper is to prepare nursing students to make better judgments in the clinical setting and realign learning and teaching. <strong>Methods: </strong>We used six steps to arrive at a competent clinical judgment suggested by the National Council State Board of Nursing (NSCBN) as a clinical judgment model 1) recognizing cues, 2) analyzing cues, 3) prioritizing hypotheses, 4) generating solutions, 5) taking an action, and 6) evaluating outcomes during the head-to-toe examination of the patient. <strong>Results: </strong>The primary outcomes are stabilization of the hemodynamics of the patient and prevention of further blood loss. Fluids are being given to help keep the vascular volume from being depleted, but they cannot solve the underlying problem. Continued assessment, intervention, and monitoring of vital signs through the course of the hospital stay ending with the patient’s discharge. <strong>Discussion:</strong> Survivors of sexual assault are unique for a nurse to provide care. The nurse needs to assess, intervene, monitor, and pay attention to detail of the 6 steps to clinical judgment, resulting in positive outcomes for their patient. <strong>Conclusions:</strong> Forensic nursing is a field of nursing that focuses on sexual assault survivor care and works to make the aftermath of their tragic situation easier to cope with. Strengthening clinical judgment skills could remedy significant mistakes made by novice forensic nurses. Critical thinking and clinical ethical reasoning are the building blocks of clinical judgment.展开更多
The aim of liver transplantation(LT) for hepatocellular carcinoma(HCC) is to ensure a rate of disease-free survival similar to that of patients transplanted due to benign disease. Therefore, we are forced to adopt str...The aim of liver transplantation(LT) for hepatocellular carcinoma(HCC) is to ensure a rate of disease-free survival similar to that of patients transplanted due to benign disease. Therefore, we are forced to adopt strict criteria when selecting candidates for LT and prioritizing patients on the waiting list(WL), to have clarified indications for bridging therapy for groups at risk for progression or recurrence, and to establish certain limits for downstaging therapies. Although the Milan criteria(MC) remain the standard and most employed criteria for indication of HCC patients for LT by far, in the coming years, criteria will be consolidated that take into account not only data regarding the size/volume and number of tumors but also their biology. This criteria will mainly include the alpha fetoprotein(AFP) values and, in view of their wide variability, any of the published logarithmic models for the selection of candidates for LT. Bridging therapy is necessary for HCC patients on the WL who meet the MC and have the possibility of experiencing a delay for LT greater than 6 mo or any of the known risk factors for recurrence. It is difficult to define single AFP values that would indicate bridging therapy(200, 300 or 400 ng/m L); therefore, it is preferable to rely on the criteria of a French AFP model score > 2. Other single indications for bridging therapy include a tumor diameter greater than 3 cm, more than one tumor, and having an AFP slope greater than 15 ng/m L per month or > 50 ng/m L for three months during strict monitoring while on the WL. When considering the inclusion of patients on the WL who do not meet the MC, it is mandatory to determine their eligibility for downstaging therapy prior to inclusion. The upper limit for this therapy could be one lesion up to 8 cm, 2-3 lesions with a total tumor diameter up to 8 cm, or a total tumor volume of 115 cm^3. Lastly, liver allocation and the prioritization of patients with HCC onthe WL should take into account the recently described HCC model for end-stage liver disease, which considers hepatic function, HCC size and the number and the log of AFP values. This formula has been calibrated with the survival data of non-HCC patients and produces a dynamic and more accurate assessment model.展开更多
文摘Edge devices,due to their limited computational and storage resources,often require the use of compilers for program optimization.Therefore,ensuring the security and reliability of these compilers is of paramount importance in the emerging field of edge AI.One widely used testing method for this purpose is fuzz testing,which detects bugs by inputting random test cases into the target program.However,this process consumes significant time and resources.To improve the efficiency of compiler fuzz testing,it is common practice to utilize test case prioritization techniques.Some researchers use machine learning to predict the code coverage of test cases,aiming to maximize the test capability for the target compiler by increasing the overall predicted coverage of the test cases.Nevertheless,these methods can only forecast the code coverage of the compiler at a specific optimization level,potentially missing many optimization-related bugs.In this paper,we introduce C-CORE(short for Clustering by Code Representation),the first framework to prioritize test cases according to their code representations,which are derived directly from the source codes.This approach avoids being limited to specific compiler states and extends to a broader range of compiler bugs.Specifically,we first train a scaled pre-trained programming language model to capture as many common features as possible from the test cases generated by a fuzzer.Using this pre-trained model,we then train two downstream models:one for predicting the likelihood of triggering a bug and another for identifying code representations associated with bugs.Subsequently,we cluster the test cases according to their code representations and select the highest-scoring test case from each cluster as the high-quality test case.This reduction in redundant testing cases leads to time savings.Comprehensive evaluation results reveal that code representations are better at distinguishing test capabilities,and C-CORE significantly enhances testing efficiency.Across four datasets,C-CORE increases the average of the percentage of faults detected(APFD)value by 0.16 to 0.31 and reduces test time by over 50% in 46% of cases.When compared to the best results from approaches using predicted code coverage,C-CORE improves the APFD value by 1.1% to 12.3% and achieves an overall time-saving of 159.1%.
文摘In real life,incomplete information,inaccurate data,and the preferences of decision-makers during qualitative judgment would impact the process of decision-making.As a technical instrument that can successfully handle uncertain information,Fermatean fuzzy sets have recently been used to solve the multi-attribute decision-making(MADM)problems.This paper proposes a Fermatean hesitant fuzzy information aggregation method to address the problem of fusion where the membership,non-membership,and priority are considered simultaneously.Combining the Fermatean hesitant fuzzy sets with Heronian Mean operators,this paper proposes the Fermatean hesitant fuzzy Heronian mean(FHFHM)operator and the Fermatean hesitant fuzzyweighted Heronian mean(FHFWHM)operator.Then,considering the priority relationship between attributes is often easier to obtain than the weight of attributes,this paper defines a new Fermatean hesitant fuzzy prioritized Heronian mean operator(FHFPHM),and discusses its elegant properties such as idempotency,boundedness and monotonicity in detail.Later,for problems with unknown weights and the Fermatean hesitant fuzzy information,aMADM approach based on prioritized attributes is proposed,which can effectively depict the correlation between attributes and avoid the influence of subjective factors on the results.Finally,a numerical example of multi-sensor electronic surveillance is applied to verify the feasibility and validity of the method proposed in this paper.
基金supported by the National Natural Science Foundation of China(71771140)Project of Cultural Masters and“the Four Kinds of a Batch”Talents,the Special Funds of Taishan Scholars Project of Shandong Province(ts201511045)the Major Bidding Projects of National Social Science Fund of China(19ZDA080)。
文摘In view of the environment competencies,selecting the optimal green supplier is one of the crucial issues for enterprises,and multi-criteria decision-making(MCDM)methodologies can more easily solve this green supplier selection(GSS)problem.In addition,prioritized aggregation(PA)operator can focus on the prioritization relationship over the criteria,Choquet integral(CI)operator can fully take account of the importance of criteria and the interactions among them,and Bonferroni mean(BM)operator can capture the interrelationships of criteria.However,most existing researches cannot simultaneously consider the interactions,interrelationships and prioritizations over the criteria,which are involved in the GSS process.Moreover,the interval type-2 fuzzy set(IT2FS)is a more effective tool to represent the fuzziness.Therefore,based on the advantages of PA,CI,BM and IT2FS,in this paper,the interval type-2 fuzzy prioritized Choquet normalized weighted BM operators with fuzzy measure and generalized prioritized measure are proposed,and some properties are discussed.Then,a novel MCDM approach for GSS based upon the presented operators is developed,and detailed decision steps are given.Finally,the applicability and practicability of the proposed methodology are demonstrated by its application in the shared-bike GSS and by comparisons with other methods.The advantages of the proposed method are that it can consider interactions,interrelationships and prioritizations over the criteria simultaneously.
基金the Deanship of Scientific Research(DSR),at KingAbdulaziz University,Jeddah,under grant no.G:292-612-1440.
文摘Medical Internet of Things(MIoTs)is a collection of small and energyefficient wireless sensor devices that monitor the patient’s body.The healthcare networks transmit continuous data monitoring for the patients to survive them independently.There are many improvements in MIoTs,but still,there are critical issues that might affect the Quality of Service(QoS)of a network.Congestion handling is one of the critical factors that directly affect the QoS of the network.The congestion in MIoT can cause more energy consumption,delay,and important data loss.If a patient has an emergency,then the life-critical signals must transmit with minimum latency.During emergencies,the MIoTs have to monitor the patients continuously and transmit data(e.g.,ECG,BP,heart rate,etc.)with minimum delay.Therefore,there is an efficient technique required that can transmit emergency data of high-risk patients to the medical staff on time with maximum reliability.The main objective of this research is to monitor and transmit the patient’s real-time data efficiently and to prioritize the emergency data.In this paper,Emergency Prioritized and Congestion Handling Protocol for Medical IoTs(EPCP_MIoT)is proposed that efficiently monitors the patients and overcome the congestion by enabling different monitoring modes.Whereas the emergency data transmissions are prioritized and transmit at SIFS time.The proposed technique is implemented and compared with the previous technique,the comparison results show that the proposed technique outperforms the previous techniques in terms of network throughput,end to end delay,energy consumption,and packet loss ratio.
文摘The object-based scalable coding in MPEG-4 is investigated, and a prioritized transmission scheme of MPEG-4 audio-visual objects (AVOs) over the DiffServ network with the QoS guarantee is proposed. MPEG-4 AVOs are extracted and classified into different groups according to their priority values and scalable layers (visual importance). These priority values are mapped to the 1P DiffServ per hop behaviors (PHB). This scheme can selectively discard packets with low importance, in order to avoid the network congestion. Simulation results show that the quality of received video can gracefully adapt to network state, as compared with the ‘best-effort' manner. Also, by allowing the content provider to define prioritization of each audio-visual object, the adaptive transmission of object-based scalable video can be customized based on the content.
文摘Pancreatic cancer (PC) occurs when malignant cells develop in the part of the pancreas, a glandular organ behind the stomach. For 2015, there are about 40,560 people dead of pancreatic cancer (20,710 men and 19,850 women) in the US (Siegel et al., 2015). Though PC accounts for about 3% of all cancers in the US, it can cause about 7% of cancer deaths. This is mainly because that the early stages of this cancer do not usually produce symptoms, and thus the cancer is almost always fatal when it is diagnosed.
文摘Purpose-The authors develop some prioritized operators named Pythagorean fuzzy prioritized averaging operator with priority degrees and Pythagorean fuzzy prioritized geometric operator with priority degrees.The properties of the existing method are routinely compared to those of other current approaches,emphasizing the superiority of the presented work over currently used methods.Furthermore,the impact of priority degrees on the aggregate outcome is thoroughly examined.Further,based on these operators,a decision-making approach is presented under the Pythagorean fuzzy set environment.An illustrative example related to the selection of the best alternative is considered to demonstrate the efficiency of the proposed approach.Design/methodology/approach-In real-world situations,Pythagorean fuzzy numbers are exceptionally useful for representing ambiguous data.The authors look at multi-criteria decision-making issues in which the parameters have a prioritization relationship.The idea of a priority degree is introduced.The aggregation operators are formed by awarding non-negative real numbers known as priority degrees among strict priority levels.Consequently,the authors develop some prioritized operators named Pythagorean fuzzy prioritized averaging operator with priority degrees and Pythagorean fuzzy prioritized geometric operator with priority degrees.Findings-The authors develop some prioritized operators named Pythagorean fuzzy prioritized averaging operator with priority degrees and Pythagorean fuzzy prioritized geometric operator with priority degrees.The properties of the existing method are routinely compared to those of other current approaches,emphasizing the superiority of the presented work over currently used methods.Furthermore,the impact of priority degrees on the aggregate outcome is thoroughly examined.Further,based on these operators,a decision-making approach is presented under the Pythagorean fuzzy set environment.An illustrative example related to the selection of the best alternative is considered to demonstrate the efficiency of the proposed approach.Originality/value-The aggregation operators are formed by awarding non-negative real numbers known as priority degrees among strict priority levels.Consequently,the authors develop some prioritized operators named Pythagorean fuzzy prioritized averaging operator with priority degrees and Pythagorean fuzzy prioritized geometric operator with priority degrees.The properties of the existing method are routinely compared to those of other current approaches,emphasizing the superiority of the presented work over currently used methods.Furthermore,the impact of priority degrees on the aggregate outcome is thoroughly examined.
基金This work was supported by the National Basic Research Program of China(2012CB316504)the National High Technology Research and Development Program of China(2012AA020401)the National Natural Science Foundation of China(61175002).
文摘Uncovering causal genes for human inherited diseases,as the primary step toward understanding the pathogenesis of these diseases,requires a combined analysis of genetic and genomic data.Although bioinformatics methods have been designed to prioritize candidate genes resulting fromgenetic linkage analysis or association studies,the coverage of both diseases and genes in existing methods is quite limited,thereby preventing the scan of causal genes for a significant proportion of diseases at the whole-genome level.To overcome this limitation,we propose a method named pgWalk to prioritize candidate genes by integrating multiple phenomic and genomic data.We derive three types of phenotype similarities among 7719 diseases and nine types of functional similarities among 20327 genes.Based on a pair of phenotype and gene similarities,we construct a disease-gene network and then simulate the process that a random walker wanders on such a heterogeneous network to quantify the strength of association between a candidate gene and a query disease.A weighted version of the Fisher’s method with dependent correction is adopted to integrate 27 scores obtained in this way,and a final q-value is calibrated for prioritizing candidate genes.A series of validation experiments are conducted to demonstrate the superior performance of this approach.We further show the effectiveness of this method in exome sequencing studies of autism and epileptic encephalopathies.An online service and the standalone software of pgWalk can be found at http://bioinfo.au.tsinghua.edu.cn/jianglab/pgwalk.
文摘Test Case Prioritization(TCP)techniques perform better than other regression test optimization techniques including Test Suite Reduction(TSR)and Test Case Selection(TCS).Many TCP techniques are available,and their performance is usually measured through a metric Average Percentage of Fault Detection(APFD).This metric is value-neutral because it only works well when all test cases have the same cost,and all faults have the same severity.Using APFD for performance evaluation of test case orders where test cases cost or faults severity varies is prone to produce false results.Therefore,using the right metric for performance evaluation of TCP techniques is very important to get reliable and correct results.In this paper,two value-based TCP techniques have been introduced using Genetic Algorithm(GA)including Value-Cognizant Fault Detection-Based TCP(VCFDB-TCP)and Value-Cognizant Requirements Coverage-Based TCP(VCRCB-TCP).Two novel value-based performance evaluation metrics are also introduced for value-based TCP including Average Percentage of Fault Detection per value(APFDv)and Average Percentage of Requirements Coverage per value(APRCv).Two case studies are performed to validate proposed techniques and performance evaluation metrics.The proposed GA-based techniques outperformed the existing state-of-the-art TCP techniques including Original Order(OO),Reverse Order(REV-O),Random Order(RO),and Greedy algorithm.
文摘Although the construction of underground dams is one of the best methods to conserve water resources in arid and semi-arid regions,applying efficient methods for the selection of suitable sites for subsurface dam construction remains a challenge.Due to the costly and time-consuming methods of site selection for underground dam construction,this study aimed to present a new method using geographic information systems techniques and decision-making processes.The exclusionary criteria including fault,slope,hypsometry,land use,soil,stream,geology,and chemical properties of groundwater were selected for site selection of dam construction and inappropriate regions were omitted by integration and scoring layers in ArcGIS based on the Boolean logic.Finally,appropriate sites were prioritized using the Multi-Attribute Utility Theory.According to the results of the utility coefficient,seven sites were selected as the region for underground dam construction based on all criteria and experts’opinions.The site of Nazarabad dam was the best location for underground dam construction with a utility coefficient of 0.7137 followed by sites of Akhavan with a utility coefficient of 0.4633 and Mirshamsi with a utility coefficient of 0.4083.This study proposed a new approach for the construction of the subsurface dam at the proper site and help managers and decision-makers achieve sustainable water resources with limited facilities and capital and avoid wasting national capital.
基金This research is funded by the Deanship of Scientific Research at Umm Al-Qura University,Grant Code:22UQU4281755DSR02.
文摘Software needs modifications and requires revisions regularly.Owing to these revisions,retesting software becomes essential to ensure that the enhancements made,have not affected its bug-free functioning.The time and cost incurred in this process,need to be reduced by the method of test case selection and prioritization.It is observed that many nature-inspired techniques are applied in this area.African Buffalo Optimization is one such approach,applied to regression test selection and prioritization.In this paper,the proposed work explains and proves the applicability of the African Buffalo Optimization approach to test case selection and prioritization.The proposed algorithm converges in polynomial time(O(n^(2))).In this paper,the empirical evaluation of applying African Buffalo Optimization for test case prioritization is done on sample data set with multiple iterations.An astounding 62.5%drop in size and a 48.57%drop in the runtime of the original test suite were recorded.The obtained results are compared with Ant Colony Optimization.The comparative analysis indicates that African Buffalo Optimization and Ant Colony Optimization exhibit similar fault detection capabilities(80%),and a reduction in the overall execution time and size of the resultant test suite.The results and analysis,hence,advocate and encourages the use of African Buffalo Optimization in the area of test case selection and prioritization.
基金The authors extend their appreciation to the Deanship of Scientific Research at King Khalid University for funding this work through Large Groups,Project under grant number RGP.2/49/43.
文摘Regression testing is a widely used approach to confirm the correct functionality of the software in incremental development.The use of test cases makes it easier to test the ripple effect of changed requirements.Rigorous testingmay help in meeting the quality criteria that is based on the conformance to the requirements as given by the intended stakeholders.However,a minimized and prioritized set of test cases may reduce the efforts and time required for testingwhile focusing on the timely delivery of the software application.In this research,a technique named Test Reduce has been presented to get a minimal set of test cases based on high priority to ensure that the web applicationmeets the required quality criteria.A new technique TestReduce is proposed with a blend of genetic algorithm to find an optimized and minimal set of test cases.The ultimate objective associated with this study is to provide a technique that may solve the minimization problem of regression test cases in the case of linked requirements.In this research,the 100-Dollar prioritization approach is used to define the priority of the new requirements.
文摘Both unit and integration testing are incredibly crucial for almost any software application because each of them operates a distinct process to examine the product.Due to resource constraints,when software is subjected to modifications,the drastic increase in the count of test cases forces the testers to opt for a test optimization strategy.One such strategy is test case prioritization(TCP).Existing works have propounded various methodologies that re-order the system-level test cases intending to boost either the fault detection capabilities or the coverage efficacy at the earliest.Nonetheless,singularity in objective functions and the lack of dissimilitude among the re-ordered test sequences have degraded the cogency of their approaches.Considering such gaps and scenarios when the meteoric and continuous updations in the software make the intensive unit and integration testing process more fragile,this study has introduced a memetics-inspired methodology for TCP.The proposed structure is first embedded with diverse parameters,and then traditional steps of the shuffled-frog-leaping approach(SFLA)are followed to prioritize the test cases at unit and integration levels.On 5 standard test functions,a comparative analysis is conducted between the established algorithms and the proposed approach,where the latter enhances the coverage rate and fault detection of re-ordered test sets.Investigation results related to the mean average percentage of fault detection(APFD)confirmed that the proposed approach exceeds the memetic,basic multi-walk,PSO,and optimized multi-walk by 21.7%,13.99%,12.24%,and 11.51%,respectively.
基金This work was supported in part by the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(No.NRF-2020R1A2C1013308).
文摘Automation software need to be continuously updated by addressing software bugs contained in their repositories.However,bugs have different levels of importance;hence,it is essential to prioritize bug reports based on their sever-ity and importance.Manually managing the deluge of incoming bug reports faces time and resource constraints from the development team and delays the resolu-tion of critical bugs.Therefore,bug report prioritization is vital.This study pro-poses a new model for bug prioritization based on average one dependence estimator;it prioritizes bug reports based on severity,which is determined by the number of attributes.The more the number of attributes,the more the severity.The proposed model is evaluated using precision,recall,F1-Score,accuracy,G-Measure,and Matthew’s correlation coefficient.Results of the proposed model are compared with those of the support vector machine(SVM)and Naive Bayes(NB)models.Eclipse and Mozilla datasetswere used as the sources of bug reports.The proposed model improved the bug repository management and out-performed the SVM and NB models.Additionally,the proposed model used a weaker attribute independence supposition than the former models,thereby improving prediction accuracy with minimal computational cost.
基金Project (No. R05-2004-000-10987-0) partly supported by the Basic Research Program of the Korea Research Foundation
文摘In this paper, we propose a practical design and implementation of network-adaptive high definition (HD) MPEG-2 video streaming combined with cross-layered channel monitoring (CLM) over the IEEE 802.11a wireless local area network (WLAN). For wireless channel monitoring, we adopt a cross-layered approach, where an access point (AP) periodically measures lower layers such as medium access control (MAC) and physical (PHY) transmission information (e.g., MAC layer loss rate) and then sends the monitored information to the streaming server application. The adaptive streaming server with the CLM scheme reacts more quickly and efficiently to the fluctuating wireless channel than the end-to-end application-layer monitoring (E2EM) scheme. The streaming server dynamically performs priority-based frame dropping to adjust the sending rate according to the measured wireless channel condition. For this purpose, the proposed streaming system nicely provides frame-based prioritized packetization by using a real-time stream parsing module. Various evaluation results over an IEEE 802.11a WLAN testbed are provided to verify the intended Quality of Service (QoS) adaptation capability. Experimental results showed that the proposed system can mitigate the quality degradation of video streaming due to the fluctuations of time-varying channel.
文摘The main idea of reinforcement learning is evaluating the chosen action depending on the current reward.According to this concept,many algorithms achieved proper performance on classic Atari 2600 games.The main challenge is when the reward is sparse or missing.Such environments are complex exploration environments likeMontezuma’s Revenge,Pitfall,and Private Eye games.Approaches built to deal with such challenges were very demanding.This work introduced a different reward system that enables the simple classical algorithm to learn fast and achieve high performance in hard exploration environments.Moreover,we added some simple enhancements to several hyperparameters,such as the number of actions and the sampling ratio that helped improve performance.We include the extra reward within the human demonstrations.After that,we used Prioritized Double Deep Q-Networks(Prioritized DDQN)to learning from these demonstrations.Our approach enabled the Prioritized DDQNwith a short learning time to finish the first level of Montezuma’s Revenge game and to perform well in both Pitfall and Private Eye.We used the same games to compare our results with several baselines,such as the Rainbow and Deep Q-learning from demonstrations(DQfD)algorithm.The results showed that the new rewards system enabled Prioritized DDQN to out-perform the baselines in the hard exploration games with short learning time.
文摘<strong>Background:</strong> Clinical judgment is a specific role that establishes a professional identity. The purpose of this paper is to prepare nursing students to make better judgments in the clinical setting and realign learning and teaching. <strong>Methods: </strong>We used six steps to arrive at a competent clinical judgment suggested by the National Council State Board of Nursing (NSCBN) as a clinical judgment model 1) recognizing cues, 2) analyzing cues, 3) prioritizing hypotheses, 4) generating solutions, 5) taking an action, and 6) evaluating outcomes during the head-to-toe examination of the patient. <strong>Results: </strong>The primary outcomes are stabilization of the hemodynamics of the patient and prevention of further blood loss. Fluids are being given to help keep the vascular volume from being depleted, but they cannot solve the underlying problem. Continued assessment, intervention, and monitoring of vital signs through the course of the hospital stay ending with the patient’s discharge. <strong>Discussion:</strong> Survivors of sexual assault are unique for a nurse to provide care. The nurse needs to assess, intervene, monitor, and pay attention to detail of the 6 steps to clinical judgment, resulting in positive outcomes for their patient. <strong>Conclusions:</strong> Forensic nursing is a field of nursing that focuses on sexual assault survivor care and works to make the aftermath of their tragic situation easier to cope with. Strengthening clinical judgment skills could remedy significant mistakes made by novice forensic nurses. Critical thinking and clinical ethical reasoning are the building blocks of clinical judgment.
文摘The aim of liver transplantation(LT) for hepatocellular carcinoma(HCC) is to ensure a rate of disease-free survival similar to that of patients transplanted due to benign disease. Therefore, we are forced to adopt strict criteria when selecting candidates for LT and prioritizing patients on the waiting list(WL), to have clarified indications for bridging therapy for groups at risk for progression or recurrence, and to establish certain limits for downstaging therapies. Although the Milan criteria(MC) remain the standard and most employed criteria for indication of HCC patients for LT by far, in the coming years, criteria will be consolidated that take into account not only data regarding the size/volume and number of tumors but also their biology. This criteria will mainly include the alpha fetoprotein(AFP) values and, in view of their wide variability, any of the published logarithmic models for the selection of candidates for LT. Bridging therapy is necessary for HCC patients on the WL who meet the MC and have the possibility of experiencing a delay for LT greater than 6 mo or any of the known risk factors for recurrence. It is difficult to define single AFP values that would indicate bridging therapy(200, 300 or 400 ng/m L); therefore, it is preferable to rely on the criteria of a French AFP model score > 2. Other single indications for bridging therapy include a tumor diameter greater than 3 cm, more than one tumor, and having an AFP slope greater than 15 ng/m L per month or > 50 ng/m L for three months during strict monitoring while on the WL. When considering the inclusion of patients on the WL who do not meet the MC, it is mandatory to determine their eligibility for downstaging therapy prior to inclusion. The upper limit for this therapy could be one lesion up to 8 cm, 2-3 lesions with a total tumor diameter up to 8 cm, or a total tumor volume of 115 cm^3. Lastly, liver allocation and the prioritization of patients with HCC onthe WL should take into account the recently described HCC model for end-stage liver disease, which considers hepatic function, HCC size and the number and the log of AFP values. This formula has been calibrated with the survival data of non-HCC patients and produces a dynamic and more accurate assessment model.