This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends t...This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends that they are not the same. The concept of cyber security is explored, which goes beyond protecting information resources to include a wider variety of assets, including people [1]. Protecting information assets is the main goal of traditional information security, with consideration to the human element and how people fit into the security process. On the other hand, cyber security adds a new level of complexity, as people might unintentionally contribute to or become targets of cyberattacks. This aspect presents moral questions since it is becoming more widely accepted that society has a duty to protect weaker members of society, including children [1]. The study emphasizes how important cyber security is on a larger scale, with many countries creating plans and laws to counteract cyberattacks. Nevertheless, a lot of these sources frequently neglect to define the differences or the relationship between information security and cyber security [1]. The paper focus on differentiating between cybersecurity and information security on a larger scale. The study also highlights other areas of cybersecurity which includes defending people, social norms, and vital infrastructure from threats that arise from online in addition to information and technology protection. It contends that ethical issues and the human factor are becoming more and more important in protecting assets in the digital age, and that cyber security is a paradigm shift in this regard [1].展开更多
This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering...This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering various cloud infrastructure offerings and security models. Additionally, it deeply analyzes real-life case studies illustrating successful cloud migrations and highlights common information security threats in current cloud computing. The article concludes by offering recommendations to businesses to protect themselves from cloud data breaches and providing insights into selecting a suitable cloud services provider from an information security perspective.展开更多
A network analyzer can often comprehend many protocols, which enables it to display talks taking place between hosts over a network. A network analyzer analyzes the device or network response and measures for the oper...A network analyzer can often comprehend many protocols, which enables it to display talks taking place between hosts over a network. A network analyzer analyzes the device or network response and measures for the operator to keep an eye on the network’s or object’s performance in an RF circuit. The purpose of the following research includes analyzing the capabilities of NetFlow analyzer to measure various parts, including filters, mixers, frequency sensitive networks, transistors, and other RF-based instruments. NetFlow Analyzer is a network traffic analyzer that measures the network parameters of electrical networks. Although there are other types of network parameter sets including Y, Z, & H-parameters, these instruments are typically employed to measure S-parameters since transmission & reflection of electrical networks are simple to calculate at high frequencies. These analyzers are widely employed to distinguish between two-port networks, including filters and amplifiers. By allowing the user to view the actual data that is sent over a network, packet by packet, a network analyzer informs you of what is happening there. Also, this research will contain the design model of NetFlow Analyzer that Measurements involving transmission and reflection use. Gain, insertion loss, and transmission coefficient are measured in transmission measurements, whereas return loss, reflection coefficient, impedance, and other variables are measured in reflection measurements. These analyzers’ operational frequencies vary from 1 Hz to 1.5 THz. These analyzers can also be used to examine stability in measurements of open loops, audio components, and ultrasonics.展开更多
Aims: This study aims at designing and implementing syllabus-oriented question-bank system that is capable of producing paper-based exams with multiple forms along with answer keys. The developed software tool is nam...Aims: This study aims at designing and implementing syllabus-oriented question-bank system that is capable of producing paper-based exams with multiple forms along with answer keys. The developed software tool is named Χ(Chi)-Pro Milestone and supports four types of questions, namely: Multiple-choice, True/False, Short-Answer and Free-Response Essay questions. The study is motivated by the fact that student number in schools and universities is continuously growing at high, non-linear, and uncontrolled rates. This growth, however, is not accompanied by an equivalent growth of educational resources (mainly: instructors, classrooms, and labs). A direct result of this situation is having relatively large number of students in each classroom. It is observed that providing and using online-examining systems could be intractable and expensive. As an alternative, paper-based exams can be used. One main issue is that manually produced paper-based exams are of low quality because of some human factors such as instability and relatively narrow range of topics [1]. Further, it is observed that instructors usually need to spend a lot of time and energy in composing paper-based exams with multiple forms. Therefore, the use of computers for automatic production of paper-based exams from question banks is becoming more and more important. Methodology: The design and evaluation of X-Pro Milestone are done by considering a basic set of design principles that are based on a list of identified Functional and Non-Functional Requirements. Deriving those requirements is made possible by developing X-Pro Milestone using the Iterative and Incremental model from software engineering domain. Results: We demonstrate that X-Pro Milestone has a number of excellent characteristics compared to the exam-preparation and question banks tools available in market. Some of these characteristics are: ease of use and operation, user-friendly interface and good usability, high security and protection of the question bank-items, high stability, and reliability. Further, X-Pro Milestone makes initiating, maintaining and archiving Question-Banks and produced exams possible. Putting X-Pro Milestone into real use has showed that X-Pro Milestone is easy to be learned and effectively used. We demonstrate that X-Pro Milestone is a cost-effective alternative to online examining systems with more and richer features and with low infrastructure requirements.展开更多
This research paper analyzes data breaches in the human service sector. The hypothesis for the solution to this problem is that there will be a significant reduction in data breaches in the human service sector due to...This research paper analyzes data breaches in the human service sector. The hypothesis for the solution to this problem is that there will be a significant reduction in data breaches in the human service sector due to an increase in information assurance. The hypothesis is tested using data from the United States Department of Health and Human Services data breach notification repository during January 2018-December 2020. Our result shows that without the increased mitigation of information assurance, data breaches in the human service sector will continue to increase.展开更多
The purpose of this paper is to provide a better knowledge of the cloud computing as well as to suggest relevant research paths in this growing field. Also, we will go through the future benefits of cloud computing an...The purpose of this paper is to provide a better knowledge of the cloud computing as well as to suggest relevant research paths in this growing field. Also, we will go through the future benefits of cloud computing and the upcoming possible challenges we will have. Intext Cloud, performance, cloud computing, architecture, scale-up, and big data are all terms used in this context. Cloud computing offers a wide range of architectural configurations, including the number of processors, memory, and nodes. Cloud computing has already changed the way we store, process, and access data, and it is expected to continue to have a significant impact on the future of information technology. Cloud computing enables organizations to scale their IT resources up or down quickly and easily, without the need for costly hardware upgrades. This can help organizations to respond more quickly to changing business needs and market conditions. By moving IT resources to the cloud, organizations can reduce their IT infrastructure costs and improve their operational efficiency. Cloud computing also allows organizations to pay only for the resources they use, rather than investing in expensive hardware and software licenses. Cloud providers invest heavily in security and compliance measures, which can help to protect organizations from cyber threats and ensure regulatory compliance. Cloud computing provides a scalable platform for AI and machine learning applications, enabling organizations to build and deploy these technologies more easily and cost-effectively. A task, an application, and its input can take up to 20 times longer or cost 10 times more than optimal. Cloud products’ ready adaptability has resulted in a paradigm change. Previously, an application was optimized for a specific cluster;however, in the cloud, the architectural configuration is tuned for the workload. The evolution of cloud computing from the era of mainframes and dumb terminals has been significant, but there are still many advancements to come. As we look towards the future, IT leaders and the companies they serve will face increasingly complex challenges in order to stay competitive in a constantly evolving cloud computing landscape. Additionally, it will be crucial to remain compliant with existing regulations as well as new regulations that may emerge in the future. It is safe to say that the next decade of cloud computing will be just as dramatic as the last where many internet services are becoming cloud-based, and huge enterprises will struggle to fund physical infrastructure. Cloud computing is significantly used in business innovation and because of its agility and adaptability, cloud technology enables new ways of working, operating, and running a business. The service enables users to access files and applications stored in the cloud from anywhere, removing the requirement for users to be always physically close to actual hardware. Cloud computing makes the connection available from anywhere because they are kept on a network of hosted computers that carry data over the internet. Cloud computing has shown to be advantageous to both consumers and corporations. To be more specific, the cloud has altered our way of life. Overall, cloud computing is likely to continue to play a significant role in the future of IT, enabling organizations to become more agile, efficient, and innovative in the face of rapid technological change. This is likely to drive further innovation in AI and machine learning in the coming years.展开更多
The COVID-19 pandemic has had a profound influence on education around the world, with schools and institutions shifting to remote learning to safeguard the safety of students and faculty. Concerns have been expressed...The COVID-19 pandemic has had a profound influence on education around the world, with schools and institutions shifting to remote learning to safeguard the safety of students and faculty. Concerns have been expressed about the impact of virtual learning on student performance and grades. The purpose of this study is to investigate the impact of remote learning on student performance and grades, as well as to investigate the obstacles and benefits of this new educational paradigm. The study will examine current literature on the subject, analyze data from surveys and interviews with students and educators, and investigate potential solutions to improve student performance and participation in virtual classrooms. The study’s findings will provide insights into the effectiveness of remote learning and inform ideas to improve student learning and achievement in an educational virtual world. The purpose of this article is to investigate the influence of remote learning on both students and educational institutions. The project will examine existing literature on the subject and collect data from students, instructors, and administrators through questionnaires and interviews. The paper will look at the challenges and opportunities that remote learning presents, such as the effect on student involvement, motivation, and academic achievement, as well as changes in teaching styles and technology. The outcomes of this study will provide insights into the effectiveness of remote learning and will affect future decisions about the usage of virtual learning environments in education. The research will also investigate potential solutions to improve the quality of remote education and handle any issues that occur.展开更多
Video games have been around for several decades and have had many advancements from the original start of video games. Video games started as virtual games that were advertised towards children, and these virtual gam...Video games have been around for several decades and have had many advancements from the original start of video games. Video games started as virtual games that were advertised towards children, and these virtual games created a virtual reality of a variety of genres. These genres included sports games, such as tennis, football, baseball, war games, fantasy, puzzles, etc. The start of these games was derived from a sports genre and now has a popularity in multiplayer-online-shooting games. The purpose of this paper is to investigate different types of tools available for cheating in virtual world making players have undue advantage over other players in a competition. With the advancement in technology, these video games have become more expanded in the development aspects of gaming. Video game developers have created long lines of codes to create a new look of video games. As video games have progressed, the coding, bugs, bots, and errors of video games have changed throughout the years. The coding of video games has branched out from the original video games, which have given many benefits to this virtual world, while simultaneously creating more problems such as bots. Analysis of tools available for cheating in a game has disadvantaged normal gamer in a fair contest.展开更多
Over the past decade, open-source software use has grown. Today, many companies including Google, Microsoft, Meta, RedHat, MongoDB, and Apache are major participants of open-source contributions. With the increased us...Over the past decade, open-source software use has grown. Today, many companies including Google, Microsoft, Meta, RedHat, MongoDB, and Apache are major participants of open-source contributions. With the increased use of open-source software or integration of open-source software into custom-developed software, the quality of this software component increases in importance. This study examined a sample of open-source applications from GitHub. Static software analytics were conducted, and each application was classified for its risk level. In the analyzed applications, it was found that 90% of the applications were classified as low risk or moderate low risk indicating a high level of quality for open-source applications.展开更多
The proposed study focuses on the critical issue of corrosion,which leads to significant economic losses and safety risks worldwide.A key area of emphasis is the accuracy of corrosion detection methods.While recent st...The proposed study focuses on the critical issue of corrosion,which leads to significant economic losses and safety risks worldwide.A key area of emphasis is the accuracy of corrosion detection methods.While recent studies have made progress,a common challenge is the low accuracy of existing detection models.These models often struggle to reliably identify corrosion tendencies,which are crucial for minimizing industrial risks and optimizing resource use.The proposed study introduces an innovative approach that significantly improves the accuracy of corrosion detection using a convolutional neural network(CNN),as well as two pretrained models,namely YOLOv8 and EfficientNetB0.By leveraging advanced technologies and methodologies,we have achieved high accuracies in identifying and managing the hazards associated with corrosion across various industrial settings.This advancement not only supports the overarching goals of enhancing safety and efficiency,but also sets a new benchmark for future research in the field.The results demonstrate a significant improvement in the ability to detect and mitigate corrosion-related concerns,providing a more accurate and comprehensive solution for industries facing these challenges.Both CNN and EfficientNetB0 exhibited 100%accuracy,precision,recall,and F1-score,followed by YOLOv8 with respective metrics of 95%,100%,90%,and 94.74%.Our approach outperformed state-of-the-art with similar datasets and methodologies.展开更多
Enhancing the security of Wireless Sensor Networks(WSNs)improves the usability of their applications.Therefore,finding solutions to various attacks,such as the blackhole attack,is crucial for the success of WSN applic...Enhancing the security of Wireless Sensor Networks(WSNs)improves the usability of their applications.Therefore,finding solutions to various attacks,such as the blackhole attack,is crucial for the success of WSN applications.This paper proposes an enhanced version of the AODV(Ad Hoc On-Demand Distance Vector)protocol capable of detecting blackholes and malfunctioning benign nodes in WSNs,thereby avoiding them when delivering packets.The proposed version employs a network-based reputation system to select the best and most secure path to a destination.To achieve this goal,the proposed version utilizes the Watchdogs/Pathrater mechanisms in AODV to gather and broadcast reputations to all network nodes to build the network-based reputation system.To minimize the network overhead of the proposed approach,the paper uses reputation aggregator nodes only for forwarding reputation tables.Moreover,to reduce the overhead of updating reputation tables,the paper proposes three mechanisms,which are the prompt broadcast,the regular broadcast,and the light broadcast approaches.The proposed enhanced version has been designed to perform effectively in dynamic environments such as mobile WSNs where nodes,including blackholes,move continuously,which is considered a challenge for other protocols.Using the proposed enhanced protocol,a node evaluates the security of different routes to a destination and can select the most secure routing path.The paper provides an algorithm that explains the proposed protocol in detail and demonstrates a case study that shows the operations of calculating and updating reputation values when nodes move across different zones.Furthermore,the paper discusses the proposed approach’s overhead analysis to prove the proposed enhancement’s correctness and applicability.展开更多
With the rapid growth of internet usage,a new situation has been created that enables practicing bullying.Cyberbullying has increased over the past decade,and it has the same adverse effects as face-to-face bullying,l...With the rapid growth of internet usage,a new situation has been created that enables practicing bullying.Cyberbullying has increased over the past decade,and it has the same adverse effects as face-to-face bullying,like anger,sadness,anxiety,and fear.With the anonymity people get on the internet,they tend to bemore aggressive and express their emotions freely without considering the effects,which can be a reason for the increase in cyberbullying and it is the main motive behind the current study.This study presents a thorough background of cyberbullying and the techniques used to collect,preprocess,and analyze the datasets.Moreover,a comprehensive review of the literature has been conducted to figure out research gaps and effective techniques and practices in cyberbullying detection in various languages,and it was deduced that there is significant room for improvement in the Arabic language.As a result,the current study focuses on the investigation of shortlisted machine learning algorithms in natural language processing(NLP)for the classification of Arabic datasets duly collected from Twitter(also known as X).In this regard,support vector machine(SVM),Naive Bayes(NB),Random Forest(RF),Logistic regression(LR),Bootstrap aggregating(Bagging),Gradient Boosting(GBoost),Light Gradient Boosting Machine(LightGBM),Adaptive Boosting(AdaBoost),and eXtreme Gradient Boosting(XGBoost)were shortlisted and investigated due to their effectiveness in the similar problems.Finally,the scheme was evaluated by well-known performance measures like accuracy,precision,Recall,and F1-score.Consequently,XGBoost exhibited the best performance with 89.95%accuracy,which is promising compared to the state-of-the-art.展开更多
Obstacle removal in crowd evacuation is critical to safety and the evacuation system efficiency. Recently, manyresearchers proposed game theoreticmodels to avoid and remove obstacles for crowd evacuation. Game theoret...Obstacle removal in crowd evacuation is critical to safety and the evacuation system efficiency. Recently, manyresearchers proposed game theoreticmodels to avoid and remove obstacles for crowd evacuation. Game theoreticalmodels aim to study and analyze the strategic behaviors of individuals within a crowd and their interactionsduring the evacuation. Game theoretical models have some limitations in the context of crowd evacuation. Thesemodels consider a group of individuals as homogeneous objects with the same goals, involve complex mathematicalformulation, and cannot model real-world scenarios such as panic, environmental information, crowds that movedynamically, etc. The proposed work presents a game theoretic model integrating an agent-based model to removethe obstacles from exits. The proposed model considered the parameters named: (1) obstacle size, length, andwidth, (2) removal time, (3) evacuation time, (4) crowd density, (5) obstacle identification, and (6) route selection.The proposed work conducts various experiments considering different conditions, such as obstacle types, obstacleremoval, and several obstacles. Evaluation results show the proposed model’s effectiveness compared with existingliterature in reducing the overall evacuation time, cell selection, and obstacle removal. The study is potentially usefulfor public safety situations such as emergency evacuations during disasters and calamities.展开更多
In the realm of Artificial Intelligence (AI), there exists a complex landscape where promises of efficiency and innovation clash with unforeseen disruptions across Information Technology (IT) and broader societal real...In the realm of Artificial Intelligence (AI), there exists a complex landscape where promises of efficiency and innovation clash with unforeseen disruptions across Information Technology (IT) and broader societal realms. This paper sets out on a journey to explore the intricate paradoxes inherent in AI, focusing on the unintended consequences that ripple through IT and beyond. Through a thorough examination of literature and analysis of related works, this study aims to shed light on the complexities surrounding the AI paradox. It delves into how this paradox appears in various domains, such as algorithmic biases, job displacement, ethical dilemmas, and privacy concerns. By mapping out these unintended disruptions, this research seeks to offer a nuanced understanding of the challenges brought forth by AI-driven transformations. Ultimately, its goal is to pave the way for the responsible development and deployment of AI, fostering a harmonious integration of technological progress with societal values and priorities.展开更多
Pervasive schemes are the significant techniques that allow intelligent communication among the devices without any human intervention.Recently Internet of Vehicles(IoVs)has been introduced as one of the applications ...Pervasive schemes are the significant techniques that allow intelligent communication among the devices without any human intervention.Recently Internet of Vehicles(IoVs)has been introduced as one of the applications of pervasive computing that addresses the road safety challenges.Vehicles participating within the IoV are embedded with a wide range of sensors which operate in a real time environment to improve the road safety issues.Various mechanisms have been proposed which allow automatic actions based on uncertainty of sensory and managed data.Due to the lack of existing transportation integration schemes,IoV has not been completely explored by business organizations.In order to tackle this problem,we have proposed a novel trusted mechanism in IoV during communication,sensing,and record storing.Our proposed method uses trust based analysis and subjective logic functions with the aim of creating a trust environment for vehicles to communicate.In addition,the subjective logic function is integrated with multi-attribute SAW scheme to improve the decision metrics of authenticating nodes.The trust analysis depends on a variety of metrics to ensure an accurate identification of legitimate vehicles embedded with IoT devices ecosystem.The proposed scheme is determined and verified rigorously through various IoT devices and decision making metrics against a baseline solution.The simulation results show that the proposed scheme leads to 88%improvement in terms of better identification of legitimate nodes,road accidents and message alteration records during data transmission among vehicles as compared to the baseline approach.展开更多
Autonomic software recovery enables software to automatically detect and recover software faults. This feature makes the software to run more efficiently, actively, and reduces the maintenance time and cost. This pape...Autonomic software recovery enables software to automatically detect and recover software faults. This feature makes the software to run more efficiently, actively, and reduces the maintenance time and cost. This paper proposes an automated approach for Software Fault Detection and Recovery (SFDR). The SFDR detects the cases if a fault occurs with software components such as component deletion, replacement or modification, and recovers the component to enable the software to continue its intended operation. The SFDR is analyzed and implemented in parallel as a standalone software at the design phase of the target software. The practical applicability of the proposed approach has been tested by implementing an application demonstrating the performance and effectiveness of the SFDR. The experimental results and the comparisons with other works show the effectiveness of the proposed approach.展开更多
The complexity of computer architectures, software, web applications, and its large spread worldwide using the internet and the rapid increase in the number of users in companion with the increase of maintenance cost ...The complexity of computer architectures, software, web applications, and its large spread worldwide using the internet and the rapid increase in the number of users in companion with the increase of maintenance cost are all factors guided many researchers to develop software, web applications and systems that have the ability of self-healing. The aim of the self healing software feature is to fast recover the application and keep it running and available for 24/7 as optimal as possible. This survey provides an overview of self-healing software and system that is especially useful in all of those situations in which the involvement of humans is costly and hard to recover and needs to be automated with self healing. There are different aspects which will make us understand the different benefits of these self-healing systems. Finally, the approaches, techniques, mechanisms and individual characteristics of self healing are classified in different tables and then summarized.展开更多
Developments in service oriented architecture (SOA) have taken us near to the once fictional dream of forming and running an online business, such commercial activity in which most or all of its commercial roles are o...Developments in service oriented architecture (SOA) have taken us near to the once fictional dream of forming and running an online business, such commercial activity in which most or all of its commercial roles are outsourced to online services. The novel concept of cloud computing gives a understanding of SOA in which Information Technology assets are provided as services that are extra flexible, inexpensive and striking to commercial activities. In this paper, we concisely study developments in concept of cloud computing, and debate the advantages of using cloud services for commercial activities and trade-offs that they have to consider. Further we presented a layered architecture for online business, and then we presented a conceptual architecture for complete online business working atmosphere. Moreover, we discuss the prospects and research experiments that are ahead of us in realizing the technical components of this conceptual architecture. We conclude by giving the outlook and impact of cloud services on both large and small businesses.展开更多
The total reliance on internet connectivity and World Wide Web (WWW) based services is forcing many organizations to look for alternative solutions for providing adequate access and response time to the demand of thei...The total reliance on internet connectivity and World Wide Web (WWW) based services is forcing many organizations to look for alternative solutions for providing adequate access and response time to the demand of their ever increasing users. A typical solution is to increase the bandwidth;this can be achieved with additional cost, but this solution does not scale nor decrease users perceived response time. Another concern is the security of their network. An alternative scalable solution is to deploy a proxy server to provide adequate access and improve response time as well as provide some level of security for clients using the network. While some studies have reported performance increase due to the use of proxy servers, one study has reported performance decrease due to proxy server. We then conducted a six-month proxy server experiment. During this period, we collected access logs from three different proxy servers and analyzed these logs with Webalizer a web server log file analysis program. After a few years, in September 2010, we collected log files from another proxy server, analyzed the logs using Webalizer and compared our results. The result of the analysis showed that the hit rate of the proxy servers ranged between 21% - 39% and over 70% of web pages were dynamic. Furthermore clients accessing the internet through a proxy server are more secured. We then conclude that although the nature of the web is changing, the proxy server is still capable of improving performance by decreasing response time perceived by web clients and improved network security.展开更多
文摘This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends that they are not the same. The concept of cyber security is explored, which goes beyond protecting information resources to include a wider variety of assets, including people [1]. Protecting information assets is the main goal of traditional information security, with consideration to the human element and how people fit into the security process. On the other hand, cyber security adds a new level of complexity, as people might unintentionally contribute to or become targets of cyberattacks. This aspect presents moral questions since it is becoming more widely accepted that society has a duty to protect weaker members of society, including children [1]. The study emphasizes how important cyber security is on a larger scale, with many countries creating plans and laws to counteract cyberattacks. Nevertheless, a lot of these sources frequently neglect to define the differences or the relationship between information security and cyber security [1]. The paper focus on differentiating between cybersecurity and information security on a larger scale. The study also highlights other areas of cybersecurity which includes defending people, social norms, and vital infrastructure from threats that arise from online in addition to information and technology protection. It contends that ethical issues and the human factor are becoming more and more important in protecting assets in the digital age, and that cyber security is a paradigm shift in this regard [1].
文摘This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering various cloud infrastructure offerings and security models. Additionally, it deeply analyzes real-life case studies illustrating successful cloud migrations and highlights common information security threats in current cloud computing. The article concludes by offering recommendations to businesses to protect themselves from cloud data breaches and providing insights into selecting a suitable cloud services provider from an information security perspective.
文摘A network analyzer can often comprehend many protocols, which enables it to display talks taking place between hosts over a network. A network analyzer analyzes the device or network response and measures for the operator to keep an eye on the network’s or object’s performance in an RF circuit. The purpose of the following research includes analyzing the capabilities of NetFlow analyzer to measure various parts, including filters, mixers, frequency sensitive networks, transistors, and other RF-based instruments. NetFlow Analyzer is a network traffic analyzer that measures the network parameters of electrical networks. Although there are other types of network parameter sets including Y, Z, & H-parameters, these instruments are typically employed to measure S-parameters since transmission & reflection of electrical networks are simple to calculate at high frequencies. These analyzers are widely employed to distinguish between two-port networks, including filters and amplifiers. By allowing the user to view the actual data that is sent over a network, packet by packet, a network analyzer informs you of what is happening there. Also, this research will contain the design model of NetFlow Analyzer that Measurements involving transmission and reflection use. Gain, insertion loss, and transmission coefficient are measured in transmission measurements, whereas return loss, reflection coefficient, impedance, and other variables are measured in reflection measurements. These analyzers’ operational frequencies vary from 1 Hz to 1.5 THz. These analyzers can also be used to examine stability in measurements of open loops, audio components, and ultrasonics.
文摘Aims: This study aims at designing and implementing syllabus-oriented question-bank system that is capable of producing paper-based exams with multiple forms along with answer keys. The developed software tool is named Χ(Chi)-Pro Milestone and supports four types of questions, namely: Multiple-choice, True/False, Short-Answer and Free-Response Essay questions. The study is motivated by the fact that student number in schools and universities is continuously growing at high, non-linear, and uncontrolled rates. This growth, however, is not accompanied by an equivalent growth of educational resources (mainly: instructors, classrooms, and labs). A direct result of this situation is having relatively large number of students in each classroom. It is observed that providing and using online-examining systems could be intractable and expensive. As an alternative, paper-based exams can be used. One main issue is that manually produced paper-based exams are of low quality because of some human factors such as instability and relatively narrow range of topics [1]. Further, it is observed that instructors usually need to spend a lot of time and energy in composing paper-based exams with multiple forms. Therefore, the use of computers for automatic production of paper-based exams from question banks is becoming more and more important. Methodology: The design and evaluation of X-Pro Milestone are done by considering a basic set of design principles that are based on a list of identified Functional and Non-Functional Requirements. Deriving those requirements is made possible by developing X-Pro Milestone using the Iterative and Incremental model from software engineering domain. Results: We demonstrate that X-Pro Milestone has a number of excellent characteristics compared to the exam-preparation and question banks tools available in market. Some of these characteristics are: ease of use and operation, user-friendly interface and good usability, high security and protection of the question bank-items, high stability, and reliability. Further, X-Pro Milestone makes initiating, maintaining and archiving Question-Banks and produced exams possible. Putting X-Pro Milestone into real use has showed that X-Pro Milestone is easy to be learned and effectively used. We demonstrate that X-Pro Milestone is a cost-effective alternative to online examining systems with more and richer features and with low infrastructure requirements.
文摘This research paper analyzes data breaches in the human service sector. The hypothesis for the solution to this problem is that there will be a significant reduction in data breaches in the human service sector due to an increase in information assurance. The hypothesis is tested using data from the United States Department of Health and Human Services data breach notification repository during January 2018-December 2020. Our result shows that without the increased mitigation of information assurance, data breaches in the human service sector will continue to increase.
文摘The purpose of this paper is to provide a better knowledge of the cloud computing as well as to suggest relevant research paths in this growing field. Also, we will go through the future benefits of cloud computing and the upcoming possible challenges we will have. Intext Cloud, performance, cloud computing, architecture, scale-up, and big data are all terms used in this context. Cloud computing offers a wide range of architectural configurations, including the number of processors, memory, and nodes. Cloud computing has already changed the way we store, process, and access data, and it is expected to continue to have a significant impact on the future of information technology. Cloud computing enables organizations to scale their IT resources up or down quickly and easily, without the need for costly hardware upgrades. This can help organizations to respond more quickly to changing business needs and market conditions. By moving IT resources to the cloud, organizations can reduce their IT infrastructure costs and improve their operational efficiency. Cloud computing also allows organizations to pay only for the resources they use, rather than investing in expensive hardware and software licenses. Cloud providers invest heavily in security and compliance measures, which can help to protect organizations from cyber threats and ensure regulatory compliance. Cloud computing provides a scalable platform for AI and machine learning applications, enabling organizations to build and deploy these technologies more easily and cost-effectively. A task, an application, and its input can take up to 20 times longer or cost 10 times more than optimal. Cloud products’ ready adaptability has resulted in a paradigm change. Previously, an application was optimized for a specific cluster;however, in the cloud, the architectural configuration is tuned for the workload. The evolution of cloud computing from the era of mainframes and dumb terminals has been significant, but there are still many advancements to come. As we look towards the future, IT leaders and the companies they serve will face increasingly complex challenges in order to stay competitive in a constantly evolving cloud computing landscape. Additionally, it will be crucial to remain compliant with existing regulations as well as new regulations that may emerge in the future. It is safe to say that the next decade of cloud computing will be just as dramatic as the last where many internet services are becoming cloud-based, and huge enterprises will struggle to fund physical infrastructure. Cloud computing is significantly used in business innovation and because of its agility and adaptability, cloud technology enables new ways of working, operating, and running a business. The service enables users to access files and applications stored in the cloud from anywhere, removing the requirement for users to be always physically close to actual hardware. Cloud computing makes the connection available from anywhere because they are kept on a network of hosted computers that carry data over the internet. Cloud computing has shown to be advantageous to both consumers and corporations. To be more specific, the cloud has altered our way of life. Overall, cloud computing is likely to continue to play a significant role in the future of IT, enabling organizations to become more agile, efficient, and innovative in the face of rapid technological change. This is likely to drive further innovation in AI and machine learning in the coming years.
文摘The COVID-19 pandemic has had a profound influence on education around the world, with schools and institutions shifting to remote learning to safeguard the safety of students and faculty. Concerns have been expressed about the impact of virtual learning on student performance and grades. The purpose of this study is to investigate the impact of remote learning on student performance and grades, as well as to investigate the obstacles and benefits of this new educational paradigm. The study will examine current literature on the subject, analyze data from surveys and interviews with students and educators, and investigate potential solutions to improve student performance and participation in virtual classrooms. The study’s findings will provide insights into the effectiveness of remote learning and inform ideas to improve student learning and achievement in an educational virtual world. The purpose of this article is to investigate the influence of remote learning on both students and educational institutions. The project will examine existing literature on the subject and collect data from students, instructors, and administrators through questionnaires and interviews. The paper will look at the challenges and opportunities that remote learning presents, such as the effect on student involvement, motivation, and academic achievement, as well as changes in teaching styles and technology. The outcomes of this study will provide insights into the effectiveness of remote learning and will affect future decisions about the usage of virtual learning environments in education. The research will also investigate potential solutions to improve the quality of remote education and handle any issues that occur.
文摘Video games have been around for several decades and have had many advancements from the original start of video games. Video games started as virtual games that were advertised towards children, and these virtual games created a virtual reality of a variety of genres. These genres included sports games, such as tennis, football, baseball, war games, fantasy, puzzles, etc. The start of these games was derived from a sports genre and now has a popularity in multiplayer-online-shooting games. The purpose of this paper is to investigate different types of tools available for cheating in virtual world making players have undue advantage over other players in a competition. With the advancement in technology, these video games have become more expanded in the development aspects of gaming. Video game developers have created long lines of codes to create a new look of video games. As video games have progressed, the coding, bugs, bots, and errors of video games have changed throughout the years. The coding of video games has branched out from the original video games, which have given many benefits to this virtual world, while simultaneously creating more problems such as bots. Analysis of tools available for cheating in a game has disadvantaged normal gamer in a fair contest.
文摘Over the past decade, open-source software use has grown. Today, many companies including Google, Microsoft, Meta, RedHat, MongoDB, and Apache are major participants of open-source contributions. With the increased use of open-source software or integration of open-source software into custom-developed software, the quality of this software component increases in importance. This study examined a sample of open-source applications from GitHub. Static software analytics were conducted, and each application was classified for its risk level. In the analyzed applications, it was found that 90% of the applications were classified as low risk or moderate low risk indicating a high level of quality for open-source applications.
文摘The proposed study focuses on the critical issue of corrosion,which leads to significant economic losses and safety risks worldwide.A key area of emphasis is the accuracy of corrosion detection methods.While recent studies have made progress,a common challenge is the low accuracy of existing detection models.These models often struggle to reliably identify corrosion tendencies,which are crucial for minimizing industrial risks and optimizing resource use.The proposed study introduces an innovative approach that significantly improves the accuracy of corrosion detection using a convolutional neural network(CNN),as well as two pretrained models,namely YOLOv8 and EfficientNetB0.By leveraging advanced technologies and methodologies,we have achieved high accuracies in identifying and managing the hazards associated with corrosion across various industrial settings.This advancement not only supports the overarching goals of enhancing safety and efficiency,but also sets a new benchmark for future research in the field.The results demonstrate a significant improvement in the ability to detect and mitigate corrosion-related concerns,providing a more accurate and comprehensive solution for industries facing these challenges.Both CNN and EfficientNetB0 exhibited 100%accuracy,precision,recall,and F1-score,followed by YOLOv8 with respective metrics of 95%,100%,90%,and 94.74%.Our approach outperformed state-of-the-art with similar datasets and methodologies.
文摘Enhancing the security of Wireless Sensor Networks(WSNs)improves the usability of their applications.Therefore,finding solutions to various attacks,such as the blackhole attack,is crucial for the success of WSN applications.This paper proposes an enhanced version of the AODV(Ad Hoc On-Demand Distance Vector)protocol capable of detecting blackholes and malfunctioning benign nodes in WSNs,thereby avoiding them when delivering packets.The proposed version employs a network-based reputation system to select the best and most secure path to a destination.To achieve this goal,the proposed version utilizes the Watchdogs/Pathrater mechanisms in AODV to gather and broadcast reputations to all network nodes to build the network-based reputation system.To minimize the network overhead of the proposed approach,the paper uses reputation aggregator nodes only for forwarding reputation tables.Moreover,to reduce the overhead of updating reputation tables,the paper proposes three mechanisms,which are the prompt broadcast,the regular broadcast,and the light broadcast approaches.The proposed enhanced version has been designed to perform effectively in dynamic environments such as mobile WSNs where nodes,including blackholes,move continuously,which is considered a challenge for other protocols.Using the proposed enhanced protocol,a node evaluates the security of different routes to a destination and can select the most secure routing path.The paper provides an algorithm that explains the proposed protocol in detail and demonstrates a case study that shows the operations of calculating and updating reputation values when nodes move across different zones.Furthermore,the paper discusses the proposed approach’s overhead analysis to prove the proposed enhancement’s correctness and applicability.
文摘With the rapid growth of internet usage,a new situation has been created that enables practicing bullying.Cyberbullying has increased over the past decade,and it has the same adverse effects as face-to-face bullying,like anger,sadness,anxiety,and fear.With the anonymity people get on the internet,they tend to bemore aggressive and express their emotions freely without considering the effects,which can be a reason for the increase in cyberbullying and it is the main motive behind the current study.This study presents a thorough background of cyberbullying and the techniques used to collect,preprocess,and analyze the datasets.Moreover,a comprehensive review of the literature has been conducted to figure out research gaps and effective techniques and practices in cyberbullying detection in various languages,and it was deduced that there is significant room for improvement in the Arabic language.As a result,the current study focuses on the investigation of shortlisted machine learning algorithms in natural language processing(NLP)for the classification of Arabic datasets duly collected from Twitter(also known as X).In this regard,support vector machine(SVM),Naive Bayes(NB),Random Forest(RF),Logistic regression(LR),Bootstrap aggregating(Bagging),Gradient Boosting(GBoost),Light Gradient Boosting Machine(LightGBM),Adaptive Boosting(AdaBoost),and eXtreme Gradient Boosting(XGBoost)were shortlisted and investigated due to their effectiveness in the similar problems.Finally,the scheme was evaluated by well-known performance measures like accuracy,precision,Recall,and F1-score.Consequently,XGBoost exhibited the best performance with 89.95%accuracy,which is promising compared to the state-of-the-art.
文摘Obstacle removal in crowd evacuation is critical to safety and the evacuation system efficiency. Recently, manyresearchers proposed game theoreticmodels to avoid and remove obstacles for crowd evacuation. Game theoreticalmodels aim to study and analyze the strategic behaviors of individuals within a crowd and their interactionsduring the evacuation. Game theoretical models have some limitations in the context of crowd evacuation. Thesemodels consider a group of individuals as homogeneous objects with the same goals, involve complex mathematicalformulation, and cannot model real-world scenarios such as panic, environmental information, crowds that movedynamically, etc. The proposed work presents a game theoretic model integrating an agent-based model to removethe obstacles from exits. The proposed model considered the parameters named: (1) obstacle size, length, andwidth, (2) removal time, (3) evacuation time, (4) crowd density, (5) obstacle identification, and (6) route selection.The proposed work conducts various experiments considering different conditions, such as obstacle types, obstacleremoval, and several obstacles. Evaluation results show the proposed model’s effectiveness compared with existingliterature in reducing the overall evacuation time, cell selection, and obstacle removal. The study is potentially usefulfor public safety situations such as emergency evacuations during disasters and calamities.
文摘In the realm of Artificial Intelligence (AI), there exists a complex landscape where promises of efficiency and innovation clash with unforeseen disruptions across Information Technology (IT) and broader societal realms. This paper sets out on a journey to explore the intricate paradoxes inherent in AI, focusing on the unintended consequences that ripple through IT and beyond. Through a thorough examination of literature and analysis of related works, this study aims to shed light on the complexities surrounding the AI paradox. It delves into how this paradox appears in various domains, such as algorithmic biases, job displacement, ethical dilemmas, and privacy concerns. By mapping out these unintended disruptions, this research seeks to offer a nuanced understanding of the challenges brought forth by AI-driven transformations. Ultimately, its goal is to pave the way for the responsible development and deployment of AI, fostering a harmonious integration of technological progress with societal values and priorities.
基金funded by the Abu Dhabi University,Faculty Research Incentive Grant(19300483–Adel Khelifi),United Arab Emirates.Link to Sponsor website:https://www.adu.ac.ae/research/research-at-adu/overview.
文摘Pervasive schemes are the significant techniques that allow intelligent communication among the devices without any human intervention.Recently Internet of Vehicles(IoVs)has been introduced as one of the applications of pervasive computing that addresses the road safety challenges.Vehicles participating within the IoV are embedded with a wide range of sensors which operate in a real time environment to improve the road safety issues.Various mechanisms have been proposed which allow automatic actions based on uncertainty of sensory and managed data.Due to the lack of existing transportation integration schemes,IoV has not been completely explored by business organizations.In order to tackle this problem,we have proposed a novel trusted mechanism in IoV during communication,sensing,and record storing.Our proposed method uses trust based analysis and subjective logic functions with the aim of creating a trust environment for vehicles to communicate.In addition,the subjective logic function is integrated with multi-attribute SAW scheme to improve the decision metrics of authenticating nodes.The trust analysis depends on a variety of metrics to ensure an accurate identification of legitimate vehicles embedded with IoT devices ecosystem.The proposed scheme is determined and verified rigorously through various IoT devices and decision making metrics against a baseline solution.The simulation results show that the proposed scheme leads to 88%improvement in terms of better identification of legitimate nodes,road accidents and message alteration records during data transmission among vehicles as compared to the baseline approach.
文摘Autonomic software recovery enables software to automatically detect and recover software faults. This feature makes the software to run more efficiently, actively, and reduces the maintenance time and cost. This paper proposes an automated approach for Software Fault Detection and Recovery (SFDR). The SFDR detects the cases if a fault occurs with software components such as component deletion, replacement or modification, and recovers the component to enable the software to continue its intended operation. The SFDR is analyzed and implemented in parallel as a standalone software at the design phase of the target software. The practical applicability of the proposed approach has been tested by implementing an application demonstrating the performance and effectiveness of the SFDR. The experimental results and the comparisons with other works show the effectiveness of the proposed approach.
文摘The complexity of computer architectures, software, web applications, and its large spread worldwide using the internet and the rapid increase in the number of users in companion with the increase of maintenance cost are all factors guided many researchers to develop software, web applications and systems that have the ability of self-healing. The aim of the self healing software feature is to fast recover the application and keep it running and available for 24/7 as optimal as possible. This survey provides an overview of self-healing software and system that is especially useful in all of those situations in which the involvement of humans is costly and hard to recover and needs to be automated with self healing. There are different aspects which will make us understand the different benefits of these self-healing systems. Finally, the approaches, techniques, mechanisms and individual characteristics of self healing are classified in different tables and then summarized.
文摘Developments in service oriented architecture (SOA) have taken us near to the once fictional dream of forming and running an online business, such commercial activity in which most or all of its commercial roles are outsourced to online services. The novel concept of cloud computing gives a understanding of SOA in which Information Technology assets are provided as services that are extra flexible, inexpensive and striking to commercial activities. In this paper, we concisely study developments in concept of cloud computing, and debate the advantages of using cloud services for commercial activities and trade-offs that they have to consider. Further we presented a layered architecture for online business, and then we presented a conceptual architecture for complete online business working atmosphere. Moreover, we discuss the prospects and research experiments that are ahead of us in realizing the technical components of this conceptual architecture. We conclude by giving the outlook and impact of cloud services on both large and small businesses.
文摘The total reliance on internet connectivity and World Wide Web (WWW) based services is forcing many organizations to look for alternative solutions for providing adequate access and response time to the demand of their ever increasing users. A typical solution is to increase the bandwidth;this can be achieved with additional cost, but this solution does not scale nor decrease users perceived response time. Another concern is the security of their network. An alternative scalable solution is to deploy a proxy server to provide adequate access and improve response time as well as provide some level of security for clients using the network. While some studies have reported performance increase due to the use of proxy servers, one study has reported performance decrease due to proxy server. We then conducted a six-month proxy server experiment. During this period, we collected access logs from three different proxy servers and analyzed these logs with Webalizer a web server log file analysis program. After a few years, in September 2010, we collected log files from another proxy server, analyzed the logs using Webalizer and compared our results. The result of the analysis showed that the hit rate of the proxy servers ranged between 21% - 39% and over 70% of web pages were dynamic. Furthermore clients accessing the internet through a proxy server are more secured. We then conclude that although the nature of the web is changing, the proxy server is still capable of improving performance by decreasing response time perceived by web clients and improved network security.