This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends t...This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends that they are not the same. The concept of cyber security is explored, which goes beyond protecting information resources to include a wider variety of assets, including people [1]. Protecting information assets is the main goal of traditional information security, with consideration to the human element and how people fit into the security process. On the other hand, cyber security adds a new level of complexity, as people might unintentionally contribute to or become targets of cyberattacks. This aspect presents moral questions since it is becoming more widely accepted that society has a duty to protect weaker members of society, including children [1]. The study emphasizes how important cyber security is on a larger scale, with many countries creating plans and laws to counteract cyberattacks. Nevertheless, a lot of these sources frequently neglect to define the differences or the relationship between information security and cyber security [1]. The paper focus on differentiating between cybersecurity and information security on a larger scale. The study also highlights other areas of cybersecurity which includes defending people, social norms, and vital infrastructure from threats that arise from online in addition to information and technology protection. It contends that ethical issues and the human factor are becoming more and more important in protecting assets in the digital age, and that cyber security is a paradigm shift in this regard [1].展开更多
This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering...This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering various cloud infrastructure offerings and security models. Additionally, it deeply analyzes real-life case studies illustrating successful cloud migrations and highlights common information security threats in current cloud computing. The article concludes by offering recommendations to businesses to protect themselves from cloud data breaches and providing insights into selecting a suitable cloud services provider from an information security perspective.展开更多
In the realm of Artificial Intelligence (AI), there exists a complex landscape where promises of efficiency and innovation clash with unforeseen disruptions across Information Technology (IT) and broader societal real...In the realm of Artificial Intelligence (AI), there exists a complex landscape where promises of efficiency and innovation clash with unforeseen disruptions across Information Technology (IT) and broader societal realms. This paper sets out on a journey to explore the intricate paradoxes inherent in AI, focusing on the unintended consequences that ripple through IT and beyond. Through a thorough examination of literature and analysis of related works, this study aims to shed light on the complexities surrounding the AI paradox. It delves into how this paradox appears in various domains, such as algorithmic biases, job displacement, ethical dilemmas, and privacy concerns. By mapping out these unintended disruptions, this research seeks to offer a nuanced understanding of the challenges brought forth by AI-driven transformations. Ultimately, its goal is to pave the way for the responsible development and deployment of AI, fostering a harmonious integration of technological progress with societal values and priorities.展开更多
A network analyzer can often comprehend many protocols, which enables it to display talks taking place between hosts over a network. A network analyzer analyzes the device or network response and measures for the oper...A network analyzer can often comprehend many protocols, which enables it to display talks taking place between hosts over a network. A network analyzer analyzes the device or network response and measures for the operator to keep an eye on the network’s or object’s performance in an RF circuit. The purpose of the following research includes analyzing the capabilities of NetFlow analyzer to measure various parts, including filters, mixers, frequency sensitive networks, transistors, and other RF-based instruments. NetFlow Analyzer is a network traffic analyzer that measures the network parameters of electrical networks. Although there are other types of network parameter sets including Y, Z, & H-parameters, these instruments are typically employed to measure S-parameters since transmission & reflection of electrical networks are simple to calculate at high frequencies. These analyzers are widely employed to distinguish between two-port networks, including filters and amplifiers. By allowing the user to view the actual data that is sent over a network, packet by packet, a network analyzer informs you of what is happening there. Also, this research will contain the design model of NetFlow Analyzer that Measurements involving transmission and reflection use. Gain, insertion loss, and transmission coefficient are measured in transmission measurements, whereas return loss, reflection coefficient, impedance, and other variables are measured in reflection measurements. These analyzers’ operational frequencies vary from 1 Hz to 1.5 THz. These analyzers can also be used to examine stability in measurements of open loops, audio components, and ultrasonics.展开更多
The purpose of this paper is to provide a better knowledge of the cloud computing as well as to suggest relevant research paths in this growing field. Also, we will go through the future benefits of cloud computing an...The purpose of this paper is to provide a better knowledge of the cloud computing as well as to suggest relevant research paths in this growing field. Also, we will go through the future benefits of cloud computing and the upcoming possible challenges we will have. Intext Cloud, performance, cloud computing, architecture, scale-up, and big data are all terms used in this context. Cloud computing offers a wide range of architectural configurations, including the number of processors, memory, and nodes. Cloud computing has already changed the way we store, process, and access data, and it is expected to continue to have a significant impact on the future of information technology. Cloud computing enables organizations to scale their IT resources up or down quickly and easily, without the need for costly hardware upgrades. This can help organizations to respond more quickly to changing business needs and market conditions. By moving IT resources to the cloud, organizations can reduce their IT infrastructure costs and improve their operational efficiency. Cloud computing also allows organizations to pay only for the resources they use, rather than investing in expensive hardware and software licenses. Cloud providers invest heavily in security and compliance measures, which can help to protect organizations from cyber threats and ensure regulatory compliance. Cloud computing provides a scalable platform for AI and machine learning applications, enabling organizations to build and deploy these technologies more easily and cost-effectively. A task, an application, and its input can take up to 20 times longer or cost 10 times more than optimal. Cloud products’ ready adaptability has resulted in a paradigm change. Previously, an application was optimized for a specific cluster;however, in the cloud, the architectural configuration is tuned for the workload. The evolution of cloud computing from the era of mainframes and dumb terminals has been significant, but there are still many advancements to come. As we look towards the future, IT leaders and the companies they serve will face increasingly complex challenges in order to stay competitive in a constantly evolving cloud computing landscape. Additionally, it will be crucial to remain compliant with existing regulations as well as new regulations that may emerge in the future. It is safe to say that the next decade of cloud computing will be just as dramatic as the last where many internet services are becoming cloud-based, and huge enterprises will struggle to fund physical infrastructure. Cloud computing is significantly used in business innovation and because of its agility and adaptability, cloud technology enables new ways of working, operating, and running a business. The service enables users to access files and applications stored in the cloud from anywhere, removing the requirement for users to be always physically close to actual hardware. Cloud computing makes the connection available from anywhere because they are kept on a network of hosted computers that carry data over the internet. Cloud computing has shown to be advantageous to both consumers and corporations. To be more specific, the cloud has altered our way of life. Overall, cloud computing is likely to continue to play a significant role in the future of IT, enabling organizations to become more agile, efficient, and innovative in the face of rapid technological change. This is likely to drive further innovation in AI and machine learning in the coming years.展开更多
The COVID-19 pandemic has had a profound influence on education around the world, with schools and institutions shifting to remote learning to safeguard the safety of students and faculty. Concerns have been expressed...The COVID-19 pandemic has had a profound influence on education around the world, with schools and institutions shifting to remote learning to safeguard the safety of students and faculty. Concerns have been expressed about the impact of virtual learning on student performance and grades. The purpose of this study is to investigate the impact of remote learning on student performance and grades, as well as to investigate the obstacles and benefits of this new educational paradigm. The study will examine current literature on the subject, analyze data from surveys and interviews with students and educators, and investigate potential solutions to improve student performance and participation in virtual classrooms. The study’s findings will provide insights into the effectiveness of remote learning and inform ideas to improve student learning and achievement in an educational virtual world. The purpose of this article is to investigate the influence of remote learning on both students and educational institutions. The project will examine existing literature on the subject and collect data from students, instructors, and administrators through questionnaires and interviews. The paper will look at the challenges and opportunities that remote learning presents, such as the effect on student involvement, motivation, and academic achievement, as well as changes in teaching styles and technology. The outcomes of this study will provide insights into the effectiveness of remote learning and will affect future decisions about the usage of virtual learning environments in education. The research will also investigate potential solutions to improve the quality of remote education and handle any issues that occur.展开更多
Video games have been around for several decades and have had many advancements from the original start of video games. Video games started as virtual games that were advertised towards children, and these virtual gam...Video games have been around for several decades and have had many advancements from the original start of video games. Video games started as virtual games that were advertised towards children, and these virtual games created a virtual reality of a variety of genres. These genres included sports games, such as tennis, football, baseball, war games, fantasy, puzzles, etc. The start of these games was derived from a sports genre and now has a popularity in multiplayer-online-shooting games. The purpose of this paper is to investigate different types of tools available for cheating in virtual world making players have undue advantage over other players in a competition. With the advancement in technology, these video games have become more expanded in the development aspects of gaming. Video game developers have created long lines of codes to create a new look of video games. As video games have progressed, the coding, bugs, bots, and errors of video games have changed throughout the years. The coding of video games has branched out from the original video games, which have given many benefits to this virtual world, while simultaneously creating more problems such as bots. Analysis of tools available for cheating in a game has disadvantaged normal gamer in a fair contest.展开更多
System analysis and design (SAD) is a crucial process in the development of software systems. The impact of modeling techniques and software engineering practices on SAD has been the focus of research for many years. ...System analysis and design (SAD) is a crucial process in the development of software systems. The impact of modeling techniques and software engineering practices on SAD has been the focus of research for many years. Two such techniques that have had a significant impact on SAD are Unified Modeling Language (UML) and machine learning. UML has been used to model the structure and behavior of software systems, while machine learning has been used to automatically learn patterns in data and make predictions. The purpose of this paper is to review the literature on the impact of UML and machine learning on SAD. We summarize the findings from several studies and highlight the key insights related to the benefits and limitations of these techniques for SAD. Our review shows that both UML and machine learning have had a positive impact on SAD, with UML improving communication and documentation, and machine learning improving the accuracy of predictions. However, there are also challenges associated with their use, such as the need for expertise and the difficulty of interpreting machine learning models. Our findings suggest that a combination of UML and machine learning can enhance SAD by leveraging the strengths of each technique.展开更多
In fields such as science and engineering, virtual environment is commonly used to provide replacements for practical hands-on laboratories. Sometimes, these environments take the form of a remote interface to the phy...In fields such as science and engineering, virtual environment is commonly used to provide replacements for practical hands-on laboratories. Sometimes, these environments take the form of a remote interface to the physical laboratory apparatus and at other times, in the form of a complete software implementation that simulates the laboratory apparatus. In this paper, we report on the use of a semi-immersive 3D mobile Augmented Reality (mAR) interface and limited simulations as a replacement for practical hands-on laboratories in science and engineering. The 3D-mAR based interfaces implementations for three different experiments (from micro-electronics, power and communications engineering) are presented;the discovered limitations are discussed along with the results of an evaluation by science and engineering students from two different institutions and plans for future work.展开更多
Cloud computing paradigm is a service oriented system that delivers services to the customer at low cost. Cloud computing needs to address three main security issues: confidentiality, integrity and availability. In th...Cloud computing paradigm is a service oriented system that delivers services to the customer at low cost. Cloud computing needs to address three main security issues: confidentiality, integrity and availability. In this paper, we propose user identity management protocol for cloud computing customers and cloud service providers. This protocol will authenticate and authorize customers/providers in other to achieve global security networks. The protocol will be developed to achieve the set global security objectives in cloud computing environments. Confidentiality, integrity and availability are the key challenges of web services’ or utility providers. A layered protocol design is proposed for cloud computing systems, the physical, networks and application layer. However, each layer will integrate existing security features such as firewalls, NIDS, NIPS, Anti-DDOS and others to prevent security threats and attacks. System vulnerability is critical to the cloud computing facilities;the proposed protocol will address this as part of measures to secure data at all levels. The protocol will protect customers/cloud service providers’ infrastructure by preventing unauthorized users to gain access to the service/facility.展开更多
The exponential growths of the World Wide Web (WWW) users have made the deployment of proxy servers popular on a network with limited resources. WWW clients perceive better response time, improved performance and spee...The exponential growths of the World Wide Web (WWW) users have made the deployment of proxy servers popular on a network with limited resources. WWW clients perceive better response time, improved performance and speed when response to requested pages are served from the cache of a proxy server, resulting in faster response times after the first document fetch. This work proposes cyclic multicast as a scalable technique for improving proxy server performance for next generation networks. The proposed system uses a cyclic multicast engine for the delivery of popular web pages from the proxy server cache to increasingly large users under limited server capacity and network resources. The cyclic multicast technique would be more efficient for the delivery of highly requested web pages from the cache to large number of receivers. We describe the operation of the cyclic multicast proxy server and characterized the gains in performance.展开更多
The total reliance on internet connectivity and World Wide Web (WWW) based services is forcing many organizations to look for alternative solutions for providing adequate access and response time to the demand of thei...The total reliance on internet connectivity and World Wide Web (WWW) based services is forcing many organizations to look for alternative solutions for providing adequate access and response time to the demand of their ever increasing users. A typical solution is to increase the bandwidth;this can be achieved with additional cost, but this solution does not scale nor decrease users perceived response time. Another concern is the security of their network. An alternative scalable solution is to deploy a proxy server to provide adequate access and improve response time as well as provide some level of security for clients using the network. While some studies have reported performance increase due to the use of proxy servers, one study has reported performance decrease due to proxy server. We then conducted a six-month proxy server experiment. During this period, we collected access logs from three different proxy servers and analyzed these logs with Webalizer a web server log file analysis program. After a few years, in September 2010, we collected log files from another proxy server, analyzed the logs using Webalizer and compared our results. The result of the analysis showed that the hit rate of the proxy servers ranged between 21% - 39% and over 70% of web pages were dynamic. Furthermore clients accessing the internet through a proxy server are more secured. We then conclude that although the nature of the web is changing, the proxy server is still capable of improving performance by decreasing response time perceived by web clients and improved network security.展开更多
The proliferation of web services;and users appeal for high scalability, availability and reliability of web servers to provide rapid response and high throughput for the Clients’ requests occurring at anytime. Distr...The proliferation of web services;and users appeal for high scalability, availability and reliability of web servers to provide rapid response and high throughput for the Clients’ requests occurring at anytime. Distributed Web Servers (DWSs) provide an effective solution for improving the quality of web services. This paper addresses un-regulated jobs/tasks migration among the servers. Considering distributed web services with several servers running, a lot of bandwidth is wasted due to unnecessary job migration. Having considered bandwidth optimization, it is important to develop a policy that will address the bandwidth consumption while loads/tasks are being transferred among the servers. The goal of this work is to regulate this movement to minimize bandwidth consumption. From literatures, little or no attention was given to this problem, making it difficult to implement some of these policies/schemes in bandwidth scarce environment. Our policy “Cooperative Adaptive Symmetrical Initiated Dynamic/Diffusion (CASID)” was developed using Java Development Environment (JADE) a middle ware service oriented environment which is agent-based. The software was used to simulate events (jobs distribution) on the servers. With no job transfer allowed when all servers are busy, any over loaded server process jobs internally to completion. We achieved this by having two different agents;static cognitive agents and dynamic cognitive agents. The results were compared with the existing schemes. CASID policy outperforms PLB scheme in terms of response time and system throughput.展开更多
The lack of current network dynamics studies that evaluate the effects of new application and protocol deployment or long-term studies that observe the effect of incremental changes on the Internet, and the change in ...The lack of current network dynamics studies that evaluate the effects of new application and protocol deployment or long-term studies that observe the effect of incremental changes on the Internet, and the change in the overall stability of the Internet under various conditions and threats has made network monitoring challenging. A good understanding of the nature and type of network traffic is the key to solving congestion problems. In this paper we describe the architecture and implementation of a scalable network traffic moni-toring and analysis system. The gigabit interface on the monitoring system was configured to capture network traffic and the Multi Router Traffic Grapher (MRTG) and Webalizer produces graphical and detailed traffic analysis. This system is in use at the Obafemi Awolowo University, IleIfe, Nigeria;we describe how this system can be replicated in another environment.展开更多
The broadcast nature of wireless network makes traditional link-layer attacks readily available to anyone within the range of the network. User authentication is best safeguard against the risk of unauthorized access ...The broadcast nature of wireless network makes traditional link-layer attacks readily available to anyone within the range of the network. User authentication is best safeguard against the risk of unauthorized access to the wireless networks. The present 802.1× authentication scheme has some flaws, making mutual authentication impossible and open to man-in-the-middle attacks. These characteristics make traditional cryptographic mechanism provide weak security for the wireless environment. We have proposed the use of mobile agents to provide dependable Internet services delivery to users, this will guarantee secure authentication in wireless networks and we examine the feasibility of our solution and propose a model for wireless network security.展开更多
The framework Information Technology professionals and Network Organizations use is often seen as open and dynamic. This can create many different pathways for cybercriminals to launch an attack on an enterprise netwo...The framework Information Technology professionals and Network Organizations use is often seen as open and dynamic. This can create many different pathways for cybercriminals to launch an attack on an enterprise network to cause panic, this situation could be prevented. Using the proposed framework, network administrators and networked organizations can improve their cybersecurity framework for future consumer networks. Implementing a network security plan that is up to date and outlines responsibilities of team members, creating a government subsidy to implement and increase safeguards on US based networks, and the analyzing of past cyber-attacks metadata to further understand the attacks that are causing problems for consumer networks can improve the cybersecurity framework for consumer networks and increase potential security on US based networks. Research found that the implementation of security plans, creating a government subsidy, and analyzing past metadata all show signs of improving the framework of cybersecurity in consumer based networks.展开更多
This research paper analyzes data breaches in the human service sector. The hypothesis for the solution to this problem is that there will be a significant reduction in data breaches in the human service sector due to...This research paper analyzes data breaches in the human service sector. The hypothesis for the solution to this problem is that there will be a significant reduction in data breaches in the human service sector due to an increase in information assurance. The hypothesis is tested using data from the United States Department of Health and Human Services data breach notification repository during January 2018-December 2020. Our result shows that without the increased mitigation of information assurance, data breaches in the human service sector will continue to increase.展开更多
Li-Fi, or known as light fidelity, is a new technology that could alleviate some network congestion for the ever-increasing internet of things (IOT). The patent for Li-Fi was created by German physicist Harald Haas in...Li-Fi, or known as light fidelity, is a new technology that could alleviate some network congestion for the ever-increasing internet of things (IOT). The patent for Li-Fi was created by German physicist Harald Haas in 2011 around visible light communication. The purpose of the following research includes the capabilities of Li-Fi Technologies and how the implementation of a Li-Fi network can improve network infrastructure. A main point is to highlight the advantages that Li-Fi technology brings to the table in comparison to traditional Wi-Fi networks like the increased bandwidth frequency, faster transmission speeds, as well as not being affected by network latency due to high traffic. Benefits that Li-Fi technologies provide to network infrastructure include the use of less energy, the need for fewer components to operate, as well as the simplicity of only needing a light source to provide high-speed internet traffic. Some of our research shows the implementation of these systems and how they can provide different benefits to different types of needs of the consumer. The research gave a complete idea about hybrid indoor systems based on Li-Fi and Wi-Fi that indicates how Li-Fi technology raises the possibilities to fulfill the technological demand in the future. Also, the part explained the security concerns of Li-Fi technology and we can consider this technology secure by updating some system protocols. At present day, Li-Fi lacks the infrastructure that Wi-Fi has, which makes replacement unideal. Rather, Li-Fi can be seen as complementary to Wi-Fi and used to improve current technology.展开更多
文摘This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends that they are not the same. The concept of cyber security is explored, which goes beyond protecting information resources to include a wider variety of assets, including people [1]. Protecting information assets is the main goal of traditional information security, with consideration to the human element and how people fit into the security process. On the other hand, cyber security adds a new level of complexity, as people might unintentionally contribute to or become targets of cyberattacks. This aspect presents moral questions since it is becoming more widely accepted that society has a duty to protect weaker members of society, including children [1]. The study emphasizes how important cyber security is on a larger scale, with many countries creating plans and laws to counteract cyberattacks. Nevertheless, a lot of these sources frequently neglect to define the differences or the relationship between information security and cyber security [1]. The paper focus on differentiating between cybersecurity and information security on a larger scale. The study also highlights other areas of cybersecurity which includes defending people, social norms, and vital infrastructure from threats that arise from online in addition to information and technology protection. It contends that ethical issues and the human factor are becoming more and more important in protecting assets in the digital age, and that cyber security is a paradigm shift in this regard [1].
文摘This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering various cloud infrastructure offerings and security models. Additionally, it deeply analyzes real-life case studies illustrating successful cloud migrations and highlights common information security threats in current cloud computing. The article concludes by offering recommendations to businesses to protect themselves from cloud data breaches and providing insights into selecting a suitable cloud services provider from an information security perspective.
文摘In the realm of Artificial Intelligence (AI), there exists a complex landscape where promises of efficiency and innovation clash with unforeseen disruptions across Information Technology (IT) and broader societal realms. This paper sets out on a journey to explore the intricate paradoxes inherent in AI, focusing on the unintended consequences that ripple through IT and beyond. Through a thorough examination of literature and analysis of related works, this study aims to shed light on the complexities surrounding the AI paradox. It delves into how this paradox appears in various domains, such as algorithmic biases, job displacement, ethical dilemmas, and privacy concerns. By mapping out these unintended disruptions, this research seeks to offer a nuanced understanding of the challenges brought forth by AI-driven transformations. Ultimately, its goal is to pave the way for the responsible development and deployment of AI, fostering a harmonious integration of technological progress with societal values and priorities.
文摘A network analyzer can often comprehend many protocols, which enables it to display talks taking place between hosts over a network. A network analyzer analyzes the device or network response and measures for the operator to keep an eye on the network’s or object’s performance in an RF circuit. The purpose of the following research includes analyzing the capabilities of NetFlow analyzer to measure various parts, including filters, mixers, frequency sensitive networks, transistors, and other RF-based instruments. NetFlow Analyzer is a network traffic analyzer that measures the network parameters of electrical networks. Although there are other types of network parameter sets including Y, Z, & H-parameters, these instruments are typically employed to measure S-parameters since transmission & reflection of electrical networks are simple to calculate at high frequencies. These analyzers are widely employed to distinguish between two-port networks, including filters and amplifiers. By allowing the user to view the actual data that is sent over a network, packet by packet, a network analyzer informs you of what is happening there. Also, this research will contain the design model of NetFlow Analyzer that Measurements involving transmission and reflection use. Gain, insertion loss, and transmission coefficient are measured in transmission measurements, whereas return loss, reflection coefficient, impedance, and other variables are measured in reflection measurements. These analyzers’ operational frequencies vary from 1 Hz to 1.5 THz. These analyzers can also be used to examine stability in measurements of open loops, audio components, and ultrasonics.
文摘The purpose of this paper is to provide a better knowledge of the cloud computing as well as to suggest relevant research paths in this growing field. Also, we will go through the future benefits of cloud computing and the upcoming possible challenges we will have. Intext Cloud, performance, cloud computing, architecture, scale-up, and big data are all terms used in this context. Cloud computing offers a wide range of architectural configurations, including the number of processors, memory, and nodes. Cloud computing has already changed the way we store, process, and access data, and it is expected to continue to have a significant impact on the future of information technology. Cloud computing enables organizations to scale their IT resources up or down quickly and easily, without the need for costly hardware upgrades. This can help organizations to respond more quickly to changing business needs and market conditions. By moving IT resources to the cloud, organizations can reduce their IT infrastructure costs and improve their operational efficiency. Cloud computing also allows organizations to pay only for the resources they use, rather than investing in expensive hardware and software licenses. Cloud providers invest heavily in security and compliance measures, which can help to protect organizations from cyber threats and ensure regulatory compliance. Cloud computing provides a scalable platform for AI and machine learning applications, enabling organizations to build and deploy these technologies more easily and cost-effectively. A task, an application, and its input can take up to 20 times longer or cost 10 times more than optimal. Cloud products’ ready adaptability has resulted in a paradigm change. Previously, an application was optimized for a specific cluster;however, in the cloud, the architectural configuration is tuned for the workload. The evolution of cloud computing from the era of mainframes and dumb terminals has been significant, but there are still many advancements to come. As we look towards the future, IT leaders and the companies they serve will face increasingly complex challenges in order to stay competitive in a constantly evolving cloud computing landscape. Additionally, it will be crucial to remain compliant with existing regulations as well as new regulations that may emerge in the future. It is safe to say that the next decade of cloud computing will be just as dramatic as the last where many internet services are becoming cloud-based, and huge enterprises will struggle to fund physical infrastructure. Cloud computing is significantly used in business innovation and because of its agility and adaptability, cloud technology enables new ways of working, operating, and running a business. The service enables users to access files and applications stored in the cloud from anywhere, removing the requirement for users to be always physically close to actual hardware. Cloud computing makes the connection available from anywhere because they are kept on a network of hosted computers that carry data over the internet. Cloud computing has shown to be advantageous to both consumers and corporations. To be more specific, the cloud has altered our way of life. Overall, cloud computing is likely to continue to play a significant role in the future of IT, enabling organizations to become more agile, efficient, and innovative in the face of rapid technological change. This is likely to drive further innovation in AI and machine learning in the coming years.
文摘The COVID-19 pandemic has had a profound influence on education around the world, with schools and institutions shifting to remote learning to safeguard the safety of students and faculty. Concerns have been expressed about the impact of virtual learning on student performance and grades. The purpose of this study is to investigate the impact of remote learning on student performance and grades, as well as to investigate the obstacles and benefits of this new educational paradigm. The study will examine current literature on the subject, analyze data from surveys and interviews with students and educators, and investigate potential solutions to improve student performance and participation in virtual classrooms. The study’s findings will provide insights into the effectiveness of remote learning and inform ideas to improve student learning and achievement in an educational virtual world. The purpose of this article is to investigate the influence of remote learning on both students and educational institutions. The project will examine existing literature on the subject and collect data from students, instructors, and administrators through questionnaires and interviews. The paper will look at the challenges and opportunities that remote learning presents, such as the effect on student involvement, motivation, and academic achievement, as well as changes in teaching styles and technology. The outcomes of this study will provide insights into the effectiveness of remote learning and will affect future decisions about the usage of virtual learning environments in education. The research will also investigate potential solutions to improve the quality of remote education and handle any issues that occur.
文摘Video games have been around for several decades and have had many advancements from the original start of video games. Video games started as virtual games that were advertised towards children, and these virtual games created a virtual reality of a variety of genres. These genres included sports games, such as tennis, football, baseball, war games, fantasy, puzzles, etc. The start of these games was derived from a sports genre and now has a popularity in multiplayer-online-shooting games. The purpose of this paper is to investigate different types of tools available for cheating in virtual world making players have undue advantage over other players in a competition. With the advancement in technology, these video games have become more expanded in the development aspects of gaming. Video game developers have created long lines of codes to create a new look of video games. As video games have progressed, the coding, bugs, bots, and errors of video games have changed throughout the years. The coding of video games has branched out from the original video games, which have given many benefits to this virtual world, while simultaneously creating more problems such as bots. Analysis of tools available for cheating in a game has disadvantaged normal gamer in a fair contest.
文摘System analysis and design (SAD) is a crucial process in the development of software systems. The impact of modeling techniques and software engineering practices on SAD has been the focus of research for many years. Two such techniques that have had a significant impact on SAD are Unified Modeling Language (UML) and machine learning. UML has been used to model the structure and behavior of software systems, while machine learning has been used to automatically learn patterns in data and make predictions. The purpose of this paper is to review the literature on the impact of UML and machine learning on SAD. We summarize the findings from several studies and highlight the key insights related to the benefits and limitations of these techniques for SAD. Our review shows that both UML and machine learning have had a positive impact on SAD, with UML improving communication and documentation, and machine learning improving the accuracy of predictions. However, there are also challenges associated with their use, such as the need for expertise and the difficulty of interpreting machine learning models. Our findings suggest that a combination of UML and machine learning can enhance SAD by leveraging the strengths of each technique.
文摘In fields such as science and engineering, virtual environment is commonly used to provide replacements for practical hands-on laboratories. Sometimes, these environments take the form of a remote interface to the physical laboratory apparatus and at other times, in the form of a complete software implementation that simulates the laboratory apparatus. In this paper, we report on the use of a semi-immersive 3D mobile Augmented Reality (mAR) interface and limited simulations as a replacement for practical hands-on laboratories in science and engineering. The 3D-mAR based interfaces implementations for three different experiments (from micro-electronics, power and communications engineering) are presented;the discovered limitations are discussed along with the results of an evaluation by science and engineering students from two different institutions and plans for future work.
文摘Cloud computing paradigm is a service oriented system that delivers services to the customer at low cost. Cloud computing needs to address three main security issues: confidentiality, integrity and availability. In this paper, we propose user identity management protocol for cloud computing customers and cloud service providers. This protocol will authenticate and authorize customers/providers in other to achieve global security networks. The protocol will be developed to achieve the set global security objectives in cloud computing environments. Confidentiality, integrity and availability are the key challenges of web services’ or utility providers. A layered protocol design is proposed for cloud computing systems, the physical, networks and application layer. However, each layer will integrate existing security features such as firewalls, NIDS, NIPS, Anti-DDOS and others to prevent security threats and attacks. System vulnerability is critical to the cloud computing facilities;the proposed protocol will address this as part of measures to secure data at all levels. The protocol will protect customers/cloud service providers’ infrastructure by preventing unauthorized users to gain access to the service/facility.
文摘The exponential growths of the World Wide Web (WWW) users have made the deployment of proxy servers popular on a network with limited resources. WWW clients perceive better response time, improved performance and speed when response to requested pages are served from the cache of a proxy server, resulting in faster response times after the first document fetch. This work proposes cyclic multicast as a scalable technique for improving proxy server performance for next generation networks. The proposed system uses a cyclic multicast engine for the delivery of popular web pages from the proxy server cache to increasingly large users under limited server capacity and network resources. The cyclic multicast technique would be more efficient for the delivery of highly requested web pages from the cache to large number of receivers. We describe the operation of the cyclic multicast proxy server and characterized the gains in performance.
文摘The total reliance on internet connectivity and World Wide Web (WWW) based services is forcing many organizations to look for alternative solutions for providing adequate access and response time to the demand of their ever increasing users. A typical solution is to increase the bandwidth;this can be achieved with additional cost, but this solution does not scale nor decrease users perceived response time. Another concern is the security of their network. An alternative scalable solution is to deploy a proxy server to provide adequate access and improve response time as well as provide some level of security for clients using the network. While some studies have reported performance increase due to the use of proxy servers, one study has reported performance decrease due to proxy server. We then conducted a six-month proxy server experiment. During this period, we collected access logs from three different proxy servers and analyzed these logs with Webalizer a web server log file analysis program. After a few years, in September 2010, we collected log files from another proxy server, analyzed the logs using Webalizer and compared our results. The result of the analysis showed that the hit rate of the proxy servers ranged between 21% - 39% and over 70% of web pages were dynamic. Furthermore clients accessing the internet through a proxy server are more secured. We then conclude that although the nature of the web is changing, the proxy server is still capable of improving performance by decreasing response time perceived by web clients and improved network security.
文摘The proliferation of web services;and users appeal for high scalability, availability and reliability of web servers to provide rapid response and high throughput for the Clients’ requests occurring at anytime. Distributed Web Servers (DWSs) provide an effective solution for improving the quality of web services. This paper addresses un-regulated jobs/tasks migration among the servers. Considering distributed web services with several servers running, a lot of bandwidth is wasted due to unnecessary job migration. Having considered bandwidth optimization, it is important to develop a policy that will address the bandwidth consumption while loads/tasks are being transferred among the servers. The goal of this work is to regulate this movement to minimize bandwidth consumption. From literatures, little or no attention was given to this problem, making it difficult to implement some of these policies/schemes in bandwidth scarce environment. Our policy “Cooperative Adaptive Symmetrical Initiated Dynamic/Diffusion (CASID)” was developed using Java Development Environment (JADE) a middle ware service oriented environment which is agent-based. The software was used to simulate events (jobs distribution) on the servers. With no job transfer allowed when all servers are busy, any over loaded server process jobs internally to completion. We achieved this by having two different agents;static cognitive agents and dynamic cognitive agents. The results were compared with the existing schemes. CASID policy outperforms PLB scheme in terms of response time and system throughput.
文摘The lack of current network dynamics studies that evaluate the effects of new application and protocol deployment or long-term studies that observe the effect of incremental changes on the Internet, and the change in the overall stability of the Internet under various conditions and threats has made network monitoring challenging. A good understanding of the nature and type of network traffic is the key to solving congestion problems. In this paper we describe the architecture and implementation of a scalable network traffic moni-toring and analysis system. The gigabit interface on the monitoring system was configured to capture network traffic and the Multi Router Traffic Grapher (MRTG) and Webalizer produces graphical and detailed traffic analysis. This system is in use at the Obafemi Awolowo University, IleIfe, Nigeria;we describe how this system can be replicated in another environment.
文摘The broadcast nature of wireless network makes traditional link-layer attacks readily available to anyone within the range of the network. User authentication is best safeguard against the risk of unauthorized access to the wireless networks. The present 802.1× authentication scheme has some flaws, making mutual authentication impossible and open to man-in-the-middle attacks. These characteristics make traditional cryptographic mechanism provide weak security for the wireless environment. We have proposed the use of mobile agents to provide dependable Internet services delivery to users, this will guarantee secure authentication in wireless networks and we examine the feasibility of our solution and propose a model for wireless network security.
文摘The framework Information Technology professionals and Network Organizations use is often seen as open and dynamic. This can create many different pathways for cybercriminals to launch an attack on an enterprise network to cause panic, this situation could be prevented. Using the proposed framework, network administrators and networked organizations can improve their cybersecurity framework for future consumer networks. Implementing a network security plan that is up to date and outlines responsibilities of team members, creating a government subsidy to implement and increase safeguards on US based networks, and the analyzing of past cyber-attacks metadata to further understand the attacks that are causing problems for consumer networks can improve the cybersecurity framework for consumer networks and increase potential security on US based networks. Research found that the implementation of security plans, creating a government subsidy, and analyzing past metadata all show signs of improving the framework of cybersecurity in consumer based networks.
文摘This research paper analyzes data breaches in the human service sector. The hypothesis for the solution to this problem is that there will be a significant reduction in data breaches in the human service sector due to an increase in information assurance. The hypothesis is tested using data from the United States Department of Health and Human Services data breach notification repository during January 2018-December 2020. Our result shows that without the increased mitigation of information assurance, data breaches in the human service sector will continue to increase.
文摘Li-Fi, or known as light fidelity, is a new technology that could alleviate some network congestion for the ever-increasing internet of things (IOT). The patent for Li-Fi was created by German physicist Harald Haas in 2011 around visible light communication. The purpose of the following research includes the capabilities of Li-Fi Technologies and how the implementation of a Li-Fi network can improve network infrastructure. A main point is to highlight the advantages that Li-Fi technology brings to the table in comparison to traditional Wi-Fi networks like the increased bandwidth frequency, faster transmission speeds, as well as not being affected by network latency due to high traffic. Benefits that Li-Fi technologies provide to network infrastructure include the use of less energy, the need for fewer components to operate, as well as the simplicity of only needing a light source to provide high-speed internet traffic. Some of our research shows the implementation of these systems and how they can provide different benefits to different types of needs of the consumer. The research gave a complete idea about hybrid indoor systems based on Li-Fi and Wi-Fi that indicates how Li-Fi technology raises the possibilities to fulfill the technological demand in the future. Also, the part explained the security concerns of Li-Fi technology and we can consider this technology secure by updating some system protocols. At present day, Li-Fi lacks the infrastructure that Wi-Fi has, which makes replacement unideal. Rather, Li-Fi can be seen as complementary to Wi-Fi and used to improve current technology.