In today’s highly competitive retail industry,offline stores face increasing pressure on profitability.They hope to improve their ability in shelf management with the help of big data technology.For this,on-shelf ava...In today’s highly competitive retail industry,offline stores face increasing pressure on profitability.They hope to improve their ability in shelf management with the help of big data technology.For this,on-shelf availability is an essential indicator of shelf data management and closely relates to customer purchase behavior.RFM(recency,frequency,andmonetary)patternmining is a powerful tool to evaluate the value of customer behavior.However,the existing RFM patternmining algorithms do not consider the quarterly nature of goods,resulting in unreasonable shelf availability and difficulty in profit-making.To solve this problem,we propose a quarterly RFM mining algorithmfor On-shelf products named OS-RFM.Our algorithmmines the high recency,high frequency,and high monetary patterns and considers the period of the on-shelf goods in quarterly units.We conducted experiments using two real datasets for numerical and graphical analysis to prove the algorithm’s effectiveness.Compared with the state-of-the-art RFM mining algorithm,our algorithm can identify more patterns and performs well in terms of precision,recall,and F1-score,with the recall rate nearing 100%.Also,the novel algorithm operates with significantly shorter running times and more stable memory usage than existing mining algorithms.Additionally,we analyze the sales trends of products in different quarters and seasonal variations.The analysis assists businesses in maintaining reasonable on-shelf availability and achieving greater profitability.展开更多
As cloud computing usage grows,cloud data centers play an increasingly important role.To maximize resource utilization,ensure service quality,and enhance system performance,it is crucial to allocate tasks and manage p...As cloud computing usage grows,cloud data centers play an increasingly important role.To maximize resource utilization,ensure service quality,and enhance system performance,it is crucial to allocate tasks and manage performance effectively.The purpose of this study is to provide an extensive analysis of task allocation and performance management techniques employed in cloud data centers.The aim is to systematically categorize and organize previous research by identifying the cloud computing methodologies,categories,and gaps.A literature review was conducted,which included the analysis of 463 task allocations and 480 performance management papers.The review revealed three task allocation research topics and seven performance management methods.Task allocation research areas are resource allocation,load-Balancing,and scheduling.Performance management includes monitoring and control,power and energy management,resource utilization optimization,quality of service management,fault management,virtual machine management,and network management.The study proposes new techniques to enhance cloud computing work allocation and performance management.Short-comings in each approach can guide future research.The research’s findings on cloud data center task allocation and performance management can assist academics,practitioners,and cloud service providers in optimizing their systems for dependability,cost-effectiveness,and scalability.Innovative methodologies can steer future research to fill gaps in the literature.展开更多
This study aims to investigate the influence of social media on college choice among undergraduates majoring in Big Data Management and Application in China.The study attempts to reveal how information on social media...This study aims to investigate the influence of social media on college choice among undergraduates majoring in Big Data Management and Application in China.The study attempts to reveal how information on social media platforms such as Weibo,WeChat,and Zhihu influences the cognition and choice process of prospective students.By employing an online quantitative survey questionnaire,data were collected from the 2022 and 2023 classes of new students majoring in Big Data Management and Application at Guilin University of Electronic Technology.The aim was to evaluate the role of social media in their college choice process and understand the features and information that most attract prospective students.Social media has become a key factor influencing the college choice decision-making of undergraduates majoring in Big Data Management and Application in China.Students tend to obtain school information through social media platforms and use this information as an important reference in their decision-making process.Higher education institutions should strengthen their social media information dissemination,providing accurate,timely,and attractive information.It is also necessary to ensure effective management of social media platforms,maintain a positive reputation for the school on social media,and increase the interest and trust of prospective students.Simultaneously,educational decision-makers should consider incorporating social media analysis into their recruitment strategies to better attract new student enrollment.This study provides a new perspective for understanding higher education choice behavior in the digital age,particularly by revealing the importance of social media in the educational decision-making process.This has important practical and theoretical implications for higher education institutions,policymakers,and social media platform operators.展开更多
As an introductory course for the emerging major of big data management and application,“Introduction to Big Data”has not yet formed a curriculum standard and implementation plan that is widely accepted and used by ...As an introductory course for the emerging major of big data management and application,“Introduction to Big Data”has not yet formed a curriculum standard and implementation plan that is widely accepted and used by everyone.To this end,we discuss some of our explorations and attempts in the construction and teaching process of big data courses for the major of big data management and application from the perspective of course planning,course implementation,and course summary.After interviews with students and feedback from questionnaires,students are highly satisfied with some of the teaching measures and programs currently adopted.展开更多
That the world is a global village is no longer news through the tremendous advancement in the Information Communication Technology (ICT). The metamorphosis of the human data storage and analysis from analogue through...That the world is a global village is no longer news through the tremendous advancement in the Information Communication Technology (ICT). The metamorphosis of the human data storage and analysis from analogue through the jaguars-loom mainframe computer to the present modern high power processing computers with sextillion bytes storage capacity has prompted discussion of Big Data concept as a tool in managing hitherto all human challenges of complex human system multiplier effects. The supply chain management (SCM) that deals with spatial service delivery that must be safe, efficient, reliable, cheap, transparent, and foreseeable to meet customers’ needs cannot but employ bid data tools in its operation. This study employs secondary data online to review the importance of big data in supply chain management and the levels of adoption in Nigeria. The study revealed that the application of big data tools in SCM and other industrial sectors is synonymous to human and national development. It is therefore recommended that both private and governmental bodies should key into e-transactions for easy data assemblage and analysis for profitable forecasting and policy formation.展开更多
With the continuous development of big data technology,the digital transformation of enterprise human resource management has become a development trend.Human resources is one of the most important resources of enterp...With the continuous development of big data technology,the digital transformation of enterprise human resource management has become a development trend.Human resources is one of the most important resources of enterprises,which is crucial to the competitiveness of enterprises.Enterprises need to attract,retain,and motivate excellent employees,thereby enhancing the innovation ability of enterprises and improving competitiveness and market share in the market.To maintain advantages in the fierce market competition,enterprises need to adopt more scientific and effective human resource management methods to enhance organizational efficiency and competitiveness.At the same time,this paper analyzes the dilemma faced by enterprise human resource management,points out the new characteristics of enterprise human resource management enabled by big data,and puts forward feasible suggestions for enterprise digital transformation.展开更多
With the rapid development and widespread application of Big Data technology, the supply chain management of agricultural products enterprises is facing unprecedented reform and challenges. This study takes the perspe...With the rapid development and widespread application of Big Data technology, the supply chain management of agricultural products enterprises is facing unprecedented reform and challenges. This study takes the perspective of Big Data technology and collects relevant information on the application of supply chain management in 100 agricultural product enterprises through a survey questionnaire. The study found that the use of Big Data can effectively improve the accuracy of demand forecasting, inventory management efficiency, optimize logistics costs, improve supplier management efficiency, enhance the overall level of supply chain management of enterprises, and propose innovative strategies for supply chain management of agricultural products enterprises based on this. Big Data technology brings a new solution for agricultural products enterprises to enhance their supply chain management level, making the supply chain smarter and more efficient.展开更多
This study investigates university English teachers’acceptance and willingness to use learning management system(LMS)data analysis tools in their teaching practices.The research employs a mixed-method approach,combin...This study investigates university English teachers’acceptance and willingness to use learning management system(LMS)data analysis tools in their teaching practices.The research employs a mixed-method approach,combining quantitative surveys and qualitative interviews to understand teachers’perceptions and attitudes,and the factors influencing their adoption of LMS data analysis tools.The findings reveal that perceived usefulness,perceived ease of use,technical literacy,organizational support,and data privacy concerns significantly impact teachers’willingness to use these tools.Based on these insights,the study offers practical recommendations for educational institutions to enhance the effective adoption of LMS data analysis tools in English language teaching.展开更多
To solve the problem of delayed update of spectrum information(SI) in the database assisted dynamic spectrum management(DB-DSM), this paper studies a novel dynamic update scheme of SI in DB-DSM. Firstly, a dynamic upd...To solve the problem of delayed update of spectrum information(SI) in the database assisted dynamic spectrum management(DB-DSM), this paper studies a novel dynamic update scheme of SI in DB-DSM. Firstly, a dynamic update mechanism of SI based on spectrum opportunity incentive is established, in which spectrum users are encouraged to actively assist the database to update SI in real time. Secondly, the information update contribution(IUC) of spectrum opportunity is defined to describe the cost of accessing spectrum opportunity for heterogeneous spectrum users, and the profit of SI update obtained by the database from spectrum allocation. The process that the database determines the IUC of spectrum opportunity and spectrum user selects spectrum opportunity is mapped to a Hotelling model. Thirdly, the process of determining the IUC of spectrum opportunities is further modelled as a Stackelberg game by establishing multiple virtual spectrum resource providers(VSRPs) in the database. It is proved that there is a Nash Equilibrium in the game of determining the IUC of spectrum opportunities by VSRPs. Finally, an algorithm of determining the IUC based on a genetic algorithm is designed to achieve the optimal IUC. The-oretical analysis and simulation results show that the proposed method can quickly find the optimal solution of the IUC, and ensure that the spectrum resource provider can obtain the optimal profit of SI update.展开更多
Driven by the wave of big data,the traditional financial accounting model faces an urgent need for transformation,as it struggles to adapt to the complex requirements of modern enterprise management.This paper aims to...Driven by the wave of big data,the traditional financial accounting model faces an urgent need for transformation,as it struggles to adapt to the complex requirements of modern enterprise management.This paper aims to explore the feasible path for transitioning enterprise financial accounting to management accounting in the context of big data.It first analyzes the limitations of financial accounting in the era of big data,then highlights the necessity of transitioning to management accounting.Following this,the paper outlines the various challenges that may arise during this transition and,based on the analysis,proposes a series of corresponding transition strategies.These strategies aim to provide theoretical support and practical guidance for enterprises seeking a smooth transition from financial accounting to management accounting.展开更多
With the rapid development of big data,big data has been more and more applied in all walks of life.Under the big data environment,massive big data provides convenience for regional tax risk control and strategic deci...With the rapid development of big data,big data has been more and more applied in all walks of life.Under the big data environment,massive big data provides convenience for regional tax risk control and strategic decision-making but also increases the difficulty of data supervision and management.By analyzing the status quo of big data and tax risk management,this paper finds many problems and puts forward effective countermeasures for tax risk supervision and strategic management by using big data.展开更多
In the 21st century,with the development of the Internet,mobile devices,and information technology,society has entered a new era:the era of big data.With the help of big data technology,enterprises can obtain massive ...In the 21st century,with the development of the Internet,mobile devices,and information technology,society has entered a new era:the era of big data.With the help of big data technology,enterprises can obtain massive market and consumer data,realize in-depth analysis of business and market,and enable enterprises to have a deeper understanding of consumer needs,preferences,and behaviors.At the same time,big data technology can also help enterprises carry out human resource management innovation and improve the performance and competitiveness of enterprises.Of course,from another perspective,enterprises in this era are also facing severe challenges.In the face of massive data processing and analysis,it requires superb data processing and analysis capabilities.Secondly,enterprises need to reconstruct their management system to adapt to the changes in the era of big data.Enterprises must treat data as assets and establish a perfect data management system.In addition,enterprises also need to pay attention to protecting customer privacy and data security to avoid data leakage and abuse.In this context,this paper will explore the thinking of enterprise human resource management innovation in the era of big data,and put forward some suggestions on enterprise human resource management innovation.展开更多
The power Internet of Things(IoT)is a significant trend in technology and a requirement for national strategic development.With the deepening digital transformation of the power grid,China’s power system has initiall...The power Internet of Things(IoT)is a significant trend in technology and a requirement for national strategic development.With the deepening digital transformation of the power grid,China’s power system has initially built a power IoT architecture comprising a perception,network,and platform application layer.However,owing to the structural complexity of the power system,the construction of the power IoT continues to face problems such as complex access management of massive heterogeneous equipment,diverse IoT protocol access methods,high concurrency of network communications,and weak data security protection.To address these issues,this study optimizes the existing architecture of the power IoT and designs an integrated management framework for the access of multi-source heterogeneous data in the power IoT,comprising cloud,pipe,edge,and terminal parts.It further reviews and analyzes the key technologies involved in the power IoT,such as the unified management of the physical model,high concurrent access,multi-protocol access,multi-source heterogeneous data storage management,and data security control,to provide a more flexible,efficient,secure,and easy-to-use solution for multi-source heterogeneous data access in the power IoT.展开更多
Due to the extensive use of various intelligent terminals and the popularity of network social tools,a large amount of data in the field of medical emerged.How to manage these massive data safely and reliably has beco...Due to the extensive use of various intelligent terminals and the popularity of network social tools,a large amount of data in the field of medical emerged.How to manage these massive data safely and reliably has become an important challenge for the medical network community.This paper proposes a data management framework of medical network community based on Consortium Blockchain(CB)and Federated learning(FL),which realizes the data security sharing between medical institutions and research institutions.Under this framework,the data security sharing mechanism of medical network community based on smart contract and the data privacy protection mechanism based on FL and alliance chain are designed to ensure the security of data and the privacy of important data in medical network community,respectively.An intelligent contract system based on Keyed-Homomorphic Public Key(KH-PKE)Encryption scheme is designed,so that medical data can be saved in the CB in the form of ciphertext,and the automatic sharing of data is realized.Zero knowledge mechanism is used to ensure the correctness of shared data.Moreover,the zero-knowledge mechanism introduces the dynamic group signature mechanism of chosen ciphertext attack(CCA)anonymity,which makes the scheme more efficient in computing and communication cost.In the end of this paper,the performance of the scheme is analyzed fromboth asymptotic and practical aspects.Through experimental comparative analysis,the scheme proposed in this paper is more effective and feasible.展开更多
Connected and autonomous vehicles are seeing their dawn at this moment.They provide numerous benefits to vehicle owners,manufacturers,vehicle service providers,insurance companies,etc.These vehicles generate a large a...Connected and autonomous vehicles are seeing their dawn at this moment.They provide numerous benefits to vehicle owners,manufacturers,vehicle service providers,insurance companies,etc.These vehicles generate a large amount of data,which makes privacy and security a major challenge to their success.The complicated machine-led mechanics of connected and autonomous vehicles increase the risks of privacy invasion and cyber security violations for their users by making them more susceptible to data exploitation and vulnerable to cyber-attacks than any of their predecessors.This could have a negative impact on how well-liked CAVs are with the general public,give them a poor name at this early stage of their development,put obstacles in the way of their adoption and expanded use,and complicate the economic models for their future operations.On the other hand,congestion is still a bottleneck for traffic management and planning.This research paper presents a blockchain-based framework that protects the privacy of vehicle owners and provides data security by storing vehicular data on the blockchain,which will be used further for congestion detection and mitigation.Numerous devices placed along the road are used to communicate with passing cars and collect their data.The collected data will be compiled periodically to find the average travel time of vehicles and traffic density on a particular road segment.Furthermore,this data will be stored in the memory pool,where other devices will also store their data.After a predetermined amount of time,the memory pool will be mined,and data will be uploaded to the blockchain in the form of blocks that will be used to store traffic statistics.The information is then used in two different ways.First,the blockchain’s final block will provide real-time traffic data,triggering an intelligent traffic signal system to reduce congestion.Secondly,the data stored on the blockchain will provide historical,statistical data that can facilitate the analysis of traffic conditions according to past behavior.展开更多
This study analyzed the concept of time efficiency in the data management process associated with the personnel training and competence assessments in one of the quality control (QC) laboratories of Nigeria’s Foods a...This study analyzed the concept of time efficiency in the data management process associated with the personnel training and competence assessments in one of the quality control (QC) laboratories of Nigeria’s Foods and Drugs Authority (NAFDAC). The laboratory administrators were burdened with a lot of mental and paper-based record keeping because the personnel training’s data were managed manually, hence not efficiently processed. The Excel spreadsheet provided by a Purdue doctoral dissertation as a remedial to this challenge was found to be deficient in handling operations in database tables, and therefore did not appropriately address the inefficiencies. Purpose: This study aimed to reduce the time it essentially takes to generate, obtain, manipulate, exchange, and securely store data that are associated with personnel competence training and assessments. Method: The study developed a software system that was integrated with a relational database management system (RDBMS) to improve manual/Excel-based data management procedures. To validate the efficiency of the software the mean operational times in using the Excel-based format were compared with that of the “New” software system. The data were obtained by performing four predefined core tasks for five hypothetical subjects using Excel and the “New” system (the model system) respectively. Results: It was verified that the average time to accomplish the specified tasks using the “New” system (37.08 seconds) was significantly (p = 0.00191, α = 0.05) lower than the time measurements for the Excel system (77.39 seconds) in the ANACHEM laboratory. The RDBMS-based “New” system provided operational (time) efficiency in the personnel training and competence assessment process in the QC laboratory and reduced human errors.展开更多
Mobile networks possess significant information and thus are considered a gold mine for the researcher’s community.The call detail records(CDR)of a mobile network are used to identify the network’s efficacy and the ...Mobile networks possess significant information and thus are considered a gold mine for the researcher’s community.The call detail records(CDR)of a mobile network are used to identify the network’s efficacy and the mobile user’s behavior.It is evident from the recent literature that cyber-physical systems(CPS)were used in the analytics and modeling of telecom data.In addition,CPS is used to provide valuable services in smart cities.In general,a typical telecom company hasmillions of subscribers and thus generatesmassive amounts of data.From this aspect,data storage,analysis,and processing are the key concerns.To solve these issues,herein we propose a multilevel cyber-physical social system(CPSS)for the analysis and modeling of large internet data.Our proposed multilevel system has three levels and each level has a specific functionality.Initially,raw Call Detail Data(CDR)was collected at the first level.Herein,the data preprocessing,cleaning,and error removal operations were performed.In the second level,data processing,cleaning,reduction,integration,processing,and storage were performed.Herein,suggested internet activity record measures were applied.Our proposed system initially constructs a graph and then performs network analysis.Thus proposed CPSS system accurately identifies different areas of internet peak usage in a city(Milan city).Our research is helpful for the network operators to plan effective network configuration,management,and optimization of resources.展开更多
With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapi...With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapid development of IIoT.Blockchain technology has immutability,decentralization,and autonomy,which can greatly improve the inherent defects of the IIoT.In the traditional blockchain,data is stored in a Merkle tree.As data continues to grow,the scale of proofs used to validate it grows,threatening the efficiency,security,and reliability of blockchain-based IIoT.Accordingly,this paper first analyzes the inefficiency of the traditional blockchain structure in verifying the integrity and correctness of data.To solve this problem,a new Vector Commitment(VC)structure,Partition Vector Commitment(PVC),is proposed by improving the traditional VC structure.Secondly,this paper uses PVC instead of the Merkle tree to store big data generated by IIoT.PVC can improve the efficiency of traditional VC in the process of commitment and opening.Finally,this paper uses PVC to build a blockchain-based IIoT data security storage mechanism and carries out a comparative analysis of experiments.This mechanism can greatly reduce communication loss and maximize the rational use of storage space,which is of great significance for maintaining the security and stability of blockchain-based IIoT.展开更多
In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose...In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose a Hadoop based big data secure storage scheme.Firstly,in order to disperse the NameNode service from a single server to multiple servers,we combine HDFS federation and HDFS high-availability mechanisms,and use the Zookeeper distributed coordination mechanism to coordinate each node to achieve dual-channel storage.Then,we improve the ECC encryption algorithm for the encryption of ordinary data,and adopt a homomorphic encryption algorithm to encrypt data that needs to be calculated.To accelerate the encryption,we adopt the dualthread encryption mode.Finally,the HDFS control module is designed to combine the encryption algorithm with the storage model.Experimental results show that the proposed solution solves the problem of a single point of failure of metadata,performs well in terms of metadata reliability,and can realize the fault tolerance of the server.The improved encryption algorithm integrates the dual-channel storage mode,and the encryption storage efficiency improves by 27.6% on average.展开更多
基金partially supported by the Foundation of State Key Laboratory of Public Big Data(No.PBD2022-01).
文摘In today’s highly competitive retail industry,offline stores face increasing pressure on profitability.They hope to improve their ability in shelf management with the help of big data technology.For this,on-shelf availability is an essential indicator of shelf data management and closely relates to customer purchase behavior.RFM(recency,frequency,andmonetary)patternmining is a powerful tool to evaluate the value of customer behavior.However,the existing RFM patternmining algorithms do not consider the quarterly nature of goods,resulting in unreasonable shelf availability and difficulty in profit-making.To solve this problem,we propose a quarterly RFM mining algorithmfor On-shelf products named OS-RFM.Our algorithmmines the high recency,high frequency,and high monetary patterns and considers the period of the on-shelf goods in quarterly units.We conducted experiments using two real datasets for numerical and graphical analysis to prove the algorithm’s effectiveness.Compared with the state-of-the-art RFM mining algorithm,our algorithm can identify more patterns and performs well in terms of precision,recall,and F1-score,with the recall rate nearing 100%.Also,the novel algorithm operates with significantly shorter running times and more stable memory usage than existing mining algorithms.Additionally,we analyze the sales trends of products in different quarters and seasonal variations.The analysis assists businesses in maintaining reasonable on-shelf availability and achieving greater profitability.
基金supported by the Ministerio Espanol de Ciencia e Innovación under Project Number PID2020-115570GB-C22,MCIN/AEI/10.13039/501100011033by the Cátedra de Empresa Tecnología para las Personas(UGR-Fujitsu).
文摘As cloud computing usage grows,cloud data centers play an increasingly important role.To maximize resource utilization,ensure service quality,and enhance system performance,it is crucial to allocate tasks and manage performance effectively.The purpose of this study is to provide an extensive analysis of task allocation and performance management techniques employed in cloud data centers.The aim is to systematically categorize and organize previous research by identifying the cloud computing methodologies,categories,and gaps.A literature review was conducted,which included the analysis of 463 task allocations and 480 performance management papers.The review revealed three task allocation research topics and seven performance management methods.Task allocation research areas are resource allocation,load-Balancing,and scheduling.Performance management includes monitoring and control,power and energy management,resource utilization optimization,quality of service management,fault management,virtual machine management,and network management.The study proposes new techniques to enhance cloud computing work allocation and performance management.Short-comings in each approach can guide future research.The research’s findings on cloud data center task allocation and performance management can assist academics,practitioners,and cloud service providers in optimizing their systems for dependability,cost-effectiveness,and scalability.Innovative methodologies can steer future research to fill gaps in the literature.
文摘This study aims to investigate the influence of social media on college choice among undergraduates majoring in Big Data Management and Application in China.The study attempts to reveal how information on social media platforms such as Weibo,WeChat,and Zhihu influences the cognition and choice process of prospective students.By employing an online quantitative survey questionnaire,data were collected from the 2022 and 2023 classes of new students majoring in Big Data Management and Application at Guilin University of Electronic Technology.The aim was to evaluate the role of social media in their college choice process and understand the features and information that most attract prospective students.Social media has become a key factor influencing the college choice decision-making of undergraduates majoring in Big Data Management and Application in China.Students tend to obtain school information through social media platforms and use this information as an important reference in their decision-making process.Higher education institutions should strengthen their social media information dissemination,providing accurate,timely,and attractive information.It is also necessary to ensure effective management of social media platforms,maintain a positive reputation for the school on social media,and increase the interest and trust of prospective students.Simultaneously,educational decision-makers should consider incorporating social media analysis into their recruitment strategies to better attract new student enrollment.This study provides a new perspective for understanding higher education choice behavior in the digital age,particularly by revealing the importance of social media in the educational decision-making process.This has important practical and theoretical implications for higher education institutions,policymakers,and social media platform operators.
文摘As an introductory course for the emerging major of big data management and application,“Introduction to Big Data”has not yet formed a curriculum standard and implementation plan that is widely accepted and used by everyone.To this end,we discuss some of our explorations and attempts in the construction and teaching process of big data courses for the major of big data management and application from the perspective of course planning,course implementation,and course summary.After interviews with students and feedback from questionnaires,students are highly satisfied with some of the teaching measures and programs currently adopted.
文摘That the world is a global village is no longer news through the tremendous advancement in the Information Communication Technology (ICT). The metamorphosis of the human data storage and analysis from analogue through the jaguars-loom mainframe computer to the present modern high power processing computers with sextillion bytes storage capacity has prompted discussion of Big Data concept as a tool in managing hitherto all human challenges of complex human system multiplier effects. The supply chain management (SCM) that deals with spatial service delivery that must be safe, efficient, reliable, cheap, transparent, and foreseeable to meet customers’ needs cannot but employ bid data tools in its operation. This study employs secondary data online to review the importance of big data in supply chain management and the levels of adoption in Nigeria. The study revealed that the application of big data tools in SCM and other industrial sectors is synonymous to human and national development. It is therefore recommended that both private and governmental bodies should key into e-transactions for easy data assemblage and analysis for profitable forecasting and policy formation.
文摘With the continuous development of big data technology,the digital transformation of enterprise human resource management has become a development trend.Human resources is one of the most important resources of enterprises,which is crucial to the competitiveness of enterprises.Enterprises need to attract,retain,and motivate excellent employees,thereby enhancing the innovation ability of enterprises and improving competitiveness and market share in the market.To maintain advantages in the fierce market competition,enterprises need to adopt more scientific and effective human resource management methods to enhance organizational efficiency and competitiveness.At the same time,this paper analyzes the dilemma faced by enterprise human resource management,points out the new characteristics of enterprise human resource management enabled by big data,and puts forward feasible suggestions for enterprise digital transformation.
文摘With the rapid development and widespread application of Big Data technology, the supply chain management of agricultural products enterprises is facing unprecedented reform and challenges. This study takes the perspective of Big Data technology and collects relevant information on the application of supply chain management in 100 agricultural product enterprises through a survey questionnaire. The study found that the use of Big Data can effectively improve the accuracy of demand forecasting, inventory management efficiency, optimize logistics costs, improve supplier management efficiency, enhance the overall level of supply chain management of enterprises, and propose innovative strategies for supply chain management of agricultural products enterprises based on this. Big Data technology brings a new solution for agricultural products enterprises to enhance their supply chain management level, making the supply chain smarter and more efficient.
文摘This study investigates university English teachers’acceptance and willingness to use learning management system(LMS)data analysis tools in their teaching practices.The research employs a mixed-method approach,combining quantitative surveys and qualitative interviews to understand teachers’perceptions and attitudes,and the factors influencing their adoption of LMS data analysis tools.The findings reveal that perceived usefulness,perceived ease of use,technical literacy,organizational support,and data privacy concerns significantly impact teachers’willingness to use these tools.Based on these insights,the study offers practical recommendations for educational institutions to enhance the effective adoption of LMS data analysis tools in English language teaching.
文摘To solve the problem of delayed update of spectrum information(SI) in the database assisted dynamic spectrum management(DB-DSM), this paper studies a novel dynamic update scheme of SI in DB-DSM. Firstly, a dynamic update mechanism of SI based on spectrum opportunity incentive is established, in which spectrum users are encouraged to actively assist the database to update SI in real time. Secondly, the information update contribution(IUC) of spectrum opportunity is defined to describe the cost of accessing spectrum opportunity for heterogeneous spectrum users, and the profit of SI update obtained by the database from spectrum allocation. The process that the database determines the IUC of spectrum opportunity and spectrum user selects spectrum opportunity is mapped to a Hotelling model. Thirdly, the process of determining the IUC of spectrum opportunities is further modelled as a Stackelberg game by establishing multiple virtual spectrum resource providers(VSRPs) in the database. It is proved that there is a Nash Equilibrium in the game of determining the IUC of spectrum opportunities by VSRPs. Finally, an algorithm of determining the IUC based on a genetic algorithm is designed to achieve the optimal IUC. The-oretical analysis and simulation results show that the proposed method can quickly find the optimal solution of the IUC, and ensure that the spectrum resource provider can obtain the optimal profit of SI update.
文摘Driven by the wave of big data,the traditional financial accounting model faces an urgent need for transformation,as it struggles to adapt to the complex requirements of modern enterprise management.This paper aims to explore the feasible path for transitioning enterprise financial accounting to management accounting in the context of big data.It first analyzes the limitations of financial accounting in the era of big data,then highlights the necessity of transitioning to management accounting.Following this,the paper outlines the various challenges that may arise during this transition and,based on the analysis,proposes a series of corresponding transition strategies.These strategies aim to provide theoretical support and practical guidance for enterprises seeking a smooth transition from financial accounting to management accounting.
文摘With the rapid development of big data,big data has been more and more applied in all walks of life.Under the big data environment,massive big data provides convenience for regional tax risk control and strategic decision-making but also increases the difficulty of data supervision and management.By analyzing the status quo of big data and tax risk management,this paper finds many problems and puts forward effective countermeasures for tax risk supervision and strategic management by using big data.
文摘In the 21st century,with the development of the Internet,mobile devices,and information technology,society has entered a new era:the era of big data.With the help of big data technology,enterprises can obtain massive market and consumer data,realize in-depth analysis of business and market,and enable enterprises to have a deeper understanding of consumer needs,preferences,and behaviors.At the same time,big data technology can also help enterprises carry out human resource management innovation and improve the performance and competitiveness of enterprises.Of course,from another perspective,enterprises in this era are also facing severe challenges.In the face of massive data processing and analysis,it requires superb data processing and analysis capabilities.Secondly,enterprises need to reconstruct their management system to adapt to the changes in the era of big data.Enterprises must treat data as assets and establish a perfect data management system.In addition,enterprises also need to pay attention to protecting customer privacy and data security to avoid data leakage and abuse.In this context,this paper will explore the thinking of enterprise human resource management innovation in the era of big data,and put forward some suggestions on enterprise human resource management innovation.
基金supported by the National Key Research and Development Program of China(grant number 2019YFE0123600)。
文摘The power Internet of Things(IoT)is a significant trend in technology and a requirement for national strategic development.With the deepening digital transformation of the power grid,China’s power system has initially built a power IoT architecture comprising a perception,network,and platform application layer.However,owing to the structural complexity of the power system,the construction of the power IoT continues to face problems such as complex access management of massive heterogeneous equipment,diverse IoT protocol access methods,high concurrency of network communications,and weak data security protection.To address these issues,this study optimizes the existing architecture of the power IoT and designs an integrated management framework for the access of multi-source heterogeneous data in the power IoT,comprising cloud,pipe,edge,and terminal parts.It further reviews and analyzes the key technologies involved in the power IoT,such as the unified management of the physical model,high concurrent access,multi-protocol access,multi-source heterogeneous data storage management,and data security control,to provide a more flexible,efficient,secure,and easy-to-use solution for multi-source heterogeneous data access in the power IoT.
基金supported by the NSFC(No.62072249)Yongjun Ren received the grant and the URLs to sponsors’websites is https://www.nsfc.gov.cn/.
文摘Due to the extensive use of various intelligent terminals and the popularity of network social tools,a large amount of data in the field of medical emerged.How to manage these massive data safely and reliably has become an important challenge for the medical network community.This paper proposes a data management framework of medical network community based on Consortium Blockchain(CB)and Federated learning(FL),which realizes the data security sharing between medical institutions and research institutions.Under this framework,the data security sharing mechanism of medical network community based on smart contract and the data privacy protection mechanism based on FL and alliance chain are designed to ensure the security of data and the privacy of important data in medical network community,respectively.An intelligent contract system based on Keyed-Homomorphic Public Key(KH-PKE)Encryption scheme is designed,so that medical data can be saved in the CB in the form of ciphertext,and the automatic sharing of data is realized.Zero knowledge mechanism is used to ensure the correctness of shared data.Moreover,the zero-knowledge mechanism introduces the dynamic group signature mechanism of chosen ciphertext attack(CCA)anonymity,which makes the scheme more efficient in computing and communication cost.In the end of this paper,the performance of the scheme is analyzed fromboth asymptotic and practical aspects.Through experimental comparative analysis,the scheme proposed in this paper is more effective and feasible.
基金funded by the Deanship of Scientific Research at King Khalid University,Kingdom of Saudi Arabia for large group Research Project under grant number:RGP2/249/44.
文摘Connected and autonomous vehicles are seeing their dawn at this moment.They provide numerous benefits to vehicle owners,manufacturers,vehicle service providers,insurance companies,etc.These vehicles generate a large amount of data,which makes privacy and security a major challenge to their success.The complicated machine-led mechanics of connected and autonomous vehicles increase the risks of privacy invasion and cyber security violations for their users by making them more susceptible to data exploitation and vulnerable to cyber-attacks than any of their predecessors.This could have a negative impact on how well-liked CAVs are with the general public,give them a poor name at this early stage of their development,put obstacles in the way of their adoption and expanded use,and complicate the economic models for their future operations.On the other hand,congestion is still a bottleneck for traffic management and planning.This research paper presents a blockchain-based framework that protects the privacy of vehicle owners and provides data security by storing vehicular data on the blockchain,which will be used further for congestion detection and mitigation.Numerous devices placed along the road are used to communicate with passing cars and collect their data.The collected data will be compiled periodically to find the average travel time of vehicles and traffic density on a particular road segment.Furthermore,this data will be stored in the memory pool,where other devices will also store their data.After a predetermined amount of time,the memory pool will be mined,and data will be uploaded to the blockchain in the form of blocks that will be used to store traffic statistics.The information is then used in two different ways.First,the blockchain’s final block will provide real-time traffic data,triggering an intelligent traffic signal system to reduce congestion.Secondly,the data stored on the blockchain will provide historical,statistical data that can facilitate the analysis of traffic conditions according to past behavior.
文摘This study analyzed the concept of time efficiency in the data management process associated with the personnel training and competence assessments in one of the quality control (QC) laboratories of Nigeria’s Foods and Drugs Authority (NAFDAC). The laboratory administrators were burdened with a lot of mental and paper-based record keeping because the personnel training’s data were managed manually, hence not efficiently processed. The Excel spreadsheet provided by a Purdue doctoral dissertation as a remedial to this challenge was found to be deficient in handling operations in database tables, and therefore did not appropriately address the inefficiencies. Purpose: This study aimed to reduce the time it essentially takes to generate, obtain, manipulate, exchange, and securely store data that are associated with personnel competence training and assessments. Method: The study developed a software system that was integrated with a relational database management system (RDBMS) to improve manual/Excel-based data management procedures. To validate the efficiency of the software the mean operational times in using the Excel-based format were compared with that of the “New” software system. The data were obtained by performing four predefined core tasks for five hypothetical subjects using Excel and the “New” system (the model system) respectively. Results: It was verified that the average time to accomplish the specified tasks using the “New” system (37.08 seconds) was significantly (p = 0.00191, α = 0.05) lower than the time measurements for the Excel system (77.39 seconds) in the ANACHEM laboratory. The RDBMS-based “New” system provided operational (time) efficiency in the personnel training and competence assessment process in the QC laboratory and reduced human errors.
基金supported by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(NRF-2021R1A6A1A03039493).
文摘Mobile networks possess significant information and thus are considered a gold mine for the researcher’s community.The call detail records(CDR)of a mobile network are used to identify the network’s efficacy and the mobile user’s behavior.It is evident from the recent literature that cyber-physical systems(CPS)were used in the analytics and modeling of telecom data.In addition,CPS is used to provide valuable services in smart cities.In general,a typical telecom company hasmillions of subscribers and thus generatesmassive amounts of data.From this aspect,data storage,analysis,and processing are the key concerns.To solve these issues,herein we propose a multilevel cyber-physical social system(CPSS)for the analysis and modeling of large internet data.Our proposed multilevel system has three levels and each level has a specific functionality.Initially,raw Call Detail Data(CDR)was collected at the first level.Herein,the data preprocessing,cleaning,and error removal operations were performed.In the second level,data processing,cleaning,reduction,integration,processing,and storage were performed.Herein,suggested internet activity record measures were applied.Our proposed system initially constructs a graph and then performs network analysis.Thus proposed CPSS system accurately identifies different areas of internet peak usage in a city(Milan city).Our research is helpful for the network operators to plan effective network configuration,management,and optimization of resources.
基金supported by China’s National Natural Science Foundation(Nos.62072249,62072056)This work is also funded by the National Science Foundation of Hunan Province(2020JJ2029).
文摘With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapid development of IIoT.Blockchain technology has immutability,decentralization,and autonomy,which can greatly improve the inherent defects of the IIoT.In the traditional blockchain,data is stored in a Merkle tree.As data continues to grow,the scale of proofs used to validate it grows,threatening the efficiency,security,and reliability of blockchain-based IIoT.Accordingly,this paper first analyzes the inefficiency of the traditional blockchain structure in verifying the integrity and correctness of data.To solve this problem,a new Vector Commitment(VC)structure,Partition Vector Commitment(PVC),is proposed by improving the traditional VC structure.Secondly,this paper uses PVC instead of the Merkle tree to store big data generated by IIoT.PVC can improve the efficiency of traditional VC in the process of commitment and opening.Finally,this paper uses PVC to build a blockchain-based IIoT data security storage mechanism and carries out a comparative analysis of experiments.This mechanism can greatly reduce communication loss and maximize the rational use of storage space,which is of great significance for maintaining the security and stability of blockchain-based IIoT.
文摘In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose a Hadoop based big data secure storage scheme.Firstly,in order to disperse the NameNode service from a single server to multiple servers,we combine HDFS federation and HDFS high-availability mechanisms,and use the Zookeeper distributed coordination mechanism to coordinate each node to achieve dual-channel storage.Then,we improve the ECC encryption algorithm for the encryption of ordinary data,and adopt a homomorphic encryption algorithm to encrypt data that needs to be calculated.To accelerate the encryption,we adopt the dualthread encryption mode.Finally,the HDFS control module is designed to combine the encryption algorithm with the storage model.Experimental results show that the proposed solution solves the problem of a single point of failure of metadata,performs well in terms of metadata reliability,and can realize the fault tolerance of the server.The improved encryption algorithm integrates the dual-channel storage mode,and the encryption storage efficiency improves by 27.6% on average.