In today’s highly competitive retail industry,offline stores face increasing pressure on profitability.They hope to improve their ability in shelf management with the help of big data technology.For this,on-shelf ava...In today’s highly competitive retail industry,offline stores face increasing pressure on profitability.They hope to improve their ability in shelf management with the help of big data technology.For this,on-shelf availability is an essential indicator of shelf data management and closely relates to customer purchase behavior.RFM(recency,frequency,andmonetary)patternmining is a powerful tool to evaluate the value of customer behavior.However,the existing RFM patternmining algorithms do not consider the quarterly nature of goods,resulting in unreasonable shelf availability and difficulty in profit-making.To solve this problem,we propose a quarterly RFM mining algorithmfor On-shelf products named OS-RFM.Our algorithmmines the high recency,high frequency,and high monetary patterns and considers the period of the on-shelf goods in quarterly units.We conducted experiments using two real datasets for numerical and graphical analysis to prove the algorithm’s effectiveness.Compared with the state-of-the-art RFM mining algorithm,our algorithm can identify more patterns and performs well in terms of precision,recall,and F1-score,with the recall rate nearing 100%.Also,the novel algorithm operates with significantly shorter running times and more stable memory usage than existing mining algorithms.Additionally,we analyze the sales trends of products in different quarters and seasonal variations.The analysis assists businesses in maintaining reasonable on-shelf availability and achieving greater profitability.展开更多
That the world is a global village is no longer news through the tremendous advancement in the Information Communication Technology (ICT). The metamorphosis of the human data storage and analysis from analogue through...That the world is a global village is no longer news through the tremendous advancement in the Information Communication Technology (ICT). The metamorphosis of the human data storage and analysis from analogue through the jaguars-loom mainframe computer to the present modern high power processing computers with sextillion bytes storage capacity has prompted discussion of Big Data concept as a tool in managing hitherto all human challenges of complex human system multiplier effects. The supply chain management (SCM) that deals with spatial service delivery that must be safe, efficient, reliable, cheap, transparent, and foreseeable to meet customers’ needs cannot but employ bid data tools in its operation. This study employs secondary data online to review the importance of big data in supply chain management and the levels of adoption in Nigeria. The study revealed that the application of big data tools in SCM and other industrial sectors is synonymous to human and national development. It is therefore recommended that both private and governmental bodies should key into e-transactions for easy data assemblage and analysis for profitable forecasting and policy formation.展开更多
As cloud computing usage grows,cloud data centers play an increasingly important role.To maximize resource utilization,ensure service quality,and enhance system performance,it is crucial to allocate tasks and manage p...As cloud computing usage grows,cloud data centers play an increasingly important role.To maximize resource utilization,ensure service quality,and enhance system performance,it is crucial to allocate tasks and manage performance effectively.The purpose of this study is to provide an extensive analysis of task allocation and performance management techniques employed in cloud data centers.The aim is to systematically categorize and organize previous research by identifying the cloud computing methodologies,categories,and gaps.A literature review was conducted,which included the analysis of 463 task allocations and 480 performance management papers.The review revealed three task allocation research topics and seven performance management methods.Task allocation research areas are resource allocation,load-Balancing,and scheduling.Performance management includes monitoring and control,power and energy management,resource utilization optimization,quality of service management,fault management,virtual machine management,and network management.The study proposes new techniques to enhance cloud computing work allocation and performance management.Short-comings in each approach can guide future research.The research’s findings on cloud data center task allocation and performance management can assist academics,practitioners,and cloud service providers in optimizing their systems for dependability,cost-effectiveness,and scalability.Innovative methodologies can steer future research to fill gaps in the literature.展开更多
The power Internet of Things(IoT)is a significant trend in technology and a requirement for national strategic development.With the deepening digital transformation of the power grid,China’s power system has initiall...The power Internet of Things(IoT)is a significant trend in technology and a requirement for national strategic development.With the deepening digital transformation of the power grid,China’s power system has initially built a power IoT architecture comprising a perception,network,and platform application layer.However,owing to the structural complexity of the power system,the construction of the power IoT continues to face problems such as complex access management of massive heterogeneous equipment,diverse IoT protocol access methods,high concurrency of network communications,and weak data security protection.To address these issues,this study optimizes the existing architecture of the power IoT and designs an integrated management framework for the access of multi-source heterogeneous data in the power IoT,comprising cloud,pipe,edge,and terminal parts.It further reviews and analyzes the key technologies involved in the power IoT,such as the unified management of the physical model,high concurrent access,multi-protocol access,multi-source heterogeneous data storage management,and data security control,to provide a more flexible,efficient,secure,and easy-to-use solution for multi-source heterogeneous data access in the power IoT.展开更多
This study investigates university English teachers’acceptance and willingness to use learning management system(LMS)data analysis tools in their teaching practices.The research employs a mixed-method approach,combin...This study investigates university English teachers’acceptance and willingness to use learning management system(LMS)data analysis tools in their teaching practices.The research employs a mixed-method approach,combining quantitative surveys and qualitative interviews to understand teachers’perceptions and attitudes,and the factors influencing their adoption of LMS data analysis tools.The findings reveal that perceived usefulness,perceived ease of use,technical literacy,organizational support,and data privacy concerns significantly impact teachers’willingness to use these tools.Based on these insights,the study offers practical recommendations for educational institutions to enhance the effective adoption of LMS data analysis tools in English language teaching.展开更多
As an introductory course for the emerging major of big data management and application,“Introduction to Big Data”has not yet formed a curriculum standard and implementation plan that is widely accepted and used by ...As an introductory course for the emerging major of big data management and application,“Introduction to Big Data”has not yet formed a curriculum standard and implementation plan that is widely accepted and used by everyone.To this end,we discuss some of our explorations and attempts in the construction and teaching process of big data courses for the major of big data management and application from the perspective of course planning,course implementation,and course summary.After interviews with students and feedback from questionnaires,students are highly satisfied with some of the teaching measures and programs currently adopted.展开更多
Driven by the wave of big data,the traditional financial accounting model faces an urgent need for transformation,as it struggles to adapt to the complex requirements of modern enterprise management.This paper aims to...Driven by the wave of big data,the traditional financial accounting model faces an urgent need for transformation,as it struggles to adapt to the complex requirements of modern enterprise management.This paper aims to explore the feasible path for transitioning enterprise financial accounting to management accounting in the context of big data.It first analyzes the limitations of financial accounting in the era of big data,then highlights the necessity of transitioning to management accounting.Following this,the paper outlines the various challenges that may arise during this transition and,based on the analysis,proposes a series of corresponding transition strategies.These strategies aim to provide theoretical support and practical guidance for enterprises seeking a smooth transition from financial accounting to management accounting.展开更多
With the rapid development of big data,big data has been more and more applied in all walks of life.Under the big data environment,massive big data provides convenience for regional tax risk control and strategic deci...With the rapid development of big data,big data has been more and more applied in all walks of life.Under the big data environment,massive big data provides convenience for regional tax risk control and strategic decision-making but also increases the difficulty of data supervision and management.By analyzing the status quo of big data and tax risk management,this paper finds many problems and puts forward effective countermeasures for tax risk supervision and strategic management by using big data.展开更多
In the 21st century,with the development of the Internet,mobile devices,and information technology,society has entered a new era:the era of big data.With the help of big data technology,enterprises can obtain massive ...In the 21st century,with the development of the Internet,mobile devices,and information technology,society has entered a new era:the era of big data.With the help of big data technology,enterprises can obtain massive market and consumer data,realize in-depth analysis of business and market,and enable enterprises to have a deeper understanding of consumer needs,preferences,and behaviors.At the same time,big data technology can also help enterprises carry out human resource management innovation and improve the performance and competitiveness of enterprises.Of course,from another perspective,enterprises in this era are also facing severe challenges.In the face of massive data processing and analysis,it requires superb data processing and analysis capabilities.Secondly,enterprises need to reconstruct their management system to adapt to the changes in the era of big data.Enterprises must treat data as assets and establish a perfect data management system.In addition,enterprises also need to pay attention to protecting customer privacy and data security to avoid data leakage and abuse.In this context,this paper will explore the thinking of enterprise human resource management innovation in the era of big data,and put forward some suggestions on enterprise human resource management innovation.展开更多
With the continuous development of big data technology,the digital transformation of enterprise human resource management has become a development trend.Human resources is one of the most important resources of enterp...With the continuous development of big data technology,the digital transformation of enterprise human resource management has become a development trend.Human resources is one of the most important resources of enterprises,which is crucial to the competitiveness of enterprises.Enterprises need to attract,retain,and motivate excellent employees,thereby enhancing the innovation ability of enterprises and improving competitiveness and market share in the market.To maintain advantages in the fierce market competition,enterprises need to adopt more scientific and effective human resource management methods to enhance organizational efficiency and competitiveness.At the same time,this paper analyzes the dilemma faced by enterprise human resource management,points out the new characteristics of enterprise human resource management enabled by big data,and puts forward feasible suggestions for enterprise digital transformation.展开更多
With the rapid development and widespread application of Big Data technology, the supply chain management of agricultural products enterprises is facing unprecedented reform and challenges. This study takes the perspe...With the rapid development and widespread application of Big Data technology, the supply chain management of agricultural products enterprises is facing unprecedented reform and challenges. This study takes the perspective of Big Data technology and collects relevant information on the application of supply chain management in 100 agricultural product enterprises through a survey questionnaire. The study found that the use of Big Data can effectively improve the accuracy of demand forecasting, inventory management efficiency, optimize logistics costs, improve supplier management efficiency, enhance the overall level of supply chain management of enterprises, and propose innovative strategies for supply chain management of agricultural products enterprises based on this. Big Data technology brings a new solution for agricultural products enterprises to enhance their supply chain management level, making the supply chain smarter and more efficient.展开更多
This study aims to investigate the influence of social media on college choice among undergraduates majoring in Big Data Management and Application in China.The study attempts to reveal how information on social media...This study aims to investigate the influence of social media on college choice among undergraduates majoring in Big Data Management and Application in China.The study attempts to reveal how information on social media platforms such as Weibo,WeChat,and Zhihu influences the cognition and choice process of prospective students.By employing an online quantitative survey questionnaire,data were collected from the 2022 and 2023 classes of new students majoring in Big Data Management and Application at Guilin University of Electronic Technology.The aim was to evaluate the role of social media in their college choice process and understand the features and information that most attract prospective students.Social media has become a key factor influencing the college choice decision-making of undergraduates majoring in Big Data Management and Application in China.Students tend to obtain school information through social media platforms and use this information as an important reference in their decision-making process.Higher education institutions should strengthen their social media information dissemination,providing accurate,timely,and attractive information.It is also necessary to ensure effective management of social media platforms,maintain a positive reputation for the school on social media,and increase the interest and trust of prospective students.Simultaneously,educational decision-makers should consider incorporating social media analysis into their recruitment strategies to better attract new student enrollment.This study provides a new perspective for understanding higher education choice behavior in the digital age,particularly by revealing the importance of social media in the educational decision-making process.This has important practical and theoretical implications for higher education institutions,policymakers,and social media platform operators.展开更多
With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapi...With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapid development of IIoT.Blockchain technology has immutability,decentralization,and autonomy,which can greatly improve the inherent defects of the IIoT.In the traditional blockchain,data is stored in a Merkle tree.As data continues to grow,the scale of proofs used to validate it grows,threatening the efficiency,security,and reliability of blockchain-based IIoT.Accordingly,this paper first analyzes the inefficiency of the traditional blockchain structure in verifying the integrity and correctness of data.To solve this problem,a new Vector Commitment(VC)structure,Partition Vector Commitment(PVC),is proposed by improving the traditional VC structure.Secondly,this paper uses PVC instead of the Merkle tree to store big data generated by IIoT.PVC can improve the efficiency of traditional VC in the process of commitment and opening.Finally,this paper uses PVC to build a blockchain-based IIoT data security storage mechanism and carries out a comparative analysis of experiments.This mechanism can greatly reduce communication loss and maximize the rational use of storage space,which is of great significance for maintaining the security and stability of blockchain-based IIoT.展开更多
In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose...In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose a Hadoop based big data secure storage scheme.Firstly,in order to disperse the NameNode service from a single server to multiple servers,we combine HDFS federation and HDFS high-availability mechanisms,and use the Zookeeper distributed coordination mechanism to coordinate each node to achieve dual-channel storage.Then,we improve the ECC encryption algorithm for the encryption of ordinary data,and adopt a homomorphic encryption algorithm to encrypt data that needs to be calculated.To accelerate the encryption,we adopt the dualthread encryption mode.Finally,the HDFS control module is designed to combine the encryption algorithm with the storage model.Experimental results show that the proposed solution solves the problem of a single point of failure of metadata,performs well in terms of metadata reliability,and can realize the fault tolerance of the server.The improved encryption algorithm integrates the dual-channel storage mode,and the encryption storage efficiency improves by 27.6% on average.展开更多
Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal depende...Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal dependence,and noise.Therefore,methodologies for data augmentation and conversion of time series data into images for analysis have been studied.This paper proposes a fault detection model that uses time series data augmentation and transformation to address the problems of data imbalance,temporal dependence,and robustness to noise.The method of data augmentation is set as the addition of noise.It involves adding Gaussian noise,with the noise level set to 0.002,to maximize the generalization performance of the model.In addition,we use the Markov Transition Field(MTF)method to effectively visualize the dynamic transitions of the data while converting the time series data into images.It enables the identification of patterns in time series data and assists in capturing the sequential dependencies of the data.For anomaly detection,the PatchCore model is applied to show excellent performance,and the detected anomaly areas are represented as heat maps.It allows for the detection of anomalies,and by applying an anomaly map to the original image,it is possible to capture the areas where anomalies occur.The performance evaluation shows that both F1-score and Accuracy are high when time series data is converted to images.Additionally,when processed as images rather than as time series data,there was a significant reduction in both the size of the data and the training time.The proposed method can provide an important springboard for research in the field of anomaly detection using time series data.Besides,it helps solve problems such as analyzing complex patterns in data lightweight.展开更多
In the era of big data, data application based on data governance has become an inevitable trend in the construction of smart campus in higher education. In this paper, a set of data governance system framework coveri...In the era of big data, data application based on data governance has become an inevitable trend in the construction of smart campus in higher education. In this paper, a set of data governance system framework covering the whole life cycle of data suitable for higher education is proposed, and based on this, the ideas and methods of data governance are applied to the construction of data management system for the basic development status of faculties by combining the practice of data governance of Donghua University.It forms a closed-loop management of data in all aspects, such as collection, information feedback, and statistical analysis of the basic development status data of the college. While optimizing the management business of higher education, the system provides a scientific and reliable basis for precise decision-making and strategic development of higher education.展开更多
Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary w...Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary with a deformation condition.This study proposes a novel approach for accurately predicting an anisotropic deformation behavior of wrought Mg alloys using machine learning(ML)with data augmentation.The developed model combines four key strategies from data science:learning the entire flow curves,generative adversarial networks(GAN),algorithm-driven hyperparameter tuning,and gated recurrent unit(GRU)architecture.The proposed model,namely GAN-aided GRU,was extensively evaluated for various predictive scenarios,such as interpolation,extrapolation,and a limited dataset size.The model exhibited significant predictability and improved generalizability for estimating the anisotropic compressive behavior of ZK60 Mg alloys under 11 annealing conditions and for three loading directions.The GAN-aided GRU results were superior to those of previous ML models and constitutive equations.The superior performance was attributed to hyperparameter optimization,GAN-based data augmentation,and the inherent predictivity of the GRU for extrapolation.As a first attempt to employ ML techniques other than artificial neural networks,this study proposes a novel perspective on predicting the anisotropic deformation behaviors of wrought Mg alloys.展开更多
There are challenges to the reliability evaluation for insulated gate bipolar transistors(IGBT)on electric vehicles,such as junction temperature measurement,computational and storage resources.In this paper,a junction...There are challenges to the reliability evaluation for insulated gate bipolar transistors(IGBT)on electric vehicles,such as junction temperature measurement,computational and storage resources.In this paper,a junction temperature estimation approach based on neural network without additional cost is proposed and the lifetime calculation for IGBT using electric vehicle big data is performed.The direct current(DC)voltage,operation current,switching frequency,negative thermal coefficient thermistor(NTC)temperature and IGBT lifetime are inputs.And the junction temperature(T_(j))is output.With the rain flow counting method,the classified irregular temperatures are brought into the life model for the failure cycles.The fatigue accumulation method is then used to calculate the IGBT lifetime.To solve the limited computational and storage resources of electric vehicle controllers,the operation of IGBT lifetime calculation is running on a big data platform.The lifetime is then transmitted wirelessly to electric vehicles as input for neural network.Thus the junction temperature of IGBT under long-term operating conditions can be accurately estimated.A test platform of the motor controller combined with the vehicle big data server is built for the IGBT accelerated aging test.Subsequently,the IGBT lifetime predictions are derived from the junction temperature estimation by the neural network method and the thermal network method.The experiment shows that the lifetime prediction based on a neural network with big data demonstrates a higher accuracy than that of the thermal network,which improves the reliability evaluation of system.展开更多
As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The ...As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The eddy dissipation rate(EDR)has been established as the standard metric for quantifying turbulence in civil aviation.This study aims to explore a universally applicable symbolic classification approach based on genetic programming to detect turbulence anomalies using quick access recorder(QAR)data.The detection of atmospheric turbulence is approached as an anomaly detection problem.Comparative evaluations demonstrate that this approach performs on par with direct EDR calculation methods in identifying turbulence events.Moreover,comparisons with alternative machine learning techniques indicate that the proposed technique is the optimal methodology currently available.In summary,the use of symbolic classification via genetic programming enables accurate turbulence detection from QAR data,comparable to that with established EDR approaches and surpassing that achieved with machine learning algorithms.This finding highlights the potential of integrating symbolic classifiers into turbulence monitoring systems to enhance civil aviation safety amidst rising environmental and operational hazards.展开更多
基金partially supported by the Foundation of State Key Laboratory of Public Big Data(No.PBD2022-01).
文摘In today’s highly competitive retail industry,offline stores face increasing pressure on profitability.They hope to improve their ability in shelf management with the help of big data technology.For this,on-shelf availability is an essential indicator of shelf data management and closely relates to customer purchase behavior.RFM(recency,frequency,andmonetary)patternmining is a powerful tool to evaluate the value of customer behavior.However,the existing RFM patternmining algorithms do not consider the quarterly nature of goods,resulting in unreasonable shelf availability and difficulty in profit-making.To solve this problem,we propose a quarterly RFM mining algorithmfor On-shelf products named OS-RFM.Our algorithmmines the high recency,high frequency,and high monetary patterns and considers the period of the on-shelf goods in quarterly units.We conducted experiments using two real datasets for numerical and graphical analysis to prove the algorithm’s effectiveness.Compared with the state-of-the-art RFM mining algorithm,our algorithm can identify more patterns and performs well in terms of precision,recall,and F1-score,with the recall rate nearing 100%.Also,the novel algorithm operates with significantly shorter running times and more stable memory usage than existing mining algorithms.Additionally,we analyze the sales trends of products in different quarters and seasonal variations.The analysis assists businesses in maintaining reasonable on-shelf availability and achieving greater profitability.
文摘That the world is a global village is no longer news through the tremendous advancement in the Information Communication Technology (ICT). The metamorphosis of the human data storage and analysis from analogue through the jaguars-loom mainframe computer to the present modern high power processing computers with sextillion bytes storage capacity has prompted discussion of Big Data concept as a tool in managing hitherto all human challenges of complex human system multiplier effects. The supply chain management (SCM) that deals with spatial service delivery that must be safe, efficient, reliable, cheap, transparent, and foreseeable to meet customers’ needs cannot but employ bid data tools in its operation. This study employs secondary data online to review the importance of big data in supply chain management and the levels of adoption in Nigeria. The study revealed that the application of big data tools in SCM and other industrial sectors is synonymous to human and national development. It is therefore recommended that both private and governmental bodies should key into e-transactions for easy data assemblage and analysis for profitable forecasting and policy formation.
基金supported by the Ministerio Espanol de Ciencia e Innovación under Project Number PID2020-115570GB-C22,MCIN/AEI/10.13039/501100011033by the Cátedra de Empresa Tecnología para las Personas(UGR-Fujitsu).
文摘As cloud computing usage grows,cloud data centers play an increasingly important role.To maximize resource utilization,ensure service quality,and enhance system performance,it is crucial to allocate tasks and manage performance effectively.The purpose of this study is to provide an extensive analysis of task allocation and performance management techniques employed in cloud data centers.The aim is to systematically categorize and organize previous research by identifying the cloud computing methodologies,categories,and gaps.A literature review was conducted,which included the analysis of 463 task allocations and 480 performance management papers.The review revealed three task allocation research topics and seven performance management methods.Task allocation research areas are resource allocation,load-Balancing,and scheduling.Performance management includes monitoring and control,power and energy management,resource utilization optimization,quality of service management,fault management,virtual machine management,and network management.The study proposes new techniques to enhance cloud computing work allocation and performance management.Short-comings in each approach can guide future research.The research’s findings on cloud data center task allocation and performance management can assist academics,practitioners,and cloud service providers in optimizing their systems for dependability,cost-effectiveness,and scalability.Innovative methodologies can steer future research to fill gaps in the literature.
基金supported by the National Key Research and Development Program of China(grant number 2019YFE0123600)。
文摘The power Internet of Things(IoT)is a significant trend in technology and a requirement for national strategic development.With the deepening digital transformation of the power grid,China’s power system has initially built a power IoT architecture comprising a perception,network,and platform application layer.However,owing to the structural complexity of the power system,the construction of the power IoT continues to face problems such as complex access management of massive heterogeneous equipment,diverse IoT protocol access methods,high concurrency of network communications,and weak data security protection.To address these issues,this study optimizes the existing architecture of the power IoT and designs an integrated management framework for the access of multi-source heterogeneous data in the power IoT,comprising cloud,pipe,edge,and terminal parts.It further reviews and analyzes the key technologies involved in the power IoT,such as the unified management of the physical model,high concurrent access,multi-protocol access,multi-source heterogeneous data storage management,and data security control,to provide a more flexible,efficient,secure,and easy-to-use solution for multi-source heterogeneous data access in the power IoT.
文摘This study investigates university English teachers’acceptance and willingness to use learning management system(LMS)data analysis tools in their teaching practices.The research employs a mixed-method approach,combining quantitative surveys and qualitative interviews to understand teachers’perceptions and attitudes,and the factors influencing their adoption of LMS data analysis tools.The findings reveal that perceived usefulness,perceived ease of use,technical literacy,organizational support,and data privacy concerns significantly impact teachers’willingness to use these tools.Based on these insights,the study offers practical recommendations for educational institutions to enhance the effective adoption of LMS data analysis tools in English language teaching.
文摘As an introductory course for the emerging major of big data management and application,“Introduction to Big Data”has not yet formed a curriculum standard and implementation plan that is widely accepted and used by everyone.To this end,we discuss some of our explorations and attempts in the construction and teaching process of big data courses for the major of big data management and application from the perspective of course planning,course implementation,and course summary.After interviews with students and feedback from questionnaires,students are highly satisfied with some of the teaching measures and programs currently adopted.
文摘Driven by the wave of big data,the traditional financial accounting model faces an urgent need for transformation,as it struggles to adapt to the complex requirements of modern enterprise management.This paper aims to explore the feasible path for transitioning enterprise financial accounting to management accounting in the context of big data.It first analyzes the limitations of financial accounting in the era of big data,then highlights the necessity of transitioning to management accounting.Following this,the paper outlines the various challenges that may arise during this transition and,based on the analysis,proposes a series of corresponding transition strategies.These strategies aim to provide theoretical support and practical guidance for enterprises seeking a smooth transition from financial accounting to management accounting.
文摘With the rapid development of big data,big data has been more and more applied in all walks of life.Under the big data environment,massive big data provides convenience for regional tax risk control and strategic decision-making but also increases the difficulty of data supervision and management.By analyzing the status quo of big data and tax risk management,this paper finds many problems and puts forward effective countermeasures for tax risk supervision and strategic management by using big data.
文摘In the 21st century,with the development of the Internet,mobile devices,and information technology,society has entered a new era:the era of big data.With the help of big data technology,enterprises can obtain massive market and consumer data,realize in-depth analysis of business and market,and enable enterprises to have a deeper understanding of consumer needs,preferences,and behaviors.At the same time,big data technology can also help enterprises carry out human resource management innovation and improve the performance and competitiveness of enterprises.Of course,from another perspective,enterprises in this era are also facing severe challenges.In the face of massive data processing and analysis,it requires superb data processing and analysis capabilities.Secondly,enterprises need to reconstruct their management system to adapt to the changes in the era of big data.Enterprises must treat data as assets and establish a perfect data management system.In addition,enterprises also need to pay attention to protecting customer privacy and data security to avoid data leakage and abuse.In this context,this paper will explore the thinking of enterprise human resource management innovation in the era of big data,and put forward some suggestions on enterprise human resource management innovation.
文摘With the continuous development of big data technology,the digital transformation of enterprise human resource management has become a development trend.Human resources is one of the most important resources of enterprises,which is crucial to the competitiveness of enterprises.Enterprises need to attract,retain,and motivate excellent employees,thereby enhancing the innovation ability of enterprises and improving competitiveness and market share in the market.To maintain advantages in the fierce market competition,enterprises need to adopt more scientific and effective human resource management methods to enhance organizational efficiency and competitiveness.At the same time,this paper analyzes the dilemma faced by enterprise human resource management,points out the new characteristics of enterprise human resource management enabled by big data,and puts forward feasible suggestions for enterprise digital transformation.
文摘With the rapid development and widespread application of Big Data technology, the supply chain management of agricultural products enterprises is facing unprecedented reform and challenges. This study takes the perspective of Big Data technology and collects relevant information on the application of supply chain management in 100 agricultural product enterprises through a survey questionnaire. The study found that the use of Big Data can effectively improve the accuracy of demand forecasting, inventory management efficiency, optimize logistics costs, improve supplier management efficiency, enhance the overall level of supply chain management of enterprises, and propose innovative strategies for supply chain management of agricultural products enterprises based on this. Big Data technology brings a new solution for agricultural products enterprises to enhance their supply chain management level, making the supply chain smarter and more efficient.
文摘This study aims to investigate the influence of social media on college choice among undergraduates majoring in Big Data Management and Application in China.The study attempts to reveal how information on social media platforms such as Weibo,WeChat,and Zhihu influences the cognition and choice process of prospective students.By employing an online quantitative survey questionnaire,data were collected from the 2022 and 2023 classes of new students majoring in Big Data Management and Application at Guilin University of Electronic Technology.The aim was to evaluate the role of social media in their college choice process and understand the features and information that most attract prospective students.Social media has become a key factor influencing the college choice decision-making of undergraduates majoring in Big Data Management and Application in China.Students tend to obtain school information through social media platforms and use this information as an important reference in their decision-making process.Higher education institutions should strengthen their social media information dissemination,providing accurate,timely,and attractive information.It is also necessary to ensure effective management of social media platforms,maintain a positive reputation for the school on social media,and increase the interest and trust of prospective students.Simultaneously,educational decision-makers should consider incorporating social media analysis into their recruitment strategies to better attract new student enrollment.This study provides a new perspective for understanding higher education choice behavior in the digital age,particularly by revealing the importance of social media in the educational decision-making process.This has important practical and theoretical implications for higher education institutions,policymakers,and social media platform operators.
基金supported by China’s National Natural Science Foundation(Nos.62072249,62072056)This work is also funded by the National Science Foundation of Hunan Province(2020JJ2029).
文摘With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapid development of IIoT.Blockchain technology has immutability,decentralization,and autonomy,which can greatly improve the inherent defects of the IIoT.In the traditional blockchain,data is stored in a Merkle tree.As data continues to grow,the scale of proofs used to validate it grows,threatening the efficiency,security,and reliability of blockchain-based IIoT.Accordingly,this paper first analyzes the inefficiency of the traditional blockchain structure in verifying the integrity and correctness of data.To solve this problem,a new Vector Commitment(VC)structure,Partition Vector Commitment(PVC),is proposed by improving the traditional VC structure.Secondly,this paper uses PVC instead of the Merkle tree to store big data generated by IIoT.PVC can improve the efficiency of traditional VC in the process of commitment and opening.Finally,this paper uses PVC to build a blockchain-based IIoT data security storage mechanism and carries out a comparative analysis of experiments.This mechanism can greatly reduce communication loss and maximize the rational use of storage space,which is of great significance for maintaining the security and stability of blockchain-based IIoT.
文摘In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose a Hadoop based big data secure storage scheme.Firstly,in order to disperse the NameNode service from a single server to multiple servers,we combine HDFS federation and HDFS high-availability mechanisms,and use the Zookeeper distributed coordination mechanism to coordinate each node to achieve dual-channel storage.Then,we improve the ECC encryption algorithm for the encryption of ordinary data,and adopt a homomorphic encryption algorithm to encrypt data that needs to be calculated.To accelerate the encryption,we adopt the dualthread encryption mode.Finally,the HDFS control module is designed to combine the encryption algorithm with the storage model.Experimental results show that the proposed solution solves the problem of a single point of failure of metadata,performs well in terms of metadata reliability,and can realize the fault tolerance of the server.The improved encryption algorithm integrates the dual-channel storage mode,and the encryption storage efficiency improves by 27.6% on average.
基金This research was financially supported by the Ministry of Trade,Industry,and Energy(MOTIE),Korea,under the“Project for Research and Development with Middle Markets Enterprises and DNA(Data,Network,AI)Universities”(AI-based Safety Assessment and Management System for Concrete Structures)(ReferenceNumber P0024559)supervised by theKorea Institute for Advancement of Technology(KIAT).
文摘Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal dependence,and noise.Therefore,methodologies for data augmentation and conversion of time series data into images for analysis have been studied.This paper proposes a fault detection model that uses time series data augmentation and transformation to address the problems of data imbalance,temporal dependence,and robustness to noise.The method of data augmentation is set as the addition of noise.It involves adding Gaussian noise,with the noise level set to 0.002,to maximize the generalization performance of the model.In addition,we use the Markov Transition Field(MTF)method to effectively visualize the dynamic transitions of the data while converting the time series data into images.It enables the identification of patterns in time series data and assists in capturing the sequential dependencies of the data.For anomaly detection,the PatchCore model is applied to show excellent performance,and the detected anomaly areas are represented as heat maps.It allows for the detection of anomalies,and by applying an anomaly map to the original image,it is possible to capture the areas where anomalies occur.The performance evaluation shows that both F1-score and Accuracy are high when time series data is converted to images.Additionally,when processed as images rather than as time series data,there was a significant reduction in both the size of the data and the training time.The proposed method can provide an important springboard for research in the field of anomaly detection using time series data.Besides,it helps solve problems such as analyzing complex patterns in data lightweight.
基金Special Project for Renovation and Procurement of Donghua University,Ministry of Education,China (No. CG202002845)。
文摘In the era of big data, data application based on data governance has become an inevitable trend in the construction of smart campus in higher education. In this paper, a set of data governance system framework covering the whole life cycle of data suitable for higher education is proposed, and based on this, the ideas and methods of data governance are applied to the construction of data management system for the basic development status of faculties by combining the practice of data governance of Donghua University.It forms a closed-loop management of data in all aspects, such as collection, information feedback, and statistical analysis of the basic development status data of the college. While optimizing the management business of higher education, the system provides a scientific and reliable basis for precise decision-making and strategic development of higher education.
基金Korea Institute of Energy Technology Evaluation and Planning(KETEP)grant funded by the Korea government(Grant No.20214000000140,Graduate School of Convergence for Clean Energy Integrated Power Generation)Korea Basic Science Institute(National Research Facilities and Equipment Center)grant funded by the Ministry of Education(2021R1A6C101A449)the National Research Foundation of Korea grant funded by the Ministry of Science and ICT(2021R1A2C1095139),Republic of Korea。
文摘Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary with a deformation condition.This study proposes a novel approach for accurately predicting an anisotropic deformation behavior of wrought Mg alloys using machine learning(ML)with data augmentation.The developed model combines four key strategies from data science:learning the entire flow curves,generative adversarial networks(GAN),algorithm-driven hyperparameter tuning,and gated recurrent unit(GRU)architecture.The proposed model,namely GAN-aided GRU,was extensively evaluated for various predictive scenarios,such as interpolation,extrapolation,and a limited dataset size.The model exhibited significant predictability and improved generalizability for estimating the anisotropic compressive behavior of ZK60 Mg alloys under 11 annealing conditions and for three loading directions.The GAN-aided GRU results were superior to those of previous ML models and constitutive equations.The superior performance was attributed to hyperparameter optimization,GAN-based data augmentation,and the inherent predictivity of the GRU for extrapolation.As a first attempt to employ ML techniques other than artificial neural networks,this study proposes a novel perspective on predicting the anisotropic deformation behaviors of wrought Mg alloys.
文摘There are challenges to the reliability evaluation for insulated gate bipolar transistors(IGBT)on electric vehicles,such as junction temperature measurement,computational and storage resources.In this paper,a junction temperature estimation approach based on neural network without additional cost is proposed and the lifetime calculation for IGBT using electric vehicle big data is performed.The direct current(DC)voltage,operation current,switching frequency,negative thermal coefficient thermistor(NTC)temperature and IGBT lifetime are inputs.And the junction temperature(T_(j))is output.With the rain flow counting method,the classified irregular temperatures are brought into the life model for the failure cycles.The fatigue accumulation method is then used to calculate the IGBT lifetime.To solve the limited computational and storage resources of electric vehicle controllers,the operation of IGBT lifetime calculation is running on a big data platform.The lifetime is then transmitted wirelessly to electric vehicles as input for neural network.Thus the junction temperature of IGBT under long-term operating conditions can be accurately estimated.A test platform of the motor controller combined with the vehicle big data server is built for the IGBT accelerated aging test.Subsequently,the IGBT lifetime predictions are derived from the junction temperature estimation by the neural network method and the thermal network method.The experiment shows that the lifetime prediction based on a neural network with big data demonstrates a higher accuracy than that of the thermal network,which improves the reliability evaluation of system.
基金supported by the Meteorological Soft Science Project(Grant No.2023ZZXM29)the Natural Science Fund Project of Tianjin,China(Grant No.21JCYBJC00740)the Key Research and Development-Social Development Program of Jiangsu Province,China(Grant No.BE2021685).
文摘As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The eddy dissipation rate(EDR)has been established as the standard metric for quantifying turbulence in civil aviation.This study aims to explore a universally applicable symbolic classification approach based on genetic programming to detect turbulence anomalies using quick access recorder(QAR)data.The detection of atmospheric turbulence is approached as an anomaly detection problem.Comparative evaluations demonstrate that this approach performs on par with direct EDR calculation methods in identifying turbulence events.Moreover,comparisons with alternative machine learning techniques indicate that the proposed technique is the optimal methodology currently available.In summary,the use of symbolic classification via genetic programming enables accurate turbulence detection from QAR data,comparable to that with established EDR approaches and surpassing that achieved with machine learning algorithms.This finding highlights the potential of integrating symbolic classifiers into turbulence monitoring systems to enhance civil aviation safety amidst rising environmental and operational hazards.