BACKGROUND Clinical belonging refers to the feeling that clinical medical staff feel recognized and accepted by others or groups.The level of clinical belonging of nursing interns affects students’learning motivation...BACKGROUND Clinical belonging refers to the feeling that clinical medical staff feel recognized and accepted by others or groups.The level of clinical belonging of nursing interns affects students’learning motivation and confidence,which in turn affects their clinical practice behavior.AIM To explore the effects of professional identity and nursing information ability on clinical belonging among nursing interns and establish a relationship model for these factors.METHODS The researchers used the convenience sampling method to select 682 nursing interns from China.The survey was conducted using a general information questionnaire,clinical sense of belonging scale,nursing information ability self-assessment scale,and a nursing student professional identity questionnaire.The mediating effect of nursing information ability between their professional identity and clinical sense of belonging was analyzed using SPSS 21.0 and the path analysis in structural equation modeling.RESULTS The total scores of clinical belonging,professional identity,and nursing information ability of nursing interns were(104.29±13.11)points,(57.89±7.16)points,and(70.29±6.20)points,respectively.Nursing information ability had a direct effect on the clinical sense of belonging(effect value=0.46,P<0.05).Occupational identity had a direct effect(effect value=0.52,P<0.05)and an indirect effect(effect value=0.21,P<0.05)on clinical belonging.CONCLUSION Nursing administrators in nursing colleges and hospitals should take effective measures to improve the professional identity and nursing information ability of nursing interns,as well as the clinical sense of belonging among nursing interns.展开更多
This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hac...This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hackers, thereby making customer/client data visible and unprotected. Also, this led to enormous risk of the clients/customers due to defective equipment, bugs, faulty servers, and specious actions. The aim if this paper therefore is to analyze a secure model using Unicode Transformation Format (UTF) base 64 algorithms for storage of data in cloud securely. The methodology used was Object Orientated Hypermedia Analysis and Design Methodology (OOHADM) was adopted. Python was used to develop the security model;the role-based access control (RBAC) and multi-factor authentication (MFA) to enhance security Algorithm were integrated into the Information System developed with HTML 5, JavaScript, Cascading Style Sheet (CSS) version 3 and PHP7. This paper also discussed some of the following concepts;Development of Computing in Cloud, Characteristics of computing, Cloud deployment Model, Cloud Service Models, etc. The results showed that the proposed enhanced security model for information systems of cooperate platform handled multiple authorization and authentication menace, that only one login page will direct all login requests of the different modules to one Single Sign On Server (SSOS). This will in turn redirect users to their requested resources/module when authenticated, leveraging on the Geo-location integration for physical location validation. The emergence of this newly developed system will solve the shortcomings of the existing systems and reduce time and resources incurred while using the existing system.展开更多
Traditional methods for selecting models in experimental data analysis are susceptible to researcher bias, hindering exploration of alternative explanations and potentially leading to overfitting. The Finite Informati...Traditional methods for selecting models in experimental data analysis are susceptible to researcher bias, hindering exploration of alternative explanations and potentially leading to overfitting. The Finite Information Quantity (FIQ) approach offers a novel solution by acknowledging the inherent limitations in information processing capacity of physical systems. This framework facilitates the development of objective criteria for model selection (comparative uncertainty) and paves the way for a more comprehensive understanding of phenomena through exploring diverse explanations. This work presents a detailed comparison of the FIQ approach with ten established model selection methods, highlighting the advantages and limitations of each. We demonstrate the potential of FIQ to enhance the objectivity and robustness of scientific inquiry through three practical examples: selecting appropriate models for measuring fundamental constants, sound velocity, and underwater electrical discharges. Further research is warranted to explore the full applicability of FIQ across various scientific disciplines.展开更多
The whole-process project cost management based on building information modeling(BIM)is a new management method,aiming to realize the comprehensive optimization and improvement of project cost management through the a...The whole-process project cost management based on building information modeling(BIM)is a new management method,aiming to realize the comprehensive optimization and improvement of project cost management through the application of BIM technology.This paper summarizes and analyzes the whole-process project cost management based on BIM,aiming to explore its application and development prospects in the construction industry.Firstly,this paper introduces the role and advantages of BIM technology in engineering cost management,including information integration,data sharing,and collaborative work.Secondly,the paper analyzes the key technologies and methods of the whole-process project cost management based on BIM,including model construction,data management,and cost control.In addition,the paper also discusses the challenges and limitations of the whole-process BIM project cost management,such as the inconsistency of technical standards,personnel training,and consciousness change.Finally,the paper summarizes the advantages and development prospects of the whole-process project cost management based on BIM and puts forward the direction and suggestions for future research.Through the research of this paper,it can provide a reference for construction cost management and promote innovation and development in the construction industry.展开更多
As a branch of computer science,information visualization aims to help users understand and analyze complex data through graphical interfaces and interactive technologies.Information visualization primarily includes v...As a branch of computer science,information visualization aims to help users understand and analyze complex data through graphical interfaces and interactive technologies.Information visualization primarily includes various visual structures such as time-series structures,spatial relationship structures,statistical distribution structures,and geographic map structures,each with unique functions and application scenarios.To better explain the cognitive process of visualization,researchers have proposed various cognitive models based on interaction mechanisms,visual perception steps,and novice use of visualization.These models help understand user cognition in information visualization,enhancing the effectiveness of data analysis and decision-making.展开更多
With the acceleration of the social information process,information awareness and information skills have become the basic qualities of every citizen.The establishment of the training mechanism for scientific and tech...With the acceleration of the social information process,information awareness and information skills have become the basic qualities of every citizen.The establishment of the training mechanism for scientific and technological innovation talents from the beginning of higher education is insufficient to meet the needs of the development of the times.It is imperative to improve the training of information technology innovation talents and explore a new training model.This paper describes the general situation of the development of education in the field of information technology from a domestic and international perspective.It then analyzes the existing problems,explores new exploration models and implementation suggestions,and puts forward prospects at the end of the paper.展开更多
Media and Information Literacy(MIL)is one of the most important topics in today’s mediatized world.Under the leadership of United Nations Educational,Scientific and Cultural Organization(UNESCO),many international or...Media and Information Literacy(MIL)is one of the most important topics in today’s mediatized world.Under the leadership of United Nations Educational,Scientific and Cultural Organization(UNESCO),many international organizations in the world,as foreign donors,annually announce many projects and grants for the promotion and development of the field of MIL in the countries of the world.One of the main actors of this movement is DW Akademie with different media and MIL projects several countries of the world.This research paper delves into the role of DW Akademie’s MIL model in shaping a media-savvy generation.The study explores the theoretical underpinnings and practical applications of Deutsche Welle(DW)Akademie’s MIL model,analysing its effectiveness in fostering media literacy skills.The research employs a multi-faceted approach,incorporating case studies to assess the model’s impact across diverse demographics.The paper also considers the model’s alignment with global educational policies and proposes recommendations for its integration into broader frameworks.By investigating DW Akademie’s MIL model,this research contributes to the ongoing discourse on media literacy education,providing valuable insights for educators,policymakers,and researchers.The findings offer a nuanced understanding of the model’s position in cultivating a media-savvy generation poised to navigate the complexities of the information age.展开更多
The core education function of higher vocational colleges is to train technical talents with high quality,so as to meet the needs of talents in the development stage of our society.Under the guidance of talent trainin...The core education function of higher vocational colleges is to train technical talents with high quality,so as to meet the needs of talents in the development stage of our society.Under the guidance of talent training,higher vocational colleges need to pay attention to establishing an all-round and three-dimensional education model,and promote innovation of higher vocational education on the basis of this.It is also a way to promote the innovation of higher vocational education to vigorously promote the construction of“post,course,competition,certificate”mode in the construction of education mode.Through the construction of“post,course,competition,certificate”mode,the education mode of higher vocational colleges is gradually improved,so as to strengthen the effectiveness of talent training in higher vocational colleges.Therefore,in this paper,the author puts forward some suggestions to promote the construction of the integrated education mode of the electronic information engineering technology major in higher vocational colleges,so as to help improve the talent training level of higher vocational colleges.展开更多
Computer vision-based inspection methods show promise for automating post-earthquake building inspections.These methods survey a building with unmanned aerial vehicles and automatically detect damage in the collected ...Computer vision-based inspection methods show promise for automating post-earthquake building inspections.These methods survey a building with unmanned aerial vehicles and automatically detect damage in the collected images.Nevertheless,assessing the damage′s impact on structural safety requires localizing damage to specific building components with known design and function.This paper proposes a BIM-based automated inspection framework to provide context for visual surveys.A deep learning-based semantic segmentation algorithm is trained to automatically identify damage in images.The BIM automatically associates any identified damage with specific building components.Then,components are classified into damage states consistent with component fragility models for integration with a structural analysis.To demonstrate the framework,methods are developed to photorealistically simulate severe structural damage in a synthetic computer graphics environment.A graphics model of a real building in Urbana,Illinois,is generated to test the framework;the model is integrated with a structural analysis to apply earthquake damage in a physically realistic manner.A simulated UAV survey is flown of the graphics model and the framework is applied.The method achieves high accuracy in assigning damage states to visible structural components.This assignment enables integration with a performance-based earthquake assessment to classify building safety.展开更多
Mechanical excavation,blasting,adjacent rockburst and fracture slip that occur during mining excavation impose dynamic loads on the rock mass,leading to further fracture of damaged surrounding rock in three-dimensiona...Mechanical excavation,blasting,adjacent rockburst and fracture slip that occur during mining excavation impose dynamic loads on the rock mass,leading to further fracture of damaged surrounding rock in three-dimensional high-stress and even causing disasters.Therefore,a novel complex true triaxial static-dynamic combined loading method reflecting underground excavation damage and then frequent intermittent disturbance failure is proposed.True triaxial static compression and intermittent disturbance tests are carried out on monzogabbro.The effects of intermediate principal stress and amplitude on the strength characteristics,deformation characteristics,failure characteristics,and precursors of monzogabbro are analyzed,intermediate principal stress and amplitude increase monzogabbro strength and tensile fracture mechanism.Rapid increases in microseismic parameters during rock loading can be precursors for intermittent rock disturbance.Based on the experimental result,the new damage fractional elements and method with considering crack initiation stress and crack unstable stress as initiation and acceleration condition of intermittent disturbance irreversible deformation are proposed.A novel three-dimensional disturbance fractional deterioration model considering the intermediate principal stress effect and intermittent disturbance damage effect is established,and the model predicted results align well with the experimental results.The sensitivity of stress states and model parameters is further explored,and the intermittent disturbance behaviors at different f are predicted.This study provides valuable theoretical bases for the stability analysis of deep mining engineering under dynamic loads.展开更多
There are five most widely used contact angle schemes in the pseudopotential lattice Boltzmann(LB)model for simulating the wetting phenomenon:The pseudopotential-based scheme(PB scheme),the improved virtualdensity sch...There are five most widely used contact angle schemes in the pseudopotential lattice Boltzmann(LB)model for simulating the wetting phenomenon:The pseudopotential-based scheme(PB scheme),the improved virtualdensity scheme(IVD scheme),the modified pseudopotential-based scheme with a ghost fluid layer constructed by using the fluid layer density above the wall(MPB-C scheme),the modified pseudopotential-based scheme with a ghost fluid layer constructed by using the weighted average density of surrounding fluid nodes(MPB-W scheme)and the geometric formulation scheme(GF scheme).But the numerical stability and accuracy of the schemes for wetting simulation remain unclear in the past.In this paper,the numerical stability and accuracy of these schemes are clarified for the first time,by applying the five widely used contact angle schemes to simulate a two-dimensional(2D)sessile droplet on wall and capillary imbibition in a 2D channel as the examples of static wetting and dynamic wetting simulations respectively.(i)It is shown that the simulated contact angles by the GF scheme are consistent at different density ratios for the same prescribed contact angle,but the simulated contact angles by the PB scheme,IVD scheme,MPB-C scheme and MPB-W scheme change with density ratios for the same fluid-solid interaction strength.The PB scheme is found to be the most unstable scheme for simulating static wetting at increased density ratios.(ii)Although the spurious velocity increases with the increased liquid/vapor density ratio for all the contact angle schemes,the magnitude of the spurious velocity in the PB scheme,IVD scheme and GF scheme are smaller than that in the MPB-C scheme and MPB-W scheme.(iii)The fluid density variation near the wall in the PB scheme is the most significant,and the variation can be diminished in the IVD scheme,MPB-C scheme andMPBWscheme.The variation totally disappeared in the GF scheme.(iv)For the simulation of capillary imbibition,the MPB-C scheme,MPB-Wscheme and GF scheme simulate the dynamics of the liquid-vapor interface well,with the GF scheme being the most accurate.The accuracy of the IVD scheme is low at a small contact angle(44 degrees)but gets high at a large contact angle(60 degrees).However,the PB scheme is the most inaccurate in simulating the dynamics of the liquid-vapor interface.As a whole,it is most suggested to apply the GF scheme to simulate static wetting or dynamic wetting,while it is the least suggested to use the PB scheme to simulate static wetting or dynamic wetting.展开更多
At present,the database cache model of power information system has problems such as slow running speed and low database hit rate.To this end,this paper proposes a database cache model for power information systems ba...At present,the database cache model of power information system has problems such as slow running speed and low database hit rate.To this end,this paper proposes a database cache model for power information systems based on deep machine learning.The caching model includes program caching,Structured Query Language(SQL)preprocessing,and core caching modules.Among them,the method to improve the efficiency of the statement is to adjust operations such as multi-table joins and replacement keywords in the SQL optimizer.Build predictive models using boosted regression trees in the core caching module.Generate a series of regression tree models using machine learning algorithms.Analyze the resource occupancy rate in the power information system to dynamically adjust the voting selection of the regression tree.At the same time,the voting threshold of the prediction model is dynamically adjusted.By analogy,the cache model is re-initialized.The experimental results show that the model has a good cache hit rate and cache efficiency,and can improve the data cache performance of the power information system.It has a high hit rate and short delay time,and always maintains a good hit rate even under different computer memory;at the same time,it only occupies less space and less CPU during actual operation,which is beneficial to power The information system operates efficiently and quickly.展开更多
In the field of target recognition based on the temporal-spatial information fusion,evidence the-ory has received extensive attention.To achieve accurate and efficient target recognition by the evi-dence theory,an ada...In the field of target recognition based on the temporal-spatial information fusion,evidence the-ory has received extensive attention.To achieve accurate and efficient target recognition by the evi-dence theory,an adaptive temporal-spatial information fusion model is proposed.Firstly,an adaptive evaluation correction mechanism is constructed by the evidence distance and Deng entropy,which realizes the credibility discrimination and adaptive correction of the spatial evidence.Secondly,the credibility decay operator is introduced to obtain the dynamic credibility of temporal evidence.Finally,the sequential combination of temporal-spatial evidences is achieved by Shafer’s discount criterion and Dempster’s combination rule.The simulation results show that the proposed method not only considers the dynamic and sequential characteristics of the temporal-spatial evidences com-bination,but also has a strong conflict information processing capability,which provides a new refer-ence for the field of temporal-spatial information fusion.展开更多
The escalating costs of research and development, coupled with the influx of researchers, have led to a surge in published articles across scientific disciplines. However, concerns have arisen regarding the accuracy, ...The escalating costs of research and development, coupled with the influx of researchers, have led to a surge in published articles across scientific disciplines. However, concerns have arisen regarding the accuracy, validity, and reproducibility of reported findings. Issues such as replication problems, fraudulent practices, and a lack of expertise in measurement theory and uncertainty analysis have raised doubts about the reliability and credibility of scientific research. Rigorous assessment practices in certain fields highlight the importance of identifying potential errors and understanding the relationship between technical parameters and research outcomes. To address these concerns, a universally applicable criterion called comparative certainty is urgently needed. This criterion, grounded in an analysis of the modeling process and information transmission, accumulation, and transformation in both theoretical and applied research, aims to evaluate the acceptable deviation between a model and the observed phenomenon. It provides a theoretically grounded framework applicable to all scientific disciplines adhering to the International System of Units (SI). Objective evaluations based on this criterion can enhance the reproducibility and reliability of scientific investigations, instilling greater confidence in published findings. Establishing this criterion would be a significant stride towards ensuring the robustness and credibility of scientific research across disciplines.展开更多
Success or failure of an E-commerce platform is often reduced to its ability to maximize the conversion rate of its visitors. This is commonly regarded as the capacity to induce a purchase from a visitor. Visitors pos...Success or failure of an E-commerce platform is often reduced to its ability to maximize the conversion rate of its visitors. This is commonly regarded as the capacity to induce a purchase from a visitor. Visitors possess individual characteristics, histories, and objectives which complicate the choice of what platform features that maximize the conversion rate. Modern web technology has made clickstream data accessible allowing a complete record of a visitor’s actions on a website to be analyzed. What remains poorly constrained is what parts of the clickstream data are meaningful information and what parts are accidental for the problem of platform design. In this research, clickstream data from an online retailer was examined to demonstrate how statistical modeling can improve clickstream information usage. A conceptual model was developed that conjectured relationships between visitor and platform variables, visitors’ platform exit rate, boune rate, and decision to purchase. Several hypotheses on the nature of the clickstream relationships were posited and tested with the models. A discrete choice logit model showed that the content of a website, the history of website use, and the exit rate of pages visited had marginal effects on derived utility for the visitor. Exit rate and bounce rate were modeled as beta distributed random variables. It was found that exit rate and its variability for pages visited were associated with site content, site quality, prior visitor history on the site, and technological preferences of the visitor. Bounce rate was also found to be influenced by the same factors but was in a direction opposite to the registered hypotheses. Most findings supported that clickstream data is amenable to statistical modeling with interpretable and comprehensible models.展开更多
When building a model of a physical phenomenon or process, scientists face an inevitable compromise between the simplicity of the model (qualitative-quantitative set of variables) and its accuracy. For hundreds of yea...When building a model of a physical phenomenon or process, scientists face an inevitable compromise between the simplicity of the model (qualitative-quantitative set of variables) and its accuracy. For hundreds of years, the visual simplicity of a law testified to the genius and depth of the physical thinking of the scientist who proposed it. Currently, the desire for a deeper physical understanding of the surrounding world and newly discovered physical phenomena motivates researchers to increase the number of variables considered in a model. This direction leads to an increased probability of choosing an inaccurate or even erroneous model. This study describes a method for estimating the limit of measurement accuracy, taking into account the stage of model building in terms of storage, transmission, processing and use of information by the observer. This limit, due to the finite amount of information stored in the model, allows you to select the optimal number of variables for the best reproduction of the observed object and calculate the exact values of the threshold discrepancy between the model and the phenomenon under study in measurement theory. We consider two examples: measurement of the speed of sound and measurement of physical constants.展开更多
As a novel paradigm,semantic communication provides an effective solution for breaking through the future development dilemma of classical communication systems.However,it remains an unsolved problem of how to measure...As a novel paradigm,semantic communication provides an effective solution for breaking through the future development dilemma of classical communication systems.However,it remains an unsolved problem of how to measure the information transmission capability for a given semantic communication method and subsequently compare it with the classical communication method.In this paper,we first present a review of the semantic communication system,including its system model and the two typical coding and transmission methods for its implementations.To address the unsolved issue of the information transmission capability measure for semantic communication methods,we propose a new universal performance measure called Information Conductivity.We provide the definition and the physical significance to state its effectiveness in representing the information transmission capabilities of the semantic communication systems and present elaborations including its measure methods,degrees of freedom,and progressive analysis.Experimental results in image transmission scenarios validate its practical applicability.展开更多
Analysis of catchment Land use/Land cover (LULC) change is a vital tool in ensuring sustainable catchment management. The study analyzed land use/land cover changes in the Rwizi catchment, south western Uganda from 19...Analysis of catchment Land use/Land cover (LULC) change is a vital tool in ensuring sustainable catchment management. The study analyzed land use/land cover changes in the Rwizi catchment, south western Uganda from 1989-2019 and projected the trend by 2040. Landsat images, field observations, key informant interviews and focus group discussions were used to collect data. Changes in cropland, forestland, built up area, grazing land, wetland and open water bodies were analyzed in ArcGIS version 10.2.2 and ERDAS IMAGINE 14 software and a Markov chain model. All the LULC classes increased in area except grazing land. Forest land and builtup area between 2009-2019 increased by 370.03% and 229.53% respectively. Projections revealed an increase in forest land and builtup area by 2030 and only built up area by 2040. LULCC in the catchment results from population pressure, reduced soil fertility and high value of agricultural products.展开更多
Cross-region innovation is widely recognized as an important source of the long-term regional innovation capacity.In the recent past,a growing number of studies has investigated the network structure and mechanisms of...Cross-region innovation is widely recognized as an important source of the long-term regional innovation capacity.In the recent past,a growing number of studies has investigated the network structure and mechanisms of cross-region innovation collaboration in various contexts.However,existing research mainly focuses on physical effects,such as geographical distance and high-speed railway connections.These studies ignore the intangible drivers in a changing environment,the more digitalized economy and the increasingly solidified innovation network structure.Thus,the focus of this study is on estimating determinants of innovation networks,especially on intangible drivers,which have been largely neglected so far.Using city-level data of Chinese patents(excluding Hong Kong,Macao,and Taiwan Province of China),we trace innovation networks across Chinese cities over a long period of time.By integrating a measure on Information and Communications Technology(ICT)development gap and network structural effects into the general proximity framework,this paper explores the changing mechanisms of Chinese innovation networks from a new perspective.The results show that the structure of cross-region innovation networks has changed in China.As mechanisms behind this development,the results confirm the increasingly important role of intangible drivers in Chinese inter-city innovation collaboration when controlling for effects of physical proximity,such as geographical distance.Since digitalization and coordinated development are the mainstream trends in China and other developing countries,these countries'inter-city innovation collaboration patterns will witness dramatic changes under the influence of intangible drivers.展开更多
In this study, the influence of confined concrete models on the response of reinforced concrete structures is investigatedat member and global system levels. The commonly encountered concrete models such as Modified K...In this study, the influence of confined concrete models on the response of reinforced concrete structures is investigatedat member and global system levels. The commonly encountered concrete models such as Modified Kent-Park, Saatçioğlu-Razvi, and Mander are considered. Two moment-resisting frames designed according to thepre-modern code are taken into consideration to reflect the example of an RC moment-resisting frame in thecurrent building stock. The building is in an earthquake-prone zone located on Z3 Soil Type. The inelasticresponse of the building frame is modelled by considering the plastic hinges formed on each beam and columnelement for different concrete classes and stirrups spacings. The models are subjected to non-linear static analyses.The differences between confined concrete models are comparatively investigated at both reinforced concretemember and system levels. Based on the results of the comparative analysis, it is revealed that the column behaviouris mostly influenced by the choice of model, due to axial loads and confinement effects, while the beams areless affected, and also it is observed that the differences exhibited in the moment-curvature response of columncross-sections do not significantly affect the overall behaviour of the global system. This highlights the critical roleof model selection relative to the concrete strength and stirrup spacing of the member.展开更多
文摘BACKGROUND Clinical belonging refers to the feeling that clinical medical staff feel recognized and accepted by others or groups.The level of clinical belonging of nursing interns affects students’learning motivation and confidence,which in turn affects their clinical practice behavior.AIM To explore the effects of professional identity and nursing information ability on clinical belonging among nursing interns and establish a relationship model for these factors.METHODS The researchers used the convenience sampling method to select 682 nursing interns from China.The survey was conducted using a general information questionnaire,clinical sense of belonging scale,nursing information ability self-assessment scale,and a nursing student professional identity questionnaire.The mediating effect of nursing information ability between their professional identity and clinical sense of belonging was analyzed using SPSS 21.0 and the path analysis in structural equation modeling.RESULTS The total scores of clinical belonging,professional identity,and nursing information ability of nursing interns were(104.29±13.11)points,(57.89±7.16)points,and(70.29±6.20)points,respectively.Nursing information ability had a direct effect on the clinical sense of belonging(effect value=0.46,P<0.05).Occupational identity had a direct effect(effect value=0.52,P<0.05)and an indirect effect(effect value=0.21,P<0.05)on clinical belonging.CONCLUSION Nursing administrators in nursing colleges and hospitals should take effective measures to improve the professional identity and nursing information ability of nursing interns,as well as the clinical sense of belonging among nursing interns.
文摘This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hackers, thereby making customer/client data visible and unprotected. Also, this led to enormous risk of the clients/customers due to defective equipment, bugs, faulty servers, and specious actions. The aim if this paper therefore is to analyze a secure model using Unicode Transformation Format (UTF) base 64 algorithms for storage of data in cloud securely. The methodology used was Object Orientated Hypermedia Analysis and Design Methodology (OOHADM) was adopted. Python was used to develop the security model;the role-based access control (RBAC) and multi-factor authentication (MFA) to enhance security Algorithm were integrated into the Information System developed with HTML 5, JavaScript, Cascading Style Sheet (CSS) version 3 and PHP7. This paper also discussed some of the following concepts;Development of Computing in Cloud, Characteristics of computing, Cloud deployment Model, Cloud Service Models, etc. The results showed that the proposed enhanced security model for information systems of cooperate platform handled multiple authorization and authentication menace, that only one login page will direct all login requests of the different modules to one Single Sign On Server (SSOS). This will in turn redirect users to their requested resources/module when authenticated, leveraging on the Geo-location integration for physical location validation. The emergence of this newly developed system will solve the shortcomings of the existing systems and reduce time and resources incurred while using the existing system.
文摘Traditional methods for selecting models in experimental data analysis are susceptible to researcher bias, hindering exploration of alternative explanations and potentially leading to overfitting. The Finite Information Quantity (FIQ) approach offers a novel solution by acknowledging the inherent limitations in information processing capacity of physical systems. This framework facilitates the development of objective criteria for model selection (comparative uncertainty) and paves the way for a more comprehensive understanding of phenomena through exploring diverse explanations. This work presents a detailed comparison of the FIQ approach with ten established model selection methods, highlighting the advantages and limitations of each. We demonstrate the potential of FIQ to enhance the objectivity and robustness of scientific inquiry through three practical examples: selecting appropriate models for measuring fundamental constants, sound velocity, and underwater electrical discharges. Further research is warranted to explore the full applicability of FIQ across various scientific disciplines.
文摘The whole-process project cost management based on building information modeling(BIM)is a new management method,aiming to realize the comprehensive optimization and improvement of project cost management through the application of BIM technology.This paper summarizes and analyzes the whole-process project cost management based on BIM,aiming to explore its application and development prospects in the construction industry.Firstly,this paper introduces the role and advantages of BIM technology in engineering cost management,including information integration,data sharing,and collaborative work.Secondly,the paper analyzes the key technologies and methods of the whole-process project cost management based on BIM,including model construction,data management,and cost control.In addition,the paper also discusses the challenges and limitations of the whole-process BIM project cost management,such as the inconsistency of technical standards,personnel training,and consciousness change.Finally,the paper summarizes the advantages and development prospects of the whole-process project cost management based on BIM and puts forward the direction and suggestions for future research.Through the research of this paper,it can provide a reference for construction cost management and promote innovation and development in the construction industry.
文摘As a branch of computer science,information visualization aims to help users understand and analyze complex data through graphical interfaces and interactive technologies.Information visualization primarily includes various visual structures such as time-series structures,spatial relationship structures,statistical distribution structures,and geographic map structures,each with unique functions and application scenarios.To better explain the cognitive process of visualization,researchers have proposed various cognitive models based on interaction mechanisms,visual perception steps,and novice use of visualization.These models help understand user cognition in information visualization,enhancing the effectiveness of data analysis and decision-making.
文摘With the acceleration of the social information process,information awareness and information skills have become the basic qualities of every citizen.The establishment of the training mechanism for scientific and technological innovation talents from the beginning of higher education is insufficient to meet the needs of the development of the times.It is imperative to improve the training of information technology innovation talents and explore a new training model.This paper describes the general situation of the development of education in the field of information technology from a domestic and international perspective.It then analyzes the existing problems,explores new exploration models and implementation suggestions,and puts forward prospects at the end of the paper.
文摘Media and Information Literacy(MIL)is one of the most important topics in today’s mediatized world.Under the leadership of United Nations Educational,Scientific and Cultural Organization(UNESCO),many international organizations in the world,as foreign donors,annually announce many projects and grants for the promotion and development of the field of MIL in the countries of the world.One of the main actors of this movement is DW Akademie with different media and MIL projects several countries of the world.This research paper delves into the role of DW Akademie’s MIL model in shaping a media-savvy generation.The study explores the theoretical underpinnings and practical applications of Deutsche Welle(DW)Akademie’s MIL model,analysing its effectiveness in fostering media literacy skills.The research employs a multi-faceted approach,incorporating case studies to assess the model’s impact across diverse demographics.The paper also considers the model’s alignment with global educational policies and proposes recommendations for its integration into broader frameworks.By investigating DW Akademie’s MIL model,this research contributes to the ongoing discourse on media literacy education,providing valuable insights for educators,policymakers,and researchers.The findings offer a nuanced understanding of the model’s position in cultivating a media-savvy generation poised to navigate the complexities of the information age.
文摘The core education function of higher vocational colleges is to train technical talents with high quality,so as to meet the needs of talents in the development stage of our society.Under the guidance of talent training,higher vocational colleges need to pay attention to establishing an all-round and three-dimensional education model,and promote innovation of higher vocational education on the basis of this.It is also a way to promote the innovation of higher vocational education to vigorously promote the construction of“post,course,competition,certificate”mode in the construction of education mode.Through the construction of“post,course,competition,certificate”mode,the education mode of higher vocational colleges is gradually improved,so as to strengthen the effectiveness of talent training in higher vocational colleges.Therefore,in this paper,the author puts forward some suggestions to promote the construction of the integrated education mode of the electronic information engineering technology major in higher vocational colleges,so as to help improve the talent training level of higher vocational colleges.
基金Financial support for this research was provided in part by the US Army Corps of Engineers through a subaward from the University of California,San Diego,USA。
文摘Computer vision-based inspection methods show promise for automating post-earthquake building inspections.These methods survey a building with unmanned aerial vehicles and automatically detect damage in the collected images.Nevertheless,assessing the damage′s impact on structural safety requires localizing damage to specific building components with known design and function.This paper proposes a BIM-based automated inspection framework to provide context for visual surveys.A deep learning-based semantic segmentation algorithm is trained to automatically identify damage in images.The BIM automatically associates any identified damage with specific building components.Then,components are classified into damage states consistent with component fragility models for integration with a structural analysis.To demonstrate the framework,methods are developed to photorealistically simulate severe structural damage in a synthetic computer graphics environment.A graphics model of a real building in Urbana,Illinois,is generated to test the framework;the model is integrated with a structural analysis to apply earthquake damage in a physically realistic manner.A simulated UAV survey is flown of the graphics model and the framework is applied.The method achieves high accuracy in assigning damage states to visible structural components.This assignment enables integration with a performance-based earthquake assessment to classify building safety.
基金the financial support from the National Natural Science Foundation of China(No.52109119)the Guangxi Natural Science Foundation(No.2021GXNSFBA075030)+2 种基金the Guangxi Science and Technology Project(No.Guike AD20325002)the Chinese Postdoctoral Science Fund Project(No.2022 M723408)the Open Research Fund of State Key Laboratory of Simulation and Regulation of Water Cycle in River Basin(China Institute of Water Resources and Hydropower Research)(No.IWHR-SKL-202202).
文摘Mechanical excavation,blasting,adjacent rockburst and fracture slip that occur during mining excavation impose dynamic loads on the rock mass,leading to further fracture of damaged surrounding rock in three-dimensional high-stress and even causing disasters.Therefore,a novel complex true triaxial static-dynamic combined loading method reflecting underground excavation damage and then frequent intermittent disturbance failure is proposed.True triaxial static compression and intermittent disturbance tests are carried out on monzogabbro.The effects of intermediate principal stress and amplitude on the strength characteristics,deformation characteristics,failure characteristics,and precursors of monzogabbro are analyzed,intermediate principal stress and amplitude increase monzogabbro strength and tensile fracture mechanism.Rapid increases in microseismic parameters during rock loading can be precursors for intermittent rock disturbance.Based on the experimental result,the new damage fractional elements and method with considering crack initiation stress and crack unstable stress as initiation and acceleration condition of intermittent disturbance irreversible deformation are proposed.A novel three-dimensional disturbance fractional deterioration model considering the intermediate principal stress effect and intermittent disturbance damage effect is established,and the model predicted results align well with the experimental results.The sensitivity of stress states and model parameters is further explored,and the intermittent disturbance behaviors at different f are predicted.This study provides valuable theoretical bases for the stability analysis of deep mining engineering under dynamic loads.
基金sponsored by the National Natural Science Foundation of China under Grant No.52206101Shanghai Sailing Program under Grant No.20YF1431200the Experiments for Space Exploration Program and the Qian Xuesen Laboratory,China Academy of Space Technology under Grant No.TKTSPY-2020-01-01.
文摘There are five most widely used contact angle schemes in the pseudopotential lattice Boltzmann(LB)model for simulating the wetting phenomenon:The pseudopotential-based scheme(PB scheme),the improved virtualdensity scheme(IVD scheme),the modified pseudopotential-based scheme with a ghost fluid layer constructed by using the fluid layer density above the wall(MPB-C scheme),the modified pseudopotential-based scheme with a ghost fluid layer constructed by using the weighted average density of surrounding fluid nodes(MPB-W scheme)and the geometric formulation scheme(GF scheme).But the numerical stability and accuracy of the schemes for wetting simulation remain unclear in the past.In this paper,the numerical stability and accuracy of these schemes are clarified for the first time,by applying the five widely used contact angle schemes to simulate a two-dimensional(2D)sessile droplet on wall and capillary imbibition in a 2D channel as the examples of static wetting and dynamic wetting simulations respectively.(i)It is shown that the simulated contact angles by the GF scheme are consistent at different density ratios for the same prescribed contact angle,but the simulated contact angles by the PB scheme,IVD scheme,MPB-C scheme and MPB-W scheme change with density ratios for the same fluid-solid interaction strength.The PB scheme is found to be the most unstable scheme for simulating static wetting at increased density ratios.(ii)Although the spurious velocity increases with the increased liquid/vapor density ratio for all the contact angle schemes,the magnitude of the spurious velocity in the PB scheme,IVD scheme and GF scheme are smaller than that in the MPB-C scheme and MPB-W scheme.(iii)The fluid density variation near the wall in the PB scheme is the most significant,and the variation can be diminished in the IVD scheme,MPB-C scheme andMPBWscheme.The variation totally disappeared in the GF scheme.(iv)For the simulation of capillary imbibition,the MPB-C scheme,MPB-Wscheme and GF scheme simulate the dynamics of the liquid-vapor interface well,with the GF scheme being the most accurate.The accuracy of the IVD scheme is low at a small contact angle(44 degrees)but gets high at a large contact angle(60 degrees).However,the PB scheme is the most inaccurate in simulating the dynamics of the liquid-vapor interface.As a whole,it is most suggested to apply the GF scheme to simulate static wetting or dynamic wetting,while it is the least suggested to use the PB scheme to simulate static wetting or dynamic wetting.
文摘At present,the database cache model of power information system has problems such as slow running speed and low database hit rate.To this end,this paper proposes a database cache model for power information systems based on deep machine learning.The caching model includes program caching,Structured Query Language(SQL)preprocessing,and core caching modules.Among them,the method to improve the efficiency of the statement is to adjust operations such as multi-table joins and replacement keywords in the SQL optimizer.Build predictive models using boosted regression trees in the core caching module.Generate a series of regression tree models using machine learning algorithms.Analyze the resource occupancy rate in the power information system to dynamically adjust the voting selection of the regression tree.At the same time,the voting threshold of the prediction model is dynamically adjusted.By analogy,the cache model is re-initialized.The experimental results show that the model has a good cache hit rate and cache efficiency,and can improve the data cache performance of the power information system.It has a high hit rate and short delay time,and always maintains a good hit rate even under different computer memory;at the same time,it only occupies less space and less CPU during actual operation,which is beneficial to power The information system operates efficiently and quickly.
基金the National Natural Science Foundation of China(No.61976080)the Key Project on Research and Practice of Henan University Graduate Education and Teaching Reform(YJSJG2023XJ006)+1 种基金the Key Research and Development Projects of Henan Province(231111212500)the Henan University Graduate Education Innovation and Quality Improvement Program(SYLKC2023016).
文摘In the field of target recognition based on the temporal-spatial information fusion,evidence the-ory has received extensive attention.To achieve accurate and efficient target recognition by the evi-dence theory,an adaptive temporal-spatial information fusion model is proposed.Firstly,an adaptive evaluation correction mechanism is constructed by the evidence distance and Deng entropy,which realizes the credibility discrimination and adaptive correction of the spatial evidence.Secondly,the credibility decay operator is introduced to obtain the dynamic credibility of temporal evidence.Finally,the sequential combination of temporal-spatial evidences is achieved by Shafer’s discount criterion and Dempster’s combination rule.The simulation results show that the proposed method not only considers the dynamic and sequential characteristics of the temporal-spatial evidences com-bination,but also has a strong conflict information processing capability,which provides a new refer-ence for the field of temporal-spatial information fusion.
文摘The escalating costs of research and development, coupled with the influx of researchers, have led to a surge in published articles across scientific disciplines. However, concerns have arisen regarding the accuracy, validity, and reproducibility of reported findings. Issues such as replication problems, fraudulent practices, and a lack of expertise in measurement theory and uncertainty analysis have raised doubts about the reliability and credibility of scientific research. Rigorous assessment practices in certain fields highlight the importance of identifying potential errors and understanding the relationship between technical parameters and research outcomes. To address these concerns, a universally applicable criterion called comparative certainty is urgently needed. This criterion, grounded in an analysis of the modeling process and information transmission, accumulation, and transformation in both theoretical and applied research, aims to evaluate the acceptable deviation between a model and the observed phenomenon. It provides a theoretically grounded framework applicable to all scientific disciplines adhering to the International System of Units (SI). Objective evaluations based on this criterion can enhance the reproducibility and reliability of scientific investigations, instilling greater confidence in published findings. Establishing this criterion would be a significant stride towards ensuring the robustness and credibility of scientific research across disciplines.
文摘Success or failure of an E-commerce platform is often reduced to its ability to maximize the conversion rate of its visitors. This is commonly regarded as the capacity to induce a purchase from a visitor. Visitors possess individual characteristics, histories, and objectives which complicate the choice of what platform features that maximize the conversion rate. Modern web technology has made clickstream data accessible allowing a complete record of a visitor’s actions on a website to be analyzed. What remains poorly constrained is what parts of the clickstream data are meaningful information and what parts are accidental for the problem of platform design. In this research, clickstream data from an online retailer was examined to demonstrate how statistical modeling can improve clickstream information usage. A conceptual model was developed that conjectured relationships between visitor and platform variables, visitors’ platform exit rate, boune rate, and decision to purchase. Several hypotheses on the nature of the clickstream relationships were posited and tested with the models. A discrete choice logit model showed that the content of a website, the history of website use, and the exit rate of pages visited had marginal effects on derived utility for the visitor. Exit rate and bounce rate were modeled as beta distributed random variables. It was found that exit rate and its variability for pages visited were associated with site content, site quality, prior visitor history on the site, and technological preferences of the visitor. Bounce rate was also found to be influenced by the same factors but was in a direction opposite to the registered hypotheses. Most findings supported that clickstream data is amenable to statistical modeling with interpretable and comprehensible models.
文摘When building a model of a physical phenomenon or process, scientists face an inevitable compromise between the simplicity of the model (qualitative-quantitative set of variables) and its accuracy. For hundreds of years, the visual simplicity of a law testified to the genius and depth of the physical thinking of the scientist who proposed it. Currently, the desire for a deeper physical understanding of the surrounding world and newly discovered physical phenomena motivates researchers to increase the number of variables considered in a model. This direction leads to an increased probability of choosing an inaccurate or even erroneous model. This study describes a method for estimating the limit of measurement accuracy, taking into account the stage of model building in terms of storage, transmission, processing and use of information by the observer. This limit, due to the finite amount of information stored in the model, allows you to select the optimal number of variables for the best reproduction of the observed object and calculate the exact values of the threshold discrepancy between the model and the phenomenon under study in measurement theory. We consider two examples: measurement of the speed of sound and measurement of physical constants.
基金supported by the National Natural Science Foundation of China(No.62293481,No.62071058)。
文摘As a novel paradigm,semantic communication provides an effective solution for breaking through the future development dilemma of classical communication systems.However,it remains an unsolved problem of how to measure the information transmission capability for a given semantic communication method and subsequently compare it with the classical communication method.In this paper,we first present a review of the semantic communication system,including its system model and the two typical coding and transmission methods for its implementations.To address the unsolved issue of the information transmission capability measure for semantic communication methods,we propose a new universal performance measure called Information Conductivity.We provide the definition and the physical significance to state its effectiveness in representing the information transmission capabilities of the semantic communication systems and present elaborations including its measure methods,degrees of freedom,and progressive analysis.Experimental results in image transmission scenarios validate its practical applicability.
文摘Analysis of catchment Land use/Land cover (LULC) change is a vital tool in ensuring sustainable catchment management. The study analyzed land use/land cover changes in the Rwizi catchment, south western Uganda from 1989-2019 and projected the trend by 2040. Landsat images, field observations, key informant interviews and focus group discussions were used to collect data. Changes in cropland, forestland, built up area, grazing land, wetland and open water bodies were analyzed in ArcGIS version 10.2.2 and ERDAS IMAGINE 14 software and a Markov chain model. All the LULC classes increased in area except grazing land. Forest land and builtup area between 2009-2019 increased by 370.03% and 229.53% respectively. Projections revealed an increase in forest land and builtup area by 2030 and only built up area by 2040. LULCC in the catchment results from population pressure, reduced soil fertility and high value of agricultural products.
基金Under the auspices of China Scholarship Council。
文摘Cross-region innovation is widely recognized as an important source of the long-term regional innovation capacity.In the recent past,a growing number of studies has investigated the network structure and mechanisms of cross-region innovation collaboration in various contexts.However,existing research mainly focuses on physical effects,such as geographical distance and high-speed railway connections.These studies ignore the intangible drivers in a changing environment,the more digitalized economy and the increasingly solidified innovation network structure.Thus,the focus of this study is on estimating determinants of innovation networks,especially on intangible drivers,which have been largely neglected so far.Using city-level data of Chinese patents(excluding Hong Kong,Macao,and Taiwan Province of China),we trace innovation networks across Chinese cities over a long period of time.By integrating a measure on Information and Communications Technology(ICT)development gap and network structural effects into the general proximity framework,this paper explores the changing mechanisms of Chinese innovation networks from a new perspective.The results show that the structure of cross-region innovation networks has changed in China.As mechanisms behind this development,the results confirm the increasingly important role of intangible drivers in Chinese inter-city innovation collaboration when controlling for effects of physical proximity,such as geographical distance.Since digitalization and coordinated development are the mainstream trends in China and other developing countries,these countries'inter-city innovation collaboration patterns will witness dramatic changes under the influence of intangible drivers.
文摘In this study, the influence of confined concrete models on the response of reinforced concrete structures is investigatedat member and global system levels. The commonly encountered concrete models such as Modified Kent-Park, Saatçioğlu-Razvi, and Mander are considered. Two moment-resisting frames designed according to thepre-modern code are taken into consideration to reflect the example of an RC moment-resisting frame in thecurrent building stock. The building is in an earthquake-prone zone located on Z3 Soil Type. The inelasticresponse of the building frame is modelled by considering the plastic hinges formed on each beam and columnelement for different concrete classes and stirrups spacings. The models are subjected to non-linear static analyses.The differences between confined concrete models are comparatively investigated at both reinforced concretemember and system levels. Based on the results of the comparative analysis, it is revealed that the column behaviouris mostly influenced by the choice of model, due to axial loads and confinement effects, while the beams areless affected, and also it is observed that the differences exhibited in the moment-curvature response of columncross-sections do not significantly affect the overall behaviour of the global system. This highlights the critical roleof model selection relative to the concrete strength and stirrup spacing of the member.