This study proposes the use of the MERISE conceptual data model to create indicators for monitoring and evaluating the effectiveness of vocational training in the Republic of Congo. The importance of MERISE for struct...This study proposes the use of the MERISE conceptual data model to create indicators for monitoring and evaluating the effectiveness of vocational training in the Republic of Congo. The importance of MERISE for structuring and analyzing data is underlined, as it enables the measurement of the adequacy between training and the needs of the labor market. The innovation of the study lies in the adaptation of the MERISE model to the local context, the development of innovative indicators, and the integration of a participatory approach including all relevant stakeholders. Contextual adaptation and local innovation: The study suggests adapting MERISE to the specific context of the Republic of Congo, considering the local particularities of the labor market. Development of innovative indicators and new measurement tools: It proposes creating indicators to assess skills matching and employer satisfaction, which are crucial for evaluating the effectiveness of vocational training. Participatory approach and inclusion of stakeholders: The study emphasizes actively involving training centers, employers, and recruitment agencies in the evaluation process. This participatory approach ensures that the perspectives of all stakeholders are considered, leading to more relevant and practical outcomes. Using the MERISE model allows for: • Rigorous data structuring, organization, and standardization: Clearly defining entities and relationships facilitates data organization and standardization, crucial for effective data analysis. • Facilitation of monitoring, analysis, and relevant indicators: Developing both quantitative and qualitative indicators helps measure the effectiveness of training in relation to the labor market, allowing for a comprehensive evaluation. • Improved communication and common language: By providing a common language for different stakeholders, MERISE enhances communication and collaboration, ensuring that all parties have a shared understanding. The study’s approach and contribution to existing research lie in: • Structured theoretical and practical framework and holistic approach: The study offers a structured framework for data collection and analysis, covering both quantitative and qualitative aspects, thus providing a comprehensive view of the training system. • Reproducible methodology and international comparison: The proposed methodology can be replicated in other contexts, facilitating international comparison and the adoption of best practices. • Extension of knowledge and new perspective: By integrating a participatory approach and developing indicators adapted to local needs, the study extends existing research and offers new perspectives on vocational training evaluation.展开更多
This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends t...This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends that they are not the same. The concept of cyber security is explored, which goes beyond protecting information resources to include a wider variety of assets, including people [1]. Protecting information assets is the main goal of traditional information security, with consideration to the human element and how people fit into the security process. On the other hand, cyber security adds a new level of complexity, as people might unintentionally contribute to or become targets of cyberattacks. This aspect presents moral questions since it is becoming more widely accepted that society has a duty to protect weaker members of society, including children [1]. The study emphasizes how important cyber security is on a larger scale, with many countries creating plans and laws to counteract cyberattacks. Nevertheless, a lot of these sources frequently neglect to define the differences or the relationship between information security and cyber security [1]. The paper focus on differentiating between cybersecurity and information security on a larger scale. The study also highlights other areas of cybersecurity which includes defending people, social norms, and vital infrastructure from threats that arise from online in addition to information and technology protection. It contends that ethical issues and the human factor are becoming more and more important in protecting assets in the digital age, and that cyber security is a paradigm shift in this regard [1].展开更多
China established Xiong’an New Area in Hebei Province in 2017,which is planned to accommodate about 5 million people,aiming to relieve Beijing City of the functions non-essential to its role as China’s capital and t...China established Xiong’an New Area in Hebei Province in 2017,which is planned to accommodate about 5 million people,aiming to relieve Beijing City of the functions non-essential to its role as China’s capital and to expedite the coordinated development of the Beijing-Tianjin-Hebei region.From 2017 to 2021,the China Geological Survey(CGS)took the lead in multi-factor urban geological surveys involving space,resources,environments,and disasters according to the general requirements of“global vision,international standards,distinctive Chinese features,and future-oriented goals”in Xiong’an New Area,identifying the engineering geologic conditions and geologic environmental challenges of this area.The achievements also include a 3D engineering geological structure model for the whole area,along with“one city proper and five clusters”,insights into the ecology and the background endowment of natural resources like land,geothermal resources,groundwater,and wetland of the area before engineering construction,a comprehensive monitoring network of resources and environments in the area,and the“Transparent Xiong’an”geological information platform that is open,shared,dynamically updated,and three-dimensionally visualized.China’s geologists and urban geology have played a significant role in the urban planning and construction of Xiong’an New Area,providing whole-process geological solutions for urban planning,construction,operation and management.The future urban construction of Xiong’an New Area will necessitate the theoretical and technical support of earth system science(ESS)from various aspects,and the purpose is to enhance the resilience of the new type of city and to provide support for the green,low-carbon,and sustainable development of this area.展开更多
Data compression plays a key role in optimizing the use of memory storage space and also reducing latency in data transmission. In this paper, we are interested in lossless compression techniques because their perform...Data compression plays a key role in optimizing the use of memory storage space and also reducing latency in data transmission. In this paper, we are interested in lossless compression techniques because their performance is exploited with lossy compression techniques for images and videos generally using a mixed approach. To achieve our intended objective, which is to study the performance of lossless compression methods, we first carried out a literature review, a summary of which enabled us to select the most relevant, namely the following: arithmetic coding, LZW, Tunstall’s algorithm, RLE, BWT, Huffman coding and Shannon-Fano. Secondly, we designed a purposive text dataset with a repeating pattern in order to test the behavior and effectiveness of the selected compression techniques. Thirdly, we designed the compression algorithms and developed the programs (scripts) in Matlab in order to test their performance. Finally, following the tests conducted on relevant data that we constructed according to a deliberate model, the results show that these methods presented in order of performance are very satisfactory:- LZW- Arithmetic coding- Tunstall algorithm- BWT + RLELikewise, it appears that on the one hand, the performance of certain techniques relative to others is strongly linked to the sequencing and/or recurrence of symbols that make up the message, and on the other hand, to the cumulative time of encoding and decoding.展开更多
In this in-depth exploration, I delve into the complex implications and costs of cybersecurity breaches. Venturing beyond just the immediate repercussions, the research unearths both the overt and concealed long-term ...In this in-depth exploration, I delve into the complex implications and costs of cybersecurity breaches. Venturing beyond just the immediate repercussions, the research unearths both the overt and concealed long-term consequences that businesses encounter. This study integrates findings from various research, including quantitative reports, drawing upon real-world incidents faced by both small and large enterprises. This investigation emphasizes the profound intangible costs, such as trade name devaluation and potential damage to brand reputation, which can persist long after the breach. By collating insights from industry experts and a myriad of research, the study provides a comprehensive perspective on the profound, multi-dimensional impacts of cybersecurity incidents. The overarching aim is to underscore the often-underestimated scope and depth of these breaches, emphasizing the entire timeline post-incident and the urgent need for fortified preventative and reactive measures in the digital domain.展开更多
This investigation deals with the intelligent system for parallel fault-tolerant diagnostic tests construction. A modified parallel algorithm for fault-tolerant diagnostic tests construction is proposed. The algorithm...This investigation deals with the intelligent system for parallel fault-tolerant diagnostic tests construction. A modified parallel algorithm for fault-tolerant diagnostic tests construction is proposed. The algorithm is allowed to optimize processing time on tests construction. A matrix model of data and knowledge representation, as well as various kinds of regularities in data and knowledge are presented. Applied intelligent system for diagnostic of mental health of population which is developed with the use of intelligent system for parallel fault-tolerant DTs construction is suggested.展开更多
This work is dedicated to formation of data warehouse for processing of a large volume of registration data of domain names. Data cleaning is applied in order to increase the effectiveness of decision making support. ...This work is dedicated to formation of data warehouse for processing of a large volume of registration data of domain names. Data cleaning is applied in order to increase the effectiveness of decision making support. Data cleaning is ap- plied in warehouses for detection and deletion of errors, discrepancy in data in order to improve their quality. For this purpose, fuzzy record comparison algorithms are for clearing of registration data of domain names reviewed in this work. Also, identification method of domain names registration data for data warehouse formation is proposed. Deci- sion making algorithms for identification of registration data are implemented in DRRacket and Python.展开更多
Every day,an NDT(Non-Destructive Testing)report will govern key decisions and inform inspection strategies that could affect the flow of millions of dollars which ultimately affects local environments and potential ri...Every day,an NDT(Non-Destructive Testing)report will govern key decisions and inform inspection strategies that could affect the flow of millions of dollars which ultimately affects local environments and potential risk to life.There is a direct correlation between report quality and equipment capability.The more able the equipment is-in terms of efficient data gathering,signal to noise ratio,positioning,and coverage-the more actionable the report is.This results in optimal maintenance and repair strategies providing the report is clear and well presented.Furthermore,when considering tank floor storage inspection it is essential that asset owners have total confidence in inspection findings and the ensuing reports.Tank floor inspection equipment must not only be efficient and highly capable,but data sets should be traceable and integrity maintained throughout.Corrosion mapping of large surface areas such as storage tank bottoms is an inherently arduous and time-consuming process.MFL(magnetic flux leakage)based tank bottom scanners present a well-established and highly rated method for inspection.There are many benefits of using modern MFL technology to generate actionable reports.Chief among these includes efficiency of coverage while gaining valuable information regarding defect location,severity,surface origin and the extent of coverage.More recent advancements in modern MFL tank bottom scanners afford the ability to scan and record data sets at areas of the tank bottom which were previously classed as dead zones or areas not scanned due to physical restraints.An example of this includes scanning the CZ(critical zone)which is the area close to the annular to shell junction weld.Inclusion of these additional dead zones increases overall inspection coverage,quality and traceability.Inspection of the CZ areas allows engineers to quickly determine the integrity of arguably the most important area of the tank bottom.Herein we discuss notable developments in CZ coverage,inspection efficiency and data integrity that combines to deliver an actionable report.The asset owner can interrogate this report to develop pertinent and accurate maintenance and repair strategies.展开更多
We develop various statistical methods important for multidimensional genetic data analysis. Theorems justifying application of these methods are established. We concentrate on the multifactor dimensionality reduction...We develop various statistical methods important for multidimensional genetic data analysis. Theorems justifying application of these methods are established. We concentrate on the multifactor dimensionality reduction, logic regression, random forests, stochastic gradient boosting along with their new modifications. We use complementary approaches to study the risk of complex diseases such as cardiovascular ones. The roles of certain combinations of single nucleotide polymorphisms and non-genetic risk factors are examined. To perform the data analysis concerning the coronary heart disease and myocardial infarction the Lomonosov Moscow State University supercomputer “Chebyshev” was employed.展开更多
The unique composition of milk makes this basic foodstuff into an exceptional raw material for the production of new ingredients with desired properties and diverse applications in the food industry. The fractionation...The unique composition of milk makes this basic foodstuff into an exceptional raw material for the production of new ingredients with desired properties and diverse applications in the food industry. The fractionation of milk is the key in the development of those ingredients and products;hence continuous research and development on this field, especially various levels of fractionation and separation by filtration, have been carried out. This review focuses on the production of milk fractions as well as their particular properties, applications and processes that increase their exploitation. Whey proteins and caseins from the protein fraction are excellent emulsifiers and protein supplements. Besides, they can be chemically or enzymatically modified to obtain bioactive peptides with numerous functional and nutritional properties. In this context, valorization techniques of cheese-whey proteins, by-product of dairy industry that constitutes both economic and environmental problems, are being developed. Phospholipids from the milk fat fraction are powerful emulsifiers and also have exclusive nutraceutical properties. In addition, enzyme modification of milk phospholipids makes it possible to tailor emulsifiers with particular properties. However, several aspects remain to be overcome;those refer to a deeper understanding of the healthy, functional and nutritional properties of these new ingredients that might be barriers for its use and acceptability. Additionally, in this review, alternative applications of milk constituents in the non-food area such as in the manufacture of plastic materials and textile fibers are also introduced. The unmet needs, the cross-fertilization in between various protein domains,the carbon footprint requirements, the environmental necessities, the health and wellness new demand, etc., are dominant factors in the search for innovation approaches;these factors are also outlining the further innovation potential deriving from those “apparent” constrains obliging science and technology to take them into account.展开更多
Smart Materials are along with Innovation attributes and Artificial Intelligence among the most used “buzz” words in all media. Central to their practical occurrence, many talents are to be gathered within new conte...Smart Materials are along with Innovation attributes and Artificial Intelligence among the most used “buzz” words in all media. Central to their practical occurrence, many talents are to be gathered within new contextual data influxes. Has this, in the last 20 years, changed some of the essential fundamental dimensions and the required skills of the actors such as providers, users, insiders, etc.? This is a preliminary focus and prelude of this review. As an example, polysaccharide materials are the most abundant macromolecules present as an integral part of the natural system of our planet. They are renewable, biodegradable, carbon neutral with low environmental, health and safety risks and serve as structural materials in the cell walls of plants. Most of them are used, for many years, as engineering materials in many important industrial processes, such as pulp and papermaking and manufacture of synthetic textile fibres. They are also used in other domains such as conversion into biofuels and, more recently, in the design of processes using polysaccharide nanoparticles. The main properties of polysaccharides (e.g. low density, thermal stability, chemical resistance, high mechanical strength…), together with their biocompatibility, biodegradability, functionality, durability and uniformity, allow their use for manufacturing smart materials such as blends and composites, electroactive polymers and hydrogels which can be obtained 1) through direct utilization and/or 2) after chemical or physical modifications of the polysaccharides. This paper reviews recent works developed on polysaccharides, mainly on cellulose, hemicelluloses, chitin, chitosans, alginates, and their by-products (blends and composites), with the objectives of manufacturing smart materials. It is worth noting that, today, the fundamental understanding of the molecular level interactions that confer smartness to polysaccharides remains poor and one can predict that new experimental and theoretical tools will emerge to develop the necessary understanding of the structure-property-function relationships that will enable polysaccharide-smartness to be better understood and controlled, giving rise to the development of new and innovative applications such as nanotechnology, foods, cosmetics and medicine (e.g. controlled drug release and regenerative medicine) and so, opening up major commercial markets in the context of green chemistry.展开更多
We show a quantitative technique characterized by low numerical mediation for the reconstruction of temporal sequences of geophysical data of length L interrupted for a time ΔT where . The aim is to protect the infor...We show a quantitative technique characterized by low numerical mediation for the reconstruction of temporal sequences of geophysical data of length L interrupted for a time ΔT where . The aim is to protect the information acquired before and after the interruption by means of a numerical protocol with the lowest possible calculation weight. The signal reconstruction process is based on the synthesis of the low frequency signal extracted for subsampling (subsampling ∇Dirac = ΔT in phase with ΔT) with the high frequency signal recorded before the crash. The SYRec (SYnthetic REConstruction) method for simplicity and speed of calculation and for spectral response stability is particularly effective in the studies of high speed transient phenomena that develop in very perturbed fields. This operative condition is found a mental when almost immediate informational responses are required to the observation system. In this example we are dealing with geomagnetic data coming from an uw counter intrusion magnetic system. The system produces (on time) information about the transit of local magnetic singularities (magnetic perturbations with low spatial extension), originated by quasi-point form and kinematic sources (divers), in harbors magnetic underwater fields. The performances of stability of the SYRec system make it usable also in long and medium period of observation (activity of geomagnetic observatories).展开更多
Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear mode...Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear model is the most used technique for identifying hidden relationships between underlying random variables of interest. However, data quality is a significant challenge in machine learning, especially when missing data is present. The linear regression model is a commonly used statistical modeling technique used in various applications to find relationships between variables of interest. When estimating linear regression parameters which are useful for things like future prediction and partial effects analysis of independent variables, maximum likelihood estimation (MLE) is the method of choice. However, many datasets contain missing observations, which can lead to costly and time-consuming data recovery. To address this issue, the expectation-maximization (EM) algorithm has been suggested as a solution for situations including missing data. The EM algorithm repeatedly finds the best estimates of parameters in statistical models that depend on variables or data that have not been observed. This is called maximum likelihood or maximum a posteriori (MAP). Using the present estimate as input, the expectation (E) step constructs a log-likelihood function. Finding the parameters that maximize the anticipated log-likelihood, as determined in the E step, is the job of the maximization (M) phase. This study looked at how well the EM algorithm worked on a made-up compositional dataset with missing observations. It used both the robust least square version and ordinary least square regression techniques. The efficacy of the EM algorithm was compared with two alternative imputation techniques, k-Nearest Neighbor (k-NN) and mean imputation (), in terms of Aitchison distances and covariance.展开更多
This research paper describes the design and implementation of the Consultative Committee for Space Data Systems (CCSDS) standards REF _Ref401069962 \r \h \* MERGEFORMAT [1] for Space Data Link Layer Protocol (SDLP). ...This research paper describes the design and implementation of the Consultative Committee for Space Data Systems (CCSDS) standards REF _Ref401069962 \r \h \* MERGEFORMAT [1] for Space Data Link Layer Protocol (SDLP). The primer focus is the telecommand (TC) part of the standard. The implementation of the standard was in the form of DLL functions using C++ programming language. The second objective of this paper was to use the DLL functions with OMNeT++ simulating environment to create a simulator in order to analyze the mean end-to-end Packet Delay, maximum achievable application layer throughput for a given fixed link capacity and normalized protocol overhead, defined as the total number of bytes transmitted on the link in a given period of time (e.g. per second) divided by the number of bytes of application data received at the application layer model data sink. In addition, the DLL was also integrated with Ground Support Equipment Operating System (GSEOS), a software system for space instruments and small spacecrafts especially suited for low budget missions. The SDLP is designed for rapid test system design and high flexibility for changing telemetry and command requirements. GSEOS can be seamlessly moved from EM/FM development (bench testing) to flight operations. It features the Python programming language as a configuration/scripting tool and can easily be extended to accommodate custom hardware interfaces. This paper also shows the results of the simulations and its analysis.展开更多
Lithium element has attracted remarkable attraction for energy storage devices, over the past 30 years. Lithium is a light element and exhibits the low atomic number 3, just after hydrogen and helium in the periodic t...Lithium element has attracted remarkable attraction for energy storage devices, over the past 30 years. Lithium is a light element and exhibits the low atomic number 3, just after hydrogen and helium in the periodic table. The lithium atom has a strong tendency to release one electron and constitute a positive charge, as Li<sup> </sup>. Initially, lithium metal was employed as a negative electrode, which released electrons. However, it was observed that its structure changed after the repetition of charge-discharge cycles. To remedy this, the cathode mainly consisted of layer metal oxide and olive, e.g., cobalt oxide, LiFePO<sub>4</sub>, etc., along with some contents of lithium, while the anode was assembled by graphite and silicon, etc. Moreover, the electrolyte was prepared using the lithium salt in a suitable solvent to attain a greater concentration of lithium ions. Owing to the lithium ions’ role, the battery’s name was mentioned as a lithium-ion battery. Herein, the presented work describes the working and operational mechanism of the lithium-ion battery. Further, the lithium-ion batteries’ general view and future prospects have also been elaborated.展开更多
[Objectives]To explore the trend of brands towards the design of waist protection products through data mining,and to provide reference for the design concept of the contour of waist protection pillow.[Methods]The str...[Objectives]To explore the trend of brands towards the design of waist protection products through data mining,and to provide reference for the design concept of the contour of waist protection pillow.[Methods]The structural design information of all waist protection equipment was collected from the national Internet platform,and the data were classified and a database was established.IBM SPSS 26.0 and MATLAB 2018a were used to analyze the data and tabulate them in Tableau 2022.4.After the association rules were clarified,the data were imported into Cinema 4D R21 to create the concept contour of waist protection pillow.[Results]The average and standard deviation of the single airbag design were the highest in all groups,with an average of 0.511 and a standard deviation of 0.502.The average and standard deviation of the upper and lower dual airbags were the lowest in all groups,with an average of 0.015 and a standard deviation of 0.120;the correlation coefficient between single airbag and 120°arc stretching was 0.325,which was positively correlated with each other(P<0.01);the correlation coefficient between multiple airbags and 360°encircling fitting was 0.501,which was positively correlated with each other and had the highest correlation degree(P<0.01).[Conclusions]The single airbag design is well recognized by companies,and has received the highest attention among all brand products.While focusing on single airbag design,most brands will consider the need to add 120°arc stretching elements in product design.At the time of focusing on multiple airbag design,some brands believe that 360°encircling fitting elements need to be added to the product,and the correlation between the two is the highest among all groups.展开更多
Chlorophyll-a(Chl-a)concentration is a primary indicator for marine environmental monitoring.The spatio-temporal variations of sea surface Chl-a concentration in the Yellow Sea(YS)and the East China Sea(ECS)in 2001-20...Chlorophyll-a(Chl-a)concentration is a primary indicator for marine environmental monitoring.The spatio-temporal variations of sea surface Chl-a concentration in the Yellow Sea(YS)and the East China Sea(ECS)in 2001-2020 were investigated by reconstructing the MODIS Level 3 products with the data interpolation empirical orthogonal function(DINEOF)method.The reconstructed results by interpolating the combined MODIS daily+8-day datasets were found better than those merely by interpolating daily or 8-day data.Chl-a concentration in the YS and the ECS reached its maximum in spring,with blooms occurring,decreased in summer and autumn,and increased in late autumn and early winter.By performing empirical orthogonal function(EOF)decomposition of the reconstructed data fields and correlation analysis with several potential environmental factors,we found that the sea surface temperature(SST)plays a significant role in the seasonal variation of Chl a,especially during spring and summer.The increase of SST in spring and the upper-layer nutrients mixed up during the last winter might favor the occurrence of spring blooms.The high sea surface temperature(SST)throughout the summer would strengthen the vertical stratification and prevent nutrients supply from deep water,resulting in low surface Chl-a concentrations.The sea surface Chl-a concentration in the YS was found decreased significantly from 2012 to 2020,which was possibly related to the Pacific Decadal Oscillation(PDO).展开更多
Over the last few years, the Internet of Things (IoT) has become an omnipresent term. The IoT expands the existing common concepts, anytime and anyplace to the connectivity for anything. The proliferation in IoT offer...Over the last few years, the Internet of Things (IoT) has become an omnipresent term. The IoT expands the existing common concepts, anytime and anyplace to the connectivity for anything. The proliferation in IoT offers opportunities but may also bear risks. A hitherto neglected aspect is the possible increase in power consumption as smart devices in IoT applications are expected to be reachable by other devices at all times. This implies that the device is consuming electrical energy even when it is not in use for its primary function. Many researchers’ communities have started addressing storage ability like cache memory of smart devices using the concept called—Named Data Networking (NDN) to achieve better energy efficient communication model. In NDN, memory or buffer overflow is the common challenge especially when internal memory of node exceeds its limit and data with highest degree of freshness may not be accommodated and entire scenarios behaves like a traditional network. In such case, Data Caching is not performed by intermediate nodes to guarantee highest degree of freshness. On the periodical updates sent from data producers, it is exceedingly demanded that data consumers must get up to date information at cost of lease energy. Consequently, there is challenge in maintaining tradeoff between freshness energy consumption during Publisher-Subscriber interaction. In our work, we proposed the architecture to overcome cache strategy issue by Smart Caching Algorithm for improvement in memory management and data freshness. The smart caching strategy updates the data at precise interval by keeping garbage data into consideration. It is also observed from experiment that data redundancy can be easily obtained by ignoring/dropping data packets for the information which is not of interest by other participating nodes in network, ultimately leading to optimizing tradeoff between freshness and energy required.展开更多
文摘This study proposes the use of the MERISE conceptual data model to create indicators for monitoring and evaluating the effectiveness of vocational training in the Republic of Congo. The importance of MERISE for structuring and analyzing data is underlined, as it enables the measurement of the adequacy between training and the needs of the labor market. The innovation of the study lies in the adaptation of the MERISE model to the local context, the development of innovative indicators, and the integration of a participatory approach including all relevant stakeholders. Contextual adaptation and local innovation: The study suggests adapting MERISE to the specific context of the Republic of Congo, considering the local particularities of the labor market. Development of innovative indicators and new measurement tools: It proposes creating indicators to assess skills matching and employer satisfaction, which are crucial for evaluating the effectiveness of vocational training. Participatory approach and inclusion of stakeholders: The study emphasizes actively involving training centers, employers, and recruitment agencies in the evaluation process. This participatory approach ensures that the perspectives of all stakeholders are considered, leading to more relevant and practical outcomes. Using the MERISE model allows for: • Rigorous data structuring, organization, and standardization: Clearly defining entities and relationships facilitates data organization and standardization, crucial for effective data analysis. • Facilitation of monitoring, analysis, and relevant indicators: Developing both quantitative and qualitative indicators helps measure the effectiveness of training in relation to the labor market, allowing for a comprehensive evaluation. • Improved communication and common language: By providing a common language for different stakeholders, MERISE enhances communication and collaboration, ensuring that all parties have a shared understanding. The study’s approach and contribution to existing research lie in: • Structured theoretical and practical framework and holistic approach: The study offers a structured framework for data collection and analysis, covering both quantitative and qualitative aspects, thus providing a comprehensive view of the training system. • Reproducible methodology and international comparison: The proposed methodology can be replicated in other contexts, facilitating international comparison and the adoption of best practices. • Extension of knowledge and new perspective: By integrating a participatory approach and developing indicators adapted to local needs, the study extends existing research and offers new perspectives on vocational training evaluation.
文摘This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends that they are not the same. The concept of cyber security is explored, which goes beyond protecting information resources to include a wider variety of assets, including people [1]. Protecting information assets is the main goal of traditional information security, with consideration to the human element and how people fit into the security process. On the other hand, cyber security adds a new level of complexity, as people might unintentionally contribute to or become targets of cyberattacks. This aspect presents moral questions since it is becoming more widely accepted that society has a duty to protect weaker members of society, including children [1]. The study emphasizes how important cyber security is on a larger scale, with many countries creating plans and laws to counteract cyberattacks. Nevertheless, a lot of these sources frequently neglect to define the differences or the relationship between information security and cyber security [1]. The paper focus on differentiating between cybersecurity and information security on a larger scale. The study also highlights other areas of cybersecurity which includes defending people, social norms, and vital infrastructure from threats that arise from online in addition to information and technology protection. It contends that ethical issues and the human factor are becoming more and more important in protecting assets in the digital age, and that cyber security is a paradigm shift in this regard [1].
基金supported by two projects initialed China Geological Survey: “Evaluation on Soil and Water Quality and Geological Survey in Xiong’an New Area (DD20189122)” and “Monitoring and Evaluation on Carrying Capacity of Resource and Environment in BeijingTianjin-Hebei Coordinated Development Zone and Xiong’an New Area (DD20221727)”
文摘China established Xiong’an New Area in Hebei Province in 2017,which is planned to accommodate about 5 million people,aiming to relieve Beijing City of the functions non-essential to its role as China’s capital and to expedite the coordinated development of the Beijing-Tianjin-Hebei region.From 2017 to 2021,the China Geological Survey(CGS)took the lead in multi-factor urban geological surveys involving space,resources,environments,and disasters according to the general requirements of“global vision,international standards,distinctive Chinese features,and future-oriented goals”in Xiong’an New Area,identifying the engineering geologic conditions and geologic environmental challenges of this area.The achievements also include a 3D engineering geological structure model for the whole area,along with“one city proper and five clusters”,insights into the ecology and the background endowment of natural resources like land,geothermal resources,groundwater,and wetland of the area before engineering construction,a comprehensive monitoring network of resources and environments in the area,and the“Transparent Xiong’an”geological information platform that is open,shared,dynamically updated,and three-dimensionally visualized.China’s geologists and urban geology have played a significant role in the urban planning and construction of Xiong’an New Area,providing whole-process geological solutions for urban planning,construction,operation and management.The future urban construction of Xiong’an New Area will necessitate the theoretical and technical support of earth system science(ESS)from various aspects,and the purpose is to enhance the resilience of the new type of city and to provide support for the green,low-carbon,and sustainable development of this area.
文摘Data compression plays a key role in optimizing the use of memory storage space and also reducing latency in data transmission. In this paper, we are interested in lossless compression techniques because their performance is exploited with lossy compression techniques for images and videos generally using a mixed approach. To achieve our intended objective, which is to study the performance of lossless compression methods, we first carried out a literature review, a summary of which enabled us to select the most relevant, namely the following: arithmetic coding, LZW, Tunstall’s algorithm, RLE, BWT, Huffman coding and Shannon-Fano. Secondly, we designed a purposive text dataset with a repeating pattern in order to test the behavior and effectiveness of the selected compression techniques. Thirdly, we designed the compression algorithms and developed the programs (scripts) in Matlab in order to test their performance. Finally, following the tests conducted on relevant data that we constructed according to a deliberate model, the results show that these methods presented in order of performance are very satisfactory:- LZW- Arithmetic coding- Tunstall algorithm- BWT + RLELikewise, it appears that on the one hand, the performance of certain techniques relative to others is strongly linked to the sequencing and/or recurrence of symbols that make up the message, and on the other hand, to the cumulative time of encoding and decoding.
文摘In this in-depth exploration, I delve into the complex implications and costs of cybersecurity breaches. Venturing beyond just the immediate repercussions, the research unearths both the overt and concealed long-term consequences that businesses encounter. This study integrates findings from various research, including quantitative reports, drawing upon real-world incidents faced by both small and large enterprises. This investigation emphasizes the profound intangible costs, such as trade name devaluation and potential damage to brand reputation, which can persist long after the breach. By collating insights from industry experts and a myriad of research, the study provides a comprehensive perspective on the profound, multi-dimensional impacts of cybersecurity incidents. The overarching aim is to underscore the often-underestimated scope and depth of these breaches, emphasizing the entire timeline post-incident and the urgent need for fortified preventative and reactive measures in the digital domain.
文摘This investigation deals with the intelligent system for parallel fault-tolerant diagnostic tests construction. A modified parallel algorithm for fault-tolerant diagnostic tests construction is proposed. The algorithm is allowed to optimize processing time on tests construction. A matrix model of data and knowledge representation, as well as various kinds of regularities in data and knowledge are presented. Applied intelligent system for diagnostic of mental health of population which is developed with the use of intelligent system for parallel fault-tolerant DTs construction is suggested.
文摘This work is dedicated to formation of data warehouse for processing of a large volume of registration data of domain names. Data cleaning is applied in order to increase the effectiveness of decision making support. Data cleaning is ap- plied in warehouses for detection and deletion of errors, discrepancy in data in order to improve their quality. For this purpose, fuzzy record comparison algorithms are for clearing of registration data of domain names reviewed in this work. Also, identification method of domain names registration data for data warehouse formation is proposed. Deci- sion making algorithms for identification of registration data are implemented in DRRacket and Python.
文摘Every day,an NDT(Non-Destructive Testing)report will govern key decisions and inform inspection strategies that could affect the flow of millions of dollars which ultimately affects local environments and potential risk to life.There is a direct correlation between report quality and equipment capability.The more able the equipment is-in terms of efficient data gathering,signal to noise ratio,positioning,and coverage-the more actionable the report is.This results in optimal maintenance and repair strategies providing the report is clear and well presented.Furthermore,when considering tank floor storage inspection it is essential that asset owners have total confidence in inspection findings and the ensuing reports.Tank floor inspection equipment must not only be efficient and highly capable,but data sets should be traceable and integrity maintained throughout.Corrosion mapping of large surface areas such as storage tank bottoms is an inherently arduous and time-consuming process.MFL(magnetic flux leakage)based tank bottom scanners present a well-established and highly rated method for inspection.There are many benefits of using modern MFL technology to generate actionable reports.Chief among these includes efficiency of coverage while gaining valuable information regarding defect location,severity,surface origin and the extent of coverage.More recent advancements in modern MFL tank bottom scanners afford the ability to scan and record data sets at areas of the tank bottom which were previously classed as dead zones or areas not scanned due to physical restraints.An example of this includes scanning the CZ(critical zone)which is the area close to the annular to shell junction weld.Inclusion of these additional dead zones increases overall inspection coverage,quality and traceability.Inspection of the CZ areas allows engineers to quickly determine the integrity of arguably the most important area of the tank bottom.Herein we discuss notable developments in CZ coverage,inspection efficiency and data integrity that combines to deliver an actionable report.The asset owner can interrogate this report to develop pertinent and accurate maintenance and repair strategies.
文摘We develop various statistical methods important for multidimensional genetic data analysis. Theorems justifying application of these methods are established. We concentrate on the multifactor dimensionality reduction, logic regression, random forests, stochastic gradient boosting along with their new modifications. We use complementary approaches to study the risk of complex diseases such as cardiovascular ones. The roles of certain combinations of single nucleotide polymorphisms and non-genetic risk factors are examined. To perform the data analysis concerning the coronary heart disease and myocardial infarction the Lomonosov Moscow State University supercomputer “Chebyshev” was employed.
文摘The unique composition of milk makes this basic foodstuff into an exceptional raw material for the production of new ingredients with desired properties and diverse applications in the food industry. The fractionation of milk is the key in the development of those ingredients and products;hence continuous research and development on this field, especially various levels of fractionation and separation by filtration, have been carried out. This review focuses on the production of milk fractions as well as their particular properties, applications and processes that increase their exploitation. Whey proteins and caseins from the protein fraction are excellent emulsifiers and protein supplements. Besides, they can be chemically or enzymatically modified to obtain bioactive peptides with numerous functional and nutritional properties. In this context, valorization techniques of cheese-whey proteins, by-product of dairy industry that constitutes both economic and environmental problems, are being developed. Phospholipids from the milk fat fraction are powerful emulsifiers and also have exclusive nutraceutical properties. In addition, enzyme modification of milk phospholipids makes it possible to tailor emulsifiers with particular properties. However, several aspects remain to be overcome;those refer to a deeper understanding of the healthy, functional and nutritional properties of these new ingredients that might be barriers for its use and acceptability. Additionally, in this review, alternative applications of milk constituents in the non-food area such as in the manufacture of plastic materials and textile fibers are also introduced. The unmet needs, the cross-fertilization in between various protein domains,the carbon footprint requirements, the environmental necessities, the health and wellness new demand, etc., are dominant factors in the search for innovation approaches;these factors are also outlining the further innovation potential deriving from those “apparent” constrains obliging science and technology to take them into account.
文摘Smart Materials are along with Innovation attributes and Artificial Intelligence among the most used “buzz” words in all media. Central to their practical occurrence, many talents are to be gathered within new contextual data influxes. Has this, in the last 20 years, changed some of the essential fundamental dimensions and the required skills of the actors such as providers, users, insiders, etc.? This is a preliminary focus and prelude of this review. As an example, polysaccharide materials are the most abundant macromolecules present as an integral part of the natural system of our planet. They are renewable, biodegradable, carbon neutral with low environmental, health and safety risks and serve as structural materials in the cell walls of plants. Most of them are used, for many years, as engineering materials in many important industrial processes, such as pulp and papermaking and manufacture of synthetic textile fibres. They are also used in other domains such as conversion into biofuels and, more recently, in the design of processes using polysaccharide nanoparticles. The main properties of polysaccharides (e.g. low density, thermal stability, chemical resistance, high mechanical strength…), together with their biocompatibility, biodegradability, functionality, durability and uniformity, allow their use for manufacturing smart materials such as blends and composites, electroactive polymers and hydrogels which can be obtained 1) through direct utilization and/or 2) after chemical or physical modifications of the polysaccharides. This paper reviews recent works developed on polysaccharides, mainly on cellulose, hemicelluloses, chitin, chitosans, alginates, and their by-products (blends and composites), with the objectives of manufacturing smart materials. It is worth noting that, today, the fundamental understanding of the molecular level interactions that confer smartness to polysaccharides remains poor and one can predict that new experimental and theoretical tools will emerge to develop the necessary understanding of the structure-property-function relationships that will enable polysaccharide-smartness to be better understood and controlled, giving rise to the development of new and innovative applications such as nanotechnology, foods, cosmetics and medicine (e.g. controlled drug release and regenerative medicine) and so, opening up major commercial markets in the context of green chemistry.
文摘We show a quantitative technique characterized by low numerical mediation for the reconstruction of temporal sequences of geophysical data of length L interrupted for a time ΔT where . The aim is to protect the information acquired before and after the interruption by means of a numerical protocol with the lowest possible calculation weight. The signal reconstruction process is based on the synthesis of the low frequency signal extracted for subsampling (subsampling ∇Dirac = ΔT in phase with ΔT) with the high frequency signal recorded before the crash. The SYRec (SYnthetic REConstruction) method for simplicity and speed of calculation and for spectral response stability is particularly effective in the studies of high speed transient phenomena that develop in very perturbed fields. This operative condition is found a mental when almost immediate informational responses are required to the observation system. In this example we are dealing with geomagnetic data coming from an uw counter intrusion magnetic system. The system produces (on time) information about the transit of local magnetic singularities (magnetic perturbations with low spatial extension), originated by quasi-point form and kinematic sources (divers), in harbors magnetic underwater fields. The performances of stability of the SYRec system make it usable also in long and medium period of observation (activity of geomagnetic observatories).
文摘Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear model is the most used technique for identifying hidden relationships between underlying random variables of interest. However, data quality is a significant challenge in machine learning, especially when missing data is present. The linear regression model is a commonly used statistical modeling technique used in various applications to find relationships between variables of interest. When estimating linear regression parameters which are useful for things like future prediction and partial effects analysis of independent variables, maximum likelihood estimation (MLE) is the method of choice. However, many datasets contain missing observations, which can lead to costly and time-consuming data recovery. To address this issue, the expectation-maximization (EM) algorithm has been suggested as a solution for situations including missing data. The EM algorithm repeatedly finds the best estimates of parameters in statistical models that depend on variables or data that have not been observed. This is called maximum likelihood or maximum a posteriori (MAP). Using the present estimate as input, the expectation (E) step constructs a log-likelihood function. Finding the parameters that maximize the anticipated log-likelihood, as determined in the E step, is the job of the maximization (M) phase. This study looked at how well the EM algorithm worked on a made-up compositional dataset with missing observations. It used both the robust least square version and ordinary least square regression techniques. The efficacy of the EM algorithm was compared with two alternative imputation techniques, k-Nearest Neighbor (k-NN) and mean imputation (), in terms of Aitchison distances and covariance.
文摘This research paper describes the design and implementation of the Consultative Committee for Space Data Systems (CCSDS) standards REF _Ref401069962 \r \h \* MERGEFORMAT [1] for Space Data Link Layer Protocol (SDLP). The primer focus is the telecommand (TC) part of the standard. The implementation of the standard was in the form of DLL functions using C++ programming language. The second objective of this paper was to use the DLL functions with OMNeT++ simulating environment to create a simulator in order to analyze the mean end-to-end Packet Delay, maximum achievable application layer throughput for a given fixed link capacity and normalized protocol overhead, defined as the total number of bytes transmitted on the link in a given period of time (e.g. per second) divided by the number of bytes of application data received at the application layer model data sink. In addition, the DLL was also integrated with Ground Support Equipment Operating System (GSEOS), a software system for space instruments and small spacecrafts especially suited for low budget missions. The SDLP is designed for rapid test system design and high flexibility for changing telemetry and command requirements. GSEOS can be seamlessly moved from EM/FM development (bench testing) to flight operations. It features the Python programming language as a configuration/scripting tool and can easily be extended to accommodate custom hardware interfaces. This paper also shows the results of the simulations and its analysis.
文摘Lithium element has attracted remarkable attraction for energy storage devices, over the past 30 years. Lithium is a light element and exhibits the low atomic number 3, just after hydrogen and helium in the periodic table. The lithium atom has a strong tendency to release one electron and constitute a positive charge, as Li<sup> </sup>. Initially, lithium metal was employed as a negative electrode, which released electrons. However, it was observed that its structure changed after the repetition of charge-discharge cycles. To remedy this, the cathode mainly consisted of layer metal oxide and olive, e.g., cobalt oxide, LiFePO<sub>4</sub>, etc., along with some contents of lithium, while the anode was assembled by graphite and silicon, etc. Moreover, the electrolyte was prepared using the lithium salt in a suitable solvent to attain a greater concentration of lithium ions. Owing to the lithium ions’ role, the battery’s name was mentioned as a lithium-ion battery. Herein, the presented work describes the working and operational mechanism of the lithium-ion battery. Further, the lithium-ion batteries’ general view and future prospects have also been elaborated.
基金Supported by Municipal Public Welfare Science and Technology Project of Zhoushan Science and Technology Bureau,Zhejiang Province(2021C31064).
文摘[Objectives]To explore the trend of brands towards the design of waist protection products through data mining,and to provide reference for the design concept of the contour of waist protection pillow.[Methods]The structural design information of all waist protection equipment was collected from the national Internet platform,and the data were classified and a database was established.IBM SPSS 26.0 and MATLAB 2018a were used to analyze the data and tabulate them in Tableau 2022.4.After the association rules were clarified,the data were imported into Cinema 4D R21 to create the concept contour of waist protection pillow.[Results]The average and standard deviation of the single airbag design were the highest in all groups,with an average of 0.511 and a standard deviation of 0.502.The average and standard deviation of the upper and lower dual airbags were the lowest in all groups,with an average of 0.015 and a standard deviation of 0.120;the correlation coefficient between single airbag and 120°arc stretching was 0.325,which was positively correlated with each other(P<0.01);the correlation coefficient between multiple airbags and 360°encircling fitting was 0.501,which was positively correlated with each other and had the highest correlation degree(P<0.01).[Conclusions]The single airbag design is well recognized by companies,and has received the highest attention among all brand products.While focusing on single airbag design,most brands will consider the need to add 120°arc stretching elements in product design.At the time of focusing on multiple airbag design,some brands believe that 360°encircling fitting elements need to be added to the product,and the correlation between the two is the highest among all groups.
基金Supported by the Fundamental Research Funds for the Central Universities(Nos.202341017,202313024)。
文摘Chlorophyll-a(Chl-a)concentration is a primary indicator for marine environmental monitoring.The spatio-temporal variations of sea surface Chl-a concentration in the Yellow Sea(YS)and the East China Sea(ECS)in 2001-2020 were investigated by reconstructing the MODIS Level 3 products with the data interpolation empirical orthogonal function(DINEOF)method.The reconstructed results by interpolating the combined MODIS daily+8-day datasets were found better than those merely by interpolating daily or 8-day data.Chl-a concentration in the YS and the ECS reached its maximum in spring,with blooms occurring,decreased in summer and autumn,and increased in late autumn and early winter.By performing empirical orthogonal function(EOF)decomposition of the reconstructed data fields and correlation analysis with several potential environmental factors,we found that the sea surface temperature(SST)plays a significant role in the seasonal variation of Chl a,especially during spring and summer.The increase of SST in spring and the upper-layer nutrients mixed up during the last winter might favor the occurrence of spring blooms.The high sea surface temperature(SST)throughout the summer would strengthen the vertical stratification and prevent nutrients supply from deep water,resulting in low surface Chl-a concentrations.The sea surface Chl-a concentration in the YS was found decreased significantly from 2012 to 2020,which was possibly related to the Pacific Decadal Oscillation(PDO).
文摘Over the last few years, the Internet of Things (IoT) has become an omnipresent term. The IoT expands the existing common concepts, anytime and anyplace to the connectivity for anything. The proliferation in IoT offers opportunities but may also bear risks. A hitherto neglected aspect is the possible increase in power consumption as smart devices in IoT applications are expected to be reachable by other devices at all times. This implies that the device is consuming electrical energy even when it is not in use for its primary function. Many researchers’ communities have started addressing storage ability like cache memory of smart devices using the concept called—Named Data Networking (NDN) to achieve better energy efficient communication model. In NDN, memory or buffer overflow is the common challenge especially when internal memory of node exceeds its limit and data with highest degree of freshness may not be accommodated and entire scenarios behaves like a traditional network. In such case, Data Caching is not performed by intermediate nodes to guarantee highest degree of freshness. On the periodical updates sent from data producers, it is exceedingly demanded that data consumers must get up to date information at cost of lease energy. Consequently, there is challenge in maintaining tradeoff between freshness energy consumption during Publisher-Subscriber interaction. In our work, we proposed the architecture to overcome cache strategy issue by Smart Caching Algorithm for improvement in memory management and data freshness. The smart caching strategy updates the data at precise interval by keeping garbage data into consideration. It is also observed from experiment that data redundancy can be easily obtained by ignoring/dropping data packets for the information which is not of interest by other participating nodes in network, ultimately leading to optimizing tradeoff between freshness and energy required.