With the continuous improvement of product quality and function,the quality control of mold steel is becoming increasingly stricter. Argon protection is essential for ensuring casting quality during ingot casting. The...With the continuous improvement of product quality and function,the quality control of mold steel is becoming increasingly stricter. Argon protection is essential for ensuring casting quality during ingot casting. The development of argon protection in ingot casting and the production process of enclosed argon protection in 40 t line is discussed,w ith particular focus given the factors affecting the flow of oxygen in the argon protection cover are discussed. The influence of some related factors on the oxygen content is analyzed. On the basis of the online measurements of the oxygen content,the optimized operational approaches for improving the effect of argon protection are developed. This can decrease the liquid steel via secondary oxidation,and improve the quality of the ingots.展开更多
Nowadays since the Internet is ubiquitous,the frequency of data transfer through the public network is increasing.Hiding secure data in these transmitted data has emerged broad security issue,such as authentication an...Nowadays since the Internet is ubiquitous,the frequency of data transfer through the public network is increasing.Hiding secure data in these transmitted data has emerged broad security issue,such as authentication and copyright protection.On the other hand,considering the transmission efficiency issue,image transmission usually involves image compression in Internet-based applications.To address both issues,this paper presents a data hiding scheme for the image compression method called absolute moment block truncation coding(AMBTC).First,an image is divided into nonoverlapping blocks through AMBTC compression,the blocks are classified four types,namely smooth,semi-smooth,semi-complex,and complex.The secret data are embedded into the smooth blocks by using a simple replacement strategy.The proposed method respectively embeds nine bits(and five bits)of secret data into the bitmap of the semi-smooth blocks(and semicomplex blocks)through the exclusive-or(XOR)operation.The secret data are embedded into the complex blocks by using a hidden function.After the embedding phase,the direct binary search(DBS)method is performed to improve the image qualitywithout damaging the secret data.The experimental results demonstrate that the proposed method yields higher quality and hiding capacity than other reference methods.展开更多
Web crawlers have been misused for several malicious purposes such as downloading server data without permission from the website administrator.Moreover,armoured crawlers are evolving against new anti-crawler mechanis...Web crawlers have been misused for several malicious purposes such as downloading server data without permission from the website administrator.Moreover,armoured crawlers are evolving against new anti-crawler mechanisms in the arm races between crawler developers and crawler defenders.In this paper,based on one observation that normal users and malicious crawlers have different short-term and long-term download behaviours,we develop a new anti-crawler mechanism called PathMarker to detect and constrain persistent distributed crawlers.By adding a marker to each Uniform Resource Locator(URL),we can trace the page that leads to the access of this URL and the user identity who accesses this URL.With this supporting information,we can not only perform more accurate heuristic detection using the path related features,but also develop a Support Vector Machine based machine learning detection model to distinguish malicious crawlers from normal users via inspecting their different patterns of URL visiting paths and URL visiting timings.In addition to effectively detecting crawlers at the earliest stage,PathMarker can dramatically suppress the scraping efficiency of crawlers before they are detected.We deploy our approach on an online forum website,and the evaluation results show that PathMarker can quickly capture all 6 open-source and in-house crawlers,plus two external crawlers(i.e.,Googlebots and Yahoo Slurp).展开更多
Web crawlers have been misused for several malicious purposes such as downloading server data without permission from the website administrator.Moreover,armoured crawlers are evolving against new anti-crawler mechanis...Web crawlers have been misused for several malicious purposes such as downloading server data without permission from the website administrator.Moreover,armoured crawlers are evolving against new anti-crawler mechanisms in the arm races between crawler developers and crawler defenders.In this paper,based on one observation that normal users and malicious crawlers have different short-term and long-term download behaviours,we develop a new anti-crawler mechanism called PathMarker to detect and constrain persistent distributed crawlers.By adding a marker to each Uniform Resource Locator(URL),we can trace the page that leads to the access of this URL and the user identity who accesses this URL.With this supporting information,we can not only perform more accurate heuristic detection using the path related features,but also develop a Support Vector Machine based machine learning detection model to distinguish malicious crawlers from normal users via inspecting their different patterns of URL visiting paths and URL visiting timings.In addition to effectively detecting crawlers at the earliest stage,PathMarker can dramatically suppress the scraping efficiency of crawlers before they are detected.We deploy our approach on an online forum website,and the evaluation results show that PathMarker can quickly capture all 6 open-source and in-house crawlers,plus two external crawlers(i.e.,Googlebots and Yahoo Slurp).展开更多
文摘With the continuous improvement of product quality and function,the quality control of mold steel is becoming increasingly stricter. Argon protection is essential for ensuring casting quality during ingot casting. The development of argon protection in ingot casting and the production process of enclosed argon protection in 40 t line is discussed,w ith particular focus given the factors affecting the flow of oxygen in the argon protection cover are discussed. The influence of some related factors on the oxygen content is analyzed. On the basis of the online measurements of the oxygen content,the optimized operational approaches for improving the effect of argon protection are developed. This can decrease the liquid steel via secondary oxidation,and improve the quality of the ingots.
基金This work is funded in part by the Ministry of Science and Technology,Taiwan,under grant MOST 108-2221-E-011-162-MY2.
文摘Nowadays since the Internet is ubiquitous,the frequency of data transfer through the public network is increasing.Hiding secure data in these transmitted data has emerged broad security issue,such as authentication and copyright protection.On the other hand,considering the transmission efficiency issue,image transmission usually involves image compression in Internet-based applications.To address both issues,this paper presents a data hiding scheme for the image compression method called absolute moment block truncation coding(AMBTC).First,an image is divided into nonoverlapping blocks through AMBTC compression,the blocks are classified four types,namely smooth,semi-smooth,semi-complex,and complex.The secret data are embedded into the smooth blocks by using a simple replacement strategy.The proposed method respectively embeds nine bits(and five bits)of secret data into the bitmap of the semi-smooth blocks(and semicomplex blocks)through the exclusive-or(XOR)operation.The secret data are embedded into the complex blocks by using a hidden function.After the embedding phase,the direct binary search(DBS)method is performed to improve the image qualitywithout damaging the secret data.The experimental results demonstrate that the proposed method yields higher quality and hiding capacity than other reference methods.
基金This work is supported by U.S.Office of Naval Research under grants N00014-16-1-3214 and N00014-16-1-3216.
文摘Web crawlers have been misused for several malicious purposes such as downloading server data without permission from the website administrator.Moreover,armoured crawlers are evolving against new anti-crawler mechanisms in the arm races between crawler developers and crawler defenders.In this paper,based on one observation that normal users and malicious crawlers have different short-term and long-term download behaviours,we develop a new anti-crawler mechanism called PathMarker to detect and constrain persistent distributed crawlers.By adding a marker to each Uniform Resource Locator(URL),we can trace the page that leads to the access of this URL and the user identity who accesses this URL.With this supporting information,we can not only perform more accurate heuristic detection using the path related features,but also develop a Support Vector Machine based machine learning detection model to distinguish malicious crawlers from normal users via inspecting their different patterns of URL visiting paths and URL visiting timings.In addition to effectively detecting crawlers at the earliest stage,PathMarker can dramatically suppress the scraping efficiency of crawlers before they are detected.We deploy our approach on an online forum website,and the evaluation results show that PathMarker can quickly capture all 6 open-source and in-house crawlers,plus two external crawlers(i.e.,Googlebots and Yahoo Slurp).
基金supported by U.S.Office of Naval Research under grants N00014-16-1-3214 and N00014-16-1-3216.
文摘Web crawlers have been misused for several malicious purposes such as downloading server data without permission from the website administrator.Moreover,armoured crawlers are evolving against new anti-crawler mechanisms in the arm races between crawler developers and crawler defenders.In this paper,based on one observation that normal users and malicious crawlers have different short-term and long-term download behaviours,we develop a new anti-crawler mechanism called PathMarker to detect and constrain persistent distributed crawlers.By adding a marker to each Uniform Resource Locator(URL),we can trace the page that leads to the access of this URL and the user identity who accesses this URL.With this supporting information,we can not only perform more accurate heuristic detection using the path related features,but also develop a Support Vector Machine based machine learning detection model to distinguish malicious crawlers from normal users via inspecting their different patterns of URL visiting paths and URL visiting timings.In addition to effectively detecting crawlers at the earliest stage,PathMarker can dramatically suppress the scraping efficiency of crawlers before they are detected.We deploy our approach on an online forum website,and the evaluation results show that PathMarker can quickly capture all 6 open-source and in-house crawlers,plus two external crawlers(i.e.,Googlebots and Yahoo Slurp).