An AlCoCuCrFeNiTi high-entropy alloy(HEA) was prepared by mechanical alloying and sintering to study the effect of Ti addition to the widely studied AlCoCuCrFeNi system. The structural and microstructural characterist...An AlCoCuCrFeNiTi high-entropy alloy(HEA) was prepared by mechanical alloying and sintering to study the effect of Ti addition to the widely studied AlCoCuCrFeNi system. The structural and microstructural characteristics were investigated by X-ray diffraction(XRD), scanning electron microscopy(SEM), and transmission electron microscopy(TEM). The formation of four micrometric phases was detected: a Cu-rich phase with a face-centered cubic(fcc) structure, a body-centered cubic(bcc) solid solution with Cu-rich plate-like precipitates(fcc), an ordered bcc phase, and a tetragonal structure. The XRD patterns corroborate the presence of a mixture of bcc-, fcc-, and tetragonal-structured phases. The Vickers hardness of the alloy under study was more than twice that of the AlCoCuCrFeNi alloy. Nanoindentation tests were performed to evaluate the mechanical response of the individual phases to elucidate the relationship between chemical composition, crystal structure, and mechanical performance of the multiphase microstructure of the AlCoCuCrFeNiTi HEA.展开更多
To detect security vulnerabilities in a web application,the security analyst must choose the best performance Security Analysis Static Tool(SAST)in terms of discovering the greatest number of security vulnerabilities ...To detect security vulnerabilities in a web application,the security analyst must choose the best performance Security Analysis Static Tool(SAST)in terms of discovering the greatest number of security vulnerabilities as possible.To compare static analysis tools for web applications,an adapted benchmark to the vulnerability categories included in the known standard Open Web Application Security Project(OWASP)Top Ten project is required.The information of the security effectiveness of a commercial static analysis tool is not usually a publicly accessible research and the state of the art on static security tool analyzers shows that the different design and implementation of those tools has different effectiveness rates in terms of security performance.Given the significant cost of commercial tools,this paper studies the performance of seven static tools using a new methodology proposal and a new benchmark designed for vulnerability categories included in the known standard OWASP Top Ten project.Thus,the practitioners will have more precise information to select the best tool using a benchmark adapted to the last versions of OWASP Top Ten project.The results of this work have been obtaining using widely acceptable metrics to classify them according to three different degree of web application criticality.展开更多
Larch wood is structurally classifi ed in many countries as one of conifers with the highest load-bearing capacity(strength class of C30).The Spanish visual classifi cation regulation only assigns a strength class to ...Larch wood is structurally classifi ed in many countries as one of conifers with the highest load-bearing capacity(strength class of C30).The Spanish visual classifi cation regulation only assigns a strength class to 4 pine woods:Laricio pine(Pinus nigra Arn.var.Salzmannii),Silvestre pine(Pinus sylvestris L.),Radiata pine(Pinus radiata D.Don),and Pinaster pine(Pinus pinaster Ait.).This work adds to the number of structurally characterised species by creating a visual classifi cation table for Japanese larch wood(Larix kaempferi(Lamb.)Carr.)which diff erentiates between 2 visual classes,MEG-1 and MEG-2.Characteristic strength values were calculated for each class(fk,MEG-1=31.80 MPa,f k,MEG-2=24.55 MPa),mean module of elasticity(E 0,mean,MEG-1=13,082 MPA,E 0,mean,MEG-2=12,320 MPA)and density(ρk,MEG-1=456.6 kg m−3,ρk,MEG-2=469.1 kg m−3),before fi nally assigning a strength class of C30 to visual class MEG-1,and a strength class of C24 to visual class MEG-2.展开更多
Paper devices have recently attracted considerable attention as a class of cost-effective cell culture substrates for various biomedical applications.The paper biomaterial can be used to partially mimic the in vivo ce...Paper devices have recently attracted considerable attention as a class of cost-effective cell culture substrates for various biomedical applications.The paper biomaterial can be used to partially mimic the in vivo cell microenvironments mainly due to its natural three-dimensional characteristic.The paper-based devices provide precise control over their structures as well as cell distributions,allowing recapitulation of certain interactions between the cells and the extracellular matrix.These features have shown great potential for the development of normal and diseased human tissue models.In this review,we discuss the fabrication of paper-based devices for in vitro tissue modeling,as well as the applications of these devices toward drug screening and personalized medicine.It is believed that paper as a biomaterial will play an essential role in the field of tissue model engineering due to its unique performances,such as good biocompatibility,eco-friendliness,cost-effectiveness,and amenability to various biodesign and manufacturing needs.展开更多
The search for efficient and versatile structural elements, leads to the fabrication of I-joists (6.5 cm × 18.5 cm × 600 cm (width × depth × length) with glue-laminated bamboo (Guada angustifolia) ...The search for efficient and versatile structural elements, leads to the fabrication of I-joists (6.5 cm × 18.5 cm × 600 cm (width × depth × length) with glue-laminated bamboo (Guada angustifolia) in the flanges and Gmelina arborea 12-mm structural plywood in the web. The results showed a modulus of rupture (MOR) of 39.45 MPa and an effective modulus of elasticity (MOE) of 17.05 GPa. Shearing in the glue line was 5.95 MPa and the lamination strength was 6.45 MPa. Structural design values averaged 9.43 MPa for bending and 4.72 MPa in shear according to Costa Rican structural standards. Both resistance value (flexure and shear) were considered satisfactory for structural proposes and I-joists fabricated with bamboo and G. arborea plywood are comparable with the Andean classification group “C” structural grade. The use of this I-joist was also shown in roofing and flooring systems. This beam can be used in allowable spans from 2 to 4 m in span for flooring systems and from 5 to 7 m for roofing applications.展开更多
An old automotive industrial site located at Mexico City with many years of operation and contaminated with heavy oil hydrocarbons, particularly spent oils, was assessed for restoration using the surfactant enhanced s...An old automotive industrial site located at Mexico City with many years of operation and contaminated with heavy oil hydrocarbons, particularly spent oils, was assessed for restoration using the surfactant enhanced soil washing (SESW) process. The main goal of this study was to characterize the contaminated soil in terms of TPHs, BTEX, PAHs, and metals contents as well as microbiologically (total heterotrophs and specific degrading microorganisms). We also aimed to determine the surfactant type and concentration to be used in the SESW process for the automotive waste oil contaminated soil. At the end, sixteen kg of contaminated soil were washed and the produced wastewater (approximately 40 L) was characterized in terms of COD, BOD;solids, and other physico-chemical parameters. The soil contained about 14,000 mg of TPH/kg soil (heavy fraction), 0.13 mg/kg of benzo (k) fluoranthene and 0.07 mg/kg of benzo (a) pyrene as well as traces of some metals. Metals concentrations were always under the maximum concentration levels suggested by Mexican regulations. 15 different surfactants were used to identify the one with the capability to achieve the highest TPH removal. Surfactants included 5 anionics, 2 zwitterionic, 5 nonionics and 3 natural gums. Sulfopon 30 at a concentration of 0.5% offered the best surfactant performance. The TPH removals employing the different surfactants were in the range from 38% to 68%, in comparison to the soil washing with water (10% of TPH removal). Once the surfactant was selected, 70 kg of soil were washed and the resulting water contained approximately 1300 mg/L of COD, 385 mg/L of BOD (BOD/COD = 0.29), 122 mg/L of MBAS, and 212 mg/L of oil and greases, among other contaminants.展开更多
The purpose of this article is to show a metallographic analysis of an underground pipeline taken out of operation upon failure. The pipeline had an 8.89 cm (3.5”) diameter and a 7 mm thickness. The study was based o...The purpose of this article is to show a metallographic analysis of an underground pipeline taken out of operation upon failure. The pipeline had an 8.89 cm (3.5”) diameter and a 7 mm thickness. The study was based on a 45 cm long pipe sample, visibly and entirely corroded, with a fish-mouth crack along its length. The work contributes to finding new ways to prevent structural failure, which has high-impact consequences from the point of view of production, damage to property, pollution, and risks to human live. Through this analysis the knowledge on behavior of failures in terrestrial ducts has been extended. Development of the research included metallographic, chemical, and mechanical tests on the sample in order to know the composition of the material, its strength and its physical conditions upon taking it out of operation. After the analysis of the laboratory tests, the physical and chemical features were compared to existing national and international regulations, which allowed a specific characterization of the conditions of the sample. In accordance with the regulations, the grade of the pipe was between ×65 and ×70. Tensile testing was carried out to obtain mechanical properties in order to corroborate the grade of pipeline steel and complement the metallographic analysis.展开更多
Fractal interpolation is a modern technique to fit and analyze scientific data.We develop a new class of fractal interpolation functions which converge to a data generating(original)function for any choice of the scal...Fractal interpolation is a modern technique to fit and analyze scientific data.We develop a new class of fractal interpolation functions which converge to a data generating(original)function for any choice of the scaling factors.Consequently,our method offers an alternative to the existing fractal interpolation functions(FIFs).We construct a sequence of-FIFs using a suitable sequence of iterated function systems(IFSs).Without imposing any condition on the scaling vector,we establish constrained interpolation by using fractal functions.In particular,the constrained interpolation discussed herein includes a method to obtain fractal functions that preserve positivity inherent in the given data.The existence of Cr--FIFs is investigated.We identify suitable conditions on the associated scaling factors so that-FIFs preserve r-convexity in addition to the Cr-smoothness of original function.展开更多
Spatial patterns reveal critical features at the individual and community levels.However,how to evaluate changes in spatial characteristics remains largely unexplored.We assess spatial changes in spatial point pattern...Spatial patterns reveal critical features at the individual and community levels.However,how to evaluate changes in spatial characteristics remains largely unexplored.We assess spatial changes in spatial point patterns by augmenting current statistical functions and indices.We fitted functions to describe unmarked and marked(tree size)spatial patterns using data from a large-scale silvicultural experiment in southern Chile.Furthermore,we compute the mingling index to represent spatial tree diversity.We proffer the pair correlation function difference before and after treatment to detect changes in the unmarked-point pattern of trees and the semivariogram-ratio to evaluate changes in the marked-point pattern.Our research provides a quantitative assessment of a critical aspect of forest heterogeneity:changes in spatial unmarked and marked-point patterns.The proposed approach can be a powerful tool for quantifying the impacts of disturbances and silvicultural treatments on spatial patterns in forest ecosystems.展开更多
An influence game is a simple game represented over an influence graph(i.e.,a labeled,weighted graph)on which the influence spread phenomenon is exerted.Influence games allow applying different properties and paramete...An influence game is a simple game represented over an influence graph(i.e.,a labeled,weighted graph)on which the influence spread phenomenon is exerted.Influence games allow applying different properties and parameters coming from cooperative game theory to the contexts of social network analysis,decision-systems,voting systems,and collective behavior.The exact calculation of several of these properties and parameters is computationally hard,even for a small number of players.Two examples of these parameters are the length and the width of a game.The length of a game is the size of its smaller winning coalition,while the width of a game is the size of its larger losing coalition.Both parameters are relevant to know the levels of difficulty in reaching agreements in collective decision-making systems.Despite the above,new bio-inspired metaheuristic algorithms have recently been developed to solve the NP-hard influence maximization problem in an efficient and approximate way,being able to find small winning coalitions that maximize the influence spread within an influence graph.In this article,we apply some variations of this solution to find extreme winning and losing coalitions,and thus efficient approximate solutions for the length and the width of influence games.As a case study,we consider two real social networks,one formed by the 58 members of the European Union Council under nice voting rules,and the other formed by the 705 members of the European Parliament,connected by political affinity.Results are promising and show that it is feasible to generate approximate solutions for the length and width parameters of influence games,in reduced solving time.展开更多
Web applications represent one of the principal vehicles by which attackers gain access to an organization’s network or resources.Thus,different approaches to protect web applications have been proposed to date.Of th...Web applications represent one of the principal vehicles by which attackers gain access to an organization’s network or resources.Thus,different approaches to protect web applications have been proposed to date.Of them,the two major approaches are Web Application Firewalls(WAF)and Runtime Application Self Protection(RASP).It is,thus,essential to understand the differences and relative effectiveness of both these approaches for effective decisionmaking regarding the security of web applications.Here we present a comparative study between WAF and RASP simulated settings,with the aim to compare their effectiveness and efficiency against different categories of attacks.For this,we used computation of different metrics and sorted their results using F-Score index.We found that RASP tools scored better than WAF tools.In this study,we also developed a new experimental methodology for the objective evaluation ofweb protection tools since,to the best of our knowledge,nomethod specifically evaluates web protection tools.展开更多
This study quantifies seismic amplifications in near-shore arising from seaquakes. Within the Boundary Element Method, boundary elements are used to irradiate waves and force densities obtained for each element. Huyg...This study quantifies seismic amplifications in near-shore arising from seaquakes. Within the Boundary Element Method, boundary elements are used to irradiate waves and force densities obtained for each element. Huygens′ Principle is implemented since the diffracted waves are constructed at the boundary from which they are radiated, which is equivalent to Somigliana′s theorem. Application of boundary conditions leads to a system of integral equations of the Fredholm type of second kind and zero order. Several numerical configurations are analyzed: The first is used to verify the present formulation with ideal sea floor configurations to estimate seismic amplifications. With the formulation verified, simple slope configurations are studied to estimate spectra of seismic motions. It is found that P-waves can produce seismic amplifications from 1.2 to 3.9 times the amplitude of the incident wave. SV-waves can generate seismic amplifications up to 4.5 times the incident wave. Another relevant finding is that the highest amplifications are at the shore compared to the ones at the sea floor.展开更多
Using the first-principles density functional theory calculations within the genera- lized gradient approximation we study the bis(1H-imidazolium-kN3)silver(I) nitrate molecular crystal. A number of different exch...Using the first-principles density functional theory calculations within the genera- lized gradient approximation we study the bis(1H-imidazolium-kN3)silver(I) nitrate molecular crystal. A number of different exchange-correlation functionals are considered for a possible treatment of the system. Perdew-Burke-Emzerhof (PBE) GGA exchange-correlation ftmctionals are found to be adequate for our system. The obtained results show that it is possible to reproduce very well the geometry of at least some molecular crystals if computational parameters are chosen adequately. In addition to the reproducing crystal structure of the bis(1H-imidazolium-kN3)silver(I) nitrate in close agreement with the available experimental data, the present work reports analysis on the chemical bonding in the material and gives total and partial density of states of this system.展开更多
Security weaknesses in web applications deployed in cloud architectures can seriously affect its data confidentiality and integrity.The construction of the procedure utilized in the static analysis tools of source cod...Security weaknesses in web applications deployed in cloud architectures can seriously affect its data confidentiality and integrity.The construction of the procedure utilized in the static analysis tools of source code security differs and therefore each tool finds a different number of each weakness type for which it is designed.To utilize the possible synergies different static analysis tools may process,this work uses a new method to combine several source codes aiming to investigate how to increase the performance of security weakness detection while reducing the number of false positives.Specifically,five static analysis tools will be combined with the designed method to study their behavior using an updated benchmark for OWASP Top Ten Security Weaknesses(OWASP TTSW).The method selects specific metrics to rank the tools for different criticality levels of web applications considering different weights in the ratios.The findings show that simply including more tools in a combination is not synonymous with better results;it depends on the specific tools included in the combination due to their different designs and techniques.展开更多
In a computer environment,an operating system is prone to malware,and even the Linux operating system is not an exception.In recent years,malware has evolved,and attackers have become more qualified compared to a few ...In a computer environment,an operating system is prone to malware,and even the Linux operating system is not an exception.In recent years,malware has evolved,and attackers have become more qualified compared to a few years ago.Furthermore,Linux-based systems have become more attractive to cybercriminals because of the increasing use of the Linux operating system in web servers and Internet of Things(IoT)devices.Windows is the most employed OS,so most of the research efforts have been focused on its malware protection rather than on other operating systems.As a result,hundreds of research articles,documents,and methodologies dedicated to malware analysis have been reported.However,there has not been much literature concerning Linux security and protection from malware.To address all these new challenges,it is necessary to develop a methodology that can standardize the required steps to perform the malware analysis in depth.A systematic analysis process makes the difference between good and ordinary malware analyses.Additionally,a deep malware comprehension can yield a faster and much more efficient malware eradication.In order to address all mentioned challenges,this article proposed a methodology for malware analysis in the Linux operating system,which is a traditionally overlooked field compared to the other operating systems.The proposed methodology is tested by a specific Linux malware,and the obtained test results have high effectiveness in malware detection.展开更多
This work clarifies the relation between Maxwell, Dirac and Majorana neutrino equations presenting an original way to derive the Dirac and neutrino equation from the chiral electrodynamics leading, perhaps, to novel c...This work clarifies the relation between Maxwell, Dirac and Majorana neutrino equations presenting an original way to derive the Dirac and neutrino equation from the chiral electrodynamics leading, perhaps, to novel conception in the mass generation by electromagnetic fields. In the present article, it is shown that Maxwell equations can be written in the same form as the two components Dirac and neutrino equations, that is the vector representation of electromagnetic theory can be factorized into a pair of two-component spinor field equations. We propose a simple approach with the electric field E parallel to the magnetic field H. Our analysis is based on the chiral or Weyl form of the Maxwell equations in a chiral vacuum. This theory is a new quantum mechanics (QM) interpretation for Dirac and neutrino equation. The below research proves that the QM of particles represents the electrodynamics of the curvilinear closed chiral waves. Electromagnetic properties of neutrinos are discussed.展开更多
The images capture in a bad environment usually loses its fidelity and contrast.As the light rays travel towards its destination they get scattered several times due to the tiny particles of fog and pollutants in the ...The images capture in a bad environment usually loses its fidelity and contrast.As the light rays travel towards its destination they get scattered several times due to the tiny particles of fog and pollutants in the environment,therefore the energy gets lost due to multiple scattering till it arrives its destination,and this degrades the images.So the images taken in bad weather appear in bad quality.Therefore,single image haze removal is quite a bit tough task.Significant research has been done in the haze removal algorithm but in all the techniques,the coefficient of scattering is taken as a constant according to the homogeneous atmosphere but in real time this does not happen.Therefore,this paper introduces a simple and efficient method so that the scattering coefficient becomes variable according to the inhomogeneous environment.Then,this research aims to remove the haze with the help of a fast and effective algorithm i.e.,Prior Color Fading,according to the inhomogeneous environmental properties.Thereby,to filter the depth map,the authors used a weighted guided image filtering which removes the drawbacks of guided image filter.Afterwards the scattering coefficient is made variable according to the inhomogeneous atmosphere and then the Simple Color Balance Algorithm is applied so that the readability property of images can be increased.The proposed method tested on various general outdoor images and synthetic hazy images and analyzed on various parameters Mean Square Error(MSE),Root Mean Square Error(RMSE),Peak Signal to Noise Ratio(PSNR),Mean Structural Similarity(MSSIM)and the Universal Objective Quality Index(UQI).Experimental results for the proposed method show that the proposed approach provides better results as compared to the state-of-the-art haze removal algorithms.展开更多
文摘An AlCoCuCrFeNiTi high-entropy alloy(HEA) was prepared by mechanical alloying and sintering to study the effect of Ti addition to the widely studied AlCoCuCrFeNi system. The structural and microstructural characteristics were investigated by X-ray diffraction(XRD), scanning electron microscopy(SEM), and transmission electron microscopy(TEM). The formation of four micrometric phases was detected: a Cu-rich phase with a face-centered cubic(fcc) structure, a body-centered cubic(bcc) solid solution with Cu-rich plate-like precipitates(fcc), an ordered bcc phase, and a tetragonal structure. The XRD patterns corroborate the presence of a mixture of bcc-, fcc-, and tetragonal-structured phases. The Vickers hardness of the alloy under study was more than twice that of the AlCoCuCrFeNi alloy. Nanoindentation tests were performed to evaluate the mechanical response of the individual phases to elucidate the relationship between chemical composition, crystal structure, and mechanical performance of the multiphase microstructure of the AlCoCuCrFeNiTi HEA.
文摘To detect security vulnerabilities in a web application,the security analyst must choose the best performance Security Analysis Static Tool(SAST)in terms of discovering the greatest number of security vulnerabilities as possible.To compare static analysis tools for web applications,an adapted benchmark to the vulnerability categories included in the known standard Open Web Application Security Project(OWASP)Top Ten project is required.The information of the security effectiveness of a commercial static analysis tool is not usually a publicly accessible research and the state of the art on static security tool analyzers shows that the different design and implementation of those tools has different effectiveness rates in terms of security performance.Given the significant cost of commercial tools,this paper studies the performance of seven static tools using a new methodology proposal and a new benchmark designed for vulnerability categories included in the known standard OWASP Top Ten project.Thus,the practitioners will have more precise information to select the best tool using a benchmark adapted to the last versions of OWASP Top Ten project.The results of this work have been obtaining using widely acceptable metrics to classify them according to three different degree of web application criticality.
基金We thanks to Basque centre of research and applied innovation in vet(TKNIKA),Centre for services and promotion of Castilla y León forestry and its industry(CESEFOR),D.Bixente Dorronsoro,Gipuzkoa provincial council,and Commercial services of the wood of Guipuzkoa(SECOMA).Larranaga sawmill(Azpeitia).
文摘Larch wood is structurally classifi ed in many countries as one of conifers with the highest load-bearing capacity(strength class of C30).The Spanish visual classifi cation regulation only assigns a strength class to 4 pine woods:Laricio pine(Pinus nigra Arn.var.Salzmannii),Silvestre pine(Pinus sylvestris L.),Radiata pine(Pinus radiata D.Don),and Pinaster pine(Pinus pinaster Ait.).This work adds to the number of structurally characterised species by creating a visual classifi cation table for Japanese larch wood(Larix kaempferi(Lamb.)Carr.)which diff erentiates between 2 visual classes,MEG-1 and MEG-2.Characteristic strength values were calculated for each class(fk,MEG-1=31.80 MPa,f k,MEG-2=24.55 MPa),mean module of elasticity(E 0,mean,MEG-1=13,082 MPA,E 0,mean,MEG-2=12,320 MPA)and density(ρk,MEG-1=456.6 kg m−3,ρk,MEG-2=469.1 kg m−3),before fi nally assigning a strength class of C30 to visual class MEG-1,and a strength class of C24 to visual class MEG-2.
基金This work was supported by the National Institutes of Health(R00CA201603,R21EB025270,R21EB026175,R01EB028143)the Brigham Research Institute.
文摘Paper devices have recently attracted considerable attention as a class of cost-effective cell culture substrates for various biomedical applications.The paper biomaterial can be used to partially mimic the in vivo cell microenvironments mainly due to its natural three-dimensional characteristic.The paper-based devices provide precise control over their structures as well as cell distributions,allowing recapitulation of certain interactions between the cells and the extracellular matrix.These features have shown great potential for the development of normal and diseased human tissue models.In this review,we discuss the fabrication of paper-based devices for in vitro tissue modeling,as well as the applications of these devices toward drug screening and personalized medicine.It is believed that paper as a biomaterial will play an essential role in the field of tissue model engineering due to its unique performances,such as good biocompatibility,eco-friendliness,cost-effectiveness,and amenability to various biodesign and manufacturing needs.
文摘The search for efficient and versatile structural elements, leads to the fabrication of I-joists (6.5 cm × 18.5 cm × 600 cm (width × depth × length) with glue-laminated bamboo (Guada angustifolia) in the flanges and Gmelina arborea 12-mm structural plywood in the web. The results showed a modulus of rupture (MOR) of 39.45 MPa and an effective modulus of elasticity (MOE) of 17.05 GPa. Shearing in the glue line was 5.95 MPa and the lamination strength was 6.45 MPa. Structural design values averaged 9.43 MPa for bending and 4.72 MPa in shear according to Costa Rican structural standards. Both resistance value (flexure and shear) were considered satisfactory for structural proposes and I-joists fabricated with bamboo and G. arborea plywood are comparable with the Andean classification group “C” structural grade. The use of this I-joist was also shown in roofing and flooring systems. This beam can be used in allowable spans from 2 to 4 m in span for flooring systems and from 5 to 7 m for roofing applications.
文摘An old automotive industrial site located at Mexico City with many years of operation and contaminated with heavy oil hydrocarbons, particularly spent oils, was assessed for restoration using the surfactant enhanced soil washing (SESW) process. The main goal of this study was to characterize the contaminated soil in terms of TPHs, BTEX, PAHs, and metals contents as well as microbiologically (total heterotrophs and specific degrading microorganisms). We also aimed to determine the surfactant type and concentration to be used in the SESW process for the automotive waste oil contaminated soil. At the end, sixteen kg of contaminated soil were washed and the produced wastewater (approximately 40 L) was characterized in terms of COD, BOD;solids, and other physico-chemical parameters. The soil contained about 14,000 mg of TPH/kg soil (heavy fraction), 0.13 mg/kg of benzo (k) fluoranthene and 0.07 mg/kg of benzo (a) pyrene as well as traces of some metals. Metals concentrations were always under the maximum concentration levels suggested by Mexican regulations. 15 different surfactants were used to identify the one with the capability to achieve the highest TPH removal. Surfactants included 5 anionics, 2 zwitterionic, 5 nonionics and 3 natural gums. Sulfopon 30 at a concentration of 0.5% offered the best surfactant performance. The TPH removals employing the different surfactants were in the range from 38% to 68%, in comparison to the soil washing with water (10% of TPH removal). Once the surfactant was selected, 70 kg of soil were washed and the resulting water contained approximately 1300 mg/L of COD, 385 mg/L of BOD (BOD/COD = 0.29), 122 mg/L of MBAS, and 212 mg/L of oil and greases, among other contaminants.
文摘The purpose of this article is to show a metallographic analysis of an underground pipeline taken out of operation upon failure. The pipeline had an 8.89 cm (3.5”) diameter and a 7 mm thickness. The study was based on a 45 cm long pipe sample, visibly and entirely corroded, with a fish-mouth crack along its length. The work contributes to finding new ways to prevent structural failure, which has high-impact consequences from the point of view of production, damage to property, pollution, and risks to human live. Through this analysis the knowledge on behavior of failures in terrestrial ducts has been extended. Development of the research included metallographic, chemical, and mechanical tests on the sample in order to know the composition of the material, its strength and its physical conditions upon taking it out of operation. After the analysis of the laboratory tests, the physical and chemical features were compared to existing national and international regulations, which allowed a specific characterization of the conditions of the sample. In accordance with the regulations, the grade of the pipe was between ×65 and ×70. Tensile testing was carried out to obtain mechanical properties in order to corroborate the grade of pipeline steel and complement the metallographic analysis.
基金Supported by Council of Scienti c&Industrial Research(CSIR),India(25(0290)/18/EMR-II).
文摘Fractal interpolation is a modern technique to fit and analyze scientific data.We develop a new class of fractal interpolation functions which converge to a data generating(original)function for any choice of the scaling factors.Consequently,our method offers an alternative to the existing fractal interpolation functions(FIFs).We construct a sequence of-FIFs using a suitable sequence of iterated function systems(IFSs).Without imposing any condition on the scaling vector,we establish constrained interpolation by using fractal functions.In particular,the constrained interpolation discussed herein includes a method to obtain fractal functions that preserve positivity inherent in the given data.The existence of Cr--FIFs is investigated.We identify suitable conditions on the associated scaling factors so that-FIFs preserve r-convexity in addition to the Cr-smoothness of original function.
基金supported by the Chilean research grant Fondecyt No.1210147.
文摘Spatial patterns reveal critical features at the individual and community levels.However,how to evaluate changes in spatial characteristics remains largely unexplored.We assess spatial changes in spatial point patterns by augmenting current statistical functions and indices.We fitted functions to describe unmarked and marked(tree size)spatial patterns using data from a large-scale silvicultural experiment in southern Chile.Furthermore,we compute the mingling index to represent spatial tree diversity.We proffer the pair correlation function difference before and after treatment to detect changes in the unmarked-point pattern of trees and the semivariogram-ratio to evaluate changes in the marked-point pattern.Our research provides a quantitative assessment of a critical aspect of forest heterogeneity:changes in spatial unmarked and marked-point patterns.The proposed approach can be a powerful tool for quantifying the impacts of disturbances and silvicultural treatments on spatial patterns in forest ecosystems.
基金F.Riquelme has been partially supported by Fondecyt de Iniciación 11200113,Chile,and by the SEGIB scholarship of Fundación Carolina,SpainX.Molinero under grants PID2019-104987GB-I00(JUVOCO)M.Serna under grants PID2020-112581GB-C21(MOTION)and 2017-SGR-786(ALBCOM).
文摘An influence game is a simple game represented over an influence graph(i.e.,a labeled,weighted graph)on which the influence spread phenomenon is exerted.Influence games allow applying different properties and parameters coming from cooperative game theory to the contexts of social network analysis,decision-systems,voting systems,and collective behavior.The exact calculation of several of these properties and parameters is computationally hard,even for a small number of players.Two examples of these parameters are the length and the width of a game.The length of a game is the size of its smaller winning coalition,while the width of a game is the size of its larger losing coalition.Both parameters are relevant to know the levels of difficulty in reaching agreements in collective decision-making systems.Despite the above,new bio-inspired metaheuristic algorithms have recently been developed to solve the NP-hard influence maximization problem in an efficient and approximate way,being able to find small winning coalitions that maximize the influence spread within an influence graph.In this article,we apply some variations of this solution to find extreme winning and losing coalitions,and thus efficient approximate solutions for the length and the width of influence games.As a case study,we consider two real social networks,one formed by the 58 members of the European Union Council under nice voting rules,and the other formed by the 705 members of the European Parliament,connected by political affinity.Results are promising and show that it is feasible to generate approximate solutions for the length and width parameters of influence games,in reduced solving time.
文摘Web applications represent one of the principal vehicles by which attackers gain access to an organization’s network or resources.Thus,different approaches to protect web applications have been proposed to date.Of them,the two major approaches are Web Application Firewalls(WAF)and Runtime Application Self Protection(RASP).It is,thus,essential to understand the differences and relative effectiveness of both these approaches for effective decisionmaking regarding the security of web applications.Here we present a comparative study between WAF and RASP simulated settings,with the aim to compare their effectiveness and efficiency against different categories of attacks.For this,we used computation of different metrics and sorted their results using F-Score index.We found that RASP tools scored better than WAF tools.In this study,we also developed a new experimental methodology for the objective evaluation ofweb protection tools since,to the best of our knowledge,nomethod specifically evaluates web protection tools.
文摘This study quantifies seismic amplifications in near-shore arising from seaquakes. Within the Boundary Element Method, boundary elements are used to irradiate waves and force densities obtained for each element. Huygens′ Principle is implemented since the diffracted waves are constructed at the boundary from which they are radiated, which is equivalent to Somigliana′s theorem. Application of boundary conditions leads to a system of integral equations of the Fredholm type of second kind and zero order. Several numerical configurations are analyzed: The first is used to verify the present formulation with ideal sea floor configurations to estimate seismic amplifications. With the formulation verified, simple slope configurations are studied to estimate spectra of seismic motions. It is found that P-waves can produce seismic amplifications from 1.2 to 3.9 times the amplitude of the incident wave. SV-waves can generate seismic amplifications up to 4.5 times the incident wave. Another relevant finding is that the highest amplifications are at the shore compared to the ones at the sea floor.
文摘Using the first-principles density functional theory calculations within the genera- lized gradient approximation we study the bis(1H-imidazolium-kN3)silver(I) nitrate molecular crystal. A number of different exchange-correlation functionals are considered for a possible treatment of the system. Perdew-Burke-Emzerhof (PBE) GGA exchange-correlation ftmctionals are found to be adequate for our system. The obtained results show that it is possible to reproduce very well the geometry of at least some molecular crystals if computational parameters are chosen adequately. In addition to the reproducing crystal structure of the bis(1H-imidazolium-kN3)silver(I) nitrate in close agreement with the available experimental data, the present work reports analysis on the chemical bonding in the material and gives total and partial density of states of this system.
文摘Security weaknesses in web applications deployed in cloud architectures can seriously affect its data confidentiality and integrity.The construction of the procedure utilized in the static analysis tools of source code security differs and therefore each tool finds a different number of each weakness type for which it is designed.To utilize the possible synergies different static analysis tools may process,this work uses a new method to combine several source codes aiming to investigate how to increase the performance of security weakness detection while reducing the number of false positives.Specifically,five static analysis tools will be combined with the designed method to study their behavior using an updated benchmark for OWASP Top Ten Security Weaknesses(OWASP TTSW).The method selects specific metrics to rank the tools for different criticality levels of web applications considering different weights in the ratios.The findings show that simply including more tools in a combination is not synonymous with better results;it depends on the specific tools included in the combination due to their different designs and techniques.
文摘In a computer environment,an operating system is prone to malware,and even the Linux operating system is not an exception.In recent years,malware has evolved,and attackers have become more qualified compared to a few years ago.Furthermore,Linux-based systems have become more attractive to cybercriminals because of the increasing use of the Linux operating system in web servers and Internet of Things(IoT)devices.Windows is the most employed OS,so most of the research efforts have been focused on its malware protection rather than on other operating systems.As a result,hundreds of research articles,documents,and methodologies dedicated to malware analysis have been reported.However,there has not been much literature concerning Linux security and protection from malware.To address all these new challenges,it is necessary to develop a methodology that can standardize the required steps to perform the malware analysis in depth.A systematic analysis process makes the difference between good and ordinary malware analyses.Additionally,a deep malware comprehension can yield a faster and much more efficient malware eradication.In order to address all mentioned challenges,this article proposed a methodology for malware analysis in the Linux operating system,which is a traditionally overlooked field compared to the other operating systems.The proposed methodology is tested by a specific Linux malware,and the obtained test results have high effectiveness in malware detection.
文摘This work clarifies the relation between Maxwell, Dirac and Majorana neutrino equations presenting an original way to derive the Dirac and neutrino equation from the chiral electrodynamics leading, perhaps, to novel conception in the mass generation by electromagnetic fields. In the present article, it is shown that Maxwell equations can be written in the same form as the two components Dirac and neutrino equations, that is the vector representation of electromagnetic theory can be factorized into a pair of two-component spinor field equations. We propose a simple approach with the electric field E parallel to the magnetic field H. Our analysis is based on the chiral or Weyl form of the Maxwell equations in a chiral vacuum. This theory is a new quantum mechanics (QM) interpretation for Dirac and neutrino equation. The below research proves that the QM of particles represents the electrodynamics of the curvilinear closed chiral waves. Electromagnetic properties of neutrinos are discussed.
文摘The images capture in a bad environment usually loses its fidelity and contrast.As the light rays travel towards its destination they get scattered several times due to the tiny particles of fog and pollutants in the environment,therefore the energy gets lost due to multiple scattering till it arrives its destination,and this degrades the images.So the images taken in bad weather appear in bad quality.Therefore,single image haze removal is quite a bit tough task.Significant research has been done in the haze removal algorithm but in all the techniques,the coefficient of scattering is taken as a constant according to the homogeneous atmosphere but in real time this does not happen.Therefore,this paper introduces a simple and efficient method so that the scattering coefficient becomes variable according to the inhomogeneous environment.Then,this research aims to remove the haze with the help of a fast and effective algorithm i.e.,Prior Color Fading,according to the inhomogeneous environmental properties.Thereby,to filter the depth map,the authors used a weighted guided image filtering which removes the drawbacks of guided image filter.Afterwards the scattering coefficient is made variable according to the inhomogeneous atmosphere and then the Simple Color Balance Algorithm is applied so that the readability property of images can be increased.The proposed method tested on various general outdoor images and synthetic hazy images and analyzed on various parameters Mean Square Error(MSE),Root Mean Square Error(RMSE),Peak Signal to Noise Ratio(PSNR),Mean Structural Similarity(MSSIM)and the Universal Objective Quality Index(UQI).Experimental results for the proposed method show that the proposed approach provides better results as compared to the state-of-the-art haze removal algorithms.