The Enhanced Complexity Model( ECM) developed previously has been further extended to produce a Motivationally Enhanced Complexity Model( MECM) which enables the degree of motivation,capability and opportunity of a hy...The Enhanced Complexity Model( ECM) developed previously has been further extended to produce a Motivationally Enhanced Complexity Model( MECM) which enables the degree of motivation,capability and opportunity of a hypothetical Trojan Horse author to be included in quantifying the relative plausibility of competing explanations for the existence of uncontested digital evidence.This new model has been applied to the case of the Trojan Horse defence( THD) against the possession of child pornography.Our results demonstrate that the THD in this case cannot be plausibly sustained unless it can be shown that an ‘off-theshelf'( OTS) Trojan Horse for this task is available and it is not detectable by the target computer,at the material time.展开更多
Loop detectors are wide-spread and relatively cheap detecting devices. With today's traffic communication revolution, high resolution detector data is available on a central traffic management level. High resolution ...Loop detectors are wide-spread and relatively cheap detecting devices. With today's traffic communication revolution, high resolution detector data is available on a central traffic management level. High resolution detector data consist of detector slopes, also called pulse data. There is an initial and continuous need for checking the detectors for correct data as all kinds of disturbances may add erroneous information to the data. This paper proposes pulse data checking and interval data checking with optional data replacement in order to guarantee a continuous data flow even if detectors do not deliver the expected data quality: Raw detector data checking analyses rising and falling slopes of detector signals; Cumulative data checking compares interval values to reference curves. Cumulative data checking needs less computational effort, but needs more parameterization effort than raw detector data checking. Both checking principles are applied to different systems in Switzerland since about five years.展开更多
The application field for Unmanned Aerial Vehicle (UAV) technology and its adoption rate have been increasingsteadily in the past years. Decreasing cost of commercial drones has enabled their use at a scale broader th...The application field for Unmanned Aerial Vehicle (UAV) technology and its adoption rate have been increasingsteadily in the past years. Decreasing cost of commercial drones has enabled their use at a scale broader thanever before. However, increasing the complexity of UAVs and decreasing the cost, both contribute to a lack ofimplemented securitymeasures and raise new security and safety concerns. For instance, the issue of implausible ortampered UAV sensor measurements is barely addressed in the current research literature and thus, requires moreattention from the research community. The goal of this survey is to extensively review state-of-the-art literatureregarding common sensor- and communication-based vulnerabilities, existing threats, and active or passive cyberattacksagainst UAVs, as well as shed light on the research gaps in the literature. In this work, we describe theUnmanned Aerial System (UAS) architecture to point out the origination sources for security and safety issues.Weevaluate the coverage and completeness of each related research work in a comprehensive comparison table as wellas classify the threats, vulnerabilities and cyber-attacks into sensor-based and communication-based categories.Additionally, for each individual cyber-attack, we describe existing countermeasures or detectionmechanisms andprovide a list of requirements to ensureUAV’s security and safety.We also address the problem of implausible sensormeasurements and introduce the idea of a plausibility check for sensor data. By doing so, we discover additionalmeasures to improve security and safety and report on a research niche that is not well represented in the currentresearch literature.展开更多
This paper presents a methodology driven by database constraints for designing and developing(database)software applications.Much needed and with excellent results,this paradigm guarantees the highest possible quality...This paper presents a methodology driven by database constraints for designing and developing(database)software applications.Much needed and with excellent results,this paradigm guarantees the highest possible quality of the managed data.The proposed methodology is illustrated with an easy to understand,yet complex medium-sized genealogy software application driven by more than 200 database constraints,which fully meets such expectations.展开更多
Lateral interaction in the biological brain is a key mechanism that underlies higher cognitive functions.Linear self‐organising map(SOM)introduces lateral interaction in a general form in which signals of any modalit...Lateral interaction in the biological brain is a key mechanism that underlies higher cognitive functions.Linear self‐organising map(SOM)introduces lateral interaction in a general form in which signals of any modality can be used.Some approaches directly incorporate SOM learning rules into neural networks,but incur complex operations and poor extendibility.The efficient way to implement lateral interaction in deep neural networks is not well established.The use of Laplacian Matrix‐based Smoothing(LS)regularisation is proposed for implementing lateral interaction in a concise form.The authors’derivation and experiments show that lateral interaction implemented by SOM model is a special case of LS‐regulated k‐means,and they both show the topology‐preserving capability.The authors also verify that LS‐regularisation can be used in conjunction with the end‐to‐end training paradigm in deep auto‐encoders.Additionally,the benefits of LS‐regularisation in relaxing the requirement of parameter initialisation in various models and improving the classification performance of prototype classifiers are evaluated.Furthermore,the topologically ordered structure introduced by LS‐regularisation in feature extractor can improve the generalisation performance on classification tasks.Overall,LS‐regularisation is an effective and efficient way to implement lateral interaction and can be easily extended to different models.展开更多
The past two decades have seen a rapid adoption of artificial intelligence methods applied to mineral exploration. More recently, the easier acquisition of some types of data has inspired a broad literature that has e...The past two decades have seen a rapid adoption of artificial intelligence methods applied to mineral exploration. More recently, the easier acquisition of some types of data has inspired a broad literature that has examined many machine learning and modelling techniques that combine exploration criteria,or ’features’, to generate predictions for mineral prospectivity. Central to the design of prospectivity models is a ’mineral system’, a conceptual model describing the key geological elements that control the timing and location of economic mineralisation. The mineral systems model defines what constitutes a training set, which features represent geological evidence of mineralisation, how features are engineered and what modelling methods are used. Mineral systems are knowledge-driven conceptual models, thus all parameter choices are subject to human biases and opinion so alternative models are possible.However, the effect of alternative mineral systems models on prospectivity is rarely compared despite the potential to heavily influence final predictions. In this study, we focus on the effect of conceptual uncertainty on Fe ore prospectivity models in the Hamersley region, Western Australia. Four important considerations are tested.(1) Five different supergene and hypogene conceptual mineral systems models guide the inputs for five forest-based classification prospectivity models model.(2) To represent conceptual uncertainty, the predictions are then combined for prospectivity model comparison.(3)Representation of three-dimensional objects as two-dimensional features are tested to address commonly ignored thickness of geological units.(4) The training dataset is composed of known economic mineralisation sites(deposits) as ’positive’ examples, and exploration drilling data providing ’negative’sampling locations. Each of the spatial predictions are assessed using independent performance metrics common to AI-based classification methods and subjected to geological plausibility testing. We find that different conceptual mineral systems produce significantly different spatial predictions, thus conceptual uncertainty must be recognised. A benefit to recognising and modelling different conceptual models is that robust and geologically plausible predictions can be made that may guide mineral discovery.展开更多
Covering rough sets are improvements of traditional rough sets by considering cover of universe instead of partition.In this paper,we develop several measures based on evidence theory to characterize covering rough se...Covering rough sets are improvements of traditional rough sets by considering cover of universe instead of partition.In this paper,we develop several measures based on evidence theory to characterize covering rough sets.First,we present belief and plausibility functions in covering information systems and study their properties.With these measures we characterize lower and upper approximation operators and attribute reductions in covering information systems and decision systems respectively.With these discussions we propose a basic framework of numerical characterizations of covering rough sets.展开更多
The paper deals with the self-defending capacity of buildings and structures against earthquakes. The idea of this conceptcomes from Hooke’s Law of elasticity published in 1678. The original deterministic concept was...The paper deals with the self-defending capacity of buildings and structures against earthquakes. The idea of this conceptcomes from Hooke’s Law of elasticity published in 1678. The original deterministic concept was converted, with the aid of reliabilityanalysis, into a probalistic form. In this way seismic resilience covers a larger field of random phenomena and becomes directlyinvolved in more engineering applications.展开更多
The paper deals with the mathematical concept of jerk created in 1825 by Carl Jacobi and introduced in 1936, forapplications purposes in dynamics by Aurel A. Beles. Extended in seismic engineering that phenomenon of d...The paper deals with the mathematical concept of jerk created in 1825 by Carl Jacobi and introduced in 1936, forapplications purposes in dynamics by Aurel A. Beles. Extended in seismic engineering that phenomenon of dynamic amplification isbased on the first law of conservation in Mechanics combined with the theory of dislocations developed by Lev Landau. Finally, thetwo study cases presented in the paper recommend this simply analysis method for advanced practical cases with any degree ofaccuracy as well as in solving some plausibility inquire as requested by ISO 13822:2010.展开更多
Based on a corpus of 10 texts in material science discipline, this paper explores the use of scientific English hedging both by Chinese and English writers. The result shows that there are similarities as well as diff...Based on a corpus of 10 texts in material science discipline, this paper explores the use of scientific English hedging both by Chinese and English writers. The result shows that there are similarities as well as differences in hedging frequency and distribution between the research articles by Chinese writers and native English writers. Research articles written by Chinese writers tend to be more direct and authoritative in tone as a result of higher frequency of approximators and lower frequency of plausibility shields.展开更多
In this paper, the theory of plausible and paradoxical reasoning of Dezert- Smarandache (DSmT) is used to take into account the paradoxical charac-ter through the intersections of vegetation, aquatic and mineral surfa...In this paper, the theory of plausible and paradoxical reasoning of Dezert- Smarandache (DSmT) is used to take into account the paradoxical charac-ter through the intersections of vegetation, aquatic and mineral surfaces. In order to do this, we developed a classification model of pixels by aggregating information using the DSmT theory based on the PCR5 rule using the ∩NDVI, ∩MNDWI and ∩NDBaI spectral indices obtained from the ASTER satellite images. On the qualitative level, the model produced three simple classes for certain knowledge (E, V, M) and eight composite classes including two union classes characterizing partial ignorance ({E,V}, {M,V}) and six classes of intersection of which three classes of simple intersection (E∩V, M∩V, E∩M) and three classes of composite intersection (E∩{M,V}, M∩{E,V}, V∩{E,M}), which represent paradoxes. This model was validated with an average rate of 93.34% for the well-classified pixels and a compliance rate of the entities in the field of 96.37%. Thus, the model 1 retained provides 84.98% for the simple classes against 15.02% for the composite classes.展开更多
"Wide Sargasso Sea"is one of Jean Rhys' novella which appealed enormous research. Although many studies touched on different meanings from its narrative perspective, most of them illustrated from the asp..."Wide Sargasso Sea"is one of Jean Rhys' novella which appealed enormous research. Although many studies touched on different meanings from its narrative perspective, most of them illustrated from the aspect which mixed narrator and point of view. This study researched this novella from the perspective of point of view suggested by Prof. Shen Dan to decode author's method of telling a"plausible story".展开更多
A phytochemical investigation on Isodon flavidus led to the isolation of flavidanolide A(1),a rearranged diterpenoid featuring a six/seven/five-membered tricyclic skeleton,together with flavidanolide B(2),an uncommon ...A phytochemical investigation on Isodon flavidus led to the isolation of flavidanolide A(1),a rearranged diterpenoid featuring a six/seven/five-membered tricyclic skeleton,together with flavidanolide B(2),an uncommon heterodimeric diterpenoid consisting of a norabietane and a seco-isopimarane monomeric units.Their structures were elucidated by extensive spectroscopic data and single-crystal X-ray diffraction analyses.Their plausible biosynthetic routes were also proposed.In the bioassay,flavidanolide B was found to exhibit good inhibitory effect against lipopolysaccharide(LPS)-induced nitric oxide(NO)production in RAW264.7 cells comparable to positive control pyrrolidinedithiocarbamate ammonium(PDTC),which provided evidence for the medicinal value of I.flavidus as a folk medicine for treating inflammatory diseases.展开更多
Unlike conventional forensics,digital forensics does not at present generally quantify the results of its investigations.It is suggested that digital forensics should aim to catch up with other forensic disciplines by...Unlike conventional forensics,digital forensics does not at present generally quantify the results of its investigations.It is suggested that digital forensics should aim to catch up with other forensic disciplines by using Bayesian and other numerical methodologies to quantify its investigations’results.Assessing the plausibility of alternative hypotheses(or propositions,or claims)which explain how recovered digital evidence came to exist on a device could assist both the prosecution and the defence sides in criminal proceedings:helping the prosecution to decide whether to proceed to trial and helping defence lawyers to advise a defendant how to plead.This paper reviews some numerical approaches to the goal of quantifying the relative weights of individual items of digital evidence and the plausibility of hypotheses based on that evidence.The potential advantages enabling the construction of cost-effective digital forensic triage schemas are also outlined.展开更多
Ensuring confidentiality of sensitive data is of paramount importance,since data leakage may not only endanger data owners’privacy,but also ruin reputation of businesses as well as violate various regulations like HI...Ensuring confidentiality of sensitive data is of paramount importance,since data leakage may not only endanger data owners’privacy,but also ruin reputation of businesses as well as violate various regulations like HIPPA and Sarbanes-Oxley Act.To provide confidentiality guarantee,the data should be protected when they are preserved in the personal computing devices(i.e.,confidentiality during their lifetime);and also,they should be rendered irrecoverable after they are removed from the devices(i.e.,confidentiality after their lifetime).Encryption and secure deletion are used to ensure data confidentiality during and after their lifetime,respectively.This work aims to perform a thorough literature review on the techniques being used to protect confidentiality of the data in personal computing devices,including both encryption and secure deletion.Especially for encryption,we mainly focus on the novel plausibly deniable encryption(PDE),which can ensure data confidentiality against both a coercive(i.e.,the attacker can coerce the data owner for the decryption key)and a non-coercive attacker.展开更多
Macleayine (1), a new natural occurring alkaloid with a unique spiro [furanone-piperidinedione] framework, was isolated from the aerial parts ofMacleaya cordata. Its unusual structure was established by extensive sp...Macleayine (1), a new natural occurring alkaloid with a unique spiro [furanone-piperidinedione] framework, was isolated from the aerial parts ofMacleaya cordata. Its unusual structure was established by extensive spectroscopic analyses, computer-assisted structure elucidation software (ACD/Structure Elucidator), quantum chemistry calculations and ECD calculation. The result of virtual molecular docking predicted the compound can enhance the effects of insulin, and may be used to treat tvpe II diabetes.展开更多
文摘The Enhanced Complexity Model( ECM) developed previously has been further extended to produce a Motivationally Enhanced Complexity Model( MECM) which enables the degree of motivation,capability and opportunity of a hypothetical Trojan Horse author to be included in quantifying the relative plausibility of competing explanations for the existence of uncontested digital evidence.This new model has been applied to the case of the Trojan Horse defence( THD) against the possession of child pornography.Our results demonstrate that the THD in this case cannot be plausibly sustained unless it can be shown that an ‘off-theshelf'( OTS) Trojan Horse for this task is available and it is not detectable by the target computer,at the material time.
文摘Loop detectors are wide-spread and relatively cheap detecting devices. With today's traffic communication revolution, high resolution detector data is available on a central traffic management level. High resolution detector data consist of detector slopes, also called pulse data. There is an initial and continuous need for checking the detectors for correct data as all kinds of disturbances may add erroneous information to the data. This paper proposes pulse data checking and interval data checking with optional data replacement in order to guarantee a continuous data flow even if detectors do not deliver the expected data quality: Raw detector data checking analyses rising and falling slopes of detector signals; Cumulative data checking compares interval values to reference curves. Cumulative data checking needs less computational effort, but needs more parameterization effort than raw detector data checking. Both checking principles are applied to different systems in Switzerland since about five years.
基金the FederalMinistry of Education and Research of Germany under Grant Numbers 16ES1131 and 16ES1128K.
文摘The application field for Unmanned Aerial Vehicle (UAV) technology and its adoption rate have been increasingsteadily in the past years. Decreasing cost of commercial drones has enabled their use at a scale broader thanever before. However, increasing the complexity of UAVs and decreasing the cost, both contribute to a lack ofimplemented securitymeasures and raise new security and safety concerns. For instance, the issue of implausible ortampered UAV sensor measurements is barely addressed in the current research literature and thus, requires moreattention from the research community. The goal of this survey is to extensively review state-of-the-art literatureregarding common sensor- and communication-based vulnerabilities, existing threats, and active or passive cyberattacksagainst UAVs, as well as shed light on the research gaps in the literature. In this work, we describe theUnmanned Aerial System (UAS) architecture to point out the origination sources for security and safety issues.Weevaluate the coverage and completeness of each related research work in a comprehensive comparison table as wellas classify the threats, vulnerabilities and cyber-attacks into sensor-based and communication-based categories.Additionally, for each individual cyber-attack, we describe existing countermeasures or detectionmechanisms andprovide a list of requirements to ensureUAV’s security and safety.We also address the problem of implausible sensormeasurements and introduce the idea of a plausibility check for sensor data. By doing so, we discover additionalmeasures to improve security and safety and report on a research niche that is not well represented in the currentresearch literature.
文摘This paper presents a methodology driven by database constraints for designing and developing(database)software applications.Much needed and with excellent results,this paradigm guarantees the highest possible quality of the managed data.The proposed methodology is illustrated with an easy to understand,yet complex medium-sized genealogy software application driven by more than 200 database constraints,which fully meets such expectations.
基金supported by the National Natural Science Foundation of China grants 61836014 to CL,and the STI2030‐Major Projects(2022ZD0205100)the Strategic Priority Research Program of Chinese Academy of Science,Grant No.XDB32010300+1 种基金Shanghai Municipal Science and Technology Major Project(Grant No.2018SHZDZX05)the Innovation Academy of Artificial Intelligence,Chinese Academy of Sciences to ZW.
文摘Lateral interaction in the biological brain is a key mechanism that underlies higher cognitive functions.Linear self‐organising map(SOM)introduces lateral interaction in a general form in which signals of any modality can be used.Some approaches directly incorporate SOM learning rules into neural networks,but incur complex operations and poor extendibility.The efficient way to implement lateral interaction in deep neural networks is not well established.The use of Laplacian Matrix‐based Smoothing(LS)regularisation is proposed for implementing lateral interaction in a concise form.The authors’derivation and experiments show that lateral interaction implemented by SOM model is a special case of LS‐regulated k‐means,and they both show the topology‐preserving capability.The authors also verify that LS‐regularisation can be used in conjunction with the end‐to‐end training paradigm in deep auto‐encoders.Additionally,the benefits of LS‐regularisation in relaxing the requirement of parameter initialisation in various models and improving the classification performance of prototype classifiers are evaluated.Furthermore,the topologically ordered structure introduced by LS‐regularisation in feature extractor can improve the generalisation performance on classification tasks.Overall,LS‐regularisation is an effective and efficient way to implement lateral interaction and can be easily extended to different models.
基金the financial support of the ARC ITTC DARE Centre IC190100031 (ML, MJ, RS, EC)the ARC DECRA scheme DE190100431 (ML)+4 种基金ARC Linkage Loop3D LP170100985 (ML, MJ, GP, JG)MRIWA Project M0557 (NP, MJ)MinEx CRC (ML, MJ, JG, GP)support from European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No. 101032994supported by the Mineral Exploration Cooperative Research Centre whose activities are funded by the Australian Government’s Cooperative Research Centre Program。
文摘The past two decades have seen a rapid adoption of artificial intelligence methods applied to mineral exploration. More recently, the easier acquisition of some types of data has inspired a broad literature that has examined many machine learning and modelling techniques that combine exploration criteria,or ’features’, to generate predictions for mineral prospectivity. Central to the design of prospectivity models is a ’mineral system’, a conceptual model describing the key geological elements that control the timing and location of economic mineralisation. The mineral systems model defines what constitutes a training set, which features represent geological evidence of mineralisation, how features are engineered and what modelling methods are used. Mineral systems are knowledge-driven conceptual models, thus all parameter choices are subject to human biases and opinion so alternative models are possible.However, the effect of alternative mineral systems models on prospectivity is rarely compared despite the potential to heavily influence final predictions. In this study, we focus on the effect of conceptual uncertainty on Fe ore prospectivity models in the Hamersley region, Western Australia. Four important considerations are tested.(1) Five different supergene and hypogene conceptual mineral systems models guide the inputs for five forest-based classification prospectivity models model.(2) To represent conceptual uncertainty, the predictions are then combined for prospectivity model comparison.(3)Representation of three-dimensional objects as two-dimensional features are tested to address commonly ignored thickness of geological units.(4) The training dataset is composed of known economic mineralisation sites(deposits) as ’positive’ examples, and exploration drilling data providing ’negative’sampling locations. Each of the spatial predictions are assessed using independent performance metrics common to AI-based classification methods and subjected to geological plausibility testing. We find that different conceptual mineral systems produce significantly different spatial predictions, thus conceptual uncertainty must be recognised. A benefit to recognising and modelling different conceptual models is that robust and geologically plausible predictions can be made that may guide mineral discovery.
基金supported by a grant of NSFC(70871036)a grant of National Basic Research Program of China(2009CB219801-3)
文摘Covering rough sets are improvements of traditional rough sets by considering cover of universe instead of partition.In this paper,we develop several measures based on evidence theory to characterize covering rough sets.First,we present belief and plausibility functions in covering information systems and study their properties.With these measures we characterize lower and upper approximation operators and attribute reductions in covering information systems and decision systems respectively.With these discussions we propose a basic framework of numerical characterizations of covering rough sets.
文摘The paper deals with the self-defending capacity of buildings and structures against earthquakes. The idea of this conceptcomes from Hooke’s Law of elasticity published in 1678. The original deterministic concept was converted, with the aid of reliabilityanalysis, into a probalistic form. In this way seismic resilience covers a larger field of random phenomena and becomes directlyinvolved in more engineering applications.
文摘The paper deals with the mathematical concept of jerk created in 1825 by Carl Jacobi and introduced in 1936, forapplications purposes in dynamics by Aurel A. Beles. Extended in seismic engineering that phenomenon of dynamic amplification isbased on the first law of conservation in Mechanics combined with the theory of dislocations developed by Lev Landau. Finally, thetwo study cases presented in the paper recommend this simply analysis method for advanced practical cases with any degree ofaccuracy as well as in solving some plausibility inquire as requested by ISO 13822:2010.
文摘Based on a corpus of 10 texts in material science discipline, this paper explores the use of scientific English hedging both by Chinese and English writers. The result shows that there are similarities as well as differences in hedging frequency and distribution between the research articles by Chinese writers and native English writers. Research articles written by Chinese writers tend to be more direct and authoritative in tone as a result of higher frequency of approximators and lower frequency of plausibility shields.
文摘In this paper, the theory of plausible and paradoxical reasoning of Dezert- Smarandache (DSmT) is used to take into account the paradoxical charac-ter through the intersections of vegetation, aquatic and mineral surfaces. In order to do this, we developed a classification model of pixels by aggregating information using the DSmT theory based on the PCR5 rule using the ∩NDVI, ∩MNDWI and ∩NDBaI spectral indices obtained from the ASTER satellite images. On the qualitative level, the model produced three simple classes for certain knowledge (E, V, M) and eight composite classes including two union classes characterizing partial ignorance ({E,V}, {M,V}) and six classes of intersection of which three classes of simple intersection (E∩V, M∩V, E∩M) and three classes of composite intersection (E∩{M,V}, M∩{E,V}, V∩{E,M}), which represent paradoxes. This model was validated with an average rate of 93.34% for the well-classified pixels and a compliance rate of the entities in the field of 96.37%. Thus, the model 1 retained provides 84.98% for the simple classes against 15.02% for the composite classes.
文摘"Wide Sargasso Sea"is one of Jean Rhys' novella which appealed enormous research. Although many studies touched on different meanings from its narrative perspective, most of them illustrated from the aspect which mixed narrator and point of view. This study researched this novella from the perspective of point of view suggested by Prof. Shen Dan to decode author's method of telling a"plausible story".
基金supported by the National Natural Science Foundation of China(No.82204605)the National Natural Science Foundation of China(No.81560709)+3 种基金the Technology Fund of Guizhou Administration of Traditional Chinese Medicine(No.QZYY-2022019)Science and Technology Tip-top Talent Foundation of Universities in Guizhou Province(No.Qian jiao he KY(2021)034)the Research Grant Council of the Hong Kong Special Administrative Region,China(No.HKBU 12102219)the University Grants Committee of the Hong Kong Special Administrative Region,China(UGC Research Matching Grant Scheme,No.RMG2019_1_19)。
文摘A phytochemical investigation on Isodon flavidus led to the isolation of flavidanolide A(1),a rearranged diterpenoid featuring a six/seven/five-membered tricyclic skeleton,together with flavidanolide B(2),an uncommon heterodimeric diterpenoid consisting of a norabietane and a seco-isopimarane monomeric units.Their structures were elucidated by extensive spectroscopic data and single-crystal X-ray diffraction analyses.Their plausible biosynthetic routes were also proposed.In the bioassay,flavidanolide B was found to exhibit good inhibitory effect against lipopolysaccharide(LPS)-induced nitric oxide(NO)production in RAW264.7 cells comparable to positive control pyrrolidinedithiocarbamate ammonium(PDTC),which provided evidence for the medicinal value of I.flavidus as a folk medicine for treating inflammatory diseases.
文摘Unlike conventional forensics,digital forensics does not at present generally quantify the results of its investigations.It is suggested that digital forensics should aim to catch up with other forensic disciplines by using Bayesian and other numerical methodologies to quantify its investigations’results.Assessing the plausibility of alternative hypotheses(or propositions,or claims)which explain how recovered digital evidence came to exist on a device could assist both the prosecution and the defence sides in criminal proceedings:helping the prosecution to decide whether to proceed to trial and helping defence lawyers to advise a defendant how to plead.This paper reviews some numerical approaches to the goal of quantifying the relative weights of individual items of digital evidence and the plausibility of hypotheses based on that evidence.The potential advantages enabling the construction of cost-effective digital forensic triage schemas are also outlined.
基金partially supported by the National Key Research&Development Program of China(Grant No.2017YFC0822704)National Natural Science Foundation of China(No.61602476,No.61772518 and No.61602475).
文摘Ensuring confidentiality of sensitive data is of paramount importance,since data leakage may not only endanger data owners’privacy,but also ruin reputation of businesses as well as violate various regulations like HIPPA and Sarbanes-Oxley Act.To provide confidentiality guarantee,the data should be protected when they are preserved in the personal computing devices(i.e.,confidentiality during their lifetime);and also,they should be rendered irrecoverable after they are removed from the devices(i.e.,confidentiality after their lifetime).Encryption and secure deletion are used to ensure data confidentiality during and after their lifetime,respectively.This work aims to perform a thorough literature review on the techniques being used to protect confidentiality of the data in personal computing devices,including both encryption and secure deletion.Especially for encryption,we mainly focus on the novel plausibly deniable encryption(PDE),which can ensure data confidentiality against both a coercive(i.e.,the attacker can coerce the data owner for the decryption key)and a non-coercive attacker.
基金financially supported by the National Natural Science Foundation of China(No.81172958)the Basic Research Subject of Key Laboratory Supported by Educational Commission of Liaoning Province of China(No.LZ2014044)
文摘Macleayine (1), a new natural occurring alkaloid with a unique spiro [furanone-piperidinedione] framework, was isolated from the aerial parts ofMacleaya cordata. Its unusual structure was established by extensive spectroscopic analyses, computer-assisted structure elucidation software (ACD/Structure Elucidator), quantum chemistry calculations and ECD calculation. The result of virtual molecular docking predicted the compound can enhance the effects of insulin, and may be used to treat tvpe II diabetes.