This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends t...This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends that they are not the same. The concept of cyber security is explored, which goes beyond protecting information resources to include a wider variety of assets, including people [1]. Protecting information assets is the main goal of traditional information security, with consideration to the human element and how people fit into the security process. On the other hand, cyber security adds a new level of complexity, as people might unintentionally contribute to or become targets of cyberattacks. This aspect presents moral questions since it is becoming more widely accepted that society has a duty to protect weaker members of society, including children [1]. The study emphasizes how important cyber security is on a larger scale, with many countries creating plans and laws to counteract cyberattacks. Nevertheless, a lot of these sources frequently neglect to define the differences or the relationship between information security and cyber security [1]. The paper focus on differentiating between cybersecurity and information security on a larger scale. The study also highlights other areas of cybersecurity which includes defending people, social norms, and vital infrastructure from threats that arise from online in addition to information and technology protection. It contends that ethical issues and the human factor are becoming more and more important in protecting assets in the digital age, and that cyber security is a paradigm shift in this regard [1].展开更多
We advance here a novel methodology for robust intelligent biometric information management with inferences and predictions made using randomness and complexity concepts. Intelligence refers to learning, adap- tation,...We advance here a novel methodology for robust intelligent biometric information management with inferences and predictions made using randomness and complexity concepts. Intelligence refers to learning, adap- tation, and functionality, and robustness refers to the ability to handle incomplete and/or corrupt adversarial information, on one side, and image and or device variability, on the other side. The proposed methodology is model-free and non-parametric. It draws support from discriminative methods using likelihood ratios to link at the conceptual level biometrics and forensics. It further links, at the modeling and implementation level, the Bayesian framework, statistical learning theory (SLT) using transduction and semi-supervised lea- rning, and Information Theory (IY) using mutual information. The key concepts supporting the proposed methodology are a) local estimation to facilitate learning and prediction using both labeled and unlabeled data;b) similarity metrics using regularity of patterns, randomness deficiency, and Kolmogorov complexity (similar to MDL) using strangeness/typicality and ranking p-values;and c) the Cover – Hart theorem on the asymptotical performance of k-nearest neighbors approaching the optimal Bayes error. Several topics on biometric inference and prediction related to 1) multi-level and multi-layer data fusion including quality and multi-modal biometrics;2) score normalization and revision theory;3) face selection and tracking;and 4) identity management, are described here using an integrated approach that includes transduction and boosting for ranking and sequential fusion/aggregation, respectively, on one side, and active learning and change/ outlier/intrusion detection realized using information gain and martingale, respectively, on the other side. The methodology proposed can be mapped to additional types of information beyond biometrics.展开更多
AIM To establish minimum clinically important difference(MCID) for measurements in an orthopaedic patient population with joint disorders.METHODS Adult patients aged 18 years and older seeking care for joint condition...AIM To establish minimum clinically important difference(MCID) for measurements in an orthopaedic patient population with joint disorders.METHODS Adult patients aged 18 years and older seeking care for joint conditions at an orthopaedic clinic took the Patient-Reported Outcomes Measurement Information System Physical Function(PROMIS~? PF) computerized adaptive test(CAT), hip disability and osteoarthritis outcome score for joint reconstruction(HOOS JR), and the knee injury and osteoarthritis outcome score for joint reconstruction(KOOS JR) from February 2014 to April 2017. MCIDs were calculated using anchorbased and distribution-based methods. Patient reports of meaningful change in function since their first clinic encounter were used as an anchor.RESULTS There were 2226 patients who participated with a mean age of 61.16(SD = 12.84) years, 41.6% male, and 89.7% Caucasian. Mean change ranged from 7.29 to 8.41 for the PROMIS~? PF CAT, from 14.81 to 19.68 for the HOOS JR, and from 14.51 to 18.85 for the KOOS JR. ROC cut-offs ranged from 1.97-8.18 for the PF CAT, 6.33-43.36 for the HOOS JR, and 2.21-8.16 for the KOOS JR. Distribution-based methods estimated MCID values ranging from 2.45 to 21.55 for the PROMIS~? PF CAT; from 3.90 to 43.61 for the HOOS JR, and from 3.98 to 40.67 for the KOOS JR. The median MCID value in the range was similar to the mean change score for each measure and was 7.9 for the PF CAT, 18.0 for the HOOS JR, and 15.1 for the KOOS JR.CONCLUSION This is the first comprehensive study providing a wide range of MCIDs for the PROMIS? PF, HOOS JR, and KOOS JR in orthopaedic patients with joint ailments.展开更多
This study examines the key factors that have impact on the successful adoption of Human Resource Information System (HRIS) within the Aqaba Special Economic Zone Authority (ASEZA)/Jordan. In order to accomplish the p...This study examines the key factors that have impact on the successful adoption of Human Resource Information System (HRIS) within the Aqaba Special Economic Zone Authority (ASEZA)/Jordan. In order to accomplish the purpose of the study four critical factors are inquired. So, four critical factors are inquired: First, TAM Model (Perceived Ease of Use (PEOU) and Perceived Usefulness (PU)). Second, Information Technology Infrastructure (ITI). Third, Top Management Support (TMS). Finally, Individual Experience with Computer (IEC). The research model was applied to collect data from the questionnaires answered by 45 users of HRIS as a source of primary data, based on a convenience sample the response rate was about 91%. In addition, the results were analyzed by utilizing the Statistical Package for Social Software (SPSS). Furthermore, the findings were analyzed;multiple Regression analysis indicated that all research variables have significant relationship on successful adoption of HRIS. The findings indicated IT infrastructures have a positive and significant effect on the successful adoption of HRIS. But there is no significant of PU, PEOU, TMS, and IEC on the successful adoption of HRIS. Finally, the results indicated that no significant statistical differences of demographic characteristics on HRIS adoption. Depending on the research’s findings;the researchers proposed a set of recommendations for better adoption of HRIS in SEZA.展开更多
The aim of this work is mathematical education through the knowledge system and mathematical modeling. A net model of formation of mathematical knowledge as a deductive theory is suggested here. Within this model the ...The aim of this work is mathematical education through the knowledge system and mathematical modeling. A net model of formation of mathematical knowledge as a deductive theory is suggested here. Within this model the formation of deductive theory is represented as the development of a certain informational space, the elements of which are structured in the form of the orientated semantic net. This net is properly metrized and characterized by a certain system of coverings. It allows injecting net optimization parameters, regulating qualitative aspects of knowledge system under consideration. To regulate the creative processes of the formation and realization of mathematical know- edge, stochastic model of formation deductive theory is suggested here in the form of branching Markovian process, which is realized in the corresponding informational space as a semantic net. According to this stochastic model we can get correct foundation of criterion of optimization creative processes that leads to “great main points” strategy (GMP-strategy) in the process of realization of the effective control in the research work in the sphere of mathematics and its applications.展开更多
Throughout the globe,diabetes mellitus(DM) is increasing in incidence with limited therapies presently available to prevent or resolve the significant complications of this disorder.DM impacts multiple organs and af...Throughout the globe,diabetes mellitus(DM) is increasing in incidence with limited therapies presently available to prevent or resolve the significant complications of this disorder.DM impacts multiple organs and affects all components of the central and peripheral nervous systems that can range from dementia to diabetic neuropathy.The mechanistic target of rapamycin(m TOR) is a promising agent for the development of novel regenerative strategies for the treatment of DM.m TOR and its related signaling pathways impact multiple metabolic parameters that include cellular metabolic homeostasis,insulin resistance,insulin secretion,stem cell proliferation and differentiation,pancreatic β-cell function,and programmed cell death with apoptosis and autophagy.m TOR is central element for the protein complexes m TOR Complex 1(m TORC1) and m TOR Complex 2(m TORC2) and is a critical component for a number of signaling pathways that involve phosphoinositide 3-kinase(PI 3-K),protein kinase B(Akt),AMP activated protein kinase(AMPK),silent mating type information regulation 2 homolog 1(Saccharomyces cerevisiae)(SIRT1),Wnt1 inducible signaling pathway protein 1(WISP1),and growth factors.As a result,m TOR represents an exciting target to offer new clinical avenues for the treatment of DM and the complications of this disease.Future studies directed to elucidate the delicate balance m TOR holds over cellular metabolism and the impact of its broad signaling pathways should foster the translation of these targets into effective clinical regimens for DM.展开更多
This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering...This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering various cloud infrastructure offerings and security models. Additionally, it deeply analyzes real-life case studies illustrating successful cloud migrations and highlights common information security threats in current cloud computing. The article concludes by offering recommendations to businesses to protect themselves from cloud data breaches and providing insights into selecting a suitable cloud services provider from an information security perspective.展开更多
The subversive nature of information war lies not only in the information itself, but also in the circulation and application of information. It has always been a challenge to quantitatively analyze the function and e...The subversive nature of information war lies not only in the information itself, but also in the circulation and application of information. It has always been a challenge to quantitatively analyze the function and effect of information flow through command, control, communications, computer, kill, intelligence,surveillance, reconnaissance (C4KISR) system. In this work, we propose a framework of force of information influence and the methods for calculating the force of information influence between C4KISR nodes of sensing, intelligence processing,decision making and fire attack. Specifically, the basic concept of force of information influence between nodes in C4KISR system is formally proposed and its mathematical definition is provided. Then, based on the information entropy theory, the model of force of information influence between C4KISR system nodes is constructed. Finally, the simulation experiments have been performed under an air defense and attack scenario. The experimental results show that, with the proposed force of information influence framework, we can effectively evaluate the contribution of information circulation through different C4KISR system nodes to the corresponding tasks. Our framework of force of information influence can also serve as an effective tool for the design and dynamic reconfiguration of C4KISR system architecture.展开更多
This article proves the existence of a hyper-precise global numerical meta-architecture unifying, structuring, binding and controlling the billion triplet codons constituting the sequence of single-stranded DNA of the...This article proves the existence of a hyper-precise global numerical meta-architecture unifying, structuring, binding and controlling the billion triplet codons constituting the sequence of single-stranded DNA of the entire human genome. Beyond the evolution and erratic mutations like transposons within the genome, it’s as if the memory of a fossil genome with multiple symmetries persists. This recalls the “intermingling” of information characterizing the fractal universe of chaos theory. The result leads to a balanced and perfect tuning between the masses of the two strands of the huge DNA molecule that constitute our genome. We show here how codon populations forming the single-stranded DNA sequences can constitute a critical approach to the understanding of junk DNA function. Then, we suggest revisiting certain methods published in our 2009 book “Codex Biogenesis”. In fact, we demonstrate here how the universal genetic code table is a powerful analytical filter to characterize single-stranded DNA sequences constituting chromosomes and genomes. We can then show that any genomic DNA sequence is featured by three numbers, which characterize it and its 64 codon populations with correlations greater than 99%. The number “1” is common to all sequences, expressing the second law of Chargaff. The other 2 numbers are related to each specific DNA sequence case characterizing life species. For example, the entire human genome is characterized by three remarkable numbers 1, 2, and Phi = 1.618 the golden ratio. Associated with each of these three numbers, we can match three axes of symmetry, then “imagine” a kind of hyperspace formed by these codon populations. Then we revisit the value (3-Phi)/2 which is probably universal and common to both the scale of quarks and atomic levels, balancing and tuning the whole human genome codon population. Finally, we demonstrate a new kind of duality between “form and substance” overlapping the whole human genome: we will show that—simultaneously with the duality between genes and junk DNA—there is a second layer of embedded hidden structure overlapping all the DNA of the whole human genome, dividing it into a second type of duality information/redundancy involving golden ratio proportions.展开更多
COVID-19 pandemic restrictions limited all social activities to curtail the spread of the virus.The foremost and most prime sector among those affected were schools,colleges,and universities.The education system of en...COVID-19 pandemic restrictions limited all social activities to curtail the spread of the virus.The foremost and most prime sector among those affected were schools,colleges,and universities.The education system of entire nations had shifted to online education during this time.Many shortcomings of Learning Management Systems(LMSs)were detected to support education in an online mode that spawned the research in Artificial Intelligence(AI)based tools that are being developed by the research community to improve the effectiveness of LMSs.This paper presents a detailed survey of the different enhancements to LMSs,which are led by key advances in the area of AI to enhance the real-time and non-real-time user experience.The AI-based enhancements proposed to the LMSs start from the Application layer and Presentation layer in the form of flipped classroom models for the efficient learning environment and appropriately designed UI/UX for efficient utilization of LMS utilities and resources,including AI-based chatbots.Session layer enhancements are also required,such as AI-based online proctoring and user authentication using Biometrics.These extend to the Transport layer to support real-time and rate adaptive encrypted video transmission for user security/privacy and satisfactory working of AI-algorithms.It also needs the support of the Networking layer for IP-based geolocation features,the Virtual Private Network(VPN)feature,and the support of Software-Defined Networks(SDN)for optimum Quality of Service(QoS).Finally,in addition to these,non-real-time user experience is enhanced by other AI-based enhancements such as Plagiarism detection algorithms and Data Analytics.展开更多
ABSTRACT: This paper generalizes the makeup and forming dynamic mechanism of natural disaster systems, principles and methods of comprehensive division of natural disasters, as well as structure, function and up-build...ABSTRACT: This paper generalizes the makeup and forming dynamic mechanism of natural disaster systems, principles and methods of comprehensive division of natural disasters, as well as structure, function and up-build routes of map and file information visualization system (MFIVS). Taking the Changjiang(Yangtze) Valley as an example, on the basis of revealing up the integrated mechanism on the formations of its natural disasters and its distributing law, thereafter, the paper relies on the MFIVS technique, adopts two top-down and bottom-up approaches to study a comprehensive division of natural disasters. It is relatively objective and precise that the required division results include three natural disaster sections and nine natural disaster sub-sections, which can not only provide a scientific basis for utilizing natural resources and controlling natural disaster and environmental degradation, but also be illuminated to a concise, practical and effective technique on comprehensive division.展开更多
We start from a minimal number of generally accepted premises, in particular Hartle-Hawking quantum wave of the universe and von Neumann-Connes’ pointless and self referential spacetime geometry. We then proceed from...We start from a minimal number of generally accepted premises, in particular Hartle-Hawking quantum wave of the universe and von Neumann-Connes’ pointless and self referential spacetime geometry. We then proceed from there to show, using Dvoretzky’s theorem of measure concentration, that the total energy of the universe is divided into two parts, an ordinary energy very small part which we can measure while most of the energy is concentrated as the second part at the boundary of the holographic boundary which we cannot measure in a direct way. Finally the results are shown to imply a resolution of the black hole information paradox without violating the fundamental laws of physics. In this way the main thrust of the two opposing arguments and views, namely that of Hawking on the one side and Susskind as well as tHooft on the other side, is brought to a consistent and compatible coherent unit.展开更多
AIM To examine the practice pattern in Kaiser Permanente Southern California(KPSC), i.e., gastroenterology(GI)/surgery referrals and endoscopic ultrasound(EUS), for pancreatic cystic neoplasms(PCNs) after the regionwi...AIM To examine the practice pattern in Kaiser Permanente Southern California(KPSC), i.e., gastroenterology(GI)/surgery referrals and endoscopic ultrasound(EUS), for pancreatic cystic neoplasms(PCNs) after the regionwide dissemination of the PCN management algorithm.METHODS Retrospective review was performed; patients with PCN diagnosis given between April 2012 and April 2015(18 mo before and after the publication of the algorithm) in KPSC(integrated health system with 15 hospitals and 202 medical offices in Southern California) were identified.RESULTS2558(1157 pre-and 1401 post-algorithm) received a new diagnosis of PCN in the study period. There was no difference in the mean cyst size(pre-19.1 mm vs post-18.5 mm, P = 0.119). A smaller percentage of PCNs resulted in EUS after the implementation of the algorithm(pre-45.5% vs post-34.8%, P < 0.001). A smaller proportion of patients were referred for GI(pre-65.2% vs post-53.3%, P < 0.001) and surgery consultations(pre-24.8% vs post-16%, P < 0.001) for PCN after the implementation. There was no significant change in operations for PCNs. Cost of diagnostic care was reduced after the implementation by 24%, 18%, and 36% for EUS, GI, and surgery consultations, respectively, with total cost saving of 24%.CONCLUSION In the current healthcare climate, there is increased need to optimize resource utilization. Dissemination of an algorithm for PCN management in an integrated health system resulted in fewer EUS and GI/surgery referrals, likely by aiding the physicians ordering imaging studies in the decision making for the management of PCNs. This translated to cost saving of 24%, 18%, and 36% for EUS, GI, and surgical consultations, respectively, with total diagnostic cost saving of 24%.展开更多
In this research we are going to define two new concepts: a) “The Potential of Events” (EP) and b) “The Catholic Information” (CI). The term CI derives from the ancient Greek language and declares all the Catholic...In this research we are going to define two new concepts: a) “The Potential of Events” (EP) and b) “The Catholic Information” (CI). The term CI derives from the ancient Greek language and declares all the Catholic (general) Logical Propositions (<img src="Edit_5f13a4a5-abc6-4bc5-9e4c-4ff981627b2a.png" width="33" height="21" alt="" />) which will true for every element of a set A. We will study the Riemann Hypothesis in two stages: a) By using the EP we will prove that the distribution of events e (even) and o (odd) of Square Free Numbers (SFN) on the axis Ax(N) of naturals is Heads-Tails (H-T) type. b) By using the CI we will explain the way that the distribution of prime numbers can be correlated with the non-trivial zeros of the function <em>ζ</em>(<em>s</em>) of Riemann. The Introduction and the Chapter 2 are necessary for understanding the solution. In the Chapter 3 we will present a simple method of forecasting in many very useful applications (e.g. financial, technological, medical, social, etc) developing a generalization of this new, proven here, theory which we finally apply to the solution of RH. The following Introduction as well the Results with the Discussion at the end shed light about the possibility of the proof of all the above. The article consists of 9 chapters that are numbered by 1, 2, …, 9.展开更多
Based on approaches deduced from previous research findings and empirical observations from density control experiments, genetic worth effect response models were developed for black spruce (Picea mariana (Mill) BSP.)...Based on approaches deduced from previous research findings and empirical observations from density control experiments, genetic worth effect response models were developed for black spruce (Picea mariana (Mill) BSP.) and jack pine (Pinus banksiana Lamb.) plantations. The models accounted for the increased rate of stand development arising from the planting of genetically-improved stock through temporal adjustments to the species-specific site-based mean dominant height-age functions. The models utilized a relative height growth modifier based on known estimates of genetic gain. The models also incorporated a phenotypic juvenile age-mature age correlation function in order to account for the intrinsic temporal decline in the magnitude of genetic worth effects throughout the rotation. Integrating the functions into algorithmic variants of structural stand density management models produced stand development patterns that were consistent with axioms of even-aged stand dynamics.展开更多
A black hole is essentially a relativistic as well as a quantum object. Therefore the information paradox of black holes is a consequence of the clash between these two most fundamental theories of modern physics. It ...A black hole is essentially a relativistic as well as a quantum object. Therefore the information paradox of black holes is a consequence of the clash between these two most fundamental theories of modern physics. It is logical to conclude that a resolution of the problem requires some form of a quantum gravity theory. The present work proposes such a resolution using set theory and pointless spacetime geometry.展开更多
This text is trying to discuss an approximation to the concept of human emancipation,as part of our well-being,in terms of Education and Knowledge.Without abandoning our metaphysical perception of wholeness,as an exte...This text is trying to discuss an approximation to the concept of human emancipation,as part of our well-being,in terms of Education and Knowledge.Without abandoning our metaphysical perception of wholeness,as an extension of the continuity principle which connects our conscious and unconscious world,emancipation is considered as a personal struggle against all oppressions.Some of these are grounded in our inner world.In accordance with the Enlightenment request,reasoning and knowledge can help us to structure new forms of acceptances which are shaping our own emancipatory meaning.Under the impact of social influence and personal interpretation,the perceived knowledge is considered as a mental tool containing an upgraded valid information.Taking under consideration that this validity is not able to overcome the metaphysical origins of human thought,it is suggested that when this mental tool is functioning in a self-transformative,self-constructed,and flexible form,human intelligence is structuring a compatible information management mechanism,which can enable us to formulate our personal acceptances,bridge our empirical and hyper-empirical inner world,and enlighten our request for self-criticism,self-determination,and above all emancipation.展开更多
As Natural Language Processing(NLP)continues to advance,driven by the emergence of sophisticated large language models such as ChatGPT,there has been a notable growth in research activity.This rapid uptake reflects in...As Natural Language Processing(NLP)continues to advance,driven by the emergence of sophisticated large language models such as ChatGPT,there has been a notable growth in research activity.This rapid uptake reflects increasing interest in the field and induces critical inquiries into ChatGPT’s applicability in the NLP domain.This review paper systematically investigates the role of ChatGPT in diverse NLP tasks,including information extraction,Name Entity Recognition(NER),event extraction,relation extraction,Part of Speech(PoS)tagging,text classification,sentiment analysis,emotion recognition and text annotation.The novelty of this work lies in its comprehensive analysis of the existing literature,addressing a critical gap in understanding ChatGPT’s adaptability,limitations,and optimal application.In this paper,we employed a systematic stepwise approach following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses(PRISMA)framework to direct our search process and seek relevant studies.Our review reveals ChatGPT’s significant potential in enhancing various NLP tasks.Its adaptability in information extraction tasks,sentiment analysis,and text classification showcases its ability to comprehend diverse contexts and extract meaningful details.Additionally,ChatGPT’s flexibility in annotation tasks reducesmanual efforts and accelerates the annotation process,making it a valuable asset in NLP development and research.Furthermore,GPT-4 and prompt engineering emerge as a complementary mechanism,empowering users to guide the model and enhance overall accuracy.Despite its promising potential,challenges persist.The performance of ChatGP Tneeds tobe testedusingmore extensivedatasets anddiversedata structures.Subsequently,its limitations in handling domain-specific language and the need for fine-tuning in specific applications highlight the importance of further investigations to address these issues.展开更多
文摘This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends that they are not the same. The concept of cyber security is explored, which goes beyond protecting information resources to include a wider variety of assets, including people [1]. Protecting information assets is the main goal of traditional information security, with consideration to the human element and how people fit into the security process. On the other hand, cyber security adds a new level of complexity, as people might unintentionally contribute to or become targets of cyberattacks. This aspect presents moral questions since it is becoming more widely accepted that society has a duty to protect weaker members of society, including children [1]. The study emphasizes how important cyber security is on a larger scale, with many countries creating plans and laws to counteract cyberattacks. Nevertheless, a lot of these sources frequently neglect to define the differences or the relationship between information security and cyber security [1]. The paper focus on differentiating between cybersecurity and information security on a larger scale. The study also highlights other areas of cybersecurity which includes defending people, social norms, and vital infrastructure from threats that arise from online in addition to information and technology protection. It contends that ethical issues and the human factor are becoming more and more important in protecting assets in the digital age, and that cyber security is a paradigm shift in this regard [1].
文摘We advance here a novel methodology for robust intelligent biometric information management with inferences and predictions made using randomness and complexity concepts. Intelligence refers to learning, adap- tation, and functionality, and robustness refers to the ability to handle incomplete and/or corrupt adversarial information, on one side, and image and or device variability, on the other side. The proposed methodology is model-free and non-parametric. It draws support from discriminative methods using likelihood ratios to link at the conceptual level biometrics and forensics. It further links, at the modeling and implementation level, the Bayesian framework, statistical learning theory (SLT) using transduction and semi-supervised lea- rning, and Information Theory (IY) using mutual information. The key concepts supporting the proposed methodology are a) local estimation to facilitate learning and prediction using both labeled and unlabeled data;b) similarity metrics using regularity of patterns, randomness deficiency, and Kolmogorov complexity (similar to MDL) using strangeness/typicality and ranking p-values;and c) the Cover – Hart theorem on the asymptotical performance of k-nearest neighbors approaching the optimal Bayes error. Several topics on biometric inference and prediction related to 1) multi-level and multi-layer data fusion including quality and multi-modal biometrics;2) score normalization and revision theory;3) face selection and tracking;and 4) identity management, are described here using an integrated approach that includes transduction and boosting for ranking and sequential fusion/aggregation, respectively, on one side, and active learning and change/ outlier/intrusion detection realized using information gain and martingale, respectively, on the other side. The methodology proposed can be mapped to additional types of information beyond biometrics.
基金National Institute of Arthritis and Musculoskeletal and Skin Diseases of the National Institutes of Health,No.U01AR067138.
文摘AIM To establish minimum clinically important difference(MCID) for measurements in an orthopaedic patient population with joint disorders.METHODS Adult patients aged 18 years and older seeking care for joint conditions at an orthopaedic clinic took the Patient-Reported Outcomes Measurement Information System Physical Function(PROMIS~? PF) computerized adaptive test(CAT), hip disability and osteoarthritis outcome score for joint reconstruction(HOOS JR), and the knee injury and osteoarthritis outcome score for joint reconstruction(KOOS JR) from February 2014 to April 2017. MCIDs were calculated using anchorbased and distribution-based methods. Patient reports of meaningful change in function since their first clinic encounter were used as an anchor.RESULTS There were 2226 patients who participated with a mean age of 61.16(SD = 12.84) years, 41.6% male, and 89.7% Caucasian. Mean change ranged from 7.29 to 8.41 for the PROMIS~? PF CAT, from 14.81 to 19.68 for the HOOS JR, and from 14.51 to 18.85 for the KOOS JR. ROC cut-offs ranged from 1.97-8.18 for the PF CAT, 6.33-43.36 for the HOOS JR, and 2.21-8.16 for the KOOS JR. Distribution-based methods estimated MCID values ranging from 2.45 to 21.55 for the PROMIS~? PF CAT; from 3.90 to 43.61 for the HOOS JR, and from 3.98 to 40.67 for the KOOS JR. The median MCID value in the range was similar to the mean change score for each measure and was 7.9 for the PF CAT, 18.0 for the HOOS JR, and 15.1 for the KOOS JR.CONCLUSION This is the first comprehensive study providing a wide range of MCIDs for the PROMIS? PF, HOOS JR, and KOOS JR in orthopaedic patients with joint ailments.
文摘This study examines the key factors that have impact on the successful adoption of Human Resource Information System (HRIS) within the Aqaba Special Economic Zone Authority (ASEZA)/Jordan. In order to accomplish the purpose of the study four critical factors are inquired. So, four critical factors are inquired: First, TAM Model (Perceived Ease of Use (PEOU) and Perceived Usefulness (PU)). Second, Information Technology Infrastructure (ITI). Third, Top Management Support (TMS). Finally, Individual Experience with Computer (IEC). The research model was applied to collect data from the questionnaires answered by 45 users of HRIS as a source of primary data, based on a convenience sample the response rate was about 91%. In addition, the results were analyzed by utilizing the Statistical Package for Social Software (SPSS). Furthermore, the findings were analyzed;multiple Regression analysis indicated that all research variables have significant relationship on successful adoption of HRIS. The findings indicated IT infrastructures have a positive and significant effect on the successful adoption of HRIS. But there is no significant of PU, PEOU, TMS, and IEC on the successful adoption of HRIS. Finally, the results indicated that no significant statistical differences of demographic characteristics on HRIS adoption. Depending on the research’s findings;the researchers proposed a set of recommendations for better adoption of HRIS in SEZA.
文摘The aim of this work is mathematical education through the knowledge system and mathematical modeling. A net model of formation of mathematical knowledge as a deductive theory is suggested here. Within this model the formation of deductive theory is represented as the development of a certain informational space, the elements of which are structured in the form of the orientated semantic net. This net is properly metrized and characterized by a certain system of coverings. It allows injecting net optimization parameters, regulating qualitative aspects of knowledge system under consideration. To regulate the creative processes of the formation and realization of mathematical know- edge, stochastic model of formation deductive theory is suggested here in the form of branching Markovian process, which is realized in the corresponding informational space as a semantic net. According to this stochastic model we can get correct foundation of criterion of optimization creative processes that leads to “great main points” strategy (GMP-strategy) in the process of realization of the effective control in the research work in the sphere of mathematics and its applications.
基金supported by American Diabetes Association,American Heart Association,NIH NIEHS,NIH NIA,NIH NINDS,and NIH ARRA
文摘Throughout the globe,diabetes mellitus(DM) is increasing in incidence with limited therapies presently available to prevent or resolve the significant complications of this disorder.DM impacts multiple organs and affects all components of the central and peripheral nervous systems that can range from dementia to diabetic neuropathy.The mechanistic target of rapamycin(m TOR) is a promising agent for the development of novel regenerative strategies for the treatment of DM.m TOR and its related signaling pathways impact multiple metabolic parameters that include cellular metabolic homeostasis,insulin resistance,insulin secretion,stem cell proliferation and differentiation,pancreatic β-cell function,and programmed cell death with apoptosis and autophagy.m TOR is central element for the protein complexes m TOR Complex 1(m TORC1) and m TOR Complex 2(m TORC2) and is a critical component for a number of signaling pathways that involve phosphoinositide 3-kinase(PI 3-K),protein kinase B(Akt),AMP activated protein kinase(AMPK),silent mating type information regulation 2 homolog 1(Saccharomyces cerevisiae)(SIRT1),Wnt1 inducible signaling pathway protein 1(WISP1),and growth factors.As a result,m TOR represents an exciting target to offer new clinical avenues for the treatment of DM and the complications of this disease.Future studies directed to elucidate the delicate balance m TOR holds over cellular metabolism and the impact of its broad signaling pathways should foster the translation of these targets into effective clinical regimens for DM.
文摘This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering various cloud infrastructure offerings and security models. Additionally, it deeply analyzes real-life case studies illustrating successful cloud migrations and highlights common information security threats in current cloud computing. The article concludes by offering recommendations to businesses to protect themselves from cloud data breaches and providing insights into selecting a suitable cloud services provider from an information security perspective.
基金supported by the Natural Science Foundation Research Plan of Shanxi Province (2023JCQN0728)。
文摘The subversive nature of information war lies not only in the information itself, but also in the circulation and application of information. It has always been a challenge to quantitatively analyze the function and effect of information flow through command, control, communications, computer, kill, intelligence,surveillance, reconnaissance (C4KISR) system. In this work, we propose a framework of force of information influence and the methods for calculating the force of information influence between C4KISR nodes of sensing, intelligence processing,decision making and fire attack. Specifically, the basic concept of force of information influence between nodes in C4KISR system is formally proposed and its mathematical definition is provided. Then, based on the information entropy theory, the model of force of information influence between C4KISR system nodes is constructed. Finally, the simulation experiments have been performed under an air defense and attack scenario. The experimental results show that, with the proposed force of information influence framework, we can effectively evaluate the contribution of information circulation through different C4KISR system nodes to the corresponding tasks. Our framework of force of information influence can also serve as an effective tool for the design and dynamic reconfiguration of C4KISR system architecture.
文摘This article proves the existence of a hyper-precise global numerical meta-architecture unifying, structuring, binding and controlling the billion triplet codons constituting the sequence of single-stranded DNA of the entire human genome. Beyond the evolution and erratic mutations like transposons within the genome, it’s as if the memory of a fossil genome with multiple symmetries persists. This recalls the “intermingling” of information characterizing the fractal universe of chaos theory. The result leads to a balanced and perfect tuning between the masses of the two strands of the huge DNA molecule that constitute our genome. We show here how codon populations forming the single-stranded DNA sequences can constitute a critical approach to the understanding of junk DNA function. Then, we suggest revisiting certain methods published in our 2009 book “Codex Biogenesis”. In fact, we demonstrate here how the universal genetic code table is a powerful analytical filter to characterize single-stranded DNA sequences constituting chromosomes and genomes. We can then show that any genomic DNA sequence is featured by three numbers, which characterize it and its 64 codon populations with correlations greater than 99%. The number “1” is common to all sequences, expressing the second law of Chargaff. The other 2 numbers are related to each specific DNA sequence case characterizing life species. For example, the entire human genome is characterized by three remarkable numbers 1, 2, and Phi = 1.618 the golden ratio. Associated with each of these three numbers, we can match three axes of symmetry, then “imagine” a kind of hyperspace formed by these codon populations. Then we revisit the value (3-Phi)/2 which is probably universal and common to both the scale of quarks and atomic levels, balancing and tuning the whole human genome codon population. Finally, we demonstrate a new kind of duality between “form and substance” overlapping the whole human genome: we will show that—simultaneously with the duality between genes and junk DNA—there is a second layer of embedded hidden structure overlapping all the DNA of the whole human genome, dividing it into a second type of duality information/redundancy involving golden ratio proportions.
文摘COVID-19 pandemic restrictions limited all social activities to curtail the spread of the virus.The foremost and most prime sector among those affected were schools,colleges,and universities.The education system of entire nations had shifted to online education during this time.Many shortcomings of Learning Management Systems(LMSs)were detected to support education in an online mode that spawned the research in Artificial Intelligence(AI)based tools that are being developed by the research community to improve the effectiveness of LMSs.This paper presents a detailed survey of the different enhancements to LMSs,which are led by key advances in the area of AI to enhance the real-time and non-real-time user experience.The AI-based enhancements proposed to the LMSs start from the Application layer and Presentation layer in the form of flipped classroom models for the efficient learning environment and appropriately designed UI/UX for efficient utilization of LMS utilities and resources,including AI-based chatbots.Session layer enhancements are also required,such as AI-based online proctoring and user authentication using Biometrics.These extend to the Transport layer to support real-time and rate adaptive encrypted video transmission for user security/privacy and satisfactory working of AI-algorithms.It also needs the support of the Networking layer for IP-based geolocation features,the Virtual Private Network(VPN)feature,and the support of Software-Defined Networks(SDN)for optimum Quality of Service(QoS).Finally,in addition to these,non-real-time user experience is enhanced by other AI-based enhancements such as Plagiarism detection algorithms and Data Analytics.
基金Under the auspices of President Foundation of the Chinese Academy of Sciences(1999).
文摘ABSTRACT: This paper generalizes the makeup and forming dynamic mechanism of natural disaster systems, principles and methods of comprehensive division of natural disasters, as well as structure, function and up-build routes of map and file information visualization system (MFIVS). Taking the Changjiang(Yangtze) Valley as an example, on the basis of revealing up the integrated mechanism on the formations of its natural disasters and its distributing law, thereafter, the paper relies on the MFIVS technique, adopts two top-down and bottom-up approaches to study a comprehensive division of natural disasters. It is relatively objective and precise that the required division results include three natural disaster sections and nine natural disaster sub-sections, which can not only provide a scientific basis for utilizing natural resources and controlling natural disaster and environmental degradation, but also be illuminated to a concise, practical and effective technique on comprehensive division.
文摘We start from a minimal number of generally accepted premises, in particular Hartle-Hawking quantum wave of the universe and von Neumann-Connes’ pointless and self referential spacetime geometry. We then proceed from there to show, using Dvoretzky’s theorem of measure concentration, that the total energy of the universe is divided into two parts, an ordinary energy very small part which we can measure while most of the energy is concentrated as the second part at the boundary of the holographic boundary which we cannot measure in a direct way. Finally the results are shown to imply a resolution of the black hole information paradox without violating the fundamental laws of physics. In this way the main thrust of the two opposing arguments and views, namely that of Hawking on the one side and Susskind as well as tHooft on the other side, is brought to a consistent and compatible coherent unit.
文摘AIM To examine the practice pattern in Kaiser Permanente Southern California(KPSC), i.e., gastroenterology(GI)/surgery referrals and endoscopic ultrasound(EUS), for pancreatic cystic neoplasms(PCNs) after the regionwide dissemination of the PCN management algorithm.METHODS Retrospective review was performed; patients with PCN diagnosis given between April 2012 and April 2015(18 mo before and after the publication of the algorithm) in KPSC(integrated health system with 15 hospitals and 202 medical offices in Southern California) were identified.RESULTS2558(1157 pre-and 1401 post-algorithm) received a new diagnosis of PCN in the study period. There was no difference in the mean cyst size(pre-19.1 mm vs post-18.5 mm, P = 0.119). A smaller percentage of PCNs resulted in EUS after the implementation of the algorithm(pre-45.5% vs post-34.8%, P < 0.001). A smaller proportion of patients were referred for GI(pre-65.2% vs post-53.3%, P < 0.001) and surgery consultations(pre-24.8% vs post-16%, P < 0.001) for PCN after the implementation. There was no significant change in operations for PCNs. Cost of diagnostic care was reduced after the implementation by 24%, 18%, and 36% for EUS, GI, and surgery consultations, respectively, with total cost saving of 24%.CONCLUSION In the current healthcare climate, there is increased need to optimize resource utilization. Dissemination of an algorithm for PCN management in an integrated health system resulted in fewer EUS and GI/surgery referrals, likely by aiding the physicians ordering imaging studies in the decision making for the management of PCNs. This translated to cost saving of 24%, 18%, and 36% for EUS, GI, and surgical consultations, respectively, with total diagnostic cost saving of 24%.
文摘In this research we are going to define two new concepts: a) “The Potential of Events” (EP) and b) “The Catholic Information” (CI). The term CI derives from the ancient Greek language and declares all the Catholic (general) Logical Propositions (<img src="Edit_5f13a4a5-abc6-4bc5-9e4c-4ff981627b2a.png" width="33" height="21" alt="" />) which will true for every element of a set A. We will study the Riemann Hypothesis in two stages: a) By using the EP we will prove that the distribution of events e (even) and o (odd) of Square Free Numbers (SFN) on the axis Ax(N) of naturals is Heads-Tails (H-T) type. b) By using the CI we will explain the way that the distribution of prime numbers can be correlated with the non-trivial zeros of the function <em>ζ</em>(<em>s</em>) of Riemann. The Introduction and the Chapter 2 are necessary for understanding the solution. In the Chapter 3 we will present a simple method of forecasting in many very useful applications (e.g. financial, technological, medical, social, etc) developing a generalization of this new, proven here, theory which we finally apply to the solution of RH. The following Introduction as well the Results with the Discussion at the end shed light about the possibility of the proof of all the above. The article consists of 9 chapters that are numbered by 1, 2, …, 9.
文摘Based on approaches deduced from previous research findings and empirical observations from density control experiments, genetic worth effect response models were developed for black spruce (Picea mariana (Mill) BSP.) and jack pine (Pinus banksiana Lamb.) plantations. The models accounted for the increased rate of stand development arising from the planting of genetically-improved stock through temporal adjustments to the species-specific site-based mean dominant height-age functions. The models utilized a relative height growth modifier based on known estimates of genetic gain. The models also incorporated a phenotypic juvenile age-mature age correlation function in order to account for the intrinsic temporal decline in the magnitude of genetic worth effects throughout the rotation. Integrating the functions into algorithmic variants of structural stand density management models produced stand development patterns that were consistent with axioms of even-aged stand dynamics.
文摘A black hole is essentially a relativistic as well as a quantum object. Therefore the information paradox of black holes is a consequence of the clash between these two most fundamental theories of modern physics. It is logical to conclude that a resolution of the problem requires some form of a quantum gravity theory. The present work proposes such a resolution using set theory and pointless spacetime geometry.
文摘This text is trying to discuss an approximation to the concept of human emancipation,as part of our well-being,in terms of Education and Knowledge.Without abandoning our metaphysical perception of wholeness,as an extension of the continuity principle which connects our conscious and unconscious world,emancipation is considered as a personal struggle against all oppressions.Some of these are grounded in our inner world.In accordance with the Enlightenment request,reasoning and knowledge can help us to structure new forms of acceptances which are shaping our own emancipatory meaning.Under the impact of social influence and personal interpretation,the perceived knowledge is considered as a mental tool containing an upgraded valid information.Taking under consideration that this validity is not able to overcome the metaphysical origins of human thought,it is suggested that when this mental tool is functioning in a self-transformative,self-constructed,and flexible form,human intelligence is structuring a compatible information management mechanism,which can enable us to formulate our personal acceptances,bridge our empirical and hyper-empirical inner world,and enlighten our request for self-criticism,self-determination,and above all emancipation.
文摘As Natural Language Processing(NLP)continues to advance,driven by the emergence of sophisticated large language models such as ChatGPT,there has been a notable growth in research activity.This rapid uptake reflects increasing interest in the field and induces critical inquiries into ChatGPT’s applicability in the NLP domain.This review paper systematically investigates the role of ChatGPT in diverse NLP tasks,including information extraction,Name Entity Recognition(NER),event extraction,relation extraction,Part of Speech(PoS)tagging,text classification,sentiment analysis,emotion recognition and text annotation.The novelty of this work lies in its comprehensive analysis of the existing literature,addressing a critical gap in understanding ChatGPT’s adaptability,limitations,and optimal application.In this paper,we employed a systematic stepwise approach following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses(PRISMA)framework to direct our search process and seek relevant studies.Our review reveals ChatGPT’s significant potential in enhancing various NLP tasks.Its adaptability in information extraction tasks,sentiment analysis,and text classification showcases its ability to comprehend diverse contexts and extract meaningful details.Additionally,ChatGPT’s flexibility in annotation tasks reducesmanual efforts and accelerates the annotation process,making it a valuable asset in NLP development and research.Furthermore,GPT-4 and prompt engineering emerge as a complementary mechanism,empowering users to guide the model and enhance overall accuracy.Despite its promising potential,challenges persist.The performance of ChatGP Tneeds tobe testedusingmore extensivedatasets anddiversedata structures.Subsequently,its limitations in handling domain-specific language and the need for fine-tuning in specific applications highlight the importance of further investigations to address these issues.