Context: The advent of Artificial Intelligence (AI) requires modeling prior to its implementation in algorithms for most human skills. This observation requires us to have a detailed and precise understanding of the i...Context: The advent of Artificial Intelligence (AI) requires modeling prior to its implementation in algorithms for most human skills. This observation requires us to have a detailed and precise understanding of the interfaces of verbal and emotional communications. The progress of AI is significant on the verbal level but modest in terms of the recognition of facial emotions even if this functionality is one of the oldest in humans and is omnipresent in our daily lives. Dysfunction in the ability for facial emotional expressions is present in many brain pathologies encountered by psychiatrists, neurologists, psychotherapists, mental health professionals including social workers. It cannot be objectively verified and measured due to a lack of reliable tools that are valid and consistently sensitive. Indeed, the articles in the scientific literature dealing with Visual-Facial-Emotions-Recognition (ViFaEmRe), suffer from the absence of 1) consensual and rational tools for continuous quantified measurement, 2) operational concepts. We have invented a software that can use computer-morphing attempting to respond to these two obstacles. It is identified as the Method of Analysis and Research of the Integration of Emotions (M.A.R.I.E.). Our primary goal is to use M.A.R.I.E. to understand the physiology of ViFaEmRe in normal healthy subjects by standardizing the measurements. Then, it will allow us to focus on subjects manifesting abnormalities in this ability. Our second goal is to make our contribution to the progress of AI hoping to add the dimension of recognition of facial emotional expressions. Objective: To study: 1) categorical vs dimensional aspects of recognition of ViFaEmRe, 2) universality vs idiosyncrasy, 3) immediate vs ambivalent Emotional-Decision-Making, 4) the Emotional-Fingerprint of a face and 5) creation of population references data. Methods: With M.A.R.I.E. enable a rational quantified measurement of Emotional-Visual-Acuity (EVA) of 1) a) an individual observer, b) in a population aged 20 to 70 years old, 2) measure the range and intensity of expressed emotions by 3 Face-Tests, 3) quantify the performance of a sample of 204 observers with hyper normal measures of cognition, “thymia,” (ibid. defined elsewhere) and low levels of anxiety 4) analysis of the 6 primary emotions. Results: We have individualized the following continuous parameters: 1) “Emotional-Visual-Acuity”, 2) “Visual-Emotional-Feeling”, 3) “Emotional-Quotient”, 4) “Emotional-Deci-sion-Making”, 5) “Emotional-Decision-Making Graph” or “Individual-Gun-Trigger”6) “Emotional-Fingerprint” or “Key-graph”, 7) “Emotional-Finger-print-Graph”, 8) detecting “misunderstanding” and 9) detecting “error”. This allowed us a taxonomy with coding of the face-emotion pair. Each face has specific measurements and graphics. The EVA improves from ages of 20 to 55 years, then decreases. It does not depend on the sex of the observer, nor the face studied. In addition, 1% of people endowed with normal intelligence do not recognize emotions. The categorical dimension is a variable for everyone. The range and intensity of ViFaEmRe is idiosyncratic and not universally uniform. The recognition of emotions is purely categorical for a single individual. It is dimensional for a population sample. Conclusions: Firstly, M.A.R.I.E. has made possible to bring out new concepts and new continuous measurements variables. The comparison between healthy and abnormal individuals makes it possible to take into consideration the significance of this line of study. From now on, these new functional parameters will allow us to identify and name “emotional” disorders or illnesses which can give additional dimension to behavioral disorders in all pathologies that affect the brain. Secondly, the ViFaEmRe is idiosyncratic, categorical, and a function of the identity of the observer and of the observed face. These findings stack up against Artificial Intelligence, which cannot have a globalist or regionalist algorithm that can be programmed into a robot, nor can AI compete with human abilities and judgment in this domain. *Here “Emotional disorders” refers to disorders of emotional expressions and recognition.展开更多
Context: The advent of Artificial Intelligence (AI) requires modeling prior to its implementation in algorithms for most human skills. This observation requires us to have a detailed and precise understanding of the i...Context: The advent of Artificial Intelligence (AI) requires modeling prior to its implementation in algorithms for most human skills. This observation requires us to have a detailed and precise understanding of the interfaces of verbal and emotional communications. The progress of AI is significant on the verbal level but modest in terms of the recognition of facial emotions even if this functionality is one of the oldest in humans and is omnipresent in our daily lives. Dysfunction in the ability for facial emotional expressions is present in many brain pathologies encountered by psychiatrists, neurologists, psychotherapists, mental health professionals including social workers. It cannot be objectively verified and measured due to a lack of reliable tools that are valid and consistently sensitive. Indeed, the articles in the scientific literature dealing with Visual-Facial-Emotions-Recognition (ViFaEmRe), suffer from the absence of 1) consensual and rational tools for continuous quantified measurement, 2) operational concepts. We have invented a software that can use computer-morphing attempting to respond to these two obstacles. It is identified as the Method of Analysis and Research of the Integration of Emotions (M.A.R.I.E.). Our primary goal is to use M.A.R.I.E. to understand the physiology of ViFaEmRe in normal healthy subjects by standardizing the measurements. Then, it will allow us to focus on subjects manifesting abnormalities in this ability. Our second goal is to make our contribution to the progress of AI hoping to add the dimension of recognition of facial emotional expressions. Objective: To study: 1) categorical vs dimensional aspects of recognition of ViFaEmRe, 2) universality vs idiosyncrasy, 3) immediate vs ambivalent Emotional-Decision-Making, 4) the Emotional-Fingerprint of a face and 5) creation of population references data. Methods: M.A.R.I.E. enables the rational, quantified measurement of Emotional Visual Acuity (EVA) in an individual observer and a population aged 20 to 70 years. Meanwhile, it can measure the range and intensity of expressed emotions through three Face- Tests, quantify the performance of a sample of 204 observers with hypernormal measures of cognition, “thymia” (defined elsewhere), and low levels of anxiety, and perform analysis of the six primary emotions. Results: We have individualized the following continuous parameters: 1) “Emotional-Visual- Acuity”, 2) “Visual-Emotional-Feeling”, 3) “Emotional-Quotient”, 4) “Emotional-Decision-Making”, 5) “Emotional-Decision-Making Graph” or “Individual-Gun-Trigger”, 6) “Emotional-Fingerprint” or “Key-graph”, 7) “Emotional-Fingerprint-Graph”, 8) detecting “misunderstanding” and 9) detecting “error”. This allowed us a taxonomy with coding of the face-emotion pair. Each face has specific measurements and graphics. The EVA improves from ages of 20 to 55 years, then decreases. It does not depend on the sex of the observer, nor the face studied. In addition, 1% of people endowed with normal intelligence do not recognize emotions. The categorical dimension is a variable for everyone. The range and intensity of ViFaEmRe is idiosyncratic and not universally uniform. The recognition of emotions is purely categorical for a single individual. It is dimensional for a population sample. Conclusions: Firstly, M.A.R.I.E. has made possible to bring out new concepts and new continuous measurements variables. The comparison between healthy and abnormal individuals makes it possible to take into consideration the significance of this line of study. From now on, these new functional parameters will allow us to identify and name “emotional” disorders or illnesses which can give additional dimension to behavioral disorders in all pathologies that affect the brain. Secondly, the ViFaEmRe is idiosyncratic, categorical, and a function of the identity of the observer and of the observed face. These findings stack up against Artificial Intelligence, which cannot have a globalist or regionalist algorithm that can be programmed into a robot, nor can AI compete with human abilities and judgment in this domain. *Here “Emotional disorders” refers to disorders of emotional expressions and recognition.展开更多
Objective:The aim of this study is to compare the effect of Vitamin D and Vitamin E supplementations,alone,and in combination,on reducing the intensity and duration of dysmenorrhea in women who were referred to the Ko...Objective:The aim of this study is to compare the effect of Vitamin D and Vitamin E supplementations,alone,and in combination,on reducing the intensity and duration of dysmenorrhea in women who were referred to the Kowsar gynecological clinics of Shahid Motahari Hospital of Urmia University of Medical Sciences.Materials and Methods:A double-blinded clinical trial was conducted on 112 women with dysmenorrhea who were randomly allocated into the four study groups.Finally,100 women complete the study and received capsules containing a placebo(n=25),1000 IU of Vitamin D(n=25),400 mg of Vitamin E(n=25),and 1000 IU of Vitamin D+400 mg of Vitamin E(n=25)every 24 h for 2 consecutive months at the beginning of the menstrual period.The pain intensity and duration among groups were compared before and after the intervention.Results:The mean menstrual pain duration in all three experimental groups(Vitamin E,Vitamin D,and the combination of them)was lower than the placebo group(all P<0.05),and the mean pain intensity scores in all three experimental groups were significantly lower than the placebo group(all P<0.05).The results(means)for all of the drugs administered at the end of the 2nd month were better than the 1st month(all P<0.01).Conclusions:Vitamin E and Vitamin D,as well as their combination,are recommended as an effective and safe treatment for the management of the complications of dysmenorrhea.展开更多
文摘Context: The advent of Artificial Intelligence (AI) requires modeling prior to its implementation in algorithms for most human skills. This observation requires us to have a detailed and precise understanding of the interfaces of verbal and emotional communications. The progress of AI is significant on the verbal level but modest in terms of the recognition of facial emotions even if this functionality is one of the oldest in humans and is omnipresent in our daily lives. Dysfunction in the ability for facial emotional expressions is present in many brain pathologies encountered by psychiatrists, neurologists, psychotherapists, mental health professionals including social workers. It cannot be objectively verified and measured due to a lack of reliable tools that are valid and consistently sensitive. Indeed, the articles in the scientific literature dealing with Visual-Facial-Emotions-Recognition (ViFaEmRe), suffer from the absence of 1) consensual and rational tools for continuous quantified measurement, 2) operational concepts. We have invented a software that can use computer-morphing attempting to respond to these two obstacles. It is identified as the Method of Analysis and Research of the Integration of Emotions (M.A.R.I.E.). Our primary goal is to use M.A.R.I.E. to understand the physiology of ViFaEmRe in normal healthy subjects by standardizing the measurements. Then, it will allow us to focus on subjects manifesting abnormalities in this ability. Our second goal is to make our contribution to the progress of AI hoping to add the dimension of recognition of facial emotional expressions. Objective: To study: 1) categorical vs dimensional aspects of recognition of ViFaEmRe, 2) universality vs idiosyncrasy, 3) immediate vs ambivalent Emotional-Decision-Making, 4) the Emotional-Fingerprint of a face and 5) creation of population references data. Methods: With M.A.R.I.E. enable a rational quantified measurement of Emotional-Visual-Acuity (EVA) of 1) a) an individual observer, b) in a population aged 20 to 70 years old, 2) measure the range and intensity of expressed emotions by 3 Face-Tests, 3) quantify the performance of a sample of 204 observers with hyper normal measures of cognition, “thymia,” (ibid. defined elsewhere) and low levels of anxiety 4) analysis of the 6 primary emotions. Results: We have individualized the following continuous parameters: 1) “Emotional-Visual-Acuity”, 2) “Visual-Emotional-Feeling”, 3) “Emotional-Quotient”, 4) “Emotional-Deci-sion-Making”, 5) “Emotional-Decision-Making Graph” or “Individual-Gun-Trigger”6) “Emotional-Fingerprint” or “Key-graph”, 7) “Emotional-Finger-print-Graph”, 8) detecting “misunderstanding” and 9) detecting “error”. This allowed us a taxonomy with coding of the face-emotion pair. Each face has specific measurements and graphics. The EVA improves from ages of 20 to 55 years, then decreases. It does not depend on the sex of the observer, nor the face studied. In addition, 1% of people endowed with normal intelligence do not recognize emotions. The categorical dimension is a variable for everyone. The range and intensity of ViFaEmRe is idiosyncratic and not universally uniform. The recognition of emotions is purely categorical for a single individual. It is dimensional for a population sample. Conclusions: Firstly, M.A.R.I.E. has made possible to bring out new concepts and new continuous measurements variables. The comparison between healthy and abnormal individuals makes it possible to take into consideration the significance of this line of study. From now on, these new functional parameters will allow us to identify and name “emotional” disorders or illnesses which can give additional dimension to behavioral disorders in all pathologies that affect the brain. Secondly, the ViFaEmRe is idiosyncratic, categorical, and a function of the identity of the observer and of the observed face. These findings stack up against Artificial Intelligence, which cannot have a globalist or regionalist algorithm that can be programmed into a robot, nor can AI compete with human abilities and judgment in this domain. *Here “Emotional disorders” refers to disorders of emotional expressions and recognition.
文摘Context: The advent of Artificial Intelligence (AI) requires modeling prior to its implementation in algorithms for most human skills. This observation requires us to have a detailed and precise understanding of the interfaces of verbal and emotional communications. The progress of AI is significant on the verbal level but modest in terms of the recognition of facial emotions even if this functionality is one of the oldest in humans and is omnipresent in our daily lives. Dysfunction in the ability for facial emotional expressions is present in many brain pathologies encountered by psychiatrists, neurologists, psychotherapists, mental health professionals including social workers. It cannot be objectively verified and measured due to a lack of reliable tools that are valid and consistently sensitive. Indeed, the articles in the scientific literature dealing with Visual-Facial-Emotions-Recognition (ViFaEmRe), suffer from the absence of 1) consensual and rational tools for continuous quantified measurement, 2) operational concepts. We have invented a software that can use computer-morphing attempting to respond to these two obstacles. It is identified as the Method of Analysis and Research of the Integration of Emotions (M.A.R.I.E.). Our primary goal is to use M.A.R.I.E. to understand the physiology of ViFaEmRe in normal healthy subjects by standardizing the measurements. Then, it will allow us to focus on subjects manifesting abnormalities in this ability. Our second goal is to make our contribution to the progress of AI hoping to add the dimension of recognition of facial emotional expressions. Objective: To study: 1) categorical vs dimensional aspects of recognition of ViFaEmRe, 2) universality vs idiosyncrasy, 3) immediate vs ambivalent Emotional-Decision-Making, 4) the Emotional-Fingerprint of a face and 5) creation of population references data. Methods: M.A.R.I.E. enables the rational, quantified measurement of Emotional Visual Acuity (EVA) in an individual observer and a population aged 20 to 70 years. Meanwhile, it can measure the range and intensity of expressed emotions through three Face- Tests, quantify the performance of a sample of 204 observers with hypernormal measures of cognition, “thymia” (defined elsewhere), and low levels of anxiety, and perform analysis of the six primary emotions. Results: We have individualized the following continuous parameters: 1) “Emotional-Visual- Acuity”, 2) “Visual-Emotional-Feeling”, 3) “Emotional-Quotient”, 4) “Emotional-Decision-Making”, 5) “Emotional-Decision-Making Graph” or “Individual-Gun-Trigger”, 6) “Emotional-Fingerprint” or “Key-graph”, 7) “Emotional-Fingerprint-Graph”, 8) detecting “misunderstanding” and 9) detecting “error”. This allowed us a taxonomy with coding of the face-emotion pair. Each face has specific measurements and graphics. The EVA improves from ages of 20 to 55 years, then decreases. It does not depend on the sex of the observer, nor the face studied. In addition, 1% of people endowed with normal intelligence do not recognize emotions. The categorical dimension is a variable for everyone. The range and intensity of ViFaEmRe is idiosyncratic and not universally uniform. The recognition of emotions is purely categorical for a single individual. It is dimensional for a population sample. Conclusions: Firstly, M.A.R.I.E. has made possible to bring out new concepts and new continuous measurements variables. The comparison between healthy and abnormal individuals makes it possible to take into consideration the significance of this line of study. From now on, these new functional parameters will allow us to identify and name “emotional” disorders or illnesses which can give additional dimension to behavioral disorders in all pathologies that affect the brain. Secondly, the ViFaEmRe is idiosyncratic, categorical, and a function of the identity of the observer and of the observed face. These findings stack up against Artificial Intelligence, which cannot have a globalist or regionalist algorithm that can be programmed into a robot, nor can AI compete with human abilities and judgment in this domain. *Here “Emotional disorders” refers to disorders of emotional expressions and recognition.
文摘Objective:The aim of this study is to compare the effect of Vitamin D and Vitamin E supplementations,alone,and in combination,on reducing the intensity and duration of dysmenorrhea in women who were referred to the Kowsar gynecological clinics of Shahid Motahari Hospital of Urmia University of Medical Sciences.Materials and Methods:A double-blinded clinical trial was conducted on 112 women with dysmenorrhea who were randomly allocated into the four study groups.Finally,100 women complete the study and received capsules containing a placebo(n=25),1000 IU of Vitamin D(n=25),400 mg of Vitamin E(n=25),and 1000 IU of Vitamin D+400 mg of Vitamin E(n=25)every 24 h for 2 consecutive months at the beginning of the menstrual period.The pain intensity and duration among groups were compared before and after the intervention.Results:The mean menstrual pain duration in all three experimental groups(Vitamin E,Vitamin D,and the combination of them)was lower than the placebo group(all P<0.05),and the mean pain intensity scores in all three experimental groups were significantly lower than the placebo group(all P<0.05).The results(means)for all of the drugs administered at the end of the 2nd month were better than the 1st month(all P<0.01).Conclusions:Vitamin E and Vitamin D,as well as their combination,are recommended as an effective and safe treatment for the management of the complications of dysmenorrhea.