期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Temporal alterations in pericytes at the acute phase of ischemia/reperfusion in the mouse brain 被引量:3
1
作者 Shuang Zhang Xue-Jing Liao +10 位作者 Jia Wang Yi Shen Han-Fen Shi Yan Zou chong-yang ma Xue-Qian Wang Qing-Guo Wang Xu Wang Ming-Yang Xu Fa-Feng Cheng Wan-Zhu Bai 《Neural Regeneration Research》 SCIE CAS CSCD 2022年第10期2247-2252,共6页
Pericytes,as the mural cells surrounding the microvasculature,play a critical role in the regulation of microcirculation;however,how these cells respond to ischemic stroke remains unclear.To determine the temporal alt... Pericytes,as the mural cells surrounding the microvasculature,play a critical role in the regulation of microcirculation;however,how these cells respond to ischemic stroke remains unclear.To determine the temporal alterations in pericytes after ischemia/reperfusion,we used the 1-hour middle cerebral artery occlusion model,which was examined at 2,12,and 24 hours after reperfusion.Our results showed that in the reperfused regions,the cerebral blood flow decreased and the infarct volume increased with time.Furthermore,the pericytes in the infarct regions contracted and acted on the vascular endothelial cells within 24 hours after reperfusion.These effects may result in incomplete microcirculation reperfusion and a gradual worsening trend with time in the acute phase.These findings provide strong evidence for explaining the“no-reflow”phenomenon that occurs after recanalization in clinical practice. 展开更多
关键词 acute ischemic stroke alpha-smooth muscle cerebral blood flow MICROCIRCULATION no-reflow phenomenon PERICYTES platelet endothelial cell adhesion molecule-1 platelet-derived growth factor receptor beta vascular endothelial cells
下载PDF
Emotion-Aware Music Driven Movie Montage
2
作者 刘伍琴 林敏轩 +4 位作者 黄海斌 马重阳 宋玉 董未名 徐常胜 《Journal of Computer Science & Technology》 SCIE EI CSCD 2023年第3期540-553,共14页
In this paper, we present Emotion-Aware Music Driven Movie Montage, a novel paradigm for the challenging task of generating movie montages. Specifically, given a movie and a piece of music as the guidance, our method ... In this paper, we present Emotion-Aware Music Driven Movie Montage, a novel paradigm for the challenging task of generating movie montages. Specifically, given a movie and a piece of music as the guidance, our method aims to generate a montage out of the movie that is emotionally consistent with the music. Unlike previous work such as video summarization, this task requires not only video content understanding, but also emotion analysis of both the input movie and music. To this end, we propose a two-stage framework, including a learning-based module for the prediction of emotion similarity and an optimization-based module for the selection and composition of candidate movie shots. The core of our method is to align and estimate emotional similarity between music clips and movie shots in a multi-modal latent space via contrastive learning. Subsequently, the montage generation is modeled as a joint optimization of emotion similarity and additional constraints such as scene-level story completeness and shot-level rhythm synchronization. We conduct both qualitative and quantitative evaluations to demonstrate that our method can generate emotionally consistent montages and outperforms alternative baselines. 展开更多
关键词 movie montage emotion analysis audio-visual modality contrastive learning
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部