论文标题
卷积神经网络,用于从过渡系外行星调查卫星的像素级数据中搜索超级流域
Convolutional Neural Networks for Searching Superflares from Pixel-level Data of the Transiting Exoplanet Survey Satellite
论文作者
论文摘要
在这项工作中,六个卷积神经网络(CNN)已根据数据库中的%不同特征图像和阵列进行了培训,其中包括15,638个关于太阳能型星的超级超叶状候选者,这些候选者是从三年的传播exoplanet Explanet Survey Satellite卫星({\ em Tess}的三年观察中收集的。这些网络用于替代人为的视觉检查,这是搜索超级流域的直接方法,并排除了近年来的假积极事件。与仅使用恒星光曲线搜索超浮光物信号的其他方法不同,我们尝试通过{\ em tess}像素级数据来识别超级流量,其混合误报事件的风险较低,并为统计分析提供更可靠的识别结果。每个网络的评估精度约为95.57 \%。将合奏学习应用于这些网络后,堆叠方法以100 \%的分类率将准确性提高到97.62 \%,而投票方法将准确性提高到99.42 \%,分类率相对较低,为92.19 \%。我们发现,持续时间较短和较低峰值幅度的超级候选者具有较低的识别精度,因为难以识别它们的超荧光功能。该数据库包括71,732个太阳能星星和15,638名来自{\ em tess}的超级候选者,其中包括相应的特征图像和阵列,以及这项工作中训练有素的CNNS。
In this work, six convolutional neural networks (CNNs) have been trained based on %different feature images and arrays from the database including 15,638 superflare candidates on solar-type stars, which are collected from the three-years observations of Transiting Exoplanet Survey Satellite ({\em TESS}). These networks are used to replace the artificially visual inspection, which was a direct way to search for superflares, and exclude false positive events in recent years. Unlike other methods, which only used stellar light curves to search superflare signals, we try to identify superflares through {\em TESS} pixel-level data with lower risks of mixing false positive events, and give more reliable identification results for statistical analysis. The evaluated accuracy of each network is around 95.57\%. After applying ensemble learning to these networks, stacking method promotes accuracy to 97.62\% with 100\% classification rate, and voting method promotes accuracy to 99.42\% with relatively lower classification rate at 92.19\%. We find that superflare candidates with short duration and low peak amplitude have lower identification precision, as their superflare-features are hard to be identified. The database including 71,732 solar-type stars and 15,638 superflare candidates from {\em TESS} with corresponding feature images and arrays, and trained CNNs in this work are public available.