论文标题

有效的早期停止点云神经网络

Effective Early Stopping of Point Cloud Neural Networks

论文作者

Zoumpekas, Thanasis, Salamó, Maria, Puig, Anna

论文摘要

可以利用早期停止技术来降低时间成本,但是目前,早期停止技术的最终目标与准确性升级或神经网络更好地概括在未看到的数据的情况下而不直接与其效率直接相关的数据密切相关。时间效率是神经网络中的关键因素,尤其是在处理3D点云数据的分割时,不仅是因为神经网络本身在计算上很昂贵,而且还因为点云是大而嘈杂的数据,这使学习过程变得更加昂贵。在本文中,我们提出了一种基于基本数学的新的早期停止技术,旨在提高学习效率和处理3D点云的神经网络的准确性之间的权衡。我们的结果表明,通过在细分3D点云中采用我们的早期停止技术在四个不同且高度使用的神经网络中,模型的训练时间效率大大提高,效率增益值可达到94 \%,而模型在几乎相似的分段准确度量值中获得了诸如培训的训练中的几乎相似的分段计量值,这些阶段是在培训中获得的,这些阶段是在培训中获得的。此外,我们的建议在分割准确性方面的表现优于四种常规的早期停止方法,这意味着在Point Cloud细分中采用了有希望的创新早期停止技术。

Early stopping techniques can be utilized to decrease the time cost, however currently the ultimate goal of early stopping techniques is closely related to the accuracy upgrade or the ability of the neural network to generalize better on unseen data without being large or complex in structure and not directly with its efficiency. Time efficiency is a critical factor in neural networks, especially when dealing with the segmentation of 3D point cloud data, not only because a neural network itself is computationally expensive, but also because point clouds are large and noisy data, making learning processes even more costly. In this paper, we propose a new early stopping technique based on fundamental mathematics aiming to upgrade the trade-off between the learning efficiency and accuracy of neural networks dealing with 3D point clouds. Our results show that by employing our early stopping technique in four distinct and highly utilized neural networks in segmenting 3D point clouds, the training time efficiency of the models is greatly improved, with efficiency gain values reaching up to 94\%, while the models achieving in just a few epochs approximately similar segmentation accuracy metric values like the ones that are obtained in the training of the neural networks in 200 epochs. Also, our proposal outperforms four conventional early stopping approaches in segmentation accuracy, implying a promising innovative early stopping technique in point cloud segmentation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源