论文标题

LayoutFormer ++:通过约束序列化和解码空间限制的条件图形布局生成

LayoutFormer++: Conditional Graphic Layout Generation via Constraint Serialization and Decoding Space Restriction

论文作者

Jiang, Zhaoyun, Guo, Jiaqi, Sun, Shizhao, Deng, Huayu, Wu, Zhongkai, Mijovic, Vuksan, Yang, Zijiang James, Lou, Jian-Guang, Zhang, Dongmei

论文摘要

有条件的图形布局生成,根据用户约束生成逼真的布局,是一项具有挑战性的任务,尚未经过深入研究。首先,关于如何灵活而统一处理多样化的用户约束的讨论有限。其次,为了使布局符合用户约束,现有工作通常会显着牺牲生成质量。在这项工作中,我们建议LayoutFormer ++解决上述问题。首先,为了灵活地处理各种约束,我们提出了一个约束序列化方案,该方案代表不同的用户约束作为具有预定格式的令牌序列。然后,我们将条件布局生成作为序列到序列变换,并将用作变压器作为基本体系结构的编码器框架利用编码器框架。此外,为了使布局更好地满足用户需求而不会损害质量,我们提出了一个解码空间限制策略。具体而言,我们通过忽略绝对违反用户限制的选项并可能导致低质量的布局,并从受限分布中制作模型样本来修剪预测的分布。实验表明,从更好的生成质量和较小的约束违规方面,布局++在所有任务上的现有方法都优于现有方法。

Conditional graphic layout generation, which generates realistic layouts according to user constraints, is a challenging task that has not been well-studied yet. First, there is limited discussion about how to handle diverse user constraints flexibly and uniformly. Second, to make the layouts conform to user constraints, existing work often sacrifices generation quality significantly. In this work, we propose LayoutFormer++ to tackle the above problems. First, to flexibly handle diverse constraints, we propose a constraint serialization scheme, which represents different user constraints as sequences of tokens with a predefined format. Then, we formulate conditional layout generation as a sequence-to-sequence transformation, and leverage encoder-decoder framework with Transformer as the basic architecture. Furthermore, to make the layout better meet user requirements without harming quality, we propose a decoding space restriction strategy. Specifically, we prune the predicted distribution by ignoring the options that definitely violate user constraints and likely result in low-quality layouts, and make the model samples from the restricted distribution. Experiments demonstrate that LayoutFormer++ outperforms existing approaches on all the tasks in terms of both better generation quality and less constraint violation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源