Institute of Computing Technology, Chinese Academy IR
AttGAN: Facial Attribute Editing by Only Changing What You Want | |
He, Zhenliang1,2; Zuo, Wangmeng3; Kan, Meina1,2; Shan, Shiguang1,2,4,5; Chen, Xilin1,2 | |
2019-11-01 | |
发表期刊 | IEEE TRANSACTIONS ON IMAGE PROCESSING |
ISSN | 1057-7149 |
卷号 | 28期号:11页码:5464-5478 |
摘要 | Facial attribute editing aims to manipulate single or multiple attributes on a given face image, i.e., to generate a new face image with desired attributes while preserving other details. Recently, the generative adversarial net (GAN) and encoder-decoder architecture are usually incorporated to handle this task with promising results. Based on the encoder-decoder architecture, facial attribute editing is achieved by decoding the latent representation of a given face conditioned on the desired attributes. Some existing methods attempt to establish an attribute-independent latent representation for further attribute editing. However, such attribute-independent constraint on the latent representation is excessive because it restricts the capacity of the latent representation and may result in information loss, leading to over-smooth or distorted generation. Instead of imposing constraints on the latent representation, in this work, we propose to apply an attribute classification constraint to the generated image to just guarantee the correct change of desired attributes, i.e., to change what you want. Meanwhile, the reconstruction learning is introduced to preserve attribute-excluding details, in other words, to only change what you want. Besides, the adversarial learning is employed for visually realistic editing. These three components cooperate with each other forming an effective framework for high quality facial attribute editing, referred as AttGAN. Furthermore, the proposed method is extended for attribute style manipulation in an unsupervised manner. Experiments on two wild datasets, CelebA and LFW, show that the proposed method outperforms the state-of-the-art on realistic attribute editing with other facial details well preserved. |
关键词 | Facial attribute editing attribute style manipulation adversarial learning |
DOI | 10.1109/TIP.2019.2916751 |
收录类别 | SCI |
语种 | 英语 |
资助项目 | National Key R&D Program of China[2017YFA0700800] ; Natural Science Foundation of China[61671182] ; Natural Science Foundation of China[61772496] ; Natural Science Foundation of China[61732004] |
WOS研究方向 | Computer Science ; Engineering |
WOS类目 | Computer Science, Artificial Intelligence ; Engineering, Electrical & Electronic |
WOS记录号 | WOS:000482600600017 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | http://119.78.100.204/handle/2XEOYT63/4732 |
专题 | 中国科学院计算技术研究所期刊论文_英文 |
通讯作者 | Shan, Shiguang |
作者单位 | 1.Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc, Beijing 100190, Peoples R China 2.Univ Chinese Acad Sci, Beijing 100049, Peoples R China 3.Harbin Inst Technol, Sch Comp Sci & Technol, Harbin 150001, Heilongjiang, Peoples R China 4.CAS Ctr Excellence Brain Sci & Intelligence Techn, Shanghai 200031, Peoples R China 5.Peng Cheng Lab, Shenzhen 518055, Peoples R China |
推荐引用方式 GB/T 7714 | He, Zhenliang,Zuo, Wangmeng,Kan, Meina,et al. AttGAN: Facial Attribute Editing by Only Changing What You Want[J]. IEEE TRANSACTIONS ON IMAGE PROCESSING,2019,28(11):5464-5478. |
APA | He, Zhenliang,Zuo, Wangmeng,Kan, Meina,Shan, Shiguang,&Chen, Xilin.(2019).AttGAN: Facial Attribute Editing by Only Changing What You Want.IEEE TRANSACTIONS ON IMAGE PROCESSING,28(11),5464-5478. |
MLA | He, Zhenliang,et al."AttGAN: Facial Attribute Editing by Only Changing What You Want".IEEE TRANSACTIONS ON IMAGE PROCESSING 28.11(2019):5464-5478. |
条目包含的文件 | 条目无相关文件。 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论