
Research Article
Image Extrapolation Based on Perceptual Loss and Style Loss
@INPROCEEDINGS{10.1007/978-3-030-77569-8_13, author={Yongpeng Ren and Xian Zhang and Hongping Ren and Lutao Wang and Guanrao Huang and Taisong Xiong and Xiaojie Li}, title={Image Extrapolation Based on Perceptual Loss and Style Loss}, proceedings={Quality, Reliability, Security and Robustness in Heterogeneous Systems. 16th EAI International Conference, QShine 2020, Virtual Event, November 29--30, 2020, Proceedings}, proceedings_a={QSHINE}, year={2021}, month={6}, keywords={Image extrapolation Perceptual loss Style loss}, doi={10.1007/978-3-030-77569-8_13} }
- Yongpeng Ren
Xian Zhang
Hongping Ren
Lutao Wang
Guanrao Huang
Taisong Xiong
Xiaojie Li
Year: 2021
Image Extrapolation Based on Perceptual Loss and Style Loss
QSHINE
Springer
DOI: 10.1007/978-3-030-77569-8_13
Abstract
In recent years, deep learning-based image extrapolation has achieved remarkable improvements. Image extrapolation utilizes the structural and semantic information from the known area of an image to extrapolate the unknown area. In addition, these extrapolative parts not only maintain the consistency of spatial information and structural information with the known area, but also achieve a clear, beautiful, natural and harmonious visual effect. In view of the shortcomings of traditional image extrapolation methods, this paper proposes an image extrapolation method which is based on perceptual loss and style loss. In the paper, we use the perceptual loss and style loss to restrain the generation of the texture and style of images, which improves the distorted and fuzzy structure generated by traditional methods. The perceptual loss and style loss capture the semantic information and the overall style of the known area respectively, which is helpful for the network to grasp the texture and style of images. The experiments on the Places2 and Paris StreetView dataset show that our approach could produce better results.