Graph edit distance reward : learning to edit scene graph

Scene Graph, as a vital tool to bridge the gap between language domain and image domain, has been widely adopted in the cross-modality task like VQA. In this paper, we propose a new method to edit the scene graph according to the user instructions, which has never been explored. To be specific, in o...

全面介紹

Saved in:
書目詳細資料
Main Authors: Chen, Lichang, Lin, Guosheng, Wang, Shijie, Wu, Qingyao
其他作者: School of Computer Science and Engineering
格式: Conference or Workshop Item
語言:English
出版: 2020
主題:
在線閱讀:https://hdl.handle.net/10356/144419
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
id sg-ntu-dr.10356-144419
record_format dspace
spelling sg-ntu-dr.10356-1444192020-11-04T06:47:42Z Graph edit distance reward : learning to edit scene graph Chen, Lichang Lin, Guosheng Wang, Shijie Wu, Qingyao School of Computer Science and Engineering European Conference on Computer Vision (ECCV) 2020 Engineering::Computer science and engineering Scene Graph Editing Policy Gradient Scene Graph, as a vital tool to bridge the gap between language domain and image domain, has been widely adopted in the cross-modality task like VQA. In this paper, we propose a new method to edit the scene graph according to the user instructions, which has never been explored. To be specific, in order to learn editing scene graphs as the semantics given by texts, we propose a Graph Edit Distance Reward, which is based on the Policy Gradient and Graph Matching algorithm, to optimize neural symbolic model. In the context of text-editing image retrieval, we validate the effectiveness of our method in CSS and CRIR dataset. Besides, CRIR is a new synthetic dataset generated by us, which we will publish it soon for future use. AI Singapore Ministry of Education (MOE) National Research Foundation (NRF) Accepted version This research was supported by the National Research Foundation Singapore under its AI Singapore Programme (Award Number: AISG-RP-2018-003) and the MOE Tier-1 research grants: RG28/18 (S) and RG22/19 (S). Q. Wu’s participation was supported by NSFC 61876208, KeyArea Research and Development Program of Guangdong 2018B010108002. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not reflect the views of National Research Foundation, Singapore. 2020-11-04T06:47:42Z 2020-11-04T06:47:42Z 2020 Conference Paper Chen, L., Lin, G., Wang, S., & Wu, Q. (2020). Graph edit distance reward : learning to edit scene graph. Proceedings of the European Conference on Computer Vision (ECCV) 2020. https://hdl.handle.net/10356/144419 en AISG-RP-2018-003 RG28/18 (S) RG22/19 (S) © 2020 Springer Nature Switzerland AG. This is a post-peer-review, pre-copyedit version of an article published in European Conference on Computer Vision (ECCV) 2020. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Computer science and engineering
Scene Graph Editing
Policy Gradient
spellingShingle Engineering::Computer science and engineering
Scene Graph Editing
Policy Gradient
Chen, Lichang
Lin, Guosheng
Wang, Shijie
Wu, Qingyao
Graph edit distance reward : learning to edit scene graph
description Scene Graph, as a vital tool to bridge the gap between language domain and image domain, has been widely adopted in the cross-modality task like VQA. In this paper, we propose a new method to edit the scene graph according to the user instructions, which has never been explored. To be specific, in order to learn editing scene graphs as the semantics given by texts, we propose a Graph Edit Distance Reward, which is based on the Policy Gradient and Graph Matching algorithm, to optimize neural symbolic model. In the context of text-editing image retrieval, we validate the effectiveness of our method in CSS and CRIR dataset. Besides, CRIR is a new synthetic dataset generated by us, which we will publish it soon for future use.
author2 School of Computer Science and Engineering
author_facet School of Computer Science and Engineering
Chen, Lichang
Lin, Guosheng
Wang, Shijie
Wu, Qingyao
format Conference or Workshop Item
author Chen, Lichang
Lin, Guosheng
Wang, Shijie
Wu, Qingyao
author_sort Chen, Lichang
title Graph edit distance reward : learning to edit scene graph
title_short Graph edit distance reward : learning to edit scene graph
title_full Graph edit distance reward : learning to edit scene graph
title_fullStr Graph edit distance reward : learning to edit scene graph
title_full_unstemmed Graph edit distance reward : learning to edit scene graph
title_sort graph edit distance reward : learning to edit scene graph
publishDate 2020
url https://hdl.handle.net/10356/144419
_version_ 1688665659648507904