Learning network-based multi-modal mobile user interface embeddings
Rich multi-modal information - text, code, images, categorical and numerical data - co-exist in the user interface (UI) design of mobile applications. UI designs are composed of UI entities supporting different functions which together enable the application. To support effective search and recommen...
Saved in:
Main Authors: | ANG, Gary, LIM, Ee-Peng |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2021
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/7049 https://ink.library.smu.edu.sg/context/sis_research/article/8052/viewcontent/3397481.3450693.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
Learning semantically rich network-based multi-modal mobile user interface embeddings
by: ANG, Meng Kiat Gary, et al.
Published: (2022) -
Learning user interface semantics from heterogeneous networks with multi-modal and positional attributes
by: ANG, Gary, et al.
Published: (2022) -
Cross-modal recipe retrieval with stacked attention model
by: CHEN, Jing-Jing, et al.
Published: (2018) -
Multi-view collaborative network embedding
by: ATA, Sezin Kircali, et al.
Published: (2021) -
Cross-modal recipe retrieval: How to cook this dish?
by: CHEN, Jingjing, et al.
Published: (2017)