Learning semantically rich network-based multi-modal mobile user interface embeddings
Semantically rich information from multiple modalities - text, code, images, categorical and numerical data - co-exist in the user interface (UI) design of mobile applications. Moreover, each UI design is composed of inter-linked UI entities which support different functions of an application, e.g.,...
Saved in:
Main Authors: | ANG, Meng Kiat Gary, LIM, Ee-peng |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2022
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/7269 https://ink.library.smu.edu.sg/context/sis_research/article/8272/viewcontent/3533856.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
Learning network-based multi-modal mobile user interface embeddings
by: ANG, Gary, et al.
Published: (2021) -
Learning user interface semantics from heterogeneous networks with multi-modal and positional attributes
by: ANG, Gary, et al.
Published: (2022) -
A high-level user interface management system
by: Singh, G., et al.
Published: (2016) -
Cross-modal recipe retrieval with stacked attention model
by: CHEN, Jing-Jing, et al.
Published: (2018) -
Cross-modal recipe retrieval: How to cook this dish?
by: CHEN, Jingjing, et al.
Published: (2017)