Support app UI development with deep learning
The mobile UI development involves two major steps. First, UI designers are responsible for the transference of requirements into an attractive mock-up, and then UI developers implement the design into actionable source code. Both responsibilities are challenging in their own ways, UI designers cons...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2018
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/74058 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | The mobile UI development involves two major steps. First, UI designers are responsible for the transference of requirements into an attractive mock-up, and then UI developers implement the design into actionable source code. Both responsibilities are challenging in their own ways, UI designers constantly need new design inspirations to stimulate creativity and for developers translation of design into code is a repetitive and tedious process.
In this paper, we proposed methods to support both roles in the development lifecycle and implemented them into two web applications. First, a web gallery of UI widgets that offers a pragmatic approach to design inspiration. For the gallery to be beneficial, we utilised Stoat, an automated GUI testing tool along with further data processing to collect a large number of diverse widget images from popular apps in Google Play.
For enhancing data collection capabilities, we proposed a novel approach that leverage Faster R-CNN, an object detection network to extract widget information directly from UI screenshots. Experiments were also carried out to boost the accuracy of the model.
A second web application that incorporates a neural translation model was also built to provide a convenient platform for UI developers to obtain a GUI skeleton through uploading a UI design image. |
---|