Our mission has been to re-imagine how software is developed. We believe that software development follows patterns which if leveraged in the right way can be harnessed to jumpstart projects.

Over the summer we created a 12 week program with a team of UI/UX designers and computer vision engineers. We set out to great a data model for object detection in mobile app screen designs. We called the project the “Mobile Application Genome”.

During the program we aggregated data from UI kits, design pattern sites and app stores which made up over 10,000 mobile application screens. In planning sessions we created an ontology that mapped groupings of atomic elements in each screen to a design pattern. For instance a series of input boxes accompanied with a button classified as a form. Once we understood the data model we began annotating.

Annotating data

Given the vertical expertise of the UI/UX group, we labeled each mobile design screen atomically. We started with 80 classes and ended with 120. We found nuances in certain classes that oversampled resulting in poor identification in the object detection.“Each week accuracy climbed another 60% improving results uesd by other processes”

A working model

Once the model was trained using the annotated data and we compensated over the outlying classes we found we could detect atomic elements in design with 99.7% accuracy. A few other findings:

  • We needed to create additional processes to handle composite elements
  • We needed to run multiple passes using color treatments as various designers didn’t always follow best practices
  • Reading the text in the images improved contrast in detections where a single element had high confidence for multiple classes