As the sole/lead UX/UI designer at Elementary Robotics, I worked with the CEO of the company to identify the market for industrial robot usage. After researching all the products in the QA market, we found out that most of the products were expensive, unfriendly, and hard to customize.
We then developed a human-centric machine learning robot that allows users to automate repetitive tasks and increase work efficiency.
Game Plan
My Role
Lead designer for a human-centric AI product that allows users to streamline the Quality Assurance (QA) process.
Research, Prototype, User testing, wireframe, info architecture, UI design,
Results
Shipped the web app with 2 main functionalities:
1. QA inspection
2. Control robots to create inspection routines
Increased customer's work efficiency
Reduced labor cost.
What Is It
A tool that allows Quality Assurance (QA) inspectors to semi-automate their work process by controlling robots with high-resolution cameras to scan items with preset routines and use pre-trained machine learning models to give Pass/Fail judgments. Inspectors remain an integral part of the process by providing feedback for every Pass/Fail judgement to improve the machine learning models over time.
User Types
Inspectors are the main type of user who interacts with the live inspection UI.
Managers will be trained to create customized inspection routines with the robot.
Inspector
The Inspector experience is geared around using the live inspection UI to achieve inspection goals and to provide live feedback on the robot's judgments.
Manager
The manager experience centers around using the routine creation UI to generate customized inspection routines with the robot.
They can supervise multiple inspection stations remotely to ensure smooth operation.
Example of a Traditional Inspection Workflow
After few customer onsite interviews, here's their current simplified journey map.
01
Unpack parts to be inspected from external supplier.
02
Perform visual inspections to locate defects on parts.
03
Mark defects with red stickers.
04
Group, package and label all defective parts with the same defect.
05
Store defective parts in a separate location.
06
Place all good parts back to the original box, and pass it down.
Pain Points Highlight
1. Live Inspection
The first major feature we launched for our customers was the ability to use our solution to inspect their products. To generate the machine learning models our internal ML engineering team set up routines and collected data for our customers. The flow below is designed for inspectors. It solves pain points 1,2,3 mentioned above.
Lift users from repetitive work and tiring postures.
Problem
Manual inspection requires inspectors to complete detailed repetitive tasks to locate defects under a magnifying lens.
Solution
Our product only requires inspectors to place parts being inspected in fixtures and to validate the results of the machine learning models on a screen. The robot completes the pre-programmed inspection routine to record images of areas with potential defects and the machine learning models identify the defects in those areas.
Problem
Manual inspection requires inspectors to complete detailed repetitive tasks to locate defects under a magnifying lens.
Solution
Our product only requires inspectors to place parts being inspected in fixtures and to validate the results of the machine learning models on a screen. The robot completes the pre-programmed inspection routine to record images of areas with potential defects and the machine learning models identify the defects in those areas.
Transparency
Problem
Inspectors wanted to understand how the robot was making decisions during inspections.
Solution
We included a routine overview panel that gives users a live update on the progress of the robot for every inspection. Bounding boxes with pass/fail results around areas of interest appear automatically when a certain position is being scanned.
Increase efficiency
Problem
After an initial deployment of our product, we realized through onsite interviews that the workflow of our software was not efficient enough to match their current workload.
Solution
We redesigned the UI to allow users to review images concurrently while the robot completed the inspection routine. With the new UI, inspectors could tap on the "Review" page to review the inspection results. They could also toggle back to the "Inspection Page" anytime if they wanted to keep track of the inspection.
By updating this design, we recorded a 25 - 30% reduction in time spent on each inspection compared to the previous prototype.
There are two elements that users can navigate on this “Review” page:
- Areas of interest (AOI)
- Inspection position.
One inspection position might contain multiple AOI that need to be reviewed.
Since the inspector interfaces with a physical touch screen to navigate our product, the AOI navigator buttons are ergonomically placed in the bottom right corner of the screen to allow the inspector to quickly review AOI. For the inspection position, the view in the review screen will automatically navigate to the next inspection position when the previous position has been reviewed. Together, these features let the inspector more efficiently review ML model decisions.
Supervised machine learning model training.
During inspection routines our machine learning model takes a first pass at making pass/fail judgement calls for every AOI in each inspection position. In order to improve the ML model over time to increase accuracy we included a step for the user to provide feedback on the results of the ML model. This allows inspectors to use their experience to correct or validate the ML model’s decisions.
This UX feature also increases business value for our product. The human-in-the-loop aspect of our product is important since our goal is to augment the inspectors’ capabilities to improve their efficiency for our customers.
Problem
Customers have different preferences for accuracy vs. efficiency.
Some customers will only want to review failed image results, some will want to review all images.
Solution
We provided a filter option to choose between "All" or "Needs Review " (Failed Result) on the “Areas of Interest” and “Inspection Positions” panel.
When "All" is selected, inspection positions with AOI labeled "Failed" and "Unknown" are placed at the beginning of the queue.
2. Routine Creation
With the scaling of the customer needs and the size of the company, we started to offer the ability for our customers to create their own inspection routines with our interface product.
Managers or inspection leads are the target audience for this feature.
Easily toggle between white and dark editions for routine creation UI.
Problem
Users were afraid of breaking the robot so an intuitive interface to control the robot was needed.
Challenge
This specific robot has 5 degrees of freedom. It can be difficult to describe robot movement relative to the user.
Solution
We worked closely with the hardware team to devise a system where the robot's movement controls to command the robot relative to the perspective of the camera on the end effector.
This way the user only needs to care about which way they want the camera to look.
We also included an information tooltip for the pitch and yaw rotation concepts.
Trade-off
This system may not be the perfect UX for our product. An alternative system that I pitched was the option of dragging the live camera feed to control some of the robot's movements. However, the team went with a faster solution for the MVP version of the feature due to the small size of the team and the timeline of the project.
I was able to come up with a roadmap with the PM for the next version of this feature.
Routine creation UI
Live Inspection UI
By making the routine creation and live inspection UI panels similar to each other, we were able to reduce the learning curve for inspectors. It also helped them more quickly overcome fears of breaking the robot.
This design decision also helped our development team building the routine creation page faster.
This is the beginning of forming our own design system/guideline.
Roadmap
We ranked each feature with customer value(request), business value, and development effort to organize them into releases.
3. Design System
- We used the Ant Design as our initial library for general components.
- I created new components for camera feed, images, positions etc on top of Ants for our Elementary Robotics UI.
- Modified color and style to match the UI to the Elementary's branding.
- We did accessibility testing with the team to finalize the font sizes, color that was easier to use at the inspection work environment.
General Topography
And we launched! Check out Elementary Robotics for more info about the product.