iOS, Android | B2C IoT | 2022-2023
We were tasked to create the Contract Pricing module of the back-office system used by one of the most profitable companies of FTI group.
The company that created the ATS (Applicant Tracking System) contacted us to help them refresh their main product. They had their main software for more than 8 years and some of their clients were asking for improvements and updates. They wanted to understand their users and create something that would be more competitive.
Redesign the ATS (Applicant Tracking System) in order to make it work for today’s needs of its user base.
My main responsibility was to create a solid research plan that would fit the client’s needs and budget. After that was done, I worked with my partner in this to run the research plan, analyse the data, and capitalise on the findings to create a new ATS system.
This is a simplification of the process followed in this project. The actual process was a lot more iterative, with many back and forth as we gained new understanding and we received more insights and evidence.In this short selection of steps we wont see all the steps we followed
The company wanted to know what were the main pain points people in HR and Talent Acquisition had. They had a lot of experience themselves, but understood that external research should be done in order to test their most risky assumptions, and to pinpoint unseen opportunities
A tool directed at the modern workforce should be at the very least in par with the existing patterns and expectations. We wanted to optimise the task flows and processes to enable the people to focus and do their best work.
The company needed a tool that would be a significant improvement from their existing solution.
The company wanted a tool that would stand out from the competition in the Hiring process. This would be their ticket to increase their market share.
This created a shared understanding and shed light to certain areas of the Hiring process we couldn’t have guessed. It also helped the client understand how much things have changed since the initial design of the ATS.
We gathered behavioural data (both attitudinal and observational) that helped us drill down and understand the jobs the users needed to perform
We started by asking the users to show us how they create and handle job positions till they hire the person for the job, and close the position. After that, we continued by doing a series of interview with users of the existing version of ATS, and with the same number of users that use competitor systems.
Our goal was to discover how these users do their tasks, what are their pain points, and where can we create value for them.
These interviews were semi-structured in order to create space for exploration on the things we and the stakeholders didn’t know - the unknown unknowns that could potentially show us a way to create value.
Creation of behavioural user personas and a shared understanding of the user’s needs and desires. This helped the team focus and make decisions
We gathered notes, quotes, and insights from the observations and interviews in a m whiteboard. We used these to code them, find themes and help us create behavioural personas.
This helped us to create customer journey maps based on user data.
Both of which helped us and the client’s team to have a shared understanding, and make decisions.
After coding the findings from the interviews, we arrived a 4 distinct personas based on the behaviours and motivations.
People in the HR and Talent Acquisition were overwhelmed with the amount of candidates they applied. They mentioned they had to view over 800 profiles for each job in some cases. Making this first step a really tedious one.
The users had to enter a lot of things by hand, as they systems they used, didn't support automations. This increased the workload, the error-proneness of the system.
Excel was used a tool that aided in capturing data, and organizing their work. This happened outside of the system, so colleagues and managers didn't have any visibility on that work. This created multiple problems, especially when people were on vacations etc.
The systems didn't have reporting functionality. The users had to download data (in the best cases) or copy/paste them in excel, in order to create reporting for their work.
On certain positions, candidates didn't show up for the meetings. This made people to book extra meeting for a day, trying to make the best of the available hours. The workaround had negative effects when everyone showed up.
On certain positions, candidates didn't show up for the meetings. This made people to book extra meeting for a day, trying to make the best of the available hours. The workaround had negative effects when everyone showed up.
We had a shared understanding of what needs to happen and how the system could handle each case. The Devs were able to start working on the infrastructure that would enable the system to work as internded
We created task flows for the top tasks users must be able to perform within the system. To do these, we explored all the paths, and accounted for what could go wrong, and how we should handle such cases.
In this process, we had the help of a tech lead from the client’s side in order to make sure that what we were designing was actually feasible within their technology stack and budget.
We were able to pinpoint areas we could improve before we even move to usability testing, as there were some low hanging fruits that both stakeholders and users agreed on
After we had the task flows at a good level, we started working on the wireframes of the system.
For this, we used a components library in order to be consistent and work faster.
Once we had all the necessary screens ready, we presented this to the stakeholders, and some selected users for feedback.
Faster job ad creation, less error-prone process, and automated steps that took a lot manual work to do
We streamlined the job ad creation process by adding a wizard like creation process with automations -having ready job templates and creating custom ones.
We also incorporated system checks so that users get notified if they’re about to create something wrong - mention one thing on one page, and then another, contradicting on the other, e.g. remote vs on-site.
We also added the option to promote the job ads to multiple job boards with a click of a button, reducing the manual work, as well as the potential for errors in the process
We saved time by having the most used tasks being done with a click of a button that can be found close to the where users expected to find them -grouping the relevant information closer
We streamlined how users could switch between finding the information they need, and doing the most common actions. We also grouped relevant information closer to each other in order to make comparisons, and decisions.
We also made contacting the candidate a lot easier, and we kept each communication details in the system so the whole team can have a full history of these.
As we focused on a single persona -this was the strategy- we recruited 6 users to test the new flows and designs with. To do this, we prepared 10 tasks -the top tasks. We create an interactive mock that could facilitate all the interactions and paths (over 70 screens).
After each task, we used the SEQ (single ease question), and at the end of the test, we had SUS with an addition of one more question. This enabled us to have both SUS, and the UMUX-lite questionnaires in order to be able to compare the SUS predictor from the UMUX-lite.
We did this as we wanted to use just the UMUX-lite in the future to measure and benchmark future improvements.
We managed to find several usability issues, as well as uncover additional cases and way we can better serve the users while using the system
After the tests were done, -even the users we tested with understood the interface and the new process, and managed to complete the tasks- we observed areas we could improve the product and make it more intuitive and clear.
We noted all of these findings as suggested improvements to the client. As we insisted/convinced the client to be present in a couple of usability sessions, they experienced the value these tests bring, and approved the fixes suggested.
We benchmarked the existing ATS as a baseline to the one we created and tested. To do this, we used the UMUX-lite and SUS (practically, an addition of one question to the SUS questionnaire).
We did this to check the SUS predictor on UMUX-lite, as we planned to use just the second moving on as it is just 2 questions.
We observed significant improvements on all parameters which made us confident that after adding the findings to the designs tested, we could move on and develop the platform, as it meets the goals of the company aa well as the user’s.
Keeping in mind that the baseline was measured with a tool the users knew and had experience with, and the newer version they saw for the first time (to also measure how intuitive the new designs were and the learnability), it make the observation even more impressive.
iOS, Android | B2C IoT | 2022-2023
Back-Office Web App | B2B SaaS | 2022
Back-Office Web App | Internal | 2020
Web App | B2B2C SaaS | 2021
iOS | B2C SaaS | 2018 - 2019
Web AR App | B2C Concept | 2023