This is a Guest Blog by Udit Khandelwal.
“The website is amazing and the session was awesome! I would love to come back for another round!” –Deepika called out while leaving for home after going through a usability testing session at Zivame. From that moment, Zivame was always going be the first choice for Deepika – a proud homemaker and entrepreneur.
What did we do to invoke such surreal emotions in people who shop with us?
Well, we involved users at each and every step of product design. In my article How I Changed The Way Women Buy Bras Online I covered how we leveraged user research to discover a new way of shopping for bras. Here, I am going http://notuser.com/how-i-changed-the-way-women-buy-bras-onlineto walk you through the journey wherein we managed to test the design with real users, obtained real insights and implemented course correction.
The Game Plan
The resources were limited, the scope was huge and we had to keep churning incrementally enhanced designs.
The entire website was supposed to be designed and developed from scratch within 5 months. The last month was already reserved for all the testing, deployment and stability related activities. So we had no more than 120 days to design and develop the new Zivame experience. I had heard this saying somewhere:
No shortcuts today; I’m in a hurry – Swiss saying
… and I thought, it was just the right time to follow it. We needed a game plan that involved no shortcuts because we couldn’t afford to go down the wrong lane. Hence, we decided to do the following after our early research was over:
- Lo-Fi Prototypes: Refine the sketches until we were done discovering the loopholes and until we had fixed all the broken flows.
- Formative Tests: Use paper prototypes to test with whoever you could,whenever you could, even if it was on the lunch table.
- Hi-Fi Prototypes: Develop a high fidelity interactive prototype, as close to the final product as possible.
- Summative Tests: Test it with real users and obtain real insights.
- Beta Tests: One final round of testing, on the actual product, before we handed it over to the world.
- Iterations: Every test delivered some good news and some bad news (which was good news). We had to prioritize fixing bugs and making enhancements for every iteration.
Usability testing is an ongoing activity that needs proper planning, time and resources.
The initial design was completely untested and development teams were waiting on us. We had to give them something solid, real quick.
Low fidelity prototypes work beautifully when you want to fail faster. They help you discover flaws in design at an early stage and designers themselves are very open to making changes in the raw design. We did the same at Zivame. Once we were done with white-boarding, we quickly moved on to our tools to create different screens on the computer and took prints to create screen flows.
And then, we simply asked some folks from within the office (who closely matched our target persona Mahi Agarwal) to perform certain scenarios. Whenever they were stuck even for a bit, we knew there was a problem. More than often, they would even give us suggestions, which we took note of. We would then come back and discuss why the users were suggesting what they were, go to the root cause of the problem, and figure our own solution.
Sometimes tests not only tell you what’s wrong with design, but they also reveal new opportunities. For example, it was during the formative tests we discovered that users were able to comprehend and interact with notifications. We took that as a green signal and design for a stronger, deeper integration of notifications.
For testing, don’t wait for the final product; go ahead with whatever prototype you can afford and test with whoever you can find. Fail faster.
High Fidelity Prototypes
The website had more than 300 unique screens and screen-states. Putting together a click-through was a mammoth task.
We first set our goals and non-goals before building the prototype. This helped us reduce the scope of prototyping. eg. We wanted to test the new shop-by-experience feature, but were pretty sure we had followed best practices for the checkout, so we decided not to focus on that flow.
Next, we evaluated 3 options for building the prototype:
- Flash – my personal favorite, I know I sound old-school!
- Marvel App – because everyone is going this way.
- Invision App – Marvel’s competitor.
Now Flash was quickly ruled out because of the overheads involved. Marvel kind of worked, but had limited support for overlays, and my entire design was based on surfaces and overlays. Invision offered me better flexibility, so I went ahead with Invision.
Scoping is critical; even for a prototype.
The test had to be optimized to FAIL the design.
We decided to write down the script that we were going to use during the usability testing sessions. The critical piece here was to figure out exactly what to test and how. So I listed down my goals and non-goals, based on which I broadly figured out what was I going to test.
- Initial Mental Model – Expectancy Test
- Actual Usage – Free Exploration Test
- Navigation – Performance Test
- Affordance – Visual Affordance Test
- Task Flow – Performance Test
- Sentiment – Semantic Differential Process
Once that was done, I quickly mapped a technique against each line item and then moved on to the modules that I wanted to focus on. After this I moved on to defining the task-flow of individual scenarios. I didn’t write down the language of the scenarios as I din’t want to sound stiff. I have shared the script below on Slideshare.
Use hammer for the nails, but screw-driver for the screws.
Finding women (in India) who’d agree to participate in the testing of a Lingerie website.
We wanted women who closely resembled Zivame’s target persona Mahi Agarwal. We took to social media and made an announcement. We asked women to help Zivame in building a great shopping experience for women! A lot of women came forward and we received a good number of responses. Not only this, they also invited their friends to participate. And kudos to forward-thinking women like Deepika, Subha and Aastha, who went a step further and agreed to put a face to our participants.
We were in a good position to screen and recruit our participants. Women who expressed interest, had to fill the participant recruitment form. After a quick screening, we shortlisted the participants and gave them a call to schedule the session with them.
Finding real users isn’t that difficult. You just have to do it the right way!
Dragonfly Effect, Jennifer Aaker & Andy Smith
Usability Test Sessions
Talking to women about a website that sells bras.
On the D-day, we were well prepared with systems set up, printed copies of the script, the team was ready to perform their roles and the participants were to be greeted warmly.
We started the sessions by making the participants feel comfortable. We began with some chit chat and brief introductions. We emphasised how important their contribution was and asked them to not worry about hurting our feelings and give honest feedback. In order to avoid them sharing their bra size details with us, we made sure we told them to assume their size was 34C. I think they liked the idea. We made sure they believed:
You are not being tested, we are!
Invariably, every user would smile at this moment, and we knew, we had managed to make them comfortable. Throughout the session, we made sure, the focus was on the tasks (and hence on design) and not on the products. That’s when we would begin the flow (as mentioned in the PPT).
We discovered some pretty interesting things during these tests. To our surprise, none of the users had trouble figuring the ‘shop’ menu, which was a big change from our previous design. However, users were confused when they reached the sub-menu and we knew it needed to be simplified (which we did later).
We also discovered a basic issue with title bars of our surfaces. Users were facing difficulty going back and closing the surfaces. Again, we resolved this issue in our next iteration.
One of the key findings was about the sticky buttons at the bottom of the surfaces. We realised that sometimes the button used to break the user flow as users were clicking on it without reading the labels or they were misinterpreting the labels. We found alternates to such situations.
All in all, the users proved to be very helpful!
Users are humans, if you treat them well, they will be very helpful.
The Bug Bash
We needed to make sure if the final product was behaving as designed and intended, and we were running out of time!
There is always a fair degree of difference between the actual product and the implementation.
When the Zivame beta product website was ready, we wanted to make sure that the product was behaving as expected. We wanted to see if users were comfortable interacting with the UI controls we had implemented and wanted to see it working in the real world.
So, we opened it up for all the Zivame employees, and conducted a 3 hour bug-logging Marathon, which we called Bug Bash. We issued a coupon that would work only on the beta website for 3 hours and asked all the employees to make use of the coupon and log whatever bug that they faced in the Bug Bash Form.
We divided them into different teams and announced prizes for the top 3 teams. The plan worked and we received 223 responses from different teams. It took us 2 days to go through the entire list and figure which ones were genuine. Most of the bugs filed were duplicate or known issues, but we discovered 12 new bugs (of which 3 were related to UI).
This gave us a high degree of confidence to release the beta externally!
Testing never hurts and it can be done at any stage.
Kudos to the women, who helped us all along the way in building such a fantastic shopping experience at Zivame!
Have any challenging experiences to share as a product manager yourself? Or something you found extremely interesting in this post? Comment below and let us know.
Waiting to drive such game-changing product experiences yourself? Don’t wait any longer. Enroll in the UpGrad Product Management Program now!