Redesigning the Google Drive Homepage to make it easier for users to find & access files
Design Research | Think-Aloud Usability Testing | A/B Testing | Prototyping
I conducted think-aloud and A/B usability testing to redesign the Google Drive homepage. The redesign made file search 80% faster and users employed half as many search attempts to find files.
Role
UX Researcher & Designer
Duration
May 2024 - June 2024
Team
Vaidehi Chotai
Methods
Moderated & Unmoderated Think-Aloud Usability Testing
Figma Wireframing & Prototyping
A/B Usability Testing
Time on Task Quantification
SUS Quantification
PROBLEM SPACE
It is difficult to quickly find desired files and folders in Google Drive. One main reason for this is its Homepage.
The Google Drive Homepage, the intended starting point to the Drive experience, is cluttered and overwhelming, especially for college students.
80%
of users either bypass the Homepage completely or avoid using Google Drive altogether due to its confusing & cluttered interface, preferring Notion or their local Desktop instead for collaboration and file storage.
*n = 8, exploratory research with college students aged 23 - 27 y/o
"One reason I don't use Google Drive often is because it's very overwhelming to me. When it has my files in here mixed in with other things that are shared, it just becomes overwhelming... all my Docs, Sheets are all jumbled together on the home screen"
OPPORTUNITY
Improving Google Drive's Homepage experience for students will cultivate loyalty and drive long-term revenue growth through future subscriptions
While this is a self-started project, if Google were my client, I would recommend the Google team to improve the Drive Homepage experience for students because Google Drive is the entryway to the entire Google suite of applications, like Docs, Slides and Sheets, and today's students are tomorrow's Google Drive power users.
By providing a seamless experience now, Google can cultivate a loyal student user base that can convert into paid subscriptions as these students enter the workforce. This strategy, pioneered by companies like Figma, Adobe, and Apple, fosters brand loyalty and fuels long-term user growth and revenue.
🔑 Entryway to Google suite of applications
📈 Long-Term User Growth
💵 Revenue from Paid Subscriptions
Project Preview
CURRENT STATE
The current Google Drive homepage lacks organization, simply displaying a list of recently accessed files and folders based on both user and collaborator activity. As a result, it quickly becomes cluttered with irrelevant content, especially for users who work in collaborative settings and have a large number of shared files. Additionally, the files are not organized by file type or ownership, making file search time-consuming and difficult.
PROPOSED REDESIGN
The redesign introduces a "Recent Files" section at the top, displaying files recently accessed by the user, organized by file type and share status (personal/shared). This allows users to quickly access files they've recently worked on or viewed. The lower half of the homepage showcases user-selected personal and shared folders, similar to a desktop, allowing easy access to frequently used folders. This approach aligns with users' preference for folder-based organization, as 100% of users that were interviewed expected the Drive homepage to reflect their local computer desktop.
Overall, this design personalizes the homepage by displaying files based on user activity and giving users control over what appears, rather than relying on an auto-generated list of file recommendations.
Mid-fidelity prototype of the final design solution
80%
faster file search
52%
fewer search strategies req.
PROCESS
How did I get to this solution?
RESEARCH FRAMEWORK
My Approach
My Role
Since this was an independent project, I led all aspects of the research and design process.
I designed and executed the research studies:
Recruited participants
Developed study protocols (including the creation of usability tasks, task metrics, and interview questions)
Conducted testing sessions
I also conducted data synthesis and designed prototypes:
Analyzed the collected data
Synthesized key insights
Translated insights into mid-fidelity prototypes using Figma
Users struggle to quickly find and access files on Google Drive. The homepage, intended for efficient file and folder access, currently does not meet this need. Redesigning it would deliver the greatest impact to users’ Drive experience.
EXPLORATORY RESEARCH: FINDINGS
Based on my exploratory research, I discovered that users often struggle to quickly find and access files on Google Drive, especially when they don’t remember the file name and can’t use the search function.
This forces them to manually search through various directories based on file access time and ownership (e.g., My Drive, Recents, Shared With Me, etc.). This process requires extra steps, heavy reliance on memory, and multiple search attempts just to find the right file. As a result, many users tend to “star” important or frequently accessed files for quicker access.
Ideally, the homepage should provide easy access to frequently used and important files and folders, allowing users to get to their content with the fewest clicks and minimal reliance on recall.
However, the Drive homepage currently does not allow for this. In fact, from usability testing, I found that a majority of participants avoid the homepage altogether.
Despite being the first page users see, they rarely use the homepage to find files. Instead, they go directly to My Drive/Starred directories, or even bypass Google Drive entirely in favor of Docs, Sheets, or Slides applications.
Users are bypassing the homepage, taking extra steps and clicks to access their files. Why?
Lack of user control
Files are displayed based on auto-generated recommendations instead of user-selected content. This takes away user control over and utility of the homepage.
Lack of folder-based organization
Files and folders are filtered separately on the homepage creating a clutter of random files.
Lack of classification based on file type and share status
The Recent Files list displays files of all type (Docs, Sheets, Slides etc.) and owned by different users all at once, causing clutter and making it difficult to identify and access files of interest.
Irrelevant files
Files opened or edited by a collaborator are suggested on a user’s homepage, which leads to the homepage displaying irrelevant files.
Problem behind the problem: Google Drive’s homepage file organization and display don't match users' expectations, which are based on how files are organized on a local computer desktop
Google Drive users generally expect a file organization experience similar to their local computers, relying heavily on a folder-based structure. The drawings above depict frequent users' mental models of file storage and search on Google Drive.
A majority of participants illustrated folders or folder-like groupings in their drawings (see examples, left and center). These visualizations suggest that users consider folders their primary tool for both storing and retrieving files, viewing them as essential for managing and searching their Drive content. One participant visualized Drive as a cabinet of files (right), with a clear distinction between their own files and those of collaborators, stored in separate areas. They seem to rely on the search function to find files, requiring recall of file names.
EXPLORATORY RESEARCH: PROCESS
To understand how other users interact with Google Drive and identify pain points, I conducted exploratory think-aloud usability testing (n = 5).
Specifically, my goal was to:
Discover specific difficulties users encounter when managing multiple project files and collaborating heavily within Google Drive.
Analyze user behaviors and mental models related to file storage, organization, and retrieval.
In these testing sessions, I tasked participants to find a specific old file from a year ago without providing them the name of the file. I also requested participants to give me a guided tour of their Google Drive to understand their file organization behaviors, strategies, and mental models. Finally, I asked them to draw their mental model of Google Drive file storage and organization.
I followed this with a think-aloud usability test of a competitive platform (Microsoft OneDrive, n = 2) to expand my understanding of users' core goals, mental models, and motivations when storing and organizing files. This exercise provided valuable insights into how another company approaches file storage and organization, and allowed me to learn from their design decisions.
THINK-ALOUD USABILITY TESTING
-
I chose think-aloud usability testing because it allows me to 1) directly observe participants’ actions (clicks, scrolls, cursor movements etc.) and also 2) hear their thought process, providing insight into why they make certain choices - their motivations and mental models.
-
File Findability: How do users locate specific files and folders without using the search function? What browsing strategies do they employ to navigate their Drive? Do they encounter difficulties finding things, and if so, what are the pain points?
Organization: How do users categorize and structure their files and folders in Google Drive? What challenges do they face when organizing their Drive? Do they use any specific naming conventions or organizational strategies?
-
Observe how users navigate their cloud storage to locate files.
Task 1: Find a slide deck from a project you did in Fall 2023. While you complete this task, make sure to think out loud.
Task 2: You are updating your resume with the project you did in the Spring. You want to refer to the original project problem statement to get guidance on what to write on your resume. Find this piece of content in your Drive. While you complete this task, make sure to think out loud.
Understand how users organize their Drive (folder labeling, subfolder categorization etc).
Task 3: Now please give me a tour of your Google Drive. Show me how you organize your files and folders. While you show me around, please explain your thought process behind your organization. -
File findability was measured by noting the number of browsing strategies used by the participant to find the file specified in the task.
Reflection
At first, I planned to measure the number of clicks it takes for the participant to find files. But I decided to use a different metric - the number of strategies used by participants - instead because this metric better accommodates the variation in subfolder organization among participants and is simpler to set metrics for. While determining an ideal number of clicks for a task can be challenging, determining the number of strategies it should take (one), is easier.
-
Item description
7
Participants
23 - 27
Age Range
Screenshots showing an unmoderated think-aloud usability testing session in progress. The participants were provided with a detailed instruction sheet (link) guiding them through tasks while screen sharing.
I decided to conduct unmoderated sessions to reduce facilitator bias and increase efficiency, as multiple sessions can be run simultaneously. However, I also ran moderated sessions alongside them, as the unmoderated approach doesn't allow for follow-up questions, limiting deeper insights into the participant’s motivations.
SYNTHESIS
Screenshot of my think-aloud usability testing synthesis spreadsheet. I organized data and findings from the usability testing interviews by task, and included additional insights on user mental models and motivations at the bottom of the sheet.
After compiling key insights and issues from the usability testing interviews, I used a frequency, impact, and persistence framework to assign a priority score to each issue. This helped me identify which issues were most critical for the business to address first.
This approach helped me focus my project on redesigning the Drive homepage, as it was the most frequent and persistent issue with the highest impact and relatively few trade-offs.
Design
Recommendations
These insights point to redesigned homepage that is uncluttered, streamlines file search & access, and matches users’ underlying expectations of a file storage platform.
My design recommendations to achieve this would be —
📂
Organize files and folders in a familiar desktop-like layout
✂️
Eliminate unnecessary details (“Location”, “Reason Suggested”), that contribute to a cluttered homepage
👩🏽💻
Display recent files only based on user activity, excluding collaborator activity, to minimize clutter
🗂️
Group files by file type (Docs, Slides, Sheets etc.) for a more organized and intuitive browsing experience
🙋🏽♀️
Clearly distinguish between user-owned and shared items
With these design recommendations in mind, I sketched several paper prototypes and finalized two layouts, which I converted into clickable Figma prototypes
DESIGN HYPOTHESES
Low-fidelity design explorations for the Google Drive homepage
Hypothesis
Incorporating a 'Recent Files' section based on user activity, where files are organized by file type and share status, along with a dedicated 'Folders' section for quick access to frequently used folders, will significantly speed up file search and discovery.
Prototype A: Mid-fidelity Design Variation 1
Prototype B: Mid-fidelity Design Variation 2
EVALUATIVE RESEARCH: FINDINGS
Both prototypes enabled faster file search and required fewer search attempts to locate a file compared to the existing design.
While Prototype B enabled faster and more efficient file search than Prototype A, users favored A's usability due to its consistent vertical scroll design, compared to B's split layout.
Prototype A
Prototype B
Control
It is significantly faster to find desired files using the prototypes than using the existing design.
Users were 4x faster with Prototype A and 5x faster with Prototype B compared to the current interface.
The prototypes have a more intuitive design; they enabled more efficient file searches.
They required half as many search strategies on average to locate files compared to the control.
The redesign successfully made the homepage a valuable entry point to access files.
Users chose to, and were quickly able to, find the file or its containing folder via the homepages of prototypes A & B, but chose to avoid the homepage of the existing design entirely.
Users perceived the redesigned interfaces to be more intuitive and user-friendly than the existing design.
Prototype A and B had a higher SUS score (higher = better usability) than control.*
-
Many SUS questions assess confidence and comfort using the system, so testing the current design with frequent Google Drive users likely introduced a “familiarity bias”. This makes it difficult to directly compare SUS scores between the new and existing designs, as users were encountering the new designs for the first time.
In hindsight, I would have tested the existing design with first-time or less frequent Google Drive users to create a more equal comparison. However, the fact that the new designs still scored higher on the SUS despite users' familiarity with the old design, further proves that the new designs are more intuitive and user-friendly than the existing one.
Testing proved that a redesigned Google Drive homepage, with user-selected folders and a Recent Files section (displaying only the user's recently accessed files, organized by file type and share status), improves user experience by making file access faster and more streamlined.
Although Prototype B enabled faster and more efficient file access, users perceived Prototype A as more usable. My analysis of think-aloud data and user justifications for SUS scores revealed that the unfamiliar vertically-split layout of prototype B was the primary issue, specifically the combination of a scrollable list on the left and a grid layout of folders on the right.
Users struggled with the side-by-side differing interactions.
In order to have a final homepage design that improves file access and also is considered highly usable by users, I combined the elements of prototypes A and B. The final design below shows the design choices that were made based on insights from the multivariate usability testing:
Insights from multivariate testing were translated into the above shown mid-fi prototype of the Google Drive homepage. It combines features from prototypes A and B that received positive user feedback and demonstrated strong performance in file search and SUS metrics.
EVALUATIVE RESEARCH: PROCESS
MULTIVARIATE THINK-ALOUD USABILITY TESTING
I hypothesized that a homepage design featuring "Recent Files" (organized by file type and share status based on user activity) and a dedicated "Folders" section (for quick access to user-selected folders) would significantly improve file search and discovery speed. Prototypes A and B both incorporated these features, but with different layouts.
To test this hypothesis and determine which design has better file search performance and usability, I conducted multivariate think-aloud usability testing with 9 student participants (ages 23-27) who regularly use Google Drive for academic work and collaborative projects. Participants were divided into three groups of three, each using either Prototype A, Prototype B, or the existing Google Drive interface. All participants completed the same file search task (find a specific file).
Specifically, my research goal was to:
Measure and compare the file search speed (average time to locate a target file) and efficiency (average number of strategies used) across the three designs to identify the most effective design.
Evaluate and compare the usability and intuitiveness of the three designs, and identify the design with the best perceived file search experience and overall usability (SUS score).
Gather qualitative user feedback on the new designs to inform future design iterations and improvements.
File search speed (average time to locate a target file) and file search efficiency (average number of strategies used) were both measured because users shouldn't just find their files quickly; they should also find them easily, ideally on the first try.
A screen share video of the testing was recorded with participant consent and reviewed after the session to extract time on task and # of search strategies used data.
A System Usability Scale questionnaire was provided at the end of the usability testing to test the perceived usability of the design tested with the user.
-
I am conducting multivariate think-aloud usability testing because I have multiple designs (Prototypes A and B, plus the existing Google Drive interface as control), and I want to understand how users interact with each of them while simultaneously gathering their immediate thoughts and feedback.
I opted for a multivariate approach rather than just an A/B test because I needed to compare all three versions. An A/B test would only allow me to compare two designs, but since I didn’t have file search, efficiency, or SUS metrics for the existing design, I wanted to include it as part of the comparison.
Overall, this approach not only helped me assess whether the design hypothesis—introducing a user-activity-prioritized "Recent Files" section and a dedicated user-selected “Folders" section— is valid, offering better usability and file search outcomes than the current design, but also helped me determine which of the two design variations (A or B) performs better.
While a typical multivariate test focuses solely on metrics, my goal was to also understand users’ thought processes behind their actions, their reasoning for their SUS ratings, and their feedback on the designs. This qualitative data was crucial for understanding what features worked (or didn’t) and why, offering valuable insights to inform future design iterations and improvements.
By combining quantitative data (time on task, number of search strategies) with qualitative data (think-aloud feedback), I was able to make informed decisions about which design is most effective.
-
You conducted concept validation testing for your Consumer Reports group project last month and stored the transcript files in Google Drive, which you use to collaborate with your team. You want to now access participant P1’s transcript file to pull a quote for a research report.
Use the given Google Drive interface to find “P1 Transcript” file without using the search functionality. Make sure to think out loud while you complete this task. -
File Search Speed (Time on Task): Users were timed from the start of the task until they found the target file to measure how quickly users can access files using the given design. The goal was for users to locate the file within 30 seconds.
File Search Efficiency: The number of search strategies used by each participant to find the target file was counted to measure how straightforward and intuitive the design is. Ideally, users should find the file using only one strategy.
Both file search speed and efficiency were measured because it is not only important for users to find their files quickly, it is also important for them to find them easily, ideally on their first try. Having to try multiple approaches, navigate to the wrong places, and backtrack takes cognitive effort and creates frustration and disappointment, even if the file is eventually found quickly.
The goal is for the new designs to enable users to find files more quickly and easily (using fewer strategies and less time) than the current Google Drive interface.
System Usability Scale (SUS) Score
The usability and intuitiveness of the designs were measured using the SUS questionnaire.
The goal was to test whether the SUS scores of the two prototypes are higher than that of the current Google Drive interface, and which of the two prototypes has the higher SUS score and why. -
Item description
-
Item description
9
Participants
23 - 27
Age Range
Screenshots showing a moderated think-aloud usability testing session of Prototype B in progress. The participant is interacting with the prototype to complete the assigned task (top) and filling out the SUS questionnaire (bottom).
SYNTHESIS
Screenshot of my spreadsheet layout for synthesizing multivariate think-aloud usability testing results. It organizes data by metric (left) and compiles positive and negative feedback, along with user experiences, for each prototype and participant (top).
REFLECTION
Key Takeaways
What Worked Well —
Data triangulation strengthened my confidence in my insights and design recommendations
Combining quantitative metrics with qualitative think-aloud data was very valuable. This triangulation strengthened my confidence in my insights and design recommendations, particularly since I was the sole researcher on the project and worked with a limited sample size.
Working independently, I was concerned about potential bias in my interpretation of user behavior and feedback. However, the quantitative data gathered from usability tasks and the SUS questionnaire provided objective support for my findings, boosting my confidence in the resulting insights. Overall, the combination of "what" users did (metrics) and "why" they did it (qualitative feedback) provided a deeper understanding of the user experience. Plus, developing and analyzing the usability tasks and metrics was a lot of fun!
Observing user behavior helped me uncover deeper insights into user needs
Observing user behavior firsthand was crucial as it allowed me to see how participants carried out different tasks and identify patterns in behavior across users. Analyzing these patterns helped me uncover deeper insights into user needs, which directly informed my design recommendations.
For example, during the exploratory research, I noticed that participants predominantly used "My Drive" and "Starred" directories over homepage to access files. Through follow-up questions, I discovered that this was because these directories offered more control and personalization, allowing users to organize files and folders according to their own conventions, which made file retrieval faster and easier. This led me to a deeper insight: users have a strong need for control and personalization in file storage and management. A homepage that offers this level of control would be more effective and useful.
This insight directly informed my design recommendations, and the experience solidified my belief in the power of firsthand observation and probing to uncover user needs and drive impactful design decisions.
Mental model sketches allowed me to visualize how users conceptualized file storage and search
Incorporating mental model drawings into the research process was a unique and insightful approach. It allowed me to visualize how users conceptualized file storage and search, revealing deeper insights into their expectations, motivations, and needs when using Google Drive. Also, it was really fun analyzing the drawings and the insights I extracted from them directly informed my final design recommendations.
:)
Challenges and How I Addressed Them —
Creating comparable tasks was difficult as every user stores and organizes files differently.
Focusing on a very specific user group made it easier to create relevant and comparable tasks.
Designing comparable usability testing tasks for a platform as personalized as Google Drive was a significant challenge. This is because everyone stores and organizes files in their their Drive differently, so creating tasks and metrics that work for everyone and are comparable was difficult. Recognizing this, I focused on a very specific user group (college students who work on collaborative projects) to create tasks that are relevant and comparable for all participants. While this approach had limitations, it allowed me to gather valuable data within the project constraints.
There was significant discrepancy between what some users say versus what they do.
Asking probing questions and focusing more on user actions helped me get reliable insights.
While think-aloud usability testing can provide rich data because it allows one to observe behavior and also hear from the participants, I faced the common challenge of users saying one thing and doing another. This was especially confusing at times when participants emphatically expressed their preferences to be a certain way, but their task performance proved otherwise. This made it challenging to draw clear and clean insights.
I combatted this by asking probing follow-up questions to understand the discrepancy and focusing more on what they were doing instead of what they were saying to get reliable insights. In some cases, I also started accounting for differences in participant personality when extracting insights from their feedback because I realized that some participants are overly positive or overly negative in their perception and articulation, which can lead to incorrect insights if taken at face value.
,':(
Learnings & Improvements —
I need to conduct more divergent prototyping before narrowing down A/B prototypes to explore more diverse ideas and uncover innovative solutions
A key takeaway from this project is the importance of early-stage exploration and rapid prototyping. In future projects, I would dedicate more time to generating a wider variety of concepts before narrowing them down to A/B prototypes for final testing. This would allow for more diverse ideas to be explored, increasing the chances of uncovering innovative solutions. Given the tight timeline of this project, I wasn’t able to iterate as much as I would have liked, but with more time, I’d prioritize this in the process.
To ensure reliable comparison between new and control prototypes, recruit users with minimal experience with the existing design
In future A/B tests, I will ensure participants are recruited in a way that allows for direct comparisons between prototypes. In this study, I tested three designs (Prototype A, B, and the Control) with participants who were heavy Google Drive users.
However, since these participants were interacting with Prototypes A and B for the first time, but were already very familiar with the Control (the existing Google Drive homepage), it likely introduced a "familiarity bias." This made it difficult to draw meaningful comparisons in task performance and SUS results.
In hindsight, I would have tested the existing design with first-time or less frequent Google Drive users to create a more balanced comparison.
Format questions in a tiered fashion in unmoderated interview guides to ensure participants respond to all parts of the question
My initial unmoderated interview guide placed "why/why not" follow-up questions at the end of the main question, like this:
From 1-5, how easy or difficult is it for you to create a folder on Google Drive? (1=very difficult, and 5=very easy). Why?
During pilot testing, I observed that participants, focused on the first part of the question, often neglected the "why" portion. To address this, I revised the format to a tiered approach. I included "why" questions on a separate line immediately following the initial question to mimic a real-time interview scenario. This ensures participants don't miss this crucial element.
I also added context to the "why" questions, referencing the original question to aid recall and encourage more thoughtful responses. Here's the updated format:
From 1-5, how easy or difficult is it for you to create a folder on Google Drive? (1=very difficult, and 5=very easy).
Why did you give that rating?
This new format worked out really well in the main unmoderated testing sessions. Moving forward, I will make sure to frame my questions in this way in interview guides.
Disable prototype hints in Figma to get reliable quantitative behavioral data
During pilot testing, I realized that leaving Figma prototype hints active during testing can inadvertently guide participants through the task, influencing their behavior. To prevent this, I disabled the hints during the main testing phase. Moving forward, I will always ensure that hints are turned off to encourage participants to explore the prototype independently and authentically.